WO2025080091A1 - An electronic device for generating color-corrected image and controlling method thereof - Google Patents
An electronic device for generating color-corrected image and controlling method thereof Download PDFInfo
- Publication number
- WO2025080091A1 WO2025080091A1 PCT/KR2024/096294 KR2024096294W WO2025080091A1 WO 2025080091 A1 WO2025080091 A1 WO 2025080091A1 KR 2024096294 W KR2024096294 W KR 2024096294W WO 2025080091 A1 WO2025080091 A1 WO 2025080091A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- electronic device
- color
- representation
- parameters
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/60—Image enhancement or restoration using machine learning, e.g. neural networks
-
- G06T11/10—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Definitions
- the present disclosure generally relates to a field of image processing, and more specifically relates to a method and a system for generating a color-corrected image on a receiver device.
- an electronic device for example, a portable electronic device such as a smartphone
- a portable electronic device such as a smartphone
- communication service providers or electronic device manufacturers have provided various functions, and have competitively developed an electronic device distinguished from other companies. Accordingly, various functions provided via an electronic device have been gradually advanced.
- FIG. 1 illustrates a pictorial diagram 100 depicting an exemplary problem scenario, where a user associated with a sender device captures an image of a red colored dress using the sender device and share the image with a user associated with a receiver device.
- the image may appear different on the receiver device from the sender device.
- the color of the dress in the received image may appear pink on the receiver device while the color of the dress is red in the captured image on the sender device.
- the angle at which the image is viewed also influences the appearance of the image.
- the colors and contrast can change due to the technical limitations of the underlying display technology. Therefore, this viewing angle dependency can cause variations in color accuracy and brightness.
- a method for generating color-corrected image on a receiver device includes receiving an image captured from a sender device on the receiver device. Further, the method includes determining a plurality of parameters associated with the received image, wherein the plurality of parameters comprises one or more image parameters and one or more hardware parameters involved in capturing of the received image. Furthermore, the method includes determining one or more effects applied to the captured image by comparing a first representation associated with the received image with a second representation associated with an intermediate unfiltered image, wherein the first representation is generated based on the determined plurality of parameters, and wherein the intermediate unfiltered image is generated using the first 4D embedding representation. Moreover, the method includes reconstructing the received image by applying the determined one or more effects on the intermediate unfiltered image. Finally, the method includes generating the color corrected image based on the reconstructed image.
- a system for generating color corrected image on a receiver device includes a memory and processor coupled to the memory.
- the processor is configured to receive, on the receiver device, an image captured from a sender device. Further, the processor is configured to determine a plurality of parameters associated with the received image, wherein the plurality of parameters comprise one or more image parameters and one or more hardware parameters involved in capturing of the received image. Furthermore, the processor is configured to determine one or more effects applied to the captured image by comparing a first representation associated with the received image with a second representation associated with an intermediate unfiltered image, wherein the first representation is generated based on the determined plurality of parameters, and wherein the intermediate unfiltered image is generated using the first 4D embedding representation. Moreover, the processor is configured to reconstruct the received image by applying the determined one or more effects on the intermediate unfiltered image. Finally, the processor is configured to generate the color-corrected image based on the reconstructed image.
- FIG. 1 illustrates a pictorial diagram depicting an exemplary problem scenario, according to existing art
- FIG. 3 illustrates a block diagram depicting an operational flow of the system for generating color corrected image, according to an embodiment of the present disclosure
- FIG. 10 illustrates a flow diagram depicting the method for generating color-corrected image, according to an embodiment of the present disclosure.
- FIG. 11 illustrates a block diagram illustrating an electronic device in a network environment according to various embodiments.
- circuits constituting a block may be implemented by dedicated hardware, by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block.
- a processor e.g., one or more programmed microprocessors and associated circuitry
- Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the disclosure.
- the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the disclosure.
- the present disclosure specifically addresses the scenario when an image, captured using a sender device and shared with a receiver device, appears different on the receiver device.
- a similar scenario was discussed above in conjunction with FIG. 1, where the color of the dress appears pink on the receiver device, while the original color of the dress is red as captured using the sender device.
- An object of the present disclosure is to provide an improved solution for displaying images across different devices such that the image appears the same on the receiver device as on the sender device.
- the disclosed techniques relate to a system and method for generating color-corrected images on the receiver device.
- both the sender device and the receiver device correspond to user devices with a display capable of transmitting and receiving images.
- Such user devices may include, but are not limited to smartphones, tablets, laptops, desktops, smartwatches, and the like.
- a user device sending an image may be considered the 'sender device', whereas a user device receiving the image may be considered the 'receiver device'.
- the image may be captured using a camera of the sender device, may or may not be edited by applying filters, and then shared with the receiver device.
- the image may be received by the sender device from a third device, may or may not be edited by applying filters, and then shared with the receiver device.
- FIG. 2 illustrates a block diagram 200 depicting an exemplary system 203 for generating a color-corrected image on a user device 201.
- the user device 201 may correspond to a receiver device and may be interchangeably referred to as 'the receiver device 201' in the present disclosure.
- the user device 201 may include the system 203 for generating the color-corrected image.
- the processor 205 may be a single processing unit or a number of units, all of which could include multiple computing units.
- the processor 205 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logical processors, virtual processors, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.
- the processor 205 is configured to fetch and execute computer-readable instructions and data stored in the memory 207.
- the display 209 may be configured to display an image.
- the display 209 may be one of the various types of display panels such as a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an electrophoretic display (EPD), an electrochromic display (ECD), and a plasma display panel (PDP).
- LCD Liquid Crystal Display
- OLED organic light-emitting diode
- EPD electrophoretic display
- ECD electrochromic display
- PDP plasma display panel
- the storage 211 may be implemented with integrated hardware and software.
- the hardware may include a hardware disk controller with programmable search capabilities or a software system running on general-purpose hardware.
- the examples of the storage 211 are, but are not limited to, in-memory databases, cloud databases, distributed databases, embedded databases, and the like.
- the storage 211 serves as a repository for storing data processed, received, and generated by one or more of the processors, and the modules/engines/units.
- the network interface 213 may be configured to provides network connectivity and enables communication with other devices including the sender device over one or more networks.
- At least one of a plurality of CNN, DNN, RNN, RMB models and the like may be implemented to thereby achieve execution of the present subject matter's mechanism through an AI model.
- a function associated with an AI module may be performed through the non-volatile memory, the volatile memory, and the processor.
- the processor may include one or a plurality of processors.
- one or a plurality of processors may be a general-purpose processor, such as a central processing unit (CPU), an application processor (AP), or the like, a graphics-only processing unit such as a graphics processing unit (GPU), a visual processing unit (VPU), and/or an AI-dedicated processor such as a neural processing unit (NPU).
- One or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory.
- the predefined operating rule or artificial intelligence model is provided through training or learning.
- an input image 303 is provided to the data extraction module 219.
- the term 'input image' may be referred to as 'received input image' or 'received image' when described in the context of the receiver device. Further, the term 'input image' may be referred to as 'captured input image' when described in the context of the device involved in capturing the 'input image' such as the sender device.
- the input image 303 is received on the receiver device from the sender device as depicted at step 401 of FIG. 4. In an embodiment, the input image 303 may be captured from the sender device or may have been captured using another device, received by the sender device and then forwarded to the receiver device. Further, one or more effects may or may not have been applied on the captured input image 303 on the sender device or on another device used to capture the input image 303.
- the data extraction module 219 determines a plurality of parameters associated with the received input image 303.
- the plurality of parameters may include one or more image parameters 305 and one or more hardware parameters 307 involved in capturing of the received input image 303.
- the one or more image parameters 305 includes edge maps, red, green, and blue (RGB) information, and segmented objects associated with the received input image 303. The determination of the one or more image parameters 305 is described below in greater detail in the forthcoming paragraphs.
- the one or more hardware parameters 307 are determined from metadata associated with the received input image 303. The determination of the one or more hardware parameters 307 is described below in greater detail in the forthcoming paragraphs.
- one or more effects 309 applied on the captured input image 303 are determined, at step 405 of FIG. 4, by comparing a first representation 601 with a second representation 609.
- the first representation 601 is associated with the received input image 303 and is generated based on one or more image parameters 305, and one or more hardware parameters 307.
- the second representation 609 is associated with an intermediate unfiltered image such that the intermediate unfiltered image is generated using a first 4D embedding representation. The generation of the first representation 601 and the second representation 609 is described below in greater detail in the forthcoming paragraphs in conjunction with the FIG. 5.
- the reconstruction module 221 reconstructs the received input image 303 by applying the one or more effects 309 on the intermediate unfiltered image to obtain a reconstructed image 313.
- the operation performed by the reconstruction module 221 to generate the reconstructed image 313 is described below in greater detail in the forthcoming paragraphs in conjunction with FIG. 7.
- the image simulation module 227 generates a color-corrected image as an output image 317 by performing image simulation based on a comparison of the corresponding optical perception metric scores of the reconstructed image 313 and the received input image 303.
- the generated output image 317 appears the same as the input image 303 would appear on the sender device.
- the data extraction module 219 may be configured to determine the one or more image parameters 305 and the one or more hardware parameters 307 involved in capturing the received input image 303.
- the data extraction module 219 may be further configured to determine the one or more effects 309 applied on the captured input image 303.
- the data extraction module 219 segments out the individual objects from the received input image 303 and creates a map.
- the segmented objects from the received input image 303 may be considered as the key of the map, and corresponding values may be defined using corresponding color scale hue saturation value (HSV) histograms, grayscale values, and edge maps.
- HSV color scale hue saturation value
- the one or more image parameters 305 may refer to edge maps, red, green, blue (RGB) information, and luminance (L) information.
- the RGB information represents the colors in the corresponding segmented object.
- the L information represents the lightness or brightness value of the corresponding segmented object.
- the corresponding edge may be created using predefined edge detector filters.
- Edge detector filters refer to image processing techniques that may be used to identify the edges or boundaries within an image. Such filters aim to detect areas of rapid intensity transitions, such as changes in brightness or color.
- the edge detector filters may comprise a high-distance sensitive filter and a low-distance sensitive filter.
- the high distance sensitive filer may be used to identify sub-objects defined by clear boundaries within the segmented object.
- the low-distance sensitive filter may be used to identify the same parts defined by usable boundaries of the same object.
- the usable boundaries refer to the boundaries that clearly distinguish between pixels of one object from the other objects. For example, in an image of a person wearing a t-shirt and a cap with a logo, the t-shirt and the cap may be considered as segmented objects.
- the high distance sensitive filter may be used to identify sub-objects such as the logo, whereas the low distance sensitive filter may be used to identify parts of the logo.
- an edge map for each segmented object may be generated using the clear boundaries and the usable boundaries identified by the high-distance and low-distance sensitive filters of the edge detector filter.
- the image 501 may correspond to the received image 303.
- the color pallet 503 and the color adjustment factors 505 may correspond to a first color pallet and a first color adjustment factors, respectively, associated with the received image 303.
- the 4D embedding representation 507 may correspond to the first representation 601 associated with the received image 303.
- the first representation 601 may be used to generate the unfiltered image 607 as shown in FIG. 6 below.
- the second representation 609 associated with the unfiltered image 607 is generated using similar steps as described in FIG. 5.
- the image 501 corresponds to the unfiltered image 607.
- the color pallet 503 and the color adjustment factors 505 may correspond to a second color pallet and a second color adjustment factors, respectively, associated with the unfiltered image 607.
- the one or more effects 309 are determined by comparing the first representation 601 and the second representation 609, as also described in conjunction with FIG. 6.
- FIG. 6 illustrates a block diagram 600 depicting the determining of one or more effects 309, according to an embodiment of the present disclosure.
- a predefined neural network 603 and a predefined decoder 605 may be applied in sequence on the first representation 601 associated with the received image 303 to generate the unfiltered image 607.
- the second representation 609 associated with the unfiltered image 607 may be generated by following the steps described in FIG. 5.
- the one or more effects 309 are determined by comparing the first representation 601 and the second representation 609 using a predefined comparator unit 611. Based on the determined one or more effects 309 and image parameters 305, the reconstruction module 221 generates the reconstructed image 313 as described below in conjunction with FIG. 7.
- FIG. 7 illustrates a block diagram 700 depicting the operation of reconstruction module 221 to generate the reconstructed image 313, according to an embodiment of the present disclosure.
- the reconstruction module 221 calculates HSV degradation to display values and provides as input to a color filing unit 705.
- the HSV degradation to display indicates a difference in perception of the HSV when the same image is viewed on two different screens.
- the unfiltered image 607 is segmented in a similar manner as the received image 303 was segmented.
- the HSV degradation to display values is calculated for each segmented object of the unfiltered image 607 using predefined techniques.
- the predefined techniques may include color picker application.
- the color filing unit 705 may also receive as input edge map of the segmented objects of the received image 303.
- FIG. 8 illustrates a block diagram 800 depicting an exemplary operation of the score calculation and dynamic rendering module 223, according to an embodiment of the present disclosure.
- the OPMS module 225 of the score calculation and dynamic rendering module 223 calculate OPMS 801 and 805 for each of the received image 303 and the reconstructed image 313 respectively. Further, the OPMS module 225 also determines neutral aspects 803 and 807 of the received image 303 and the reconstructed image 313 respectively.
- the neutral aspects 803 and 807 may be determined for each pixel of the corresponding segmented objects of the respective image.
- the neutral aspects 803 and 807 may refer to saturation per pixel, brightness per pixel, and contrast per pixel of the corresponding image.
- the neutral aspects of the image may change with respect to the change in the angle of perception of the corresponding image.
- the OPMS module 225 determines the final neutral aspects 809 based on the OPMS 801, 805, and neutral aspects 803, 807 of the corresponding received image 303 and the reconstructed image 313.
- the final neutral aspects 809 may be determined with respect to the received image 303 and may relate to the neutral aspect of the color-corrected image 317 to be generated by the image simulation module 227.
- OPMS vectors affecting the image colors are determined by comparing the OPMS 801 and 807 to determine current conditions of the captured image.
- the current conditions may define the angle of perception as 30 degrees, the display type as active-matrix organic light-emitting diode (AMOLED), and the image zoom as 30% .
- the final neutral aspect 809 is determined based on the current conditions of the captured image and the neutral aspects 803, 807 of the received image 303 and the reconstructed image 313.
- the color-corrected image 317 is generated by applying image simulation on the reconstructed image 313 using the final neutral aspects 809. The operation of the image simulation module 227 is described in conjunction with FIG. 9.
- the sub-pixel regions 905 may be identified using sub-pixel region identification unit 903.
- the sub-pixel region identification unit 903 may be configured to identify the degraded region in the received image 303. It is to be noted that not all regions of the received image 303 would be in need of correction, as only some of the regions are degraded.
- the sub-pixel region identification unit 903 uses a degradation-aware region identification technique to identify the sub-pixel regions 905 that are degraded or prone to degradation.
- the degradation-aware region identification technique includes a degradation-aware region filter for calculation of a probability distribution of degradation based on the contribution of various degrading factors.
- the degrading factors correspond to the color adjustment factors 505 i.e., brightness, contrast, and saturation.
- the sub-pixel regions 905 are typically the regions where the brightness, saturation, or contrast is the highest.
- the user may tilt the user device 201, zoom into the image, or view in a different display.
- any of the above listed events may change the OPMS 801 and 805.
- the final neutral aspects 809 may have to be recalculated in real-time based on the event and rendered accordingly.
- Such rendering of an image after recalculation of final neutral aspects 809 is referred to as dynamic rendering.
- the output of the dynamic rendering unit 907 may be provided as feedback to the OPMS module 225 to improve the OPMS calculation.
- FIG. 10 illustrates a flow diagram depicting the method 1000 for generating color-corrected image 317, according to an embodiment of the present disclosure.
- the method 1000 includes a series of operations 1001 through 1011 executed by one or more components of the system 203 of the receiver device 201, in particular the processor 205.
- the processor 205 receives the image 303 captured from the sender device.
- the processor 205 determines a plurality of parameters associated with the received image such that the plurality of parameters includes the one or more image parameters 305 and the one or more hardware parameters 307 involved in capturing of the received image.
- the one or more image parameters 305 comprise edge maps, red, green, blue (RGB) information, and luminance (L) information associated with each segmented object of the received image 303.
- the one or more hardware parameters 307 are determined from metadata associated with the received image 303.
- the processor 205 determines the one or more effects 309 applied to the captured image by comparing the first representation associated with the received image with a second representation associated with an intermediate unfiltered image.
- the first representation is generated based on the determined plurality of parameters.
- the intermediate unfiltered image is generated using the first representation.
- the processor 205 reconstructs the received image by applying the determined one or more effects 309 on the intermediate unfiltered image.
- the processor 205 generates the color-corrected image 317 based on the reconstructed image 313.
- the processor 205 dynamically renders the generated color-corrected image on an interface associated with the receiver device, wherein the color consistency associated with the rendered image is the same as the color consistency associated with the captured image.
- Fig. 11 is a block diagram illustrating an electronic device 1101 in a network environment 1100 according to various embodiments.
- the electronic device 1101 in the network environment 1100 may communicate with an electronic device 1102 via a first network 1198 (e.g., a short-range wireless communication network), or at least one of an electronic device 1104 or a server 1108 via a second network 1199 (e.g., a long-range wireless communication network).
- the electronic device 1101 may communicate with the electronic device 1104 via the server 1108.
- the electronic device 1101 may include a processor 1120, memory 1130, an input module 1150, a sound output module 1155, a display module 1160, an audio module 1170, a sensor module 1176, an interface 1177, a connecting terminal 1178, a haptic module 1179, a camera module 1180, a power management module 1188, a battery 1189, a communication module 1190, a subscriber identification module(SIM) 1196, or an antenna module 1197.
- at least one of the components e.g., the connecting terminal 1178) may be omitted from the electronic device 1101, or one or more other components may be added in the electronic device 1101.
- some of the components e.g., the sensor module 1176, the camera module 1180, or the antenna module 1197) may be implemented as a single component (e.g., the display module 1160).
- the processor 1120 may execute, for example, software (e.g., a program 1140) to control at least one other component (e.g., a hardware or software component) of the electronic device 1101 coupled with the processor 1120, and may perform various data processing or computation. According to one embodiment, as at least part of the data processing or computation, the processor 1120 may store a command or data received from another component (e.g., the sensor module 1176 or the communication module 1190) in volatile memory 1132, process the command or the data stored in the volatile memory 1132, and store resulting data in non-volatile memory 1134.
- software e.g., a program 1140
- the processor 1120 may store a command or data received from another component (e.g., the sensor module 1176 or the communication module 1190) in volatile memory 1132, process the command or the data stored in the volatile memory 1132, and store resulting data in non-volatile memory 1134.
- the processor 1120 may include a main processor 1121 (e.g., a central processing unit (CPU) or an application processor (AP)), or an auxiliary processor 1123 (e.g., a graphics processing unit (GPU), a neural processing unit (NPU), an image signal processor (ISP), a sensor hub processor, or a communication processor (CP)) that is operable independently from, or in conjunction with, the main processor 1121.
- a main processor 1121 e.g., a central processing unit (CPU) or an application processor (AP)
- auxiliary processor 1123 e.g., a graphics processing unit (GPU), a neural processing unit (NPU), an image signal processor (ISP), a sensor hub processor, or a communication processor (CP)
- the auxiliary processor 1123 may be adapted to consume less power than the main processor 1121, or to be specific to a specified function.
- the auxiliary processor 1123 may be implemented as separate from, or as part of the main processor 1121.
- the auxiliary processor 1123 may control at least some of functions or states related to at least one component (e.g., the display module 1160, the sensor module 1176, or the communication module 1190) among the components of the electronic device 1101, instead of the main processor 1121 while the main processor 1121 is in an inactive (e.g., sleep) state, or together with the main processor 1121 while the main processor 1121 is in an active state (e.g., executing an application).
- the auxiliary processor 1123 e.g., an image signal processor or a communication processor
- the auxiliary processor 1123 may include a hardware structure specified for artificial intelligence model processing.
- An artificial intelligence model may be generated by machine learning. Such learning may be performed, e.g., by the electronic device 1101 where the artificial intelligence is performed or via a separate server (e.g., the server 1108). Learning algorithms may include, but are not limited to, e.g., supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
- the artificial intelligence model may include a plurality of artificial neural network layers.
- the memory 1130 may store various data used by at least one component (e.g., the processor 1120 or the sensor module 1176) of the electronic device 1101.
- the various data may include, for example, software (e.g., the program 1140) and input data or output data for a command related thererto.
- the memory 1130 may include the volatile memory 1132 or the non-volatile memory 1134.
- the input module 1150 may receive a command or data to be used by another component (e.g., the processor 1120) of the electronic device 1101, from the outside (e.g., a user) of the electronic device 1101.
- the input module 1150 may include, for example, a microphone, a mouse, a keyboard, a key (e.g., a button), or a digital pen (e.g., a stylus pen).
- the audio module 1170 may convert a sound into an electrical signal and vice versa. According to an embodiment, the audio module 1170 may obtain the sound via the input module 1150, or output the sound via the sound output module 1155 or a headphone of an external electronic device (e.g., an electronic device 1102) directly (e.g., wiredly) or wirelessly coupled with the electronic device 1101.
- an external electronic device e.g., an electronic device 1102
- directly e.g., wiredly
- wirelessly e.g., wirelessly
- the sensor module 1176 may detect an operational state (e.g., power or temperature) of the electronic device 1101 or an environmental state (e.g., a state of a user) external to the electronic device 1101, and then generate an electrical signal or data value corresponding to the detected state.
- the sensor module 1176 may include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
- the haptic module 1179 may convert an electrical signal into a mechanical stimulus (e.g., a vibration or a movement) or electrical stimulus which may be recognized by a user via his tactile sensation or kinesthetic sensation.
- the haptic module 1179 may include, for example, a motor, a piezoelectric element, or an electric stimulator.
- the camera module 1180 may capture a still image or moving images.
- the camera module 1180 may include one or more lenses, image sensors, image signal processors, or flashes.
- the power management module 1188 may manage power supplied to the electronic device 1101. According to one embodiment, the power management module 1188 may be implemented as at least part of, for example, a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the communication module 1190 may support establishing a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 1101 and the external electronic device (e.g., the electronic device 1102, the electronic device 1104, or the server 1108) and performing communication via the established communication channel.
- the communication module 1190 may include one or more communication processors that are operable independently from the processor 1120 (e.g., the application processor (AP)) and supports a direct (e.g., wired) communication or a wireless communication.
- AP application processor
- the wireless communication module 1192 may support various technologies for securing performance on a high-frequency band, such as, e.g., beamforming, massive multiple-input and multiple-output (massive MIMO), full dimensional MIMO (FD-MIMO), array antenna, analog beam-forming, or large scale antenna.
- the wireless communication module 1192 may support various requirements specified in the electronic device 1101, an external electronic device (e.g., the electronic device 1104), or a network system (e.g., the second network 1199).
- the antenna module 1197 may transmit or receive a signal or power to or from the outside (e.g., the external electronic device) of the electronic device 1101.
- the antenna module 1197 may include an antenna including a radiating element composed of a conductive material or a conductive pattern formed in or on a substrate (e.g., a printed circuit board (PCB)).
- the antenna module 1197 may include a plurality of antennas (e.g., array antennas). In such a case, at least one antenna appropriate for a communication scheme used in the communication network, such as the first network 1198 or the second network 1199, may be selected, for example, by the communication module 1190 (e.g., the wireless communication module 1192) from the plurality of antennas.
- the signal or the power may then be transmitted or received between the communication module 1190 and the external electronic device via the selected at least one antenna.
- another component e.g., a radio frequency integrated circuit (RFIC)
- RFIC radio frequency integrated circuit
- the antenna module 1197 may form a mmWave antenna module.
- the mmWave antenna module may include a printed circuit board, a RFIC disposed on a first surface (e.g., the bottom surface) of the printed circuit board, or adjacent to the first surface and capable of supporting a designated high-frequency band (e.g., the mmWave band), and a plurality of antennas (e.g., array antennas) disposed on a second surface (e.g., the top or a side surface) of the printed circuit board, or adjacent to the second surface and capable of transmitting or receiving signals of the designated high-frequency band.
- a RFIC disposed on a first surface (e.g., the bottom surface) of the printed circuit board, or adjacent to the first surface and capable of supporting a designated high-frequency band (e.g., the mmWave band)
- a plurality of antennas e.g., array antennas
- At least some of the above-described components may be coupled mutually and communicate signals (e.g., commands or data) therebetween via an inter-peripheral communication scheme (e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)).
- an inter-peripheral communication scheme e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
- commands or data may be transmitted or received between the electronic device 1101 and the external electronic device 1104 via the server 1108 coupled with the second network 1199.
- Each of the electronic devices 1102 or 1104 may be a device of a same type as, or a different type, from the electronic device 1101.
- all or some of operations to be executed at the electronic device 1101 may be executed at one or more of the external electronic devices 1102, 1104, or 1108.
- the electronic device 1101, instead of, or in addition to, executing the function or the service may request the one or more external electronic devices to perform at least part of the function or the service.
- the one or more external electronic devices receiving the request may perform the at least part of the function or the service requested, or an additional function or an additional service related to the request, and transfer an outcome of the performing to the electronic device 1101.
- the electronic device 1101 may provide the outcome, with or without further processing of the outcome, as at least part of a reply to the request.
- a cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used, for example.
- the electronic device 1101 may provide ultra low-latency services using, e.g., distributed computing or mobile edge computing.
- the external electronic device 1104 may include an internet-of-things (IoT) device.
- the server 1108 may be an intelligent server using machine learning and/or a neural network.
- the external electronic device 1104 or the server 1108 may be included in the second network 1199.
- the electronic device 1101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology or IoT-related technology.
- the electronic device may be one of various types of electronic devices.
- the electronic devices may include, for example, a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance. According to an embodiment of the disclosure, the electronic devices are not limited to those described above.
- an element e.g., a first element
- the element may be coupled with the other element directly (e.g., wiredly), wirelessly, or via a third element.
- module may include a unit implemented in hardware, software, or firmware, and may interchangeably be used with other terms, for example, “logic,” “logic block,” “part,” or “circuitry”.
- a module may be a single integral component, or a minimum unit or part thereof, adapted to perform one or more functions.
- the module may be implemented in a form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- Various embodiments as set forth herein may be implemented as software (e.g., the program 1140) including one or more instructions that are stored in a storage medium (e.g., internal memory 1136 or external memory 1138) that is readable by a machine (e.g., the electronic device 1101).
- a processor e.g., the processor 1120 of the machine (e.g., the electronic device 1101) may invoke at least one of the one or more instructions stored in the storage medium, and execute it, with or without using one or more other components under the control of the processor. This allows the machine to be operated to perform at least one function according to the at least one instruction invoked.
- the one or more instructions may include a code generated by a complier or a code executable by an interpreter.
- the machine-readable storage medium may be provided in the form of a non-transitory storage medium.
- non-transitory simply means that the storage medium is a tangible device, and does not include a signal (e.g., an electromagnetic wave), but this term does not differentiate between where data is semi-permanently stored in the storage medium and where the data is temporarily stored in the storage medium.
- the method described in the embodiments herein ensures the color consistency of the rendered image on the receiver device is the same as the color consistency of the captured image as it appeared on the device involved in capturing the captured image.
- the method described in the embodiments herein may be utilized during conference call or screen sharing mode such that colors are shown accurately to all.
- the method described in the embodiments herein may be utilized by online stores, such that images of the products may generalized based on generalized true colors.
- the method described in the embodiments herein may be utilized in online gaming where displays of the players are synced, such that colors of the rendered content appear consistent throughout the synced displays.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
Disclosed are a system (203) and a method (1000) for generating color corrected image (317) on a receiver device (201). An image captured from a sender device is received on a receiver device (201). Further, a plurality of parameters associated with the received image (303) are determined. Based on the plurality of parameters, one or more effects applied to the captured image are determined. Furthermore, the received image (303) is reconstructed by applying the determined one or more effects on an intermediate unfiltered image (607). Finally, the color corrected image (317) is generated based on the reconstructed image (313).
Description
The present disclosure generally relates to a field of image processing, and more specifically relates to a method and a system for generating a color-corrected image on a receiver device.
Various services and additional functions provided via an electronic device, for example, a portable electronic device such as a smartphone, have been gradually increased. To increase the effective value of such electronic device, and to satisfy various desires of users, communication service providers or electronic device manufacturers have provided various functions, and have competitively developed an electronic device distinguished from other companies. Accordingly, various functions provided via an electronic device have been gradually advanced.
One of the common challenges encountered when displaying an image on display screens of different devices is the variation in the appearance of the image across different devices, as depicted in FIG. 1. FIG. 1 illustrates a pictorial diagram 100 depicting an exemplary problem scenario, where a user associated with a sender device captures an image of a red colored dress using the sender device and share the image with a user associated with a receiver device. In the problem scenario as depicted in FIG. 1, the image may appear different on the receiver device from the sender device. For example, the color of the dress in the received image may appear pink on the receiver device while the color of the dress is red in the captured image on the sender device.
While the different devices are capable of producing the same range of colors, the perception of the colors can vary based on several factors. One contributing factor is the use of different display technologies, such as liquid crystal display (LCD), organic light-emitting diode (OLED), etc. in the different devices. The different display technologies can affect color accuracy and saturation. For example, LCDs use a backlight to illuminate pixels while OLED displays emit light on a per-pixel basis. Such differences influence how colors are rendered and perceived on a particular device.
Additionally, the angle at which the image is viewed also influences the appearance of the image. When the image on a display screen is viewed from various angles, the colors and contrast can change due to the technical limitations of the underlying display technology. Therefore, this viewing angle dependency can cause variations in color accuracy and brightness.
Furthermore, other factors related to display settings such as brightness level, contrast ratio, color temperature, and color cast settings on a device can also affect the overall visual experience. Any adjustment in the display settings on a particular device can impact how the images are perceived.
As a result of the above, an image when displayed on two different devices may appear different on each device.
Therefore, there lies a need of an improved method and system that can overcome the above-described limitations and problems associated with color perception while displaying an image across different devices.
This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the disclosure. This summary is neither intended to identify essential inventive concepts of the disclosure nor is it intended for determining the scope of the disclosure.
According to one embodiment of the present disclosure, disclosed herein is a method for generating color-corrected image on a receiver device. The method includes receiving an image captured from a sender device on the receiver device. Further, the method includes determining a plurality of parameters associated with the received image, wherein the plurality of parameters comprises one or more image parameters and one or more hardware parameters involved in capturing of the received image. Furthermore, the method includes determining one or more effects applied to the captured image by comparing a first representation associated with the received image with a second representation associated with an intermediate unfiltered image, wherein the first representation is generated based on the determined plurality of parameters, and wherein the intermediate unfiltered image is generated using the first 4D embedding representation. Moreover, the method includes reconstructing the received image by applying the determined one or more effects on the intermediate unfiltered image. Finally, the method includes generating the color corrected image based on the reconstructed image.
According to another embodiment of the present disclosure, disclosed is a system for generating color corrected image on a receiver device. The system includes a memory and processor coupled to the memory. The processor is configured to receive, on the receiver device, an image captured from a sender device. Further, the processor is configured to determine a plurality of parameters associated with the received image, wherein the plurality of parameters comprise one or more image parameters and one or more hardware parameters involved in capturing of the received image. Furthermore, the processor is configured to determine one or more effects applied to the captured image by comparing a first representation associated with the received image with a second representation associated with an intermediate unfiltered image, wherein the first representation is generated based on the determined plurality of parameters, and wherein the intermediate unfiltered image is generated using the first 4D embedding representation. Moreover, the processor is configured to reconstruct the received image by applying the determined one or more effects on the intermediate unfiltered image. Finally, the processor is configured to generate the color-corrected image based on the reconstructed image.
To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will be rendered by reference to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the disclosure and are therefore not to be considered limiting of its scope. The disclosure will be described and explained with additional specificity and detail in the accompanying drawings.
The method described in the embodiments herein ensures the color consistency of the rendered image on the receiver device is the same as the color consistency of the captured image as it appeared on the device involved in capturing the captured image. In an advantage, the method described in the embodiments herein may be utilized during conference call or screen sharing mode such that colors are shown accurately to all. In another advantage, the method described in the embodiments herein may be utilized by online stores, such that images of the products may generalized based on generalized true colors. In yet another advantage, the method described in the embodiments herein may be utilized in online gaming where displays of the players are synced, such that colors of the rendered content appear consistent throughout the synced displays.
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
FIG. 1 illustrates a pictorial diagram depicting an exemplary problem scenario, according to existing art;
FIG. 2 illustrates a block diagram depicting an exemplary system for generating color corrected image on a user device, according to an embodiment of the present disclosure;
FIG. 3 illustrates a block diagram depicting an operational flow of the system for generating color corrected image, according to an embodiment of the present disclosure;
FIG. 4 illustrates a flow chart of the operation flow depicted in FIG. 3, according to an embodiment of the present disclosure;
FIG. 5 illustrates a block diagram depicting the general steps involved in generating a representation of an image, according to an embodiment of the present disclosure;
FIG. 6 illustrates a block diagram depicting determining of one or more effects 309, according to an embodiment of the present disclosure;
FIG. 7 illustrates a block diagram depicting the operation of reconstruction module to generate the reconstructed image, according to an embodiment of the present disclosure;
FIG. 8 illustrates a block diagram depicting an exemplary operation of the score calculation and dynamic rendering module, according to an embodiment of the present disclosure;
FIG. 9 illustrates a block diagram depicting an exemplary operation of the image simulation module, according to an embodiment of the present disclosure; and
FIG. 10 illustrates a flow diagram depicting the method for generating color-corrected image, according to an embodiment of the present disclosure.
FIG. 11 illustrates a block diagram illustrating an electronic device in a network environment according to various embodiments.
Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help improve understanding of aspects of the present disclosure. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the various embodiments and specific language will be used to describe the same. It should be understood at the outset that although illustrative implementations of the embodiments of the present disclosure are illustrated below, the present disclosure may be implemented using any number of techniques, whether currently known or in existence. The present disclosure is not necessarily limited to the illustrative implementations, drawings, and techniques illustrated below, including the exemplary design and implementation illustrated and described herein, but may be modified within the scope of the present disclosure
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are explanatory of the disclosure and are not intended to be restrictive thereof.
Reference throughout this specification to "an aspect" "another aspect" or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
It is to be understood that as used herein, terms such as, "includes," "comprises," "has," etc. are intended to mean that the one or more features or elements listed are within the element being defined, but the element is not necessarily limited to the listed features and elements, and that additional features and elements may be within the meaning of the element being defined. In contrast, terms such as, "consisting of" are intended to exclude features and elements that have not been listed.
The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term "or" as used herein, refers to a non-exclusive or unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
As is traditional in the field, embodiments may be described and illustrated in terms of blocks that carry out a described function or functions. These blocks, which may be referred to herein as units or modules or the like, are physically implemented by analog or digital circuits such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits, or the like, and may optionally be driven by firmware and software. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like. The circuits constituting a block may be implemented by dedicated hardware, by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block. Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the disclosure. Likewise, the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the disclosure.
The accompanying drawings are used to help easily understand various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed to extend to any alterations, equivalents, and substitutes in addition to those which are particularly set out in the accompanying drawings. Although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another.
The present disclosure specifically addresses the scenario when an image, captured using a sender device and shared with a receiver device, appears different on the receiver device. A similar scenario was discussed above in conjunction with FIG. 1, where the color of the dress appears pink on the receiver device, while the original color of the dress is red as captured using the sender device.
An object of the present disclosure is to provide an improved solution for displaying images across different devices such that the image appears the same on the receiver device as on the sender device.
The present disclosure achieves the above-described objective by providing a technique to display an image such that the color consistency of the image is maintained across different devices.
The disclosed techniques relate to a system and method for generating color-corrected images on the receiver device.
A person skilled in the art would appreciate that both the sender device and the receiver device correspond to user devices with a display capable of transmitting and receiving images. Such user devices may include, but are not limited to smartphones, tablets, laptops, desktops, smartwatches, and the like. A user device sending an image may be considered the 'sender device', whereas a user device receiving the image may be considered the 'receiver device'.
In an embodiment, the image may be captured using a camera of the sender device, may or may not be edited by applying filters, and then shared with the receiver device. In an alternate embodiment, the image may be received by the sender device from a third device, may or may not be edited by applying filters, and then shared with the receiver device.
The disclosed system and method for generating a color-corrected image on the receiver device are described below in the forthcoming paragraphs.
FIG. 2 illustrates a block diagram 200 depicting an exemplary system 203 for generating a color-corrected image on a user device 201. The user device 201 may correspond to a receiver device and may be interchangeably referred to as 'the receiver device 201' in the present disclosure. The user device 201 may include the system 203 for generating the color-corrected image.
The system 203 may include a processor 205, a memory 207, a display 209, a storage 211, a network interface 213, and module(s) 215 coupled with each other. The plurality of modules 215 may include a data extraction and reconstruction module 217, and a score calculation and dynamic rendering module 223. The data extraction and reconstruction module 217 may further include sub-modules such as data extraction module 219, and reconstruction module 221. The score calculation and dynamic rendering module 223 may further include sub-modules such as an optical perception metric score (OPMS) calculation module 225, and an image simulation module 227.
In an example, the processor 205 may be a single processing unit or a number of units, all of which could include multiple computing units. The processor 205 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logical processors, virtual processors, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 205 is configured to fetch and execute computer-readable instructions and data stored in the memory 207.
The memory 207 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
The display 209 may configured to display an image. The display 209 may be one of the various types of display panels such as a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an electrophoretic display (EPD), an electrochromic display (ECD), and a plasma display panel (PDP).
As an example, the storage 211 may be implemented with integrated hardware and software. The hardware may include a hardware disk controller with programmable search capabilities or a software system running on general-purpose hardware. The examples of the storage 211 are, but are not limited to, in-memory databases, cloud databases, distributed databases, embedded databases, and the like. The storage 211, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the processors, and the modules/engines/units.
The network interface 213 may be configured to provides network connectivity and enables communication with other devices including the sender device over one or more networks.
As an example, the module(s) 215 may include a program, a subroutine, a portion of a program, a software component, or a hardware component capable of performing a stated task or function. As used herein, the module(s) 215 may be implemented on a hardware component such as a server independently of other modules, or a module can exist with other modules on the same server, or within the same program. The module(s) 215 may be implemented on a hardware component such as processor one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. The module(s) 215 when executed by the processor(s) 205 may be configured to perform any of the described functionalities.
In an embodiment, the module(s) 215 may be implemented using one or more AI modules that may include a plurality of neural network layers. Examples of neural networks include but are not limited to, Convolutional Neural Network (CNN), Deep Neural Network (DNN), Recurrent Neural Network (RNN), and Restricted Boltzmann Machine (RBM). Further, 'learning' may be referred to in the disclosure as a method for training a predetermined target device (for example, a robot) using a plurality of learning data to cause, allow, or control the target device to make a determination or prediction. Examples of learning techniques include but are not limited to supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning. At least one of a plurality of CNN, DNN, RNN, RMB models and the like may be implemented to thereby achieve execution of the present subject matter's mechanism through an AI model. A function associated with an AI module may be performed through the non-volatile memory, the volatile memory, and the processor. The processor may include one or a plurality of processors. At this time, one or a plurality of processors may be a general-purpose processor, such as a central processing unit (CPU), an application processor (AP), or the like, a graphics-only processing unit such as a graphics processing unit (GPU), a visual processing unit (VPU), and/or an AI-dedicated processor such as a neural processing unit (NPU). One or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning.
A detailed working explanation of the various modules of FIG. 2 will be explained in detail in the forthcoming paragraphs
A detailed working of the system 203 for generating color-corrected image will be explained through various components of FIG. 2 in the forthcoming paragraphs through FIG. 3 to 10.
FIG. 3 illustrates a block diagram depicting an operational flow 300 of the system 203 for generating color-corrected image. The operation flow 300 is implemented in the receiver device 201 and will be explained through various operation steps 301 and 315. Further, FIG. 4 illustrates a flow chart 400 of the operation flow 300 and hence will be explained collectively with the operation flow 300 for the sake of brevity and ease of reference. Accordingly, an explanation of the operation flow 300 will be explained in the forthcoming paragraphs and through FIG. 2 to 10. Further, the reference numerals were kept the same for the similar components throughout the disclosure for ease of explanation and understanding.
In an embodiment, initially, at operation step 301, an input image 303 is provided to the data extraction module 219. The term 'input image' may be referred to as 'received input image' or 'received image' when described in the context of the receiver device. Further, the term 'input image' may be referred to as 'captured input image' when described in the context of the device involved in capturing the 'input image' such as the sender device. The input image 303 is received on the receiver device from the sender device as depicted at step 401 of FIG. 4. In an embodiment, the input image 303 may be captured from the sender device or may have been captured using another device, received by the sender device and then forwarded to the receiver device. Further, one or more effects may or may not have been applied on the captured input image 303 on the sender device or on another device used to capture the input image 303.
In an embodiment, the data extraction module 219, at step 403 of FIG. 4, determines a plurality of parameters associated with the received input image 303. The plurality of parameters may include one or more image parameters 305 and one or more hardware parameters 307 involved in capturing of the received input image 303. In an embodiment, the one or more image parameters 305 includes edge maps, red, green, and blue (RGB) information, and segmented objects associated with the received input image 303. The determination of the one or more image parameters 305 is described below in greater detail in the forthcoming paragraphs. In an embodiment, the one or more hardware parameters 307 are determined from metadata associated with the received input image 303. The determination of the one or more hardware parameters 307 is described below in greater detail in the forthcoming paragraphs.
Further, one or more effects 309 applied on the captured input image 303 are determined, at step 405 of FIG. 4, by comparing a first representation 601 with a second representation 609. In an embodiment, the first representation 601 is associated with the received input image 303 and is generated based on one or more image parameters 305, and one or more hardware parameters 307. In an embodiment, the second representation 609 is associated with an intermediate unfiltered image such that the intermediate unfiltered image is generated using a first 4D embedding representation. The generation of the first representation 601 and the second representation 609 is described below in greater detail in the forthcoming paragraphs in conjunction with the FIG. 5.
Furthermore, the reconstruction module 221, at step 407 of FIG. 4, reconstructs the received input image 303 by applying the one or more effects 309 on the intermediate unfiltered image to obtain a reconstructed image 313. The operation performed by the reconstruction module 221 to generate the reconstructed image 313 is described below in greater detail in the forthcoming paragraphs in conjunction with FIG. 7.
In an embodiment, at operation 315, the reconstructed image 313 is provided to the optical perception metric score (OPMS) module 225. The OPMS module 225 determines corresponding optical perception metric scores associated with the reconstructed image 313 and the received input image 303. In an embodiment, the optical perception metric scores refer to a vector, wherein the vector comprises an angle of perception, an image zoom factor associated with the received image, and a display type associated with the receiver device. The operation of the OPMS module 225 is described below in greater detail in the forthcoming paragraphs.
Finally, the image simulation module 227, at step 409 of FIG. 4 and step 315 of FIG. 3, generates a color-corrected image as an output image 317 by performing image simulation based on a comparison of the corresponding optical perception metric scores of the reconstructed image 313 and the received input image 303. The generated output image 317 appears the same as the input image 303 would appear on the sender device. The operation of the image simulation module 227 is described below in greater detail in the forthcoming paragraphs in conjunction with the FIG. 9.
Each of the modules 215 is now described below.
The data extraction module 219 may be configured to determine the one or more image parameters 305 and the one or more hardware parameters 307 involved in capturing the received input image 303. The data extraction module 219 may be further configured to determine the one or more effects 309 applied on the captured input image 303.
In an embodiment, the data extraction module 219 segments out the individual objects from the received input image 303 and creates a map. In the created map, the segmented objects from the received input image 303 may be considered as the key of the map, and corresponding values may be defined using corresponding color scale hue saturation value (HSV) histograms, grayscale values, and edge maps.
In an embodiment, for each segmented object of the received input image 303, the one or more image parameters 305 may refer to edge maps, red, green, blue (RGB) information, and luminance (L) information. In an embodiment, the RGB information represents the colors in the corresponding segmented object. In an embodiment, the L information represents the lightness or brightness value of the corresponding segmented object.
In an embodiment, for each segmented object of the received input image 303, the corresponding edge may be created using predefined edge detector filters. Edge detector filters refer to image processing techniques that may be used to identify the edges or boundaries within an image. Such filters aim to detect areas of rapid intensity transitions, such as changes in brightness or color. The edge detector filters may comprise a high-distance sensitive filter and a low-distance sensitive filter.
The high distance sensitive filer may be used to identify sub-objects defined by clear boundaries within the segmented object. The low-distance sensitive filter may be used to identify the same parts defined by usable boundaries of the same object. In an embodiment, the usable boundaries refer to the boundaries that clearly distinguish between pixels of one object from the other objects. For example, in an image of a person wearing a t-shirt and a cap with a logo, the t-shirt and the cap may be considered as segmented objects. Within the segmented object represented by the cap with the logo, the high distance sensitive filter may be used to identify sub-objects such as the logo, whereas the low distance sensitive filter may be used to identify parts of the logo.
In an embodiment, an edge map for each segmented object may be generated using the clear boundaries and the usable boundaries identified by the high-distance and low-distance sensitive filters of the edge detector filter.
In an embodiment, the one or more hardware parameters 307 are determined from metadata associated with the received image 303. The one or more hardware parameters include but are not limited to, display type, pixel per inch (PPI), resolution, and camera parameters such as sensitivity of the image sensor to light, focus, and white balance. In an embodiment, the metadata may be accessed using an option available to view detailed information associated with the received image 303.
In an embodiment, the one or more effects 309 corresponds to one or more filters applied to the captured image. Such applied filters are determined on the receiver device 201 by comparing a first representation 601 associated with the received image 303 with a second representation 609 associated with an intermediate unfiltered image as shown in FIG. 6. In an embodiment, the first representation 601 corresponds to a four-dimensional (4D) embedding representation determined using local statistics computation on the received image 303. Generation of the first representation 601 is described below in conjunction with FIG. 5.
FIG. 5 illustrates a block diagram 500 depicting the general steps involved in generating a representation of an image, according to an embodiment of the present disclosure. In general, according to embodiments of the present disclosure, a representation or 4D embedding representation 507 of an image 501 may be generated using a color pallet 503 and color adjustment factors 505 associated with the image 501. The color pallet 503 may be determined by applying density-based filtering on the image 501. The color pallet refers to a minimal list of prominent colors that can be used to represent all the colors in the image 501. In an exemplary embodiment, density-based spatial clustering of applications with noise (DBSCAN) may be used to determine the color pallet 503. Further, the color adjustment factors 505 indicate brightness, saturation, and contrast of the image 501. The colour adjustment factors are determined by applying one or more image processing operations on the image 501. Similar steps are performed for generating the first representation 601 and the second representation 609 as shown in FIG. 6 below.
In an embodiment, when the first representation 601 associated with the received image 303 is generated, the image 501 may correspond to the received image 303. Accordingly, the color pallet 503 and the color adjustment factors 505 may correspond to a first color pallet and a first color adjustment factors, respectively, associated with the received image 303. Accordingly, the 4D embedding representation 507 may correspond to the first representation 601 associated with the received image 303. Further, the first representation 601 may be used to generate the unfiltered image 607 as shown in FIG. 6 below.
The second representation 609 associated with the unfiltered image 607 is generated using similar steps as described in FIG. 5. When the second representation 609 is generated, the image 501 corresponds to the unfiltered image 607. Further, the color pallet 503 and the color adjustment factors 505 may correspond to a second color pallet and a second color adjustment factors, respectively, associated with the unfiltered image 607.
As mentioned above, the one or more effects 309 are determined by comparing the first representation 601 and the second representation 609, as also described in conjunction with FIG. 6.
FIG. 6 illustrates a block diagram 600 depicting the determining of one or more effects 309, according to an embodiment of the present disclosure. As depicted in the figure, a predefined neural network 603 and a predefined decoder 605 may be applied in sequence on the first representation 601 associated with the received image 303 to generate the unfiltered image 607. Thereafter, the second representation 609 associated with the unfiltered image 607 may be generated by following the steps described in FIG. 5. Finally, the one or more effects 309 are determined by comparing the first representation 601 and the second representation 609 using a predefined comparator unit 611. Based on the determined one or more effects 309 and image parameters 305, the reconstruction module 221 generates the reconstructed image 313 as described below in conjunction with FIG. 7.
FIG. 7 illustrates a block diagram 700 depicting the operation of reconstruction module 221 to generate the reconstructed image 313, according to an embodiment of the present disclosure. As depicted in the figure, initially, at step 701, the reconstruction module 221 calculates HSV degradation to display values and provides as input to a color filing unit 705. The HSV degradation to display indicates a difference in perception of the HSV when the same image is viewed on two different screens. In an embodiment, before calculating the HSV degradation to display, the unfiltered image 607 is segmented in a similar manner as the received image 303 was segmented. In an embodiment, the HSV degradation to display values is calculated for each segmented object of the unfiltered image 607 using predefined techniques. In an exemplary embodiment, the predefined techniques may include color picker application. The color filing unit 705 may also receive as input edge map of the segmented objects of the received image 303.
In an embodiment, the color filing unit 705 may perform color filling operation on each edge map of the segmented objects of the received image 303 using the HSV degradation to display values. The color filing unit 705 results in an image 707 as seen on the receiver device 201 without the one or more effects 309.
In an embodiment, the reconstruction module 221 generates the reconstructed image 313 by applying the one or more effects 309 on the image 707, and performing focus integration operation on blurred areas of the image 707 using the determined one or more hardware parameters 307. Further, the score calculation and dynamic rendering module 223 determine optical perception metric scores associated with the reconstructed image 313 and the received image 303 and generate the color-corrected image by performing image simulation based on a comparison of the corresponding optical perception metric scores. Operation of the score calculation and dynamic rendering module 223 is described below in conjunction with FIG. 8.
FIG. 8 illustrates a block diagram 800 depicting an exemplary operation of the score calculation and dynamic rendering module 223, according to an embodiment of the present disclosure. Initially, the OPMS module 225 of the score calculation and dynamic rendering module 223 calculate OPMS 801 and 805 for each of the received image 303 and the reconstructed image 313 respectively. Further, the OPMS module 225 also determines neutral aspects 803 and 807 of the received image 303 and the reconstructed image 313 respectively.
In an embodiment, the OPMS 801 and 805 refer to a vector. In an embodiment, the vector includes an angle of perception, an image zoom factor associated with the received image 303, and a display type associated with the receiver device 201. In general, an OPMS is calculated using vector components such as a predicted color tone, a color shade, and a color tint of the reconstructed image 313, for each permutation and combination of angle of perception, image zoom, and display type. To reduce the number of data points, computation is performed only for the most probable combination of the vector components. OPMS is compared for both the received image 303, and the reconstructed image 313 by comparing the individual vector components. In an embodiment, the angle of perception refers to the perspective or angle at which the image 303 is viewed. According to the embodiments of the present disclosure, the OPMS is calculated using color tone, color shade, and color tint of the corresponding image, for each permutation and combination of angle of perception, image zoom, and display type.
In an embodiment, the neutral aspects 803 and 807 may be determined for each pixel of the corresponding segmented objects of the respective image. In an embodiment, the neutral aspects 803 and 807 may refer to saturation per pixel, brightness per pixel, and contrast per pixel of the corresponding image. According to embodiments of the present disclosure, the neutral aspects of the image may change with respect to the change in the angle of perception of the corresponding image.
In an embodiment, the OPMS module 225 determines the final neutral aspects 809 based on the OPMS 801, 805, and neutral aspects 803, 807 of the corresponding received image 303 and the reconstructed image 313. In an embodiment, the final neutral aspects 809 may be determined with respect to the received image 303 and may relate to the neutral aspect of the color-corrected image 317 to be generated by the image simulation module 227. In an embodiment, OPMS vectors affecting the image colors are determined by comparing the OPMS 801 and 807 to determine current conditions of the captured image. For example, the current conditions may define the angle of perception as 30 degrees, the display type as active-matrix organic light-emitting diode (AMOLED), and the image zoom as 30% .
In an embodiment, the final neutral aspect 809 is determined based on the current conditions of the captured image and the neutral aspects 803, 807 of the received image 303 and the reconstructed image 313. The color-corrected image 317 is generated by applying image simulation on the reconstructed image 313 using the final neutral aspects 809. The operation of the image simulation module 227 is described in conjunction with FIG. 9.
FIG. 9 illustrates a block diagram 900 depicting an exemplary operation of the image simulation module 227, according to an embodiment of the present disclosure. Initially, at step 901, neutral aspect conversion is determined by equating neutral aspects 803 with the final neutral aspects 809 of each pixel of the received image 303. Further, the image simulation module 227 identifies sub-pixel regions 905 in the received image 303.
In an embodiment, the sub-pixel regions 905 may be identified using sub-pixel region identification unit 903. The sub-pixel region identification unit 903 may be configured to identify the degraded region in the received image 303. It is to be noted that not all regions of the received image 303 would be in need of correction, as only some of the regions are degraded. In an exemplary module, the sub-pixel region identification unit 903 uses a degradation-aware region identification technique to identify the sub-pixel regions 905 that are degraded or prone to degradation. The degradation-aware region identification technique includes a degradation-aware region filter for calculation of a probability distribution of degradation based on the contribution of various degrading factors. The degrading factors correspond to the color adjustment factors 505 i.e., brightness, contrast, and saturation. The sub-pixel regions 905 are typically the regions where the brightness, saturation, or contrast is the highest.
In an embodiment, based on the identified sub-pixel regions 905, and the determined neutral aspect conversion, a dynamic rendering unit 907 of the image simulation module 227 dynamically renders the generated color-corrected image 317 on an interface of the receiver device 201. When the color-corrected image 317 is dynamically displayed on the receiver device 201, the color consistency of the rendered image 317 is the same as the color consistency of the captured image as it appeared on the device involved in capturing the image 303.
In an embodiment, at any point of time the user may tilt the user device 201, zoom into the image, or view in a different display. In an embodiment, any of the above listed events may change the OPMS 801 and 805. Hence, the final neutral aspects 809 may have to be recalculated in real-time based on the event and rendered accordingly. Such rendering of an image after recalculation of final neutral aspects 809 is referred to as dynamic rendering. In an embodiment, the output of the dynamic rendering unit 907 may be provided as feedback to the OPMS module 225 to improve the OPMS calculation.
The method 1000 for generating color-corrected image 317 on the receiver device 201 is described below in conjunction with FIG. 10.
FIG. 10 illustrates a flow diagram depicting the method 1000 for generating color-corrected image 317, according to an embodiment of the present disclosure. The method 1000 includes a series of operations 1001 through 1011 executed by one or more components of the system 203 of the receiver device 201, in particular the processor 205. At step 1001, the processor 205 receives the image 303 captured from the sender device.
At step 1003, the processor 205 determines a plurality of parameters associated with the received image such that the plurality of parameters includes the one or more image parameters 305 and the one or more hardware parameters 307 involved in capturing of the received image. In an embodiment, the one or more image parameters 305 comprise edge maps, red, green, blue (RGB) information, and luminance (L) information associated with each segmented object of the received image 303. In an embodiment, the one or more hardware parameters 307 are determined from metadata associated with the received image 303.
At step 1005, the processor 205 determines the one or more effects 309 applied to the captured image by comparing the first representation associated with the received image with a second representation associated with an intermediate unfiltered image. In an embodiment, the first representation is generated based on the determined plurality of parameters. In an embodiment, the intermediate unfiltered image is generated using the first representation.
At step 1007, the processor 205 reconstructs the received image by applying the determined one or more effects 309 on the intermediate unfiltered image.
At step 1009, the processor 205 generates the color-corrected image 317 based on the reconstructed image 313.
At step 1011, the processor 205 dynamically renders the generated color-corrected image on an interface associated with the receiver device, wherein the color consistency associated with the rendered image is the same as the color consistency associated with the captured image.
Fig. 11 is a block diagram illustrating an electronic device 1101 in a network environment 1100 according to various embodiments. Referring to Fig. 11, the electronic device 1101 in the network environment 1100 may communicate with an electronic device 1102 via a first network 1198 (e.g., a short-range wireless communication network), or at least one of an electronic device 1104 or a server 1108 via a second network 1199 (e.g., a long-range wireless communication network). According to an embodiment, the electronic device 1101 may communicate with the electronic device 1104 via the server 1108. According to an embodiment, the electronic device 1101 may include a processor 1120, memory 1130, an input module 1150, a sound output module 1155, a display module 1160, an audio module 1170, a sensor module 1176, an interface 1177, a connecting terminal 1178, a haptic module 1179, a camera module 1180, a power management module 1188, a battery 1189, a communication module 1190, a subscriber identification module(SIM) 1196, or an antenna module 1197. In some embodiments, at least one of the components (e.g., the connecting terminal 1178) may be omitted from the electronic device 1101, or one or more other components may be added in the electronic device 1101. In some embodiments, some of the components (e.g., the sensor module 1176, the camera module 1180, or the antenna module 1197) may be implemented as a single component (e.g., the display module 1160).
The processor 1120 may execute, for example, software (e.g., a program 1140) to control at least one other component (e.g., a hardware or software component) of the electronic device 1101 coupled with the processor 1120, and may perform various data processing or computation. According to one embodiment, as at least part of the data processing or computation, the processor 1120 may store a command or data received from another component (e.g., the sensor module 1176 or the communication module 1190) in volatile memory 1132, process the command or the data stored in the volatile memory 1132, and store resulting data in non-volatile memory 1134. According to an embodiment, the processor 1120 may include a main processor 1121 (e.g., a central processing unit (CPU) or an application processor (AP)), or an auxiliary processor 1123 (e.g., a graphics processing unit (GPU), a neural processing unit (NPU), an image signal processor (ISP), a sensor hub processor, or a communication processor (CP)) that is operable independently from, or in conjunction with, the main processor 1121. For example, when the electronic device 1101 includes the main processor 1121 and the auxiliary processor 1123, the auxiliary processor 1123 may be adapted to consume less power than the main processor 1121, or to be specific to a specified function. The auxiliary processor 1123 may be implemented as separate from, or as part of the main processor 1121.
The auxiliary processor 1123 may control at least some of functions or states related to at least one component (e.g., the display module 1160, the sensor module 1176, or the communication module 1190) among the components of the electronic device 1101, instead of the main processor 1121 while the main processor 1121 is in an inactive (e.g., sleep) state, or together with the main processor 1121 while the main processor 1121 is in an active state (e.g., executing an application). According to an embodiment, the auxiliary processor 1123 (e.g., an image signal processor or a communication processor) may be implemented as part of another component (e.g., the camera module 1180 or the communication module 1190) functionally related to the auxiliary processor 1123. According to an embodiment, the auxiliary processor 1123 (e.g., the neural processing unit) may include a hardware structure specified for artificial intelligence model processing. An artificial intelligence model may be generated by machine learning. Such learning may be performed, e.g., by the electronic device 1101 where the artificial intelligence is performed or via a separate server (e.g., the server 1108). Learning algorithms may include, but are not limited to, e.g., supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning. The artificial intelligence model may include a plurality of artificial neural network layers. The artificial neural network may be a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN), a restricted boltzmann machine (RBM), a deep belief network (DBN), a bidirectional recurrent deep neural network (BRDNN), deep Q-network or a combination of two or more thereof but is not limited thereto. The artificial intelligence model may, additionally or alternatively, include a software structure other than the hardware structure.
The memory 1130 may store various data used by at least one component (e.g., the processor 1120 or the sensor module 1176) of the electronic device 1101. The various data may include, for example, software (e.g., the program 1140) and input data or output data for a command related thererto. The memory 1130 may include the volatile memory 1132 or the non-volatile memory 1134.
The program 1140 may be stored in the memory 1130 as software, and may include, for example, an operating system (OS) 1142, middleware 1144, or an application 1146.
The input module 1150 may receive a command or data to be used by another component (e.g., the processor 1120) of the electronic device 1101, from the outside (e.g., a user) of the electronic device 1101. The input module 1150 may include, for example, a microphone, a mouse, a keyboard, a key (e.g., a button), or a digital pen (e.g., a stylus pen).
The sound output module 1155 may output sound signals to the outside of the electronic device 1101. The sound output module 1155 may include, for example, a speaker or a receiver. The speaker may be used for general purposes, such as playing multimedia or playing record. The receiver may be used for receiving incoming calls. According to an embodiment, the receiver may be implemented as separate from, or as part of the speaker.
The display module 1160 may visually provide information to the outside (e.g., a user) of the electronic device 1101. The display module 1160 may include, for example, a display, a hologram device, or a projector and control circuitry to control a corresponding one of the display, hologram device, and projector. According to an embodiment, the display module 1160 may include a touch sensor adapted to detect a touch, or a pressure sensor adapted to measure the intensity of force incurred by the touch.
The audio module 1170 may convert a sound into an electrical signal and vice versa. According to an embodiment, the audio module 1170 may obtain the sound via the input module 1150, or output the sound via the sound output module 1155 or a headphone of an external electronic device (e.g., an electronic device 1102) directly (e.g., wiredly) or wirelessly coupled with the electronic device 1101.
The sensor module 1176 may detect an operational state (e.g., power or temperature) of the electronic device 1101 or an environmental state (e.g., a state of a user) external to the electronic device 1101, and then generate an electrical signal or data value corresponding to the detected state. According to an embodiment, the sensor module 1176 may include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
The interface 1177 may support one or more specified protocols to be used for the electronic device 1101 to be coupled with the external electronic device (e.g., the electronic device 1102) directly (e.g., wiredly) or wirelessly. According to an embodiment, the interface 1177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, a secure digital (SD) card interface, or an audio interface.
A connecting terminal 1178 may include a connector via which the electronic device 1101 may be physically connected with the external electronic device (e.g., the electronic device 1102). According to an embodiment, the connecting terminal 1178 may include, for example, a HDMI connector, a USB connector, a SD card connector, or an audio connector (e.g., a headphone connector).
The haptic module 1179 may convert an electrical signal into a mechanical stimulus (e.g., a vibration or a movement) or electrical stimulus which may be recognized by a user via his tactile sensation or kinesthetic sensation. According to an embodiment, the haptic module 1179 may include, for example, a motor, a piezoelectric element, or an electric stimulator.
The camera module 1180 may capture a still image or moving images. According to an embodiment, the camera module 1180 may include one or more lenses, image sensors, image signal processors, or flashes.
The power management module 1188 may manage power supplied to the electronic device 1101. According to one embodiment, the power management module 1188 may be implemented as at least part of, for example, a power management integrated circuit (PMIC).
The battery 1189 may supply power to at least one component of the electronic device 1101. According to an embodiment, the battery 1189 may include, for example, a primary cell which is not rechargeable, a secondary cell which is rechargeable, or a fuel cell.
The communication module 1190 may support establishing a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 1101 and the external electronic device (e.g., the electronic device 1102, the electronic device 1104, or the server 1108) and performing communication via the established communication channel. The communication module 1190 may include one or more communication processors that are operable independently from the processor 1120 (e.g., the application processor (AP)) and supports a direct (e.g., wired) communication or a wireless communication. According to an embodiment, the communication module 1190 may include a wireless communication module 1192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 1194 (e.g., a local area network (LAN) communication module or a power line communication (PLC) module). A corresponding one of these communication modules may communicate with the external electronic device via the first network 1198 (e.g., a short-range communication network, such as BluetoothTM, wireless-fidelity (Wi-Fi) direct, or infrared data association (IrDA)) or the second network 1199 (e.g., a long-range communication network, such as a legacy cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or wide area network (WAN)). These various types of communication modules may be implemented as a single component (e.g., a single chip), or may be implemented as multi components (e.g., multi chips) separate from each other. The wireless communication module 1192 may identify and authenticate the electronic device 1101 in a communication network, such as the first network 1198 or the second network 1199, using subscriber information (e.g., international mobile subscriber identity (IMSI)) stored in the subscriber identification module 1196.
The wireless communication module 1192 may support a 5G network, after a 4G network, and next-generation communication technology, e.g., new radio (NR) access technology. The NR access technology may support enhanced mobile broadband (eMBB), massive machine type communications (mMTC), or ultra-reliable and low-latency communications (URLLC). The wireless communication module 1192 may support a high-frequency band (e.g., the mmWave band) to achieve, e.g., a high data transmission rate. The wireless communication module 1192 may support various technologies for securing performance on a high-frequency band, such as, e.g., beamforming, massive multiple-input and multiple-output (massive MIMO), full dimensional MIMO (FD-MIMO), array antenna, analog beam-forming, or large scale antenna. The wireless communication module 1192 may support various requirements specified in the electronic device 1101, an external electronic device (e.g., the electronic device 1104), or a network system (e.g., the second network 1199). According to an embodiment, the wireless communication module 1192 may support a peak data rate (e.g., 20Gbps or more) for implementing eMBB, loss coverage (e.g., 164dB or less) for implementing mMTC, or U-plane latency (e.g., 0.5ms or less for each of downlink (DL) and uplink (UL), or a round trip of 1ms or less) for implementing URLLC.
The antenna module 1197 may transmit or receive a signal or power to or from the outside (e.g., the external electronic device) of the electronic device 1101. According to an embodiment, the antenna module 1197 may include an antenna including a radiating element composed of a conductive material or a conductive pattern formed in or on a substrate (e.g., a printed circuit board (PCB)). According to an embodiment, the antenna module 1197 may include a plurality of antennas (e.g., array antennas). In such a case, at least one antenna appropriate for a communication scheme used in the communication network, such as the first network 1198 or the second network 1199, may be selected, for example, by the communication module 1190 (e.g., the wireless communication module 1192) from the plurality of antennas. The signal or the power may then be transmitted or received between the communication module 1190 and the external electronic device via the selected at least one antenna. According to an embodiment, another component (e.g., a radio frequency integrated circuit (RFIC)) other than the radiating element may be additionally formed as part of the antenna module 1197.
According to various embodiments, the antenna module 1197 may form a mmWave antenna module. According to an embodiment, the mmWave antenna module may include a printed circuit board, a RFIC disposed on a first surface (e.g., the bottom surface) of the printed circuit board, or adjacent to the first surface and capable of supporting a designated high-frequency band (e.g., the mmWave band), and a plurality of antennas (e.g., array antennas) disposed on a second surface (e.g., the top or a side surface) of the printed circuit board, or adjacent to the second surface and capable of transmitting or receiving signals of the designated high-frequency band.
At least some of the above-described components may be coupled mutually and communicate signals (e.g., commands or data) therebetween via an inter-peripheral communication scheme (e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)).
According to an embodiment, commands or data may be transmitted or received between the electronic device 1101 and the external electronic device 1104 via the server 1108 coupled with the second network 1199. Each of the electronic devices 1102 or 1104 may be a device of a same type as, or a different type, from the electronic device 1101. According to an embodiment, all or some of operations to be executed at the electronic device 1101 may be executed at one or more of the external electronic devices 1102, 1104, or 1108. For example, if the electronic device 1101 should perform a function or a service automatically, or in response to a request from a user or another device, the electronic device 1101, instead of, or in addition to, executing the function or the service, may request the one or more external electronic devices to perform at least part of the function or the service. The one or more external electronic devices receiving the request may perform the at least part of the function or the service requested, or an additional function or an additional service related to the request, and transfer an outcome of the performing to the electronic device 1101. The electronic device 1101 may provide the outcome, with or without further processing of the outcome, as at least part of a reply to the request. To that end, a cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used, for example. The electronic device 1101 may provide ultra low-latency services using, e.g., distributed computing or mobile edge computing. In another embodiment, the external electronic device 1104 may include an internet-of-things (IoT) device. The server 1108 may be an intelligent server using machine learning and/or a neural network. According to an embodiment, the external electronic device 1104 or the server 1108 may be included in the second network 1199. The electronic device 1101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology or IoT-related technology.
The electronic device according to various embodiments may be one of various types of electronic devices. The electronic devices may include, for example, a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance. According to an embodiment of the disclosure, the electronic devices are not limited to those described above.
It should be appreciated that various embodiments of the present disclosure and the terms used therein are not intended to limit the technological features set forth herein to particular embodiments and include various changes, equivalents, or replacements for a corresponding embodiment. With regard to the description of the drawings, similar reference numerals may be used to refer to similar or related elements. It is to be understood that a singular form of a noun corresponding to an item may include one or more of the things, unless the relevant context clearly indicates otherwise. As used herein, each of such phrases as "A or B," "at least one of A and B," "at least one of A or B," "A, B, or C," "at least one of A, B, and C," and "at least one of A, B, or C," may include any one of, or all possible combinations of the items enumerated together in a corresponding one of the phrases. As used herein, such terms as "1st" and "2nd," or "first" and "second" may be used to simply distinguish a corresponding component from another, and does not limit the components in other aspect (e.g., importance or order). It is to be understood that if an element (e.g., a first element) is referred to, with or without the term "operatively" or "communicatively", as "coupled with," "coupled to," "connected with," or "connected to" another element (e.g., a second element), it means that the element may be coupled with the other element directly (e.g., wiredly), wirelessly, or via a third element.
As used in connection with various embodiments of the disclosure, the term "module" may include a unit implemented in hardware, software, or firmware, and may interchangeably be used with other terms, for example, "logic," "logic block," "part," or "circuitry". A module may be a single integral component, or a minimum unit or part thereof, adapted to perform one or more functions. For example, according to an embodiment, the module may be implemented in a form of an application-specific integrated circuit (ASIC).
Various embodiments as set forth herein may be implemented as software (e.g., the program 1140) including one or more instructions that are stored in a storage medium (e.g., internal memory 1136 or external memory 1138) that is readable by a machine (e.g., the electronic device 1101). For example, a processor (e.g., the processor 1120) of the machine (e.g., the electronic device 1101) may invoke at least one of the one or more instructions stored in the storage medium, and execute it, with or without using one or more other components under the control of the processor. This allows the machine to be operated to perform at least one function according to the at least one instruction invoked. The one or more instructions may include a code generated by a complier or a code executable by an interpreter. The machine-readable storage medium may be provided in the form of a non-transitory storage medium. Wherein, the term "non-transitory" simply means that the storage medium is a tangible device, and does not include a signal (e.g., an electromagnetic wave), but this term does not differentiate between where data is semi-permanently stored in the storage medium and where the data is temporarily stored in the storage medium.
According to an embodiment, a method according to various embodiments of the disclosure may be included and provided in a computer program product. The computer program product may be traded as a product between a seller and a buyer. The computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc read only memory (CD-ROM)), or be distributed (e.g., downloaded or uploaded) online via an application store (e.g., PlayStoreTM), or between two user devices (e.g., smart phones) directly. If distributed online, at least part of the computer program product may be temporarily generated or at least temporarily stored in the machine-readable storage medium, such as memory of the manufacturer's server, a server of the application store, or a relay server.
According to various embodiments, each component (e.g., a module or a program) of the above-described components may include a single entity or multiple entities, and some of the multiple entities may be separately disposed in different components. According to various embodiments, one or more of the above-described components may be omitted, or one or more other components may be added. Alternatively or additionally, a plurality of components (e.g., modules or programs) may be integrated into a single component. In such a case, according to various embodiments, the integrated component may still perform one or more functions of each of the plurality of components in the same or similar manner as they are performed by a corresponding one of the plurality of components before the integration. According to various embodiments, operations performed by the module, the program, or another component may be carried out sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order or omitted, or one or more other operations may be added.
At least by virtue of the aforesaid, the present subject matter at least provides the following advantages:
The method described in the embodiments herein ensures the color consistency of the rendered image on the receiver device is the same as the color consistency of the captured image as it appeared on the device involved in capturing the captured image. In an advantage, the method described in the embodiments herein may be utilized during conference call or screen sharing mode such that colors are shown accurately to all. In another advantage, the method described in the embodiments herein may be utilized by online stores, such that images of the products may generalized based on generalized true colors. In yet another advantage, the method described in the embodiments herein may be utilized in online gaming where displays of the players are synced, such that colors of the rendered content appear consistent throughout the synced displays.
While specific language has been used to describe the present subject matter, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein. The drawings and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment.
Claims (15)
- An electronic device comprising:a display,at least one processor; andmemory storing instructions,wherein the instructions, when executed by the at least one processor, cause the electronic device to:receive (1001) an image (303) captured from an external device;determine (1003) a plurality of parameters associated with the received image (303), wherein the plurality of parameters include one or more image parameters (305) and one or more hardware parameters (307) related to the received image (303);determine (1005) one or more effects (309) applied to the captured image by comparing a first representation associated with the received image (303) with a second representation associated with an intermediate unfiltered image (607), wherein the first representation is generated based on the determined plurality of parameters, and wherein the intermediate unfiltered image (607) is generated using the first representation;reconstruct (1007) the received image (303) by applying the determined one or more effects on the intermediate unfiltered image (607);generate (1009) the color-corrected image (317) based on the reconstructed image (313), anddisplay, via the display, the color-corrected image (317).
- The electronic device as claimed in claim 1, wherein the one or more hardware parameters (307) are determined from metadata associated with the received image (303).
- The electronic device as claimed in claim 1, wherein the one or more image parameters (305) comprise edge maps, red, green, blue (RGB) information, and luminance (L) information associated with each segmented object of the received image (303).
- The electronic device as claimed in claim 1,wherein the instructions cause the electronic device to generate the first representation based on a first color pallet and a first color adjustment factors associated with the received image (303).
- The electronic device as claimed in claim 4, wherein the first color pallet is determined by applying density-based filtering on the received image (303) as seen on the receiver device (201).
- The electronic device as claimed in claim 4, wherein the first color adjustment factors are determined by applying one or more image processing operations on the received image (303) as seen on the receiver device (201).
- The electronic device as claimed in claim 1, wherein the instructions cause the electronic device to generate the intermediate unfiltered image (607) based on a predefined neural network (NN) and the first representation.
- The electronic device as claimed in claim 1,wherein the instructions cause the electronic device to generate the second representation based on a second color pallet and second color adjustment factors associated with the intermediate unfiltered image (607).
- The electronic device as claimed in claim 8, wherein the second color pallet is determined by applying density-based filtering on the intermediate unfiltered image (607).
- The electronic device as claimed in claim 8, wherein the second color adjustment factors are determined by applying one or more image processing operations on the intermediate unfiltered image (607).
- The electronic device as claimed in claim 1, wherein the instructions cause the electronic device to:determine corresponding optical perception metric scores associated with the reconstructed image (313) and the received image (303), andgenerate the color-corrected image (317) by performing image simulation based on a comparison of the corresponding optical perception metric scores.
- The electronic device as claimed in claim 11, wherein the corresponding optical perception metric scores is indicative of a vector, wherein the vector comprises an angle of perception, an image zoom factor associated with the received image (303), and a display type associated with the receiver device (201).
- The electronic device as claimed in claim 1,wherein the instructions cause the electronic device to dynamically render (1011) the generated color corrected image (317) on an interface associated with the receiver device (201), wherein a color consistency associated with the rendered image is same as a color consistency associated with the captured image.
- A non-transitory storage medium storing one or more programs, the one or more programs comprising executable instructions configured to, when executed by at least one processor an electronic device, cause the electronic device to:receive an image captured from an external device;determine a plurality of parameters associated with the received image (303), wherein the plurality of parameters include one or more image parameters and one or more hardware parameters related to the received image (303);determine one or more effects applied to the captured image by comparing a first representation associated with the received image (303) with a second representation associated with an intermediate unfiltered image (607), wherein the first representation is generated based on the determined plurality of parameters, and wherein the intermediate unfiltered image (607) is generated using the first representation;reconstruct the received image (303) by applying the determined one or more effects on the intermediate unfiltered image (607);generate the color-corrected image (317) based on the reconstructed image (313), anddisplay, via the display, the color-corrected image (317).
- The non-transitory storage medium as claimed in claim 14, wherein the one or more hardware parameters are determined from metadata associated with the received image (303).
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IN202311068785 | 2023-10-12 | ||
| IN202311068785 | 2023-10-12 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025080091A1 true WO2025080091A1 (en) | 2025-04-17 |
Family
ID=95395279
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2024/096294 Pending WO2025080091A1 (en) | 2023-10-12 | 2024-10-10 | An electronic device for generating color-corrected image and controlling method thereof |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025080091A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210224575A1 (en) * | 2015-09-02 | 2021-07-22 | Apple Inc. | Detecting Keypoints In Image Data |
| US20230043536A1 (en) * | 2021-08-06 | 2023-02-09 | Ford Global Technologies, Llc | White Balance and Color Correction for Interior Vehicle Camera |
| US20230073310A1 (en) * | 2020-04-13 | 2023-03-09 | Snap Inc. | Augmented reality content generators including 3d data in a messaging system |
| US20230164451A1 (en) * | 2021-11-24 | 2023-05-25 | Canon Kabushiki Kaisha | Information processing apparatus, method, medium, and system for color correction |
| US20230239553A1 (en) * | 2022-01-25 | 2023-07-27 | Qualcomm Incorporated | Multi-sensor imaging color correction |
-
2024
- 2024-10-10 WO PCT/KR2024/096294 patent/WO2025080091A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210224575A1 (en) * | 2015-09-02 | 2021-07-22 | Apple Inc. | Detecting Keypoints In Image Data |
| US20230073310A1 (en) * | 2020-04-13 | 2023-03-09 | Snap Inc. | Augmented reality content generators including 3d data in a messaging system |
| US20230043536A1 (en) * | 2021-08-06 | 2023-02-09 | Ford Global Technologies, Llc | White Balance and Color Correction for Interior Vehicle Camera |
| US20230164451A1 (en) * | 2021-11-24 | 2023-05-25 | Canon Kabushiki Kaisha | Information processing apparatus, method, medium, and system for color correction |
| US20230239553A1 (en) * | 2022-01-25 | 2023-07-27 | Qualcomm Incorporated | Multi-sensor imaging color correction |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020162673A1 (en) | Electronic device for providing avatar animation and method thereof | |
| EP4200837A1 (en) | Electronic device including display having variable screen size and method for compensating degradation of the display same | |
| CN108234882B (en) | Image blurring method and mobile terminal | |
| CN107958470A (en) | A kind of color correcting method, mobile terminal | |
| CN112184581B (en) | Image processing method, device, computer equipment and medium | |
| CN112070096A (en) | Color recognition method and device, terminal equipment and storage medium | |
| EP4327272A1 (en) | System and method for learning tone curves for local image enhancement | |
| CN112634155B (en) | Image processing method, device, electronic equipment and storage medium | |
| CN108038889A (en) | The processing method and mobile terminal of a kind of image color cast | |
| WO2025080091A1 (en) | An electronic device for generating color-corrected image and controlling method thereof | |
| WO2024154920A1 (en) | Electronic device and method for changing display state | |
| WO2024005333A1 (en) | Electronic device including camera and method therefor | |
| WO2021071124A1 (en) | Electronic device with improved visibility of user interface | |
| WO2022220590A1 (en) | Method of processing image and electronic device for performing same | |
| WO2022231168A1 (en) | Method and device for face recognition through color inversion on screen | |
| CN116107465A (en) | Processing method of icon color on desktop and electronic equipment | |
| WO2023153790A1 (en) | Method and apparatus for generating three-dimensional (3d) lookup table for tone mapping or other image processing functions | |
| WO2025143382A1 (en) | Method of identifying and colorizing partially colorized image and electronic device | |
| CN117316122B (en) | Color temperature calibration method, electronic equipment and medium | |
| WO2024185951A1 (en) | Electronic device and control method for selecting scaler on basis of image characteristics | |
| WO2025100798A1 (en) | Electronic device, method, and storage medium for controlling brightness of display | |
| WO2025146929A1 (en) | Electronic device, method, and non-transitory computer-readable storage medium for controlling brightness level | |
| WO2026029355A1 (en) | Method and electronic device for enhancing image visibility | |
| WO2024101684A1 (en) | Electronic device, method, and non-transitory computer-readable storage medium for changing driving frequency | |
| WO2025159397A1 (en) | Electronic device and image editing method for electronic device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24877626 Country of ref document: EP Kind code of ref document: A1 |