[go: up one dir, main page]

WO2013153252A1 - Method and apparatus for producing special effects in digital photography - Google Patents

Method and apparatus for producing special effects in digital photography Download PDF

Info

Publication number
WO2013153252A1
WO2013153252A1 PCT/FI2012/050363 FI2012050363W WO2013153252A1 WO 2013153252 A1 WO2013153252 A1 WO 2013153252A1 FI 2012050363 W FI2012050363 W FI 2012050363W WO 2013153252 A1 WO2013153252 A1 WO 2013153252A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sub
focus
processor
corresponding portion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/FI2012/050363
Other languages
French (fr)
Inventor
Petri Nenonen
Markus VARTIAINEN
Matti SUKSI
Martti Ilmoniemi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Inc
Original Assignee
Nokia Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Inc filed Critical Nokia Inc
Priority to PCT/FI2012/050363 priority Critical patent/WO2013153252A1/en
Publication of WO2013153252A1 publication Critical patent/WO2013153252A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present application generally relates to producing of special effects in digital photography.
  • DOF depth of field
  • tilt and shift objectives are among the most expensive objectives. Tilt and shift objectives yet enable photographing tall buildings so that the tops of the buildings do not seem to turn towards each other. While this geometric error can also be corrected by digital processing, the use of a tilt-shift objective with correct settings results in higher accuracy by removing the need to stretch image areas.
  • Tilt and shift objectives are also sometimes used to produce a so-called diorama effect or "diorama illusion".
  • diorama effect a life-size object or scene is made to look like a photograph of a miniature scale model.
  • miniature model photographs it is easy to produce strong blur in front of and behind the focal plane because of the basic laws of optics and because of short range to the target.
  • life-size photographing the lens aperture cannot be proportionally increased as much as the distances do in comparison to macro imaging.
  • the DOF is relatively far greater than in macro imaging.
  • Tilt-shift objectives yet produce a wedge shaped DOF when the objective is tilted.
  • an apparatus comprising:
  • an interface configured to exchange information with a camera unit
  • a processor configured to cause taking images in a series and with differing focus settings with the camera unit so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are imaged out of focus so as to cause blur in the neighboring objects; [0009] the processor being further configured to combine the at least one first image and the at least one second image to form a combined image so that:
  • a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image;
  • a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image
  • a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image.
  • the target object may be defined as an object appearing in the focused depth of the camera unit at a given moment of time.
  • the given moment of time may be the time when a user expresses a desire to take a photograph with simulated diorama effect.
  • the processor may be further configured to form a gradual transition between the first sub-image and the second sub-image in the image by blending images of different focus settings with gradient weights about border region surrounding the desired at least one image object.
  • the processor may be configured to form a gradual transition between the first sub-image and the second sub-image by mixing pixels of two or more images with varying weights so that pixels closer to the target object are formed with higher weight from an image in which the target object is in focus and pixels farther from the target object are formed with greater weight from an image in which the second sub-image is out-of- focus.
  • a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image;
  • a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image
  • a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image.
  • a computer program comprising:
  • code for causing exchanging information with a camera unit [0023] code for causing exchanging information with a camera unit; [0024] code for causing taking images in a series and with differing focus settings so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are imaged out of focus so as to cause blur in the neighboring objects; and
  • code for combining the at least one first image and the at least one second image to form a combined image so that:
  • a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image;
  • a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image
  • a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image;
  • a memory medium comprising the computer program of the third example aspect.
  • a first sub-image formed from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image
  • a third sub-image between the first sub-image and the second sub-image formed by merging pixels of matching position from the at least one first image and from the at least one second image.
  • Any foregoing memory medium may comprise a digital data storage such as a data disc or diskette, optical storage, magnetic storage, holographic storage, opto-magnetic storage, phase-change memory, resistive random access memory, magnetic random access memory, solid-electrolyte memory, ferroelectric random access memory, organic memory or polymer memory.
  • the memory medium may be formed into a device without other substantial functions than storing memory or it may be formed as part of a device with other functions, including but not limited to a memory of a computer, a chip set, and a sub assembly of an electronic device.
  • FIG. 1 shows a schematic system for use as a reference with which some example embodiments of the invention can be explained;
  • FIG. 2 shows a block diagram of an apparatus of an example embodiment of the invention
  • FIG. 3 shows a block diagram of a camera unit of an example embodiment of the invention
  • Fig. 4 shows a flow chart illustrating basic operations in a process according to an example embodiment
  • Fig. 5 shows an example of an image with a focus grid illustrating focus measurement blocks of an autofocus unit
  • Fig. 6 shows the image of Fig. 5 with a target area in focus
  • Fig. 7 shows an image taken from the view of Fig. 5 with non-target area out of focus
  • Fig. 8 shows weight factors for the blurred part of the image and the smooth transition of the weight factors
  • Fig. 9 shows a final image in which the non-focus blurred surroundings are merged with the crisp image of the target object and the result is color enhanced
  • Fig. 10 shows a schematic diagram illustrating forming of the smooth transition between the image of the target image object and its blurred surroundings.
  • Fig. 1 shows a schematic system 100 for use as a reference with which some example embodiments of the invention can be explained.
  • the system 100 comprises a device 110 such as a camera phone or a digital camera having a camera unit 120 with a field of view 130.
  • the system 100 further comprises a display 140.
  • Fig. 1 also shows a target image object 150 that is being imaged by the camera unit 120.
  • Fig. 1 also shows two other image objects: a proximate object 160 and a distant object 170 both clearly at a spatial distance with relation to the target image object 150.
  • FIG. 1 The three objects in Fig. 1 will be used to describe different example embodiments which enable simulating of a diorama effect. Some of these embodiments employ only circuitries within the camera unit 120 while some other embodiments use circuitries external to the camera unit 120. Before further explaining the operations, let us introduce some example structures with which at least some of the described example embodiments can be implemented.
  • Fig. 2 shows a block diagram of an apparatus 200 of an example embodiment of the invention.
  • the apparatus 200 is suited for operating as the device 110.
  • the apparatus 200 comprises a communication interface 220, a processor 210 coupled to the communication interface module 220, and a memory 240 coupled to the processor 210.
  • the memory 240 comprises a work memory and a non- volatile memory such as a read-only memory, flash memory, optical or magnetic memory.
  • the memory 240 typically at least initially in the non-volatile memory, there is stored software 250 operable to be loaded into and executed by the processor 210.
  • the software 250 may comprise one or more software modules and can be in the form of a computer program product that is software stored in a memory medium.
  • the apparatus 200 further comprises a camera unit 260 and a viewfinder 270 each coupled to the processor.
  • the communication interface module 220 is configured to provide local communications over one or more local links.
  • the links may be wired and/or wireless links.
  • the communication interface 220 may further or alternatively implement telecommunication links suited for establishing links with other users or for data transfer (e.g. using the Internet).
  • Such telecommunication links may be links using any of: wireless local area network links, Bluetooth, ultra-wideband, cellular or satellite communication links.
  • the communication interface 220 may be integrated into the apparatus 200 or into an adapter, card or the like that may be inserted into a suitable slot or port of the apparatus 200. While Fig. 2 shows one communication interface 220, the apparatus may comprise a plurality of communication interfaces 220.
  • the processor 210 is, for instance, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a graphics processing unit, an application specific integrated circuit (ASIC), a field programmable gate array, a microcontroller or a combination of such elements.
  • Figure 2 shows one processor 210, but the apparatus 200 may comprise a plurality of processors.
  • the memory 240 may comprise volatile and a nonvolatile memory, such as a read-only memory (ROM), a programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), a random-access memory (RAM), a flash memory, a data disk, an optical storage, a magnetic storage, a smart card, or the like.
  • ROM read-only memory
  • PROM programmable read-only memory
  • EPROM erasable programmable read-only memory
  • RAM random-access memory
  • the apparatus 200 may comprise other elements, such as microphones, displays, as well as additional circuitry such as further input/output (I/O) circuitries, memory chips, application-specific integrated circuits (ASIC), processing circuitry for specific purposes such as source coding/decoding circuitry, channel coding/decoding circuitry, ciphering/deciphering circuitry, and the like. Additionally, the apparatus 200 may comprise a disposable or rechargeable battery (not shown) for powering the apparatus when external power if external power supply is not available.
  • I/O input/output
  • ASIC application-specific integrated circuits
  • processing circuitry for specific purposes such as source coding/decoding circuitry, channel coding/decoding circuitry, ciphering/deciphering circuitry, and the like.
  • the apparatus 200 may comprise a disposable or rechargeable battery (not shown) for powering the apparatus when external power if external power supply is not available.
  • apparatus refers to the processor 210, with an input for the processor 210 configured to receive information from the camera unit and an output for the processor 210 configured to provide information to the camera unit for adjusting focus setting.
  • Fig. 3 shows a block diagram of a camera unit 260 of an example embodiment of the invention.
  • the camera unit 260 comprises an objective 261, an autofocus unit 262 configured to adjust focusing of the objective 261, an optional mechanical shutter 263, an image sensor 264 and an input and/or output 265.
  • the camera unit 260 is configured in one example embodiment to output autofocus information from the autofocus unit 262.
  • the camera unit is also configured to receive through the I/O 265 instructions e.g. from the processor 210 for the autofocus unit 262.
  • the camera unit 260 further comprises, in one example embodiment, an effect processor 266 communicatively connected to the autofocus unit 262 and to the image sensor 264.
  • the effect processor 266 can enable simulating the diorama effect within the camera unit 260.
  • the effect processor can be any type of a processor e.g. such as the alternatives described with reference to Fig. 2.
  • Fig. 4 shows a flow chart illustrating basic operations in a process according to an example embodiment. First, two or more images are captured with different focus settings so that:
  • step 410 one image is taken with the target area i.e. the target image object 150 in focus.
  • This can be arranged as a normal autofocus operation with an autofocus target spot on the target image object 150.
  • the focus can be driven slightly closer to the camera while still keeping the target image object 150 in the depth of field when the target image object 150 is far.
  • the objects behind the target image object 150 become more blurred.
  • the focus may be set slightly behind the target image object 150.
  • step 420 another image is taken with significant blur, e.g. maximum possible blur, in other areas i.e. in the other image objects 160, 170.
  • significant blur e.g. maximum possible blur
  • the autofocus unit e.g. maximum possible blur
  • step 430 additionally one or more images are taken in an example embodiment with different focus setting for causing significant blur in other image objects.
  • the autofocus unit 262 moves the objective 261 to its most distant possible focus, i.e. infinity focus, to blur objects in macro range.
  • the proximate object 160 becomes blurred in such an image.
  • the autofocus unit 262 is used to measure and provide for further use focus values from a focus block grid or other multiple block arrangement of focus value blocks for each of the captured images.
  • the autofocus unit 262 produces 440 focus measurements indicative of how well the pixels inside each block are in focus e.g. based on contrast between adjacent pixels and gives focus values for each block. These focus measurements can also be suited for normal autofocus operations.
  • the areas to be blurred are selected 450 based on the areas in focus and comparing the focus values in captured images and based on the spatial location of areas.
  • the target object 150 resides somewhere in the middle of image in spatial direction.
  • the target object is identified on locking focus as the object at which a focus setting point is directed.
  • a diorama effect image is formed by merging 460 the image having the target area in focus and one or more of the other captured images.
  • the merging employs pixelwise weighted averaging to smooth the boundaries between the target image object 150 and other parts of the merged image. At simplest, two images are merged, i.e. one with the target in focus and a most blurred one (typically, taken in the macro range).
  • the weight is selected for each pixel so that:
  • the area detected to be in the target area has weight 1.0 for the image in focus and weight 0.0 for the blurred images;
  • the area spatially far from the target area has weight 1.0 for the most blurred image and weight 0.0 for the other images.
  • the spatial distance considered to be far is, depending on embodiment, e.g. based on a predetermined threshold or computed/adjusted dynamically based on the areas detected to be in focus at each image.
  • the most blurred images can be simply determined based on the autofocus measurements and the used focus settings as the ones where the difference between correct focus setting and the used focus setting has the greatest blurring impact.
  • additional image processing operations are applied 470 to result image for enhancing miniature appearance.
  • These additional image processing operations can include one or more of the following:
  • Fig. 5 shows an example of an image with a focus grid illustrating focus measurement blocks of an autofocus unit.
  • Fig. 6 shows the image of Fig. 5 with a target area in focus. Focus blocks corresponding to the target image object 150 are shown as a target image object grid 610. The focus values are stored.
  • Fig. 7 shows an image taken from the view of Fig. 5 with non-target area out of focus.
  • Fig. 7 also shows the used autofocus block in the upper left corner at a proximate branch of a tree. This example is thus taken with macro focus.
  • the focus values are stored for the focused blocks. In another example embodiment, the focus values are stored for all the blocks. The focus values can subsequently be used for selecting the image from which a given block will be taken into a final image to gain target image with sharp focused objects and relatively strong blur around the focused objects.
  • Fig. 8 shows weight factors for the blurred part of the image and the smooth transition of the weight factors.
  • the weight factor 1 area is represented by solid black color and in the transition or gradient zone, the region surrounding the target image object 150 goes through darkening shades of grey to black as representation of the smooth change from crisp target image object 150 to blurred surroundings.
  • Fig. 9 shows a final image in which the non-focus blurred surroundings are merged with the crisp image of the target object 150 and the result is color enhanced.
  • the final image has a good a diorama effect with realistic blur that naturally depends on the spatial positions of the objects in the image independent of the objects' location within the image.
  • the creation of the blur by use of the autofocus unit 262 produces the blur without heavy computational operations. Only smoothing the boundary regions requires little combining of pixel values, but the number of pixels concerned is greatly smaller than the total number of pixels in the image.
  • Fig. 10 shows a schematic diagram illustrating forming of the smooth transition between the image of the target image object 150 and its blurred surroundings.
  • a portion of the target image object grid 610 is shown on an illustration of the target image object 150.
  • the weight for each pixel of an in focus image is calculated with a smoothing function:
  • W f(d x ,d y ), wherein W is weight, and d x and d y are distance from the nearest reference point that corresponds to the target image object 150.
  • the reference points can refer to, for example, a centerline of the focus blocks that cover the desired image. Alternatively, the reference points can refer to borderlines of the focus blocks. The choice of the reference points can be taken into account in the smoothing function f. For instance, when the centerline of focus blocks is used for defining the reference points, the weight W of a given pixel of the in focus image can be such as:
  • the focus block width is the width of the focus block (square focus blocks) expressed in common units with d x ,d y
  • scale factor is a factor used to determine the width of a smoothing zone.
  • the scale factor is e.g. 0.35 to cause that the edges of the desired image remain fairly sharp and the full blur is reached at about the distance of one focus block width from the border of the outmost focus blocks of the target image object 150.
  • the boundary region of the target image object 150 can be slightly blurred without excessive subjective impairing of the image and thus the scale factor can also be greater, e.g. 0.5 to 0.75.
  • the weight for the blurred image in the gradient zone should be 1 - W at each pixel so that the general brightness of the image remains unchanged.
  • the weight function used in the foregoing example is only one example; in another example, the function is any function so selected that the weight of in focus image pixels decreases when distance increases when going further away from in focus area and thus the pixels of blurred image become more prevailing than the pixels of the in focus image.
  • linear weighting is applied. For example,
  • parameters di to cU are distances from centerline of target object blocks to the exterior border of the target object blocks and further to the border of blurred area as shown in Fig.10, and function mm(valuel ;value2) produces the minimum of its arguments (e.g. valuel and value2).
  • a technical effect of one or more of the example embodiments disclosed herein is that miniature lookalike photos can be produced without necessarily using any special hardware. Also heavy computation can be avoided by using optical blurring with the autofocus unit. Another technical effect of one or more of the example embodiments disclosed herein is that both near and far objects can be excluded from in focus target area with automatic masking that requires low computational complexity. Another technical effect of one or more of the example embodiments disclosed herein is that the photographs need not be carefully designed as with using a tilt-shift lens where the camera orientation combined with the tilt-shift settings determines the objects that appear crisp and the objects that become blurred. Yet another technical effect of one or more of the example embodiments disclosed herein is that the blur obtained is very natural and close to the effect of using a real tilt-shift objective.
  • Embodiments of the present invention may be implemented in software, hardware, application logic or a combination of software, hardware and application logic.
  • the software, application logic and/or hardware may reside on the camera unit, a host device that uses the camera unit or even on a plug-in module.
  • the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media.
  • a "computer-readable medium" may be any media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, with two examples of a suited apparatus being described and depicted in Figs. 2 and 3.
  • a computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
  • the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the previously described functions may be optional or may be combined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Description

METHOD AND APPARATUS FOR PRODUCING SPECIAL EFFECTS IN DIGITAL PHOTOGRAPHY TECHNICAL FIELD
[0001] The present application generally relates to producing of special effects in digital photography.
BACKGROUND
[0002] In photographing, images are formed with very different objectives. Sometimes, it is desired that so-called depth of field (DOF) is very long, extending from nearest objects to the farthest objects. This is often desired in landscape images. However, often the photographer desires to compress the DOF so as to emphasize some image objects. Typically, a smaller DOF is created by using larger lens aperture (i.e. smaller / number). While every objective has exactly one exactly sharp focal plane at a distance that depends on the lens aperture and other properties of the camera, the blur builds gradually and does not become perceivable within the DOF.
[0003] In portrait and macro images, the DOF is typically shortened by use of relatively large lens apertures and by short range. There are also some professional photographers who have taken portrait images using very special objectives that can tilt and shift with relation to the camera's exposing frame of film or image sensor. Such objectives are referred to as tilt and shift objectives. Tilt and shift objectives are among the most expensive objectives. Tilt and shift objectives yet enable photographing tall buildings so that the tops of the buildings do not seem to turn towards each other. While this geometric error can also be corrected by digital processing, the use of a tilt-shift objective with correct settings results in higher accuracy by removing the need to stretch image areas.
[0004] Tilt and shift objectives are also sometimes used to produce a so-called diorama effect or "diorama illusion". In the diorama effect, a life-size object or scene is made to look like a photograph of a miniature scale model. In miniature model photographs, it is easy to produce strong blur in front of and behind the focal plane because of the basic laws of optics and because of short range to the target. In life-size photographing, the lens aperture cannot be proportionally increased as much as the distances do in comparison to macro imaging. Hence, when an image is taken e.g. from a helicopter or tall building, the DOF is relatively far greater than in macro imaging. Tilt-shift objectives yet produce a wedge shaped DOF when the objective is tilted.
SUMMARY
[0005] Various aspects of examples of the invention are set out in the claims.
[0006] According to a first example aspect of the present invention, there is provided an apparatus comprising:
[0007] an interface configured to exchange information with a camera unit;
[0008] a processor configured to cause taking images in a series and with differing focus settings with the camera unit so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are imaged out of focus so as to cause blur in the neighboring objects; [0009] the processor being further configured to combine the at least one first image and the at least one second image to form a combined image so that:
[0010] a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image;
[0011 ] a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image; and
[0012] a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image.
[0013] The target object may be defined as an object appearing in the focused depth of the camera unit at a given moment of time. The given moment of time may be the time when a user expresses a desire to take a photograph with simulated diorama effect.
[0014] The processor may be further configured to form a gradual transition between the first sub-image and the second sub-image in the image by blending images of different focus settings with gradient weights about border region surrounding the desired at least one image object. The processor may be configured to form a gradual transition between the first sub-image and the second sub-image by mixing pixels of two or more images with varying weights so that pixels closer to the target object are formed with higher weight from an image in which the target object is in focus and pixels farther from the target object are formed with greater weight from an image in which the second sub-image is out-of- focus.
[0015] According to a second example aspect of the present invention, there is provided a method comprising:
[0016] exchanging information with a camera unit;
[0017] causing taking images in a series and with differing focus settings so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are imaged out of focus so as to cause blur in the neighboring objects;
[0018] combining the at least one first image and the at least one second image to form a combined image so that:
[0019] a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image;
[0020] a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image; and
[0021] a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image.
[0022] According to a third example aspect of the present invention, there is provided a computer program comprising:
[0023] code for causing exchanging information with a camera unit; [0024] code for causing taking images in a series and with differing focus settings so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are imaged out of focus so as to cause blur in the neighboring objects; and
[0025] code for combining the at least one first image and the at least one second image to form a combined image so that:
[0026] a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image;
[0027] a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image; and
[0028] a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image;
[0029] when executed by an apparatus.
[0030] According to a fourth example aspect of the present invention, there is provided a memory medium comprising the computer program of the third example aspect.
[0031] According to a fifth example aspect of the present invention, there is provided a method comprising:
[0032] taking images in a series with differing focus settings so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are out of focus so as to blur the neighboring objects;
[0033] combining the at least one first and second images into a combined image with:
[0034] a first sub-image formed from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image;
[0035] a second sub-image formed from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image; and
[0036] a third sub-image between the first sub-image and the second sub-image formed by merging pixels of matching position from the at least one first image and from the at least one second image.
[0037] Any foregoing memory medium may comprise a digital data storage such as a data disc or diskette, optical storage, magnetic storage, holographic storage, opto-magnetic storage, phase-change memory, resistive random access memory, magnetic random access memory, solid-electrolyte memory, ferroelectric random access memory, organic memory or polymer memory. The memory medium may be formed into a device without other substantial functions than storing memory or it may be formed as part of a device with other functions, including but not limited to a memory of a computer, a chip set, and a sub assembly of an electronic device.
[0038] Different non-binding example aspects and embodiments of the present invention have been illustrated in the foregoing. The foregoing embodiments are used merely to explain selected aspects or steps that may be utilized in implementations of the present invention. Some embodiments may be presented only with reference to certain example aspects of the invention. It should be appreciated that corresponding embodiments may apply to other example aspects as well. BRIEF DESCRIPTION OF THE DRAWINGS
[0039] For a more complete understanding of example embodiments of the present invention, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
[0040] Fig. 1 shows a schematic system for use as a reference with which some example embodiments of the invention can be explained;
[0041] Fig. 2 shows a block diagram of an apparatus of an example embodiment of the invention;
[0042] Fig. 3 shows a block diagram of a camera unit of an example embodiment of the invention;
[0043] Fig. 4 shows a flow chart illustrating basic operations in a process according to an example embodiment;
[0044] Fig. 5 shows an example of an image with a focus grid illustrating focus measurement blocks of an autofocus unit;
[0045] Fig. 6 shows the image of Fig. 5 with a target area in focus;
[0046] Fig. 7 shows an image taken from the view of Fig. 5 with non-target area out of focus;
[0047] Fig. 8 shows weight factors for the blurred part of the image and the smooth transition of the weight factors;
[0048] Fig. 9 shows a final image in which the non-focus blurred surroundings are merged with the crisp image of the target object and the result is color enhanced; and
[0049] Fig. 10 shows a schematic diagram illustrating forming of the smooth transition between the image of the target image object and its blurred surroundings. DETAILED DESCRIPTION OF THE DRAWINGS
[0050] An example embodiment of the present invention and its potential advantages are understood by referring to Figs. 1 through 10 of the drawings.
[0051] Fig. 1 shows a schematic system 100 for use as a reference with which some example embodiments of the invention can be explained. The system 100 comprises a device 110 such as a camera phone or a digital camera having a camera unit 120 with a field of view 130. The system 100 further comprises a display 140. Fig. 1 also shows a target image object 150 that is being imaged by the camera unit 120. Fig. 1 also shows two other image objects: a proximate object 160 and a distant object 170 both clearly at a spatial distance with relation to the target image object 150.
[0052] The three objects in Fig. 1 will be used to describe different example embodiments which enable simulating of a diorama effect. Some of these embodiments employ only circuitries within the camera unit 120 while some other embodiments use circuitries external to the camera unit 120. Before further explaining the operations, let us introduce some example structures with which at least some of the described example embodiments can be implemented.
[0053] Fig. 2 shows a block diagram of an apparatus 200 of an example embodiment of the invention. The apparatus 200 is suited for operating as the device 110. The apparatus 200 comprises a communication interface 220, a processor 210 coupled to the communication interface module 220, and a memory 240 coupled to the processor 210. The memory 240 comprises a work memory and a non- volatile memory such as a read-only memory, flash memory, optical or magnetic memory. In the memory 240, typically at least initially in the non-volatile memory, there is stored software 250 operable to be loaded into and executed by the processor 210. The software 250 may comprise one or more software modules and can be in the form of a computer program product that is software stored in a memory medium. The apparatus 200 further comprises a camera unit 260 and a viewfinder 270 each coupled to the processor.
[0054] It shall be understood that any coupling in this document refers to functional or operational coupling; there may be intervening components or circuitries in between coupled elements.
[0055] The communication interface module 220 is configured to provide local communications over one or more local links. The links may be wired and/or wireless links. The communication interface 220 may further or alternatively implement telecommunication links suited for establishing links with other users or for data transfer (e.g. using the Internet). Such telecommunication links may be links using any of: wireless local area network links, Bluetooth, ultra-wideband, cellular or satellite communication links. The communication interface 220 may be integrated into the apparatus 200 or into an adapter, card or the like that may be inserted into a suitable slot or port of the apparatus 200. While Fig. 2 shows one communication interface 220, the apparatus may comprise a plurality of communication interfaces 220.
[0056] The processor 210 is, for instance, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a graphics processing unit, an application specific integrated circuit (ASIC), a field programmable gate array, a microcontroller or a combination of such elements. Figure 2 shows one processor 210, but the apparatus 200 may comprise a plurality of processors.
[0057] As mentioned in the foregoing, the memory 240 may comprise volatile and a nonvolatile memory, such as a read-only memory (ROM), a programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), a random-access memory (RAM), a flash memory, a data disk, an optical storage, a magnetic storage, a smart card, or the like. In some example embodiments, only volatile or non-volatile memory is present in the apparatus 200. Moreover, in some example embodiments, the apparatus comprises a plurality of memories. In some example embodiments, various elements are integrated. For instance, the memory 240 can be constructed as a part of the apparatus 200 or inserted into a slot, port, or the like. Further still, the memory 240 may serve the sole purpose of storing data, or it may be constructed as a part of an apparatus serving other purposes, such as processing data. Similar options are thinkable also for various other elements.
[0058] A skilled person appreciates that in addition to the elements shown in Figure 2, the apparatus 200 may comprise other elements, such as microphones, displays, as well as additional circuitry such as further input/output (I/O) circuitries, memory chips, application-specific integrated circuits (ASIC), processing circuitry for specific purposes such as source coding/decoding circuitry, channel coding/decoding circuitry, ciphering/deciphering circuitry, and the like. Additionally, the apparatus 200 may comprise a disposable or rechargeable battery (not shown) for powering the apparatus when external power if external power supply is not available.
[0059] It is also useful to realize that the term apparatus is used in this document with varying scope. In some of the broader claims and examples, the apparatus may refer to only a subset of the features presented in Fig. 2 or even be implemented without any one of the features of Fig. 2. In one example embodiment term apparatus refers to the processor 210, with an input for the processor 210 configured to receive information from the camera unit and an output for the processor 210 configured to provide information to the camera unit for adjusting focus setting.
[0060] Fig. 3 shows a block diagram of a camera unit 260 of an example embodiment of the invention. The camera unit 260 comprises an objective 261, an autofocus unit 262 configured to adjust focusing of the objective 261, an optional mechanical shutter 263, an image sensor 264 and an input and/or output 265. The camera unit 260 is configured in one example embodiment to output autofocus information from the autofocus unit 262. In one example embodiment, the camera unit is also configured to receive through the I/O 265 instructions e.g. from the processor 210 for the autofocus unit 262.
[0061] The camera unit 260 further comprises, in one example embodiment, an effect processor 266 communicatively connected to the autofocus unit 262 and to the image sensor 264. When implemented, the effect processor 266 can enable simulating the diorama effect within the camera unit 260. The effect processor can be any type of a processor e.g. such as the alternatives described with reference to Fig. 2.
[0062] Fig. 4 shows a flow chart illustrating basic operations in a process according to an example embodiment. First, two or more images are captured with different focus settings so that:
• In step 410, one image is taken with the target area i.e. the target image object 150 in focus. This can be arranged as a normal autofocus operation with an autofocus target spot on the target image object 150. For better detaching the surrounding parts of the image, the focus can be driven slightly closer to the camera while still keeping the target image object 150 in the depth of field when the target image object 150 is far.
Thus, the objects behind the target image object 150 become more blurred. On the other hand, when the target image object 150 is near the camera, the focus may be set slightly behind the target image object 150.
• In step 420, another image is taken with significant blur, e.g. maximum possible blur, in other areas i.e. in the other image objects 160, 170. For instance, the autofocus unit
262 brings the objective 261 or lens in closest possible focus i.e. to macro focus to blur the distant object 170.
• In an optional step 430, additionally one or more images are taken in an example embodiment with different focus setting for causing significant blur in other image objects. For instance, the autofocus unit 262 moves the objective 261 to its most distant possible focus, i.e. infinity focus, to blur objects in macro range. In Fig. 1, the proximate object 160 becomes blurred in such an image.
[0063] The autofocus unit 262 is used to measure and provide for further use focus values from a focus block grid or other multiple block arrangement of focus value blocks for each of the captured images. In one example embodiment, the autofocus unit 262 produces 440 focus measurements indicative of how well the pixels inside each block are in focus e.g. based on contrast between adjacent pixels and gives focus values for each block. These focus measurements can also be suited for normal autofocus operations.
[0064] The areas to be blurred are selected 450 based on the areas in focus and comparing the focus values in captured images and based on the spatial location of areas. Often, the target object 150 resides somewhere in the middle of image in spatial direction. In one example embodiment, the target object is identified on locking focus as the object at which a focus setting point is directed. [0065] A diorama effect image is formed by merging 460 the image having the target area in focus and one or more of the other captured images. In an example embodiment, the merging employs pixelwise weighted averaging to smooth the boundaries between the target image object 150 and other parts of the merged image. At simplest, two images are merged, i.e. one with the target in focus and a most blurred one (typically, taken in the macro range).
[0066] The weight is selected for each pixel so that:
[0067] The area detected to be in the target area has weight 1.0 for the image in focus and weight 0.0 for the blurred images;
[0068] The area spatially far from the target area, as determined based on the focus measurements, has weight 1.0 for the most blurred image and weight 0.0 for the other images. The spatial distance considered to be far is, depending on embodiment, e.g. based on a predetermined threshold or computed/adjusted dynamically based on the areas detected to be in focus at each image. The most blurred images can be simply determined based on the autofocus measurements and the used focus settings as the ones where the difference between correct focus setting and the used focus setting has the greatest blurring impact.
[0069] In one example embodiment, additional image processing operations are applied 470 to result image for enhancing miniature appearance. These additional image processing operations can include one or more of the following:
• emphasizing contrast, saturation and colors over the entire image; and
· applying moderate edge preserving smoothing filter on the in focus target area.
[0070] Fig. 5 shows an example of an image with a focus grid illustrating focus measurement blocks of an autofocus unit.
[0071] Fig. 6 shows the image of Fig. 5 with a target area in focus. Focus blocks corresponding to the target image object 150 are shown as a target image object grid 610. The focus values are stored.
[0072] Fig. 7 shows an image taken from the view of Fig. 5 with non-target area out of focus. Fig. 7 also shows the used autofocus block in the upper left corner at a proximate branch of a tree. This example is thus taken with macro focus. Once the image is taken, the focus values are stored for the focused blocks. In another example embodiment, the focus values are stored for all the blocks. The focus values can subsequently be used for selecting the image from which a given block will be taken into a final image to gain target image with sharp focused objects and relatively strong blur around the focused objects.
[0073] Fig. 8 shows weight factors for the blurred part of the image and the smooth transition of the weight factors. The weight factor 1 area is represented by solid black color and in the transition or gradient zone, the region surrounding the target image object 150 goes through darkening shades of grey to black as representation of the smooth change from crisp target image object 150 to blurred surroundings.
[0074] Fig. 9 shows a final image in which the non-focus blurred surroundings are merged with the crisp image of the target object 150 and the result is color enhanced. The final image has a good a diorama effect with realistic blur that naturally depends on the spatial positions of the objects in the image independent of the objects' location within the image. Moreover, the creation of the blur by use of the autofocus unit 262 produces the blur without heavy computational operations. Only smoothing the boundary regions requires little combining of pixel values, but the number of pixels concerned is greatly smaller than the total number of pixels in the image.
[0075] Fig. 10 shows a schematic diagram illustrating forming of the smooth transition between the image of the target image object 150 and its blurred surroundings. A portion of the target image object grid 610 is shown on an illustration of the target image object 150. The weight for each pixel of an in focus image is calculated with a smoothing function:
[0076] W = f(dx,dy), wherein W is weight, and dx and dy are distance from the nearest reference point that corresponds to the target image object 150. The reference points can refer to, for example, a centerline of the focus blocks that cover the desired image. Alternatively, the reference points can refer to borderlines of the focus blocks. The choice of the reference points can be taken into account in the smoothing function f. For instance, when the centerline of focus blocks is used for defining the reference points, the weight W of a given pixel of the in focus image can be such as:
' dy_ · scale factor
W max^ 1 , 1 f0CUs block width ^
[0077] when at least one of dx,dy is greater than 0, wherein max(valuel ;value2) refers to the greater of valuel and value2, the focus block width is the width of the focus block (square focus blocks) expressed in common units with dx,dy, and scale factor is a factor used to determine the width of a smoothing zone. The scale factor is e.g. 0.35 to cause that the edges of the desired image remain fairly sharp and the full blur is reached at about the distance of one focus block width from the border of the outmost focus blocks of the target image object 150. However, with the edge preserving smoothing filter application, the boundary region of the target image object 150 can be slightly blurred without excessive subjective impairing of the image and thus the scale factor can also be greater, e.g. 0.5 to 0.75.
[0078] The weight for the blurred image in the gradient zone should be 1 - W at each pixel so that the general brightness of the image remains unchanged. It is also understood that the weight function used in the foregoing example is only one example; in another example, the function is any function so selected that the weight of in focus image pixels decreases when distance increases when going further away from in focus area and thus the pixels of blurred image become more prevailing than the pixels of the in focus image. In another example, linear weighting is applied. For example,
Wx= min(l ; max(o ;^ ¾ ) ) (2) Wy= min(l ; max(o ; ) ) (3)
W = min(Wx ; Wy) (4), wherein
[0079] parameters di to cU are distances from centerline of target object blocks to the exterior border of the target object blocks and further to the border of blurred area as shown in Fig.10, and function mm(valuel ;value2) produces the minimum of its arguments (e.g. valuel and value2).
[0080] Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein is that miniature lookalike photos can be produced without necessarily using any special hardware. Also heavy computation can be avoided by using optical blurring with the autofocus unit. Another technical effect of one or more of the example embodiments disclosed herein is that both near and far objects can be excluded from in focus target area with automatic masking that requires low computational complexity. Another technical effect of one or more of the example embodiments disclosed herein is that the photographs need not be carefully designed as with using a tilt-shift lens where the camera orientation combined with the tilt-shift settings determines the objects that appear crisp and the objects that become blurred. Yet another technical effect of one or more of the example embodiments disclosed herein is that the blur obtained is very natural and close to the effect of using a real tilt-shift objective.
[0081] Embodiments of the present invention may be implemented in software, hardware, application logic or a combination of software, hardware and application logic. The software, application logic and/or hardware may reside on the camera unit, a host device that uses the camera unit or even on a plug-in module. In an example embodiment, the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media. In the context of this document, a "computer-readable medium" may be any media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, with two examples of a suited apparatus being described and depicted in Figs. 2 and 3. A computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
[0082] If desired, the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the previously described functions may be optional or may be combined.
[0083] Although various aspects of the invention are set out in the independent claims, other aspects of the invention comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims.
[0084] It is also noted herein that while the foregoing describes example embodiments of the invention, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the present invention as defined in the appended claims.

Claims

WHAT IS CLAIMED IS
1. An apparatus, comprising:
an interface configured to exchange information with a camera unit; and
a processor configured to cause taking images in a series and with differing focus settings with the camera unit so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are imaged out of focus so as to cause blur in the neighboring objects;
the processor being further configured to combine the at least one first image and the at least one second image to form a combined image so that:
a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image;
a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image; and a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image.
2. The apparatus of claim 1, wherein the target image object is defined as an object appearing in the focused depth of the camera unit at a given moment of time.
3. The apparatus of claim 2, wherein the given moment of time is the time when a user expresses a desire to take a photograph with simulated diorama effect.
4. The apparatus of any of preceding claims, wherein the processor is further configured to form a gradual transition between the first sub-image and the second sub-image in the image by blending images of different focus settings with gradient weights about border region surrounding the desired at least one image object.
5. The apparatus of any of preceding claims, wherein processor is further configured to form a gradual transition between the first sub-image and the second sub-image by mixing pixels of two or more images with varying weights so that pixels closer to the target object are formed with higher weight from an image in which the target object is in focus and pixels farther from the target object are formed with greater weight from an image in which the second sub-image is out-of- focus.
6. The apparatus of any of preceding claims, wherein the processor is further configured to receive focusing information from the camera unit.
7. The apparatus of claim 6, wherein the processor is further configured to determine the in focus image from the focusing information.
8. The apparatus of claim 6 or 7, wherein the processor is further configured to determine in focus image blocks based on the focusing information.
9. The apparatus of any of preceding claims, wherein the processor is further configured to apply an edge preserving smoothing filter on the image at the target image object.
10. The apparatus of any of preceding claims, wherein the processor is further configured to enhance colors of the formed image.
11. The apparatus of any of preceding claims, wherein the apparatus is built into a camera unit.
12. A method comprising:
exchanging information with a camera unit;
causing taking images in a series and with differing focus settings with the camera unit so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are imaged out of focus so as to cause blur in the neighboring objects;
combining the at least one first image and the at least one second image to form a combined image so that:
a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image; a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image; and
a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image.
13. The method of claim 12, wherein the target image object is defined as an object appearing in the focused depth of the camera unit at a given moment of time.
14. The method of claim 13, wherein the given moment of time is the time when a user expresses a desire to take a photograph with simulated diorama effect.
15. The method of any of claims 12 to 14, further comprising forming a gradual transition between the first sub-image and the second sub-image in the image by blending images of different focus settings with gradient weights about border region surrounding the desired at least one image object.
16. The method of any of claims 12 to 14, further comprising forming a gradual transition between the first sub-image and the second sub-image by mixing pixels of two or more images with varying weights so that pixels closer to the target object are formed with higher weight from an image in which the target object is in focus and pixels farther from the target object are formed with greater weight from an image in which the second sub-image is out-of- focus.
17. The method of any of claims 12 to 16, further comprising receiving focusing information from the camera unit.
18. The method of claim 17, wherein the processor is further configured to determine the in focus image from the focusing information.
19. The method of claim 17 or 18, wherein the processor is further configured to determine in focus image blocks based on the focusing information.
20. The method of any of claims 12 to 19, wherein the processor is further configured to apply an edge preserving smoothing filter on the image at the target image object.
21. The method of any claims 12 to 20, wherein the processor is further configured to enhance colors of the formed image.
22. A computer program comprising:
code for causing exchanging information with a camera unit;
code for causing taking images in a series and with differing focus settings with the camera unit so that in at least one first image, a target object is in focus and in at least one second image, objects neighboring the target object are imaged out of focus so as to cause blur in the neighboring objects; and code for combining the at least one first image and the at least one second image to form a combined image so that:
a first sub-image is formed for the combined image from a corresponding portion of the at least one first image independently of a corresponding portion of the at least one second image; a second sub-image is formed for the combined image from a corresponding portion of the at least one second image independently of a corresponding portion of the at least one first image; and
a third sub-image between the first sub-image and the second sub-image is formed for the combined image by merging pixels of matching position from the at least one first image and from the at least one second image;
when executed by an apparatus.
23. The computer program of claim 22, further comprising computer code for causing performing any of claims 13 to 21, when executed by the apparatus.
24. A memory medium comprising the computer program of claim 22.
PCT/FI2012/050363 2012-04-13 2012-04-13 Method and apparatus for producing special effects in digital photography Ceased WO2013153252A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2012/050363 WO2013153252A1 (en) 2012-04-13 2012-04-13 Method and apparatus for producing special effects in digital photography

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2012/050363 WO2013153252A1 (en) 2012-04-13 2012-04-13 Method and apparatus for producing special effects in digital photography

Publications (1)

Publication Number Publication Date
WO2013153252A1 true WO2013153252A1 (en) 2013-10-17

Family

ID=49327153

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2012/050363 Ceased WO2013153252A1 (en) 2012-04-13 2012-04-13 Method and apparatus for producing special effects in digital photography

Country Status (1)

Country Link
WO (1) WO2013153252A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104270560A (en) * 2014-07-31 2015-01-07 三星电子(中国)研发中心 Multi-point focusing method and device
CN105631911A (en) * 2015-12-31 2016-06-01 小米科技有限责任公司 Image generation method, device and system
JP2016200701A (en) * 2015-04-09 2016-12-01 キヤノン株式会社 Image capturing device, control method therefor, program, and storage medium
CN111246106A (en) * 2020-01-22 2020-06-05 维沃移动通信有限公司 Image processing method, electronic device, and computer-readable storage medium
US12260534B2 (en) 2022-04-26 2025-03-25 Communications Test Design, Inc. Method to detect camera blemishes

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060098970A1 (en) * 2004-11-10 2006-05-11 Pentax Corporation Image signal processing unit and digital camera
US20090040321A1 (en) * 2007-08-10 2009-02-12 Megachips Corporation Digital camera system
US20090160963A1 (en) * 2007-12-21 2009-06-25 Samsung Techwin Co., Ltd. Apparatus and method for blurring image background in digital image processing device
US20110193980A1 (en) * 2010-02-05 2011-08-11 Canon Kabushiki Kaisha Imaging apparatus and image processing method
US20120057070A1 (en) * 2010-09-02 2012-03-08 Samsung Electronics Co. Ltd. Digital image processing apparatus, digital image processing method, and recording medium storing the digital image processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060098970A1 (en) * 2004-11-10 2006-05-11 Pentax Corporation Image signal processing unit and digital camera
US20090040321A1 (en) * 2007-08-10 2009-02-12 Megachips Corporation Digital camera system
US20090160963A1 (en) * 2007-12-21 2009-06-25 Samsung Techwin Co., Ltd. Apparatus and method for blurring image background in digital image processing device
US20110193980A1 (en) * 2010-02-05 2011-08-11 Canon Kabushiki Kaisha Imaging apparatus and image processing method
US20120057070A1 (en) * 2010-09-02 2012-03-08 Samsung Electronics Co. Ltd. Digital image processing apparatus, digital image processing method, and recording medium storing the digital image processing method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104270560A (en) * 2014-07-31 2015-01-07 三星电子(中国)研发中心 Multi-point focusing method and device
JP2016200701A (en) * 2015-04-09 2016-12-01 キヤノン株式会社 Image capturing device, control method therefor, program, and storage medium
CN105631911A (en) * 2015-12-31 2016-06-01 小米科技有限责任公司 Image generation method, device and system
CN111246106A (en) * 2020-01-22 2020-06-05 维沃移动通信有限公司 Image processing method, electronic device, and computer-readable storage medium
WO2021147921A1 (en) * 2020-01-22 2021-07-29 维沃移动通信有限公司 Image processing method, electronic device and computer-readable storage medium
CN111246106B (en) * 2020-01-22 2021-08-03 维沃移动通信有限公司 Image processing method, electronic device, and computer-readable storage medium
US11792351B2 (en) 2020-01-22 2023-10-17 Vivo Mobile Communication Co., Ltd. Image processing method, electronic device, and computer-readable storage medium
US12260534B2 (en) 2022-04-26 2025-03-25 Communications Test Design, Inc. Method to detect camera blemishes

Similar Documents

Publication Publication Date Title
US8989517B2 (en) Bokeh amplification
US10827107B2 (en) Photographing method for terminal and terminal
EP4050881B1 (en) High-dynamic range image synthesis method and electronic device
CN105765967B (en) Method, system and medium for adjusting settings of a first camera using a second camera
US8335390B2 (en) Blur function modeling for depth of field rendering
US20150086127A1 (en) Method and image capturing device for generating artificially defocused blurred image
EP1924966B1 (en) Adaptive exposure control
KR101364421B1 (en) System and method for generating robust depth maps utilizing a multi-resolution procedure
CN113888437A (en) Image processing method, apparatus, electronic device, and computer-readable storage medium
US20140176592A1 (en) Configuring two-dimensional image processing based on light-field parameters
CN103780840A (en) High-quality imaging double camera shooting and imaging device and method thereof
WO2014172059A2 (en) Reference image selection for motion ghost filtering
WO2017039853A1 (en) Photo-realistic shallow depth-of-field rendering from focal stacks
Kim et al. Radiometric alignment of image sequences
KR20230074136A (en) Salience-based capture or image processing
CN101088104A (en) Electronic device and method in an electronic device for processing image data
CN103136745B (en) Utilization defocuses the system and method for pillbox image execution estimation of Depth
CN113298735A (en) Image processing method, image processing device, electronic equipment and storage medium
CN110022430A (en) Image weakening method, device, mobile terminal and computer readable storage medium
WO2013153252A1 (en) Method and apparatus for producing special effects in digital photography
CN107346531A (en) A kind of image optimization method, device and terminal
US20160292842A1 (en) Method and Apparatus for Enhanced Digital Imaging
KR20150032764A (en) Method and image capturing device for generating artificially defocused blurred image
US12254644B2 (en) Imaging system and method
US9262833B2 (en) Methodology for performing depth estimation with defocused images under extreme lighting conditions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12874319

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12874319

Country of ref document: EP

Kind code of ref document: A1