[go: up one dir, main page]

WO2016199483A1 - Appareil de traitement d'image, procédé de traitement d'image, et programme - Google Patents

Appareil de traitement d'image, procédé de traitement d'image, et programme Download PDF

Info

Publication number
WO2016199483A1
WO2016199483A1 PCT/JP2016/060689 JP2016060689W WO2016199483A1 WO 2016199483 A1 WO2016199483 A1 WO 2016199483A1 JP 2016060689 W JP2016060689 W JP 2016060689W WO 2016199483 A1 WO2016199483 A1 WO 2016199483A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
score
composition
frame
threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/060689
Other languages
English (en)
Japanese (ja)
Inventor
俊樹 小野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to DE112016002564.5T priority Critical patent/DE112016002564T5/de
Priority to JP2017523136A priority patent/JP6729572B2/ja
Publication of WO2016199483A1 publication Critical patent/WO2016199483A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape

Definitions

  • the present technology relates to an image processing apparatus, an image processing method, and a program, and particularly to a technical field regarding automatic still image storage.
  • Patent Document 1 discloses a technique in which the frequency of still image capturing is changed according to the user's intention and the state of the image capturing apparatus such as battery driving, regarding automatic still image capturing.
  • Patent Document 2 discloses a technique for preventing many subjects having the same subject content and composition from being taken in automatic still image capturing.
  • Automatic still image capturing is a suitable image obtained by determining, for example, human face detection, smile determination, face size, etc., in a frame of captured image data of a subject acquired by an imaging device by an imaging device.
  • the image data of the frame is extracted and stored as a still image.
  • automatic still image storage is referred to as “automatic still image storage”.
  • An image processing apparatus includes a threshold setting unit that sets a threshold condition for determining whether to perform image storage according to a composition, and the composition in the frame satisfies the threshold condition set by the threshold setting unit.
  • an image storage determination unit that determines image data that stores image data corresponding to the frame is provided. According to this configuration, it is possible to automatically determine a frame having a good composition from a continuous frame such as a moving image or a through image.
  • the image storage determination unit when the composition in the frame satisfies the threshold condition set by the threshold setting unit, the image data of the frame according to other condition determination Is stored as image data. That is, by determining other conditions in addition to the composition, it is possible to automatically determine a frame having a good quality as much as possible.
  • the whole or a part of the continuous frames is used as a score calculation target frame
  • the score calculation unit includes a score calculation unit that calculates a score that is an evaluation value of the composition of the image for the frame as the score calculation target.
  • the image storage determination unit when a score calculated by the score calculation unit for a certain frame satisfies a threshold condition set by the threshold setting unit, image data for storing image data corresponding to the frame; judge. That is, it is determined whether or not one frame of image data corresponds to a preferable composition, and a score indicating the corresponding degree is obtained.
  • the score calculation unit is capable of calculating a score with each of a plurality of compositions as a reference.
  • a score is obtained using a plurality of compositions as a reference.
  • the score calculation unit may select a composition as a reference for score calculation according to the image content of the frame, and calculate a score for each of the selected one or more compositions. Conceivable. Depending on the image content, for example, the type of subject, there are desirable compositions and suitable compositions. For example, a composition suitable for a person being a subject, a composition suitable for a landscape being a subject, and the like. Therefore, by selecting a composition used for score calculation according to the image content of the frame, the degree of appropriateness to the composition suitable for the image content is obtained.
  • the image storage determination unit is configured such that the maximum value among the scores calculated by the score calculation unit for a certain frame based on a plurality of compositions is set as the threshold set by the threshold setting unit. It may be possible to determine whether or not If at least one score satisfies the threshold condition, the frame can be evaluated as corresponding to a certain composition. Therefore, it is determined whether the threshold value condition is satisfied for the score having the maximum value.
  • the image storage determination unit determines whether the score of the target frame is in the vicinity of a peak value of a score value that fluctuates in a plurality of consecutive frames as the other condition determination. It is conceivable to make a determination. That is, a frame whose score satisfies the threshold condition is not stored as a still image, but a frame whose score is in the vicinity of the peak value can be selected.
  • the image storage determination unit may determine whether the image of the target frame is in focus as the other condition determination. That is, a frame whose score satisfies the threshold condition is not stored as a still image, but an out-of-focus image is not stored as a still image.
  • the image storage determination unit may determine whether or not the target frame includes an image of a process in which the subject of interest is moving as the other condition determination. That is, a frame whose score satisfies the threshold condition is not stored as a still image, but an image estimated to be blurred is not stored as a still image.
  • the image storage determination unit determines the shutter speed at the time of capturing the frame. It is conceivable to perform the determination. That is, instead of simply storing a frame whose score satisfies the threshold condition as a still image, if the image is likely to be blurred, it is stored as a still image when it can be determined that there is no blur from the shutter speed.
  • the threshold value setting unit variably sets a threshold value condition according to an operation input.
  • the standby time corresponding to the threshold set by the threshold setting unit For the image data corresponding to this frame, it is conceivable that the image storage determination unit does not determine the image data to be stored. That is, when image data corresponding to a certain frame is stored, the storage is not performed for a predetermined waiting time thereafter. In this case, the standby time is determined according to the set threshold condition.
  • the image processing apparatus described above may include a display control unit that displays image data of frames that are continuous on the time axis on the display screen, and displays the composition information superimposed on the image. That is, the captured image is displayed in a state in which the degree of evaluation with respect to a specific composition of the image content can be understood.
  • the above-described image processing apparatus may include a display control unit that displays image data of frames that are continuous on the time axis on a display screen, and displays information about the score and threshold condition of each frame. That is, it is shown to the user whether or not the score satisfies the threshold condition.
  • the above-described image processing apparatus includes a display control unit that displays image data of continuous frames on the time axis on a display screen and displays an enlarged image that serves as a guide for matching the image contents with a specific composition.
  • a display control unit that displays image data of continuous frames on the time axis on a display screen and displays an enlarged image that serves as a guide for matching the image contents with a specific composition.
  • a threshold setting step for setting a threshold condition for image storage execution determination according to the composition, and the composition in the frame satisfy the threshold condition set in the threshold setting step.
  • the image processing method is such that the arithmetic processing unit executes an image storage determination step for determining image data for storing image data corresponding to the frame. This makes it possible to obtain a still image with a suitable composition by the operation of automatic still image storage.
  • a program according to the present technology is a program that causes an arithmetic processing device to execute the processing of each step described above. With this program, the arithmetic processing unit can execute image processing suitable for automatic still image storage.
  • the present technology it is possible to perform automatic image storage capable of obtaining an image with a preferable composition without being conscious of the user, and the frequency of image storage also satisfies the user's request to some extent.
  • the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
  • FIG. 6 is a flowchart of determination processing related to automatic still image storage in the image processing apparatus according to the embodiment. It is a block diagram of the imaging device of an embodiment. 4 is a flowchart of automatic still image storage processing of the imaging apparatus according to the embodiment. It is a flowchart of a person composition determination process of an embodiment. It is a flowchart of the scenery composition determination process of an embodiment. It is a flowchart of the impression composition determination process of an embodiment. 4 is a flowchart of still image storage determination / execution processing according to the embodiment. It is explanatory drawing of the score calculation corresponding to the composition pattern of embodiment. It is explanatory drawing of the score calculation example about the center one point composition of embodiment.
  • the automatic still image capturing storage (automatic still image storage) relates to a technique of selecting a frame from consecutive frames and storing it as still image data, and particularly selecting a frame with a suitable composition. A technique for doing this will be described.
  • a frame is a frame that is continuous on the time axis constituting an image as a moving image.
  • the automatic still image storage is performed in a situation where image data of continuous frames captured by the image sensor, that is, frame data as a moving image is obtained.
  • image data of continuous frames captured by the image sensor that is, frame data as a moving image is obtained.
  • automatic Still image storage can be performed in an automatic imaging mode in which imaging by an image sensor is continuously performed, or in a shutter standby state in a normal manual imaging mode (a state in which a through image is generated so that a user can monitor a subject).
  • automatic Still image storage can be performed in an automatic imaging mode in which imaging by an image sensor is continuously performed, or in a shutter standby state in a normal manual imaging mode (a state in which a through image is generated so that a user can monitor a subject.
  • image data of continuous frames is obtained from the image sensor is also assumed during moving image capturing, and automatic still image storage may be performed during moving image capturing.
  • the image processing apparatus 1 described below is an apparatus that performs at least processing for determining a frame to be stored as a still image from among input image data of each frame.
  • the imaging device 10 to be described later is a device that performs automatic still image storage in which a suitable frame is stored as a still image regardless of a user operation in any of an automatic imaging mode, a manual imaging mode, a moving image imaging mode, and the like.
  • the composition refers to a screen configuration that mainly takes into account the position of the subject of interest in the frame. Whether the composition is suitable or not is evaluated according to the distance between the ideal target subject comfortable for the observer observing the frame and the actual target subject distance.
  • FIG. 1 shows a configuration example of an image processing apparatus 1 according to the embodiment.
  • the image processing apparatus 1 includes a score calculation unit 1a, a threshold setting unit 1b, and an image storage determination unit 1c.
  • the image processing apparatus 1 outputs determination information SS of a frame to be stored as a still image for the input image data Din.
  • As the image data Din image data of each frame continuous on the time axis is supplied. That is, it is frame data constituting a moving image.
  • the image data Din may be obtained by imaging with an image sensor (not shown) or may be transmitted as moving image data from an external device.
  • the score calculation unit 1a calculates a score as an evaluation value of the composition of an image for each frame supplied as image data Din or intermittent frames.
  • the composition used as the score calculation reference is based on information of one or a plurality of commonly known photographic compositions.
  • the composition includes, for example, the Hinomaru composition (central one-point composition), a three-part composition, a bust-up composition, and the like.
  • the evaluation value as the Hinomaru composition for example, the evaluation value of the degree corresponding to the Hinomaru composition is quantified for the image of the frame to be processed.
  • the score calculation unit 1a performs image analysis of a frame to be processed, determines a composition and a subject, and selects a composition that serves as a score calculation reference according to them. Then, a score as an evaluation value for the composition is obtained.
  • the threshold setting unit 1b variably sets a threshold condition for determining whether to execute automatic still image storage. For example, threshold conditions are set according to user operations, image contents, device status, and the like.
  • the threshold is a value to be compared with the score calculated by the score calculation unit 1a. If the maximum score is “100”, the threshold is selected and set to “90” “80” “70” or the like. It can be so. For example, it is assumed that the threshold condition is satisfied because the score exceeds the threshold.
  • the image storage determination unit 1c determines the image data of the frame according to another condition determination. Is determined as image data to be stored as a still image.
  • the determination result is output as control information SS. That is, first, it is determined whether or not a frame to be processed is a frame to be stored as a still image based on a threshold condition for the score. When the threshold condition is satisfied, it is further determined whether or not the frame is to be stored as a still image by further condition determination.
  • Other condition determinations include determination of the focus state, determination of whether or not the score value is near the peak, determination of whether or not the image is unblurred, and the like.
  • the image storage determination unit 1c determines that a certain frame is a frame to be stored as a still image, in an image processing system or storage processing system (not shown), predetermined processing such as necessary resolution conversion and encoding for the image data of the frame Is stored in the storage medium as still image data. As a result, automatic still image storage is executed.
  • FIG. 2 shows an example of determination processing for automatic still image storage of the image processing apparatus 1.
  • This is a process executed in the image processing apparatus 1 by the functions of the above-described score calculation unit 1a, threshold setting unit 1b, and image storage determination unit 1c.
  • the image processing apparatus 1 performs threshold setting in step S1. For example, a threshold is selected according to a user operation and set as a processing threshold.
  • step S2 the image processing apparatus 1 acquires image data of a frame to be processed among frames continuously supplied as image data Din.
  • step S ⁇ b> 3 the image processing apparatus 1 performs composition determination and subject determination on a frame to be processed, and calculates a score based on a specific composition according to them.
  • step S4 the image processing apparatus 1 confirms whether the score satisfies the threshold condition. If not, the process proceeds from step S4 to S8 to S2, and the process proceeds to the next process target frame.
  • step S4 the image processing apparatus 1 proceeds from step S4 to step S5, and performs other condition determination on the image data of the frame. If other conditions are not satisfied, the process proceeds from step S6 to S8 to S2, and the process for the next processing target frame is started. When other conditions are satisfied, the image processing apparatus 1 proceeds from step S6 to step S7, determines that the frame is a frame to be stored as a still image, and outputs the determination information SS. When the operation of automatic still image storage is completed, the image processing apparatus 1 ends the processing of FIG. 2 from step S8.
  • the image processing apparatus 1 stores, as a still image, image data of each frame supplied as a moving image that has a good composition and satisfies other conditions such as a focus state. It can be determined. Accordingly, it can be determined that a still image having a high quality composition is stored for automatic still image storage. Further, by setting the threshold condition variably, the frequency of storing as a still image can be adjusted.
  • an image processing apparatus 1 having a score calculation unit 1a, a threshold setting unit 1b, and an image storage determination unit 1c is a CPU (Central Processing Unit) or a DSP (Digital Signal Processor) as an arithmetic processing unit. Can be realized. It is also conceivable to realize the functions of the image processing apparatus 1 as a cooperative process of these arithmetic processing devices by realizing each unit in, for example, a plurality of CPUs or CPUs and an image processing DSP.
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • FIG. 3 shows a configuration example of the imaging apparatus 10 according to the embodiment.
  • the imaging device 10 is a so-called digital still camera or digital video camera, and is a device that captures / records still images and moving images, and incorporates an image processing device as defined in the claims.
  • the imaging apparatus 10 can execute automatic still image storage in the automatic imaging mode or the moving image imaging mode.
  • the imaging apparatus 10 includes an optical system 11, an imager 12, an optical system driving unit 13, a sensor unit 14, a storage unit 15, a communication unit 16, a digital signal processing unit 20, a control unit 30, a display unit 34, An operation unit 35 is provided.
  • the optical system 11 includes a lens such as a cover lens, a zoom lens, and a focus lens, and a diaphragm mechanism.
  • the optical system 11 collects light from the subject on the imager 12.
  • the imager 12 includes, for example, an image sensor such as a charge coupled device (CCD) type or a complementary metal oxide semiconductor (CMOS) type.
  • the imager 12 executes, for example, CDS (Correlated Double Sampling) processing, AGC (Automatic Gain Control) processing, etc., on the electrical signal obtained by photoelectric conversion in the image sensor, and further performs A / D (Analog / Digital) conversion processing. I do.
  • the imaging signal as digital data is output to the digital signal processing unit 20 at the subsequent stage.
  • the electronic shutter speed of the image sensor in the imager 12 is variably controlled by the control unit 30.
  • the optical system drive unit 13 drives the focus lens in the optical system 11 based on the control of the control unit 30, and executes a focus operation. Further, the optical system driving unit 13 drives an aperture mechanism in the optical system 11 based on the control of the control unit 30, and performs exposure adjustment. Furthermore, the optical system drive unit 13 drives the zoom lens in the optical system 11 based on the control of the control unit 30 and executes a zoom operation.
  • the digital signal processor 20 is configured as an image processor by a DSP or the like, for example.
  • the digital signal processing unit 20 performs various types of signal processing on the digital signal (captured image data) from the imager 12.
  • the digital signal processing unit 20 includes a preprocessing unit 21, a synchronization unit 22, a YC generation unit 23, a resolution conversion unit 24, a codec unit 25, a display data generation unit 26, an image analysis unit 27, and a focus processing unit 28. Yes.
  • the pre-processing unit 21 performs a clamping process for clamping the R, G, B black levels to a predetermined level on the captured image data from the imager 12, a correction process between the R, G, B color channels, and the like. Apply.
  • the synchronization unit 22 performs demosaic processing so that the image data for each pixel has all the R, G, and B color components.
  • the YC generation unit 23 generates (separates) a luminance (Y) signal and a color (C) signal from R, G, and B image data.
  • the resolution conversion unit 24 performs resolution conversion processing on image data that has been subjected to various types of signal processing.
  • the codec unit 25 performs, for example, a recording or communication encoding process on the resolution-converted image data.
  • the display data generation unit 26 generates display data as, for example, a through image output to the display unit 34 under the control of the control unit 30.
  • the display data as the through image is basically data of each frame as captured image data whose resolution is converted by the resolution conversion unit 24.
  • the display data generation unit 26 also performs processing for displaying various guide images, character images, operation images, and the like in a form of being superimposed on an image such as a through image, based on an instruction from the control unit 30.
  • the image analysis unit 27 performs image analysis processing for each frame unit (or every intermittent frame) on the captured image data (luminance signal / color signal) obtained by the YC generation unit 23 as image content. Processing such as detection of the subject type and main subject, composition determination, and determination of luminance and color status are performed. These analysis results are provided for processing by the control unit 30.
  • the image analysis unit 27 can perform, for example, human face detection, body detection, and animal detection such as pets.
  • the image analysis unit 27 can also detect an attribute such as a person. For example, a feature point of a face image detected by image analysis is discriminated to identify an attribute, and attribute information is generated.
  • the attribute information is, for example, information on whether the subject is an adult or a child, or information on whether the subject is a woman or a man. Further, more detailed information such as age group may be determined.
  • the image analysis unit 27 performs various analysis processes such as line detection such as horizontal lines and vertical lines in the image, and determination of hue / saturation / lightness.
  • the focus processing unit 28 confirms the focus state of the current frame image data for the autofocus operation. For example, an evaluation value for determining a focus state is obtained by a technique such as detection of high-frequency component energy of image data.
  • the control unit 30 performs focus lens drive by the optical system drive unit 13 while checking the evaluation value from the focus processing unit 28, and controls the focus state.
  • Each of the display data generation unit 26, the image analysis unit 27, and the focus processing unit 28 is shown as a functional configuration executed by the digital signal processing unit 20 in the example of FIG. The processing of each of these units may be executed by the unit 30.
  • the control unit 30 is configured by a microcomputer (arithmetic processing unit) including a CPU, a ROM (Read Only Memory), a RAM (Random Access Memory), a flash memory, and the like.
  • a microcomputer arithmetic processing unit
  • the CPU executes a program stored in a ROM, a flash memory, or the like, the entire imaging apparatus 10 is comprehensively controlled.
  • the RAM is used as a work area for various data processing of the CPU for temporary storage of data and programs.
  • ROM and flash memory non-volatile memory
  • OS Operating System
  • a program for executing processing for automatic still image storage is also stored.
  • Such a control unit 30 includes various signal processing instructions in the digital signal processing unit 20, an imaging operation and a recording operation according to a user operation, a reproduction operation of a recorded image file, a camera operation such as zoom, focus, and exposure adjustment. Controls the operation of each necessary unit for user interface operations and the like.
  • control unit 30 has functions as a score calculation unit 30a, a threshold control unit 30b, an image storage determination unit 30c, and a display control unit 30d, and performs control related to the operation of automatic still image storage to be described later.
  • the score calculation unit 30a, the threshold control unit 30b, and the image storage determination unit 30c function as the score calculation unit 1a, the threshold control unit 1b, and the image storage determination unit 1c described with reference to FIGS. That is, in the imaging apparatus 10, the function of the image processing apparatus 1 in FIG. 1 is implemented as a software function in the control unit 30.
  • the display control unit 30d displays composition information, for example, in a form that is superimposed on a through image, displays score and threshold condition information of each frame, and specifies image contents. Control is performed to display an enlarged image serving as a guide for matching the composition.
  • the display unit 34 is a display unit that performs various displays for a user (imager or the like). For example, an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display formed on the housing of the imaging device 10.
  • the display device is formed. It may be formed using an LCD, an organic EL display, or the like in the form of a so-called viewfinder.
  • the display unit 34 includes the display device described above and a display driver that causes the display device to perform display. The display driver executes various displays on the display device based on instructions from the control unit 30.
  • the display driver reproduces and displays still images and moving images that have been picked up and recorded on a recording medium, or displays a through image based on display data from the display data generation unit 26 on the screen of the display device.
  • Various operation menus, icons, messages, guide displays, etc., that is, display as a GUI (Graphical User Interface) is executed on the screen.
  • the operation unit 35 has an input function for inputting a user operation, and sends a signal corresponding to the input operation to the control unit 30.
  • the operation unit 35 is realized as, for example, various operators provided on the housing of the imaging device 10 or a touch panel formed on the display unit 34.
  • a playback menu activation button, a determination button, a cross key, a cancel button, a zoom key, a slide key, a shutter button (release button), and the like are provided.
  • Various operations may be performed by a touch panel operation using icons, menus, and the like displayed on the touch panel and the display unit 34.
  • the storage unit 15 includes, for example, a non-volatile memory, and functions as a storage area for storing image files (content files) such as still image data and moving image data, image file attribute information, thumbnail images, and the like.
  • the image file is stored in a format such as JPEG (Joint Photographic Experts Group), TIFF (Tagged Image File Format), or GIF (Graphics Interchange Format).
  • Various forms of the storage unit 15 can be considered.
  • the storage unit 15 may be a flash memory built in the imaging apparatus 10, or a memory card (for example, a portable flash memory) that can be attached to and detached from the imaging apparatus 10 and a card recording / reproducing that performs recording / reproducing access to the memory card
  • the form by a part may be sufficient.
  • a form incorporated in the imaging device 10 it may be realized as an HDD (Hard Disk Drive) or the like.
  • a program for executing processing for the automatic still image storage function may be stored in the storage unit 15.
  • the communication unit 16 performs data communication and network communication with an external device in a wired or wireless manner. For example, captured image data (still image file or moving image file) is communicated between an external display device, a recording device, a playback device, and the like.
  • a network communication unit for example, communication via various networks such as the Internet, home network, and LAN (Local Area Network) may be performed, and various types of data may be transmitted and received between servers and terminals on the network. .
  • the sensor unit 14 comprehensively shows various sensors.
  • a gyro sensor angular velocity sensor
  • an acceleration sensor and the like are provided for detecting the overall movement of the imaging apparatus 10 such as camera shake or the posture and movement (pan movement, tilt movement, etc.) of the imaging apparatus 10.
  • An illuminance sensor that detects external illuminance for exposure adjustment or the like, and a distance measuring sensor that measures the subject distance may be provided.
  • a zoom lens position sensor for detecting the position of the zoom lens in the optical system 11 and a focus lens position sensor for detecting the position of the focus lens.
  • a sensor that detects the opening amount of a mechanical iris may be provided.
  • Various sensors of the sensor unit 14 transmit detected information to the control unit 30.
  • the control unit 30 can perform various controls using information detected by the sensor unit 14.
  • the outline of the automatic still image storage operation performed by the imaging apparatus 10 of the present embodiment is as follows.
  • the image analysis unit 27 performs subject determination and composition determination for each frame (or each intermittent frame) for image data of continuous frames captured by the imager 12.
  • the control unit 30 calculates a score using information on the analysis result. For example, when a main subject in the frame screen, for example, face detection is performed, a score indicating the degree of corresponding to a specific composition is calculated. First, it is determined whether this score satisfies a threshold condition. Frames that do not satisfy the threshold condition are not subject to still image storage.
  • the threshold condition is variably set by a user operation, for example. If the threshold condition is strict, the frequency of still image storage is low, and if the threshold condition is loose, the frequency of still image storage is high.
  • Peak determination A peak is determined by comparing the score of the previous frame with a temporarily stored buffering score (hereinafter referred to as “Buf score”). Among the consecutive frames to be processed, a frame having a score near the peak value is left as a still image.
  • Focus determination It is determined whether or not a main subject (for example, a face) is in focus, and frames that are not in focus are not set as targets for still image storage. Note that factors that cause the subject to focus include shutter half-press / full-press by the user, autofocus (AF), continuous AF, manual focus, and the like.
  • FIG. 4, FIG. 5, FIG. 6, FIG. 7 and FIG. 8 show processing examples of the control unit 30 for automatic still image storage. These processes are executed by the control unit 30 using functions as the score calculation unit 30a, the threshold control unit 30b, the image storage determination unit 30c, and the display control unit 30d.
  • FIG. 4 shows the overall processing as automatic still image storage.
  • the control unit 30 proceeds from step S101 to step S102 in FIG.
  • this is a case where the user performs an operation for performing automatic still image storage during moving image capturing, or a case where the user performs an operation for instructing automatic still image storage even when moving image capturing is not performed.
  • the control unit 30 sets a threshold used for processing in step S102.
  • An interval time corresponding to the threshold is set.
  • the threshold value is compared with the score as described above, and the still image storage frequency can be adjusted to some extent by the threshold value.
  • the user can select a mode of how often a still image is captured in a moving image frame during automatic still image storage. For example, the user selects “high”, “medium”, “low” or the like as a still image storage frequency mode in advance by a predetermined operation.
  • the control unit 30 stores this operation as an automatic still image storage frequency mode.
  • the threshold value is set according to the mode (frequency “high”, “medium”, or “low”) set by the user's most recent operation.
  • a relatively strict threshold th1 is set when the frequency is “low”
  • an intermediate threshold th2 is set when the frequency is “medium”
  • a relatively low threshold th3 is set when the frequency is “high”.
  • the interval time is a waiting time until a frame is selected as the next still image when one still image is stored.
  • a frame to be a still image is selected according to a score or other conditions, there may be a case where many images having the same image content in a continuous frame are stored as a still image. Therefore, when a certain frame is stored as a still image, the next still image is not stored during the standby period.
  • This interval time is assumed to correspond to the frequency mode described above. For example, 10 seconds are set for the frequency “low”, 6 seconds are set for the frequency “medium”, and 3 seconds are set for the frequency “high”.
  • step S103 a processing target frame for determining whether or not to store as a still image is specified.
  • the frames are sequentially set as processing targets in step S103.
  • the frame is intermittently set as a processing target in step S103, for example, every n frames.
  • the control unit 30 subsequently checks the analysis result of the image analysis unit 27 for the target frame.
  • the face image detection result is referred to, and it is confirmed whether or not a human face is included as a subject of the image of the frame. Also, the orientation of the face image is confirmed as a result of the face image analysis.
  • the control unit 30 proceeds from step S104 to step S105 to perform person composition determination.
  • the process proceeds from step S104 to step S106 to perform landscape composition determination.
  • the control unit 30 compares the composition of one or more desirable composition patterns in the case of including a human face with the composition of the image of the frame, and determines whether the frame matches the desirable composition. And a score is calculated as an evaluation value based on the composition (for example, a degree corresponding to the composition).
  • the control unit 30 compares one or more desirable composition patterns as the landscape composition with the composition of the image of the frame, and determines whether the frame matches the desired composition. Then, a score as an evaluation value based on the composition is calculated. Specific examples of the person composition determination and the landscape composition determination will be described later.
  • step S107 the control unit 30 causes the display unit 34 to display information based on the composition determination result, display of a guide for achieving a better composition, and the like. For example, various images, icons, enlarged images, and the like are displayed on the through image at that time. A display example will be described later.
  • step S108 the control unit 30 determines whether or not the processing target frame is a frame to be stored as a still image, and when determining that the frame to be stored is a frame to be stored as a still image, the control unit 30 performs a storage operation. I do. Details of the determination process will be described later.
  • step S109 the control unit 30 determines whether or not a trigger for ending the operation of automatic still image storage has occurred, and if a trigger for ending automatic still image storage has occurred as some user operation or moving image capturing stop, The process of FIG. 4 is finished.
  • the process branches in step S110 depending on whether still image storage is executed in the immediately preceding step S108. If still image storage has not been executed, the process returns to step S103, and the same processing is performed for the next frame. If still image storage has been performed immediately before, the process proceeds from step S110 to S111 and waits for the above-described interval time to elapse.
  • the time count of the interval time may be started from, for example, the execution time of the still image storage control in step S108.
  • the process returns to step S109 while waiting for the elapse of time.
  • the frames in the standby period are not subjected to processing in step S104 and subsequent steps, and are sequentially set as processing target frames from the frame at the time when the standby period has elapsed.
  • step S105 and S106 in FIG. 4 the control unit 30 performs a process of determining a person composition and a scene composition. The composition determination will be described.
  • FIG. 5 shows in detail the person composition determination process in step S105 of FIG.
  • the control unit 30 determines in step S104 in FIG. 4 that a face image exists in the target frame, the control unit 30 proceeds to step S105.
  • the number of face images existing in the target frame that is, The composition used as a reference for score calculation is switched according to the determination result of the number of subject persons and the face orientation.
  • the process proceeds from step S201 to S202 to S203. Then, using the whole body composition, the Hinomaru composition, and the bust-up composition for the target frame as the composition of the score calculation standard, each score based on each of these compositions is calculated. That is, a score obtained by quantifying the evaluation based on the whole body composition, a score obtained by quantifying the evaluation based on the Hinomaru composition, and a score obtained by quantifying the evaluation based on the bust-up composition are obtained.
  • control unit 30 confirms from the image analysis result of the target frame that there is one face image (one person in the subject) and the face image is not front-facing, the process proceeds from step S201 to S202 to S204. Go ahead and calculate a score based on the three-part composition for the target frame.
  • step S201 If the control unit 30 confirms from the image analysis result of the target frame that there are two face images (two subjects) and the face image is a front-facing image, the process proceeds from step S201 to step S205.
  • the process proceeds from S206 to S207, and each score is calculated as an evaluation based on the three-division composition and the diagonal composition for the target frame.
  • the determination of whether or not the faces in step S206 are front faces requires that both of them are front faces. If even one of the faces is not judged to be front faces, score calculation is not performed. As a result, the possibility of storing a still image is obtained when the two of them are front-facing frames. However, for example, if one person is facing the front, an example of proceeding to step S207 is also conceivable.
  • step S201 When the control unit 30 confirms from the image analysis result of the target frame that there are three or more face images (three or more subject persons) and the face image is a front-facing image, the process proceeds to step S201.
  • the process proceeds from S205 to S208 to S209, and a score is calculated as an evaluation based on the composition of multiple persons for the target frame.
  • the determination of whether or not the face in step S208 is in front requires that the faces of all the members be in front, and if no one is determined to be in front, score calculation is not performed. I am doing so. As a result, the possibility of storing still images can be obtained when the frames are all facing the front.
  • step S209 when at least one person is facing the front, or when the majority is facing the front, an example of proceeding to step S209 is also conceivable. This is because, in the case of a large number of people, it is difficult to be a candidate for still image storage if the face orientations of all the members are strictly determined.
  • FIG. 9 taking the case of the Hinomaru composition and the bust-up composition as an example, the relationship between the score calculation and the threshold value set in step S102 of FIG. 4 will be described.
  • This is a threshold value for determining how far away from the ideal position (ideal coordinates) of the image to be stored.
  • FIG. 9A shows the case of the Hinomaru composition.
  • the Hinomaru composition is a composition in which the main subject exists in the center of the image.
  • IP (Hx, Hy)
  • a threshold th1 (indicated by a solid line)
  • a threshold th2 (indicated by a broken line)
  • a threshold th3 indicated by a one-dot chain line
  • the threshold th1 is a range of ⁇ 20 pixels from the ideal position IP in the x direction and the y direction
  • the threshold th2 is a range of ⁇ 40 pixels from the ideal position IP in the x direction and the y direction
  • the threshold th3 is a range from the ideal position IP to the x direction and y.
  • a value corresponding to a range of ⁇ 60 pixels in the direction is set.
  • FIG. 9B shows the case of the bust-up composition.
  • the threshold values th1 (solid line), th2 (broken line), and th3 (one-dot chain line) indicate ranges in which the separation distance from the ideal position IP is different as illustrated.
  • the threshold th1 is in the range of ⁇ 20 pixels in the y direction from the ideal position IP
  • the threshold th2 is in the range of ⁇ 40 pixels in the y direction from the ideal position IP
  • the threshold th3 is in the range of ⁇ 60 pixels in the y direction from the ideal position IP.
  • Each is a corresponding value.
  • threshold values th1, th2, and th3 are prepared according to the composition, and one of the threshold values th1, th2, and th3 is selected by a user operation or the like.
  • a threshold selected from the thresholds th1, th2, and th3 is set as a threshold used for processing as a threshold used in automatic still image storage. For example, when the threshold value th1 is used, it is determined in step S108 in FIG. 4 whether the score of the target frame is a score corresponding to the range of the threshold value th1.
  • the score is obtained as a value representing the distance d from the ideal position IP in each composition with respect to the center position CP of the main subject.
  • the score can be calculated from the distance between the coordinates of the ideal position IP and the center coordinate position of the detected face.
  • the coordinates of the ideal position IP are (Hx, Hy).
  • the coordinates of the face center position CP are (Fx, Fy).
  • One pixel 1 (score).
  • the threshold th1 is selected, when the score is 80 to 100, the image of the frame satisfies the threshold condition.
  • the threshold value th2 is selected, the threshold condition is satisfied when the score is 60 to 100, and when the threshold value th3 is selected, the threshold condition is satisfied when the score is 40 to 100.
  • the probability (frequency) that the target frame will satisfy the threshold condition changes depending on the selection of the thresholds th1, th2, and th3.
  • FIG. 10A shows the orientation of the face image. Arranged in the case where the horizontal direction of the face image (YAW-R2 to YAW-L2) is different in the horizontal direction in the figure, and the case where the axial rotation direction (ROLL-R1 to ROLL-L1) is different in the vertical direction in the figure. Indicated.
  • the score for the Hinomaru composition is calculated in step S203. Therefore, only when the face image is surrounded by the broken line T in FIG.
  • FIG. 10B shows a state where the coordinates of the ideal position IP and the center position CP of the face image match.
  • FIG. 11A shows the face orientation as in FIG. 10A.
  • the score for the three-part composition is calculated in step S204.
  • the ideal positions in the case of the three-part composition are ideal positions IPa, IPb, IPc, and IPd shown in FIG. 11B. That is, the four positions are the three division points in the x direction and the three division points in the y direction. In this case, it is scored whether the direction of the face is in the direction in which the space is empty and the coordinates of the center position of the face (CP described above) are close to the coordinates of the ideal position.
  • the composition is such that the subject person is cut at the neck portion.
  • score calculation may not be performed.
  • threshold values th1, th2, and th3 are shown in accordance with the ideal position IPa.
  • the score is 80 or more. Is an image whose center of the face is within the solid line (th1) of FIG. 11B.
  • the score for the whole body composition is calculated in step S203.
  • the x coordinate value of the ideal position IP in the case of whole body composition is a median value, and the y coordinate value is changed according to the subject.
  • the ideal position IP is set accordingly.
  • the y coordinate value of the ideal position IP is set to a position below the upper end of the screen by a length corresponding to the size Sf of the face image as shown in FIG. 12A.
  • the y-coordinate value of the ideal position IP is set to a position below the upper end of the screen by a double length (2Sf) of the face image size Sf as shown in FIG. 12C.
  • 2Sf double length of the face image size Sf
  • the size Sf of the face image is assumed to be 10% to 15% with respect to the image vertical size dy.
  • the size Sf of the face image does not fall within the range of 10% to 15%, for example, it may be considered not to calculate the whole body composition score.
  • the score is obtained by (Expression 1) described above as a value indicating whether the coordinates of the center position CP of the face are close to the coordinates of the ideal position IP.
  • threshold values th1, th2, and th3 are shown in accordance with the ideal position IP. For example, when the score related to the whole body composition is 80 or more, the face center is within the solid line (th1).
  • a score calculation example for the bust-up composition will be described.
  • the score for the bust-up composition is calculated in step S203.
  • the coordinates of the ideal position IP in the case of the bust-up composition are a certain y coordinate position, but the y coordinate value is changed according to the subject. For example, for the face image, an adult, a child, and an infant are discriminated, and the ideal position IP is set accordingly.
  • the y-coordinate value of the ideal position IP is set to a position that is 2.5 times (2.5Sf) the size Sf of the face image from the lower end of the screen as shown in FIG. 13A.
  • the y-coordinate value of the ideal position IP is set to a position that is twice as long (2Sf) as the face image size Sf from the bottom of the screen as shown in FIG. 13C. .
  • the face image size Sf is 22% to 30% for adults, 15% to 25% for children, and infants for the vertical image size dy. Is between 25% and 35%. If the face image size Sf does not fall within these ranges, for example, it may be possible not to calculate the bust-up composition.
  • the score is obtained as a value indicating whether the coordinates of the center position CP of the face are close to the coordinates of the ideal position IP.
  • score (SCmax) ⁇
  • threshold values th1, th2, and th3 are shown in accordance with the ideal position IP.
  • the case where the score relating to the whole body composition is 80 or more means an image in which the center of the face exists within the y-direction range of the solid line (th1).
  • FIG. 14A shows the face orientation as in FIG. 10A.
  • the score for the three-part composition is calculated in step S207.
  • FIG. 14B shows ideal positions IPa, IPb, IPc, and IPd in the case of a three-part composition.
  • IPa, IPb, IPc, IPd When there are two subject persons, each face has a different ideal position (IPa, IPb, IPc, IPd). ) So that the score is high.
  • the center positions CP1 and CP2 of the faces of the two persons coincide with the ideal positions IPd and IPb.
  • the full score value is set to “100”.
  • the score is calculated by the above (Equation 1) for the center position CP1 and a certain ideal position IP that is close to the center position CP1, and the center position CP2 is calculated for the other ideal position IP that is close to the center position CP2.
  • a score is calculated by (Formula 1). The average value of the two scores can be considered as the score in this case.
  • the score is obtained when the orientations of the faces of the two people are surrounded by a broken line T in FIG. 14A.
  • the center positions CP1 and CP2 of the faces of the two people are close to the same ideal position among the ideal positions IPa, IPb, IPc, and IPd, the score is not calculated because the composition is not properly divided into three. It is possible to do.
  • FIG. 14B threshold values th1, th2, and th3 are shown in accordance with the ideal position IPc for reference.
  • FIG. 15A shows an ideal inclination IA in the case of a diagonal composition.
  • an ideal inclination IA is used instead of the ideal position IP.
  • the inclination of the entire angle of view of the image is an ideal inclination IA.
  • the score for the diagonal composition is calculated in step S207.
  • FIG. 15A shows a case where the center positions CP1 and CP2 of the faces of two subject persons are on the diagonal lines.
  • the score is a perfect score “100”.
  • the inclination of the coordinates of the center positions CP1 and CP2 of the two faces is obtained, and this is compared with the inclination of the ideal position IP as a diagonal line.
  • the inclination of the line connecting the center positions CP1 and CP2 is obtained and converted into a score as shown in FIG. 15B.
  • the score is 100, and the score decreases as the distance from 1.8 increases.
  • the threshold value th1 80
  • the threshold value th2 60
  • the threshold value th3 40
  • the slope values corresponding to 80, 60, and 40 are associated through the score. That is, the difference between the inclinations of the center positions CP1 and CP2 with respect to the ideal inclination IA is converted into a score (maximum 100), and threshold values th1, th2, and th3 are used as references for the score.
  • the coordinates of the ideal position IP are set so that the center of gravity of the triangular composition is arranged when there are three or more faces and a front face. For example, in the case of three people, scoring is performed based on whether the coordinates of the center position of the middle face are close to the coordinates of the ideal position IP.
  • a triangle TR having apexes at the center positions CP1 and CP2 of the faces U1 and U2 other than the center face U3 is considered, and the barycentric coordinate of the triangle TR is set as an ideal position IP2.
  • the score is calculated by the above (Equation 1) for the center position CP3 and the ideal position IP of the middle face U3.
  • Equation 1 the center of gravity coordinates of the person in the interior are obtained, and the rest of the concept is the same as the composition of three persons.
  • step S106 of FIG. 4 When the control unit 30 determines in step S104 in FIG. 4 that no face image exists in the target frame, the control unit 30 proceeds to the landscape composition determination process in step S106.
  • An example of landscape composition determination processing is shown in FIG.
  • step S301 in FIG. 6 the control unit 30 confirms the line detection result by image analysis of the image of the target frame. That is, the detection results of various lines are confirmed by referring to the edge detection results such as luminance and color in the image by the image analysis unit 27 for the target frame. In the subsequent processing, score calculation is performed according to the type of existing line.
  • control unit 30 proceeds from step S302 to S303, and calculates a score for the two-part composition and the three-part composition.
  • the control unit 30 proceeds from step S304 to S305, and calculates a score for the diagonal composition.
  • the control unit 30 proceeds from step S306 to step S307, and calculates a score for the triangular composition.
  • the control unit 30 proceeds from step S308 to S309, and calculates a score for the curve composition.
  • step S310 to step S311 and calculates a score for the vanishing point composition. Further, in step S312, the control unit calculates a score for the impression composition.
  • FIG. 17A is an example of an image in which the horizontal line E1 is detected
  • FIG. 17B is an example of an image in which the horizontal line E2 and the vertical line E3 are detected.
  • FIGS. 17C to 17H For example, such an image is matched with a template as shown in FIGS. 17C to 17H to score whether the length and position of the detected line are close to the ideal line.
  • FIG. 17C shows a horizontal ideal line IL1 as a two-part composition.
  • FIG. 17D shows a vertical ideal line IL2 as a two-part composition.
  • FIG. 17E shows an ideal line IL3 above the horizontal line as a three-part composition.
  • FIG. 17F shows an ideal line IL4 below the horizontal line as a three-part composition.
  • FIG. 17G shows an ideal line IL5 on the left side of the vertical line as a three-part composition.
  • FIG. 17H shows an ideal line IL6 on the right side of the vertical line as a three-part composition.
  • the score is low because it is relatively far from the ideal line IL3 in FIG. 17C.
  • the horizontal line E2 is close to the ideal line IL4 of FIG. 17F, a high score is obtained for the three-part composition.
  • the score is high because the vertical line E3 is close to the ideal line IL2 in FIG. 17D.
  • the score calculation condition is that one line is detected for each of the horizontal line and the vertical line.
  • the score it is also possible to add the score calculated with the horizontal line and the score calculated with the vertical line, or take an average value.
  • FIG. 18A is an example of an image in which a diagonal line E10 is detected.
  • FIG. 18B is an example of an image in which the curve E11 is detected.
  • FIG. 18C shows ideal lines IL10 and IL11 used for the diagonal composition.
  • FIG. 18D shows an ideal line IL12 used for the curve composition.
  • the diagonal line E10 detected in the image is compared with the template of FIG. 18C and scored based on whether the length and position of the detected diagonal line E10 are close to the ideal line IL10. As in the case of the diagonal composition of the two persons described above, the score may be scored using the degree of approximation of the inclination. Further, the curve E11 detected in the image is compared with the template of FIG. 18D and scored based on whether the length and position of the detected curve E10 are close to the ideal line IL10. It can be considered that the condition for calculating the score of the diagonal composition is that one line is detected as a diagonal line. Similarly, it can be considered that the condition for calculating the score of the curve composition is that one line is detected as a curve.
  • FIG. 19A is an example of an image in which a triangular line E20 is detected.
  • FIG. 19B shows ideal lines IL20 and IL21 used for the triangular composition.
  • the triangular line E20 detected in the image is compared with the template of FIG. 19B and scored based on whether the length and position of the detected triangular line E20 are close to the ideal line IL20 or IL21. This score is calculated when a triangular line is formed by two lines.
  • FIG. 20A is an example of an image in which two lines E30 and E31 are detected.
  • FIG. 20B is a template showing ideal lines IL30 and IL31.
  • FIG. 20C shows the ideal position IPct as the center and the ideal positions IPa, IPb, IPc, and IPd as the three division points.
  • scoring is performed based on whether the intersection of the two lines E30 and E31 detected in the image, that is, the vanishing point of the line on the image is close to the coordinates of the ideal position (IPct, IPa, IPb, IPc, IPd). To do.
  • the control unit 30 performs, for example, a process as shown in FIG. In step S320 in FIG. 7, the control unit 30 determines the hue of the target frame. For example, referring to the color analysis result of the image analysis unit 27, the upper two colors (first color and second color) are confirmed, and it is determined whether or not they have a complementary color relationship. For example, it is whether or not the first color and the second color are a combination of colors such as “red and blue-green”, “purple and yellow-green”, and “yellow and bluish-purple” that are positioned in opposite directions in the hue circle. If there is a complementary color relationship, the control unit 30 proceeds from step S321 to step S322, and calculates a complementary color score. For example, the score is obtained according to the ratio of the number of pixels of the first color and the second color.
  • FIG. 21A shows an example of the score according to the ratio between the first color and the second color.
  • the score is set to 100 when the first color and the second color are 50:50, and the score decreases as any ratio increases. That is, the score is set according to the balance between the first color and the second color.
  • step S323 in FIG. 7 the control unit 30 determines hue, saturation, and brightness. This confirms the top two colors (first color and second color) and determines whether they are in a friendly color relationship. For example, if the first and second colors are “red and black” “blue and black” “green and black” “blue and white” “green and white” “red and white”, etc. Judge. If the relationship is friendship color, the control unit 30 proceeds from step S324 to step S325 and calculates a friendship color score. For example, the ratio of the number of pixels of the first color and the second color is converted into a score as shown in FIG. 21A as in the case of the complementary color.
  • step S326 in FIG. 7 the control unit 30 performs luminance determination. For example, the degree of luminance separation is measured using a discriminant analysis method or the like. Then, the contrast strength is determined. In step S327, a score for contrast is calculated.
  • FIG. 21B and FIG. 21C respectively show luminance histograms for a certain image. For example, FIG. 21B shows a case of an image with a high contrast, and FIG. 21C shows an image with a low contrast.
  • the score calculation the score is obtained by quantifying from the viewpoint of whether the contrast is strong and whether it is close to the balance of the histogram (dark / bright part is 50/50).
  • the person composition determination and the landscape composition determination have been described above, but various examples of score calculation for the person composition and score calculation for the landscape composition can be considered.
  • either the person composition determination or the landscape composition determination is performed depending on whether or not a face exists in the target frame, but the present invention is not limited to such a process.
  • both the person composition determination (S105) and the landscape composition determination (S106) may be executed.
  • a human face is detected as an example of subject detection, but detection other than the human face, for example, dog-cat face detection, flower detection, dish detection, object recognition, etc. It is also assumed that composition determination and score calculation according to detection / recognition thereof are performed.
  • step S108 of FIG. 4 a still image storage determination / execution process is performed.
  • the control unit 30 uses the score calculated according to the composition determined as described above to determine whether the processing target frame is a frame to be stored as a still image and to store the still image.
  • control for executing the storage operation is performed.
  • the processing in step S108 is shown in detail in FIG.
  • the control unit 30 first confirms the score obtained for the target frame in step S400 of FIG. 8, and selects a maximum score (score with the largest value) from the scores. To do. Of course, if only one score is calculated, that is the maximum score.
  • a maximum score score with the largest value
  • one or more scores are obtained for the current target frame. For example, when a face is present, there are cases where three scores are calculated: a whole body composition score, a Hinomaru composition score, and a bust-up composition score. When there is no face, there are cases where two scores are calculated as the score of the curve composition and the score of the friendly color.
  • the process proceeds from step S401 to S402.
  • the comparison buffering score hereinafter referred to as “Buf score”
  • the process in FIG. 8 (step S108 in FIG. 4) ends. .
  • the Buf score stores a score to be compared with the current max score in the process of FIG.
  • step S402 it is confirmed whether or not the score type selected as the max score is different from the score type selected as the max score for the previous target frame.
  • the type is a type of composition that serves as a reference for score calculation as described above, such as Hinomaru composition, bust-up composition, whole-body composition, and so on.
  • the change in the score type is confirmed because the peak value as a score of a certain type is determined in comparison with the Buf score.
  • the control unit 30 proceeds from step S402 to S403 and clears the Buf score. Further, the control unit 30 performs threshold setting according to the type. This is not the selection of the threshold values th1, th2, and th3, but the setting of the threshold value itself.
  • the threshold value th1 is used (see step S102 in FIG. 4), for example, the value as the threshold value th1 is changed between the score of the three-part composition and the score of the curve composition.
  • this threshold setting need not be performed.
  • step S404 the control unit 30 compares the maximum score with the threshold value.
  • the threshold value th1 is set to be used in step S102 of FIG. 4, the maximum score and the threshold value th1 are compared. If the maximum score does not exceed the threshold, it is determined that the current target frame does not satisfy the threshold condition, and the control unit 30 clears the Buf score in step S405 and ends the process of FIG. This is because the target frame this time has low relevance for any composition for which the score is calculated (it does not have relevance to the extent set by the threshold), and the evaluation as a composition is relatively low. It will be a case where it is determined not to do so.
  • step S406 the control unit 30 compares the Max score and the Buf score for peak determination. If the maximum score ⁇ Buf score is not satisfied, the value of the maximum score is substituted for the Buf score in step S408, and the processing of FIG. If the maximum score is smaller than the Buf score, it is determined that the score (maximum score) for a certain composition of the current target frame has a value near the peak. Therefore, it is determined that the current frame has a score near the peak and the peak value vicinity condition is satisfied.
  • the score of the bust-up composition is continuously the maximum score for consecutive target frames.
  • the maximum score of six target frames is the score of the bust-up composition, and the value has changed from 85 ⁇ 86 ⁇ 86 ⁇ 90 ⁇ 92 ⁇ 91.
  • the Buf score is updated as 85 ⁇ 86 ⁇ 86 ⁇ 90 ⁇ 92, and in the process of FIG. 8 for the sixth target frame, the maximum score (91 ) ⁇ Buf score (92). Therefore, it can be estimated that the sixth target frame has a score immediately after the peak, and this target frame is determined to satisfy the condition that the score is near the peak.
  • step S407 the control unit 30 determines whether or not the current target frame is focused on the subject of interest. For example, the control unit 30 can recognize the focus state by confirming the processing information of the focus processing unit 28 on the subject of interest extracted by the image analysis unit 27, such as a person or a face. If it is not in focus, the value of the max score is substituted for the Buf score in step S408, and the process of FIG. That is, it is determined that an out-of-focus frame is not suitable for still image storage.
  • the control unit 30 determines in step S409 whether the subject of interest is a moving subject (dynamic subject). Whether the subject is a dynamic subject or not can be determined by the image analysis unit 27 analyzing the position variation of the main subject in each successive frame image. It ’s fine.
  • the dynamic subject does not indicate a subject that may be moving, such as a person, an animal, a car, or a train, but a subject that is actually moving at the time of imaging the frame. is there. That is, it is determined whether or not the moving subject is an imaged frame.
  • control unit 30 proceeds from step S409 to step S411, and performs control to store the current target frame image as a still image.
  • control unit 30 designates the image data of the current frame to the digital signal processing unit 20 and executes predetermined encoding processing, necessary resolution conversion, and the like, and the storage unit 15 stores the still image data in the recording medium.
  • step S409 determines whether or not the target frame is captured in a state where the shutter speed of the imager 12 is faster than 1 second corresponding to the focal length. If the shutter speed is faster than 1 second corresponding to the focal length, there is a high possibility that the dynamic subject is captured without blurring. Therefore, the control unit 30 proceeds to step S411 and stores the current target frame as a still image. To control.
  • the control unit 30 sets the Buf score in step S408. Substituting the value of the maximum score, the process of FIG.
  • the maximum score is estimated to be near the peak value after satisfying the threshold condition, and the subject of interest is in focus, and it is estimated that there is no blurring.
  • the image is automatically stored as a still image.
  • the Buf score is updated. For this reason, even a frame that actually satisfies the conditions of steps S406, S407, S409, or S410 may be stored as a still image having a score slightly different from the peak value within a range that satisfies the threshold condition.
  • step S407 it is an idea to store a frame close to the peak value as a still image among at least the frames after the peak and having no focus blur or blur.
  • the Buf score of step S408 is determined. It is possible to avoid updating. In this way, it is possible to actually set an image near the peak value as a target for still image storage.
  • step S107 The composition display control in step S107 in FIG. 4 will be described.
  • the control unit 30 uses the function as the display control unit 30d to guide the user to automatically store a still image having the best possible composition when automatic still image storage is performed as described above.
  • the display to be executed is executed.
  • FIGS. 22 and 23 are examples in which composition information (center position CP and threshold information) is superimposed and displayed on a through image.
  • FIG. 22A shows a case where the threshold th1 is selected.
  • FIG. 22B shows a case where the threshold value th2 is selected.
  • a frame Wth2 indicating the threshold value th2 is displayed for the ideal position IP.
  • 22C shows a case where the threshold value th3 is selected, and similarly, an example in which a frame Wth3 indicating the threshold value th3 with respect to the ideal position IP is displayed. 22A, 22B, and 22C, a human face is detected on each through image, but a frame WF indicating the face area and the center position CP of the face are superimposed and displayed on the through image.
  • FIG. 23A shows a case where the threshold th1 is selected.
  • the ideal position IP and the line Lth1 indicating the threshold th1 in the bust-up composition are displayed.
  • FIG. 23B shows an example in which a line Lth2 indicating the ideal position IP and the threshold th2 is displayed when the threshold th2 is selected.
  • FIG. 23C is an example in which the line Lth3 indicating the ideal position IP and the threshold th3 is displayed when the threshold th3 is selected.
  • a human face is detected on each through image, and the face frame WF indicating the face area and the center position CP of the face are superimposed and displayed on the through image.
  • the control unit 30 performs the ideal position IP, threshold information (frame Wth (Wth1, Wth2, Wth3), line Lth (Lth1, Lth2, Lth3), face frame WF, center for the current target frame.
  • the display data generation unit 26 is instructed to display the position CP and the like, and an image as shown is displayed. By performing such display, it is possible to guide the user appropriately. For example, the user can adjust the imaging direction of the imaging apparatus 10 so that the center position CP of the face coincides with the ideal position IP, or at least falls within the frame Wth or the line Lth, so that still image storage can be performed with a good composition. The possibility of being performed can be increased.
  • the ideal position IP varies depending on the composition
  • there are various ways of indicating the ideal position IP for which composition type at the time of display For example, at the time of step S107, it is conceivable to select one of the compositions whose scores have been calculated in the immediately preceding step S105 or S106 and display the ideal position IP and threshold information (Wth, Lth). .
  • the composition type may be selected with the composition having the highest score.
  • the user may be able to specify the composition type, and the ideal position IP and the threshold value information about the specified composition may be displayed.
  • the composition type may be selected according to the image content. For example, if the face image is in landscape orientation, the ideal position IP of the three-part composition is displayed.
  • the threshold information (Wth, Lth) may change the color of the line or the color of the region corresponding to the threshold condition depending on the selection state of the thresholds th1, th2, and th3.
  • the 24A and 24B are examples of displaying the current score and threshold conditions.
  • the display unit 34 displays a maximum level 60, a threshold level 61, and a score bar 62 so as to be superimposed on the through image. That is, the score obtained for the current frame is expressed by the score bar 62. Since the score display corresponds to each frame as a through image, the length of the score bar 62 varies depending on the score of each frame. For example, in the case of FIG. 24A, since the center position CP is close to the ideal position IP within the range of the frame Wth1 and the score is high, the score bar 62 is longer than the threshold level 61. In the case of FIG. 24B, the center position CP is far from the ideal position IP outside the range of the frame Wth1, and the score is low, so the score bar 62 is short.
  • the score bar 62 is displayed according to the score, and the maximum level 60 and the threshold level 61 are displayed on the score bar 62, so that the user can determine the degree to which the composition of the current frame corresponds to a certain composition. Can be recognized. If the user adjusts the direction of the subject so that the score bar 62 extends, in particular, at least exceeds the threshold level 61, the possibility of execution of automatic still image storage can be increased. Further, by making it close to the maximum score, it is possible to store a still image having a composition as good as possible. The score bar 62 can also make it easier for the user to recognize the score by changing the color according to the length.
  • the position of the threshold level 61 rises and falls depending on the selection state of the thresholds th1, th2, and th3.
  • the score bar 62 may not be displayed on the live view image, but may be displayed in a separate area such as the screen edge side by dividing the area from the live view image.
  • FIG. 24C is an example in which an enlarged image 65 is displayed.
  • the center position CP is within the threshold range (for example, within the frame Wth1)
  • the vicinity of the ideal position IP is enlarged and displayed as shown in the figure.
  • the user adjusts the imaging direction so that the center position CP1 coincides with the ideal position IP.
  • the enlarged image 65 makes it easy to recognize the deviation between the center position CP and the ideal position IP by using the enlarged center position CPL and the enlarged ideal position IPL, so that the user can easily adjust the center position CP to the ideal position IP.
  • by displaying the enlarged image 65 only when the center position CP is within the threshold range it is possible to prevent the through image from being hidden by the enlarged image.
  • the program according to the embodiment is a program that causes an arithmetic processing device such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor) to execute the processing described in the above-described embodiment. That is, the program according to the embodiment includes a threshold setting step (S1 in FIG. 2) for setting a threshold condition for image storage execution determination according to the composition, and the threshold set by the composition in the frame at the threshold setting step. When the conditions are satisfied, the program causes the arithmetic processing unit to execute an image storage determination step (S4) for determining image data corresponding to the frame as image data to be stored as a still image.
  • S1 in FIG. 2 for setting a threshold condition for image storage execution determination according to the composition
  • S4 image storage determination step
  • image data for storing the image data of the frame as a still image according to other condition determination May be determined (S4 to S7).
  • a score calculation step (S3) for calculating a score that is an evaluation value of the composition of the image for a frame that is a score calculation target for all or a part of the continuous frames is provided.
  • the storage determination step when the score calculated in the score calculation step for a certain frame satisfies the threshold condition set in the threshold setting step, it is determined that the image data corresponding to the frame is stored as a still image. You may make it carry out (S4).
  • an image processing device that controls the above-described automatic still image storage can be realized using an arithmetic processing device.
  • Such a program can be recorded in advance in an HDD as a recording medium built in a device such as a computer device, a ROM in a microcomputer having a CPU, or the like.
  • a flexible disk CD-ROM (Compact Disc Read Only Memory), MO (Magnet optical) disk, DVD (Digital Versatile Disc), Blu-ray Disc (Blu-ray Disc (registered trademark)), magnetic disk, semiconductor memory, It can be stored (recorded) temporarily or permanently in a removable recording medium such as a memory card.
  • a removable recording medium can be provided as so-called package software.
  • such a program can be downloaded from a removable recording medium to a personal computer or the like, or downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
  • LAN Local Area Network
  • Such a program is suitable for providing a wide range of image processing apparatuses according to the embodiments.
  • a program by downloading a program to a personal computer, a portable information processing device, a mobile phone, a game device, a video device, a PDA (Personal Digital Assistant) or the like, the personal computer or the like may be used as the image processing device of the present disclosure. it can.
  • the same processing as the processing for automatic still image storage in the image processing device 1 and the imaging device 10 described above may be executed.
  • the CPU 71 of the computer device 70 executes various processes according to a program stored in the ROM 72 or a program loaded from the storage unit 78 to the RAM 73.
  • the RAM 73 also appropriately stores data necessary for the CPU 71 to execute various processes.
  • the CPU 71, ROM 72, and RAM 73 are connected to each other via a bus 74.
  • An input / output interface 75 is also connected to the bus 74.
  • the input / output interface 75 includes an input unit 76 such as a keyboard and a mouse, a display such as a CRT (Cathode Ray Tube) or LCD or an organic EL panel, an output unit 77 such as a speaker, and a hard disk.
  • a communication unit 79 including a storage unit 78 and a modem is connected. The communication unit 79 performs communication processing via a network including the Internet.
  • a drive 80 is connected to the input / output interface 75 as necessary, and a removable medium 81 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is appropriately mounted, and a computer program read from them is Installed in the storage unit 78 as necessary.
  • a program constituting the software is installed from a network or a recording medium.
  • This recording medium is constituted by a removable medium 81 made of a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory on which a program is recorded, which is distributed to distribute the program to the user.
  • a removable medium 81 made of a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory on which a program is recorded, which is distributed to distribute the program to the user.
  • it is configured by a ROM 72 in which a program is recorded and a hard disk included in the storage unit 78 that is distributed to the user in a state of being pre-installed in the apparatus body.
  • the image processing apparatus 1 or the imaging apparatus 10 includes a threshold setting unit (1b, 30b) that sets a threshold condition for determining whether to perform image storage according to the composition, and the composition in the frame is a threshold value. And an image storage determination unit (1c, 30c) that determines that the image data corresponding to the frame is stored as image data when the threshold condition set by the setting unit is satisfied.
  • the image storage determination unit (1c, 30c) determines image data to be stored as a still image (S406 to S410 in FIG. 8). That is, by determining other conditions in addition to the composition, it is possible to automatically determine a frame having a good quality as much as possible.
  • a score calculation unit (1a, 30a) is provided that calculates all or part of consecutive frames as a score calculation target frame and calculates a score as an evaluation value of the composition of the image for the frame as the score calculation target.
  • the image storage determination unit (1c, 30c) sets the image data corresponding to the frame as a still image.
  • the image data is determined to be stored. That is, the score calculation unit (1a, 30a) determines whether or not one frame of image data corresponds to a preferred composition, and obtains a score indicating the degree of corresponding.
  • the score is a value indicating how close the image of the frame is to the ideal state of the composition as a score calculation reference.
  • the adjustment of the still image storage frequency based on the threshold condition is not the frequency adjustment based on the time, so that the opportunity for taking in the still image cannot be reduced unnecessarily.
  • scores are calculated using a plurality of compositions such as a Hinomaru composition, a bust-up composition, etc. as a reference for score calculation (FIGS. 5, 6, and 7). Since there are a variety of compositions that are preferable for still images, a score is obtained using a plurality of compositions as a reference. Thus, for each frame, if the composition of the image data is high in relation to any composition, the possibility of being stored as a still image can be increased. That is, still images corresponding to any one of them can be stored on the basis of various preferable compositions. In addition, it is possible to prevent the opportunity for storing a still image from being lost by using only a specific composition as a reference.
  • one or a plurality of compositions to be used for score calculation are selected from a plurality of compositions according to the image content of the frame, and a score for each of the selected one or a plurality of compositions is calculated.
  • a desirable composition or a suitable composition For example, a composition suitable for a person being a subject, a composition suitable for a landscape being a subject, and the like. Therefore, by selecting a composition used for score calculation according to the image content of the frame, an evaluation value based on the composition suitable for the image content is obtained. Accordingly, it is possible to increase a possibility that a frame having a desirable composition is stored as a still image according to the type of the subject.
  • the maximum value (the maximum score selected in step S400 in FIG. 8) among the scores respectively calculated based on a plurality of compositions satisfies the threshold ( S404). If at least one score satisfies the threshold condition, it can be determined that the frame corresponds to a certain composition. Therefore, it is possible to determine whether it is appropriate to leave the image as a still image by determining the threshold condition based on the score having the maximum value. In this case, when a plurality of scores are calculated, the composition suitability can be accurately determined by simple processing.
  • a condition determination other than the threshold condition it is determined whether the score of the target frame is in the vicinity of the peak value of the score value that fluctuates in a plurality of consecutive frames. (S406). That is, a frame whose score satisfies the threshold condition is not simply stored as a still image, but a frame whose score is near the peak value is selected. For example, in a continuous frame period, if a score is stored as a still image only by satisfying a threshold condition, many similar images are stored. Therefore, by selecting a frame near the peak value, an image more suitable as a composition can be stored as a still image. This improves the quality of the stored still image and prevents a large number of similar images from being stored.
  • the threshold value is low, by determining the peak, it is possible to store still images with a composition as good as possible while increasing the number of imaging opportunities.
  • the target frame immediately before the frame in which the Max score ⁇ Buf score in step S406 in FIG. 8 is used as the peak score frame and the frame may be stored as a still image after other conditions are determined. .
  • the frame with the highest score can be stored as a still image.
  • the condition determination other than the threshold condition it is determined whether or not an image of the target frame, for example, an image of the subject of interest is in focus (S407). That is, a frame whose score satisfies the threshold condition is not stored as a still image, but an out-of-focus image is not stored as a still image.
  • the stored still image can be a focused image (an image that is not blurred), and the quality of the stored still image can be improved.
  • the target frame includes an image in the process of moving the subject of interest (S409).
  • the condition for storing the still image is satisfied.
  • the shutter speed at the time of capturing the frame is determined. Specifically, it is determined whether or not the image is captured in a state where the shutter speed is equal to or higher than the speed obtained from the subject condition (S410).
  • the image is stored as a still image when it can be determined that there is no blur based on the shutter speed.
  • the possibility of a blurred image is low.
  • a high-quality still image is stored even in the case of a moving subject.
  • the threshold setting unit variably sets the threshold conditions (th1, th2, th3) according to the operation input.
  • automatic still image storage can be performed at a frequency according to the user's intention.
  • a frame during a waiting time corresponding to a set threshold is not set as image data stored as a still image ( S111).
  • the standby time also corresponds to the frequency according to the threshold condition by responding to the threshold condition. Will be. If the standby time is too long, the opportunity for storing still images is excessively impaired. It also affects the memory frequency setting. Therefore, it is preferable that the time is relatively short.
  • image data for example, a through image
  • the composition information for example, the ideal position IP
  • FIGS. 22 and 23 That is, for the captured image, a state is displayed in which the degree of corresponding to the specific composition of the image content is known.
  • the user can easily determine whether or not the composition being imaged is a good composition.
  • the threshold condition for example, threshold bar 62, threshold level 61
  • an enlarged image 65 serving as a guide for matching the image content with a specific composition is displayed (FIG. 24C).
  • the enlarged image can guide the user to easily bring the center of the main subject to the ideal position IP as a composition, making it easy to adjust the angle of view and the imaging direction during imaging, and obtaining a frame with a high score.
  • these display controls are suitable as a guide even when automatic still image storage is performed, that is, when still image storage is performed by a user's shutter operation. Therefore, even in the case of still image storage by shutter operation, it is useful to calculate a score for a composition and present composition information and score information.
  • the automatic still image storage operation is a process of determining a frame having a composition suitable as a still image from frames as a moving image and storing image data corresponding to the frame.
  • automatic storage as a moving image is also possible.
  • a frame having a good composition when a frame having a good composition is determined, a moving image of a predetermined time can be automatically stored starting from the frame.
  • the frame may be intermittent frames with a low frame rate, or may be stored as pseudo-moving image data with more thinned frames.
  • a frame having a good composition for example, a frame satisfying the threshold condition is determined, a continuous frame immediately before the frame that does not satisfy the threshold condition may be stored as moving image data.
  • composition condition that is, the threshold condition is satisfied.
  • the process as automatic image storage can be simplified.
  • This technique can also take the following configurations.
  • a threshold setting unit that sets a threshold condition for determining whether to execute image storage according to the composition;
  • An image processing apparatus comprising: an image storage determination unit that determines, when a composition in a frame satisfies a threshold condition set by the threshold setting unit, as image data that stores image data corresponding to the frame.
  • the image storage determination unit stores image data of the frame according to another condition determination when the composition in the frame satisfies the threshold condition set by the threshold setting unit.
  • a score calculation unit that calculates all or part of consecutive frames as a score calculation target frame and calculates a score that is an evaluation value of the composition of the image for the frame as the score calculation target;
  • the image storage determination unit when a score calculated by the score calculation unit for a certain frame satisfies a threshold condition set by the threshold setting unit, image data for storing image data corresponding to the frame;
  • the image processing apparatus according to (1) or (2).
  • the score calculation unit is capable of calculating a score with each of a plurality of compositions as a reference.
  • the score calculation unit selects a composition as a reference for score calculation according to the image content of the frame, and calculates a score for each of the selected one or a plurality of compositions.
  • Image processing device (6)
  • the image storage determination unit determines whether a maximum value of scores calculated by the score calculation unit for a certain frame based on a plurality of compositions satisfies a threshold set by the threshold setting unit.
  • the image processing apparatus according to (4) or (5).
  • the image storage determination unit as the other condition determination, The image processing apparatus according to (2), wherein it is determined whether or not the score of the target frame is near a peak value of a score value that fluctuates in a plurality of consecutive frames.
  • the image storage determination unit as the other condition determination, The image processing apparatus according to (2) or (7), wherein a determination is made as to whether or not an image of a target frame is in focus.
  • the image storage determination unit as the other condition determination, The image processing device according to any one of (2), (7), and (8), wherein it is determined whether or not the target frame includes an image in a process in which the subject of interest is moving.
  • the image storage determination unit as the other condition determination, If the target frame includes an image of a process in which the subject of interest is moving, the shutter speed at the time of capturing the frame is determined. (2) (7) (8) (9) Image processing apparatus.
  • the image processing apparatus according to any one of (1) to (10), wherein the threshold setting unit variably sets a threshold condition according to an operation input. (12) After the image storage determination unit determines that the image data corresponding to a certain frame is stored image data, it corresponds to a frame during a waiting time corresponding to the threshold set by the threshold setting unit. The image processing apparatus according to any one of (1) to (11), wherein the image storage determination unit does not determine image data to be stored as image data. (13) Any one of the above (1) to (12) provided with a display control unit that displays image data of continuous frames on the time axis on the display screen and superimposes the image data on the display screen. An image processing apparatus according to claim 1.
  • the above (1) to (1) are provided with a display control unit that displays image data of continuous frames on the time axis on the display screen and displays an enlarged image that serves as a guide for matching the image content with a specific composition.
  • the image processing apparatus according to any one of (14).
  • a threshold setting step for setting a threshold condition for determining whether to execute image storage according to the composition
  • An image storage determination step for determining, when the composition in the frame satisfies the threshold condition set in the threshold setting step, as image data for storing the image data corresponding to the frame;
  • a threshold setting step for setting a threshold condition for determining whether to execute image storage according to the composition
  • An image storage determination step for determining, when the composition in the frame satisfies the threshold condition set in the threshold setting step, as image data for storing the image data corresponding to the frame

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

L'objectif de l'invention est de faciliter le stockage, à une fréquence appropriée, d'images ayant des compositions souhaitables au cours d'une opération de stockage d'images de façon automatique. Une condition de valeur seuil utilisée pour la détermination relative à l'exécution de stockages d'images automatiques est établie. Ensuite, un degré de correspondance avec une composition est obtenu pour chacune de certaines ou de la totalité des trames apparaissant de manière consécutive sur l'axe temporel en tant que données d'image. Si la composition d'une trame satisfait la condition de valeur seuil établie, les données d'image correspondant à cette trame sont déterminées en tant que données d'image à stocker.
PCT/JP2016/060689 2015-06-08 2016-03-31 Appareil de traitement d'image, procédé de traitement d'image, et programme Ceased WO2016199483A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112016002564.5T DE112016002564T5 (de) 2015-06-08 2016-03-31 Bildverarbeitungsvorrichtung, bildverarbeitungsverfahren und programm
JP2017523136A JP6729572B2 (ja) 2015-06-08 2016-03-31 画像処理装置、画像処理方法、プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015115750 2015-06-08
JP2015-115750 2015-06-08

Publications (1)

Publication Number Publication Date
WO2016199483A1 true WO2016199483A1 (fr) 2016-12-15

Family

ID=57504730

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/060689 Ceased WO2016199483A1 (fr) 2015-06-08 2016-03-31 Appareil de traitement d'image, procédé de traitement d'image, et programme

Country Status (3)

Country Link
JP (1) JP6729572B2 (fr)
DE (1) DE112016002564T5 (fr)
WO (1) WO2016199483A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019124055A1 (fr) * 2017-12-18 2019-06-27 キヤノン株式会社 Dispositif de capture d'image, son procédé de commande, programme et support de stockage
JP2019110525A (ja) * 2017-12-18 2019-07-04 キヤノン株式会社 撮像装置及びその制御方法、プログラム、記憶媒体
JP2019186791A (ja) * 2018-04-12 2019-10-24 シャープ株式会社 撮像装置、撮像装置の制御方法、および制御プログラム
JP2021082972A (ja) * 2019-11-20 2021-05-27 株式会社エクサウィザーズ 撮影装置、情報処理装置、方法およびプログラム
JP2021083073A (ja) * 2020-07-03 2021-05-27 株式会社エクサウィザーズ 撮影装置、情報処理装置、方法およびプログラム
JP2022172705A (ja) * 2021-05-06 2022-11-17 キヤノン株式会社 撮影制御装置、撮影システム、撮影制御方法及びプログラム
JP7492793B1 (ja) 2023-05-18 2024-05-30 深▲せん▼市宗匠科技有限公司 電子機器、その肌質分析方法及び装置
WO2024189836A1 (fr) * 2023-03-15 2024-09-19 日本電気株式会社 Dispositif d'extraction

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021171685A1 (fr) * 2020-02-28 2021-09-02 富士フイルム株式会社 Contrôleur d'appareil de prise de vues numérique équipé d'une imprimante, procédé d'exploitation de contrôleur d'appareil de prise de vues numérique équipé d'une imprimante, et programme d'exploitation de contrôleur d'appareil de prise de vues numérique équipé d'une imprimante

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06235952A (ja) * 1993-02-12 1994-08-23 Nikon Corp 撮像装置
JP2007306464A (ja) * 2006-05-15 2007-11-22 Fujifilm Corp 撮影制御方法および装置ならびにプログラム
JP2008289189A (ja) * 2008-08-15 2008-11-27 Sony Corp 撮像装置および表情評価装置
JP2009081502A (ja) * 2007-09-25 2009-04-16 Fujifilm Corp 撮影装置及び画像再生装置
JP2009225103A (ja) * 2008-03-17 2009-10-01 Nikon Corp カメラ
JP2011030164A (ja) * 2009-07-29 2011-02-10 Sony Corp 撮像装置、撮像システム、撮像方法、プログラム
JP2011097194A (ja) * 2009-10-27 2011-05-12 Canon Inc 撮像装置、撮像装置の制御方法およびプログラム
JP2012186670A (ja) * 2011-03-07 2012-09-27 Ricoh Co Ltd 撮像装置と撮像方法並びに撮像プログラム
JP2012231327A (ja) * 2011-04-26 2012-11-22 Canon Inc 撮像装置、撮像方法及びプログラム
JP2013195704A (ja) * 2012-03-19 2013-09-30 Casio Comput Co Ltd 撮像装置、撮像方法、及びプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4844657B2 (ja) * 2009-07-31 2011-12-28 カシオ計算機株式会社 画像処理装置及び方法

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06235952A (ja) * 1993-02-12 1994-08-23 Nikon Corp 撮像装置
JP2007306464A (ja) * 2006-05-15 2007-11-22 Fujifilm Corp 撮影制御方法および装置ならびにプログラム
JP2009081502A (ja) * 2007-09-25 2009-04-16 Fujifilm Corp 撮影装置及び画像再生装置
JP2009225103A (ja) * 2008-03-17 2009-10-01 Nikon Corp カメラ
JP2008289189A (ja) * 2008-08-15 2008-11-27 Sony Corp 撮像装置および表情評価装置
JP2011030164A (ja) * 2009-07-29 2011-02-10 Sony Corp 撮像装置、撮像システム、撮像方法、プログラム
JP2011097194A (ja) * 2009-10-27 2011-05-12 Canon Inc 撮像装置、撮像装置の制御方法およびプログラム
JP2012186670A (ja) * 2011-03-07 2012-09-27 Ricoh Co Ltd 撮像装置と撮像方法並びに撮像プログラム
JP2012231327A (ja) * 2011-04-26 2012-11-22 Canon Inc 撮像装置、撮像方法及びプログラム
JP2013195704A (ja) * 2012-03-19 2013-09-30 Casio Comput Co Ltd 撮像装置、撮像方法、及びプログラム

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019124055A1 (fr) * 2017-12-18 2019-06-27 キヤノン株式会社 Dispositif de capture d'image, son procédé de commande, programme et support de stockage
JP2019110525A (ja) * 2017-12-18 2019-07-04 キヤノン株式会社 撮像装置及びその制御方法、プログラム、記憶媒体
US11303802B2 (en) 2017-12-18 2022-04-12 Canon Kabushiki Kaisha Image capturing apparatus, control method therefor, and storage medium
JP7403218B2 (ja) 2017-12-18 2023-12-22 キヤノン株式会社 撮像装置及びその制御方法、プログラム、記憶媒体
JP2019186791A (ja) * 2018-04-12 2019-10-24 シャープ株式会社 撮像装置、撮像装置の制御方法、および制御プログラム
JP2021082972A (ja) * 2019-11-20 2021-05-27 株式会社エクサウィザーズ 撮影装置、情報処理装置、方法およびプログラム
JP2021083073A (ja) * 2020-07-03 2021-05-27 株式会社エクサウィザーズ 撮影装置、情報処理装置、方法およびプログラム
JP2022172705A (ja) * 2021-05-06 2022-11-17 キヤノン株式会社 撮影制御装置、撮影システム、撮影制御方法及びプログラム
US12317003B2 (en) 2021-05-06 2025-05-27 Canon Kabushiki Kaisha Imaging control apparatus, server apparatus, imaging control method, and non-transitory computer-readable storage medium
WO2024189836A1 (fr) * 2023-03-15 2024-09-19 日本電気株式会社 Dispositif d'extraction
JP7492793B1 (ja) 2023-05-18 2024-05-30 深▲せん▼市宗匠科技有限公司 電子機器、その肌質分析方法及び装置
JP2024166053A (ja) * 2023-05-18 2024-11-28 深▲せん▼市宗匠科技有限公司 電子機器、その肌質分析方法及び装置

Also Published As

Publication number Publication date
JP6729572B2 (ja) 2020-07-22
JPWO2016199483A1 (ja) 2018-03-29
DE112016002564T5 (de) 2018-03-22

Similar Documents

Publication Publication Date Title
JP6729572B2 (ja) 画像処理装置、画像処理方法、プログラム
US12276899B2 (en) Image pickup device and method of tracking subject thereof
US11012614B2 (en) Image processing device, image processing method, and program
JP5867424B2 (ja) 画像処理装置、画像処理方法、プログラム
US10848662B2 (en) Image processing device and associated methodology for determining a main subject in an image
JP5251215B2 (ja) デジタルカメラ
CN101931752B (zh) 摄像装置、以及对焦方法
US20150070526A1 (en) Display control device, display control method, and program
US10455154B2 (en) Image processing device, image processing method, and program including stable image estimation and main subject determination
KR102303514B1 (ko) 정보 처리 장치, 정보 처리 방법 및 프로그램
JP5987306B2 (ja) 画像処理装置、画像処理方法、プログラム
JP5331128B2 (ja) 撮像装置
US20180232943A1 (en) System and method for generating a virtual viewpoint apparatus
JP2011139282A (ja) 画像処理装置、撮像装置、画像処理方法およびプログラム
US10275917B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
US9942460B2 (en) Image processing device, image processing method, and program
US20100188520A1 (en) Imaging device and storage medium storing program
JP2010199694A (ja) 撮像装置、画角調節方法、及び、プログラム
JP2014123908A (ja) 画像処理装置、画像切り出し方法、及びプログラム
JP6024135B2 (ja) 被写体追尾表示制御装置、被写体追尾表示制御方法およびプログラム
JP5533241B2 (ja) 動画再生装置、動画再生方法及びプログラム
US11595565B2 (en) Image capturing apparatus, method for controlling the same, and recording medium for automatic image capturing of a subject
JP2017050593A (ja) 撮像装置、撮像制御方法及びプログラム
CN114342350B (zh) 成像控制装置、成像控制方法、程序以及成像设备
JP2009098850A (ja) 演算装置及びそのプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16807190

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017523136

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 112016002564

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16807190

Country of ref document: EP

Kind code of ref document: A1