[go: up one dir, main page]

WO2018185940A1 - Dispositif de commande d'imagerie, dispositif d'imagerie, système d'imagerie, corps mobile, procédé de commande d'imagerie et programme - Google Patents

Dispositif de commande d'imagerie, dispositif d'imagerie, système d'imagerie, corps mobile, procédé de commande d'imagerie et programme Download PDF

Info

Publication number
WO2018185940A1
WO2018185940A1 PCT/JP2017/014560 JP2017014560W WO2018185940A1 WO 2018185940 A1 WO2018185940 A1 WO 2018185940A1 JP 2017014560 W JP2017014560 W JP 2017014560W WO 2018185940 A1 WO2018185940 A1 WO 2018185940A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
lens
spatial frequency
frequency band
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/014560
Other languages
English (en)
Japanese (ja)
Inventor
明 邵
本庄 謙一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to PCT/JP2017/014560 priority Critical patent/WO2018185940A1/fr
Priority to JP2017559465A priority patent/JP6503607B2/ja
Publication of WO2018185940A1 publication Critical patent/WO2018185940A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • the present invention relates to an imaging control device, an imaging device, an imaging system, a moving body, an imaging control method, and a program.
  • Patent Document 1 Japanese Patent No. 5932476
  • the AF processing using the blur amount of the image it is necessary to capture at least two images, and the AF processing time may be long.
  • the accuracy of the AF process may be reduced when a specific frequency component such as a high frequency component is included in the spatial frequency of the image.
  • the imaging control apparatus may include a derivation unit that derives a spatial frequency band of a captured image.
  • the imaging control device executes autofocus processing based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different based on the spatial frequency band of the image derived by the deriving unit.
  • a control unit is provided that controls the positional relationship between the imaging surface and the lens by selecting one of the first AF process and the second AF process for performing autofocus processing by a phase difference method, or by combining both. Good.
  • the control unit selects one of the first AF process and the second AF process based on the spatial frequency band of the image derived by the derivation unit and the spatial frequency band predetermined for the second AF process.
  • the positional relationship between the imaging surface and the lens may be controlled by combining both.
  • the control unit selects one of the first AF process and the second AF process based on the ratio of the spatial frequency band included in the predetermined spatial frequency band in the spatial frequency band of the image derived by the derivation unit. Or a combination of both may control the positional relationship between the imaging surface and the lens.
  • the control unit may control the positional relationship between the imaging surface and the lens by selecting the second AF process when the ratio is equal to or greater than the threshold and selecting the first AF process when the ratio is smaller than the threshold.
  • the control unit selects the second AF process when the ratio is equal to or greater than the first threshold, and combines the first AF process and the second AF process when the ratio is smaller than the first threshold and equal to or greater than the second threshold, When the threshold value is smaller than 2, the first AF process may be selected to control the positional relationship between the imaging surface and the lens.
  • the control unit compares the spatial frequency band of the image derived by the deriving unit with a predetermined spatial frequency band that can be reproduced by a plurality of pixels that generate the phase difference detection signal, and thereby performs the first AF process and the first AF process.
  • the positional relationship between the imaging surface and the lens may be controlled by selecting either one of the 2AF processing or combining both.
  • the second AF process may be executed based on a phase difference detection signal output from an image sensor in which a plurality of pixels and a plurality of other pixels that generate color component signals are arranged in a predetermined arrangement pattern. .
  • An imaging device may include the imaging control device.
  • the imaging device may include an image sensor having an imaging surface.
  • the imaging device may include a lens.
  • An imaging system may include the imaging device.
  • the imaging system may include a support mechanism that supports the imaging device.
  • the moving body according to one embodiment of the present invention may move by mounting the imaging system.
  • the imaging control method may include a step of deriving a spatial frequency band of a captured image.
  • the imaging control method includes a first AF process that performs an autofocus process based on blur amounts of a plurality of images that are captured in a state where the positional relationship between the imaging surface and the lens is different based on the spatial frequency band of the derived image. And a step of controlling the positional relationship between the imaging surface and the lens by selecting either one of the second AF processing for executing the autofocus processing by the phase difference method, or combining both.
  • the program according to an aspect of the present invention may cause a computer to execute a step of deriving a spatial frequency band of a captured image.
  • the program includes: a first AF process that performs an autofocus process based on blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface and the lens is different based on the derived spatial frequency band of the image;
  • the step of controlling the positional relationship between the imaging surface and the lens may be executed by the computer by selecting either one of the second AF processes that execute the autofocus process by the phase difference method, or by combining both.
  • a block is either (1) a stage in a process in which an operation is performed or (2) an apparatus responsible for performing the operation. May represent a “part”.
  • Certain stages and “units” may be implemented by programmable circuits and / or processors.
  • Dedicated circuitry may include digital and / or analog hardware circuitry.
  • Integrated circuits (ICs) and / or discrete circuits may be included.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA), etc.
  • the memory element or the like may be included.
  • the computer readable medium may include any tangible device capable of storing instructions to be executed by a suitable device.
  • a computer readable medium having instructions stored thereon comprises a product that includes instructions that can be executed to create a means for performing the operations specified in the flowcharts or block diagrams.
  • Examples of computer readable media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • Computer readable media include floppy disks, diskettes, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), Electrically erasable programmable read only memory (EEPROM), static random access memory (SRAM), compact disc read only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, integrated A circuit card or the like may be included.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • EEPROM Electrically erasable programmable read only memory
  • SRAM static random access memory
  • CD-ROM compact disc read only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • the computer readable instructions may include either source code or object code written in any combination of one or more programming languages.
  • the source code or object code includes a conventional procedural programming language.
  • Conventional procedural programming languages include assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA, C ++, etc. It may be an object-oriented programming language and a “C” programming language or a similar programming language.
  • Computer readable instructions may be directed to a general purpose computer, special purpose computer, or other programmable data processing device processor or programmable circuit locally or in a wide area network (WAN) such as a local area network (LAN), the Internet, etc. ).
  • WAN wide area network
  • LAN local area network
  • the Internet etc.
  • the processor or programmable circuit may execute computer readable instructions to create a means for performing the operations specified in the flowcharts or block diagrams.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
  • FIG. 1 shows an example of the external appearance of an unmanned aerial vehicle (UAV) 10 and a remote control device 300.
  • the UAV 10 includes a UAV main body 20, a gimbal 50, a plurality of imaging devices 60, and an imaging device 100.
  • the gimbal 50 and the imaging device 100 are an example of an imaging system.
  • the UAV 10 is an example of a moving body propelled by a propulsion unit.
  • the moving body is a concept including a flying body such as another aircraft moving in the air, a vehicle moving on the ground, a ship moving on the water, etc. in addition to the UAV.
  • the UAV main body 20 includes a plurality of rotor blades.
  • the plurality of rotor blades is an example of a propulsion unit.
  • the UAV main body 20 causes the UAV 10 to fly by controlling the rotation of a plurality of rotor blades.
  • the UAV main body 20 causes the UAV 10 to fly using four rotary wings.
  • the number of rotor blades is not limited to four.
  • the UAV 10 may be a fixed wing machine that does not have a rotating wing.
  • the imaging apparatus 100 is an imaging camera that images a subject included in a desired imaging range.
  • the gimbal 50 supports the imaging device 100 in a rotatable manner.
  • the gimbal 50 is an example of a support mechanism.
  • the gimbal 50 supports the imaging device 100 so as to be rotatable about the pitch axis using an actuator.
  • the gimbal 50 further supports the imaging device 100 using an actuator so as to be rotatable about the roll axis and the yaw axis.
  • the gimbal 50 may change the posture of the imaging device 100 by rotating the imaging device 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that image the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • Two imaging devices 60 may be provided in the front which is the nose of UAV10.
  • Two other imaging devices 60 may be provided on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired and function as a stereo camera. Based on images picked up by a plurality of image pickup devices 60, three-dimensional spatial data around the UAV 10 may be generated.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 only needs to include at least one imaging device 60.
  • the UAV 10 may include at least one imaging device 60 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the UAV 10.
  • the angle of view that can be set by the imaging device 60 may be wider than the angle of view that can be set by the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 may communicate with the UAV 10 wirelessly.
  • the remote control device 300 transmits to the UAV 10 instruction information indicating various commands related to movement of the UAV 10 such as ascending, descending, accelerating, decelerating, moving forward, moving backward, and rotating.
  • the instruction information includes, for example, instruction information for raising the altitude of the UAV 10.
  • the instruction information may indicate the altitude at which the UAV 10 should be located.
  • the UAV 10 moves so as to be located at an altitude indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending command that raises the UAV 10.
  • the UAV 10 rises while accepting the ascent command. Even if the UAV 10 receives the ascending command, the UAV 10 may limit the ascent when the altitude of the UAV 10 has reached the upper limit altitude.
  • FIG. 2 shows an example of functional blocks of the UAV10.
  • the UAV 10 includes a UAV control unit 30, a memory 32, a communication interface 34, a propulsion unit 40, a GPS receiver 41, an inertial measurement device 42, a magnetic compass 43, a barometric altimeter 44, a temperature sensor 45, a gimbal 50, and the imaging device 100. .
  • the communication interface 34 communicates with other devices such as the remote operation device 300.
  • the communication interface 34 may receive instruction information including various commands for the UAV control unit 30 from the remote operation device 300.
  • the memory 32 includes a propulsion unit 40, a GPS receiver 41, an inertial measurement device (IMU) 42, a magnetic compass 43, a barometric altimeter 44, a temperature sensor 45, a gimbal 50, an imaging device 60, and the imaging device 100. Stores programs and the like necessary for controlling
  • the memory 32 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 may be provided inside the UAV main body 20. It may be provided so as to be removable from the UAV main body 20.
  • the UAV control unit 30 controls the flight and imaging of the UAV 10 according to a program stored in the memory 32.
  • the UAV control unit 30 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the UAV control unit 30 controls the flight and imaging of the UAV 10 according to a command received from the remote control device 300 via the communication interface 34.
  • the propulsion unit 40 propels the UAV 10.
  • the propulsion unit 40 includes a plurality of rotating blades and a plurality of drive motors that rotate the plurality of rotating blades.
  • the propulsion unit 40 causes the UAV 10 to fly by rotating a plurality of rotor blades via a plurality of drive motors in accordance with a command from the UAV control unit 30.
  • the GPS receiver 41 receives a plurality of signals indicating times transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position of the GPS receiver 41, that is, the position of the UAV 10 based on the received signals.
  • the IMU 42 detects the posture of the UAV 10.
  • the IMU 42 detects, as the posture of the UAV 10, acceleration in the three axial directions of the front, rear, left, and right of the UAV 10, and angular velocity in the three axial directions of pitch, roll, and yaw.
  • the magnetic compass 43 detects the heading of the UAV 10.
  • the barometric altimeter 44 detects the altitude at which the UAV 10 flies.
  • the barometric altimeter 44 detects the atmospheric pressure around the UAV 10, converts the detected atmospheric pressure into an altitude, and detects the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the imaging apparatus 100 includes an imaging unit 102 and a lens unit 200.
  • the lens unit 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be configured by a CCD or a CMOS.
  • the image sensor 120 outputs image data of an optical image formed through the plurality of lenses 210 to the imaging control unit 110.
  • the imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 may control the imaging device 100 in accordance with an operation command for the imaging device 100 from the UAV control unit 30.
  • the memory 130 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores a program and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the memory 130 may be provided so as to be removable from the housing of the imaging apparatus 100.
  • the lens unit 200 includes a plurality of lenses 210, a lens moving mechanism 212, and a lens control unit 220.
  • the plurality of lenses 210 may function as a zoom lens, a varifocal lens, and a focus lens. At least some or all of the plurality of lenses 210 are arranged to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens that is detachably attached to the imaging unit 102.
  • the lens moving mechanism 212 moves at least some or all of the plurality of lenses 210 along the optical axis.
  • the lens control unit 220 drives the lens moving mechanism 212 in accordance with a lens control command from the imaging unit 102 to move one or a plurality of lenses 210 along the optical axis direction.
  • the lens control command is, for example, a zoom control command and a focus control command.
  • the imaging apparatus 100 configured in this manner performs an autofocus process (AF process) and images a desired subject.
  • AF process autofocus process
  • the imaging apparatus 100 determines the distance from the lens to the subject (subject distance) in order to execute the AF process.
  • a method for determining the subject distance there is a method of determining based on the blur amounts of a plurality of images captured in a state where the positional relationship between the lens and the imaging surface is different.
  • this method is referred to as a blur detection auto focus (BDAF) method.
  • BDAF blur detection auto focus
  • the blur amount (Cost) of the image can be expressed by the following equation (1) using a Gaussian function.
  • x indicates a pixel position in the horizontal direction.
  • represents a standard deviation value.
  • FIG. 3 shows an example of a curve represented by Equation (1).
  • FIG. 4 is a flowchart illustrating an example of a distance calculation procedure of the BDAF method.
  • the imaging apparatus 100 in a state in which the lens and the image plane is in the first positional relationship, for storing the image I 1 of the first sheet in the memory 130 by photographing.
  • the imaging surface of the focusing lens or the image sensor 120 in the optical axis direction, the lens and the imaging surface in the state in the second positional relationship, captured image I 2 of the second sheet by the imaging device 100 And stored in the memory 130 (S101).
  • the focus lens or the imaging surface of the image sensor 120 is moved in the optical axis direction so as not to exceed the focal point.
  • the moving amount of the imaging surface of the focus lens or the image sensor 120 may be 10 ⁇ m, for example.
  • the imaging device 100 divides the image I 1 into a plurality of regions (S102). Calculates a feature amount for each pixel in the image I2, may divide the image I 1 into a plurality of regions of pixel groups having a feature amount that is similar as a single region. A pixel group in a range set in the AF processing frame in the image I 1 may be divided into a plurality of regions. Imaging device 100 divides the image I 2 into a plurality of regions corresponding to a plurality of areas of the image I 1. The imaging apparatus 100 includes an object included in each of the plurality of regions based on the amount of blur of each of the plurality of regions of the image I 1 and the amount of blur of each of the plurality of regions of the image I 2. Is calculated (S103).
  • the distance calculation procedure will be further described with reference to FIG.
  • the distance from the lens L (principal point) to the object 510 (object surface) is A
  • the distance from the lens L (principal point) to the position (image plane) where the object 510 forms an image on the imaging surface is B
  • the focal length is F.
  • the relationship between the distance A, the separation B, and the focal length F can be expressed by the following formula (2) from the lens formula.
  • the focal length F is specified by the lens position. Therefore, if the distance B at which the object 510 forms an image on the imaging surface can be specified, the distance A from the lens L to the object 510 can be specified using Expression (2).
  • the distance B is specified by calculating the position at which the object 510 forms an image from the blur size (the circles of confusion 512 and 514) of the object 510 projected on the imaging surface.
  • A can be specified. That is, the imaging position can be specified in consideration of the fact that the size of blur (the amount of blur) is proportional to the imaging surface and the imaging position.
  • the distance from the image I 2 far from the image plane to the lens L is D 2 .
  • Each image is blurred.
  • the point spread function at this time is PSF, and the images at D 1 and D 2 are I d1 and I d2 , respectively.
  • the image I 1 can be expressed by the following equation (3) by a convolution operation.
  • the value C shown in the equation (4) corresponds to the amount of change in the blur amounts of the images I d1 and I d2 , that is, the value C corresponds to the difference between the blur amount of the image I d1 and the blur amount of the image I d2n. .
  • AF processing may take time.
  • a phase difference AF processing method that performs AF processing using two parallax images is known as a method that can shorten the AF processing time relatively.
  • the phase difference AF processing method includes an image plane phase difference AF (PDAF) method.
  • PDAF image plane phase difference AF
  • an image sensor in which a plurality of pixels are arranged as shown in FIG. 6 is used.
  • the image sensor includes a pixel column 600 including a Bayer array pixel block that outputs a color component detection signal, and a pixel column 602 and a pixel column 604 including pixel blocks including pixels that output a phase difference detection signal.
  • the pixel P1 that outputs the phase difference detection signal included in the pixel column 602 and the pixel P2 that outputs the phase difference detection signal included in the pixel column 604 have different light incident directions. Therefore, an image having a different phase is obtained from the image obtained from the pixel P1 and the image obtained from the pixel P2. Then, the distance to the subject can be derived based on the amount and direction of deviation between the image included in the image obtained from the pixel P1 and the image included in the image obtained from the pixel P2.
  • the PDAF method can execute AF processing faster than the BDAF method.
  • the pixels for phase difference detection are relatively spaced apart. Therefore, the image quality of the image obtained from the image P1 and the pixel P2 is relatively low. For this reason, if a high frequency component is included in the spatial frequency band of the image captured by the image sensor, the accuracy of the AF processing may decrease.
  • AF processing of the BDAF method and AF processing of the PADF method are selected based on the spatial frequency band of the image, or a combination of both, Execute the process.
  • the imaging control unit 110 includes a derivation unit 112 and a focusing control unit 114.
  • the deriving unit 112 derives the spatial frequency band of the image captured by the image sensor 120.
  • the deriving unit 112 may derive the spatial frequency band of the image by performing Fourier transform on the image and decomposing the image for each spatial frequency component.
  • the focusing control unit 114 determines the blur amounts of a plurality of images captured in a state where the positional relationship between the imaging surface of the image sensor 120 and the lens 210 is different.
  • the imaging surface of the image sensor 120 and the lens 210 are selected by selecting either one of BDAF processing that executes autofocus processing based on the PDAF processing that performs autofocus processing by the phase difference method, or a combination of both.
  • the focus control unit 114 may control the position of the focus lens by selecting one of BDAF processing and PDAF processing based on the spatial frequency band of the image, or by combining both.
  • the focus control unit 114 is an example of a control unit.
  • the BDAF process is an example of a first AF process.
  • the PFAF process is an example of a second AF process.
  • the focus control unit 114 selects either BDAF processing or PDAF processing based on the spatial frequency band of the image derived by the deriving unit 112 and the spatial frequency band predetermined for the PDAF processing. Alternatively, the positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by combining both. The focusing control unit 114 performs either one of the BDAF process and the PDAF process based on the ratio of the spatial frequency band included in the predetermined spatial frequency band in the spatial frequency band of the image derived by the deriving unit 112. The position relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting or combining both.
  • the focus control unit 114 selects the PDAF process when the ratio is greater than or equal to the threshold value, and selects the BDAF process when the ratio is smaller than the threshold value, thereby determining the positional relationship between the imaging surface of the image sensor 120 and the lens 210. You may control.
  • the focus control unit 114 selects the PDAF process when the ratio is greater than or equal to the first threshold, and combines the BDAF process and the PDAF process when the ratio is smaller than the first threshold and greater than or equal to the second threshold, If it is smaller than two thresholds, the positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting BDAF processing.
  • the focus control unit 114 weights the distance to the subject specified by the BDAF process and the distance to the subject specified by the PADF process, and based on the weighted distance, the position of the focus lens May be controlled.
  • the focus control unit 114 may change the weighting of the BDAF process and the PDAF process depending on the ratio.
  • the focusing control unit 114 compares the spatial frequency band of the image derived by the deriving unit 112 with a predetermined spatial frequency band that can be reproduced by a plurality of pixels that generate the phase difference detection signal, thereby obtaining the BDAF.
  • the positional relationship between the imaging surface of the image sensor 120 and the lens 210 may be controlled by selecting one of the processing and the PDAF processing or combining both.
  • the focus control unit 114 is output from the image sensor 120 in which a plurality of pixels that generate a phase difference detection signal and a plurality of other pixels that generate a color component signal are arranged in a predetermined arrangement pattern.
  • the PDAF process may be executed based on the phase difference detection signal.
  • the plurality of pixels that generate the phase difference detection signal are, for example, the pixel P1 and the pixel P2 illustrated in FIG.
  • the plurality of other pixels that generate the color component signal are, for example, the pixel R, the pixel G, and the pixel B illustrated in FIG.
  • the predetermined spatial frequency band may be a spatial frequency band that can be reproduced by the pixel P1 and the pixel P2, for example.
  • the spatial frequency band that can be reproduced by the pixel R, the pixel G, and the pixel B is, for example, a region 700.
  • the spatial frequency band that can be reproduced by the pixel P1 and the pixel P2 is, for example, a region 702.
  • Region 700 and region 702 each contain a low frequency component.
  • the frequency component included in the region 702 has a lower proportion of high frequency components than the frequency component included in the region 700. Therefore, when the image picked up by the image sensor 120 includes many high-frequency components, the spatial frequency components that can be reproduced by the pixels P1 and P2 are small, and the focus control unit 114 cannot execute the AF processing with high accuracy by the PDAF processing. There is a case.
  • the BDAF process is executed based on the color component signal. That is, the focus control unit 114 can perform BDAF processing with high accuracy on an image including a spatial frequency band in the range of the region 700.
  • a region 710 is a region corresponding to the spatial frequency band of the image derived by the deriving unit 112.
  • the region 712 is a region corresponding to the spatial frequency band of the image included in the region 702 in the spatial frequency band of the image.
  • the focusing control unit 114 controls the position of the focus lens by selecting one of BDAF processing and PDAF processing or combining both based on the ratio of the area of the region 712 to the area of the region 710. It's okay.
  • the focus control unit 114 may select the PDAF process and control the position of the focus lens if the ratio of the region 712 is equal to or greater than a predetermined threshold.
  • the focus control unit 114 may select the BDAF process and control the position of the focus lens when the ratio of the area 712 is smaller than a predetermined threshold.
  • the focusing control unit 114 selects the PDAF process and selects the maximum spatial frequency component. If the frequency component is larger than a predetermined spatial frequency component, BDAF processing may be selected to control the position of the focus lens.
  • the predetermined spatial frequency component may be determined based on, for example, the maximum spatial frequency component in the spatial frequency band that can be reproduced by the pixel P1 and the pixel P2.
  • FIG. 8 is a flowchart showing an example of the procedure of AF processing.
  • the deriving unit 112 acquires an image captured by the image sensor 120.
  • the deriving unit 112 derives a spatial frequency band of the image.
  • the deriving unit 112 derives a ratio B of the spatial frequency band of the image that occupies a predetermined spatial frequency band for the PDAF processing (S200).
  • the focus control unit 114 determines whether or not the derived ratio B is equal to or greater than a predetermined first threshold Th1 (S202). If the ratio B is equal to or greater than the first threshold Th1, the focus control unit 114 selects the PDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S204). On the other hand, if the ratio B is smaller than the first threshold Th1, the focusing control unit 114 selects the BDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S206).
  • the focus control unit 114 switches between the PDAF process and the BDAF process.
  • the focusing control unit 114 can prevent the AF processing accuracy from being lowered by executing the PDAF processing.
  • the high frequency component is not included in the image, it is possible to prevent the AF process from taking a long time by the focusing control unit 114 executing the BDAF process. Therefore, according to the imaging device 100 according to the present embodiment, it is possible to suppress an increase in AF processing time and a decrease in AF processing accuracy with a good balance. You may distinguish a low frequency component and a high frequency component on the 1st threshold value Th as a boundary.
  • FIG. 9 is a flowchart showing another example of the AF processing procedure.
  • the deriving unit 112 derives a spatial frequency band of the image. Further, the deriving unit 112 derives a ratio B of the spatial frequency band of the image that occupies a predetermined spatial frequency band for the PDAF processing (S300).
  • the focus control unit 114 determines whether or not the derived ratio B is equal to or greater than a predetermined first threshold Th1 (S302). If the ratio B is equal to or greater than the first threshold Th1, the focus control unit 114 selects the PDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S304). On the other hand, if the ratio B is smaller than the first threshold value Th1, the focusing control unit 114 determines whether the ratio B is smaller than the first threshold value Th1 and is equal to or larger than a predetermined second threshold value Th2 (S306).
  • the focus control unit 114 combines the PDAF process and the BDAF process to combine the imaging surface of the image sensor 120 and the lens 210. (S308).
  • the focus control unit 114 may control the position of the focus lens based on the position of the focus lens specified by the PDAF process and the position of the focus lens specified by the BDAF process.
  • the focusing control unit 114 weights each focus lens position specified by the PDAF process and the BDAF process according to the ratio B, and controls the position of the focus lens based on each weighted distance. Good.
  • the focus control unit 114 may weight each distance so that the weight of the distance specified by the PDAF process is larger than the weight of the distance specified by the BDAF process as the ratio B is larger. .
  • the focus control unit 114 selects the BDAF process and controls the positional relationship between the imaging surface of the image sensor 120 and the lens 210 (S310).
  • the focus control unit 114 executes the AF process by switching the PDAF process and the BDAF process, or combining the PDAF process and the BDAF process.
  • the focus control unit 114 executes the AF process by switching the PDAF process and the BDAF process, or combining the PDAF process and the BDAF process.
  • the AF process is executed by combining the PDAF process and the BDAF process.
  • the ratio B is, for example, around 50%
  • the accuracy of the AF process can be improved as compared with the case where the AF process is executed by only one of the PDAF process and the BDAF process. Therefore, according to the imaging device 100 according to the present embodiment, it is possible to suppress an increase in AF processing time and a decrease in AF processing accuracy with a good balance.
  • FIG. 10 illustrates an example of a computer 1200 in which aspects of the present invention may be embodied in whole or in part.
  • a program installed in the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more “units” of the apparatus.
  • the program can cause the computer 1200 to execute the operation or the one or more “units”.
  • the program can cause the computer 1200 to execute a process according to an embodiment of the present invention or a stage of the process.
  • Such a program may be executed by CPU 1212 to cause computer 1200 to perform certain operations associated with some or all of the blocks in the flowcharts and block diagrams described herein.
  • the computer 1200 includes a CPU 1212 and a RAM 1214, which are connected to each other by a host controller 1210.
  • the computer 1200 also includes a communication interface 1222 and an input / output unit, which are connected to the host controller 1210 via the input / output controller 1220.
  • Computer 1200 also includes ROM 1230.
  • the CPU 1212 operates according to programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • a hard disk drive may store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 at the time of activation and / or a program depending on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card or a network.
  • the program is installed in the RAM 1214 or the ROM 1230 that is also an example of a computer-readable recording medium, and is executed by the CPU 1212.
  • Information processing described in these programs is read by the computer 1200 to bring about cooperation between the programs and the various types of hardware resources.
  • An apparatus or method may be configured by implementing information operations or processing in accordance with the use of computer 1200.
  • the CPU 1212 executes a communication program loaded in the RAM 1214 and performs communication processing on the communication interface 1222 based on the processing described in the communication program. You may order.
  • the communication interface 1222 reads transmission data stored in a RAM 1214 or a transmission buffer area provided in a recording medium such as a USB memory under the control of the CPU 1212 and transmits the read transmission data to a network, or The reception data received from the network is written into a reception buffer area provided on the recording medium.
  • the CPU 1212 allows the RAM 1214 to read all or necessary portions of a file or database stored in an external recording medium such as a USB memory, and executes various types of processing on the data on the RAM 1214. Good. The CPU 1212 may then write back the processed data to an external recording medium.
  • the CPU 1212 describes various types of operations, information processing, conditional judgment, conditional branching, unconditional branching, and information retrieval that are described throughout the present disclosure for data read from the RAM 1214 and specified by the instruction sequence of the program. Various types of processing may be performed, including / replacement, etc., and the result is written back to RAM 1214.
  • the CPU 1212 may search for information in files, databases, etc. in the recording medium. For example, when a plurality of entries each having an attribute value of the first attribute associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 specifies the attribute value of the first attribute. The entry that matches the condition is searched from the plurality of entries, the attribute value of the second attribute stored in the entry is read, and thereby the first attribute that satisfies the predetermined condition is associated. The attribute value of the obtained second attribute may be acquired.
  • the program or software module described above may be stored in a computer-readable storage medium on the computer 1200 or in the vicinity of the computer 1200.
  • a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, whereby the program is transferred to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)

Abstract

La présente invention concerne un dispositif de commande d'imagerie pouvant être doté d'une unité de dérivation qui dérive la bande de fréquence spatiale d'une image capturée. Le dispositif de commande d'imagerie peut être doté d'une unité de commande qui, sur la base de la bande de fréquence spatiale de l'image dérivée par l'unité de dérivation, commande la relation de position entre la surface d'imagerie et la lentille en sélectionnant l'une des deux étapes suivantes ou en combinant ces deux étapes : un premier traitement AF, dans lequel un traitement de mise au point automatique est réalisé sur la base de la quantité de flou dans plusieurs images capturées dans des états ayant différentes relations de position entre la surface d'imagerie et la lentille, et/ou un second traitement AF, dans lequel un traitement de mise au point automatique est réalisé avec un procédé de différence de phase.
PCT/JP2017/014560 2017-04-07 2017-04-07 Dispositif de commande d'imagerie, dispositif d'imagerie, système d'imagerie, corps mobile, procédé de commande d'imagerie et programme Ceased WO2018185940A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/014560 WO2018185940A1 (fr) 2017-04-07 2017-04-07 Dispositif de commande d'imagerie, dispositif d'imagerie, système d'imagerie, corps mobile, procédé de commande d'imagerie et programme
JP2017559465A JP6503607B2 (ja) 2017-04-07 2017-04-07 撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/014560 WO2018185940A1 (fr) 2017-04-07 2017-04-07 Dispositif de commande d'imagerie, dispositif d'imagerie, système d'imagerie, corps mobile, procédé de commande d'imagerie et programme

Publications (1)

Publication Number Publication Date
WO2018185940A1 true WO2018185940A1 (fr) 2018-10-11

Family

ID=63712736

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/014560 Ceased WO2018185940A1 (fr) 2017-04-07 2017-04-07 Dispositif de commande d'imagerie, dispositif d'imagerie, système d'imagerie, corps mobile, procédé de commande d'imagerie et programme

Country Status (2)

Country Link
JP (1) JP6503607B2 (fr)
WO (1) WO2018185940A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021193412A (ja) * 2020-06-08 2021-12-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co., Ltd 装置、撮像装置、撮像システム、移動体

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112313940A (zh) * 2019-11-14 2021-02-02 深圳市大疆创新科技有限公司 一种变焦跟踪方法和系统、镜头、成像装置和无人机

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009175279A (ja) * 2008-01-22 2009-08-06 Olympus Imaging Corp カメラシステム
JP2014063142A (ja) * 2012-08-31 2014-04-10 Canon Inc 距離検出装置、撮像装置、プログラム、記録媒体および距離検出方法
WO2014083914A1 (fr) * 2012-11-29 2014-06-05 富士フイルム株式会社 Dispositif de capture d'image et procédé de commande de mise au point
JP6026695B1 (ja) * 2016-06-14 2016-11-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 制御装置、移動体、制御方法及びプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009175279A (ja) * 2008-01-22 2009-08-06 Olympus Imaging Corp カメラシステム
JP2014063142A (ja) * 2012-08-31 2014-04-10 Canon Inc 距離検出装置、撮像装置、プログラム、記録媒体および距離検出方法
WO2014083914A1 (fr) * 2012-11-29 2014-06-05 富士フイルム株式会社 Dispositif de capture d'image et procédé de commande de mise au point
JP6026695B1 (ja) * 2016-06-14 2016-11-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 制御装置、移動体、制御方法及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021193412A (ja) * 2020-06-08 2021-12-23 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co., Ltd 装置、撮像装置、撮像システム、移動体

Also Published As

Publication number Publication date
JP6503607B2 (ja) 2019-04-24
JPWO2018185940A1 (ja) 2019-04-11

Similar Documents

Publication Publication Date Title
CN108235815B (zh) 摄像控制装置、摄像装置、摄像系统、移动体、摄像控制方法及介质
JP6878736B2 (ja) 制御装置、移動体、制御方法、及びプログラム
JP6496955B1 (ja) 制御装置、システム、制御方法、及びプログラム
JP6733106B2 (ja) 決定装置、移動体、決定方法、及びプログラム
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
JP6565072B2 (ja) 制御装置、レンズ装置、飛行体、制御方法、及びプログラム
JP6543875B2 (ja) 制御装置、撮像装置、飛行体、制御方法、プログラム
WO2018123013A1 (fr) Moyen de commande, entité mobile, procédé de commande et programme
JP6503607B2 (ja) 撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラム
JP6641574B1 (ja) 決定装置、移動体、決定方法、及びプログラム
JP6790318B2 (ja) 無人航空機、制御方法、及びプログラム
JP6587006B2 (ja) 動体検出装置、制御装置、移動体、動体検出方法、及びプログラム
JP2019205047A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP2020085918A (ja) 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム
JP6696092B2 (ja) 制御装置、移動体、制御方法、及びプログラム
JP2020016703A (ja) 制御装置、移動体、制御方法、及びプログラム
JP2019083390A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP6569157B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP6413170B1 (ja) 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム
JP6818987B1 (ja) 画像処理装置、撮像装置、移動体、画像処理方法、及びプログラム
JP6798072B2 (ja) 制御装置、移動体、制御方法、及びプログラム
JP2019169810A (ja) 画像処理装置、撮像装置、移動体、画像処理方法、及びプログラム
WO2018163300A1 (fr) Dispositif de commande, dispositif d'imagerie, système d'imagerie, corps mobile, procédé de commande et programme
JP2021193412A (ja) 装置、撮像装置、撮像システム、移動体
JP2021111937A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017559465

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904898

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17904898

Country of ref document: EP

Kind code of ref document: A1