[go: up one dir, main page]

WO2018123013A1 - Moyen de commande, entité mobile, procédé de commande et programme - Google Patents

Moyen de commande, entité mobile, procédé de commande et programme Download PDF

Info

Publication number
WO2018123013A1
WO2018123013A1 PCT/JP2016/089087 JP2016089087W WO2018123013A1 WO 2018123013 A1 WO2018123013 A1 WO 2018123013A1 JP 2016089087 W JP2016089087 W JP 2016089087W WO 2018123013 A1 WO2018123013 A1 WO 2018123013A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
distance
distance range
uav
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/089087
Other languages
English (en)
Japanese (ja)
Inventor
本庄 謙一
佳範 永山
高根 靖雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to PCT/JP2016/089087 priority Critical patent/WO2018123013A1/fr
Priority to JP2017559618A priority patent/JP6515423B2/ja
Publication of WO2018123013A1 publication Critical patent/WO2018123013A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • the present invention relates to a control device, a moving body, a control method, and a program.
  • Patent Document 1 discloses a camera that adjusts the position of the photographing lens and the aperture amount of the aperture so that the subject enters the depth of field.
  • Patent Document 1 International Publication No. 2016/056089
  • the control device is based on an imaging device in which an object to be imaged by the imaging device should exist based on at least one of the focal length, the aperture value, the allowable confusion circle diameter, and the subject distance of the imaging device.
  • the control device may include a first specifying unit that specifies a distance from the imaging device to the object.
  • the control device may include a control unit that controls the position of the imaging device so that the object is included in the distance range based on the distance range and the distance.
  • the width of the distance range may be narrower than the width of the depth of field of the imaging device determined based on at least one of the focal length, aperture value, allowable confusion circle diameter, and subject distance of the imaging device.
  • the subject distance may be based on the position of the lens of the imaging device.
  • the control device may include an estimation unit that estimates the geographical position of the object at the first future time point.
  • the control device may include a second determination unit that determines a geographical region in which the imaging device should exist at the first time point based on the estimated geographical position and distance range.
  • the control unit may control the position of the imaging device such that the imaging device exists in the geographical area at the first time point.
  • the control device may include a first point where a geographical position is determined in advance from an image captured by the imaging device, and a selection unit that selects an object.
  • the control device specifies the geographical position of the target object based on the positional relationship between the first point and the target object in the image captured by the imaging device and the geographical position of the first point. May be provided.
  • the estimation unit may estimate the geographical position of the object at the first time point based on the plurality of geographical positions of the object specified by the second specifying unit at a plurality of time points up to the present time.
  • the control unit may cause the imaging device to image the object when the object exists within the distance range.
  • the moving body according to one embodiment of the present invention may move by mounting the control device and the imaging device.
  • the control unit may control the position of the moving body so that the object is included within the distance range.
  • the control unit may cause the moving object to track the object so that the object is included within the distance range.
  • the moving body may be an unmanned aerial vehicle.
  • the control unit may cause the unmanned aircraft to hover so that the object is included in the distance range.
  • a control method is based on an imaging device in which an object to be imaged by the imaging device should exist based on at least one of a focal length, an aperture value, an allowable confusion circle diameter, and a subject distance of the imaging device. Determining a distance range of.
  • the control method may include a step of specifying a distance from the imaging device to the object.
  • the control method may include a step of controlling the position of the imaging device based on the distance range and the distance so that the object is included in the distance range.
  • a program according to an aspect of the present invention is based on at least one of the focal length, the aperture value, the allowable confusion circle diameter, and the subject distance of the imaging device.
  • the step of determining the distance range may be performed by a computer.
  • the program may cause the computer to execute a step of specifying the distance from the imaging device to the object.
  • the program may cause the computer to execute a step of controlling the position of the imaging device so that the object is included in the distance range based on the distance range and the distance.
  • FIG. 1 Various embodiments of the present invention may be described with reference to flowcharts and block diagrams.
  • the blocks in the flowcharts and block diagrams may represent (1) the stage of the process in which the operation is performed or (2) the “part” of the device responsible for performing the operation.
  • Certain stages and “parts” are provided with dedicated circuitry, programmable circuitry supplied with computer readable instructions stored on a computer readable storage medium, and / or computer readable instructions stored on a computer readable storage medium. It may be implemented by a processor.
  • Dedicated circuitry may include digital and / or analog hardware circuitry. Integrated circuits (ICs) and / or discrete circuits may be included.
  • Programmable circuits may be logical products, logical sums, exclusive logical sums, negative logical products, negative logical sums, and other logical operations, such as field programmable gate arrays (FPGAs) and programmable logic arrays (PLA), for example. , Flip-flops, registers, and memory elements, including reconfigurable hardware circuitry.
  • FPGAs field programmable gate arrays
  • PLA programmable logic arrays
  • a computer-readable storage medium may include any tangible device capable of storing instructions to be executed by a suitable device.
  • a computer readable storage medium having instructions stored thereon comprises a product that includes instructions that can be executed to create a means for performing the operations specified in the flowcharts or block diagrams.
  • Examples of computer readable storage media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like. More specific examples of computer-readable storage media include floppy disks, diskettes, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory).
  • EEPROM Electrically erasable programmable read only memory
  • SRAM static random access memory
  • CD-ROM compact disc read only memory
  • DVD digital versatile disc
  • Blu-ray registered trademark
  • the computer readable instructions may include either source code or object code written in any combination of one or more programming languages.
  • the source code or object code includes a conventional procedural programming language.
  • Conventional procedural programming languages include assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA, C ++, etc. It may be an object-oriented programming language and a “C” programming language or a similar programming language.
  • Computer readable instructions may be directed to a general purpose computer, special purpose computer, or other programmable data processing device processor or programmable circuit locally or in a wide area network (WAN) such as a local area network (LAN), the Internet, etc. ).
  • the processor or programmable circuit may execute computer readable instructions to create a means for performing the operations specified in the flowcharts or block diagrams. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 100.
  • the UAV 100 includes a UAV main body 102, a gimbal 200, an imaging device 300, and a plurality of imaging devices 230.
  • the UAV 100 is an example of a moving object.
  • the moving body is a concept including, in addition to UAV, other aircraft that moves in the air, vehicles that move on the ground, ships that move on the water, and the like.
  • the gimbal 200 and the imaging device 300 are an example of an imaging system.
  • the UAV main body 102 includes a plurality of rotor blades.
  • the UAV main body 102 flies the UAV 100 by controlling the rotation of a plurality of rotor blades.
  • the UAV main body 102 causes the UAV 100 to fly using four rotary wings.
  • the number of rotor blades is not limited to four.
  • the UAV 100 may be a fixed wing aircraft that does not have a rotating wing.
  • the imaging device 300 is a camera for capturing a moving image or a still image.
  • the plurality of imaging devices 230 are sensing cameras that image the surroundings of the UAV 100 in order to control the flight of the UAV 100.
  • Two imaging devices 230 may be provided on the front surface that is the nose of the UAV 100.
  • Two other imaging devices 230 may be provided on the bottom surface of the UAV 100.
  • the two imaging devices 230 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 230 on the bottom side may also be paired and function as a stereo camera.
  • the distance from the UAV 100 to the object may be measured based on images captured by the plurality of imaging devices 230.
  • Three-dimensional spatial data around the UAV 100 may be generated based on images captured by the plurality of imaging devices 230.
  • the number of imaging devices 230 included in the UAV 100 is not limited to four.
  • the UAV 100 only needs to include at least one imaging device 230.
  • the UAV 100 may include at least one imaging device 230 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the UAV 100.
  • the angle of view that can be set by the imaging device 230 may be wider than the angle of view that can be set by the imaging device 300.
  • the imaging device 230 may have a single focus lens or a fisheye lens.
  • the position of the UAV 100 is controlled so that the object imaged by the imaging apparatus 300 does not deviate from the depth of field of the imaging apparatus 300.
  • FIG. 2 is a schematic diagram for explaining the depth of field.
  • the focal length of the lens 510 is f
  • the aperture value of the diaphragm 514 is F
  • the object distance indicating the distance from the lens 510 to the in-focus position 502 of the lens 510 is a
  • the distance from the lens 510 to the image plane of the image sensor 512 is shown.
  • the image distance b and the permissible circle of confusion are defined as ⁇ .
  • the subject distance a changes depending on the position of the lens 510.
  • the allowable confusion circle diameter ⁇ may vary depending on the type of the image sensor 512.
  • the allowable confusion circle diameter ⁇ may vary depending on the size of the image sensor 512.
  • the distance (shooting distance) from the image plane of the image sensor 512 to the target object 500 that is the main subject is defined as L.
  • the depth of field may be determined based on at least one of the focal length f, the aperture value F, the allowable confusion circle diameter ⁇ , and the subject distance a of the imaging apparatus.
  • the depth of field 520 is the sum of the front depth of field 522 and the rear depth of field 524.
  • the front depth of field 522 is located on the front side from the in-focus position 502.
  • the rear depth of field 524 is located behind the in-focus position 502.
  • the forward depth of field 522 and the backward depth of field 524 are derived by the following equations.
  • the UAV 100 determines the position of the imaging device 300 so that the target object 500 is included in the distance range 526 from the imaging device 300 where the target object 500 should exist so that the target object 500 does not deviate from the depth of field 520.
  • the position of the UAV 100 may be controlled.
  • the width of the distance range 526 may be narrower than the width of the depth of field 520.
  • the distance range 526 may be a range from a distance Lf from the image plane of the image sensor 512 to a distance Lb (> distance Lf) from the image plane of the image sensor 512.
  • the distance range 526 may include a predetermined percentage of the front depth of field 522 and a predetermined percentage of the rear depth of field 524.
  • the distance range 526 is, for example, a range of 95%, 90%, 85%, or 80% on the in-focus position 502 side of the front depth of field 522 and 95 on the in-focus position 502 side of the rear depth of field 524. %, 90%, 85%, or 80% ranges.
  • the object 500 is covered by the imaging apparatus 300. It is possible to prevent deviation from the depth of field 520.
  • FIG. 3 shows an example of functional blocks of the UAV100.
  • the UAV 100 includes a UAV control unit 110, a communication interface 150, a memory 160, a gimbal 200, a rotating blade mechanism 210, an imaging device 300, an imaging device 230, a GPS receiver 240, an inertial measurement device (IMU) 250, a magnetic compass 260, and an atmospheric pressure.
  • An altimeter 270 is provided.
  • the communication interface 150 communicates with an external transmitter.
  • the communication interface 150 receives various commands for the UAV control unit 110 from a remote transmitter.
  • the memory 160 stores programs necessary for the UAV control unit 110 to control the gimbal 200, the rotary blade mechanism 210, the imaging device 300, the imaging device 230, the GPS receiver 240, the IMU 250, the magnetic compass 260, and the barometric altimeter 270.
  • the memory 160 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 160 may be provided inside the UAV main body 102.
  • the memory 160 may be provided so as to be removable from the UAV main body 102.
  • the gimbal 200 supports the imaging direction of the imaging device 300 so that it can be adjusted.
  • the gimbal 200 supports the imaging device 300 rotatably around at least one axis.
  • the gimbal 200 is an example of a support mechanism.
  • the gimbal 200 may support the imaging device 300 rotatably about the yaw axis, the pitch axis, and the roll axis.
  • the gimbal 200 may change the imaging direction of the imaging device 300 by rotating the imaging device 300 about at least one of the yaw axis, the pitch axis, and the roll axis.
  • the rotary blade mechanism 210 includes a plurality of rotary blades and a plurality of drive motors that rotate the plurality of rotary blades.
  • the imaging device 230 captures the surroundings of the UAV 100 and generates image data. Image data of the imaging device 230 is stored in the memory 160.
  • the GPS receiver 240 receives a plurality of signals indicating times transmitted from a plurality of GPS satellites. The GPS receiver 240 calculates the position of the GPS receiver 240, that is, the position of the UAV 100, based on the received signals.
  • the inertial measurement device (IMU) 250 detects the posture of the UAV 100. As the posture of the UAV 100, the IMU 250 detects the acceleration in the three axial directions of the UAV 100 in the front, rear, left, and right directions, and the angular velocity in the three axial directions of pitch, roll, and yaw.
  • the magnetic compass 260 detects the heading of the UAV 100.
  • the barometric altimeter 270 detects the altitude at which the UAV 100 flies.
  • the UAV control unit 110 controls the flight of the UAV 100 in accordance with a program stored in the memory 160.
  • the UAV control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the UAV control unit 110 controls the flight of the UAV 100 according to a command received from a remote transmitter via the communication interface 150.
  • the UAV control unit 110 may specify the environment around the UAV 100 by analyzing a plurality of images captured by the plurality of imaging devices 230.
  • the UAV control unit 110 controls the flight while avoiding obstacles based on the environment around the UAV 100, for example.
  • the UAV control unit 110 may generate three-dimensional spatial data around the UAV 100 based on a plurality of images captured by the plurality of imaging devices 230, and control the flight based on the three-dimensional spatial data.
  • the imaging apparatus 300 includes an imaging unit 301 and a lens unit 401.
  • the lens unit 401 may be a lens unit that can be detached from the imaging unit 301.
  • the imaging unit 301 includes an imaging control unit 310, an imaging element 330, and a memory 340.
  • the imaging control unit 310 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 310 may control the imaging device 300 in accordance with an operation command for the imaging device 300 from the UAV control unit 110.
  • the memory 340 may be a computer-readable recording medium and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 340 may be provided inside the housing of the imaging unit 301.
  • the memory 340 may be provided so as to be removable from the housing of the imaging unit 301.
  • the imaging device 330 may be configured by a CCD or a CMOS.
  • the image pickup device 330 is held inside the housing of the image pickup apparatus 300 and outputs image data of an optical image formed through the plurality of lenses 432 and the lenses 442 to the image pickup control unit 310.
  • the imaging control unit 310 performs a series of image processing such as noise reduction, demosaicing, gamma correction, and edge cooperation on the image data.
  • the imaging control unit 310 stores image data after a series of image processing in the memory 340.
  • the imaging control unit 310 may output and store the image data in the memory 160 via the UAV control unit 110.
  • the lens unit 401 includes a lens control unit 410, a memory 420, a lens driving unit 430, a lens 432, a position sensor 434, a lens driving unit 440, a lens 442, a position sensor 444, an aperture driving unit 450, and an aperture 452.
  • the lens 432 includes at least one lens.
  • the lens 432 may be a zoom lens.
  • the lens 442 includes at least one lens.
  • the lens 442 may be a focus lens.
  • the lens control unit 410 controls the movement of the lens 432 in the optical axis direction via the lens driving unit 430 in accordance with a lens operation command from the imaging unit 301.
  • the lens control unit 410 controls the movement of the lens 442 in the optical axis direction via the lens driving unit 440 in accordance with a lens operation command from the imaging unit 301.
  • Some or all of the lens 432 and the lens 442 move along the optical axis.
  • the lens control unit 410 performs at least one of a zoom operation and a focus operation by moving at least one of the lens 432 and the lens 442 along the optical axis.
  • the position sensor 434 detects the position of the lens 432.
  • the position sensor 434 may detect the current zoom position.
  • the position sensor 444 detects the position of the lens 442.
  • the position sensor 444 may detect the current focus position.
  • the lens driving unit 430 and the lens driving unit 440 may include an actuator.
  • the lens 432 and the lens 442 may move along the optical axis direction via a lens driving mechanism in response to power from each actuator.
  • the diaphragm 452 adjusts the amount of light incident on the image sensor 330.
  • the aperture driving unit 450 may include an actuator.
  • the aperture driving unit 450 may receive an instruction from the lens control unit 410 and drive the actuator to adjust the size of the aperture opening, that is, the aperture value.
  • the memory 420 stores control values of a plurality of lenses 432 and lenses 442 that move via the lens driving unit 430.
  • the memory 420 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the UAV control unit 110 controls the position of the UAV 100 so that the object 500 does not deviate from the depth of field of the imaging device 300.
  • the UAV control unit 110 includes a distance range determination unit 111, a distance identification unit 112, a position control unit 113, a selection unit 114, a geographical position identification unit 115, an estimation unit 116, and a geographical region determination unit 117.
  • a control unit other than the UAV control unit 110 may include all or part of each unit included in the UAV control unit 110.
  • the imaging control unit 310 may include all or part of each unit included in the UAV control unit 110.
  • a transmitter that remotely controls the UAV 100 may include all or a part of each unit included in the UAV control unit 110.
  • the distance range determination unit 111 should include the target object 500 to be imaged by the imaging device 300 based on at least one of the focal length f, the aperture value F, the allowable confusion circle diameter ⁇ , and the subject distance a of the imaging device 300.
  • a distance range 526 from the imaging device 300 is determined.
  • the distance range determination unit 111 is an example of a first determination unit.
  • the distance range determination unit 111 determines the forward depth of field 522 from the above formulas (1) and (2) based on the focal length f, the aperture value F, the allowable circle of confusion ⁇ , and the subject distance a. And a rear depth of field 524 may be derived.
  • the distance range determination unit 111 may determine the distance range 526 based on the derived front depth of field 522 and rear depth of field 524.
  • the distance range determination unit 111 sets a predetermined ratio (for example, 90%) on the in-focus position 502 side of the front depth of field 522 and a predetermined ratio on the in-focus position 502 side of the rear depth of field 524.
  • the distance range 526 may be determined to include a percentage (eg, 90%).
  • the distance specifying unit 112 specifies the distance L from the imaging device 300 to the target object 500.
  • the distance L may be a distance from the image plane of the image sensor 330 to the object 500.
  • the distance specifying unit 112 is an example of a first specifying unit.
  • the distance specifying unit 112 may specify the distance from the UAV 100 to the object 500 as the distance L by a triangulation method based on a plurality of images captured by the plurality of imaging devices 230.
  • the distance specifying unit 112 may specify the distance from the UAV 100 to the object 500 as the distance L using an ultrasonic sensor, an infrared sensor, a radar sensor, or the like.
  • the position control unit 113 controls the position of the imaging apparatus 300, that is, the position of the UAV 100 so that the object 500 is included in the distance range 526 based on the distance range 526 and the distance L.
  • the position control unit 113 is an example of a control unit.
  • the UAV control unit 110 may cause the imaging device 300 to capture an image of the object when the object is within the distance range.
  • the UAV control unit 110 may stop imaging of the object by the imaging device 300 when the object does not exist within the distance range.
  • the UAV 100 is tracking an object, the UAV control unit 110 may control the position of the UAV 100 so that the object exists within the distance range determined by the distance range determination unit 111.
  • the UAV 100 is hovering around the object, the UAV control unit 110 may control the position of the UAV 100 so that the object exists within the distance range determined by the distance range determination unit 111.
  • FIG. 4 is a diagram for explaining an example of adjustment of the position of the distance range. It is assumed that the target object 500 has moved to the position of the target object 500 ′ and is out of the distance range 600 with the depth of field 610. In this case, for example, the position control unit 113 controls the position of the UAV 100 so that the target object 500 ′ is present at the center of the distance range. As the UAV 100 moves, the distance range moves from the distance range 600 to the distance range 600 ′.
  • the position control unit 113 may adjust the position within the distance range 526 where the object 500 should be present, according to the relative movement direction between the object 500 and the imaging device 300. For example, it is assumed that the target object 500 is approaching the imaging device 300. In this case, the object 500 deviates from the end of the front depth of field 522 opposite to the end on the focusing position 502 side. That is, the object 500 is detached from the front end of the front depth of field 522. In this case, the position control unit 113 may control the position of the UAV 100 so that the object 500 is positioned within the rear depth of field 524. On the other hand, it is assumed that the object 500 is moving away from the imaging device 300.
  • the position control unit 113 may control the position of the UAV 100 so that the object 500 is positioned within the forward depth of field 522.
  • the object 500 is imaged by controlling the position of the UAV 100 so that the position of the object 500 is as described above.
  • the deviation from the depth of field 520 of the device 300 can be further suppressed.
  • the frequency with which the position control unit 113 adjusts the position of the UAV 100 based on the distance range 526 can be reduced.
  • the depth of field 520 is derived from the equations (1) and (2) as described above. Even if the focal length f is constant, the depth of field 520 varies depending on the size of the aperture value F, for example. For example, the greater the aperture value F, the shallower the depth of field 520 becomes. Since the allowable confusion circle diameter ⁇ varies depending on the size of the image sensor 330, the depth of field 520 varies depending on the size of the image sensor 330 even if the focal length f is constant. Even if the allowable confusion circle diameter ⁇ is the same, the depth of field 520 varies depending on the size of the image sensor 330. The depth of field 520 becomes shallower as the size of the image sensor 330 increases.
  • the UAV 100 needs to keep the distance from the object constant with higher accuracy.
  • 5A and 5B show an example of a temporal change in the moving distance of the object 500 and the UAV 100.
  • FIG. The width of the distance range 700 shown in FIG. 5B is narrower than the width of the distance range 700 shown in FIG. 5A.
  • the narrower the distance range 700 the shorter the time 702 from when the object 500 starts to deviate from the center of the distance range 700 to the end of the distance range 700. Therefore, the UAV 100 needs to operate more quickly as the depth of field is shallower.
  • the UAV 100 may operate so that the acceleration increases as the depth of field is shallower.
  • the maximum acceleration may be set to increase as the depth of field is shallower. The shallower the depth of field, the faster the correction of the distance between the UAV 100 and the object.
  • the maximum acceleration is increased as the depth of field is shallower. May be effective.
  • the UAV control unit 110 may predict the movement of the object, determine a target position that the UAV 100 should reach based on the predicted future position and distance range of the object, and move the UAV 100 to the target position. .
  • the UAV control unit 110 may further include a selection unit 114, a geographical position specifying unit 115, an estimation unit 116, and a geographical region determination unit 117.
  • the selection unit 114 selects a reference point with a predetermined geographical position and an object from images captured by the imaging apparatus 300.
  • the reference point is an example of a first point.
  • the reference point having a predetermined geographical position may be a stationary object.
  • the reference point may be a landmark in which a geographical position (latitude, longitude, altitude, etc.) is registered in advance on a map database that can be referred to by the UAV control unit 110.
  • the selection unit 114 may select a reference point and an object in response to designation from the user from among images captured by the imaging device 300.
  • FIG. 6 shows an example of an image 800 including a reference point 802 and an object 804 imaged by the imaging device 300.
  • the user designates a mountain that can be a landmark from the image 800 as the reference point 802.
  • the image 800 may be displayed on, for example, a display included in a transmitter that remotely controls the UAV 100. Further, the user designates the object 804 as the main subject from the image 800.
  • the selection unit 114 may search the image 800 for an area that can be a landmark by comparing the image 800 with the image corresponding to the landmark registered in advance in the map database. Map information including the current position of the UAV 100, the position of the reference point 802, and the current position of the object 804 may be superimposed on the image 800 captured by the image capturing apparatus 300 and displayed on the display.
  • the geographical position specifying unit 115 specifies the geographical position of the object based on the positional relationship between the reference point and the object in the image captured by the imaging device 300 and the geographical position of the reference point.
  • the geographical position specifying unit 115 is an example of a second specifying unit.
  • the geographical position specifying unit 115 may generate a distance image indicating the distance to the imaging device 300 for each unit area in the image using two images captured by the imaging device 300 every unit time. .
  • the geographical position specifying unit 115 specifies the positional relationship between the reference point and the object using the distance image.
  • the geographical position specifying unit 115 refers to a map database stored in the memory 160, for example, and specifies the geographical position of the reference point.
  • the geographical position specifying unit 115 specifies the geographical position of the object based on the specified positional relationship and the geographical position of the reference point.
  • the estimation unit 116 estimates the geographical position of the object at the first future time point.
  • the estimating unit 116 estimates the geographical position of the object at the first time point based on the plurality of geographical positions of the object specified by the geographical position specifying unit 115 at a plurality of time points up to the present time.
  • the estimation unit 116 may estimate the geographical position of the object for each unit time determined based on the frame rate.
  • the geographical area determination unit 117 determines a geographical area that the imaging apparatus 300 should exist at the first time point based on the geographical position and the distance range of the object estimated by the estimation unit 116. That is, the geographical area determination unit 117 determines the geographical area where the UAV 100 should exist at the first time point based on the geographical position and distance range of the object estimated by the estimation unit 116. If the UAV 100 exists in the geographical area at the first time point, at least the object can be positioned within the depth of field of the imaging device 300 at the first time point.
  • the position control unit 113 controls the position of the imaging device 300 so that the imaging device 300 exists in the geographical area at the first time point.
  • the position control unit 113 may determine an optimum target position within the geographical area with reference to the geographical area and the map database.
  • the position control unit 113 may determine a position that can be reached at the shortest distance within the geographical area as the target position.
  • the position control unit 113 refers to the map database, identifies obstacles that hinder flight before reaching the geographical area, and detours the obstacles within the geographical area and finds a position that can be reached at the shortest distance.
  • the target position may be determined.
  • FIG. 7 is a flowchart illustrating an example of a procedure for controlling the position of the imaging apparatus 300.
  • the selection unit 114 selects an object designated by the user from images captured by the imaging device 300 (S010).
  • the imaging condition of the imaging apparatus 300 that is optimal for imaging the object is determined, and the imaging condition is maintained until the processing is completed. That is, the imaging apparatus 300 maintains the focal length f, the aperture value F, and the subject distance a determined as the imaging conditions.
  • the distance range determination unit 111 acquires information on the focal length f, the aperture value F, the allowable confusion circle diameter ⁇ , the subject distance a, and the image distance b from the imaging device 300.
  • the distance specifying unit 112 specifies the depth of field of the imaging apparatus 300 based on the focal length f, the aperture value F, the allowable confusion circle diameter ⁇ , and the subject distance a.
  • the distance range determination unit 111 determines a distance range from the imaging apparatus 300 where the object should exist based on the depth of field, the subject distance a, and the image distance b (S102).
  • the distance specifying unit 112 specifies the distance L from the imaging device 300 to the object (S104). For example, the distance specifying unit 112 may specify the distance to the object derived from a plurality of images captured by the plurality of imaging devices 230 as the distance L.
  • the position control unit 113 determines whether or not the specified distance L is within the distance range (S106). If the object does not exist within the distance range, the position control unit 113 controls the flight of the UAV 100 so as to adjust the position of the imaging device 300 so that the object is included within the distance range (S108). When the object is within the distance range, the UAV control unit 110 continues the current flight control of the UAV 100.
  • the UAV control unit 110 continues the process from step S104 to step S108 until the process of controlling the position of the imaging device 300 so that the target object is within the distance range is completed (S110).
  • the UAV 100 can cause the imaging device 300 to capture an image of the target object without deviating from the depth of field of the imaging device 300.
  • FIG. 8 is a flowchart illustrating an example of a procedure for controlling the position of the imaging apparatus 300 by estimating the future position of the object.
  • the selection unit 114 selects an object designated by the user and a reference point to be a landmark from images captured by the imaging device 300 (S200).
  • the distance range determination unit 111 acquires information on the focal length f, the aperture value F, the allowable confusion circle diameter ⁇ , the subject distance a, and the image distance b from the imaging device 300.
  • the distance specifying unit 112 specifies the depth of field of the imaging apparatus 300 based on the focal length f, the aperture value F, the allowable confusion circle diameter ⁇ , and the subject distance a.
  • the distance range determination unit 111 determines a distance range from the imaging apparatus 300 where the object should exist based on the depth of field, the subject distance a, and the image distance b (S202).
  • the geographical position specifying unit 115 specifies the geographical position of the reference point with reference to the map data base (S204).
  • the geographic position specifying unit 115 creates a distance image indicating the distance to the imaging device 300 for each unit region in the image based on the plurality of images captured by the imaging device 300 every unit time (S206).
  • the geographical position specifying unit 115 specifies the geographical position of the object based on the geographical position of the reference point and the distance image (S208).
  • the estimation unit 116 estimates the geographical position of the target object at the first future time point based on the plurality of geographical positions of the target object specified by the geographical position specifying unit 115 so far (S210).
  • the geographical area determination unit 117 determines the geographical area where the UAV 100 should exist at the first time point based on the geographical position of the object at the first time point and the distance range (S212).
  • the position control unit 113 controls the flight of the UAV 100 based on the geographical area and the current geographical position of the UAV 100 (S214).
  • the position control unit 113 controls the flight of the UAV 100 so that the UAV 100 is located in the geographical area at the first time point.
  • the UAV 100 estimates the position of the future object, and specifies the geographical region where the UAV 100 should exist in the future based on the estimation result and the distance range in which the object should exist.
  • the UAV 100 can determine an optimum flight path in consideration of an obstacle or the like with reference to the identified geographical area and the map database. As a result, the UAV 100 can fly so that the object does not deviate from the depth of field of the imaging device 300 on the optimal flight path.
  • FIG. 9 illustrates an example of a computer 1200 in which aspects of the present invention may be embodied in whole or in part.
  • a program installed in the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more “units” of the apparatus.
  • the program can cause the computer 1200 to execute the operation or the one or more “units”.
  • the program can cause the computer 1200 to execute a process according to an embodiment of the present invention or a stage of the process.
  • Such a program may be executed by CPU 1212 to cause computer 1200 to perform certain operations associated with some or all of the blocks in the flowcharts and block diagrams described herein.
  • the computer 1200 includes a CPU 1212 and a RAM 1214, which are connected to each other by a host controller 1210.
  • the computer 1200 also includes a communication interface 1222 and an input / output unit, which are connected to the host controller 1210 via the input / output controller 1220.
  • Computer 1200 also includes ROM 1230.
  • the CPU 1212 operates according to programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • a hard disk drive may store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 at the time of activation and / or a program depending on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card or a network.
  • the program is installed in the RAM 1214 or the ROM 1230 that is also an example of a computer-readable recording medium, and is executed by the CPU 1212.
  • Information processing described in these programs is read by the computer 1200 to bring about cooperation between the programs and the various types of hardware resources.
  • An apparatus or method may be configured by implementing information operations or processing in accordance with the use of computer 1200.
  • the CPU 1212 executes a communication program loaded in the RAM 1214 and performs communication processing on the communication interface 1222 based on the processing described in the communication program. You may order.
  • the communication interface 1222 reads transmission data stored in a RAM 1214 or a transmission buffer area provided in a recording medium such as a USB memory under the control of the CPU 1212 and transmits the read transmission data to a network, or The reception data received from the network is written into a reception buffer area provided on the recording medium.
  • the CPU 1212 allows the RAM 1214 to read all or necessary portions of a file or database stored in an external recording medium such as a USB memory, and executes various types of processing on the data on the RAM 1214. Good. The CPU 1212 may then write back the processed data to an external recording medium.
  • the CPU 1212 describes various types of operations, information processing, conditional judgment, conditional branching, unconditional branching, and information retrieval that are described throughout the present disclosure for data read from the RAM 1214 and specified by the instruction sequence of the program. Various types of processing may be performed, including / replacement, etc., and the result is written back to RAM 1214.
  • the CPU 1212 may search for information in files, databases, etc. in the recording medium. For example, when a plurality of entries each having an attribute value of the first attribute associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 specifies the attribute value of the first attribute. The entry that matches the condition is searched from the plurality of entries, the attribute value of the second attribute stored in the entry is read, and thereby the first attribute that satisfies the predetermined condition is associated. The attribute value of the obtained second attribute may be acquired.
  • the program or software module described above may be stored in a computer-readable storage medium on the computer 1200 or in the vicinity of the computer 1200.
  • a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, whereby the program is transferred to the computer 1200 via the network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

L'invention concerne un moyen de commande qui peut comporter une première unité de détermination servant à déterminer, en fonction d'au moins un paramètre parmi la distance focale, une valeur de diaphragme et un cercle de confusion admissible d'un dispositif d'imagerie et une distance de sujet, une plage de distances au dispositif d'imagerie dans laquelle un objet à imager par le dispositif d'imagerie doit se trouver. Le moyen de commande peut comporter une première unité de spécification servant à spécifier la distance du dispositif d'imagerie à l'objet. Le moyen de commande peut comporter une unité de commande servant à commander, en fonction de la plage de distances et de la distance, la position du dispositif d'imagerie de telle façon que l'objet soit compris dans la plage de distances.
PCT/JP2016/089087 2016-12-28 2016-12-28 Moyen de commande, entité mobile, procédé de commande et programme Ceased WO2018123013A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2016/089087 WO2018123013A1 (fr) 2016-12-28 2016-12-28 Moyen de commande, entité mobile, procédé de commande et programme
JP2017559618A JP6515423B2 (ja) 2016-12-28 2016-12-28 制御装置、移動体、制御方法、及びプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/089087 WO2018123013A1 (fr) 2016-12-28 2016-12-28 Moyen de commande, entité mobile, procédé de commande et programme

Publications (1)

Publication Number Publication Date
WO2018123013A1 true WO2018123013A1 (fr) 2018-07-05

Family

ID=62708116

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/089087 Ceased WO2018123013A1 (fr) 2016-12-28 2016-12-28 Moyen de commande, entité mobile, procédé de commande et programme

Country Status (2)

Country Link
JP (1) JP6515423B2 (fr)
WO (1) WO2018123013A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020125414A1 (fr) * 2018-12-19 2020-06-25 深圳市大疆创新科技有限公司 Appareil de commande, appareil de photographie, système de photographie, corps mobile, procédé et programme de commande
JP2023534691A (ja) * 2020-07-14 2023-08-10 インターナショナル・ビジネス・マシーンズ・コーポレーション ガイド付きマルチスペクトル検査
WO2025205729A1 (fr) * 2024-03-29 2025-10-02 富士フイルム株式会社 Dispositif de traitement d'informations, dispositif de formation d'image, procédé de traitement d'informations, et programme

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111835968B (zh) * 2020-05-28 2022-02-08 北京迈格威科技有限公司 图像清晰度还原方法及装置、图像拍摄方法及装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003348428A (ja) * 2002-05-24 2003-12-05 Sharp Corp 撮影システム、撮影方法、撮影プログラムおよび撮影プログラムを記録したコンピュ−タ読取可能な記録媒体
JP2014119828A (ja) * 2012-12-13 2014-06-30 Secom Co Ltd 自律飛行ロボット
JP2014149620A (ja) * 2013-01-31 2014-08-21 Secom Co Ltd 撮影システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003348428A (ja) * 2002-05-24 2003-12-05 Sharp Corp 撮影システム、撮影方法、撮影プログラムおよび撮影プログラムを記録したコンピュ−タ読取可能な記録媒体
JP2014119828A (ja) * 2012-12-13 2014-06-30 Secom Co Ltd 自律飛行ロボット
JP2014149620A (ja) * 2013-01-31 2014-08-21 Secom Co Ltd 撮影システム

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020125414A1 (fr) * 2018-12-19 2020-06-25 深圳市大疆创新科技有限公司 Appareil de commande, appareil de photographie, système de photographie, corps mobile, procédé et programme de commande
JP2023534691A (ja) * 2020-07-14 2023-08-10 インターナショナル・ビジネス・マシーンズ・コーポレーション ガイド付きマルチスペクトル検査
US12430868B2 (en) 2020-07-14 2025-09-30 International Business Machines Corporation Guided multi-spectral inspection
JP7764101B2 (ja) 2020-07-14 2025-11-05 インターナショナル・ビジネス・マシーンズ・コーポレーション ガイド付きマルチスペクトル検査
WO2025205729A1 (fr) * 2024-03-29 2025-10-02 富士フイルム株式会社 Dispositif de traitement d'informations, dispositif de formation d'image, procédé de traitement d'informations, et programme

Also Published As

Publication number Publication date
JP6515423B2 (ja) 2019-05-22
JPWO2018123013A1 (ja) 2019-01-10

Similar Documents

Publication Publication Date Title
JP6478177B2 (ja) 制御装置、撮像システム、移動体、制御方法、およびプログラム
WO2018185939A1 (fr) Dispositif de commande d'imagerie, dispositif imageur, système imageur, corps mobile, procédé et programme de commande d'imagerie
JP6384000B1 (ja) 制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラム
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
JP6733106B2 (ja) 決定装置、移動体、決定方法、及びプログラム
WO2020011230A1 (fr) Dispositif de commande, corps mobile, procédé de commande et programme
JP6565072B2 (ja) 制御装置、レンズ装置、飛行体、制御方法、及びプログラム
JP6515423B2 (ja) 制御装置、移動体、制御方法、及びプログラム
WO2017203646A1 (fr) Dispositif de commande de capture d'image, dispositif de spécification de position d'ombre, système de capture d'image, objet mobile, procédé de commande de capture d'image, procédé de spécification de position d'ombre, et programme
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
JP6790318B2 (ja) 無人航空機、制御方法、及びプログラム
JP6641574B1 (ja) 決定装置、移動体、決定方法、及びプログラム
WO2019061887A1 (fr) Dispositif de commande, dispositif de photographie, aéronef, procédé de commande et programme
JP6587006B2 (ja) 動体検出装置、制御装置、移動体、動体検出方法、及びプログラム
JP6543878B2 (ja) 制御装置、撮像装置、移動体、制御方法、およびプログラム
JP2019205047A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP2019096965A (ja) 決定装置、制御装置、撮像システム、飛行体、決定方法、及びプログラム
WO2018185940A1 (fr) Dispositif de commande d'imagerie, dispositif d'imagerie, système d'imagerie, corps mobile, procédé de commande d'imagerie et programme
JP2019083390A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP6696092B2 (ja) 制御装置、移動体、制御方法、及びプログラム
JP6543879B2 (ja) 無人航空機、決定方法、およびプログラム
JP6696094B2 (ja) 移動体、制御方法、及びプログラム
JP6569157B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP6413170B1 (ja) 決定装置、撮像装置、撮像システム、移動体、決定方法、及びプログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017559618

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16925956

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16925956

Country of ref document: EP

Kind code of ref document: A1