[go: up one dir, main page]

WO2018193574A1 - Flight path generation method, information processing device, flight path generation system, program and recording medium - Google Patents

Flight path generation method, information processing device, flight path generation system, program and recording medium Download PDF

Info

Publication number
WO2018193574A1
WO2018193574A1 PCT/JP2017/015876 JP2017015876W WO2018193574A1 WO 2018193574 A1 WO2018193574 A1 WO 2018193574A1 JP 2017015876 W JP2017015876 W JP 2017015876W WO 2018193574 A1 WO2018193574 A1 WO 2018193574A1
Authority
WO
WIPO (PCT)
Prior art keywords
flight path
imaging
imaging position
processing unit
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/015876
Other languages
French (fr)
Japanese (ja)
Inventor
磊 顧
宗耀 瞿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to PCT/JP2017/015876 priority Critical patent/WO2018193574A1/en
Priority to JP2019513156A priority patent/JP6765512B2/en
Publication of WO2018193574A1 publication Critical patent/WO2018193574A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/32UAVs specially adapted for particular uses or applications for imaging, photography or videography for cartography or topography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/30Flight plan management
    • G08G5/32Flight plan management for flight plan preparation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/50Navigation or guidance aids
    • G08G5/53Navigation or guidance aids for cruising
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/50Navigation or guidance aids
    • G08G5/55Navigation or guidance aids for a single aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/50Navigation or guidance aids
    • G08G5/57Navigation or guidance aids for unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/70Arrangements for monitoring traffic-related situations or conditions
    • G08G5/74Arrangements for monitoring traffic-related situations or conditions for monitoring terrain

Definitions

  • the present disclosure relates to a flight path generation method, an information processing apparatus, a flight path generation system, a program, and a recording medium for generating a flight path of a flying object.
  • a platform for example, an unmanned air vehicle that is equipped with a photographing device and performs photographing while flying on a preset fixed route is known (for example, see Patent Document 1).
  • This platform receives a command such as a flight route and a shooting instruction from the ground base, flies in accordance with the command, performs shooting, and sends an acquired image to the ground base.
  • the platform inclines the imaging device of the platform based on the positional relationship between the platform and the imaging target while flying along the fixed path that has been set.
  • Patent Document 1 captures an image while passing through a fixed path, but does not sufficiently consider the presence of an object (for example, a building) positioned in the vertical direction from the fixed path. Therefore, it is difficult to sufficiently acquire the captured image of the side surface of the object and the other part of the captured image hidden in a part of the object that can be observed from above. Therefore, a captured image for estimating the three-dimensional shape is insufficient, and the estimation accuracy of the three-dimensional shape is lowered.
  • object for example, a building
  • the flight route on which the unmanned aircraft flies is manually determined in advance.
  • a desired position around the object is designated as an imaging position
  • the position (latitude, longitude, altitude) in the three-dimensional space is designated by user input. In this case, since each imaging position is determined by user input, user convenience is reduced. In addition, since detailed information on the object is required in advance for determining the flight path, it takes time to prepare.
  • a flight path generation method for generating a flight path of an aircraft that images a subject, the step of acquiring a schematic shape of an object included in the subject, a step of extracting a side surface in the schematic shape, Setting a corresponding imaging position and generating a flight path passing through the imaging position.
  • the step of setting the imaging position may include a step of setting an imaging position facing the side surface for each extracted side surface.
  • the step of setting the imaging position may include a step of setting a plurality of imaging positions having a predetermined imaging position interval corresponding to the side surface.
  • the step of generating the flight route may include a step of determining a shooting route passing through a plurality of imaging positions and generating a flight route including the shooting route.
  • the flight path generation method further includes a step of generating an imaging plane parallel to the side surface with a predetermined imaging distance, and the step of setting the imaging position has a predetermined imaging position interval on the imaging plane.
  • a step of setting a plurality of imaging positions may be included.
  • the step of setting the imaging position may use an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other as the predetermined imaging position interval.
  • the flight path generation method further includes a step of calculating a polyhedron surrounding the general shape of the object, and the step of extracting the side surface is a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction. May be included as a side.
  • the flight path generation method further includes a step of calculating a polyhedron in which the schematic shape of the object is simplified, and the step of extracting the side surface stands on a plane along the vertical direction of the polyhedron or within a predetermined angle range in the vertical direction.
  • a step of extracting the face as a side face may be included.
  • the step of calculating the polyhedron may include a step of calculating a polyhedron corresponding to a plurality of approximate shapes of the object, and combining a plurality of adjacent polyhedrons.
  • the step of generating the flight path may include a step of generating a flight path that passes through the imaging position in one of the side surfaces and a flight path that passes through the imaging position in the next side surface adjacent to the side surface.
  • the flight path generation method further includes the step of acquiring a captured image obtained by capturing the object downward, and the step of acquiring the approximate shape includes acquiring three-dimensional shape data of the approximate shape of the object using the captured image. May be included.
  • an information processing apparatus that generates a flight path of a flying object that captures an image of a subject, and includes a processing unit that executes processing related to the flight path, and the processing unit acquires a schematic shape of an object included in the subject Then, the information processing apparatus extracts a side surface in a schematic shape, sets an imaging position corresponding to the side surface, and generates a flight path passing through the imaging position.
  • the processing unit may set an imaging position facing the side surface for each extracted side surface in setting the imaging position.
  • the processing unit may set a plurality of imaging positions having a predetermined imaging position interval corresponding to the side surface in setting the imaging position.
  • the processing unit may determine a shooting path that passes through a plurality of imaging positions and generate a flight path including the shooting path.
  • the processing unit further generates a shooting plane parallel to the side surface with a predetermined shooting distance, and sets a plurality of shooting positions having a predetermined shooting position interval on the shooting plane in the setting of the shooting position. Good.
  • the processing unit may use an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other as the predetermined imaging position interval.
  • the processing unit may further calculate a polyhedron surrounding the approximate shape of the object, and in extracting the side surface, a surface along the vertical direction of the polyhedron or a surface standing within a predetermined angle range in the vertical direction may be extracted as the side surface.
  • the processing unit further calculates a polyhedron in which the schematic shape of the object is simplified, and in the side surface extraction, extracts a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction as a side surface. Good.
  • the processing unit may calculate a polyhedron corresponding to a plurality of approximate shapes of the object, and combine a plurality of adjacent polyhedrons.
  • the processing unit may generate a flight path that passes through the imaging position on one of the side surfaces, and a flight path that passes through the imaging position on the next side surface adjacent to the side surface.
  • the processing unit may further acquire a captured image obtained by capturing the object downward, and acquire the three-dimensional shape data of the approximate shape of the object using the captured image in acquiring the approximate shape.
  • a flight path generation system having a flying body that captures an image of a subject and a processing unit that generates a flight path of the flying body, the processing unit acquires a schematic shape of an object included in the subject, A flight path generation system that extracts a side surface in a schematic shape, sets an imaging position corresponding to the side surface, generates a flight path that passes through the imaging position, and obtains and sets a flight path.
  • a program corresponds to a side, a step of acquiring a rough shape of an object included in the subject, a step of extracting a side surface in the rough shape, and a computer that generates a flight path of a flying object that images the subject.
  • the recording medium corresponds to a step of obtaining a rough shape of an object included in the subject, a step of extracting a side surface in the rough shape, and a computer that generates a flight path of a flying object that images the subject.
  • a computer-readable recording medium recording a program for executing a step of setting an imaging position to be performed and a step of generating a flight path passing through the imaging position.
  • the flight path generation system includes a flying object as an example of a moving object and a platform for remotely controlling the operation or processing of the flying object.
  • the information processing apparatus is a computer included in at least one of the platform and the flying object, and executes various processes related to the operation of the flying object.
  • the flying object includes an aircraft (for example, drone, helicopter) moving in the air.
  • the flying body may be an unmanned flying vehicle (UAV: Unmanned ⁇ Aerial Vehicle) having an imaging device.
  • UAV Unmanned ⁇ Aerial Vehicle
  • the flying object flies along a predetermined flight path in order to image a subject in the imaging range (for example, a ground shape of a building, road, park, etc. within a certain range), and is set on the flight path.
  • the subject is imaged at a plurality of imaging positions.
  • the subject includes objects such as buildings and roads, for example.
  • the platform is a computer, for example, a transmitter for instructing remote control of various processes including movement of the flying object, or a communication terminal connected to the transmitter or the flying object so as to be able to input and output information and data.
  • the communication terminal may be, for example, a mobile terminal, a PC (Personal Computer), or the like. Note that the flying object itself may be included as a platform.
  • the flight path generation method defines various processes (steps) in an information processing apparatus (platform, flying object) or flight path generation system.
  • the program according to the present disclosure is a program for causing an information processing device (platform, flying object) or a flight path generation system to execute various processes (steps).
  • the recording medium stores a program (that is, a program for causing the information processing apparatus (platform, flying object) or the flight path generation system to execute various processes (steps)).
  • a program that is, a program for causing the information processing apparatus (platform, flying object) or the flight path generation system to execute various processes (steps)).
  • an unmanned aerial vehicle (UAV) is exemplified as the flying object.
  • UAV unmanned aerial vehicle
  • the unmanned air vehicle sets a flight path including an imaging position at which the side surface of the object can be imaged.
  • FIG. 1 is a schematic diagram illustrating a first configuration example of a flight path generation system 10 according to the embodiment.
  • the flight path generation system 10 includes an unmanned air vehicle 100, a transmitter 50, and a portable terminal 80.
  • the unmanned air vehicle 100, the transmitter 50, and the portable terminal 80 can communicate with each other using wired communication or wireless communication (for example, a wireless local area network (LAN) or Bluetooth (registered trademark)).
  • the transmitter 50 is used in a state of being held by both hands of a person who uses the transmitter 50 (hereinafter referred to as “user”), for example.
  • user a person who uses the transmitter 50
  • FIG. 2 is a diagram showing an example of the appearance of the unmanned air vehicle 100.
  • FIG. 3 is a diagram illustrating an example of a specific external appearance of the unmanned air vehicle 100.
  • a side view when the unmanned air vehicle 100 flies in the moving direction STV0 is shown in FIG. 2, and a perspective view when the unmanned air vehicle 100 flies in the moving direction STV0 is shown in FIG.
  • the unmanned air vehicle 100 is an example of a moving body that includes the imaging devices 220 and 230 as an example of an imaging unit and moves.
  • the moving body is a concept including, in addition to the unmanned air vehicle 100, other aircraft that moves in the air, vehicles that move on the ground, ships that move on the water, and the like.
  • the roll axis (see the x-axis in FIGS. 2 and 3) is defined in a direction parallel to the ground and along the movement direction STV0.
  • a pitch axis (see the y-axis in FIGS. 2 and 3) is defined in a direction parallel to the ground and perpendicular to the roll axis, and further, a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
  • the yaw axis (the z axis in FIGS. 2 and 3) is defined.
  • the unmanned air vehicle 100 includes a UAV main body 102, a gimbal 200, an imaging device 220, and a plurality of imaging devices 230.
  • the unmanned aerial vehicle 100 moves based on a remote control instruction transmitted from a transmitter 50 as an example of a platform.
  • the movement of the unmanned air vehicle 100 means a flight, and includes at least ascending, descending, left turning, right turning, left horizontal movement, and right horizontal movement.
  • the UAV main body 102 includes a plurality of rotor blades (propellers).
  • the UAV main body 102 moves the unmanned air vehicle 100 by controlling the rotation of a plurality of rotor blades.
  • the UAV main body 102 moves the unmanned aerial vehicle 100 using, for example, four rotary wings.
  • the number of rotor blades is not limited to four.
  • the unmanned air vehicle 100 may be a fixed wing aircraft that does not have rotating wings.
  • the imaging device 220 is an imaging camera that images a subject (for example, a building on the ground) included in a desired imaging range.
  • the subject may include, for example, an object such as a building and the like, an aerial view of the unmanned air vehicle 100, and a landscape such as a mountain or a river.
  • the plurality of imaging devices 230 are sensing cameras that image the surroundings of the unmanned air vehicle 100 in order to control the movement of the unmanned air vehicle 100.
  • Two imaging devices 230 may be provided on the front surface that is the nose of the unmanned air vehicle 100.
  • the other two imaging devices 230 may be provided on the bottom surface of the unmanned air vehicle 100.
  • the two imaging devices 230 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 230 on the bottom side may also be paired and function as a stereo camera.
  • Three-dimensional spatial data around the unmanned air vehicle 100 may be generated based on images captured by the plurality of imaging devices 230.
  • the number of imaging devices 230 included in the unmanned air vehicle 100 is not limited to four.
  • the unmanned air vehicle 100 only needs to include at least one imaging device 230.
  • the unmanned air vehicle 100 may include at least one imaging device 230 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the unmanned air vehicle 100.
  • the angle of view that can be set by the imaging device 230 may be wider than the angle of view that can be set by the imaging device 220.
  • the imaging device 230 may have a single focus lens or a fisheye lens.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of the unmanned air vehicle 100 constituting the flight path generation system 10 of FIG.
  • the unmanned air vehicle 100 includes a processing unit 110, a communication interface 150, a memory 160, a storage 170, a battery 190, a gimbal 200, a rotary wing mechanism 210, an imaging device 220, an imaging device 230, and GPS reception.
  • Machine 240 inertial measurement unit (IMU: Inertial Measurement Unit) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, and laser measuring instrument 290.
  • the communication interface 150 is an example of a communication unit.
  • the processing unit 110 is configured using a processor such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
  • the processing unit 110 performs signal processing for overall control of operations of each unit of the unmanned air vehicle 100, data input / output processing with other units, data calculation processing, and data storage processing.
  • the processing unit 110 controls the flight of the unmanned air vehicle 100 according to the program stored in the memory 160.
  • the processing unit 110 controls the movement (that is, the flight) of the unmanned air vehicle 100 according to a command received from the remote transmitter 50 via the communication interface 150.
  • the memory 160 may be removable from the unmanned air vehicle 100.
  • the processing unit 110 acquires image data of a subject imaged by the imaging device 220 and the imaging device 230 (hereinafter sometimes referred to as “captured image”).
  • the processing unit 110 controls the gimbal 200, the rotary blade mechanism 210, the imaging device 220, and the imaging device 230.
  • the processing unit 110 controls the imaging range of the imaging device 220 by changing the imaging direction or angle of view of the imaging device 220.
  • the processing unit 110 controls the imaging range of the imaging device 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.
  • the imaging range refers to a geographical range captured by the imaging device 220 or the imaging device 230.
  • the imaging range is defined by latitude, longitude, and altitude.
  • the imaging range may be a range in three-dimensional spatial data defined by latitude, longitude, and altitude.
  • the imaging range is specified based on the angle of view and imaging direction of the imaging device 220 or the imaging device 230, and the position where the unmanned air vehicle 100 is present.
  • the imaging directions of the imaging device 220 and the imaging device 230 are defined from the azimuth and the depression angle in which the front surface where the imaging lenses of the imaging device 220 and the imaging device 230 are provided is directed.
  • the imaging direction of the imaging device 220 is a direction specified from the nose direction of the unmanned air vehicle 100 and the posture state of the imaging device 220 with respect to the gimbal 200.
  • the imaging direction of the imaging device 230 is a direction specified from the nose direction of the unmanned air vehicle 100 and the position where the imaging device 230 is provided.
  • the processing unit 110 controls the flight of the unmanned air vehicle 100 by controlling the rotary wing mechanism 210. That is, the processing unit 110 controls the position including the latitude, longitude, and altitude of the unmanned air vehicle 100 by controlling the rotary wing mechanism 210.
  • the processing unit 110 may control the imaging ranges of the imaging device 220 and the imaging device 230 by controlling the flight of the unmanned air vehicle 100.
  • the processing unit 110 may control the angle of view of the imaging device 220 by controlling a zoom lens included in the imaging device 220.
  • the processing unit 110 may control the angle of view of the imaging device 220 by digital zoom using the digital zoom function of the imaging device 220.
  • the processing unit 110 captures the subject in the horizontal direction, the predetermined angle direction, or the vertical direction by the imaging device 220 or the imaging device 230 at an imaging position (waypoint to be described later) existing in the middle of the set flight path.
  • the direction of the predetermined angle is a direction of a predetermined angle suitable for the information processing apparatus (unmanned air vehicle or platform) to estimate the three-dimensional shape of the subject.
  • the processing unit 110 moves the unmanned air vehicle 100 to a specific position at a specific date and time to perform a desired operation under a desired environment.
  • the imaging range can be captured by the imaging device 220.
  • the processing unit 110 moves the unmanned air vehicle 100 to a specific position at the specified date and time, thereby In this environment, the imaging device 220 can capture a desired imaging range.
  • the communication interface 150 communicates with the transmitter 50.
  • the communication interface 150 receives various commands for the processing unit 110 from the remote transmitter 50.
  • the memory 160 is an example of a storage unit.
  • the memory 160 includes a gimbal 200, a rotating blade mechanism 210, an imaging device 220, an imaging device 230, a GPS receiver 240, an inertial measurement device 250, a magnetic compass 260, a barometric altimeter 270, an ultrasonic sensor 280, and laser measurement.
  • a program and the like necessary for controlling the device 290 are stored.
  • the memory 160 stores captured images captured by the imaging devices 220 and 230.
  • the memory 160 may be a computer-readable recording medium, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), and It may include at least one flash memory such as a USB memory.
  • the memory 160 may be provided inside the UAV main body 102. It may be provided so as to be removable from the UAV main body 102.
  • the storage 170 is an example of a storage unit.
  • the storage 170 accumulates and holds various data and information.
  • the storage 170 may be an HDD (Hard Disk Drive), an SSD (Solid State Drive), a memory card, a USB memory, or the like.
  • the storage 170 may be provided inside the UAV main body 102.
  • the storage 170 may be provided so as to be removable from the UAV main body 102.
  • the battery 190 has a function as a drive source of each part of the unmanned air vehicle 100 and supplies necessary power to each part of the unmanned air vehicle 100.
  • the gimbal 200 supports the imaging device 220 to be rotatable about at least one axis.
  • the gimbal 200 may support the imaging device 220 rotatably about the yaw axis, pitch axis, and roll axis.
  • the gimbal 200 may change the imaging direction of the imaging device 220 by rotating the imaging device 220 about at least one of the yaw axis, the pitch axis, and the roll axis.
  • the rotary blade mechanism 210 includes a plurality of rotary blades and a plurality of drive motors that rotate the plurality of rotary blades.
  • the imaging device 220 captures a subject within a desired imaging range and generates captured image data.
  • Image data obtained by imaging by the imaging device 220 is stored in a memory included in the imaging device 220 or the memory 160.
  • the imaging device 230 captures the surroundings of the unmanned air vehicle 100 and generates captured image data. Image data of the imaging device 230 is stored in the memory 160.
  • the GPS receiver 240 receives a plurality of signals indicating times and positions (coordinates) of each GPS satellite transmitted from a plurality of navigation satellites (that is, GPS satellites).
  • the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned air vehicle 100) based on the received signals.
  • the GPS receiver 240 outputs position information of the unmanned air vehicle 100 to the processing unit 110.
  • the calculation of the position information of the GPS receiver 240 may be performed by the processing unit 110 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the processing unit 110.
  • the inertial measurement device 250 detects the attitude of the unmanned air vehicle 100 and outputs the detection result to the processing unit 110.
  • the inertial measurement device 250 uses, as the attitude of the unmanned aerial vehicle 100, accelerations in the three axial directions of the front and rear, left and right, and up and down of the unmanned air vehicle 100, and angular velocities in the three axial directions of the pitch axis, roll axis, and yaw axis. To detect.
  • the magnetic compass 260 detects the nose direction of the unmanned air vehicle 100 and outputs the detection result to the processing unit 110.
  • the barometric altimeter 270 detects the altitude at which the unmanned air vehicle 100 flies, and outputs the detection result to the processing unit 110.
  • the ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground or an object, and outputs the detection results to the processing unit 110.
  • the detection result may indicate a distance (that is, altitude) from the unmanned air vehicle 100 to the ground, for example.
  • the detection result may indicate a distance from the unmanned air vehicle 100 to the object, for example.
  • Laser measuring device 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between unmanned air vehicle 100 and the object using the reflected light.
  • the distance measurement result is input to the processing unit 110.
  • the distance measurement method using laser light may be a time-of-flight method.
  • the processing unit 110 may specify the environment around the unmanned air vehicle 100 by analyzing a plurality of images captured by the plurality of imaging devices 230. Based on the environment around the unmanned air vehicle 100, the processing unit 110 controls flight while avoiding obstacles, for example.
  • the processing unit 110 may generate three-dimensional spatial data around the unmanned air vehicle 100 based on a plurality of images captured by the plurality of imaging devices 230, and control the flight based on the three-dimensional spatial data.
  • the processing unit 110 acquires date / time information indicating the current date / time.
  • the processing unit 110 may acquire date / time information indicating the current date / time from the GPS receiver 240.
  • the processing unit 110 may acquire date / time information indicating the current date / time from a timer (not shown) mounted on the unmanned air vehicle 100.
  • the processing unit 110 acquires position information indicating the position of the unmanned air vehicle 100.
  • the processing unit 110 may acquire position information indicating the latitude, longitude, and altitude where the unmanned air vehicle 100 exists from the GPS receiver 240.
  • the processing unit 110 receives latitude and longitude information indicating the latitude and longitude where the unmanned air vehicle 100 exists from the GPS receiver 240, and altitude information indicating the altitude where the unmanned air vehicle 100 exists from the barometric altimeter 270 or the ultrasonic sensor 280, respectively. It may be acquired as position information.
  • the processing unit 110 may acquire orientation information indicating the orientation of the unmanned air vehicle 100 from the magnetic compass 260.
  • the orientation information may indicate an orientation corresponding to the nose orientation of the unmanned air vehicle 100, for example.
  • the processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should be present when the imaging device 220 captures an imaging range to be imaged.
  • the processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should exist from the memory 160.
  • the processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should be present from another device such as the transmitter 50 via the communication interface 150.
  • the processing unit 110 refers to the three-dimensional map database, specifies a position where the unmanned air vehicle 100 can exist in order to capture an image capturing range to be imaged, and the unmanned air vehicle 100 should exist at that position. You may acquire as positional information which shows a position.
  • the processing unit 110 may acquire imaging information indicating the imaging ranges of the imaging device 220 and the imaging device 230, respectively.
  • the processing unit 110 may acquire angle-of-view information indicating the angle of view of the imaging device 220 and the imaging device 230 from the imaging device 220 and the imaging device 230 as parameters for specifying the imaging range.
  • the processing unit 110 may acquire information indicating the imaging direction of the imaging device 220 and the imaging device 230 as a parameter for specifying the imaging range.
  • the processing unit 110 may acquire posture information indicating the posture state of the imaging device 220 from the gimbal 200 as information indicating the imaging direction of the imaging device 220, for example.
  • the information indicating the posture state of the imaging device 220 may be indicated by, for example, a rotation angle from the reference rotation angle of the pitch axis and yaw axis of the gimbal 200.
  • the processing unit 110 may acquire information indicating the orientation of the unmanned air vehicle 100 as information indicating the imaging direction of the imaging device 220, for example.
  • the processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 exists as a parameter for specifying the imaging range.
  • the processing unit 110 defines an imaging range indicating a geographical range captured by the imaging device 220 based on the angle of view and the imaging direction of the imaging device 220 and the imaging device 230 and the position where the unmanned flying object 100 exists.
  • the imaging information may be acquired by generating imaging information indicating the imaging range.
  • the processing unit 110 may acquire imaging information indicating an imaging range to be imaged by the imaging device 220.
  • the processing unit 110 may acquire imaging information to be imaged by the imaging device 220 from the memory 160.
  • the processing unit 110 may acquire imaging information to be imaged by the imaging device 220 from another device such as the transmitter 50 via the communication interface 150.
  • the processing unit 110 acquires three-dimensional information (three-dimensional information) indicating a three-dimensional shape (three-dimensional shape) of an object existing around the unmanned air vehicle 100.
  • the object is a part of a landscape such as a building, a road, a car, and a tree.
  • the three-dimensional information is, for example, three-dimensional space data.
  • the processing unit 110 may acquire the three-dimensional information by generating the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned air vehicle 100 from the respective images obtained from the plurality of imaging devices 230.
  • the processing unit 110 may acquire the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned air vehicle 100 by referring to the three-dimensional map database stored in the memory 160.
  • the processing unit 110 may acquire three-dimensional information related to a three-dimensional shape of an object existing around the unmanned air vehicle 100 by referring to a three-dimensional map database managed by a server existing on the network.
  • FIG. 5 is a diagram illustrating an example of the appearance of the transmitter 50 to which the mobile terminal 80 is attached.
  • a smartphone is shown as an example of the mobile terminal 80.
  • the mobile terminal 80 may be a smartphone, a tablet terminal, or the like, for example.
  • the up / down / front / rear / left / right directions with respect to the transmitter 50 are assumed to follow the directions of arrows shown in FIG.
  • the transmitter 50 is used in a state of being held by both hands of a user who uses the transmitter 50, for example.
  • the transmitter 50 includes, for example, a resin casing 50B having a substantially rectangular parallelepiped shape (in other words, a substantially box shape) having a substantially square bottom surface and a height shorter than one side of the bottom surface.
  • a left control rod 53L and a right control rod 53R are provided in a projecting manner at approximately the center of the housing surface of the transmitter 50.
  • the left control rod 53L and the right control rod 53R are used in operations for remotely controlling the movement of the unmanned air vehicle 100 by the user (for example, moving the unmanned air vehicle 100 back and forth, moving left and right, moving up and down, and changing the direction). Is done.
  • the left control rod 53L and the right control rod 53R indicate positions in an initial state where no external force is applied from both hands of the user.
  • the left control rod 53L and the right control rod 53R automatically return to a predetermined position (for example, the initial position shown in FIG. 5) after the external force applied by the user is released.
  • the power button B1 of the transmitter 50 is disposed on the front side (in other words, the user side) of the left control rod 53L.
  • the power button B1 is pressed once by the user, for example, the remaining capacity of the battery built in the transmitter 50 is displayed in the remaining battery capacity display portion L2.
  • the power button B1 is pressed again by the user, for example, the power of the transmitter 50 is turned on, and power is supplied to each part (see FIG. 6) of the transmitter 50 so that it can be used.
  • RTH (Return To Home) button B2 is disposed on the front side (in other words, the user side) of the right control rod 53R.
  • the transmitter 50 transmits a signal for automatically returning the unmanned air vehicle 100 to a predetermined position.
  • the transmitter 50 can automatically return the unmanned air vehicle 100 to a predetermined position (for example, a take-off position stored in the unmanned air vehicle 100).
  • the RTH button B2 is used when, for example, the user loses sight of the airframe of the unmanned aerial vehicle 100 during aerial shooting with the unmanned air vehicle 100 outdoors, or when it becomes impossible to operate due to radio interference or unexpected troubles. Is available.
  • a remote status display unit L1 and a battery remaining amount display unit L2 are arranged on the front side (in other words, the user side) of the power button B1 and the RTH button B2.
  • the remote status display unit L1 is configured by using, for example, an LED (Light Emission Diode), and displays a wireless connection state between the transmitter 50 and the unmanned air vehicle 100.
  • the battery remaining amount display unit L2 is configured using, for example, an LED, and displays the remaining amount of battery capacity built in the transmitter 50.
  • Two antennas AN1 and AN2 project from the rear side of the housing 50B of the transmitter 50 and rearward from the left control rod 53L and the right control rod 53R.
  • the antennas AN1 and AN2 transmit a signal for controlling the movement of the unmanned air vehicle 100 to the unmanned air vehicle 100 based on the user's operation of the left control rod 53L and the right control rod 53R.
  • the antennas AN1 and AN2 can cover a transmission / reception range of 2 km, for example.
  • the antennas AN1 and AN2 receive from the unmanned aerial vehicle 100 captured images captured by the imaging devices 220 and 230 included in the unmanned aerial vehicle 100 wirelessly connected to the transmitter 50, or various data acquired by the unmanned aerial vehicle 100. When transmitted, these images or various data can be received.
  • the transmitter 50 does not include a display unit, but may include a display unit.
  • the portable terminal 80 may be mounted on the holder HLD.
  • the holder HLD may be bonded and attached to the transmitter 50. Thereby, the portable terminal 80 is attached to the transmitter 50 via the holder HLD.
  • the portable terminal 80 and the transmitter 50 may be connected via a wired cable (for example, a USB cable).
  • the portable terminal 80 and the transmitter 50 may be connected by wireless communication (for example, Bluetooth (registered trademark)).
  • the portable terminal 80 may not be attached to the transmitter 50, and the portable terminal 80 and the transmitter 50 may be provided independently.
  • FIG. 6 is a block diagram illustrating an example of a hardware configuration of the transmitter 50 configuring the flight path generation system 10 of FIG.
  • the transmitter 50 includes a left control rod 53L, a right control rod 53R, a processing unit 61, a wireless communication unit 63, an interface unit 65, a memory 67, a battery 69, a power button B1, and an RTH button B2.
  • the operation unit set OPS, the remote status display unit L1, and the battery remaining amount display unit L2 are included.
  • the transmitter 50 is an example of an operation terminal for remotely controlling the unmanned air vehicle 100.
  • the processing unit 61 is configured using a processor (for example, a CPU, MPU, or DSP).
  • the processing unit 61 performs signal processing for overall control of operations of each unit of the transmitter 50, data input / output processing with other units, data calculation processing, and data storage processing.
  • the processing unit 61 acquires captured image data captured by the imaging device 220 of the unmanned air vehicle 100 via the wireless communication unit 63, stores it in the memory 67, and outputs the data to the portable terminal 80 via the interface unit 65. It's okay. In other words, the processing unit 61 may cause the portable terminal 80 to display a captured image captured by the imaging device 220 of the unmanned air vehicle 100. Thereby, the captured image captured by the imaging device 220 of the unmanned air vehicle 100 can be displayed on the portable terminal 80.
  • the processing unit 61 may generate a signal for controlling the movement of the unmanned aerial vehicle 100 designated by the operation by the user's operation of the left control rod 53L and the right control rod 53R.
  • the processing unit 61 may transmit the generated signal to the unmanned air vehicle 100 via the wireless communication unit 63 and the antennas AN1 and AN2 to remotely control the unmanned air vehicle 100. Thereby, the transmitter 50 can control the movement of the unmanned air vehicle 100 remotely.
  • the processing unit 61 may acquire map information of a map database accumulated by an external server or the like via the wireless communication unit 63.
  • the left control rod 53L is used for an operation for remotely controlling the movement of the unmanned air vehicle 100 by, for example, the user's left hand.
  • the right control rod 53R is used for an operation for remotely controlling the movement of the unmanned air vehicle 100 by, for example, the user's right hand.
  • the movement of the unmanned aerial vehicle 100 includes, for example, a forward movement, a backward movement, a leftward movement, a rightward movement, a rising movement, a downward movement, and a leftward movement. Or a combination thereof, and so on.
  • the battery 69 has a function as a drive source for each part of the transmitter 50 and supplies necessary power to each part of the transmitter 50.
  • the processing section 61 displays the remaining capacity of the battery 69 built in the transmitter 50 on the remaining battery capacity display section L2. Thereby, the user can easily check the remaining capacity of the battery capacity built in the transmitter 50.
  • the processing unit 61 may display the remaining amount of the capacity of the battery built in the unmanned air vehicle 100 in the battery remaining amount display unit L2.
  • the processing unit 61 instructs the battery 69 built in the transmitter 50 to supply power to each unit in the transmitter 50. As a result, the user can turn on the transmitter 50 and easily start using the transmitter 50.
  • the processing unit 61 When the RTH button B2 is pressed, the processing unit 61 generates a signal for automatically returning the unmanned air vehicle 100 to a predetermined position (for example, the take-off position of the unmanned air vehicle 100), and the wireless communication unit 63 and the antenna AN1. , Transmitted to the unmanned air vehicle 100 via AN2.
  • a predetermined position for example, the take-off position of the unmanned air vehicle 100
  • the wireless communication unit 63 and the antenna AN1 Transmitted to the unmanned air vehicle 100 via AN2.
  • the user can automatically return (return) the unmanned air vehicle 100 to a predetermined position by a simple operation on the transmitter 50.
  • the operation unit set OPS is configured using a plurality of operation units (for example, operation units OP1,..., Operation unit OPn) (n: an integer of 2 or more).
  • the operation unit set OPS supports other operation units (for example, remote control of the unmanned air vehicle 100 by the transmitter 50) except for the left control rod 53L, the right control rod 53R, the power button B1, and the RTH button B2 shown in FIG. Various operation units).
  • the various operation units referred to here include, for example, a button for instructing to capture a still image using the imaging device 220 of the unmanned air vehicle 100, and start and end of video recording using the imaging device 220 of the unmanned air vehicle 100.
  • Button for adjusting the tilt direction of the gimbal 200 (see FIG. 4) of the unmanned air vehicle 100, a button for switching the flight mode of the unmanned air vehicle 100, and the imaging device 220 of the unmanned air vehicle 100 are set. Dial is applicable.
  • the remote status display unit L1 and the remaining battery level display unit L2 have been described with reference to FIG.
  • the wireless communication unit 63 is connected to two antennas AN1 and AN2.
  • the wireless communication unit 63 transmits / receives information and data to / from the unmanned air vehicle 100 via the two antennas AN1 and AN2 using a predetermined wireless communication method (for example, WiFi (registered trademark)).
  • a predetermined wireless communication method for example, WiFi (registered trademark)
  • the interface unit 65 inputs and outputs information and data between the transmitter 50 and the portable terminal 80.
  • the interface unit 65 may be a USB port (not shown) provided in the transmitter 50, for example.
  • the interface unit 65 may be an interface other than the USB port.
  • the memory 67 is an example of a storage unit.
  • the memory 67 temporarily stores, for example, a ROM (Read Only Memory) in which a program that defines the operation of the processing unit 61 and data of set values are stored, and various types of information and data that are used when the processing unit 61 performs processing.
  • RAM Random Access Memory
  • the program and setting value data stored in the ROM of the memory 64 may be copied to a predetermined recording medium (for example, CD-ROM, DVD-ROM).
  • a predetermined recording medium for example, CD-ROM, DVD-ROM.
  • data of captured images captured by the imaging devices 220 and 230 of the unmanned air vehicle 100 may be stored.
  • FIG. 7 is a block diagram illustrating an example of a hardware configuration of the mobile terminal 80 that configures the flight path generation system 10 of FIG.
  • the portable terminal 80 may include a processing unit 81, an interface unit 82, an operation unit 83, a wireless communication unit 85, a memory 87, a display unit 88, a storage 89, and a battery 99.
  • the portable terminal 80 has a function as an example of an information processing device, and the processing unit 81 of the portable terminal 80 is an example of a processing unit of the information processing device.
  • the processing unit 81 is configured using a processor (for example, CPU, MPU, or DSP).
  • the processing unit 81 performs signal processing for overall control of operations of each unit of the mobile terminal 80, data input / output processing with other units, data calculation processing, and data storage processing.
  • the processing unit 81 may acquire data and information from the unmanned air vehicle 100 via the wireless communication unit 85.
  • the processing unit 81 may acquire data and information from the transmitter 50 or another device via the interface unit 82.
  • the processing unit 81 may acquire data and information input via the operation unit 83.
  • the processing unit 81 may acquire data and information held in the memory 87.
  • the processing unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information.
  • the processing unit 81 may send data and information to the storage 89 and store the data and information.
  • the processing unit 81 may acquire data and information stored in the storage 89.
  • the processing unit 81 may execute an application for instructing control of the unmanned air vehicle 100.
  • the processing unit 81 may generate various data used in the application.
  • the interface unit 82 inputs and outputs information and data between the transmitter 50 or another device and the portable terminal 80.
  • the interface unit 82 may be a USB connector (not shown) provided in the mobile terminal 80, for example.
  • the interface unit 65 may be an interface other than the USB connector.
  • the operation unit 83 receives data and information input by the operator of the mobile terminal 80.
  • the operation unit 83 may include buttons, keys, a touch panel, a microphone, and the like.
  • the operation unit 83 and the display unit 88 are mainly configured by a touch panel.
  • the operation unit 83 can accept a touch operation, a tap operation, a drag operation, and the like.
  • the wireless communication unit 85 communicates with the unmanned air vehicle 100 by various wireless communication methods.
  • the wireless communication method may include, for example, wireless LAN, Bluetooth (registered trademark), short-range wireless communication, or communication via a public wireless line.
  • the wireless communication unit 85 may transmit and receive data and information by communicating with other devices.
  • the memory 87 is an example of a storage unit.
  • the memory 87 includes, for example, a ROM that stores a program that defines the operation of the portable terminal 80 and data of setting values, and a RAM that temporarily stores various information and data used during processing by the processing unit 81. It's okay.
  • the memory 87 may include memories other than ROM and RAM.
  • the memory 87 may be provided inside the mobile terminal 80.
  • the memory 87 may be provided so as to be removable from the portable terminal 80.
  • the program may include an application program.
  • the display unit 88 is configured using, for example, an LCD (Liquid Crystal Display) or an organic EL (ElectroLuminescence) display, and displays various information and data output from the processing unit 81.
  • the display unit 88 may display captured image data captured by the imaging devices 220 and 230 of the unmanned air vehicle 100.
  • the storage 89 is an example of a storage unit.
  • the storage 89 stores and holds various data and information.
  • the storage 89 may be a flash memory, an SSD (Solid State Drive), a memory card, a USB memory, or the like.
  • the storage 89 may be provided so as to be removable from the main body of the mobile terminal 80.
  • the battery 99 has a function as a drive source for each part of the mobile terminal 80 and supplies necessary power to each part of the mobile terminal 80.
  • the processing unit 81 as an example of the processing unit of the information processing apparatus includes a flight path processing unit 811 that performs processing related to generation of a flight path of the unmanned air vehicle 100.
  • the processing unit 81 includes a shape data processing unit 812 that performs processing related to estimation and generation of three-dimensional shape data of a subject.
  • the flight path processing unit 811 generates a flight path of the unmanned air vehicle 100 that images the subject.
  • the flight path processing unit 811 may acquire input parameters.
  • the flight path processing unit 811 may acquire the input parameter input by the transmitter 50 by receiving the input parameter via the interface unit 82 or the wireless communication unit 85. Further, the flight path processing unit 811 may acquire at least a part of information included in the input parameter from another device instead of acquiring from the transmitter 50.
  • the flight path processing unit 811 may acquire at least a part of information included in the input parameter from a server or the like existing on the network.
  • the acquired input parameters may be held in the memory 87.
  • the processing unit 81 of the portable terminal 80 may refer to the memory 87 as appropriate (for example, at the time of generating a flight route, at the time of generating three-dimensional shape data).
  • the input parameters may include information on the approximate shape of the object, information on the flight range, information on the flight altitude, information on the imaging distance, and information on the imaging position interval.
  • the input parameter may include setting resolution information. Note that the set resolution is the resolution of the captured image captured by the imaging devices 220 and 230 of the unmanned air vehicle 100 (that is, for obtaining an appropriate captured image so that the three-dimensional shape of the subject can be estimated with high accuracy). Resolution) and may be stored in the memory 160 of the unmanned air vehicle 100 or the memory 67 of the transmitter 50.
  • the input parameters include information on imaging positions (that is, waypoints) in the flight path of the unmanned air vehicle 100 and various parameters for generating a flight path that passes through the imaging positions.
  • the imaging position is a position in a three-dimensional space.
  • the input parameter may include information on the overlapping rate of the imaging range when the unmanned air vehicle 100 images the subject at the imaging position, for example.
  • the input parameter may include information on the interval between imaging positions in the flight path.
  • the imaging position interval is an interval (distance) between two adjacent imaging positions among a plurality of imaging positions (waypoints) arranged on the flight path.
  • the input parameter may include information on the angle of view of the imaging device 220 or 230 of the unmanned air vehicle 100.
  • the flight path processing unit 811 may receive and acquire subject identification information.
  • the flight path processing unit 811 communicates with the external server via the interface unit 82 or the wireless communication unit 85 based on the identified subject identification information, and information on the shape of the subject corresponding to the subject identification information or the subject The size information may be received and acquired.
  • the overlapping ratio of the imaging ranges indicates a ratio of overlapping two imaging ranges when images are captured by the imaging device 220 or the imaging device 230 of the unmanned air vehicle 100 at imaging positions adjacent in the horizontal direction or the vertical direction.
  • the overlapping range of the imaging range is at least one of information on the overlapping rate of the imaging range in the horizontal direction (also referred to as horizontal overlapping rate) and information on the overlapping rate of the imaging range in the vertical direction (also referred to as vertical overlapping rate). May include.
  • the horizontal overlap rate and the vertical overlap rate may be the same or different. When the horizontal overlap rate and the vertical overlap rate are different values, both the horizontal overlap rate information and the vertical overlap rate information may be included in the input parameter. When the horizontal overlap rate and the vertical overlap rate are the same value, information on one overlap rate that is the same value may be included in the input parameter.
  • the imaging position interval is a spatial imaging interval, and is a distance between adjacent imaging positions among a plurality of imaging positions at which the unmanned air vehicle 100 should take an image in the flight path.
  • the imaging position interval may include at least one of an imaging position interval in the horizontal direction (also referred to as a horizontal imaging interval) and an imaging position interval in the vertical direction (also referred to as an upper and lower imaging interval).
  • the flight path processing unit 811 may calculate and acquire an imaging position interval including a horizontal imaging interval and an upper and lower imaging interval, or may acquire it from input parameters.
  • the flight path processing unit 811 may place an imaging position (waypoint) for imaging by the imaging device 220 or 230 on the flight path.
  • the intervals between the imaging positions may be arranged at regular intervals, for example.
  • the imaging positions are arranged so that the imaging ranges related to the captured images at adjacent imaging positions partially overlap. This is to enable estimation of a three-dimensional shape using a plurality of captured images. Since the imaging device 220 or 230 has a predetermined angle of view, a part of both imaging ranges overlaps by shortening the imaging position interval.
  • the flight path processing unit 811 may calculate the imaging position interval based on, for example, the altitude (imaging altitude) at which the imaging position is arranged and the resolution of the imaging device 220 or 230. The higher the imaging altitude or the longer the imaging distance, the larger the imaging range overlap rate, so that the imaging position interval can be made longer (sparse). As the imaging altitude is lower or the imaging distance is shorter, the overlapping ratio of the imaging ranges becomes smaller, so the imaging position interval is shortened (densely). The flight path processing unit 811 may further calculate the imaging position interval based on the angle of view of the imaging device 220 or 230. The flight path processing unit 811 may calculate the imaging position interval by other known methods.
  • the flight path processing unit 811 may acquire the angle of view of the imaging device 220 or the angle of view of the imaging device 230 from the imaging device 220 or the imaging device 230.
  • the angle of view of the imaging device 220 or the angle of view of the imaging device 230 may be the same or different in the horizontal direction and the vertical direction.
  • the angle of view of the imaging device 220 in the horizontal direction or the angle of view of the imaging device 230 is also referred to as a horizontal angle of view.
  • the angle of view of the imaging device 220 or the angle of view of the imaging device 230 in the vertical direction is also referred to as the vertical angle of view.
  • the flight path processing unit 811 may acquire information on one angle of view having the same value when the horizontal angle of view and the vertical angle of view are the same value.
  • the flight path processing unit 811 determines the imaging position (waypoint) of the subject by the unmanned air vehicle 100 based on the flight range and the imaging position interval.
  • the imaging positions by the unmanned air vehicle 100 may be arranged at equal intervals in the horizontal direction, and the distance between the last imaging position and the first imaging position may be shorter than the imaging position interval. This interval is a horizontal imaging interval.
  • the imaging positions by the unmanned air vehicle 100 may be arranged at equal intervals in the vertical direction, and the distance between the last imaging position and the first imaging position may be shorter than the imaging position interval. This interval is the vertical imaging interval.
  • the flight path processing unit 811 generates a flight path passing through the determined imaging position for each imaging plane corresponding to the side surface of the object.
  • the flight path processing unit 811 sequentially passes through the imaging positions adjacent to each other in the flight course of one imaging plane, passes through all the imaging positions in this flight course, and then enters the flight path entering the flight course of the next imaging plane. May be generated.
  • the flight path processing unit 811 sequentially passes through the imaging positions adjacent to each other, passes through all the imaging positions in the flight course, and then moves to the flight course of the next imaging plane.
  • An incoming flight path may be generated.
  • the flight path may be formed such that the altitude decreases as the flight path starts from the sky side.
  • the flight path may be formed such that the altitude increases as the flight path starts from the ground side.
  • the processing unit 110 of the unmanned air vehicle 100 may control the flight of the unmanned air vehicle 100 according to the generated flight path.
  • the processing unit 110 may cause the imaging device 220 or the imaging device 230 to image the subject at an imaging position that exists in the middle of the flight path. Therefore, the imaging device 220 or the imaging device 230 may capture the side surface of the subject at the imaging position in the flight path.
  • the captured image captured by the imaging device 220 or the imaging device 230 may be held in the memory 160 or the storage 170 of the unmanned air vehicle 100 or the memory 87 or the storage 89 of the portable terminal 80.
  • the processing unit 110 may refer to the memory 160 as appropriate (for example, when setting a flight path).
  • the shape data processing unit 812 is a three-dimensional information (three-dimensional shape) indicating a three-dimensional shape (three-dimensional shape) of an object (subject) based on a plurality of picked-up images picked up at different image pickup positions by any of the image pickup devices 220 and 230.
  • Information, three-dimensional shape data may be used as one image for restoring the three-dimensional shape data.
  • the captured image for restoring the three-dimensional shape data may be a still image.
  • a known method may be used as a method for generating three-dimensional shape data based on a plurality of captured images. a known method may be used. Known methods include, for example, MVS (Multi View Stereo), PMVS (Patch-based MVS), and SfM (Structure from Motion).
  • the captured image used for generating the three-dimensional shape data may be a still image.
  • the plurality of captured images used for generating the three-dimensional shape data include two captured images whose imaging ranges partially overlap each other.
  • the higher the overlapping ratio that is, the imaging area overlapping ratio
  • the shape data processing unit 812 can improve the reconstruction accuracy of the three-dimensional shape.
  • the lower the overlapping ratio of the imaging ranges the smaller the number of captured images used for generating the three-dimensional shape data when generating the three-dimensional shape data in the same range. Therefore, the shape data processing unit 812 can shorten the generation time of the three-dimensional shape data. Note that two captured images whose imaging ranges partially overlap each other may not be included in the plurality of captured images.
  • the shape data processing unit 812 may acquire a plurality of captured images including captured images obtained by capturing the side surface of the subject. As a result, the shape data processing unit 812 can collect a large number of image features on the side surface of the subject, and can restore the three-dimensional shape around the subject, as compared with the case of acquiring a captured image obtained by uniformly capturing the vertical direction from the sky. It can be improved.
  • FIG. 8 is a flowchart illustrating an example of a processing procedure of the flight path generation method according to the embodiment.
  • the illustrated example illustrates a process of performing aerial photography of a target region to acquire a rough shape of an object, and generating a flight path for estimating a three-dimensional shape based on the acquired rough shape.
  • the processing unit 81 of the mobile terminal 80 executes the process independently.
  • the flight path processing unit 811 of the processing unit 81 generates a flight path of the unmanned air vehicle 100 for shooting an object when performing shooting for estimating the three-dimensional shape of the object.
  • the flight path processing unit 811 inputs the flight range of the unmanned air vehicle 100 and designates the range of the imaging target region (S11).
  • the processing unit 110 of the unmanned aerial vehicle 100 inputs information on a designated flight range, flies in the corresponding flight range, and performs aerial photography in a state where an object in the imaging target area is looked down vertically at a predetermined imaging position. (S12). In this case, the processing unit 110 roughly captures an object (hereinafter, may be referred to as “schematic imaging”) at a small number of imaging positions.
  • the processing unit 110 of the unmanned aerial vehicle 100 acquires a bird's-eye shot image at each imaging position and records the captured image in the memory 160.
  • the flight path processing unit 811 of the processing unit 81 acquires a captured image obtained by rough imaging (downward aerial shooting) in the vertical direction of the imaging target region, and stores the acquired image in the memory 87 or the storage 89.
  • the flight path processing unit 811 acquires the approximate shape by estimating the approximate shape of the object (building, ground, etc.) using a known three-dimensional shape restoration technique using the acquired captured image group (S13).
  • the three-dimensional shape data of the approximate shape may include polygon data, for example.
  • the approximate shape of the object is obtained by taking a 3D map database held by another device such as the mobile terminal 80 or the server instead of being acquired by aerial imaging of the target region.
  • the three-dimensional shape data of the approximate shape may be acquired by three-dimensional information (for example, polygon data) such as a building and a road included in the information.
  • the flight path processing unit 811 generates a detailed imaging flight path for estimating the three-dimensional shape of the object using the acquired schematic shape of the object (S14). Several examples of the flight path generation procedure using the general shape of the object will be described later.
  • the above operation example can generate a flight path for estimating the three-dimensional shape of an object and automate the detailed imaging of the object.
  • the setting of an appropriate flight path for the object can be automated.
  • FIG. 9 is a diagram for explaining an input example of the flight range A1.
  • the processing unit 81 of the portable terminal 80 inputs information on the flight range A1 through the operation unit 83.
  • the operation unit 83 may accept a user input of a desired range where the generation of the three-dimensional shape data indicated in the map information M1 is desired as the flight range A1.
  • the information on the flight range A1 is not limited to a desired range, and may be a predetermined flight range.
  • the predetermined flight range may be, for example, one of ranges for periodically generating 3D shape data and measuring the 3D shape.
  • FIG. 10 is a diagram for explaining schematic imaging in the flight path FPA.
  • the flight path processing unit 811 may set the interval between the imaging positions CP (imaging position interval) to the interval d11 in the flight path FPA.
  • the interval d11 is a sparse interval (for example, an interval of several tens of meters) such that the size of an object (for example, a building) can be estimated.
  • the interval d11 is set to an interval where at least imaging ranges at adjacent imaging positions CP partially overlap. Imaging at each imaging position CP at the interval d11 of the flight path FPA may be referred to as schematic imaging.
  • the unmanned air vehicle 100 can reduce the imaging time by capturing images at sparse intervals as compared to capturing images at dense intervals.
  • a landscape including the building BL and the mountain MT may spread in the vertical direction (direction toward the ground, direction of gravity) of the flight path on which the unmanned air vehicle 100 flies. Therefore, the building BL and the mountain MT exist in the imaging range and are imaging targets.
  • the approximate shape of the object can be acquired from the captured image obtained by the approximate imaging.
  • FIG. 11 is a diagram for explaining generation of three-dimensional shape data having a rough shape based on rough imaging obtained by the flight path FPA.
  • the shape data processing unit 812 generates the three-dimensional shape data SD1 of the approximate shape of the object based on the plurality of captured images CI1 obtained at each imaging position CP by the schematic imaging of the flight path FPA.
  • the user can grasp the approximate shape of the ground existing in the vertical direction of the flight path FPA by confirming the three-dimensional shape data SD1 by display or the like.
  • the user can confirm that the mountain MT exists by confirming the shape (schematic shape) obtained from the three-dimensional shape data SD1 based on the schematic imaging, but cannot confirm the existence of the building BL.
  • the mountain MT has a gentle outline, and even if an image is taken from the sky according to the flight path FPA, an image necessary for generating the three-dimensional shape data SD1 is sufficient in the captured image CI1.
  • the outline of the building BL is substantially parallel to the vertical direction, and the side surface of the building BL is sufficiently imaged at the imaging position CP of the flight path FPA where the unmanned air vehicle 100 travels in the horizontal direction above the building BL. This is because it is difficult to do. That is, information necessary for three-dimensional shape estimation cannot be acquired from the captured image captured downward in the vicinity of the building BL.
  • the flight path processing unit 811 uses the data of the approximate shape of the object to move the side surface of the object toward the direction parallel to the vertical direction of the object, that is, in the horizontal direction (normal direction of the vertical direction).
  • a flight path and an imaging position are generated and set so as to capture an image from one side.
  • the shape data processing unit 812 generates the three-dimensional shape data of the object using the captured image including the captured image of the side of the object captured according to the generated flight path. Thereby, the estimation accuracy of the three-dimensional shape of the object can be improved.
  • FIG. 12 is a flowchart illustrating an example of a processing procedure of the three-dimensional shape estimation method according to the embodiment.
  • the processing unit 81 of the mobile terminal 80 as an example of the processing unit of the information processing apparatus executes the processing independently.
  • the flight path processing unit 811 of the processing unit 81 sets a flight path for the unmanned air vehicle 100 using the generated flight path (S21).
  • the processing unit 110 of the unmanned aerial vehicle 100 flies over the flight range of the imaging target area according to the set flight path, and aerially captures the object laterally at a predetermined imaging position (S22).
  • the processing unit 110 captures an object in detail (hereinafter, may be referred to as “detailed imaging”) by partially overlapping the imaging range at every predetermined imaging position interval.
  • the processing unit 110 of the unmanned air vehicle 100 acquires the captured image at each imaging position and records the captured image in the memory 160.
  • the shape data processing unit 812 of the processing unit 81 acquires a captured image obtained by detailed imaging (side aerial imaging) of the imaging target region and stores it in the memory 87 or the storage 89.
  • the shape data processing unit 812 generates three-dimensional shape data by estimating the three-dimensional shape of an object (building, ground, etc.) from the acquired captured image group using a known three-dimensional shape restoration technique (S23).
  • three-dimensional shape data including the shape of the side surface of the object can be generated using a detailed captured image obtained by capturing the object from the side. Therefore, it is possible to estimate the detailed shape of the side surface that has been difficult to restore in the captured image obtained by capturing the object downward, and to improve the accuracy of the three-dimensional shape data of the object.
  • FIG. 13 is a diagram for describing a first operation example of flight path generation using the schematic shape of an object in the embodiment.
  • the first operation example is an example in which a polyhedron such as a cube surrounding an object is calculated to generate a shooting plane that faces the side of the object.
  • the flight path processing unit 811 of the processing unit 81 uses the acquired schematic shape to calculate a polyhedron that surrounds the outer shape of the object.
  • This polyhedron is a solid that touches the outside or is slightly larger than the general shape of the object.
  • a cube 301 is calculated as an example of a polyhedron.
  • the flight path processing unit 811 extracts at least one side surface 303 in the polyhedron of the cube 301.
  • the side surface 303 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction.
  • the flight path processing unit 811 calculates a normal 304 outward of the polyhedron with respect to the extracted side surface 303.
  • the normal 304 can be calculated by an outer product of two vectors (for example, a vector connecting any one of the vertices) along the surface of the side surface 303.
  • the flight path processing unit 811 calculates an imaging plane 305 having a predetermined imaging distance and parallel to the side surface 303 using the acquired normal line 304.
  • the imaging plane 305 is located at a predetermined imaging distance from the side surface 303 and is a plane perpendicular to the normal line 304.
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 306 having predetermined imaging position intervals in the generated imaging plane 305 and determines an imaging path 307 passing through each imaging position 306. As a result, a flight path including the imaging path 307 is generated.
  • the shooting direction at each imaging position 306 is opposite to the side of the object in the direction opposite to the direction of the normal 304.
  • the shooting plane is a vertical plane
  • the shooting direction is a horizontal direction perpendicular to the shooting plane.
  • FIG. 14 is a diagram for explaining the setting of a plurality of imaging positions 306 on the imaging plane 305.
  • the flight path processing unit 811 sets a predetermined shooting distance L in the normal direction with respect to the side surface 303 of the polyhedron, and calculates a shooting plane 305 parallel to the side surface 303 at a position away from the side surface 303 by the shooting distance L.
  • the flight path processing unit 811 sets a predetermined imaging position interval d on the imaging plane 305 and determines an imaging position 306 for each imaging position interval d. For example, the following method may be used to set the shooting distance L and the imaging position interval d.
  • the user designates an imaging distance L [m] and an imaging position interval d [m].
  • the processing unit 81 of the portable terminal 80 inputs information on the shooting distance L and the imaging position interval d through the operation unit 83 according to the operation input by the user, and stores the information in the memory 87. Thereby, the imaging position for detailed imaging can be set based on the imaging distance specified by the user and the imaging position interval.
  • the user designates the shooting distance L [m] and the overlapping rate r side [%] of the imaging range, and calculates the imaging position interval d [m] from the shooting distance L and the overlapping rate r side .
  • the imaging position interval d can be calculated by the following equation (1) using the imaging distance L, the overlapping rate r side , and the angle of view FOV (Field of View) of the imaging device.
  • the processing unit 81 of the portable terminal 80 inputs information on the shooting distance L and the overlap rate r side by the operation unit 83 according to the operation input by the user, and stores the information in the memory 87.
  • the processing unit 81 acquires information on the angle of view FOV of the imaging device 220 from the unmanned air vehicle 100 by the interface unit 82 or the wireless communication unit 85 and stores the information in the memory 87.
  • the processing unit 81 calculates the imaging position interval d by the above mathematical formula (1). As a result, the imaging position interval can be calculated based on the shooting distance specified by the user and the overlapping ratio of the imaging ranges, and the imaging position for detailed imaging can be set.
  • the user designates the resolution r [m / pixel] of the captured image and the overlapping rate r side [%] of the imaging range, and calculates the imaging position interval d [m] from the resolution r and the overlapping rate r side . Also, the shooting distance L [m] is calculated from the imaging position interval d.
  • the imaging position interval d can be calculated by the following equation (2) using the resolution r, the width w of the captured image, and the overlap rate r side .
  • the shooting distance L can be calculated by the following equation (3) using the imaging position interval d, the overlapping rate r side , and the angle of view FOV of the imaging device.
  • the processing unit 81 of the portable terminal 80 inputs information about the resolution r and the overlap rate r side by the operation unit 83 according to the operation input by the user, and stores the information in the memory 87.
  • the processing unit 81 calculates the imaging position interval d by the above mathematical formula (2).
  • the processing unit 81 acquires information on the angle of view FOV of the imaging device 220 from the unmanned air vehicle 100 by the interface unit 82 or the wireless communication unit 85 and stores the information in the memory 87.
  • the processing unit 81 calculates the shooting distance L by the above mathematical formula (3). Thereby, the imaging position interval can be calculated based on the resolution of the captured image specified by the user and the overlapping range of the imaging range, and the imaging position for detailed imaging can be set.
  • the flight path processing unit 811 arranges a plurality of imaging positions 306 at equal intervals for each imaging position interval d on the imaging plane 305 based on the set imaging position interval d, and sets the imaging path 307 passing through these imaging positions 306. decide.
  • the imaging position of the end of the imaging plane 305 such as the imaging position of the start point and end point on the imaging plane 305, may be set within a range of 1 / 2d or less from the side edge of the imaging plane 305.
  • FIG. 15 is a flowchart illustrating a processing procedure of a first operation example of flight path generation using the schematic shape of an object in the embodiment.
  • the flight path processing unit 811 of the processing unit 81 uses the acquired schematic shape to calculate a polyhedron (cube 301) that surrounds the outer shape of the object (S31).
  • the flight path processing unit 811 sequentially extracts at least one (four in the case of a cube) side surfaces 303 in the polyhedron of the cube 301 (S32).
  • the flight path processing unit 811 calculates a normal 304 outward of the polyhedron for one extracted side surface 303 (S33).
  • the flight path processing unit 811 calculates an imaging plane 305 parallel to the side surface 303 at a position away from the predetermined imaging distance L by using the acquired normal line 304 (S34).
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 306 having a predetermined imaging position interval d on one calculated imaging plane 305, and shoots in the direction facing the object from each of these imaging positions.
  • An imaging route 307 is generated (S35).
  • the flight path processing unit 811 determines whether the generation of the shooting paths 307 of all the extracted side surfaces 303 has been completed for the object (S36). If the shooting route generation for all the side surfaces 303 has not been completed, the flight route processing unit 811 returns to the process of step S32, extracts the next side surface 303, and repeats the same processing until the generation of the shooting route 307 (S32). To S35).
  • step S36 when the shooting path generation for all the side surfaces 303 is completed, the flight path processing unit 811 combines the shooting paths 307 of the respective shooting planes 305 to generate a flight path (S37).
  • the processing unit 110 of the unmanned aerial vehicle 100 communicates with the portable terminal 80 through the communication interface 150, acquires the flight path information generated by the flight path processing unit 811, and sets the flight path of the unmanned air vehicle 100.
  • the processing unit 110 flies around the object according to the set flight path, and images the object by the imaging devices 220 and 230 at each of a plurality of imaging positions (waypoints).
  • the processing unit 110 captures images at each imaging position 306 in order for each imaging plane 305 by a flight path obtained by combining the imaging paths 307 of the imaging planes 305.
  • the processing unit 110 completes imaging at each imaging position 306 on the imaging plane 305 corresponding to one side surface 303 of the polyhedron (cube 301) having a substantially shape with respect to the object included in the subject. For example, a side plane adjacent to the current side surface, and imaging is performed at each imaging position on this plane. In this way, the unmanned aerial vehicle 100 acquires a captured image of the side imaged toward the side surface of the object at the imaging positions of all imaging planes set in the flight path.
  • the processing unit 81 of the portable terminal 80 communicates with the unmanned aerial vehicle 100 through the interface unit 82 or the wireless communication unit 85, and acquires a captured image captured by the unmanned aerial vehicle 100.
  • the shape data processing unit 812 of the processing unit 81 generates the three-dimensional shape data of the object (building, ground, etc.) using the acquired captured image of the side of the object, and includes details of the shape of the side surface of the object. A three-dimensional shape can be estimated.
  • the captured image may include a captured image on the lower side obtained by performing detailed imaging of the object in the vertical direction at an imaging position interval for detailed imaging, along with the side captured image.
  • the flight path processing unit 811 sets an imaging path including a plurality of imaging positions on the upper surface of the object as well as the side surface, and generates a flight path.
  • the polyhedron By calculating the polyhedron surrounding the approximate shape of the object and extracting the surface along the vertical direction of the polyhedron or the surface standing within the predetermined angle range in the vertical direction as the side surface by the above operation example, the polyhedron is directed toward the side of the object. Thus, it is possible to extract the side surface of the approximate shape that can be captured. For this reason, it is possible to set an imaging position where detailed imaging can be performed when the object is viewed from the side, using the schematic shape of the object.
  • FIG. 16 is a diagram for describing a second operation example of flight path generation using the schematic shape of an object in the embodiment.
  • the second operation example is an example in which a mesh indicating the schematic shape of an object is simplified and a shooting plane directed to the side of the object is generated.
  • the flight path processing unit 811 of the processing unit 81 uses the acquired approximate shape to simplify the mesh indicating the approximate shape of the object.
  • a mesh simplification method a known method may be used. Known methods include, for example, Vertex-clustering method, Incremental-decimation method and the like.
  • the polygon data is simplified to simplify a complicated shape, and smoothing is performed to reduce the number of polygons representing one surface.
  • the flight path processing unit 811 performs simplification processing on the schematic shape 311 and calculates a simplified polyhedron 312.
  • the flight path processing unit 811 extracts at least one side surface 313 in the simplified polyhedron 312, and calculates a normal 314 outward of the polyhedron with respect to the extracted side surface 313.
  • the direction of the normal line is determined for each plane of the polyhedron 312. If the absolute value of the normalized normal normal component Nz is smaller than 0.1 (
  • the side surface 313 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction.
  • the flight path processing unit 811 calculates an imaging plane 315 having a predetermined imaging distance L and parallel to the side surface 313 using the acquired normal line 314.
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 316 having a predetermined imaging position interval d on the calculated imaging plane 315, determines an imaging path 317 passing through each imaging position 316, and this imaging A flight path including the path 317 is generated.
  • the shooting plane is a flat plane within a predetermined range with respect to the vertical direction
  • the shooting direction is a direction toward the side in a substantially horizontal direction, that is, a direction facing the side surface of the object.
  • FIG. 17 is a flowchart illustrating a processing procedure of the second operation example of the flight path generation using the schematic shape of the object in the embodiment.
  • the flight path processing unit 811 of the processing unit 81 simplifies the mesh of the approximate shape of the object using the acquired approximate shape, and calculates the polyhedron 312 in which the approximate shape 311 is simplified (S41).
  • the flight path processing unit 811 extracts at least one side surface 313 in the polyhedron 312 (S42).
  • the flight path processing unit 811 calculates a normal 314 outward of the polyhedron with respect to the extracted one side 313 (S43).
  • the flight path processing unit 811 calculates the imaging plane 315 parallel to the side surface 313 at a position away from the predetermined imaging distance L using the calculated normal 314 (S44).
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 316 having a predetermined imaging position interval d on one generated imaging plane 315, and shoots in the direction facing the object from each of these imaging positions.
  • a shooting path 317 is generated (S45).
  • the flight path processing unit 811 determines whether the generation of the shooting paths 317 of all the extracted side surfaces 313 has been completed for the object (S46). If the shooting route generation for all the side surfaces 313 has not been completed, the flight route processing unit 811 returns to the process of step S42, extracts the next side surface 313 adjacent to the current side surface, and generates the shooting route 317. The same processing is repeated until (S42 to S45).
  • step S46 when the shooting path generation for all the side surfaces 313 is completed, the flight path processing unit 811 combines the shooting paths 317 of the shooting planes 315 to generate a flight path (S47).
  • FIG. 18 is a diagram for describing a third operation example of flight path generation using the schematic shape of an object in the embodiment.
  • the third operation example is an example in which a photographing plane directed to the side of the object is generated by combining polyhedrons such as a plurality of cubes surrounding the object.
  • the flight path processing unit 811 of the processing unit 81 calculates a plurality of polyhedrons surrounding the schematic shape of each object, for a plurality of objects such as buildings, using the acquired schematic shape.
  • a cube or a rectangular parallelepiped polyhedron 321A, 321B, 321C present as an example of a plurality of polyhedrons is shown.
  • the flight path processing unit 811 combines a plurality of polyhedrons 321A, 321B, and 321C, and calculates the combined polyhedron 322. By combining adjacent polyhedrons, a collision with an object of the unmanned air vehicle 100 at the time of detailed side imaging is avoided.
  • the flight path processing unit 811 extracts at least one side surface 323 from the combined polyhedron 322.
  • the side surface 323 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction.
  • the flight path processing unit 811 calculates a normal line 324 outward of the polyhedron with respect to the extracted side surface 323.
  • the flight path processing unit 811 calculates a shooting plane 325 having a predetermined shooting distance L and parallel to the side surface 323 using the calculated normal 324.
  • the flight path processing unit 811 sets a plurality of imaging positions (waypoints) 326 having a predetermined imaging position interval d inside the calculated imaging plane 325, determines an imaging path 327 passing through each imaging position 326, and A flight path including the imaging path 327 is generated.
  • the shooting direction at each imaging position 326 is opposite to the side of the object in the direction opposite to the direction of the normal 324.
  • the shooting plane is a vertical plane, and the shooting direction is a horizontal direction perpendicular to the shooting plane.
  • FIG. 19 is a flowchart illustrating a processing procedure of a third operation example of the flight path generation using the schematic shape of the object in the embodiment.
  • the flight path processing unit 811 of the processing unit 81 calculates a plurality of polyhedrons 321A, 321B, and 321C surrounding the outer shape of each object for a plurality of objects using the acquired schematic shape (S51).
  • the flight path processing unit 811 combines the polyhedrons 321A, 321B, and 321C, and calculates the combined polyhedron 322 (S52).
  • the flight path processing unit 811 sequentially extracts at least one side surface 323 from the combined polyhedron 322 (S32).
  • step S36 the flight path processing unit 811 returns to the process of step S32 when the shooting path generation for all the side surfaces 323 is not completed, extracts the next side surface 323 adjacent to the current side surface, and captures the image. The same process is repeated until the path 327 is generated (S32 to S35).
  • the flight path processing unit 811 combines the shooting paths 327 of the shooting planes 325 in the combined polyhedron 322 to generate a flight path (S57).
  • the above-described second operation example and the third operation example are combined to simplify the schematic shape of the object and to combine a plurality of polyhedrons, to extract side surfaces, to set the shooting plane and the shooting position, and to the shooting path.
  • the flight path may be generated by making the determination.
  • polyhedrons corresponding to a plurality of approximate shapes of the object are calculated, and a plurality of adjacent polyhedrons are combined to stand in a plane along the vertical direction of the combined polyhedrons or within a predetermined angle range in the vertical direction.
  • the mobile terminal 80 functioning as an information processing apparatus sets an imaging position where detailed imaging can be performed when the object is viewed from the side using the schematic shape of the object, and sets a flight path passing through the imaging position. it can.
  • the imaging position where the detailed imaging when the object is viewed from the side can be set using the approximate shape of the object.
  • By extracting the side surface of the schematic shape it is possible to set a side image capturing position for detailed imaging corresponding to the side surface of the object.
  • By generating a flight path that passes through the set imaging position a detailed imaging flight path that includes the side of the object can be set.
  • By setting an imaging position facing the side surface for each extracted side surface it is possible to perform detailed imaging in the horizontal direction when the object is viewed from the side.
  • the present embodiment by setting a plurality of imaging positions having predetermined imaging position intervals corresponding to the extracted side surfaces, captured images having an appropriate resolution and overlapping rate can be obtained.
  • determining a shooting route that passes through the plurality of set imaging positions and generating a flight route including the shooting route it is possible to set a flight route for detailed imaging including the side of the object.
  • the imaging position facing the side surface of the object can be easily determined.
  • a shooting plane parallel to the side surface at a predetermined shooting distance can be easily generated.
  • an imaging position capable of acquiring captured images having an appropriate overlapping rate can be set.
  • each side surface corresponds to a plurality of side surfaces of the object. Efficient images can be taken by flying in order.
  • FIG. 20 is a schematic diagram illustrating a second configuration example of the flight path generation system 10A in the embodiment.
  • the flight path generation system 10 ⁇ / b> A includes an unmanned air vehicle 100 and a PC (Personal Computer) 70.
  • the unmanned air vehicle 100 and the PC 70 can communicate with each other using wired communication or wireless communication (for example, wireless LAN or Bluetooth (registered trademark)).
  • the PC 70 may be a computer such as a desktop PC, a notebook PC, or a tablet terminal.
  • the PC 70 may be a computer having a server and a client terminal connected via a network.
  • the PC 70 is an example of an information processing apparatus.
  • the PC 70 may include a processor (eg, CPU, MPU, or DSP) as an example of a processing unit, a memory, an example of a storage unit, a communication interface, a display, an input device, and a storage.
  • the PC 70 as an example of the information processing apparatus has the same functions as the processing unit 81, the flight path processing unit 811, and the shape data processing unit 812 included in the mobile terminal 80 illustrated in FIG. 7.
  • the PC 70 functioning as the information processing apparatus can set the imaging position where the detailed imaging when the object is viewed from the side is set using the schematic shape of the object, and the flight path passing through the imaging position can be set.
  • FIG. 21 is a block diagram illustrating an example of a hardware configuration of an unmanned air vehicle 100A according to a third configuration example of the flight path generation system 10B in the embodiment.
  • the unmanned air vehicle 100A of the flight path generation system 10B includes a processing unit 110A instead of the processing unit 110, as compared with the unmanned air vehicle 100 illustrated in FIG.
  • the unmanned air vehicle 100A has a function as an example of an information processing device, and the processing unit 110A of the unmanned air vehicle 100A is an example of a processing unit of the information processing device.
  • the same components as those of the unmanned air vehicle 100 of FIG. 4 are denoted by the same reference numerals, and description thereof is omitted or simplified.
  • the processing unit 110A as an example of the processing unit of the information processing apparatus includes a flight path processing unit 111 and a shape data processing unit 112.
  • the flight path processing unit 111 has the same function as the flight path processing unit 811 provided in the portable terminal 80 shown in FIG.
  • the shape data processing unit 112 has the same function as the shape data processing unit 812 included in the mobile terminal 80 shown in FIG.
  • the processing unit 110A of the unmanned air vehicle 100A functioning as an information processing apparatus sets an imaging position where detailed imaging can be performed when the object is viewed from the side, using the approximate shape of the object, and sets the imaging position. It is possible to set the flight route through.
  • FIG. 22 is a block diagram illustrating an example of a hardware configuration of a transmitter 50A according to a fourth configuration example of the flight path generation system 10C in the embodiment.
  • the transmitter 50A includes a processing unit 61A instead of the processing unit 61, as compared with the transmitter 50 illustrated in FIG.
  • the transmitter 50A has a function as an example of an information processing device, and the processing unit 61A of the transmitter 50A is an example of a processing unit of the information processing device.
  • the same components as those of the transmitter 50 of FIG. 6 are denoted by the same reference numerals, and description thereof is omitted or simplified.
  • the processing unit 61A as an example of the processing unit of the information processing apparatus includes a flight path processing unit 611 and a shape data processing unit 612.
  • the flight path processing unit 611 has the same function as the flight path processing unit 811 provided in the portable terminal 80 shown in FIG.
  • the shape data processing unit 612 has the same function as the shape data processing unit 812 included in the mobile terminal 80 shown in FIG.
  • the processing unit 61A of the transmitter 50A functioning as an information processing apparatus sets an imaging position where detailed imaging is possible when the object is viewed from the side, using the approximate shape of the object, and passes through the imaging position. You can set the flight path.
  • the acquired flight path is set as the flying object, and the image obtained by performing imaging including lateral detailed imaging with respect to the object while the flying object flies in the imaging target area according to the flight path.
  • the image may be used for generating three-dimensional shape data of an object existing in the imaging target area.
  • a captured image acquired by detailed lateral imaging may be used for inspection of the side surface of the object.
  • the information processing apparatus that executes the steps in the flight path generation method is provided in any one of the mobile terminal 80, the unmanned air vehicle 100A, and the transmitter 50A has been described.
  • a device may be included to perform the steps in the flight path generation method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)

Abstract

This flight path generation method for generating a flight path of a flying object for imaging an imaging subject has a step for obtaining the general shape of an object included in the imaging subject, a step for extracting the lateral surface of the general shape, a step for setting an imaging location that corresponds to the lateral surface, and a step for generating a flight path that passes through the imaging location.

Description

飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体Flight path generation method, information processing apparatus, flight path generation system, program, and recording medium

 本開示は、飛行体の飛行経路を生成するための飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体に関する。 The present disclosure relates to a flight path generation method, an information processing apparatus, a flight path generation system, a program, and a recording medium for generating a flight path of a flying object.

 撮影機器を搭載し、予め設定された固定経路を飛行しながら撮影を行うプラットフォーム(例えば無人飛行体)が知られている(例えば特許文献1参照)。このプラットフォームは、地上基地から飛行経路や撮影指示等の命令を受け、その命令に従って飛行し、撮影を行って取得画像を地上基地に送る。プラットフォームは、撮影対象を撮影する場合、設定された固定経路を飛行しながら、プラットフォームと撮影対象との位置関係に基づいて、プラットフォームの撮像機器を傾斜して撮像する。 2. Description of the Related Art A platform (for example, an unmanned air vehicle) that is equipped with a photographing device and performs photographing while flying on a preset fixed route is known (for example, see Patent Document 1). This platform receives a command such as a flight route and a shooting instruction from the ground base, flies in accordance with the command, performs shooting, and sends an acquired image to the ground base. When imaging the imaging target, the platform inclines the imaging device of the platform based on the positional relationship between the platform and the imaging target while flying along the fixed path that has been set.

 また従来、空中を飛行する無人飛行体(例えばUAV:Unmanned Aerial Vehicle)により撮影された空中写真等の撮像画像に基づいて、建物等の被写体の3次元形状を推定することも知られている。無人飛行体による撮影(例えば空撮)を自動化するために、予め無人飛行体の飛行経路を生成する技術が用いられる。従って、無人飛行体を用いて建物等の被写体の3次元形状を推定するためには、予め生成した飛行経路に従って無人飛行体を飛行させ、無人飛行体が飛行経路中の異なる撮像位置において撮影した被写体の撮像画像を複数取得する必要がある。 Conventionally, it is also known to estimate the three-dimensional shape of a subject such as a building based on a captured image such as an aerial photograph taken by an unmanned air vehicle flying in the air (for example, UAV: Unmanned Aero Vehicle). In order to automate imaging (for example, aerial photography) by an unmanned air vehicle, a technique for generating a flight path of the unmanned air vehicle in advance is used. Therefore, in order to estimate the three-dimensional shape of a subject such as a building using an unmanned aerial vehicle, the unmanned aerial vehicle is caused to fly according to a previously generated flight path, and the unmanned aerial vehicle is photographed at different imaging positions in the flight path. It is necessary to acquire a plurality of captured images of the subject.

日本国特開2010-61216号公報Japanese Unexamined Patent Publication No. 2010-61216

 特許文献1に記載されたプラットフォームでは、固定経路を通りながら撮像するが、固定経路から鉛直方向に位置するオブジェクト(例えば建物)の存在を十分に考慮されていない。そのため、オブジェクトの側面の撮像画像や上空から観察可能なオブジェクトの一部に隠された他の一部の撮像画像を十分に取得することが困難である。従って、3次元形状を推定するための撮像画像が不足し、3次元形状の推定精度が低下する。 The platform described in Patent Document 1 captures an image while passing through a fixed path, but does not sufficiently consider the presence of an object (for example, a building) positioned in the vertical direction from the fixed path. Therefore, it is difficult to sufficiently acquire the captured image of the side surface of the object and the other part of the captured image hidden in a part of the object that can be observed from above. Therefore, a captured image for estimating the three-dimensional shape is insufficient, and the estimation accuracy of the three-dimensional shape is lowered.

 オブジェクトの側面を撮像する場合、撮像者が撮像装置を把持してオブジェクトの側面を撮像することが考えられる。この場合、ユーザがオブジェクトの周辺まで移動する必要があるため、ユーザの利便性が低下する。また、ユーザ手動による撮像となるため、所望の状態(例えばオブジェクトの所望の撮像位置、オブジェクトの所望の撮像サイズ、オブジェクトの所望の撮像向き)の撮像画像を十分に取得できない可能性がある。 When imaging the side surface of the object, it is conceivable that the photographer grasps the imaging device and images the side surface of the object. In this case, since the user needs to move to the periphery of the object, the convenience for the user is reduced. Further, since the image is manually captured by the user, there is a possibility that a captured image in a desired state (for example, a desired imaging position of the object, a desired imaging size of the object, and a desired imaging direction of the object) cannot be sufficiently acquired.

 また、特定のオブジェクトの側面を無人航空機により撮像する場合、無人航空機が飛行する飛行経路を事前に手動で決定することが考えらえる。オブジェクトの周囲における所望の位置を撮像位置として指定する場合、3次元空間の位置(緯度、経度、高度)をユーザ入力して指定することが考えられる。この場合、各撮像位置をユーザ入力により決定するので、ユーザの利便性が低下する。また、飛行経路の決定のために、事前にオブジェクトの詳細な情報が必要となるので、準備に手間がかかる。 Also, when the side surface of a specific object is imaged by an unmanned aircraft, it is conceivable that the flight route on which the unmanned aircraft flies is manually determined in advance. When a desired position around the object is designated as an imaging position, it is conceivable that the position (latitude, longitude, altitude) in the three-dimensional space is designated by user input. In this case, since each imaging position is determined by user input, user convenience is reduced. In addition, since detailed information on the object is required in advance for determining the flight path, it takes time to prepare.

 一態様において、被写体を撮像する飛行体の飛行経路を生成する飛行経路生成方法であって、被写体に含まれるオブジェクトの概略形状を取得するステップと、概略形状における側面を抽出するステップと、側面に対応する撮像位置を設定するステップと、撮像位置を通過する飛行経路を生成するステップと、を有する。 In one aspect, a flight path generation method for generating a flight path of an aircraft that images a subject, the step of acquiring a schematic shape of an object included in the subject, a step of extracting a side surface in the schematic shape, Setting a corresponding imaging position and generating a flight path passing through the imaging position.

 撮像位置を設定するステップは、抽出した側面毎に、側面に対向する撮像位置を設定するステップを含んでよい。 The step of setting the imaging position may include a step of setting an imaging position facing the side surface for each extracted side surface.

 撮像位置を設定するステップは、側面に対応して、所定の撮像位置間隔を持つ複数の撮像位置を設定するステップを含んでよい。 The step of setting the imaging position may include a step of setting a plurality of imaging positions having a predetermined imaging position interval corresponding to the side surface.

 飛行経路を生成するステップは、複数の撮像位置を通る撮影経路を決定し、撮影経路を含む飛行経路を生成するステップを含んでよい。 The step of generating the flight route may include a step of determining a shooting route passing through a plurality of imaging positions and generating a flight route including the shooting route.

 飛行経路生成方法は、側面に対して所定の撮影距離を有して平行する撮影平面を生成するステップ、を更に有し、撮像位置を設定するステップは、撮影平面において所定の撮像位置間隔を持つ複数の撮像位置を設定するステップを含んでよい。 The flight path generation method further includes a step of generating an imaging plane parallel to the side surface with a predetermined imaging distance, and the step of setting the imaging position has a predetermined imaging position interval on the imaging plane. A step of setting a plurality of imaging positions may be included.

 撮像位置を設定するステップは、所定の撮像位置間隔として、各撮像位置において撮像した撮像画像の一部が他と重複する撮像位置間隔を用いてよい。 The step of setting the imaging position may use an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other as the predetermined imaging position interval.

 飛行経路生成方法は、オブジェクトの概略形状を囲む多面体を算出するステップ、を更に有し、側面を抽出するステップは、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出するステップを含んでよい。 The flight path generation method further includes a step of calculating a polyhedron surrounding the general shape of the object, and the step of extracting the side surface is a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction. May be included as a side.

 飛行経路生成方法は、オブジェクトの概略形状を簡略化した多面体を算出するステップ、を更に有し、側面を抽出するステップは、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出するステップを含んでよい。 The flight path generation method further includes a step of calculating a polyhedron in which the schematic shape of the object is simplified, and the step of extracting the side surface stands on a plane along the vertical direction of the polyhedron or within a predetermined angle range in the vertical direction. A step of extracting the face as a side face may be included.

 多面体を算出するステップは、オブジェクトの複数の概略形状に対応する多面体をそれぞれ算出し、近接する複数の多面体を結合するステップを含んでよい。 The step of calculating the polyhedron may include a step of calculating a polyhedron corresponding to a plurality of approximate shapes of the object, and combining a plurality of adjacent polyhedrons.

 飛行経路を生成するステップは、一つの前記側面において撮像位置を通過する飛行経路を生成し、側面と隣接する次の側面において撮像位置を通過する飛行経路を生成するステップを含んでよい。 The step of generating the flight path may include a step of generating a flight path that passes through the imaging position in one of the side surfaces and a flight path that passes through the imaging position in the next side surface adjacent to the side surface.

 飛行経路生成方法は、オブジェクトを下向きに撮像した撮像画像を取得するステップ、を更に有し、概略形状を取得するステップは、撮像画像を用いてオブジェクトの概略形状の3次元形状データを取得するステップを含んでよい。 The flight path generation method further includes the step of acquiring a captured image obtained by capturing the object downward, and the step of acquiring the approximate shape includes acquiring three-dimensional shape data of the approximate shape of the object using the captured image. May be included.

 一態様において、被写体を撮像する飛行体の飛行経路を生成する情報処理装置であって、飛行経路に関する処理を実行する処理部を有し、処理部は、被写体に含まれるオブジェクトの概略形状を取得し、概略形状における側面を抽出し、側面に対応する撮像位置を設定し、撮像位置を通過する飛行経路を生成する、情報処理装置である。 In one aspect, an information processing apparatus that generates a flight path of a flying object that captures an image of a subject, and includes a processing unit that executes processing related to the flight path, and the processing unit acquires a schematic shape of an object included in the subject Then, the information processing apparatus extracts a side surface in a schematic shape, sets an imaging position corresponding to the side surface, and generates a flight path passing through the imaging position.

 処理部は、撮像位置の設定において、抽出した側面毎に、側面に対向する撮像位置を設定してよい。 The processing unit may set an imaging position facing the side surface for each extracted side surface in setting the imaging position.

 処理部は、撮像位置の設定において、側面に対応して、所定の撮像位置間隔を持つ複数の撮像位置を設定してよい。 The processing unit may set a plurality of imaging positions having a predetermined imaging position interval corresponding to the side surface in setting the imaging position.

 処理部は、飛行経路の生成において、複数の撮像位置を通る撮影経路を決定し、撮影経路を含む飛行経路を生成してよい。 In the generation of the flight path, the processing unit may determine a shooting path that passes through a plurality of imaging positions and generate a flight path including the shooting path.

 処理部は、更に、側面に対して所定の撮影距離を有して平行する撮影平面を生成し、撮像位置の設定において、撮影平面において所定の撮像位置間隔を持つ複数の撮像位置を設定してよい。 The processing unit further generates a shooting plane parallel to the side surface with a predetermined shooting distance, and sets a plurality of shooting positions having a predetermined shooting position interval on the shooting plane in the setting of the shooting position. Good.

 処理部は、撮像位置の設定において、所定の撮像位置間隔として、各撮像位置において撮像した撮像画像の一部が他と重複する撮像位置間隔を用いてよい。 In the setting of the imaging position, the processing unit may use an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other as the predetermined imaging position interval.

 処理部は、更に、オブジェクトの概略形状を囲む多面体を算出し、側面の抽出において、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出してよい。 The processing unit may further calculate a polyhedron surrounding the approximate shape of the object, and in extracting the side surface, a surface along the vertical direction of the polyhedron or a surface standing within a predetermined angle range in the vertical direction may be extracted as the side surface.

 処理部は、更に、オブジェクトの概略形状を簡略化した多面体を算出し、側面の抽出において、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出してよい。 The processing unit further calculates a polyhedron in which the schematic shape of the object is simplified, and in the side surface extraction, extracts a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction as a side surface. Good.

 処理部は、多面体の算出において、オブジェクトの複数の概略形状に対応する多面体をそれぞれ算出し、近接する複数の多面体を結合してよい。 In the calculation of the polyhedron, the processing unit may calculate a polyhedron corresponding to a plurality of approximate shapes of the object, and combine a plurality of adjacent polyhedrons.

 処理部は、飛行経路の生成において、一つの前記側面において撮像位置を通過する飛行経路を生成し、側面と隣接する次の側面において撮像位置を通過する飛行経路を生成してよい。 In the generation of the flight path, the processing unit may generate a flight path that passes through the imaging position on one of the side surfaces, and a flight path that passes through the imaging position on the next side surface adjacent to the side surface.

 処理部は、更に、オブジェクトを下向きに撮像した撮像画像を取得し、概略形状の取得において、撮像画像を用いてオブジェクトの概略形状の3次元形状データを取得してよい。 The processing unit may further acquire a captured image obtained by capturing the object downward, and acquire the three-dimensional shape data of the approximate shape of the object using the captured image in acquiring the approximate shape.

 一態様において、被写体を撮像する飛行体と、飛行体の飛行経路を生成する処理部と、を有する飛行経路生成システムであって、処理部は、被写体に含まれるオブジェクトの概略形状を取得し、概略形状における側面を抽出し、側面に対応する撮像位置を設定し、撮像位置を通過する飛行経路を生成し、飛行体は、飛行経路を取得して設定する、飛行経路生成システムである。 In one aspect, a flight path generation system having a flying body that captures an image of a subject and a processing unit that generates a flight path of the flying body, the processing unit acquires a schematic shape of an object included in the subject, A flight path generation system that extracts a side surface in a schematic shape, sets an imaging position corresponding to the side surface, generates a flight path that passes through the imaging position, and obtains and sets a flight path.

 一態様において、プログラムは、被写体を撮像する飛行体の飛行経路を生成するコンピュータに、被写体に含まれるオブジェクトの概略形状を取得するステップと、概略形状における側面を抽出するステップと、側面に対応する撮像位置を設定するステップと、撮像位置を通過する飛行経路を生成するステップと、を実行させるためのプログラムである。 In one aspect, a program corresponds to a side, a step of acquiring a rough shape of an object included in the subject, a step of extracting a side surface in the rough shape, and a computer that generates a flight path of a flying object that images the subject. A program for executing a step of setting an imaging position and a step of generating a flight path passing through the imaging position.

 一態様において、記録媒体は、被写体を撮像する飛行体の飛行経路を生成するコンピュータに、被写体に含まれるオブジェクトの概略形状を取得するステップと、概略形状における側面を抽出するステップと、側面に対応する撮像位置を設定するステップと、撮像位置を通過する飛行経路を生成するステップと、を実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体である。 In one aspect, the recording medium corresponds to a step of obtaining a rough shape of an object included in the subject, a step of extracting a side surface in the rough shape, and a computer that generates a flight path of a flying object that images the subject. A computer-readable recording medium recording a program for executing a step of setting an imaging position to be performed and a step of generating a flight path passing through the imaging position.

 なお、上記の発明の概要は、本開示の特徴の全てを列挙したものではない。また、これらの特徴群のサブコンビネーションもまた、発明となりうる。 Note that the above summary of the invention does not enumerate all the features of the present disclosure. In addition, a sub-combination of these feature groups can also be an invention.

実施形態における飛行経路生成システムの第1構成例を示す模式図である。It is a mimetic diagram showing the 1st example of composition of the flight course generating system in an embodiment. 無人飛行体の外観の一例を示す図である。It is a figure which shows an example of the external appearance of an unmanned air vehicle. 無人飛行体の具体的な外観の一例を示す図である。It is a figure which shows an example of the specific external appearance of an unmanned air vehicle. 図1の飛行経路生成システムを構成する無人飛行体のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the unmanned air vehicle which comprises the flight path | route production | generation system of FIG. 携帯端末が装着された送信機の外観の一例を示す図である。It is a figure which shows an example of the external appearance of the transmitter with which the portable terminal was mounted | worn. 図1の飛行経路生成システムを構成する送信機のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the transmitter which comprises the flight path | route production | generation system of FIG. 図1の飛行経路生成システムを構成する携帯端末のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the portable terminal which comprises the flight path | route production | generation system of FIG. 実施形態における飛行経路生成方法の処理手順の一例を示すフローチャートである。It is a flowchart which shows an example of the process sequence of the flight path | route production | generation method in embodiment. 飛行範囲の入力例を説明するための図である。It is a figure for demonstrating the example of an input of a flight range. 飛行経路での概略撮像を説明するための図である。It is a figure for demonstrating the schematic imaging in a flight path. 飛行経路により得られた概略撮像に基づく概略形状の3次元形状データの生成を説明するための図である。It is a figure for demonstrating the production | generation of three-dimensional shape data of a rough shape based on the rough imaging obtained by the flight path. 実施形態における3次元形状推定方法の処理手順の一例を示すフローチャートである。It is a flowchart which shows an example of the process sequence of the three-dimensional shape estimation method in embodiment. 実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第1動作例を説明するための図である。It is a figure for demonstrating the 1st operation example of the flight path production | generation using the schematic shape of the object in embodiment. 撮影平面における複数の撮像位置の設定を説明するための図である。It is a figure for demonstrating the setting of the several imaging position in an imaging plane. 実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第1動作例の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the 1st operation example of the flight path production | generation using the schematic shape of the object in embodiment. 実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第2動作例を説明するための図である。It is a figure for demonstrating the 2nd operation example of the flight path | route production | generation using the schematic shape of the object in embodiment. 実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第2動作例の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the 2nd operation example of the flight path production | generation using the schematic shape of the object in embodiment. 実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第3動作例を説明するための図である。It is a figure for demonstrating the 3rd operation example of the flight path | route production | generation using the schematic shape of the object in embodiment. 実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第3動作例の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the 3rd operation example of the flight path production | generation using the schematic shape of the object in embodiment. 実施形態における飛行経路生成システムの第2構成例を示す模式図である。It is a schematic diagram which shows the 2nd structural example of the flight path | route production | generation system in embodiment. 実施形態における飛行経路生成システムの第3構成例に係る無人飛行体のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the unmanned air vehicle which concerns on the 3rd structural example of the flight path | route production | generation system in embodiment. 実施形態における飛行経路生成システムの第4構成例に係る送信機のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the transmitter which concerns on the 4th structural example of the flight path | route production | generation system in embodiment.

 以下、発明の実施の形態を通じて本開示を説明するが、以下の実施の形態は特許請求の範囲に係る発明を限定するものではない。実施の形態の中で説明されている特徴の組み合わせの全てが発明の解決手段に必須とは限らない。 Hereinafter, the present disclosure will be described through embodiments of the invention, but the following embodiments do not limit the invention according to the claims. Not all combinations of features described in the embodiments are essential for the solution of the invention.

 特許請求の範囲、明細書、図面、及び要約書には、著作権による保護の対象となる事項が含まれる。著作権者は、これらの書類の何人による複製に対しても、特許庁のファイル又はレコードに表示される通りであれば異議を唱えない。但し、それ以外の場合、一切の著作権を留保する。 The claims, the description, the drawings, and the abstract include matters that are subject to copyright protection. The copyright owner will not object to any number of copies of these documents as they appear in the JPO file or record. However, in other cases, all copyrights are reserved.

 本開示に係る飛行経路生成システムは、移動体の一例としての飛行体と、飛行体の動作又は処理を遠隔で制御するためのプラットフォームとを含む構成である。 The flight path generation system according to the present disclosure includes a flying object as an example of a moving object and a platform for remotely controlling the operation or processing of the flying object.

 本開示に係る情報処理装置は、プラットフォームと、飛行体との少なくとも一方に含まれるコンピュータであって、飛行体の動作に係る各種処理を実行するものである。 The information processing apparatus according to the present disclosure is a computer included in at least one of the platform and the flying object, and executes various processes related to the operation of the flying object.

 飛行体は、空中を移動する航空機(例えばドローン、ヘリコプター)を含む。飛行体は、撮像装置を有する無人飛行体(UAV:Unmanned Aerial Vehicle)であってもよい。飛行体は、撮像範囲における被写体(例えば一定の範囲内の建物、道路、公園等の地面形状)を撮像するために、あらかじめ設定された飛行経路に沿って飛行し、飛行経路上に設定されている複数の撮像位置において被写体を撮像する。被写体は、例えば建物、道路等のオブジェクトが含まれる。 The flying object includes an aircraft (for example, drone, helicopter) moving in the air. The flying body may be an unmanned flying vehicle (UAV: Unmanned 装置 Aerial Vehicle) having an imaging device. The flying object flies along a predetermined flight path in order to image a subject in the imaging range (for example, a ground shape of a building, road, park, etc. within a certain range), and is set on the flight path. The subject is imaged at a plurality of imaging positions. The subject includes objects such as buildings and roads, for example.

 プラットフォームは、コンピュータであって、例えば飛行体の移動を含む各種処理の遠隔制御を指示するための送信機、或いは送信機又は飛行体と情報やデータの入出力が可能に接続された通信端末である。通信端末は、例えば携帯端末、PC(Personal Computer)などであってよい。なお、飛行体自体がプラットフォームとして含まれてよい。 The platform is a computer, for example, a transmitter for instructing remote control of various processes including movement of the flying object, or a communication terminal connected to the transmitter or the flying object so as to be able to input and output information and data. is there. The communication terminal may be, for example, a mobile terminal, a PC (Personal Computer), or the like. Note that the flying object itself may be included as a platform.

 本開示に係る飛行経路生成方法は、情報処理装置(プラットフォーム、飛行体)、又は飛行経路生成システムにおける各種の処理(ステップ)が規定されたものである。 The flight path generation method according to the present disclosure defines various processes (steps) in an information processing apparatus (platform, flying object) or flight path generation system.

 本開示に係るプログラムは、情報処理装置(プラットフォーム、飛行体)、又は飛行経路生成システムに各種の処理(ステップ)を実行させるためのプログラムである。 The program according to the present disclosure is a program for causing an information processing device (platform, flying object) or a flight path generation system to execute various processes (steps).

 本開示に係る記録媒体は、プログラム(つまり、情報処理装置(プラットフォーム、飛行体)、又は飛行経路生成システムに各種の処理(ステップ)を実行させるためのプログラム)が記録されたものである。 The recording medium according to the present disclosure stores a program (that is, a program for causing the information processing apparatus (platform, flying object) or the flight path generation system to execute various processes (steps)).

 以下の実施形態では、飛行体として、無人飛行体(UAV)を例示する。本明細書に添付する図面では、無人飛行体を「UAV」と表記する。本実施形態では、無人飛行体は、オブジェクトの側面を撮像可能な撮像位置を含む飛行経路を設定する。 In the following embodiment, an unmanned aerial vehicle (UAV) is exemplified as the flying object. In the drawings attached to this specification, an unmanned air vehicle is denoted as “UAV”. In the present embodiment, the unmanned air vehicle sets a flight path including an imaging position at which the side surface of the object can be imaged.

[飛行経路生成システム、第1構成例]
 図1は、実施形態における飛行経路生成システム10の第1構成例を示す模式図である。飛行経路生成システム10は、無人飛行体100、送信機50、及び携帯端末80を含む。無人飛行体100、送信機50、及び携帯端末80は、有線通信又は無線通信(例えば無線LAN(Local Area Network)、又はBluetooth(登録商標))を用いて、互いに通信することが可能である。送信機50は、例えば送信機50を使用する人物(以下、「ユーザ」という)の両手で把持された状態で使用される。
[Flight path generation system, first configuration example]
FIG. 1 is a schematic diagram illustrating a first configuration example of a flight path generation system 10 according to the embodiment. The flight path generation system 10 includes an unmanned air vehicle 100, a transmitter 50, and a portable terminal 80. The unmanned air vehicle 100, the transmitter 50, and the portable terminal 80 can communicate with each other using wired communication or wireless communication (for example, a wireless local area network (LAN) or Bluetooth (registered trademark)). The transmitter 50 is used in a state of being held by both hands of a person who uses the transmitter 50 (hereinafter referred to as “user”), for example.

 図2は、無人飛行体100の外観の一例を示す図である。図3は、無人飛行体100の具体的な外観の一例を示す図である。無人飛行体100が移動方向STV0に飛行する時の側面図が図2に示され、無人飛行体100が移動方向STV0に飛行する時の斜視図が図3に示されている。無人飛行体100は、撮像部の一例としての撮像装置220,230を備えて移動する移動体の一例である。移動体とは、無人飛行体100の他、空中を移動する他の航空機、地上を移動する車両、水上を移動する船舶等を含む概念である。ここで、図2及び図3に示すように、地面と平行であって移動方向STV0に沿う方向にロール軸(図2及び図3のx軸参照)が定義されたとする。この場合、地面と平行であってロール軸に垂直な方向にピッチ軸(図2及び図3のy軸参照)が定められ、更に、地面に垂直であってロール軸及びピッチ軸に垂直な方向にヨー軸(図2及び図3のz軸)が定められる。 FIG. 2 is a diagram showing an example of the appearance of the unmanned air vehicle 100. FIG. 3 is a diagram illustrating an example of a specific external appearance of the unmanned air vehicle 100. A side view when the unmanned air vehicle 100 flies in the moving direction STV0 is shown in FIG. 2, and a perspective view when the unmanned air vehicle 100 flies in the moving direction STV0 is shown in FIG. The unmanned air vehicle 100 is an example of a moving body that includes the imaging devices 220 and 230 as an example of an imaging unit and moves. The moving body is a concept including, in addition to the unmanned air vehicle 100, other aircraft that moves in the air, vehicles that move on the ground, ships that move on the water, and the like. Here, as shown in FIGS. 2 and 3, it is assumed that the roll axis (see the x-axis in FIGS. 2 and 3) is defined in a direction parallel to the ground and along the movement direction STV0. In this case, a pitch axis (see the y-axis in FIGS. 2 and 3) is defined in a direction parallel to the ground and perpendicular to the roll axis, and further, a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis. The yaw axis (the z axis in FIGS. 2 and 3) is defined.

 無人飛行体100は、UAV本体102と、ジンバル200と、撮像装置220と、複数の撮像装置230とを含む構成である。無人飛行体100は、プラットフォームの一例としての送信機50から送信される遠隔制御の指示を基に移動する。無人飛行体100の移動は、飛行を意味し、少なくとも上昇、降下、左旋回、右旋回、左水平移動、右水平移動の飛行が含まれる。 The unmanned air vehicle 100 includes a UAV main body 102, a gimbal 200, an imaging device 220, and a plurality of imaging devices 230. The unmanned aerial vehicle 100 moves based on a remote control instruction transmitted from a transmitter 50 as an example of a platform. The movement of the unmanned air vehicle 100 means a flight, and includes at least ascending, descending, left turning, right turning, left horizontal movement, and right horizontal movement.

 UAV本体102は、複数の回転翼(プロペラ)を備える。UAV本体102は、複数の回転翼の回転を制御することにより無人飛行体100を移動させる。UAV本体102は、例えば4つの回転翼を用いて無人飛行体100を移動させる。回転翼の数は、4つに限定されない。また、無人飛行体100は、回転翼を有さない固定翼機でよい。 The UAV main body 102 includes a plurality of rotor blades (propellers). The UAV main body 102 moves the unmanned air vehicle 100 by controlling the rotation of a plurality of rotor blades. The UAV main body 102 moves the unmanned aerial vehicle 100 using, for example, four rotary wings. The number of rotor blades is not limited to four. The unmanned air vehicle 100 may be a fixed wing aircraft that does not have rotating wings.

 撮像装置220は、所望の撮像範囲に含まれる被写体(例えば、地上の建物)を撮像する撮像用のカメラである。なお被写体は、例えば建物等のオブジェクトとともに、無人飛行体100の空撮対象となる上空の様子、山や川等の景色が含まれてよい。 The imaging device 220 is an imaging camera that images a subject (for example, a building on the ground) included in a desired imaging range. The subject may include, for example, an object such as a building and the like, an aerial view of the unmanned air vehicle 100, and a landscape such as a mountain or a river.

 複数の撮像装置230は、無人飛行体100の移動を制御するために無人飛行体100の周囲を撮像するセンシング用のカメラである。2つの撮像装置230が、無人飛行体100の機首である正面に設けられてよい。更に、他の2つの撮像装置230が、無人飛行体100の底面に設けられてよい。正面側の2つの撮像装置230はペアとなり、いわゆるステレオカメラとして機能してよい。底面側の2つの撮像装置230もペアとなり、ステレオカメラとして機能してよい。複数の撮像装置230により撮像された画像に基づいて、無人飛行体100の周囲の3次元空間データが生成されてよい。なお、無人飛行体100が備える撮像装置230の数は4つに限定されない。無人飛行体100は、少なくとも1つの撮像装置230を備えていればよい。無人飛行体100は、無人飛行体100の機首、機尾、側面、底面、及び天井面のそれぞれに少なくとも1つの撮像装置230を備えてよい。撮像装置230で設定できる画角は、撮像装置220で設定できる画角より広くてよい。撮像装置230は、単焦点レンズ又は魚眼レンズを有してよい。 The plurality of imaging devices 230 are sensing cameras that image the surroundings of the unmanned air vehicle 100 in order to control the movement of the unmanned air vehicle 100. Two imaging devices 230 may be provided on the front surface that is the nose of the unmanned air vehicle 100. Furthermore, the other two imaging devices 230 may be provided on the bottom surface of the unmanned air vehicle 100. The two imaging devices 230 on the front side may be paired and function as a so-called stereo camera. The two imaging devices 230 on the bottom side may also be paired and function as a stereo camera. Three-dimensional spatial data around the unmanned air vehicle 100 may be generated based on images captured by the plurality of imaging devices 230. The number of imaging devices 230 included in the unmanned air vehicle 100 is not limited to four. The unmanned air vehicle 100 only needs to include at least one imaging device 230. The unmanned air vehicle 100 may include at least one imaging device 230 on each of the nose, the tail, the side surface, the bottom surface, and the ceiling surface of the unmanned air vehicle 100. The angle of view that can be set by the imaging device 230 may be wider than the angle of view that can be set by the imaging device 220. The imaging device 230 may have a single focus lens or a fisheye lens.

[無人飛行体の構成例]
 次に、無人飛行体100の構成例について説明する。
[Configuration example of unmanned air vehicle]
Next, a configuration example of the unmanned air vehicle 100 will be described.

 図4は、図1の飛行経路生成システム10を構成する無人飛行体100のハードウェア構成の一例を示すブロック図である。無人飛行体100は、処理部110と、通信インタフェース150と、メモリ160と、ストレージ170と、バッテリ190と、ジンバル200と、回転翼機構210と、撮像装置220と、撮像装置230と、GPS受信機240と、慣性計測装置(IMU:Inertial Measurement Unit)250と、磁気コンパス260と、気圧高度計270と、超音波センサ280と、レーザ測定器290とを含む構成である。通信インタフェース150は、通信部の一例である。 FIG. 4 is a block diagram showing an example of a hardware configuration of the unmanned air vehicle 100 constituting the flight path generation system 10 of FIG. The unmanned air vehicle 100 includes a processing unit 110, a communication interface 150, a memory 160, a storage 170, a battery 190, a gimbal 200, a rotary wing mechanism 210, an imaging device 220, an imaging device 230, and GPS reception. Machine 240, inertial measurement unit (IMU: Inertial Measurement Unit) 250, magnetic compass 260, barometric altimeter 270, ultrasonic sensor 280, and laser measuring instrument 290. The communication interface 150 is an example of a communication unit.

 処理部110は、プロセッサ、例えばCPU(Central Processing Unit)、MPU(Micro Processing Unit)又はDSP(Digital Signal Processor)を用いて構成される。処理部110は、無人飛行体100の各部の動作を統括して制御するための信号処理、他の各部との間のデータの入出力処理、データの演算処理及びデータの記憶処理を行う。 The processing unit 110 is configured using a processor such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor). The processing unit 110 performs signal processing for overall control of operations of each unit of the unmanned air vehicle 100, data input / output processing with other units, data calculation processing, and data storage processing.

 処理部110は、メモリ160に格納されたプログラムに従って無人飛行体100の飛行を制御する。処理部110は、通信インタフェース150を介して遠隔の送信機50から受信した命令に従って、無人飛行体100の移動(つまり、飛行)を制御する。メモリ160は、無人飛行体100から取り外し可能であってよい。 The processing unit 110 controls the flight of the unmanned air vehicle 100 according to the program stored in the memory 160. The processing unit 110 controls the movement (that is, the flight) of the unmanned air vehicle 100 according to a command received from the remote transmitter 50 via the communication interface 150. The memory 160 may be removable from the unmanned air vehicle 100.

 処理部110は、撮像装置220及び撮像装置230により撮像された被写体の画像データ(以下、「撮像画像」と称する場合がある)を取得する。 The processing unit 110 acquires image data of a subject imaged by the imaging device 220 and the imaging device 230 (hereinafter sometimes referred to as “captured image”).

 処理部110は、ジンバル200、回転翼機構210、撮像装置220、及び撮像装置230を制御する。処理部110は、撮像装置220の撮像方向又は画角を変更することによって、撮像装置220の撮像範囲を制御する。処理部110は、ジンバル200の回転機構を制御することで、ジンバル200に支持されている撮像装置220の撮像範囲を制御する。 The processing unit 110 controls the gimbal 200, the rotary blade mechanism 210, the imaging device 220, and the imaging device 230. The processing unit 110 controls the imaging range of the imaging device 220 by changing the imaging direction or angle of view of the imaging device 220. The processing unit 110 controls the imaging range of the imaging device 220 supported by the gimbal 200 by controlling the rotation mechanism of the gimbal 200.

 本明細書では、撮像範囲は、撮像装置220又は撮像装置230により撮像される地理的な範囲をいう。撮像範囲は、緯度、経度、及び高度で定義される。撮像範囲は、緯度、経度、及び高度で定義される3次元空間データにおける範囲でよい。撮像範囲は、撮像装置220又は撮像装置230の画角及び撮像方向、並びに無人飛行体100が存在する位置に基づいて特定される。撮像装置220及び撮像装置230の撮像方向は、撮像装置220及び撮像装置230の撮像レンズが設けられた正面が向く方位と俯角とから定義される。撮像装置220の撮像方向は、無人飛行体100の機首の方位と、ジンバル200に対する撮像装置220の姿勢の状態とから特定される方向である。撮像装置230の撮像方向は、無人飛行体100の機首の方位と、撮像装置230が設けられた位置とから特定される方向である。 In this specification, the imaging range refers to a geographical range captured by the imaging device 220 or the imaging device 230. The imaging range is defined by latitude, longitude, and altitude. The imaging range may be a range in three-dimensional spatial data defined by latitude, longitude, and altitude. The imaging range is specified based on the angle of view and imaging direction of the imaging device 220 or the imaging device 230, and the position where the unmanned air vehicle 100 is present. The imaging directions of the imaging device 220 and the imaging device 230 are defined from the azimuth and the depression angle in which the front surface where the imaging lenses of the imaging device 220 and the imaging device 230 are provided is directed. The imaging direction of the imaging device 220 is a direction specified from the nose direction of the unmanned air vehicle 100 and the posture state of the imaging device 220 with respect to the gimbal 200. The imaging direction of the imaging device 230 is a direction specified from the nose direction of the unmanned air vehicle 100 and the position where the imaging device 230 is provided.

 処理部110は、回転翼機構210を制御することで、無人飛行体100の飛行を制御する。つまり、処理部110は、回転翼機構210を制御することにより、無人飛行体100の緯度、経度、及び高度を含む位置を制御する。処理部110は、無人飛行体100の飛行を制御することにより、撮像装置220及び撮像装置230の撮像範囲を制御してよい。処理部110は、撮像装置220が備えるズームレンズを制御することで、撮像装置220の画角を制御してよい。処理部110は、撮像装置220のデジタルズーム機能を利用して、デジタルズームにより、撮像装置220の画角を制御してよい。処理部110は、設定した飛行経路の途中に存在する撮像位置(後述するウェイポイント(Waypoint))において、撮像装置220又は撮像装置230により被写体を水平方向、既定角度の方向、又は鉛直方向に撮像させる。既定角度の方向は、情報処理装置(無人飛行体又はプラットフォーム)が被写体の3次元形状の推定を行う上で適した既定値の角度の方向である。 The processing unit 110 controls the flight of the unmanned air vehicle 100 by controlling the rotary wing mechanism 210. That is, the processing unit 110 controls the position including the latitude, longitude, and altitude of the unmanned air vehicle 100 by controlling the rotary wing mechanism 210. The processing unit 110 may control the imaging ranges of the imaging device 220 and the imaging device 230 by controlling the flight of the unmanned air vehicle 100. The processing unit 110 may control the angle of view of the imaging device 220 by controlling a zoom lens included in the imaging device 220. The processing unit 110 may control the angle of view of the imaging device 220 by digital zoom using the digital zoom function of the imaging device 220. The processing unit 110 captures the subject in the horizontal direction, the predetermined angle direction, or the vertical direction by the imaging device 220 or the imaging device 230 at an imaging position (waypoint to be described later) existing in the middle of the set flight path. Let The direction of the predetermined angle is a direction of a predetermined angle suitable for the information processing apparatus (unmanned air vehicle or platform) to estimate the three-dimensional shape of the subject.

 撮像装置220が無人飛行体100に固定され、撮像装置220を動かせない場合、処理部110は、特定の日時に特定の位置に無人飛行体100を移動させることにより、所望の環境下で所望の撮像範囲を撮像装置220に撮像させることができる。あるいは撮像装置220がズーム機能を有さず、撮像装置220の画角を変更できない場合でも、処理部110は、特定された日時に、特定の位置に無人飛行体100を移動させることで、所望の環境下で所望の撮像範囲を撮像装置220に撮像させることができる。 When the imaging device 220 is fixed to the unmanned air vehicle 100 and the imaging device 220 cannot be moved, the processing unit 110 moves the unmanned air vehicle 100 to a specific position at a specific date and time to perform a desired operation under a desired environment. The imaging range can be captured by the imaging device 220. Alternatively, even when the imaging device 220 does not have a zoom function and the angle of view of the imaging device 220 cannot be changed, the processing unit 110 moves the unmanned air vehicle 100 to a specific position at the specified date and time, thereby In this environment, the imaging device 220 can capture a desired imaging range.

 通信インタフェース150は、送信機50と通信する。通信インタフェース150は、遠隔の送信機50から処理部110に対する各種の命令を受信する。 The communication interface 150 communicates with the transmitter 50. The communication interface 150 receives various commands for the processing unit 110 from the remote transmitter 50.

 メモリ160は、記憶部の一例である。メモリ160は、処理部110がジンバル200、回転翼機構210、撮像装置220、撮像装置230、GPS受信機240、慣性計測装置250、磁気コンパス260、気圧高度計270、超音波センサ280、及びレーザ測定器290を制御するのに必要なプログラム等を格納する。メモリ160は、撮像装置220,230により撮像された撮像画像を格納する。メモリ160は、コンピュータ読み取り可能な記録媒体でよく、SRAM(Static Random Access Memory)、DRAM(Dynamic Random Access Memory)、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)、及びUSBメモリ等のフラッシュメモリの少なくとも1つを含んでよい。メモリ160は、UAV本体102の内部に設けられてよい。UAV本体102から取り外し可能に設けられてよい。 The memory 160 is an example of a storage unit. The memory 160 includes a gimbal 200, a rotating blade mechanism 210, an imaging device 220, an imaging device 230, a GPS receiver 240, an inertial measurement device 250, a magnetic compass 260, a barometric altimeter 270, an ultrasonic sensor 280, and laser measurement. A program and the like necessary for controlling the device 290 are stored. The memory 160 stores captured images captured by the imaging devices 220 and 230. The memory 160 may be a computer-readable recording medium, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), and It may include at least one flash memory such as a USB memory. The memory 160 may be provided inside the UAV main body 102. It may be provided so as to be removable from the UAV main body 102.

 ストレージ170は、記憶部の一例である。ストレージ170は、各種データ、情報を蓄積し、保持する。ストレージ170は、HDD(Hard Disk Drive)、SSD(Solid State Drive)、メモリカード、USBメモリ、等でよい。ストレージ170は、UAV本体102の内部に設けられてよい。ストレージ170は、UAV本体102から取り外し可能に設けられてよい。 The storage 170 is an example of a storage unit. The storage 170 accumulates and holds various data and information. The storage 170 may be an HDD (Hard Disk Drive), an SSD (Solid State Drive), a memory card, a USB memory, or the like. The storage 170 may be provided inside the UAV main body 102. The storage 170 may be provided so as to be removable from the UAV main body 102.

 バッテリ190は、無人飛行体100の各部の駆動源としての機能を有し、無人飛行体100の各部に必要な電源を供給する。 The battery 190 has a function as a drive source of each part of the unmanned air vehicle 100 and supplies necessary power to each part of the unmanned air vehicle 100.

 ジンバル200は、少なくとも1つの軸を中心に撮像装置220を回転可能に支持する。ジンバル200は、ヨー軸、ピッチ軸、及びロール軸を中心に撮像装置220を回転可能に支持してよい。ジンバル200は、ヨー軸、ピッチ軸、及びロール軸の少なくとも1つを中心に撮像装置220を回転させることで、撮像装置220の撮像方向を変更してよい。 The gimbal 200 supports the imaging device 220 to be rotatable about at least one axis. The gimbal 200 may support the imaging device 220 rotatably about the yaw axis, pitch axis, and roll axis. The gimbal 200 may change the imaging direction of the imaging device 220 by rotating the imaging device 220 about at least one of the yaw axis, the pitch axis, and the roll axis.

 回転翼機構210は、複数の回転翼と、複数の回転翼を回転させる複数の駆動モータとを有する。 The rotary blade mechanism 210 includes a plurality of rotary blades and a plurality of drive motors that rotate the plurality of rotary blades.

 撮像装置220は、所望の撮像範囲の被写体を撮像して撮像画像のデータを生成する。撮像装置220の撮像により得られた画像データは、撮像装置220が有するメモリ、又はメモリ160に格納される。 The imaging device 220 captures a subject within a desired imaging range and generates captured image data. Image data obtained by imaging by the imaging device 220 is stored in a memory included in the imaging device 220 or the memory 160.

 撮像装置230は、無人飛行体100の周囲を撮像して撮像画像のデータを生成する。撮像装置230の画像データは、メモリ160に格納される。 The imaging device 230 captures the surroundings of the unmanned air vehicle 100 and generates captured image data. Image data of the imaging device 230 is stored in the memory 160.

 GPS受信機240は、複数の航法衛星(つまり、GPS衛星)から発信された時刻及び各GPS衛星の位置(座標)を示す複数の信号を受信する。GPS受信機240は、受信された複数の信号に基づいて、GPS受信機240の位置(つまり、無人飛行体100の位置)を算出する。GPS受信機240は、無人飛行体100の位置情報を処理部110に出力する。なお、GPS受信機240の位置情報の算出は、GPS受信機240の代わりに処理部110により行われてよい。この場合、処理部110には、GPS受信機240が受信した複数の信号に含まれる時刻及び各GPS衛星の位置を示す情報が入力される。 The GPS receiver 240 receives a plurality of signals indicating times and positions (coordinates) of each GPS satellite transmitted from a plurality of navigation satellites (that is, GPS satellites). The GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned air vehicle 100) based on the received signals. The GPS receiver 240 outputs position information of the unmanned air vehicle 100 to the processing unit 110. The calculation of the position information of the GPS receiver 240 may be performed by the processing unit 110 instead of the GPS receiver 240. In this case, information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 240 is input to the processing unit 110.

 慣性計測装置250は、無人飛行体100の姿勢を検出し、検出結果を処理部110に出力する。慣性計測装置250は、無人飛行体100の姿勢として、無人飛行体100の前後、左右、及び上下の3軸方向の加速度と、ピッチ軸、ロール軸、及びヨー軸の3軸方向の角速度とを検出する。 The inertial measurement device 250 detects the attitude of the unmanned air vehicle 100 and outputs the detection result to the processing unit 110. The inertial measurement device 250 uses, as the attitude of the unmanned aerial vehicle 100, accelerations in the three axial directions of the front and rear, left and right, and up and down of the unmanned air vehicle 100, and angular velocities in the three axial directions of the pitch axis, roll axis, and yaw axis. To detect.

 磁気コンパス260は、無人飛行体100の機首の方位を検出し、検出結果を処理部110に出力する。 The magnetic compass 260 detects the nose direction of the unmanned air vehicle 100 and outputs the detection result to the processing unit 110.

 気圧高度計270は、無人飛行体100が飛行する高度を検出し、検出結果を処理部110に出力する。 The barometric altimeter 270 detects the altitude at which the unmanned air vehicle 100 flies, and outputs the detection result to the processing unit 110.

 超音波センサ280は、超音波を照射し、地面や物体により反射された超音波を検出し、検出結果を処理部110に出力する。検出結果は、例えば無人飛行体100から地面までの距離(つまり、高度)を示してよい。検出結果は、例えば無人飛行体100から物体までの距離を示してよい。 The ultrasonic sensor 280 emits ultrasonic waves, detects ultrasonic waves reflected by the ground or an object, and outputs the detection results to the processing unit 110. The detection result may indicate a distance (that is, altitude) from the unmanned air vehicle 100 to the ground, for example. The detection result may indicate a distance from the unmanned air vehicle 100 to the object, for example.

 レーザ測定器290は、物体に向けてレーザ光を照射し、物体で反射された反射光を受光し、反射光により無人飛行体100と物体との間の距離を測距する。測距結果は、処理部110に入力される。レーザ光による距離の測定方式は、一例として、タイムオブフライト方式でよい。 Laser measuring device 290 irradiates the object with laser light, receives the reflected light reflected by the object, and measures the distance between unmanned air vehicle 100 and the object using the reflected light. The distance measurement result is input to the processing unit 110. As an example, the distance measurement method using laser light may be a time-of-flight method.

 次に、無人飛行体100の処理部110の機能の一例について説明する。 Next, an example of the function of the processing unit 110 of the unmanned air vehicle 100 will be described.

 処理部110は、複数の撮像装置230により撮像された複数の画像を解析することで、無人飛行体100の周囲の環境を特定してよい。処理部110は、無人飛行体100の周囲の環境に基づいて、例えば障害物を回避して飛行を制御する。処理部110は、複数の撮像装置230により撮像された複数の画像に基づいて無人飛行体100の周囲の3次元空間データを生成し、3次元空間データに基づいて飛行を制御してよい。 The processing unit 110 may specify the environment around the unmanned air vehicle 100 by analyzing a plurality of images captured by the plurality of imaging devices 230. Based on the environment around the unmanned air vehicle 100, the processing unit 110 controls flight while avoiding obstacles, for example. The processing unit 110 may generate three-dimensional spatial data around the unmanned air vehicle 100 based on a plurality of images captured by the plurality of imaging devices 230, and control the flight based on the three-dimensional spatial data.

 処理部110は、現在の日時を示す日時情報を取得する。処理部110は、GPS受信機240から現在の日時を示す日時情報を取得してよい。処理部110は、無人飛行体100に搭載されたタイマ(不図示)から現在の日時を示す日時情報を取得してよい。 The processing unit 110 acquires date / time information indicating the current date / time. The processing unit 110 may acquire date / time information indicating the current date / time from the GPS receiver 240. The processing unit 110 may acquire date / time information indicating the current date / time from a timer (not shown) mounted on the unmanned air vehicle 100.

 処理部110は、無人飛行体100の位置を示す位置情報を取得する。処理部110は、GPS受信機240から、無人飛行体100が存在する緯度、経度及び高度を示す位置情報を取得してよい。処理部110は、GPS受信機240から無人飛行体100が存在する緯度及び経度を示す緯度経度情報、並びに気圧高度計270又は超音波センサ280から無人飛行体100が存在する高度を示す高度情報をそれぞれ位置情報として取得してよい。 The processing unit 110 acquires position information indicating the position of the unmanned air vehicle 100. The processing unit 110 may acquire position information indicating the latitude, longitude, and altitude where the unmanned air vehicle 100 exists from the GPS receiver 240. The processing unit 110 receives latitude and longitude information indicating the latitude and longitude where the unmanned air vehicle 100 exists from the GPS receiver 240, and altitude information indicating the altitude where the unmanned air vehicle 100 exists from the barometric altimeter 270 or the ultrasonic sensor 280, respectively. It may be acquired as position information.

 処理部110は、磁気コンパス260から無人飛行体100の向きを示す向き情報を取得してよい。向き情報は、例えば無人飛行体100の機首の向きに対応する方位を示してよい。 The processing unit 110 may acquire orientation information indicating the orientation of the unmanned air vehicle 100 from the magnetic compass 260. The orientation information may indicate an orientation corresponding to the nose orientation of the unmanned air vehicle 100, for example.

 処理部110は、撮像装置220が撮像すべき撮像範囲を撮像する時に無人飛行体100が存在すべき位置を示す位置情報を取得してよい。処理部110は、無人飛行体100が存在すべき位置を示す位置情報をメモリ160から取得してよい。処理部110は、無人飛行体100が存在すべき位置を示す位置情報を、通信インタフェース150を介して送信機50等の他の装置から取得してよい。処理部110は、3次元地図データベースを参照して、撮像すべき撮像範囲を撮像するために、無人飛行体100が存在可能な位置を特定して、その位置を無人飛行体100が存在すべき位置を示す位置情報として取得してよい。 The processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should be present when the imaging device 220 captures an imaging range to be imaged. The processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should exist from the memory 160. The processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 should be present from another device such as the transmitter 50 via the communication interface 150. The processing unit 110 refers to the three-dimensional map database, specifies a position where the unmanned air vehicle 100 can exist in order to capture an image capturing range to be imaged, and the unmanned air vehicle 100 should exist at that position. You may acquire as positional information which shows a position.

 処理部110は、撮像装置220及び撮像装置230のそれぞれの撮像範囲を示す撮像情報を取得してよい。処理部110は、撮像範囲を特定するためのパラメータとして、撮像装置220及び撮像装置230の画角を示す画角情報を撮像装置220及び撮像装置230から取得してよい。処理部110は、撮像範囲を特定するためのパラメータとして、撮像装置220及び撮像装置230の撮像方向を示す情報を取得してよい。処理部110は、例えば撮像装置220の撮像方向を示す情報として、ジンバル200から撮像装置220の姿勢の状態を示す姿勢情報を取得してよい。撮像装置220の姿勢の状態を示す情報は、例えばジンバル200のピッチ軸及びヨー軸の基準回転角度からの回転角度により示してよい。処理部110は、例えば撮像装置220の撮像方向を示す情報として、無人飛行体100の向きを示す情報を取得してよい。処理部110は、撮像範囲を特定するためのパラメータとして、無人飛行体100が存在する位置を示す位置情報を取得してよい。処理部110は、撮像装置220及び撮像装置230の画角及び撮像方向、並びに無人飛行体100が存在する位置に基づいて、撮像装置220が撮像する地理的な範囲を示す撮像範囲を画定し、撮像範囲を示す撮像情報を生成することで、撮像情報を取得してよい。 The processing unit 110 may acquire imaging information indicating the imaging ranges of the imaging device 220 and the imaging device 230, respectively. The processing unit 110 may acquire angle-of-view information indicating the angle of view of the imaging device 220 and the imaging device 230 from the imaging device 220 and the imaging device 230 as parameters for specifying the imaging range. The processing unit 110 may acquire information indicating the imaging direction of the imaging device 220 and the imaging device 230 as a parameter for specifying the imaging range. The processing unit 110 may acquire posture information indicating the posture state of the imaging device 220 from the gimbal 200 as information indicating the imaging direction of the imaging device 220, for example. The information indicating the posture state of the imaging device 220 may be indicated by, for example, a rotation angle from the reference rotation angle of the pitch axis and yaw axis of the gimbal 200. The processing unit 110 may acquire information indicating the orientation of the unmanned air vehicle 100 as information indicating the imaging direction of the imaging device 220, for example. The processing unit 110 may acquire position information indicating a position where the unmanned air vehicle 100 exists as a parameter for specifying the imaging range. The processing unit 110 defines an imaging range indicating a geographical range captured by the imaging device 220 based on the angle of view and the imaging direction of the imaging device 220 and the imaging device 230 and the position where the unmanned flying object 100 exists. The imaging information may be acquired by generating imaging information indicating the imaging range.

 処理部110は、撮像装置220が撮像すべき撮像範囲を示す撮像情報を取得してよい。処理部110は、メモリ160から撮像装置220が撮像すべき撮像情報を取得してよい。処理部110は、通信インタフェース150を介して送信機50等の他の装置から撮像装置220が撮像すべき撮像情報を取得してよい。 The processing unit 110 may acquire imaging information indicating an imaging range to be imaged by the imaging device 220. The processing unit 110 may acquire imaging information to be imaged by the imaging device 220 from the memory 160. The processing unit 110 may acquire imaging information to be imaged by the imaging device 220 from another device such as the transmitter 50 via the communication interface 150.

 処理部110は、無人飛行体100の周囲に存在するオブジェクトの立体形状(3次元形状)を示す立体情報(3次元情報)を取得する。オブジェクトは、例えば、建物、道路、車、木等の風景の一部である。立体情報は、例えば、3次元空間データである。処理部110は、複数の撮像装置230から得られたそれぞれの画像から、無人飛行体100の周囲に存在するオブジェクトの立体形状を示す立体情報を生成することで、立体情報を取得してよい。処理部110は、メモリ160に格納された3次元地図データベースを参照することにより、無人飛行体100の周囲に存在するオブジェクトの立体形状を示す立体情報を取得してよい。処理部110は、ネットワーク上に存在するサーバが管理する3次元地図データベースを参照することで、無人飛行体100の周囲に存在するオブジェクトの立体形状に関する立体情報を取得してよい。 The processing unit 110 acquires three-dimensional information (three-dimensional information) indicating a three-dimensional shape (three-dimensional shape) of an object existing around the unmanned air vehicle 100. The object is a part of a landscape such as a building, a road, a car, and a tree. The three-dimensional information is, for example, three-dimensional space data. The processing unit 110 may acquire the three-dimensional information by generating the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned air vehicle 100 from the respective images obtained from the plurality of imaging devices 230. The processing unit 110 may acquire the three-dimensional information indicating the three-dimensional shape of the object existing around the unmanned air vehicle 100 by referring to the three-dimensional map database stored in the memory 160. The processing unit 110 may acquire three-dimensional information related to a three-dimensional shape of an object existing around the unmanned air vehicle 100 by referring to a three-dimensional map database managed by a server existing on the network.

 次に、送信機50及び携帯端末80の構成例について説明する。 Next, configuration examples of the transmitter 50 and the portable terminal 80 will be described.

 図5は、携帯端末80が装着された送信機50の外観の一例を示す図である。図5では、携帯端末80の一例として、スマートフォンが示されている。携帯端末80は、例えばスマートフォン、タブレット端末等でよい。送信機50に対する上下前後左右の方向は、図5に示す矢印の方向にそれぞれ従うとする。送信機50は、例えば送信機50を使用するユーザの両手で把持された状態で使用される。 FIG. 5 is a diagram illustrating an example of the appearance of the transmitter 50 to which the mobile terminal 80 is attached. In FIG. 5, a smartphone is shown as an example of the mobile terminal 80. The mobile terminal 80 may be a smartphone, a tablet terminal, or the like, for example. The up / down / front / rear / left / right directions with respect to the transmitter 50 are assumed to follow the directions of arrows shown in FIG. The transmitter 50 is used in a state of being held by both hands of a user who uses the transmitter 50, for example.

 送信機50は、例えば略正方形状の底面を有し、かつ高さが底面の一辺より短い略直方体(言い換えると、略箱形)の形状をした樹脂製の筐体50Bを有する。送信機50の筐体表面の略中央には、左制御棒53Lと右制御棒53Rとが突設して配置される。 The transmitter 50 includes, for example, a resin casing 50B having a substantially rectangular parallelepiped shape (in other words, a substantially box shape) having a substantially square bottom surface and a height shorter than one side of the bottom surface. A left control rod 53L and a right control rod 53R are provided in a projecting manner at approximately the center of the housing surface of the transmitter 50.

 左制御棒53L、右制御棒53Rは、それぞれユーザによる無人飛行体100の移動を遠隔で制御(例えば、無人飛行体100の前後移動、左右移動、上下移動、向き変更)するための操作において使用される。図5では、左制御棒53L及び右制御棒53Rは、ユーザの両手からそれぞれ外力が印加されていない初期状態の位置が示されている。左制御棒53L及び右制御棒53Rは、ユーザにより印加された外力が解放された後、自動的に所定位置(例えば図5に示す初期位置)に復帰する。 The left control rod 53L and the right control rod 53R are used in operations for remotely controlling the movement of the unmanned air vehicle 100 by the user (for example, moving the unmanned air vehicle 100 back and forth, moving left and right, moving up and down, and changing the direction). Is done. In FIG. 5, the left control rod 53L and the right control rod 53R indicate positions in an initial state where no external force is applied from both hands of the user. The left control rod 53L and the right control rod 53R automatically return to a predetermined position (for example, the initial position shown in FIG. 5) after the external force applied by the user is released.

 左制御棒53Lの手前側(言い換えると、ユーザ側)には、送信機50の電源ボタンB1が配置される。電源ボタンB1がユーザにより一度押下されると、例えば送信機50に内蔵されるバッテリの容量の残量がバッテリ残量表示部L2において表示される。電源ボタンB1がユーザによりもう一度押下されると、例えば送信機50の電源がオンとなり、送信機50の各部(図6参照)に電源が供給されて使用可能となる。 The power button B1 of the transmitter 50 is disposed on the front side (in other words, the user side) of the left control rod 53L. When the power button B1 is pressed once by the user, for example, the remaining capacity of the battery built in the transmitter 50 is displayed in the remaining battery capacity display portion L2. When the power button B1 is pressed again by the user, for example, the power of the transmitter 50 is turned on, and power is supplied to each part (see FIG. 6) of the transmitter 50 so that it can be used.

 右制御棒53Rの手前側(言い換えると、ユーザ側)には、RTH(Return To Home)ボタンB2が配置される。RTHボタンB2がユーザにより押下されると、送信機50は、無人飛行体100に所定の位置に自動復帰させるための信号を送信する。これにより、送信機50は、無人飛行体100を所定の位置(例えば無人飛行体100が記憶している離陸位置)に自動的に帰還させることができる。RTHボタンB2は、例えば屋外での無人飛行体100による空撮中にユーザが無人飛行体100の機体を見失った場合、又は電波干渉や予期せぬトラブルに遭遇して操作不能になった場合等に利用可能である。 RTH (Return To Home) button B2 is disposed on the front side (in other words, the user side) of the right control rod 53R. When the RTH button B2 is pressed by the user, the transmitter 50 transmits a signal for automatically returning the unmanned air vehicle 100 to a predetermined position. Thereby, the transmitter 50 can automatically return the unmanned air vehicle 100 to a predetermined position (for example, a take-off position stored in the unmanned air vehicle 100). The RTH button B2 is used when, for example, the user loses sight of the airframe of the unmanned aerial vehicle 100 during aerial shooting with the unmanned air vehicle 100 outdoors, or when it becomes impossible to operate due to radio interference or unexpected troubles. Is available.

 電源ボタンB1及びRTHボタンB2の手前側(言い換えると、ユーザ側)には、リモートステータス表示部L1及びバッテリ残量表示部L2が配置される。リモートステータス表示部L1は、例えばLED(Light Emission Diode)を用いて構成され、送信機50と無人飛行体100との無線の接続状態を表示する。バッテリ残量表示部L2は、例えばLEDを用いて構成され、送信機50に内蔵されたバッテリの容量の残量を表示する。 A remote status display unit L1 and a battery remaining amount display unit L2 are arranged on the front side (in other words, the user side) of the power button B1 and the RTH button B2. The remote status display unit L1 is configured by using, for example, an LED (Light Emission Diode), and displays a wireless connection state between the transmitter 50 and the unmanned air vehicle 100. The battery remaining amount display unit L2 is configured using, for example, an LED, and displays the remaining amount of battery capacity built in the transmitter 50.

 左制御棒53L及び右制御棒53Rより後側であって、かつ送信機50の筐体50Bの後方側面から、2つのアンテナAN1,AN2が突設して配置される。アンテナAN1,AN2は、ユーザの左制御棒53L及び右制御棒53Rの操作に基づき、無人飛行体100の移動を制御するための信号を無人飛行体100に送信する。アンテナAN1,AN2は、例えば2kmの送受信範囲をカバーできる。また、アンテナAN1,AN2は、送信機50と無線接続中の無人飛行体100が有する撮像装置220,230により撮像された撮像画像、又は無人飛行体100が取得した各種データが無人飛行体100から送信された場合に、これらの画像又は各種データを受信できる。 Two antennas AN1 and AN2 project from the rear side of the housing 50B of the transmitter 50 and rearward from the left control rod 53L and the right control rod 53R. The antennas AN1 and AN2 transmit a signal for controlling the movement of the unmanned air vehicle 100 to the unmanned air vehicle 100 based on the user's operation of the left control rod 53L and the right control rod 53R. The antennas AN1 and AN2 can cover a transmission / reception range of 2 km, for example. The antennas AN1 and AN2 receive from the unmanned aerial vehicle 100 captured images captured by the imaging devices 220 and 230 included in the unmanned aerial vehicle 100 wirelessly connected to the transmitter 50, or various data acquired by the unmanned aerial vehicle 100. When transmitted, these images or various data can be received.

 図5では、送信機50が表示部を備えていないが、表示部を備えてもよい。 In FIG. 5, the transmitter 50 does not include a display unit, but may include a display unit.

 携帯端末80は、ホルダHLDに載置されて取り付けられてよい。ホルダHLDは、送信機50に接合されて取り付けられてよい。これにより、携帯端末80がホルダHLDを介して送信機50に装着される。携帯端末80と送信機50とは、有線ケーブル(例えばUSBケーブル)を介して接続されてよい。携帯端末80と送信機50とは、無線通信(例えばBluetooth(登録商標))によって接続されてよい。携帯端末80が送信機50に装着されず、携帯端末80と送信機50がそれぞれ独立して設けられてもよい。 The portable terminal 80 may be mounted on the holder HLD. The holder HLD may be bonded and attached to the transmitter 50. Thereby, the portable terminal 80 is attached to the transmitter 50 via the holder HLD. The portable terminal 80 and the transmitter 50 may be connected via a wired cable (for example, a USB cable). The portable terminal 80 and the transmitter 50 may be connected by wireless communication (for example, Bluetooth (registered trademark)). The portable terminal 80 may not be attached to the transmitter 50, and the portable terminal 80 and the transmitter 50 may be provided independently.

[送信機の構成例]
 図6は、図1の飛行経路生成システム10を構成する送信機50のハードウェア構成の一例を示すブロック図である。送信機50は、左制御棒53Lと、右制御棒53Rと、処理部61と、無線通信部63と、インタフェース部65と、メモリ67と、バッテリ69と、電源ボタンB1と、RTHボタンB2と、操作部セットOPSと、リモートステータス表示部L1と、バッテリ残量表示部L2とを含む構成である。送信機50は、無人飛行体100を遠隔制御するための操作端末の一例である。
[Example of transmitter configuration]
FIG. 6 is a block diagram illustrating an example of a hardware configuration of the transmitter 50 configuring the flight path generation system 10 of FIG. The transmitter 50 includes a left control rod 53L, a right control rod 53R, a processing unit 61, a wireless communication unit 63, an interface unit 65, a memory 67, a battery 69, a power button B1, and an RTH button B2. The operation unit set OPS, the remote status display unit L1, and the battery remaining amount display unit L2 are included. The transmitter 50 is an example of an operation terminal for remotely controlling the unmanned air vehicle 100.

 処理部61は、プロセッサ(例えばCPU、MPU又はDSP)を用いて構成される。処理部61は、送信機50の各部の動作を統括して制御するための信号処理、他の各部との間のデータの入出力処理、データの演算処理及びデータの記憶処理を行う。 The processing unit 61 is configured using a processor (for example, a CPU, MPU, or DSP). The processing unit 61 performs signal processing for overall control of operations of each unit of the transmitter 50, data input / output processing with other units, data calculation processing, and data storage processing.

 処理部61は、無人飛行体100の撮像装置220が撮像した撮像画像のデータを、無線通信部63を介して取得してメモリ67に保存し、インタフェース部65を介して携帯端末80に出力してよい。言い換えると、処理部61は、無人飛行体100の撮像装置220により撮像された撮像画像を携帯端末80に表示させてよい。これにより、無人飛行体100の撮像装置220により撮像された撮像画像は、携帯端末80において表示可能となる。 The processing unit 61 acquires captured image data captured by the imaging device 220 of the unmanned air vehicle 100 via the wireless communication unit 63, stores it in the memory 67, and outputs the data to the portable terminal 80 via the interface unit 65. It's okay. In other words, the processing unit 61 may cause the portable terminal 80 to display a captured image captured by the imaging device 220 of the unmanned air vehicle 100. Thereby, the captured image captured by the imaging device 220 of the unmanned air vehicle 100 can be displayed on the portable terminal 80.

 処理部61は、ユーザの左制御棒53L及び右制御棒53Rの操作により、その操作により指定された無人飛行体100の移動を制御するための信号を生成してよい。処理部61は、この生成した信号を、無線通信部63及びアンテナAN1,AN2を介して、無人飛行体100に送信して無人飛行体100を遠隔制御してよい。これにより、送信機50は、無人飛行体100の移動を遠隔で制御できる。処理部61は、無線通信部63を介して外部サーバ等が蓄積する地図データベースの地図情報を取得してよい。 The processing unit 61 may generate a signal for controlling the movement of the unmanned aerial vehicle 100 designated by the operation by the user's operation of the left control rod 53L and the right control rod 53R. The processing unit 61 may transmit the generated signal to the unmanned air vehicle 100 via the wireless communication unit 63 and the antennas AN1 and AN2 to remotely control the unmanned air vehicle 100. Thereby, the transmitter 50 can control the movement of the unmanned air vehicle 100 remotely. The processing unit 61 may acquire map information of a map database accumulated by an external server or the like via the wireless communication unit 63.

 左制御棒53Lは、例えばユーザの左手により、無人飛行体100の移動を遠隔で制御するための操作に使用される。右制御棒53Rは、例えばユーザの右手により、無人飛行体100の移動を遠隔で制御するための操作に使用される。無人飛行体100の移動は、例えば前進する方向の移動、後進する方向の移動、左方向の移動、右方向の移動、上昇する方向の移動、下降する方向の移動、左方向に無人飛行体100を回転する移動、右方向に無人飛行体100を回転する移動のうちいずれか又はこれらの組み合わせであり、以下同様である。 The left control rod 53L is used for an operation for remotely controlling the movement of the unmanned air vehicle 100 by, for example, the user's left hand. The right control rod 53R is used for an operation for remotely controlling the movement of the unmanned air vehicle 100 by, for example, the user's right hand. The movement of the unmanned aerial vehicle 100 includes, for example, a forward movement, a backward movement, a leftward movement, a rightward movement, a rising movement, a downward movement, and a leftward movement. Or a combination thereof, and so on.

 バッテリ69は、送信機50の各部の駆動源としての機能を有し、送信機50の各部に必要な電源を供給する。 The battery 69 has a function as a drive source for each part of the transmitter 50 and supplies necessary power to each part of the transmitter 50.

 電源ボタンB1が一度押下されると、処理部61は、送信機50に内蔵されるバッテリ69の容量の残量をバッテリ残量表示部L2に表示する。これにより、ユーザは、送信機50に内蔵されるバッテリの容量の残量を簡単に確認できる。処理部61は、バッテリ残量表示部L2に、無人飛行体100に内蔵されたバッテリの容量の残量を表示してよい。また、電源ボタンB1が二度押下されると、処理部61は、送信機50に内蔵されるバッテリ69に対し、送信機50内の各部への電源供給を指示する。これにより、ユーザは、送信機50の電源がオンとなり、送信機50の使用を簡単に開始できる。 When the power button B1 is pressed once, the processing section 61 displays the remaining capacity of the battery 69 built in the transmitter 50 on the remaining battery capacity display section L2. Thereby, the user can easily check the remaining capacity of the battery capacity built in the transmitter 50. The processing unit 61 may display the remaining amount of the capacity of the battery built in the unmanned air vehicle 100 in the battery remaining amount display unit L2. When the power button B1 is pressed twice, the processing unit 61 instructs the battery 69 built in the transmitter 50 to supply power to each unit in the transmitter 50. As a result, the user can turn on the transmitter 50 and easily start using the transmitter 50.

 RTHボタンB2が押下されると、処理部61は、無人飛行体100に所定の位置(例えば無人飛行体100の離陸位置)に自動復帰させるための信号を生成し、無線通信部63及びアンテナAN1,AN2を介して無人飛行体100に送信する。これにより、ユーザは、送信機50に対する簡単な操作により、無人飛行体100を所定の位置に自動で復帰(帰還)させることができる。 When the RTH button B2 is pressed, the processing unit 61 generates a signal for automatically returning the unmanned air vehicle 100 to a predetermined position (for example, the take-off position of the unmanned air vehicle 100), and the wireless communication unit 63 and the antenna AN1. , Transmitted to the unmanned air vehicle 100 via AN2. Thus, the user can automatically return (return) the unmanned air vehicle 100 to a predetermined position by a simple operation on the transmitter 50.

 操作部セットOPSは、複数の操作部(例えば操作部OP1,…,操作部OPn)(n:2以上の整数)を用いて構成される。操作部セットOPSは、図5に示す左制御棒53L、右制御棒53R、電源ボタンB1及びRTHボタンB2を除く他の操作部(例えば、送信機50による無人飛行体100の遠隔制御を支援するための各種の操作部)により構成される。ここでいう各種の操作部とは、例えば、無人飛行体100の撮像装置220を用いた静止画の撮像を指示するボタン、無人飛行体100の撮像装置220を用いた動画の録画の開始及び終了を指示するボタン、無人飛行体100のジンバル200(図4参照)のチルト方向の傾きを調整するダイヤル、無人飛行体100のフライトモードを切り替えるボタン、無人飛行体100の撮像装置220の設定を行うダイヤルが該当する。 The operation unit set OPS is configured using a plurality of operation units (for example, operation units OP1,..., Operation unit OPn) (n: an integer of 2 or more). The operation unit set OPS supports other operation units (for example, remote control of the unmanned air vehicle 100 by the transmitter 50) except for the left control rod 53L, the right control rod 53R, the power button B1, and the RTH button B2 shown in FIG. Various operation units). The various operation units referred to here include, for example, a button for instructing to capture a still image using the imaging device 220 of the unmanned air vehicle 100, and start and end of video recording using the imaging device 220 of the unmanned air vehicle 100. Button for adjusting the tilt direction of the gimbal 200 (see FIG. 4) of the unmanned air vehicle 100, a button for switching the flight mode of the unmanned air vehicle 100, and the imaging device 220 of the unmanned air vehicle 100 are set. Dial is applicable.

 リモートステータス表示部L1及びバッテリ残量表示部L2は、図5を参照して説明したので、ここでは説明を省略する。 The remote status display unit L1 and the remaining battery level display unit L2 have been described with reference to FIG.

 無線通信部63は、2つのアンテナAN1,AN2と接続される。無線通信部63は、2つのアンテナAN1,AN2を介して、無人飛行体100との間で所定の無線通信方式(例えばWifi(登録商標))を用いた情報やデータの送受信を行う。 The wireless communication unit 63 is connected to two antennas AN1 and AN2. The wireless communication unit 63 transmits / receives information and data to / from the unmanned air vehicle 100 via the two antennas AN1 and AN2 using a predetermined wireless communication method (for example, WiFi (registered trademark)).

 インタフェース部65は、送信機50と携帯端末80との間の情報やデータの入出力を行う。インタフェース部65は、例えば送信機50に設けられたUSBポート(不図示)でよい。インタフェース部65は、USBポート以外のインタフェースでもよい。 The interface unit 65 inputs and outputs information and data between the transmitter 50 and the portable terminal 80. The interface unit 65 may be a USB port (not shown) provided in the transmitter 50, for example. The interface unit 65 may be an interface other than the USB port.

 メモリ67は、記憶部の一例である。メモリ67は、例えば処理部61の動作を規定するプログラムや設定値のデータが格納されたROM(Read Only Memory)と、処理部61の処理時に使用される各種の情報やデータを一時的に保存するRAM(Random Access Memory)とを有する。メモリ64のROMに格納されたプログラムや設定値のデータは、所定の記録媒体(例えばCD-ROM、DVD-ROM)にコピーされてよい。メモリ64のRAMには、例えば無人飛行体100の撮像装置220,230により撮像された撮像画像のデータを保存してよい。 The memory 67 is an example of a storage unit. The memory 67 temporarily stores, for example, a ROM (Read Only Memory) in which a program that defines the operation of the processing unit 61 and data of set values are stored, and various types of information and data that are used when the processing unit 61 performs processing. RAM (Random Access Memory). The program and setting value data stored in the ROM of the memory 64 may be copied to a predetermined recording medium (for example, CD-ROM, DVD-ROM). In the RAM of the memory 64, for example, data of captured images captured by the imaging devices 220 and 230 of the unmanned air vehicle 100 may be stored.

[携帯端末の構成例]
 図7は、図1の飛行経路生成システム10を構成する携帯端末80のハードウェア構成の一例を示すブロック図である。携帯端末80は、処理部81、インタフェース部82、操作部83、無線通信部85、メモリ87、表示部88、ストレージ89、及びバッテリ99を備えてよい。携帯端末80は、情報処理装置の一例としての機能を有し、携帯端末80の処理部81は、情報処理装置の処理部の一例である。
[Configuration example of mobile terminal]
FIG. 7 is a block diagram illustrating an example of a hardware configuration of the mobile terminal 80 that configures the flight path generation system 10 of FIG. The portable terminal 80 may include a processing unit 81, an interface unit 82, an operation unit 83, a wireless communication unit 85, a memory 87, a display unit 88, a storage 89, and a battery 99. The portable terminal 80 has a function as an example of an information processing device, and the processing unit 81 of the portable terminal 80 is an example of a processing unit of the information processing device.

 処理部81は、プロセッサ(例えばCPU、MPU又はDSP)を用いて構成される。処理部81は、携帯端末80の各部の動作を統括して制御するための信号処理、他の各部との間のデータの入出力処理、データの演算処理及びデータの記憶処理を行う。 The processing unit 81 is configured using a processor (for example, CPU, MPU, or DSP). The processing unit 81 performs signal processing for overall control of operations of each unit of the mobile terminal 80, data input / output processing with other units, data calculation processing, and data storage processing.

 処理部81は、無線通信部85を介して、無人飛行体100からのデータや情報を取得してよい。処理部81は、インタフェース部82を介して、送信機50又は他の装置からのデータや情報を取得してよい。処理部81は、操作部83を介して入力されたデータや情報を取得してよい。処理部81は、メモリ87に保持されたデータや情報を取得してよい。処理部81は、データや情報を表示部88に送り、このデータや情報に基づく表示情報を表示部88に表示させてよい。処理部81は、データや情報をストレージ89に送り、このデータや情報を格納してよい。処理部81は、ストレージ89に格納されたデータや情報を取得してよい。 The processing unit 81 may acquire data and information from the unmanned air vehicle 100 via the wireless communication unit 85. The processing unit 81 may acquire data and information from the transmitter 50 or another device via the interface unit 82. The processing unit 81 may acquire data and information input via the operation unit 83. The processing unit 81 may acquire data and information held in the memory 87. The processing unit 81 may send data and information to the display unit 88 and cause the display unit 88 to display display information based on the data and information. The processing unit 81 may send data and information to the storage 89 and store the data and information. The processing unit 81 may acquire data and information stored in the storage 89.

 処理部81は、無人飛行体100の制御を指示するためのアプリケーションを実行してよい。処理部81は、アプリケーションで用いられる各種のデータを生成してよい。 The processing unit 81 may execute an application for instructing control of the unmanned air vehicle 100. The processing unit 81 may generate various data used in the application.

 インタフェース部82は、送信機50又は他の装置と携帯端末80との間の情報やデータの入出力を行う。インタフェース部82は、例えば携帯端末80に設けられたUSBコネクタ(不図示)でよい。インタフェース部65は、USBコネクタ以外のインタフェースでもよい。 The interface unit 82 inputs and outputs information and data between the transmitter 50 or another device and the portable terminal 80. The interface unit 82 may be a USB connector (not shown) provided in the mobile terminal 80, for example. The interface unit 65 may be an interface other than the USB connector.

 操作部83は、携帯端末80の操作者により入力されるデータや情報を受け付ける。操作部83は、ボタン、キー、タッチパネル、マイクロホン、等を含んでよい。ここでは、主に、操作部83と表示部88とがタッチパネルにより構成されることを例示する。この場合、操作部83は、タッチ操作、タップ操作、ドラック操作等を受付可能である。 The operation unit 83 receives data and information input by the operator of the mobile terminal 80. The operation unit 83 may include buttons, keys, a touch panel, a microphone, and the like. Here, it is exemplified that the operation unit 83 and the display unit 88 are mainly configured by a touch panel. In this case, the operation unit 83 can accept a touch operation, a tap operation, a drag operation, and the like.

 無線通信部85は、各種の無線通信方式により、無人飛行体100との間で通信する。無線通信方式は、例えば、無線LAN、Bluetooth(登録商標)、近距離無線通信、又は公衆無線回線を介した通信を含んでよい。無線通信部85は、他の装置との間で通信を行ってデータや情報を送受信してよい。 The wireless communication unit 85 communicates with the unmanned air vehicle 100 by various wireless communication methods. The wireless communication method may include, for example, wireless LAN, Bluetooth (registered trademark), short-range wireless communication, or communication via a public wireless line. The wireless communication unit 85 may transmit and receive data and information by communicating with other devices.

 メモリ87は、記憶部の一例である。メモリ87は、例えば携帯端末80の動作を規定するプログラムや設定値のデータが格納されたROMと、処理部81の処理時に使用される各種の情報やデータを一時的に保存するRAMを有してよい。メモリ87は、ROM及びRAM以外のメモリが含まれてよい。メモリ87は、携帯端末80の内部に設けられてよい。メモリ87は、携帯端末80から取り外し可能に設けられてよい。プログラムは、アプリケーションプログラムを含んでよい。 The memory 87 is an example of a storage unit. The memory 87 includes, for example, a ROM that stores a program that defines the operation of the portable terminal 80 and data of setting values, and a RAM that temporarily stores various information and data used during processing by the processing unit 81. It's okay. The memory 87 may include memories other than ROM and RAM. The memory 87 may be provided inside the mobile terminal 80. The memory 87 may be provided so as to be removable from the portable terminal 80. The program may include an application program.

 表示部88は、例えばLCD(Liquid Crystal Display)又は有機EL(ElectroLuminescence)ディスプレイを用いて構成され、処理部81から出力された各種の情報やデータを表示する。表示部88は、無人飛行体100の撮像装置220,230により撮像された撮像画像のデータを表示してよい。 The display unit 88 is configured using, for example, an LCD (Liquid Crystal Display) or an organic EL (ElectroLuminescence) display, and displays various information and data output from the processing unit 81. The display unit 88 may display captured image data captured by the imaging devices 220 and 230 of the unmanned air vehicle 100.

 ストレージ89は、記憶部の一例である。ストレージ89は、各種データ、情報を蓄積し、保持する。ストレージ89は、フラッシュメモリ、SSD(Solid State Drive)、メモリカード、USBメモリ、等でよい。ストレージ89は、携帯端末80の本体から取り外し可能に設けられてよい。 The storage 89 is an example of a storage unit. The storage 89 stores and holds various data and information. The storage 89 may be a flash memory, an SSD (Solid State Drive), a memory card, a USB memory, or the like. The storage 89 may be provided so as to be removable from the main body of the mobile terminal 80.

 バッテリ99は、携帯端末80の各部の駆動源としての機能を有し、携帯端末80の各部に必要な電源を供給する。 The battery 99 has a function as a drive source for each part of the mobile terminal 80 and supplies necessary power to each part of the mobile terminal 80.

 次に、携帯端末80の処理部81の機能の一例について説明する。 Next, an example of the function of the processing unit 81 of the mobile terminal 80 will be described.

 情報処理装置の処理部の一例としての処理部81は、無人飛行体100の飛行経路の生成に関する処理を行う飛行経路処理部811を含む。処理部81は、被写体の3次元形状データの推定及び生成に関する処理を行う形状データ処理部812を含む。 The processing unit 81 as an example of the processing unit of the information processing apparatus includes a flight path processing unit 811 that performs processing related to generation of a flight path of the unmanned air vehicle 100. The processing unit 81 includes a shape data processing unit 812 that performs processing related to estimation and generation of three-dimensional shape data of a subject.

 飛行経路処理部811は、被写体を撮像する無人飛行体100の飛行経路を生成する。飛行経路処理部811は、入力パラメータを取得してよい。飛行経路処理部811は、送信機50が入力した入力パラメータを、インタフェース部82又は無線通信部85を介して受信することで取得してよい。また、飛行経路処理部811は、入力パラメータに含まれる少なくとも一部の情報を、送信機50から取得するのではなく、他の装置から取得してよい。飛行経路処理部811は、入力パラメータに含まれる少なくとも一部の情報を、ネットワーク上に存在するサーバ等から取得してよい。取得された入力パラメータは、メモリ87に保持されてよい。携帯端末80の処理部81は、適宜(例えば飛行経路の生成時、3次元形状データの生成時)メモリ87を参照してよい。 The flight path processing unit 811 generates a flight path of the unmanned air vehicle 100 that images the subject. The flight path processing unit 811 may acquire input parameters. The flight path processing unit 811 may acquire the input parameter input by the transmitter 50 by receiving the input parameter via the interface unit 82 or the wireless communication unit 85. Further, the flight path processing unit 811 may acquire at least a part of information included in the input parameter from another device instead of acquiring from the transmitter 50. The flight path processing unit 811 may acquire at least a part of information included in the input parameter from a server or the like existing on the network. The acquired input parameters may be held in the memory 87. The processing unit 81 of the portable terminal 80 may refer to the memory 87 as appropriate (for example, at the time of generating a flight route, at the time of generating three-dimensional shape data).

 入力パラメータは、オブジェクトの概略形状の情報、飛行範囲の情報、飛行高度の情報、撮像距離の情報、撮像位置間隔の情報、を含んでよい。入力パラメータは、設定解像度の情報を含んでよい。なお、設定解像度は、無人飛行体100の撮像装置220,230により撮像される撮像画像の解像度(つまり、被写体の3次元形状を高精度に推定可能とするために適正な撮像画像を得るための解像度)を示し、無人飛行体100のメモリ160又は送信機50のメモリ67に保持されてよい。 The input parameters may include information on the approximate shape of the object, information on the flight range, information on the flight altitude, information on the imaging distance, and information on the imaging position interval. The input parameter may include setting resolution information. Note that the set resolution is the resolution of the captured image captured by the imaging devices 220 and 230 of the unmanned air vehicle 100 (that is, for obtaining an appropriate captured image so that the three-dimensional shape of the subject can be estimated with high accuracy). Resolution) and may be stored in the memory 160 of the unmanned air vehicle 100 or the memory 67 of the transmitter 50.

 なお、入力パラメータは、上述したパラメータの他に、無人飛行体100の飛行経路における撮像位置(つまり、ウェイポイント)の情報や、撮像位置を通る飛行経路を生成するための各種のパラメータを含んでよい。撮像位置は3次元空間における位置である。 In addition to the parameters described above, the input parameters include information on imaging positions (that is, waypoints) in the flight path of the unmanned air vehicle 100 and various parameters for generating a flight path that passes through the imaging positions. Good. The imaging position is a position in a three-dimensional space.

 また、入力パラメータは、例えば撮像位置において無人飛行体100が被写体を撮像する時の撮像範囲の重複率の情報を含んでよい。また、入力パラメータは、飛行経路における撮像位置の間隔の情報を含んでよい。撮像位置間隔は、飛行経路に配置される複数の撮像位置(ウェイポイント)のうち2つの隣り合う撮像位置の間隔(距離)である。また、入力パラメータは、無人飛行体100の撮像装置220又は230の画角の情報を含んでよい。 Also, the input parameter may include information on the overlapping rate of the imaging range when the unmanned air vehicle 100 images the subject at the imaging position, for example. The input parameter may include information on the interval between imaging positions in the flight path. The imaging position interval is an interval (distance) between two adjacent imaging positions among a plurality of imaging positions (waypoints) arranged on the flight path. The input parameter may include information on the angle of view of the imaging device 220 or 230 of the unmanned air vehicle 100.

 また、飛行経路処理部811は、被写体の識別情報を受信して取得してよい。飛行経路処理部811は、特定された被写体の識別情報を基に、インタフェース部82又は無線通信部85を介して外部サーバと通信し、被写体の識別情報に対応する被写体の形状の情報や被写体の大きさの情報を受信して取得してよい。 Further, the flight path processing unit 811 may receive and acquire subject identification information. The flight path processing unit 811 communicates with the external server via the interface unit 82 or the wireless communication unit 85 based on the identified subject identification information, and information on the shape of the subject corresponding to the subject identification information or the subject The size information may be received and acquired.

 撮像範囲の重複率は、水平方向又は上下方向で隣り合う撮像位置で無人飛行体100の撮像装置220又は撮像装置230により撮像される場合の2つの撮像範囲が重複する割合を示す。撮像範囲の重複率は、水平方向での撮像範囲の重複率(水平重複率ともいう)の情報、上下方向での撮像範囲の重複率(上下重複率ともいう)の情報、の少なくとも1つを含んでよい。水平重複率及び上下重複率は、同じでも異なってもよい。水平重複率及び上下重複率が異なる値である場合、水平重複率の情報及び上下重複率の情報のいずれも入力パラメータに含まれてよい。水平重複率及び上下重複率が同値である場合、同値である1つの重複率の情報が入力パラメータに含まれてよい。 The overlapping ratio of the imaging ranges indicates a ratio of overlapping two imaging ranges when images are captured by the imaging device 220 or the imaging device 230 of the unmanned air vehicle 100 at imaging positions adjacent in the horizontal direction or the vertical direction. The overlapping range of the imaging range is at least one of information on the overlapping rate of the imaging range in the horizontal direction (also referred to as horizontal overlapping rate) and information on the overlapping rate of the imaging range in the vertical direction (also referred to as vertical overlapping rate). May include. The horizontal overlap rate and the vertical overlap rate may be the same or different. When the horizontal overlap rate and the vertical overlap rate are different values, both the horizontal overlap rate information and the vertical overlap rate information may be included in the input parameter. When the horizontal overlap rate and the vertical overlap rate are the same value, information on one overlap rate that is the same value may be included in the input parameter.

 撮像位置間隔は、空間的な撮像間隔であり、飛行経路において無人飛行体100が画像を撮像すべき複数の撮像位置のうち、隣り合う撮像位置の間の距離である。撮像位置間隔は、水平方向での撮像位置の間隔(水平撮像間隔ともいう)及び鉛直方向の撮像位置の間隔(上下撮像間隔ともいう)の少なくとも1つを含んでよい。飛行経路処理部811は、水平撮像間隔及び上下撮像間隔を含む撮像位置間隔を、算出して取得してもよいし、入力パラメータから取得してもよい。 The imaging position interval is a spatial imaging interval, and is a distance between adjacent imaging positions among a plurality of imaging positions at which the unmanned air vehicle 100 should take an image in the flight path. The imaging position interval may include at least one of an imaging position interval in the horizontal direction (also referred to as a horizontal imaging interval) and an imaging position interval in the vertical direction (also referred to as an upper and lower imaging interval). The flight path processing unit 811 may calculate and acquire an imaging position interval including a horizontal imaging interval and an upper and lower imaging interval, or may acquire it from input parameters.

 つまり、飛行経路処理部811は、飛行経路上に、撮像装置220又は230により撮像する撮像位置(ウェイポイント)を配置してよい。撮像位置の間隔(撮像位置間隔)は、例えば等間隔で配置されてよい。撮像位置は、隣り合う撮像位置での撮像画像に係る撮像範囲が一部重複するよう配置される。複数の撮像画像を用いた3次元形状の推定を可能とするためである。撮像装置220又は230は所定の画角を有するので、撮像位置間隔を短くすることで、双方の撮像範囲の一部が重複する。 That is, the flight path processing unit 811 may place an imaging position (waypoint) for imaging by the imaging device 220 or 230 on the flight path. The intervals between the imaging positions (imaging position intervals) may be arranged at regular intervals, for example. The imaging positions are arranged so that the imaging ranges related to the captured images at adjacent imaging positions partially overlap. This is to enable estimation of a three-dimensional shape using a plurality of captured images. Since the imaging device 220 or 230 has a predetermined angle of view, a part of both imaging ranges overlaps by shortening the imaging position interval.

 飛行経路処理部811は、例えば撮像位置が配置される高度(撮像高度)、撮像装置220又は230の解像度に基づき、撮像位置間隔を算出してよい。撮像高度が高い程又は撮像距離が長い程、撮像範囲の重複率が大きくなるので、撮像位置間隔を長く(疎に)できる。撮像高度が低い程又は撮像距離が短い程、撮像範囲の重複率が小さくなるので、撮像位置間隔を短く(密に)する。飛行経路処理部811は、更に撮像装置220又は230の画角を基に、撮像位置間隔を算出してよい。飛行経路処理部811は、その他公知の方法により撮像位置間隔を算出してよい。 The flight path processing unit 811 may calculate the imaging position interval based on, for example, the altitude (imaging altitude) at which the imaging position is arranged and the resolution of the imaging device 220 or 230. The higher the imaging altitude or the longer the imaging distance, the larger the imaging range overlap rate, so that the imaging position interval can be made longer (sparse). As the imaging altitude is lower or the imaging distance is shorter, the overlapping ratio of the imaging ranges becomes smaller, so the imaging position interval is shortened (densely). The flight path processing unit 811 may further calculate the imaging position interval based on the angle of view of the imaging device 220 or 230. The flight path processing unit 811 may calculate the imaging position interval by other known methods.

 飛行経路処理部811は、撮像装置220の画角又は撮像装置230の画角の情報を、撮像装置220又は撮像装置230から取得してよい。撮像装置220の画角又は撮像装置230の画角は、水平方向と上下方向とで同じでも異なってもよい。水平方向での撮像装置220の画角又は撮像装置230の画角を水平画角とも称する。上下方向での撮像装置220の画角又は撮像装置230の画角を上下画角とも称する。飛行経路処理部811は、水平画角及び上下画角が同値である場合、同値である1つの画角の情報を取得してよい。 The flight path processing unit 811 may acquire the angle of view of the imaging device 220 or the angle of view of the imaging device 230 from the imaging device 220 or the imaging device 230. The angle of view of the imaging device 220 or the angle of view of the imaging device 230 may be the same or different in the horizontal direction and the vertical direction. The angle of view of the imaging device 220 in the horizontal direction or the angle of view of the imaging device 230 is also referred to as a horizontal angle of view. The angle of view of the imaging device 220 or the angle of view of the imaging device 230 in the vertical direction is also referred to as the vertical angle of view. The flight path processing unit 811 may acquire information on one angle of view having the same value when the horizontal angle of view and the vertical angle of view are the same value.

 飛行経路処理部811は、飛行範囲及び撮像位置間隔に基づいて、無人飛行体100による被写体の撮像位置(ウェイポイント)を決定する。無人飛行体100による撮像位置は、水平方向において等間隔に配置されてよく、最後の撮像位置と最初の撮像位置との距離は撮像位置間隔より短くてよい。この間隔は、水平撮像間隔となる。無人飛行体100による撮像位置は、上下方向において等間隔に配置されてよく、最後の撮像位置と最初の撮像位置との距離は撮像位置間隔より短くてよい。この間隔は、上下撮像間隔となる。 The flight path processing unit 811 determines the imaging position (waypoint) of the subject by the unmanned air vehicle 100 based on the flight range and the imaging position interval. The imaging positions by the unmanned air vehicle 100 may be arranged at equal intervals in the horizontal direction, and the distance between the last imaging position and the first imaging position may be shorter than the imaging position interval. This interval is a horizontal imaging interval. The imaging positions by the unmanned air vehicle 100 may be arranged at equal intervals in the vertical direction, and the distance between the last imaging position and the first imaging position may be shorter than the imaging position interval. This interval is the vertical imaging interval.

 飛行経路処理部811は、オブジェクトの側面に対応する1つの撮像平面毎に、決定された撮像位置を通る飛行経路を生成する。飛行経路処理部811は、1つの撮像平面の飛行コースにおいて隣り合う各撮像位置を順に通り、この飛行コースにおける各撮像位置を全て通過した後、次の撮像平面の飛行コースへ進入する飛行経路を生成してよい。飛行経路処理部811は、次の撮像平面の飛行コースにおいても同様に、隣り合う各撮像位置を順に通り、この飛行コースにおける各撮像位置を全て通過した後、その次の撮像平面の飛行コースへ進入する飛行経路を生成してよい。飛行経路は、上空側を始点として飛行経路を進むにつれて高度が下降するように形成されてよい。一方、飛行経路は、地面側を始点として飛行経路を進むにつれて高度が上昇するように形成されてよい。 The flight path processing unit 811 generates a flight path passing through the determined imaging position for each imaging plane corresponding to the side surface of the object. The flight path processing unit 811 sequentially passes through the imaging positions adjacent to each other in the flight course of one imaging plane, passes through all the imaging positions in this flight course, and then enters the flight path entering the flight course of the next imaging plane. May be generated. Similarly, in the flight course of the next imaging plane, the flight path processing unit 811 sequentially passes through the imaging positions adjacent to each other, passes through all the imaging positions in the flight course, and then moves to the flight course of the next imaging plane. An incoming flight path may be generated. The flight path may be formed such that the altitude decreases as the flight path starts from the sky side. On the other hand, the flight path may be formed such that the altitude increases as the flight path starts from the ground side.

 無人飛行体100の処理部110は、生成された飛行経路に従って、無人飛行体100の飛行を制御してよい。処理部110は、飛行経路の途中に存在する撮像位置において、撮像装置220又は撮像装置230により被写体を撮像させてよい。従って、撮像装置220又は撮像装置230は、飛行経路における撮像位置において、被写体の側面を撮像してよい。撮像装置220又は撮像装置230により撮像された撮像画像は、無人飛行体100のメモリ160又はストレージ170、或いは携帯端末80のメモリ87又はストレージ89に保持されてよい。処理部110は、適宜(例えば飛行経路の設定時)メモリ160を参照してよい。 The processing unit 110 of the unmanned air vehicle 100 may control the flight of the unmanned air vehicle 100 according to the generated flight path. The processing unit 110 may cause the imaging device 220 or the imaging device 230 to image the subject at an imaging position that exists in the middle of the flight path. Therefore, the imaging device 220 or the imaging device 230 may capture the side surface of the subject at the imaging position in the flight path. The captured image captured by the imaging device 220 or the imaging device 230 may be held in the memory 160 or the storage 170 of the unmanned air vehicle 100 or the memory 87 or the storage 89 of the portable terminal 80. The processing unit 110 may refer to the memory 160 as appropriate (for example, when setting a flight path).

 形状データ処理部812は、撮像装置220,230のいずれかにより異なる撮像位置において撮像された複数の撮像画像に基づいて、オブジェクト(被写体)の立体形状(3次元形状)を示す立体情報(3次元情報、3次元形状データ)を生成してよい。よって、撮像画像は、3次元形状データを復元するための1つの画像として用いられてよい。3次元形状データを復元するための撮像画像は、静止画像でよい。複数の撮像画像に基づく3次元形状データの生成手法としては、公知の方法を用いてよい。公知の方法として、例えば、MVS(Multi View Stereo)、PMVS(Patch-based MVS)、SfM(Structure from Motion)が挙げられる。 The shape data processing unit 812 is a three-dimensional information (three-dimensional shape) indicating a three-dimensional shape (three-dimensional shape) of an object (subject) based on a plurality of picked-up images picked up at different image pickup positions by any of the image pickup devices 220 and 230. Information, three-dimensional shape data). Therefore, the captured image may be used as one image for restoring the three-dimensional shape data. The captured image for restoring the three-dimensional shape data may be a still image. As a method for generating three-dimensional shape data based on a plurality of captured images, a known method may be used. Known methods include, for example, MVS (Multi View Stereo), PMVS (Patch-based MVS), and SfM (Structure from Motion).

 3次元形状データの生成に用いられる撮像画像は、静止画でよい。3次元形状データの生成に用いられる複数の撮像画像には、互いに撮像範囲が一部重複する2つの撮像画像が含まれる。この重複の割合(つまり撮像範囲の重複率)が高い程、同一範囲において3次元形状データを生成する場合には、3次元形状データの生成に用いられる撮像画像の数が多くなる。従って、形状データ処理部812は、3次元形状の復元精度を向上できる。一方、撮像範囲の重複率が低い程、同一範囲において3次元形状データを生成する場合には、3次元形状データの生成に用いられる撮像画像の数が少なくなる。従って、形状データ処理部812は、3次元形状データの生成時間を短縮できる。なお、複数の撮像画像において、互いに撮像範囲が一部重複する2つの撮像画像が含まれなくてもよい。 The captured image used for generating the three-dimensional shape data may be a still image. The plurality of captured images used for generating the three-dimensional shape data include two captured images whose imaging ranges partially overlap each other. The higher the overlapping ratio (that is, the imaging area overlapping ratio), the larger the number of captured images used for generating the three-dimensional shape data when generating the three-dimensional shape data in the same range. Therefore, the shape data processing unit 812 can improve the reconstruction accuracy of the three-dimensional shape. On the other hand, the lower the overlapping ratio of the imaging ranges, the smaller the number of captured images used for generating the three-dimensional shape data when generating the three-dimensional shape data in the same range. Therefore, the shape data processing unit 812 can shorten the generation time of the three-dimensional shape data. Note that two captured images whose imaging ranges partially overlap each other may not be included in the plurality of captured images.

 形状データ処理部812は、複数の撮像画像として、被写体の側面が撮像された撮像画像を含んで取得してよい。これにより、形状データ処理部812は、一律に上空から鉛直方向を撮像した撮像画像を取得する場合と比較すると、被写体の側面における画像特徴を多数収集でき、被写体周辺の3次元形状の復元精度を向上できる。 The shape data processing unit 812 may acquire a plurality of captured images including captured images obtained by capturing the side surface of the subject. As a result, the shape data processing unit 812 can collect a large number of image features on the side surface of the subject, and can restore the three-dimensional shape around the subject, as compared with the case of acquiring a captured image obtained by uniformly capturing the vertical direction from the sky. It can be improved.

 次に、飛行経路生成システム10の動作例について説明する。 Next, an operation example of the flight path generation system 10 will be described.

[飛行経路生成]
 図8は、実施形態における飛行経路生成方法の処理手順の一例を示すフローチャートである。図示例は、対象領域の空撮を行ってオブジェクトの概略形状を取得し、取得した概略形状に基づいて3次元形状推定のための飛行経路を生成する処理を例示する。本例では、携帯端末80の処理部81が主体的に処理を実行するものとする。
[Flight path generation]
FIG. 8 is a flowchart illustrating an example of a processing procedure of the flight path generation method according to the embodiment. The illustrated example illustrates a process of performing aerial photography of a target region to acquire a rough shape of an object, and generating a flight path for estimating a three-dimensional shape based on the acquired rough shape. In this example, it is assumed that the processing unit 81 of the mobile terminal 80 executes the process independently.

 処理部81の飛行経路処理部811は、オブジェクトの3次元形状推定のための撮影を実行する際に、オブジェクト撮影用の無人飛行体100の飛行経路を生成する。飛行経路処理部811は、無人飛行体100の飛行範囲を入力し、撮影対象領域の範囲を指定する(S11)。 The flight path processing unit 811 of the processing unit 81 generates a flight path of the unmanned air vehicle 100 for shooting an object when performing shooting for estimating the three-dimensional shape of the object. The flight path processing unit 811 inputs the flight range of the unmanned air vehicle 100 and designates the range of the imaging target region (S11).

 無人飛行体100の処理部110は、指定された飛行範囲の情報を入力し、対応する飛行範囲を飛行し、所定の撮像位置において鉛直方向下向きに撮影対象領域のオブジェクトを俯瞰した状態で空撮する(S12)。この場合、処理部110は、少ない撮像位置において大まかにオブジェクトを撮像(以下、「概略撮像」と称する場合がある)する。無人飛行体100の処理部110は、各撮像位置における俯瞰の撮像画像を取得し、メモリ160に撮像画像を記録する。 The processing unit 110 of the unmanned aerial vehicle 100 inputs information on a designated flight range, flies in the corresponding flight range, and performs aerial photography in a state where an object in the imaging target area is looked down vertically at a predetermined imaging position. (S12). In this case, the processing unit 110 roughly captures an object (hereinafter, may be referred to as “schematic imaging”) at a small number of imaging positions. The processing unit 110 of the unmanned aerial vehicle 100 acquires a bird's-eye shot image at each imaging position and records the captured image in the memory 160.

 処理部81の飛行経路処理部811は、撮影対象領域の鉛直方向下方の概略撮像(下向き空撮)により得られた撮像画像を取得してメモリ87又はストレージ89に格納する。飛行経路処理部811は、取得した撮像画像群を用いて公知の3次元形状復元技術によってオブジェクト(建物、地面など)の概略形状を推定することで、概略形状を取得する(S13)。概略形状の3次元形状データは、例えばポリゴンデータを含んでよい。 The flight path processing unit 811 of the processing unit 81 acquires a captured image obtained by rough imaging (downward aerial shooting) in the vertical direction of the imaging target region, and stores the acquired image in the memory 87 or the storage 89. The flight path processing unit 811 acquires the approximate shape by estimating the approximate shape of the object (building, ground, etc.) using a known three-dimensional shape restoration technique using the acquired captured image group (S13). The three-dimensional shape data of the approximate shape may include polygon data, for example.

 なお、オブジェクトの概略形状は、対象領域の空撮によって取得するのに代えて、携帯端末80又はサーバ等の他の装置が保持している3次元地図データベースを利用し、3次元地図データベースの地図情報に含まれる建物、道路などの3次元情報(例えばポリゴンデータ)によって概略形状の3次元形状データを取得してよい。 Note that the approximate shape of the object is obtained by taking a 3D map database held by another device such as the mobile terminal 80 or the server instead of being acquired by aerial imaging of the target region. The three-dimensional shape data of the approximate shape may be acquired by three-dimensional information (for example, polygon data) such as a building and a road included in the information.

 飛行経路処理部811は、取得したオブジェクトの概略形状を用いて、オブジェクトの3次元形状推定のための詳細な撮像用の飛行経路を生成する(S14)。オブジェクトの概略形状を用いた飛行経路の生成手順については、いくつかの例を後述する。 The flight path processing unit 811 generates a detailed imaging flight path for estimating the three-dimensional shape of the object using the acquired schematic shape of the object (S14). Several examples of the flight path generation procedure using the general shape of the object will be described later.

 上記の動作例により、オブジェクトの3次元形状推定を行うための飛行経路を生成し、オブジェクトの詳細な撮像を自動化できる。また、オブジェクトに対する適切な飛行経路の設定を自動化できる。 The above operation example can generate a flight path for estimating the three-dimensional shape of an object and automate the detailed imaging of the object. In addition, the setting of an appropriate flight path for the object can be automated.

[概略形状取得]
 次に、飛行経路生成処理におけるオブジェクトの概略形状の取得方法の一例について説明する。
[Approximate shape acquisition]
Next, an example of a method for acquiring the approximate shape of the object in the flight path generation process will be described.

 図9は、飛行範囲A1の入力例を説明するための図である。 FIG. 9 is a diagram for explaining an input example of the flight range A1.

 例えば、携帯端末80の処理部81は、操作部83によって、飛行範囲A1の情報を入力する。操作部83は、飛行範囲A1として、地図情報M1に示された3次元形状データの生成を望む所望の範囲のユーザ入力を受け付けてよい。飛行範囲A1の情報は、所望の範囲に限らず、所定の飛行範囲でもよい。所定の飛行範囲は、例えば定期的に3次元形状データを生成して3次元形状を計測するための範囲の1つでもよい。 For example, the processing unit 81 of the portable terminal 80 inputs information on the flight range A1 through the operation unit 83. The operation unit 83 may accept a user input of a desired range where the generation of the three-dimensional shape data indicated in the map information M1 is desired as the flight range A1. The information on the flight range A1 is not limited to a desired range, and may be a predetermined flight range. The predetermined flight range may be, for example, one of ranges for periodically generating 3D shape data and measuring the 3D shape.

 図10は、飛行経路FPAでの概略撮像を説明するための図である。 FIG. 10 is a diagram for explaining schematic imaging in the flight path FPA.

 処理部81において、飛行経路処理部811は、飛行経路FPAでは、各撮像位置CPの間隔(撮像位置間隔)を、間隔d11に設定してよい。間隔d11は、オブジェクト(例えば建物)のサイズが推定可能となる程度の疎な間隔(例えば数10m間隔)である。間隔d11は、少なくとも、隣り合う撮像位置CPでの撮像範囲が一部重複する間隔に設定される。飛行経路FPAの間隔d11での各撮像位置CPでの撮像を、概略撮像と称してよい。無人飛行体100は、疎な間隔で撮像することで、密な間隔で撮像するよりも撮像時間を短縮できる。無人飛行体100が飛行する飛行経路の鉛直方向(地面に向かう方向、重力方向)には、建物BLや山MTを含む景色が広がっていてよい。従って、建物BLや山MTは、撮像範囲に存在し、撮像対象となる。この概略撮像による撮像画像によって、オブジェクトの概略形状が取得可能である。 In the processing unit 81, the flight path processing unit 811 may set the interval between the imaging positions CP (imaging position interval) to the interval d11 in the flight path FPA. The interval d11 is a sparse interval (for example, an interval of several tens of meters) such that the size of an object (for example, a building) can be estimated. The interval d11 is set to an interval where at least imaging ranges at adjacent imaging positions CP partially overlap. Imaging at each imaging position CP at the interval d11 of the flight path FPA may be referred to as schematic imaging. The unmanned air vehicle 100 can reduce the imaging time by capturing images at sparse intervals as compared to capturing images at dense intervals. A landscape including the building BL and the mountain MT may spread in the vertical direction (direction toward the ground, direction of gravity) of the flight path on which the unmanned air vehicle 100 flies. Therefore, the building BL and the mountain MT exist in the imaging range and are imaging targets. The approximate shape of the object can be acquired from the captured image obtained by the approximate imaging.

 図11は、飛行経路FPAにより得られた概略撮像に基づく概略形状の3次元形状データの生成を説明するための図である。 FIG. 11 is a diagram for explaining generation of three-dimensional shape data having a rough shape based on rough imaging obtained by the flight path FPA.

 処理部81において、形状データ処理部812は、飛行経路FPAの概略撮像により各撮像位置CPで得られた複数の撮像画像CI1を基に、オブジェクトの概略形状の3次元形状データSD1を生成する。ユーザは、3次元形状データSD1を表示等により確認することで、飛行経路FPAの鉛直方向に存在した地面の概略形状を把握できる。ユーザは、概略撮像に基づく3次元形状データSD1により得られる形状(概略形状)の確認により、山MTが存在することは確認可能であるが、建物BLの存在は確認できない。これは、山MTはその輪郭がなだらかであり、飛行経路FPAに従う上空から撮像しても、撮像画像CI1内に3次元形状データSD1の生成に必要な画像が足りるためである。また、これは、建物BLはその輪郭が鉛直方向に略平行となり、建物BLの上空で水平方向に無人飛行体100が進行する飛行経路FPAの撮像位置CPにおいて、建物BLの側面を十分に撮像することが困難であるためである。つまり、建物BLの周辺は、下向きに撮像した撮像画像からは3次元形状推定に必要な情報が取得できない。 In the processing unit 81, the shape data processing unit 812 generates the three-dimensional shape data SD1 of the approximate shape of the object based on the plurality of captured images CI1 obtained at each imaging position CP by the schematic imaging of the flight path FPA. The user can grasp the approximate shape of the ground existing in the vertical direction of the flight path FPA by confirming the three-dimensional shape data SD1 by display or the like. The user can confirm that the mountain MT exists by confirming the shape (schematic shape) obtained from the three-dimensional shape data SD1 based on the schematic imaging, but cannot confirm the existence of the building BL. This is because the mountain MT has a gentle outline, and even if an image is taken from the sky according to the flight path FPA, an image necessary for generating the three-dimensional shape data SD1 is sufficient in the captured image CI1. In addition, this is because the outline of the building BL is substantially parallel to the vertical direction, and the side surface of the building BL is sufficiently imaged at the imaging position CP of the flight path FPA where the unmanned air vehicle 100 travels in the horizontal direction above the building BL. This is because it is difficult to do. That is, information necessary for three-dimensional shape estimation cannot be acquired from the captured image captured downward in the vicinity of the building BL.

 そこで、飛行経路処理部811は、オブジェクトの概略形状のデータを用いて、オブジェクトの鉛直方向に平行な側面に向かう方向、すなわち水平方向(鉛直方向の法線方向)に向けてオブジェクトの側面を側方から撮像するように、飛行経路及び撮像位置を生成して設定する。形状データ処理部812は、生成した飛行経路に従って撮像したオブジェクトの側方の撮像画像を含む撮像画像を用いて、オブジェクトの3次元形状データを生成する。これにより、オブジェクトの3次元形状の推定精度を向上できる。 Therefore, the flight path processing unit 811 uses the data of the approximate shape of the object to move the side surface of the object toward the direction parallel to the vertical direction of the object, that is, in the horizontal direction (normal direction of the vertical direction). A flight path and an imaging position are generated and set so as to capture an image from one side. The shape data processing unit 812 generates the three-dimensional shape data of the object using the captured image including the captured image of the side of the object captured according to the generated flight path. Thereby, the estimation accuracy of the three-dimensional shape of the object can be improved.

[3次元形状推定]
 図12は、実施形態における3次元形状推定方法の処理手順の一例を示すフローチャートである。本例では、情報処理装置の処理部の一例としての携帯端末80の処理部81が主体的に処理を実行するものとする。
[3D shape estimation]
FIG. 12 is a flowchart illustrating an example of a processing procedure of the three-dimensional shape estimation method according to the embodiment. In this example, it is assumed that the processing unit 81 of the mobile terminal 80 as an example of the processing unit of the information processing apparatus executes the processing independently.

 処理部81の飛行経路処理部811は、生成した飛行経路を用いて、無人飛行体100に対して飛行経路の設定を行う(S21)。無人飛行体100の処理部110は、設定された飛行経路に従って、撮影対象領域の飛行範囲を飛行し、所定の撮像位置において側方に向かってオブジェクトを空撮する(S22)。この場合、処理部110は、所定の撮像位置間隔毎に撮像範囲を一部重複させて詳細にオブジェクトを撮像(以下、「詳細撮像」と称する場合がある)する。無人飛行体100の処理部110は、各撮像位置における撮像画像を取得し、メモリ160に撮像画像を記録する。 The flight path processing unit 811 of the processing unit 81 sets a flight path for the unmanned air vehicle 100 using the generated flight path (S21). The processing unit 110 of the unmanned aerial vehicle 100 flies over the flight range of the imaging target area according to the set flight path, and aerially captures the object laterally at a predetermined imaging position (S22). In this case, the processing unit 110 captures an object in detail (hereinafter, may be referred to as “detailed imaging”) by partially overlapping the imaging range at every predetermined imaging position interval. The processing unit 110 of the unmanned air vehicle 100 acquires the captured image at each imaging position and records the captured image in the memory 160.

 処理部81の形状データ処理部812は、撮影対象領域の詳細撮像(側方空撮)により得られた撮像画像を取得してメモリ87又はストレージ89に格納する。形状データ処理部812は、取得した撮像画像群から公知の3次元形状復元技術によってオブジェクト(建物、地面など)の立体形状を推定することで、3次元形状データを生成する(S23)。 The shape data processing unit 812 of the processing unit 81 acquires a captured image obtained by detailed imaging (side aerial imaging) of the imaging target region and stores it in the memory 87 or the storage 89. The shape data processing unit 812 generates three-dimensional shape data by estimating the three-dimensional shape of an object (building, ground, etc.) from the acquired captured image group using a known three-dimensional shape restoration technique (S23).

 これにより、オブジェクトを側方から撮像した詳細な撮像画像を用いて、オブジェクトの側面の形状を含む3次元形状データを生成できる。したがって、オブジェクトを下向きに撮像した撮像画像では復元が困難であった側面の詳細形状を推定でき、オブジェクトの3次元形状データの精度を向上できる。 Thus, three-dimensional shape data including the shape of the side surface of the object can be generated using a detailed captured image obtained by capturing the object from the side. Therefore, it is possible to estimate the detailed shape of the side surface that has been difficult to restore in the captured image obtained by capturing the object downward, and to improve the accuracy of the three-dimensional shape data of the object.

[飛行経路生成の第1動作例]
 図13は、実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第1動作例を説明するための図である。第1動作例は、オブジェクトを囲む立方体等の多面体を算出してオブジェクトの側方に向かう撮影平面を生成する例である。
[First example of flight path generation]
FIG. 13 is a diagram for describing a first operation example of flight path generation using the schematic shape of an object in the embodiment. The first operation example is an example in which a polyhedron such as a cube surrounding an object is calculated to generate a shooting plane that faces the side of the object.

 処理部81の飛行経路処理部811は、取得した概略形状を用いて、オブジェクトの外形を囲むような多面体を算出する。この多面体は、オブジェクトの概略形状に対して、外側に接するか又は少し大きい立体である。図示例では、多面体の一例として立方体301を算出した例を示す。飛行経路処理部811は、立方体301の多面体において、少なくとも一つの側面303を抽出する。側面303は、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面であってよい。飛行経路処理部811は、抽出した側面303に対して、多面体の外向きに法線304を算出する。法線304は、側面303の面上に沿う2つのベクトル(例えば、各頂点のいずれかを結ぶベクトル)の外積によって算出できる。飛行経路処理部811は、取得した法線304を用いて、所定の撮影距離を有して側面303と平行する撮影平面305を算出する。この撮影平面305は、側面303から所定の撮影距離に位置し、法線304に対して垂直な平面となる。飛行経路処理部811は、生成した撮影平面305において、この平面内で所定の撮像位置間隔を持つ複数の撮像位置(ウェイポイント)306を設定し、各撮像位置306を通る撮影経路307を決定することにより、この撮影経路307を含む飛行経路を生成する。各撮像位置306における撮影方向は、法線304の方向と逆方向でオブジェクトの側面に対向する方向となる。オブジェクトを囲む多面体が立方体、直方体、或いは柱状体の場合、撮影平面は鉛直方向の平面となり、撮影方向は撮影平面と垂直な水平方向となる。 The flight path processing unit 811 of the processing unit 81 uses the acquired schematic shape to calculate a polyhedron that surrounds the outer shape of the object. This polyhedron is a solid that touches the outside or is slightly larger than the general shape of the object. In the illustrated example, a cube 301 is calculated as an example of a polyhedron. The flight path processing unit 811 extracts at least one side surface 303 in the polyhedron of the cube 301. The side surface 303 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction. The flight path processing unit 811 calculates a normal 304 outward of the polyhedron with respect to the extracted side surface 303. The normal 304 can be calculated by an outer product of two vectors (for example, a vector connecting any one of the vertices) along the surface of the side surface 303. The flight path processing unit 811 calculates an imaging plane 305 having a predetermined imaging distance and parallel to the side surface 303 using the acquired normal line 304. The imaging plane 305 is located at a predetermined imaging distance from the side surface 303 and is a plane perpendicular to the normal line 304. The flight path processing unit 811 sets a plurality of imaging positions (waypoints) 306 having predetermined imaging position intervals in the generated imaging plane 305 and determines an imaging path 307 passing through each imaging position 306. As a result, a flight path including the imaging path 307 is generated. The shooting direction at each imaging position 306 is opposite to the side of the object in the direction opposite to the direction of the normal 304. When the polyhedron surrounding the object is a cube, a rectangular parallelepiped, or a columnar body, the shooting plane is a vertical plane, and the shooting direction is a horizontal direction perpendicular to the shooting plane.

 図14は、撮影平面305における複数の撮像位置306の設定を説明するための図である。飛行経路処理部811は、多面体の側面303に対して、法線方向に所定の撮影距離Lを設定し、側面303から撮影距離L離れた位置に側面303と平行な撮影平面305を算出する。飛行経路処理部811は、撮影平面305において、所定の撮像位置間隔dを設定し、撮像位置間隔d毎に撮像位置306を決定する。撮影距離Lと撮像位置間隔dの設定は、例えば以下に示す方法を用いてよい。 FIG. 14 is a diagram for explaining the setting of a plurality of imaging positions 306 on the imaging plane 305. The flight path processing unit 811 sets a predetermined shooting distance L in the normal direction with respect to the side surface 303 of the polyhedron, and calculates a shooting plane 305 parallel to the side surface 303 at a position away from the side surface 303 by the shooting distance L. The flight path processing unit 811 sets a predetermined imaging position interval d on the imaging plane 305 and determines an imaging position 306 for each imaging position interval d. For example, the following method may be used to set the shooting distance L and the imaging position interval d.

 [1]ユーザが撮影距離L[m]と撮像位置間隔d[m]を指定する。携帯端末80の処理部81は、ユーザによる操作入力に従い、操作部83によって、撮影距離Lと撮像位置間隔dの情報を入力し、メモリ87に格納する。これにより、ユーザの指定による撮影距離と撮像位置間隔に基づいて詳細撮像用の撮像位置を設定できる。 [1] The user designates an imaging distance L [m] and an imaging position interval d [m]. The processing unit 81 of the portable terminal 80 inputs information on the shooting distance L and the imaging position interval d through the operation unit 83 according to the operation input by the user, and stores the information in the memory 87. Thereby, the imaging position for detailed imaging can be set based on the imaging distance specified by the user and the imaging position interval.

 [2]ユーザが撮影距離L[m]と撮像範囲の重複率rside[%]を指定し、撮影距離L及び重複率rsideから撮像位置間隔d[m]を算出する。撮像位置間隔dは、撮影距離L、重複率rside、及び撮像装置の画角FOV(Field of View)を用いて、次の数式(1)により算出できる。 [2] The user designates the shooting distance L [m] and the overlapping rate r side [%] of the imaging range, and calculates the imaging position interval d [m] from the shooting distance L and the overlapping rate r side . The imaging position interval d can be calculated by the following equation (1) using the imaging distance L, the overlapping rate r side , and the angle of view FOV (Field of View) of the imaging device.

Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001

 携帯端末80の処理部81は、ユーザによる操作入力に従い、操作部83によって、撮影距離Lと重複率rsideの情報を入力し、メモリ87に格納する。処理部81は、インタフェース部82又は無線通信部85によって、無人飛行体100より撮像装置220の画角FOVの情報を取得し、メモリ87に格納する。処理部81は、上記数式(1)によって撮像位置間隔dを算出する。これにより、ユーザの指定による撮影距離と撮像範囲の重複率に基づいて撮像位置間隔を算出し、詳細撮像用の撮像位置を設定できる。 The processing unit 81 of the portable terminal 80 inputs information on the shooting distance L and the overlap rate r side by the operation unit 83 according to the operation input by the user, and stores the information in the memory 87. The processing unit 81 acquires information on the angle of view FOV of the imaging device 220 from the unmanned air vehicle 100 by the interface unit 82 or the wireless communication unit 85 and stores the information in the memory 87. The processing unit 81 calculates the imaging position interval d by the above mathematical formula (1). As a result, the imaging position interval can be calculated based on the shooting distance specified by the user and the overlapping ratio of the imaging ranges, and the imaging position for detailed imaging can be set.

 [3]ユーザが撮像画像の解像度r[m/pixel]と撮像範囲の重複率rside[%]を指定し、解像度r及び重複率rsideから撮像位置間隔d[m]を算出する。また、撮像位置間隔dから撮影距離L[m]を算出する。撮像位置間隔dは、解像度r、撮像画像の幅w、及び重複率rsideを用いて、次の数式(2)により算出できる。 [3] The user designates the resolution r [m / pixel] of the captured image and the overlapping rate r side [%] of the imaging range, and calculates the imaging position interval d [m] from the resolution r and the overlapping rate r side . Also, the shooting distance L [m] is calculated from the imaging position interval d. The imaging position interval d can be calculated by the following equation (2) using the resolution r, the width w of the captured image, and the overlap rate r side .

Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002

 撮影距離Lは、撮像位置間隔d、重複率rside、及び撮像装置の画角FOVを用いて次の数式(3)により算出できる。 The shooting distance L can be calculated by the following equation (3) using the imaging position interval d, the overlapping rate r side , and the angle of view FOV of the imaging device.

Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003

 携帯端末80の処理部81は、ユーザによる操作入力に従い、操作部83によって、解像度rと重複率rsideの情報を入力し、メモリ87に格納する。処理部81は、上記数式(2)によって撮像位置間隔dを算出する。処理部81は、インタフェース部82又は無線通信部85によって、無人飛行体100より撮像装置220の画角FOVの情報を取得し、メモリ87に格納する。処理部81は、上記数式(3)によって撮影距離Lを算出する。これにより、ユーザの指定による撮像画像の解像度と撮像範囲の重複率に基づいて撮像位置間隔を算出し、詳細撮像用の撮像位置を設定できる。 The processing unit 81 of the portable terminal 80 inputs information about the resolution r and the overlap rate r side by the operation unit 83 according to the operation input by the user, and stores the information in the memory 87. The processing unit 81 calculates the imaging position interval d by the above mathematical formula (2). The processing unit 81 acquires information on the angle of view FOV of the imaging device 220 from the unmanned air vehicle 100 by the interface unit 82 or the wireless communication unit 85 and stores the information in the memory 87. The processing unit 81 calculates the shooting distance L by the above mathematical formula (3). Thereby, the imaging position interval can be calculated based on the resolution of the captured image specified by the user and the overlapping range of the imaging range, and the imaging position for detailed imaging can be set.

 飛行経路処理部811は、撮影平面305において、設定した撮像位置間隔dに基づき、複数の撮像位置306を撮像位置間隔d毎に等間隔に配置し、これらの撮像位置306を通る撮影経路307を決定する。撮影平面305における始点や終点の撮像位置など、撮影平面305の端部の撮像位置は、撮影平面305の側端から1/2d以下の範囲に設定してよい。 The flight path processing unit 811 arranges a plurality of imaging positions 306 at equal intervals for each imaging position interval d on the imaging plane 305 based on the set imaging position interval d, and sets the imaging path 307 passing through these imaging positions 306. decide. The imaging position of the end of the imaging plane 305, such as the imaging position of the start point and end point on the imaging plane 305, may be set within a range of 1 / 2d or less from the side edge of the imaging plane 305.

 図15は、実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第1動作例の処理手順を示すフローチャートである。処理部81の飛行経路処理部811は、取得した概略形状を用いて、オブジェクトの外形を囲む多面体(立方体301)を算出する(S31)。飛行経路処理部811は、立方体301の多面体において、少なくとも一つ(立方体では4つ)の側面303を順番に抽出する(S32)。飛行経路処理部811は、抽出した1つの側面303に対して、多面体の外向きに法線304を算出する(S33)。飛行経路処理部811は、取得した法線304を用いて、所定の撮影距離L離れた位置にその側面303と平行する撮影平面305を算出する(S34)。 FIG. 15 is a flowchart illustrating a processing procedure of a first operation example of flight path generation using the schematic shape of an object in the embodiment. The flight path processing unit 811 of the processing unit 81 uses the acquired schematic shape to calculate a polyhedron (cube 301) that surrounds the outer shape of the object (S31). The flight path processing unit 811 sequentially extracts at least one (four in the case of a cube) side surfaces 303 in the polyhedron of the cube 301 (S32). The flight path processing unit 811 calculates a normal 304 outward of the polyhedron for one extracted side surface 303 (S33). The flight path processing unit 811 calculates an imaging plane 305 parallel to the side surface 303 at a position away from the predetermined imaging distance L by using the acquired normal line 304 (S34).

 飛行経路処理部811は、算出した1つの撮影平面305において、所定の撮像位置間隔dを持つ複数の撮像位置(ウェイポイント)306を設定し、これらの各撮像位置からオブジェクトを向く方向に撮影する撮影経路307を生成する(S35)。飛行経路処理部811は、オブジェクトに関して、抽出した全ての側面303の撮影経路307の生成が完了したかを判定する(S36)。全ての側面303の撮影経路生成が完了していない場合、飛行経路処理部811は、ステップS32の処理に戻り、次の側面303を抽出し、撮影経路307の生成まで同様の処理を繰り返す(S32~S35)。ステップS36において、全ての側面303の撮影経路生成が完了した場合、飛行経路処理部811は、それぞれの撮影平面305の撮影経路307を結合し、飛行経路を生成する(S37)。 The flight path processing unit 811 sets a plurality of imaging positions (waypoints) 306 having a predetermined imaging position interval d on one calculated imaging plane 305, and shoots in the direction facing the object from each of these imaging positions. An imaging route 307 is generated (S35). The flight path processing unit 811 determines whether the generation of the shooting paths 307 of all the extracted side surfaces 303 has been completed for the object (S36). If the shooting route generation for all the side surfaces 303 has not been completed, the flight route processing unit 811 returns to the process of step S32, extracts the next side surface 303, and repeats the same processing until the generation of the shooting route 307 (S32). To S35). In step S36, when the shooting path generation for all the side surfaces 303 is completed, the flight path processing unit 811 combines the shooting paths 307 of the respective shooting planes 305 to generate a flight path (S37).

 無人飛行体100の処理部110は、通信インタフェース150によって携帯端末80と通信を行い、飛行経路処理部811により生成された飛行経路の情報を取得し、無人飛行体100の飛行経路を設定する。処理部110は、設定した飛行経路に従ってオブジェクトの周囲を飛行し、複数の撮像位置(ウェイポイント)のそれぞれにおいて撮像装置220、230によってオブジェクトを撮像する。処理部110は、各撮影平面305の撮影経路307を結合した飛行経路によって、撮影平面305毎に順番にそれぞれの撮像位置306にて撮像する。すなわち、処理部110は、被写体に含まれるオブジェクトに対して、概略形状の多面体(立方体301)の1つの側面303に対応する撮影平面305において各撮像位置306にて撮像が完了すると、次の側面に対応する撮影平面、例えば現在の側面に隣接する側面の撮影平面に移動し、この撮影平面の各撮像位置にて撮像を行う。このように、無人飛行体100は、飛行経路に設定された全ての撮影平面の撮像位置において、オブジェクトの側面に向かって撮像した側方の撮像画像を取得する。 The processing unit 110 of the unmanned aerial vehicle 100 communicates with the portable terminal 80 through the communication interface 150, acquires the flight path information generated by the flight path processing unit 811, and sets the flight path of the unmanned air vehicle 100. The processing unit 110 flies around the object according to the set flight path, and images the object by the imaging devices 220 and 230 at each of a plurality of imaging positions (waypoints). The processing unit 110 captures images at each imaging position 306 in order for each imaging plane 305 by a flight path obtained by combining the imaging paths 307 of the imaging planes 305. In other words, the processing unit 110 completes imaging at each imaging position 306 on the imaging plane 305 corresponding to one side surface 303 of the polyhedron (cube 301) having a substantially shape with respect to the object included in the subject. For example, a side plane adjacent to the current side surface, and imaging is performed at each imaging position on this plane. In this way, the unmanned aerial vehicle 100 acquires a captured image of the side imaged toward the side surface of the object at the imaging positions of all imaging planes set in the flight path.

 携帯端末80の処理部81は、インタフェース部82又は無線通信部85によって無人飛行体100と通信を行い、無人飛行体100によって撮像された撮像画像を取得する。処理部81の形状データ処理部812は、取得したオブジェクトの側方の撮像画像を用いて、オブジェクト(建物、地面など)の3次元形状データを生成することにより、オブジェクトの側面の形状を含む詳細な3次元形状を推定可能となる。 The processing unit 81 of the portable terminal 80 communicates with the unmanned aerial vehicle 100 through the interface unit 82 or the wireless communication unit 85, and acquires a captured image captured by the unmanned aerial vehicle 100. The shape data processing unit 812 of the processing unit 81 generates the three-dimensional shape data of the object (building, ground, etc.) using the acquired captured image of the side of the object, and includes details of the shape of the side surface of the object. A three-dimensional shape can be estimated.

 なお、撮像画像は、側方の撮像画像とともに、オブジェクトを詳細撮像用の撮像位置間隔で鉛直方向下向きに詳細撮像を行った下方の撮像画像を含んでよい。この場合、飛行経路処理部811は、オブジェクトの上面についても側面と同様に複数の撮像位置を含む撮影経路を設定し、飛行経路を生成する。 Note that the captured image may include a captured image on the lower side obtained by performing detailed imaging of the object in the vertical direction at an imaging position interval for detailed imaging, along with the side captured image. In this case, the flight path processing unit 811 sets an imaging path including a plurality of imaging positions on the upper surface of the object as well as the side surface, and generates a flight path.

 上記動作例により、オブジェクトの概略形状を囲む多面体を算出し、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出することで、オブジェクトの側方に向かって撮像可能な概略形状の側面を抽出できる。このため、オブジェクトの概略形状を用いて、オブジェクトを側方から見た詳細撮像が可能な撮像位置を設定できる。 By calculating the polyhedron surrounding the approximate shape of the object and extracting the surface along the vertical direction of the polyhedron or the surface standing within the predetermined angle range in the vertical direction as the side surface by the above operation example, the polyhedron is directed toward the side of the object. Thus, it is possible to extract the side surface of the approximate shape that can be captured. For this reason, it is possible to set an imaging position where detailed imaging can be performed when the object is viewed from the side, using the schematic shape of the object.

[飛行経路生成の第2動作例]
 図16は、実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第2動作例を説明するための図である。第2動作例は、オブジェクトの概略形状を示すメッシュを簡略化してオブジェクトの側方に向かう撮影平面を生成する例である。
[Second example of flight path generation]
FIG. 16 is a diagram for describing a second operation example of flight path generation using the schematic shape of an object in the embodiment. The second operation example is an example in which a mesh indicating the schematic shape of an object is simplified and a shooting plane directed to the side of the object is generated.

 処理部81の飛行経路処理部811は、取得した概略形状を用いて、オブジェクトの概略形状を示すメッシュを簡略化する。メッシュの簡略化手法としては、公知の方法を用いてよい。公知の方法として、例えば、Vertex clustering法、Incremental decimation法などが挙げられる。メッシュ簡略化処理において、ポリゴンデータを簡略化して複雑な形状を単純化し、1つの面を表すポリゴンの数を削減するスムージングを行う。図示例では、飛行経路処理部811は、概略形状311に対して簡略化処理を施し、簡略化した多面体312を算出する。飛行経路処理部811は、簡略化した多面体312において、少なくとも一つの側面313を抽出し、抽出した側面313に対して、多面体の外向きに法線314を算出する。この場合、多面体312の各平面について法線の向きを判定し、正規化した法線の垂直成分Nzの絶対値が0.1より小さい(|Nz|<0.1)場合、その平面を側面313として抽出する。側面313は、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面であってよい。飛行経路処理部811は、取得した法線314を用いて、所定の撮影距離Lを有して側面313と平行する撮影平面315を算出する。飛行経路処理部811は、算出した撮影平面315において、所定の撮像位置間隔dを持つ複数の撮像位置(ウェイポイント)316を設定し、各撮像位置316を通る撮影経路317を決定し、この撮影経路317を含む飛行経路を生成する。この場合、撮影平面は鉛直方向に対し所定範囲以内の切り立った平面となり、撮影方向は略水平方向に側方に向かう方向、すなわちオブジェクトの側面と対向する方向となる。 The flight path processing unit 811 of the processing unit 81 uses the acquired approximate shape to simplify the mesh indicating the approximate shape of the object. As a mesh simplification method, a known method may be used. Known methods include, for example, Vertex-clustering method, Incremental-decimation method and the like. In the mesh simplification process, the polygon data is simplified to simplify a complicated shape, and smoothing is performed to reduce the number of polygons representing one surface. In the illustrated example, the flight path processing unit 811 performs simplification processing on the schematic shape 311 and calculates a simplified polyhedron 312. The flight path processing unit 811 extracts at least one side surface 313 in the simplified polyhedron 312, and calculates a normal 314 outward of the polyhedron with respect to the extracted side surface 313. In this case, the direction of the normal line is determined for each plane of the polyhedron 312. If the absolute value of the normalized normal normal component Nz is smaller than 0.1 (| Nz | <0.1), the plane is defined as the side surface. Extract as 313. The side surface 313 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction. The flight path processing unit 811 calculates an imaging plane 315 having a predetermined imaging distance L and parallel to the side surface 313 using the acquired normal line 314. The flight path processing unit 811 sets a plurality of imaging positions (waypoints) 316 having a predetermined imaging position interval d on the calculated imaging plane 315, determines an imaging path 317 passing through each imaging position 316, and this imaging A flight path including the path 317 is generated. In this case, the shooting plane is a flat plane within a predetermined range with respect to the vertical direction, and the shooting direction is a direction toward the side in a substantially horizontal direction, that is, a direction facing the side surface of the object.

 図17は、実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第2動作例の処理手順を示すフローチャートである。処理部81の飛行経路処理部811は、取得した概略形状を用いて、オブジェクトの概略形状のメッシュ簡略化を行い、概略形状311を簡略化した多面体312を算出する(S41)。飛行経路処理部811は、多面体312において、少なくとも一つの側面313を抽出する(S42)。飛行経路処理部811は、抽出した1つの側面313に対して、多面体の外向きに法線314を算出する(S43)。飛行経路処理部811は、算出した法線314を用いて、所定の撮影距離L離れた位置にその側面313と平行する撮影平面315を算出する(S44)。 FIG. 17 is a flowchart illustrating a processing procedure of the second operation example of the flight path generation using the schematic shape of the object in the embodiment. The flight path processing unit 811 of the processing unit 81 simplifies the mesh of the approximate shape of the object using the acquired approximate shape, and calculates the polyhedron 312 in which the approximate shape 311 is simplified (S41). The flight path processing unit 811 extracts at least one side surface 313 in the polyhedron 312 (S42). The flight path processing unit 811 calculates a normal 314 outward of the polyhedron with respect to the extracted one side 313 (S43). The flight path processing unit 811 calculates the imaging plane 315 parallel to the side surface 313 at a position away from the predetermined imaging distance L using the calculated normal 314 (S44).

 飛行経路処理部811は、生成した1つの撮影平面315において、所定の撮像位置間隔dを持つ複数の撮像位置(ウェイポイント)316を設定し、これらの各撮像位置からオブジェクトを向く方向に撮影する撮影経路317を生成する(S45)。飛行経路処理部811は、オブジェクトに関して、抽出した全ての側面313の撮影経路317の生成が完了したかを判定する(S46)。全ての側面313の撮影経路生成が完了していない場合、飛行経路処理部811は、ステップS42の処理に戻り、現在の側面に隣接している次の側面313を抽出し、撮影経路317の生成まで同様の処理を繰り返す(S42~S45)。ステップS46において、全ての側面313の撮影経路生成が完了した場合、飛行経路処理部811は、それぞれの撮影平面315の撮影経路317を結合し、飛行経路を生成する(S47)。 The flight path processing unit 811 sets a plurality of imaging positions (waypoints) 316 having a predetermined imaging position interval d on one generated imaging plane 315, and shoots in the direction facing the object from each of these imaging positions. A shooting path 317 is generated (S45). The flight path processing unit 811 determines whether the generation of the shooting paths 317 of all the extracted side surfaces 313 has been completed for the object (S46). If the shooting route generation for all the side surfaces 313 has not been completed, the flight route processing unit 811 returns to the process of step S42, extracts the next side surface 313 adjacent to the current side surface, and generates the shooting route 317. The same processing is repeated until (S42 to S45). In step S46, when the shooting path generation for all the side surfaces 313 is completed, the flight path processing unit 811 combines the shooting paths 317 of the shooting planes 315 to generate a flight path (S47).

 上記動作例により、オブジェクトの概略形状を簡略化した多面体を算出し、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出することで、オブジェクトの側方を向いて撮像可能な概略形状の側面を抽出できる。このため、オブジェクトの概略形状を用いて、オブジェクトを側方から見た詳細撮像が可能な撮像位置を設定できる。 By calculating the polyhedron with the simplified shape of the object and extracting the surface along the vertical direction of the polyhedron or the surface standing within the predetermined angle range in the vertical direction as the side surface by the above operation example, The side of the approximate shape that can be imaged can be extracted. For this reason, it is possible to set an imaging position where detailed imaging can be performed when the object is viewed from the side, using the schematic shape of the object.

[飛行経路生成の第3動作例]
 図18は、実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第3動作例を説明するための図である。第3動作例は、オブジェクトを囲む複数の立方体等の多面体を結合してオブジェクトの側方に向かう撮影平面を生成する例である。
[Third example of flight path generation]
FIG. 18 is a diagram for describing a third operation example of flight path generation using the schematic shape of an object in the embodiment. The third operation example is an example in which a photographing plane directed to the side of the object is generated by combining polyhedrons such as a plurality of cubes surrounding the object.

 処理部81の飛行経路処理部811は、取得した概略形状を用いて、建物などの複数のオブジェクトについて、各オブジェクトの概略形状を囲む複数の多面体をそれぞれ算出する。図示例では、複数の多面体の一例として近接して存在する立方体又は直方体の多面体321A、321B、321Cを算出した例を示す。飛行経路処理部811は、複数の多面体321A、321B、321Cを結合し、結合した多面体322を算出する。近接する多面体を結合することにより、側方の詳細撮像時の無人飛行体100のオブジェクトへの衝突を回避する。飛行経路処理部811は、結合した多面体322において、少なくとも一つの側面323を抽出する。側面323は、多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面であってよい。飛行経路処理部811は、抽出した側面323に対して、多面体の外向きに法線324を算出する。飛行経路処理部811は、算出した法線324を用いて、所定の撮影距離Lを有して側面323と平行する撮影平面325を算出する。飛行経路処理部811は、算出した撮影平面325の内部において、所定の撮像位置間隔dを持つ複数の撮像位置(ウェイポイント)326を設定し、各撮像位置326を通る撮影経路327を決定し、この撮影経路327を含む飛行経路を生成する。各撮像位置326における撮影方向は、法線324の方向と逆方向でオブジェクトの側面と対向する方向となる。オブジェクトを囲む多面体が立方体、直方体、或いは柱状体の場合、撮影平面は鉛直方向の平面となり、撮影方向は撮影平面と垂直な水平方向となる。 The flight path processing unit 811 of the processing unit 81 calculates a plurality of polyhedrons surrounding the schematic shape of each object, for a plurality of objects such as buildings, using the acquired schematic shape. In the illustrated example, a cube or a rectangular parallelepiped polyhedron 321A, 321B, 321C present as an example of a plurality of polyhedrons is shown. The flight path processing unit 811 combines a plurality of polyhedrons 321A, 321B, and 321C, and calculates the combined polyhedron 322. By combining adjacent polyhedrons, a collision with an object of the unmanned air vehicle 100 at the time of detailed side imaging is avoided. The flight path processing unit 811 extracts at least one side surface 323 from the combined polyhedron 322. The side surface 323 may be a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction. The flight path processing unit 811 calculates a normal line 324 outward of the polyhedron with respect to the extracted side surface 323. The flight path processing unit 811 calculates a shooting plane 325 having a predetermined shooting distance L and parallel to the side surface 323 using the calculated normal 324. The flight path processing unit 811 sets a plurality of imaging positions (waypoints) 326 having a predetermined imaging position interval d inside the calculated imaging plane 325, determines an imaging path 327 passing through each imaging position 326, and A flight path including the imaging path 327 is generated. The shooting direction at each imaging position 326 is opposite to the side of the object in the direction opposite to the direction of the normal 324. When the polyhedron surrounding the object is a cube, a rectangular parallelepiped, or a columnar body, the shooting plane is a vertical plane, and the shooting direction is a horizontal direction perpendicular to the shooting plane.

 図19は、実施形態におけるオブジェクトの概略形状を用いた飛行経路生成の第3動作例の処理手順を示すフローチャートである。処理部81の飛行経路処理部811は、取得した概略形状を用いて、複数のオブジェクトについて、各オブジェクトの外形を囲む複数の多面体321A、321B、321Cを算出する(S51)。飛行経路処理部811は、各多面体321A、321B、321Cを結合し、結合した多面体322を算出する(S52)。飛行経路処理部811は、結合した多面体322において、少なくとも一つの側面323を順番に抽出する(S32)。飛行経路処理部811における、多面体の側面323の抽出(S32)、法線324の算出(S33)、撮影平面325の算出(S34)、撮像位置326の決定及び撮影経路327の生成(S35)、全撮影経路の生成完了の判定(S36)の各処理は、図15に示した第1動作例と同様であり、ここでは説明を省略する。 FIG. 19 is a flowchart illustrating a processing procedure of a third operation example of the flight path generation using the schematic shape of the object in the embodiment. The flight path processing unit 811 of the processing unit 81 calculates a plurality of polyhedrons 321A, 321B, and 321C surrounding the outer shape of each object for a plurality of objects using the acquired schematic shape (S51). The flight path processing unit 811 combines the polyhedrons 321A, 321B, and 321C, and calculates the combined polyhedron 322 (S52). The flight path processing unit 811 sequentially extracts at least one side surface 323 from the combined polyhedron 322 (S32). In the flight path processing unit 811, extraction of the side surface 323 of the polyhedron (S 32), calculation of the normal 324 (S 33), calculation of the imaging plane 325 (S 34), determination of the imaging position 326 and generation of the imaging path 327 (S 35), Each process of determination of completion of generation of all shooting paths (S36) is the same as that in the first operation example shown in FIG.

 飛行経路処理部811は、ステップS36において、全ての側面323の撮影経路生成が完了していない場合、ステップS32の処理に戻り、現在の側面に隣接している次の側面323を抽出し、撮影経路327の生成まで同様の処理を繰り返す(S32~S35)。全ての側面323の撮影経路生成が完了した場合、飛行経路処理部811は、結合した多面体322におけるそれぞれの撮影平面325の撮影経路327を結合し、飛行経路を生成する(S57)。 In step S36, the flight path processing unit 811 returns to the process of step S32 when the shooting path generation for all the side surfaces 323 is not completed, extracts the next side surface 323 adjacent to the current side surface, and captures the image. The same process is repeated until the path 327 is generated (S32 to S35). When the shooting path generation for all the side surfaces 323 is completed, the flight path processing unit 811 combines the shooting paths 327 of the shooting planes 325 in the combined polyhedron 322 to generate a flight path (S57).

 なお、上述した第2動作例と第3動作例とを組み合わせて、オブジェクトの概略形状の簡略化と複数の多面体の結合とを行い、側面の抽出、撮影平面及び撮像位置の設定、及び撮影経路の決定を行うことにより、飛行経路を生成してよい。 The above-described second operation example and the third operation example are combined to simplify the schematic shape of the object and to combine a plurality of polyhedrons, to extract side surfaces, to set the shooting plane and the shooting position, and to the shooting path. The flight path may be generated by making the determination.

 上記動作例により、オブジェクトの複数の概略形状に対応する多面体をそれぞれ算出し、近接する複数の多面体を結合して、結合した多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出することで、オブジェクトの側方を向いて撮像可能な概略形状の側面を抽出できる。このため、オブジェクトの概略形状を用いて、オブジェクトを側方から見た詳細撮像が可能な撮像位置を設定できる。複数の多面体を結合して側面を抽出することで、オブジェクトが近接して存在する場合に、側方の詳細撮像時に飛行体がオブジェクトへ衝突することを回避できる。 According to the above operation example, polyhedrons corresponding to a plurality of approximate shapes of the object are calculated, and a plurality of adjacent polyhedrons are combined to stand in a plane along the vertical direction of the combined polyhedrons or within a predetermined angle range in the vertical direction. By extracting the open surface as a side surface, it is possible to extract a substantially shaped side surface that can be imaged facing the side of the object. For this reason, it is possible to set an imaging position where detailed imaging can be performed when the object is viewed from the side, using the schematic shape of the object. By extracting a side surface by combining a plurality of polyhedrons, it is possible to avoid the collision of the flying object with the object at the time of detailed imaging of the side when the object is close to the object.

 上記構成例では、情報処理装置として機能する携帯端末80によって、オブジェクトの概略形状を用いて、オブジェクトを側方から見た詳細撮像が可能な撮像位置を設定し、撮像位置を通る飛行経路を設定できる。 In the above configuration example, the mobile terminal 80 functioning as an information processing apparatus sets an imaging position where detailed imaging can be performed when the object is viewed from the side using the schematic shape of the object, and sets a flight path passing through the imaging position. it can.

 本実施形態によれば、オブジェクトの概略形状を用いて、オブジェクトを側方から見た詳細撮像が可能な撮像位置を設定できる。概略形状の側面を抽出することで、オブジェクトの側面に対応する側方の詳細撮像用の撮像位置を設定できる。設定した撮像位置を通過する飛行経路を生成することで、オブジェクトの側方を含む詳細撮像用の飛行経路を設定できる。抽出した側面毎に、側面に対向する撮像位置を設定することで、オブジェクトを側方から見た水平方向の詳細撮像が可能となる。 According to the present embodiment, it is possible to set the imaging position where the detailed imaging when the object is viewed from the side can be set using the approximate shape of the object. By extracting the side surface of the schematic shape, it is possible to set a side image capturing position for detailed imaging corresponding to the side surface of the object. By generating a flight path that passes through the set imaging position, a detailed imaging flight path that includes the side of the object can be set. By setting an imaging position facing the side surface for each extracted side surface, it is possible to perform detailed imaging in the horizontal direction when the object is viewed from the side.

 本実施形態によれば、抽出した側面に対応して、所定の撮像位置間隔を持つ複数の撮像位置を設定することで、適切な解像度、重複率を持つ撮像画像を取得できる。設定した複数の撮像位置を通る撮影経路を決定し、撮影経路を含む飛行経路を生成することで、オブジェクトの側方を含む詳細撮像用の飛行経路を設定できる。抽出した側面に対して所定の撮影距離を有して平行する撮影平面を生成することで、オブジェクトの側面に対向する撮像位置を容易に決定できる。側面に対する法線を算出することで、側面に対して所定の撮影距離において平行する撮影平面を容易に生成できる。所定の撮像位置間隔として、各撮像位置において撮像した撮像画像の一部が他と重複する撮像位置間隔を用いることで、適切な重複率を持つ撮像画像を取得可能な撮像位置を設定できる。一つの側面において撮像位置を通過する飛行経路を生成し、この側面と隣接する次の側面において撮像位置を通過する飛行経路を生成することで、オブジェクトの複数の側面に対応して、側面毎に順番に飛行して効率良く撮像できる。オブジェクトを下向きに撮像した撮像画像を取得し、この撮像画像を用いてオブジェクトの概略形状の3次元形状データを取得することで、所望のオブジェクトの概略形状を取得できる。 According to the present embodiment, by setting a plurality of imaging positions having predetermined imaging position intervals corresponding to the extracted side surfaces, captured images having an appropriate resolution and overlapping rate can be obtained. By determining a shooting route that passes through the plurality of set imaging positions and generating a flight route including the shooting route, it is possible to set a flight route for detailed imaging including the side of the object. By generating a shooting plane parallel to the extracted side surface with a predetermined shooting distance, the imaging position facing the side surface of the object can be easily determined. By calculating the normal to the side surface, a shooting plane parallel to the side surface at a predetermined shooting distance can be easily generated. By using an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other as the predetermined imaging position interval, an imaging position capable of acquiring captured images having an appropriate overlapping rate can be set. By generating a flight path that passes through the imaging position on one side surface, and generating a flight path that passes through the imaging position on the next side surface adjacent to this side surface, each side surface corresponds to a plurality of side surfaces of the object. Efficient images can be taken by flying in order. By acquiring a captured image obtained by imaging the object downward and acquiring three-dimensional shape data of the approximate shape of the object using the captured image, the approximate shape of the desired object can be acquired.

[飛行経路生成システム、第2構成例]
 図20は、実施形態における飛行経路生成システム10Aの第2構成例を示す模式図である。飛行経路生成システム10Aは、無人飛行体100及びPC(Personal Computer)70を含む。無人飛行体100及びPC70は、有線通信又は無線通信(例えば無線LAN、又はBluetooth(登録商標))を用いて、互いに通信可能である。PC70は、デスクトップPC、ノートPC、タブレット端末などのコンピュータでよい。PC70は、ネットワークにより接続されたサーバとクライアント端末によるコンピュータでよい。PC70は、情報処理装置の一例である。
[Flight path generation system, second configuration example]
FIG. 20 is a schematic diagram illustrating a second configuration example of the flight path generation system 10A in the embodiment. The flight path generation system 10 </ b> A includes an unmanned air vehicle 100 and a PC (Personal Computer) 70. The unmanned air vehicle 100 and the PC 70 can communicate with each other using wired communication or wireless communication (for example, wireless LAN or Bluetooth (registered trademark)). The PC 70 may be a computer such as a desktop PC, a notebook PC, or a tablet terminal. The PC 70 may be a computer having a server and a client terminal connected via a network. The PC 70 is an example of an information processing apparatus.

 PC70は、処理部の一例としてのプロセッサ(例えばCPU、MPU又はDSP)、記憶部の一例としてのメモリ、通信インタフェース、ディスプレイ、入力デバイス、ストレージを含んでよい。情報処理装置の一例としてのPC70は、図7に示した携帯端末80が備える処理部81、飛行経路処理部811、形状データ処理部812と同様の機能を有する。 The PC 70 may include a processor (eg, CPU, MPU, or DSP) as an example of a processing unit, a memory, an example of a storage unit, a communication interface, a display, an input device, and a storage. The PC 70 as an example of the information processing apparatus has the same functions as the processing unit 81, the flight path processing unit 811, and the shape data processing unit 812 included in the mobile terminal 80 illustrated in FIG. 7.

 上記構成例では、情報処理装置として機能するPC70によって、オブジェクトの概略形状を用いて、オブジェクトを側方から見た詳細撮像が可能な撮像位置を設定し、撮像位置を通る飛行経路を設定できる。 In the above configuration example, the PC 70 functioning as the information processing apparatus can set the imaging position where the detailed imaging when the object is viewed from the side is set using the schematic shape of the object, and the flight path passing through the imaging position can be set.

[飛行経路生成システム、第3構成例]
 図21は、実施形態における飛行経路生成システム10Bの第3構成例に係る無人飛行体100Aのハードウェア構成の一例を示すブロック図である。飛行経路生成システム10Bの無人飛行体100Aは、図4に示した無人飛行体100と比較すると、処理部110の代わりに処理部110Aを備える。無人飛行体100Aは、情報処理装置の一例としての機能を有し、無人飛行体100Aの処理部110Aは、情報処理装置の処理部の一例である。図21の無人飛行体100Aにおいて、図4の無人飛行体100と同様の構成については、同一の符号を付し、説明を省略又は簡略化する。
[Flight path generation system, third configuration example]
FIG. 21 is a block diagram illustrating an example of a hardware configuration of an unmanned air vehicle 100A according to a third configuration example of the flight path generation system 10B in the embodiment. The unmanned air vehicle 100A of the flight path generation system 10B includes a processing unit 110A instead of the processing unit 110, as compared with the unmanned air vehicle 100 illustrated in FIG. The unmanned air vehicle 100A has a function as an example of an information processing device, and the processing unit 110A of the unmanned air vehicle 100A is an example of a processing unit of the information processing device. In the unmanned air vehicle 100A of FIG. 21, the same components as those of the unmanned air vehicle 100 of FIG. 4 are denoted by the same reference numerals, and description thereof is omitted or simplified.

 情報処理装置の処理部の一例としての処理部110Aは、飛行経路処理部111、形状データ処理部112を含む。飛行経路処理部111は、図7に示した携帯端末80が備える飛行経路処理部811と同様の機能を有する。形状データ処理部112は、図7に示した携帯端末80が備える形状データ処理部812と同様の機能を有する。 The processing unit 110A as an example of the processing unit of the information processing apparatus includes a flight path processing unit 111 and a shape data processing unit 112. The flight path processing unit 111 has the same function as the flight path processing unit 811 provided in the portable terminal 80 shown in FIG. The shape data processing unit 112 has the same function as the shape data processing unit 812 included in the mobile terminal 80 shown in FIG.

 上記構成例では、情報処理装置として機能する無人飛行体100Aの処理部110Aによって、オブジェクトの概略形状を用いて、オブジェクトを側方から見た詳細撮像が可能な撮像位置を設定し、撮像位置を通る飛行経路を設定できる。 In the above configuration example, the processing unit 110A of the unmanned air vehicle 100A functioning as an information processing apparatus sets an imaging position where detailed imaging can be performed when the object is viewed from the side, using the approximate shape of the object, and sets the imaging position. It is possible to set the flight route through.

[飛行経路生成システム、第4構成例]
 図22は、実施形態における飛行経路生成システム10Cの第4構成例に係る送信機50Aのハードウェア構成の一例を示すブロック図である。送信機50Aは、図6に示した送信機50と比較すると、処理部61の代わりに処理部61Aを備える。送信機50Aは、情報処理装置の一例としての機能を有し、送信機50Aの処理部61Aは、情報処理装置の処理部の一例である。図22の送信機50Aにおいて、図6の送信機50と同様の構成については、同一の符号を付し、説明を省略又は簡略化する。
[Flight path generation system, fourth configuration example]
FIG. 22 is a block diagram illustrating an example of a hardware configuration of a transmitter 50A according to a fourth configuration example of the flight path generation system 10C in the embodiment. The transmitter 50A includes a processing unit 61A instead of the processing unit 61, as compared with the transmitter 50 illustrated in FIG. The transmitter 50A has a function as an example of an information processing device, and the processing unit 61A of the transmitter 50A is an example of a processing unit of the information processing device. In the transmitter 50A of FIG. 22, the same components as those of the transmitter 50 of FIG. 6 are denoted by the same reference numerals, and description thereof is omitted or simplified.

 情報処理装置の処理部の一例としての処理部61Aは、飛行経路処理部611、形状データ処理部612を含む。飛行経路処理部611は、図7に示した携帯端末80が備える飛行経路処理部811と同様の機能を有する。形状データ処理部612は、図7に示した携帯端末80が備える形状データ処理部812と同様の機能を有する。 The processing unit 61A as an example of the processing unit of the information processing apparatus includes a flight path processing unit 611 and a shape data processing unit 612. The flight path processing unit 611 has the same function as the flight path processing unit 811 provided in the portable terminal 80 shown in FIG. The shape data processing unit 612 has the same function as the shape data processing unit 812 included in the mobile terminal 80 shown in FIG.

 上記構成例により、情報処理装置として機能する送信機50Aの処理部61Aによって、オブジェクトの概略形状を用いて、オブジェクトを側方から見た詳細撮像が可能な撮像位置を設定し、撮像位置を通る飛行経路を設定できる。 With the above configuration example, the processing unit 61A of the transmitter 50A functioning as an information processing apparatus sets an imaging position where detailed imaging is possible when the object is viewed from the side, using the approximate shape of the object, and passes through the imaging position. You can set the flight path.

 なお、上記実施形態において、生成した飛行経路を飛行体に設定し、飛行体が飛行経路に従って撮影対象領域を飛行しながら、オブジェクトに対して側方の詳細撮像を含む撮像を行って取得した撮像画像は、撮影対象領域に存在するオブジェクトの3次元形状データの生成に用いてよい。側方の詳細撮像により取得した撮像画像は、オブジェクトの側面の検査に使用してよい。 In the above-described embodiment, the acquired flight path is set as the flying object, and the image obtained by performing imaging including lateral detailed imaging with respect to the object while the flying object flies in the imaging target area according to the flight path. The image may be used for generating three-dimensional shape data of an object existing in the imaging target area. A captured image acquired by detailed lateral imaging may be used for inspection of the side surface of the object.

 なお、上記実施形態において、飛行経路生成方法におけるステップを実行する情報処理装置は、携帯端末80、無人飛行体100A、送信機50Aのいずれかに有する例を示したが、他のプラットフォームにおいて情報処理装置を有し、飛行経路生成方法におけるステップを実行してよい。 In the above-described embodiment, an example in which the information processing apparatus that executes the steps in the flight path generation method is provided in any one of the mobile terminal 80, the unmanned air vehicle 100A, and the transmitter 50A has been described. A device may be included to perform the steps in the flight path generation method.

 以上、本開示について実施形態を用いて説明したが、本開示に係る発明の技術的範囲は上述した実施形態に記載の範囲には限定されない。上述した実施形態に、多様な変更又は改良を加えることが当業者に明らかである。その様な変更又は改良を加えた形態も本発明の技術的範囲に含まれ得ることが、特許請求の範囲の記載からも明らかである。 As mentioned above, although this indication was explained using an embodiment, the technical scope of the invention concerning this indication is not limited to the range given in the above-mentioned embodiment. It will be apparent to those skilled in the art that various modifications and improvements can be made to the embodiment described above. It is also apparent from the scope of the claims that the embodiment added with such changes or improvements can be included in the technical scope of the present invention.

 特許請求の範囲、明細書、及び図面中において示した装置、システム、プログラム、及び方法における動作、手順、ステップ、及び段階等の各処理の実行順序は、特段「より前に」、「先立って」等と明示しておらず、前の処理の出力を後の処理で用いるのでない限り、任意の順序で実現可能である。特許請求の範囲、明細書、及び図面中の動作フローに関して、便宜上「先ず、」、「次に」等を用いて説明したとしても、この順で実施することが必須であることを意味するものではない。 The execution order of each process such as operation, procedure, step, and stage in the apparatus, system, program, and method shown in the claims, the description, and the drawings is particularly “before” or “prior to”. ”And the like, and can be realized in any order unless the output of the previous process is used in the subsequent process. Regarding the operation flow in the claims, the description, and the drawings, even if it is described using “first”, “next”, etc. for convenience, it means that it is essential to carry out in this order. is not.

10、10B、10C 飛行経路生成システム
50、50A 送信機
61、61A 処理部
63 無線通信部
65 インタフェース部
67 メモリ
70 PC
80 携帯端末
81 処理部
82 インタフェース部
83 操作部
85 無線通信部
87 メモリ
88 表示部
89 ストレージ
100、100A 無人飛行体(UAV)
110、110A 処理部
111 飛行経路処理部
112 形状データ処理部
150 通信インタフェース
160 メモリ
170 ストレージ
200 ジンバル
210 回転翼機構
220、230 撮像装置
301 立方体
303、313、323 側面
304、314、324 法線
305、315、325 撮影平面
306、316、326 撮像位置(ウェイポイント)
307、317、327 撮影経路
311 概略形状
312、321A、321B、321C 多面体
322 結合した多面体
611 飛行経路処理部
612 形状データ処理部
811 飛行経路処理部
812 形状データ処理部
OPS 操作部セット
10, 10B, 10C Flight path generation system 50, 50A Transmitter 61, 61A Processing unit 63 Wireless communication unit 65 Interface unit 67 Memory 70 PC
80 portable terminal 81 processing unit 82 interface unit 83 operation unit 85 wireless communication unit 87 memory 88 display unit 89 storage 100, 100A unmanned aerial vehicle (UAV)
110, 110A Processing unit 111 Flight path processing unit 112 Shape data processing unit 150 Communication interface 160 Memory 170 Storage 200 Gimbal 210 Rotary wing mechanism 220, 230 Imaging device 301 Cube 303, 313, 323 Side surface 304, 314, 324 Normal line 305, 315, 325 Imaging plane 306, 316, 326 Imaging position (waypoint)
307, 317, 327 Shooting path 311 Outline shape 312, 321A, 321B, 321C Polyhedron 322 Combined polyhedron 611 Flight path processing unit 612 Shape data processing unit 811 Flight path processing unit 812 Shape data processing unit OPS Operation unit set

Claims (25)

 被写体を撮像する飛行体の飛行経路を生成する飛行経路生成方法であって、
 前記被写体に含まれるオブジェクトの概略形状を取得するステップと、
 前記概略形状における側面を抽出するステップと、
 前記側面に対応する撮像位置を設定するステップと、
 前記撮像位置を通過する飛行経路を生成するステップと、を有する、
 飛行経路生成方法。
A flight path generation method for generating a flight path of a flying object that images a subject,
Obtaining a schematic shape of an object included in the subject;
Extracting a side surface in the schematic shape;
Setting an imaging position corresponding to the side surface;
Generating a flight path that passes through the imaging position.
Flight path generation method.
 前記撮像位置を設定するステップは、
 前記抽出した側面毎に、前記側面に対向する撮像位置を設定するステップを含む、
 請求項1に記載の飛行経路生成方法。
The step of setting the imaging position includes
Setting an imaging position facing the side surface for each of the extracted side surfaces,
The flight path generation method according to claim 1.
 前記撮像位置を設定するステップは、
 前記側面に対応して、所定の撮像位置間隔を持つ複数の撮像位置を設定するステップを含む、
 請求項1又は2に記載の飛行経路生成方法。
The step of setting the imaging position includes
Setting a plurality of imaging positions having a predetermined imaging position interval corresponding to the side surface,
The flight path generation method according to claim 1 or 2.
 前記飛行経路を生成するステップは、
 前記複数の撮像位置を通る撮影経路を決定し、前記撮影経路を含む飛行経路を生成するステップを含む、
 請求項3に記載の飛行経路生成方法。
Generating the flight path comprises:
Determining a shooting route passing through the plurality of imaging positions, and generating a flight route including the shooting route,
The flight path generation method according to claim 3.
 前記側面に対して所定の撮影距離を有して平行する撮影平面を生成するステップ、を更に有し、
 前記撮像位置を設定するステップは、
 前記撮影平面において所定の撮像位置間隔を持つ複数の撮像位置を設定するステップを含む、
 請求項1に記載の飛行経路生成方法。
Generating a shooting plane parallel to the side surface with a predetermined shooting distance;
The step of setting the imaging position includes
Setting a plurality of imaging positions having a predetermined imaging position interval in the imaging plane,
The flight path generation method according to claim 1.
 前記撮像位置を設定するステップは、
 前記所定の撮像位置間隔として、各撮像位置において撮像した撮像画像の一部が他と重複する撮像位置間隔を用いる、
 請求項3又は5に記載の飛行経路生成方法。
The step of setting the imaging position includes
As the predetermined imaging position interval, an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other is used.
The flight path generation method according to claim 3 or 5.
 前記オブジェクトの概略形状を囲む多面体を算出するステップ、を更に有し、
 前記側面を抽出するステップは、
 前記多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出するステップを含む、
 請求項1に記載の飛行経路生成方法。
Calculating a polyhedron surrounding the general shape of the object,
Extracting the side surface comprises:
Extracting a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction as a side surface,
The flight path generation method according to claim 1.
 前記オブジェクトの概略形状を簡略化した多面体を算出するステップ、を更に有し、
 前記側面を抽出するステップは、
 前記多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出するステップを含む、
 請求項1に記載の飛行経路生成方法。
Calculating a polyhedron with a simplified schematic shape of the object,
Extracting the side surface comprises:
Extracting a surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction as a side surface,
The flight path generation method according to claim 1.
 前記多面体を算出するステップは、
 前記オブジェクトの複数の概略形状に対応する多面体をそれぞれ算出し、近接する複数の多面体を結合するステップを含む、
 請求項7又は8に記載の飛行経路生成方法。
The step of calculating the polyhedron includes
Calculating a polyhedron corresponding to a plurality of schematic shapes of the object, respectively, and combining a plurality of adjacent polyhedrons,
The flight path generation method according to claim 7 or 8.
 前記飛行経路を生成するステップは、
 一つの前記側面において前記撮像位置を通過する飛行経路を生成し、前記側面と隣接する次の側面において前記撮像位置を通過する飛行経路を生成するステップを含む、
 請求項1に記載の飛行経路生成方法。
Generating the flight path comprises:
Generating a flight path passing through the imaging position on one of the side surfaces, and generating a flight path passing through the imaging position on a next side surface adjacent to the side surface;
The flight path generation method according to claim 1.
 前記オブジェクトを下向きに撮像した撮像画像を取得するステップ、を更に有し、
 前記概略形状を取得するステップは、
 前記撮像画像を用いて前記オブジェクトの概略形状の3次元形状データを取得するステップを含む、
 請求項1に記載の飛行経路生成方法。
Acquiring a captured image obtained by imaging the object downward;
The step of acquiring the approximate shape includes
Obtaining the three-dimensional shape data of the approximate shape of the object using the captured image,
The flight path generation method according to claim 1.
 被写体を撮像する飛行体の飛行経路を生成する情報処理装置であって、
 前記飛行経路に関する処理を実行する処理部を有し、
 前記処理部は、
 前記被写体に含まれるオブジェクトの概略形状を取得し、
 前記概略形状における側面を抽出し、
 前記側面に対応する撮像位置を設定し、
 前記撮像位置を通過する飛行経路を生成する、
 情報処理装置。
An information processing apparatus for generating a flight path of a flying object that images a subject,
A processing unit for performing processing related to the flight path;
The processor is
Obtain the approximate shape of the object contained in the subject,
Extracting a side surface in the schematic shape;
Set the imaging position corresponding to the side surface,
Generating a flight path passing through the imaging position;
Information processing device.
 前記処理部は、
 前記撮像位置の設定において、
 前記抽出した側面毎に、前記側面に対向する撮像位置を設定する、
 請求項12に記載の情報処理装置。
The processor is
In setting the imaging position,
For each extracted side surface, an imaging position that faces the side surface is set.
The information processing apparatus according to claim 12.
 前記処理部は、
 前記撮像位置の設定において、
 前記側面に対応して、所定の撮像位置間隔を持つ複数の撮像位置を設定する、
 請求項12又は13に記載の情報処理装置。
The processor is
In setting the imaging position,
Corresponding to the side surface, a plurality of imaging positions having a predetermined imaging position interval are set.
The information processing apparatus according to claim 12 or 13.
 前記処理部は、
 前記飛行経路の生成において、
 前記複数の撮像位置を通る撮影経路を決定し、前記撮影経路を含む飛行経路を生成する、
 請求項14に記載の情報処理装置。
The processor is
In generating the flight path,
Determining a shooting path passing through the plurality of imaging positions, and generating a flight path including the shooting path;
The information processing apparatus according to claim 14.
 前記処理部は、更に、
 前記側面に対して所定の撮影距離を有して平行する撮影平面を生成し、
 前記撮像位置の設定において、
 前記撮影平面において所定の撮像位置間隔を持つ複数の撮像位置を設定する、
 請求項12に記載の情報処理装置。
The processing unit further includes:
Generating a shooting plane parallel to the side surface with a predetermined shooting distance;
In setting the imaging position,
Setting a plurality of imaging positions having a predetermined imaging position interval in the imaging plane;
The information processing apparatus according to claim 12.
 前記処理部は、
 前記撮像位置の設定において、
 前記所定の撮像位置間隔として、各撮像位置において撮像した撮像画像の一部が他と重複する撮像位置間隔を用いる、
 請求項14又は16に記載の情報処理装置。
The processor is
In setting the imaging position,
As the predetermined imaging position interval, an imaging position interval in which a part of the captured image captured at each imaging position overlaps with the other is used.
The information processing apparatus according to claim 14 or 16.
 前記処理部は、更に、
 前記オブジェクトの概略形状を囲む多面体を算出し、
 前記側面の抽出において、
 前記多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出する、
 請求項12に記載の情報処理装置。
The processing unit further includes:
Calculating a polyhedron surrounding the approximate shape of the object;
In extracting the aspect,
A surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction is extracted as a side surface.
The information processing apparatus according to claim 12.
 前記処理部は、更に、
 前記オブジェクトの概略形状を簡略化した多面体を算出し、
 前記側面の抽出において、
 前記多面体における鉛直方向に沿う面、又は鉛直方向の所定角度範囲内に立った面を側面として抽出する、
 請求項12に記載の情報処理装置。
The processing unit further includes:
Calculate a polyhedron with a simplified schematic shape of the object,
In extracting the aspect,
A surface along the vertical direction in the polyhedron or a surface standing within a predetermined angle range in the vertical direction is extracted as a side surface.
The information processing apparatus according to claim 12.
 前記処理部は、
 前記多面体の算出において、
 前記オブジェクトの複数の概略形状に対応する多面体をそれぞれ算出し、近接する複数の多面体を結合する、
 請求項18又は19に記載の情報処理装置。
The processor is
In calculating the polyhedron,
Calculating a polyhedron corresponding to a plurality of approximate shapes of the object, and combining a plurality of adjacent polyhedrons;
The information processing apparatus according to claim 18 or 19.
 前記処理部は、
 前記飛行経路の生成において、
 一つの前記側面において前記撮像位置を通過する飛行経路を生成し、前記側面と隣接する次の側面において前記撮像位置を通過する飛行経路を生成する、
 請求項12に記載の情報処理装置。
The processor is
In generating the flight path,
Generating a flight path passing through the imaging position on one of the side surfaces, and generating a flight path passing through the imaging position on the next side surface adjacent to the side surface;
The information processing apparatus according to claim 12.
 前記処理部は、更に、
 前記オブジェクトを下向きに撮像した撮像画像を取得し、
 前記概略形状の取得において、
 前記撮像画像を用いて前記オブジェクトの概略形状の3次元形状データを取得する、
 請求項12に記載の情報処理装置。
The processing unit further includes:
Obtain a captured image obtained by capturing the object downward,
In obtaining the outline shape,
Obtaining the three-dimensional shape data of the approximate shape of the object using the captured image;
The information processing apparatus according to claim 12.
 被写体を撮像する飛行体と、前記飛行体の飛行経路を生成する処理部と、を有する飛行経路生成システムであって、
 前記処理部は、
 前記被写体に含まれるオブジェクトの概略形状を取得し、
 前記概略形状における側面を抽出し、
 前記側面に対応する撮像位置を設定し、
 前記撮像位置を通過する飛行経路を生成し、
 前記飛行体は、
 前記飛行経路を取得して設定する、
 飛行経路生成システム。
A flight path generation system having a flying body that images a subject and a processing unit that generates a flight path of the flying body,
The processor is
Obtain the approximate shape of the object contained in the subject,
Extracting a side surface in the schematic shape;
Set the imaging position corresponding to the side surface,
Generating a flight path passing through the imaging position;
The aircraft is
Obtain and set the flight path;
Flight path generation system.
 被写体を撮像する飛行体の飛行経路を生成するコンピュータに、
 前記被写体に含まれるオブジェクトの概略形状を取得するステップと、
 前記概略形状における側面を抽出するステップと、
 前記側面に対応する撮像位置を設定するステップと、
 前記撮像位置を通過する飛行経路を生成するステップと、を実行させるための、
 プログラム。
To the computer that generates the flight path of the flying object that images the subject,
Obtaining a schematic shape of an object included in the subject;
Extracting a side surface in the schematic shape;
Setting an imaging position corresponding to the side surface;
Generating a flight path passing through the imaging position,
program.
 被写体を撮像する飛行体の飛行経路を生成するコンピュータに、
 前記被写体に含まれるオブジェクトの概略形状を取得するステップと、
 前記概略形状における側面を抽出するステップと、
 前記側面に対応する撮像位置を設定するステップと、
 前記撮像位置を通過する飛行経路を生成するステップと、を実行させるためのプログラムを記録したコンピュータ読み取り可能な、
 記録媒体。
To the computer that generates the flight path of the flying object that images the subject,
Obtaining a schematic shape of an object included in the subject;
Extracting a side surface in the schematic shape;
Setting an imaging position corresponding to the side surface;
Generating a flight path passing through the imaging position; and a computer-readable recording of a program for executing the program,
recoding media.
PCT/JP2017/015876 2017-04-20 2017-04-20 Flight path generation method, information processing device, flight path generation system, program and recording medium Ceased WO2018193574A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/015876 WO2018193574A1 (en) 2017-04-20 2017-04-20 Flight path generation method, information processing device, flight path generation system, program and recording medium
JP2019513156A JP6765512B2 (en) 2017-04-20 2017-04-20 Flight path generation method, information processing device, flight path generation system, program and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/015876 WO2018193574A1 (en) 2017-04-20 2017-04-20 Flight path generation method, information processing device, flight path generation system, program and recording medium

Publications (1)

Publication Number Publication Date
WO2018193574A1 true WO2018193574A1 (en) 2018-10-25

Family

ID=63856531

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/015876 Ceased WO2018193574A1 (en) 2017-04-20 2017-04-20 Flight path generation method, information processing device, flight path generation system, program and recording medium

Country Status (2)

Country Link
JP (1) JP6765512B2 (en)
WO (1) WO2018193574A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111179413A (en) * 2019-12-19 2020-05-19 中建科技有限公司深圳分公司 Three-dimensional reconstruction method and device, terminal equipment and readable storage medium
US10962650B2 (en) 2017-10-31 2021-03-30 United States Of America As Represented By The Administrator Of Nasa Polyhedral geofences
US10983535B2 (en) * 2016-08-05 2021-04-20 SZ DJI Technology Co., Ltd. System and method for positioning a movable object
JP2021066423A (en) * 2020-07-30 2021-04-30 株式会社センシンロボティクス Aerial vehicle, inspection method and inspection system
JP2021075262A (en) * 2020-06-02 2021-05-20 株式会社センシンロボティクス Aerial vehicle, inspection method and inspection system
JP2021075263A (en) * 2020-06-02 2021-05-20 株式会社センシンロボティクス Aerial vehicle, inspection method and inspection system
JPWO2021177139A1 (en) * 2020-03-06 2021-09-10
JP2021182177A (en) * 2020-05-18 2021-11-25 防衛装備庁長官 Vehicle maneuvering system and vehicle maneuvering method
US20220118625A1 (en) * 2020-10-21 2022-04-21 Toyota Jidosha Kabushiki Kaisha Robot system, method for controlling robot system, and program
JP7094432B1 (en) 2021-12-03 2022-07-01 ソフトバンク株式会社 Information processing system, information processing device, program, and information processing method
JP2022554248A (en) * 2019-10-28 2022-12-28 スカイディオ,インコーポレイテッド Structural scanning using unmanned air vehicles
JP2025157288A (en) * 2021-11-10 2025-10-15 スカイディオ,インコーポレイテッド Contour scanning using unmanned aerial vehicles

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014185947A (en) * 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd Image photographing method for three-dimensional restoration
US20160253808A1 (en) * 2015-02-26 2016-09-01 Hexagon Technology Center Gmbh Determination of object data by template-based uav control

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014185947A (en) * 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd Image photographing method for three-dimensional restoration
US20160253808A1 (en) * 2015-02-26 2016-09-01 Hexagon Technology Center Gmbh Determination of object data by template-based uav control

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10983535B2 (en) * 2016-08-05 2021-04-20 SZ DJI Technology Co., Ltd. System and method for positioning a movable object
US10962650B2 (en) 2017-10-31 2021-03-30 United States Of America As Represented By The Administrator Of Nasa Polyhedral geofences
US12337965B2 (en) 2019-10-28 2025-06-24 Skydio, Inc. Structure scan using unmanned aerial vehicle
JP7731350B2 (en) 2019-10-28 2025-08-29 スカイディオ,インコーポレイテッド Structural scanning using unmanned aerial vehicles
US12379731B2 (en) 2019-10-28 2025-08-05 Skydio, Inc. Roof scan using unmanned aerial vehicle
JP2022554248A (en) * 2019-10-28 2022-12-28 スカイディオ,インコーポレイテッド Structural scanning using unmanned air vehicles
CN111179413A (en) * 2019-12-19 2020-05-19 中建科技有限公司深圳分公司 Three-dimensional reconstruction method and device, terminal equipment and readable storage medium
CN111179413B (en) * 2019-12-19 2023-10-31 中建科技有限公司深圳分公司 Three-dimensional reconstruction method, device, terminal equipment and readable storage medium
JP7643447B2 (en) 2020-03-06 2025-03-11 ソニーグループ株式会社 Information processing method, information processing device, and program
JPWO2021177139A1 (en) * 2020-03-06 2021-09-10
WO2021177139A1 (en) * 2020-03-06 2021-09-10 ソニーグループ株式会社 Information processing method, information processing device, and program
US12292742B2 (en) 2020-03-06 2025-05-06 Sony Group Corporation Information processing method and information processor
JP2021182177A (en) * 2020-05-18 2021-11-25 防衛装備庁長官 Vehicle maneuvering system and vehicle maneuvering method
JP2021075263A (en) * 2020-06-02 2021-05-20 株式会社センシンロボティクス Aerial vehicle, inspection method and inspection system
JP2021075262A (en) * 2020-06-02 2021-05-20 株式会社センシンロボティクス Aerial vehicle, inspection method and inspection system
JP2021066423A (en) * 2020-07-30 2021-04-30 株式会社センシンロボティクス Aerial vehicle, inspection method and inspection system
JP7420047B2 (en) 2020-10-21 2024-01-23 トヨタ自動車株式会社 robot system
JP2022067744A (en) * 2020-10-21 2022-05-09 トヨタ自動車株式会社 Robot systems, robot system control methods, and programs
US20220118625A1 (en) * 2020-10-21 2022-04-21 Toyota Jidosha Kabushiki Kaisha Robot system, method for controlling robot system, and program
US12350846B2 (en) * 2020-10-21 2025-07-08 Toyota Jidosha Kabushiki Kaisha Robot system, method for controlling robot system, and program
JP2025157288A (en) * 2021-11-10 2025-10-15 スカイディオ,インコーポレイテッド Contour scanning using unmanned aerial vehicles
JP7760088B2 (en) 2021-11-10 2025-10-24 スカイディオ,インコーポレイテッド Contour scanning using unmanned aerial vehicles
JP2023083134A (en) * 2021-12-03 2023-06-15 ソフトバンク株式会社 Information processing system, information processing device, program, and information processing method
JP7094432B1 (en) 2021-12-03 2022-07-01 ソフトバンク株式会社 Information processing system, information processing device, program, and information processing method

Also Published As

Publication number Publication date
JPWO2018193574A1 (en) 2020-02-27
JP6765512B2 (en) 2020-10-07

Similar Documents

Publication Publication Date Title
JP6765512B2 (en) Flight path generation method, information processing device, flight path generation system, program and recording medium
JP6803919B2 (en) Flight path generation methods, flight path generation systems, flying objects, programs, and recording media
JP6878567B2 (en) 3D shape estimation methods, flying objects, mobile platforms, programs and recording media
US11513514B2 (en) Location processing device, flight vehicle, location processing system, flight system, location processing method, flight control method, program and recording medium
US20190318636A1 (en) Flight route display method, mobile platform, flight system, recording medium and program
CN111213107B (en) Information processing device, imaging control method, program, and recording medium
CN115220475A (en) System and method for UAV flight control
US11122209B2 (en) Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium
CN111344650B (en) Information processing device, flight path generation method, program, and recording medium
JP6788094B2 (en) Image display methods, image display systems, flying objects, programs, and recording media
JP6329219B2 (en) Operation terminal and moving body
WO2019061859A1 (en) Mobile platform, image capture path generation method, program, and recording medium
CN114586335A (en) Image processing apparatus, image processing method, program, and recording medium
WO2020119572A1 (en) Shape inferring device, shape inferring method, program, and recording medium
CN112313942A (en) Control device for image processing and frame body control
WO2018188086A1 (en) Unmanned aerial vehicle and control method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17906712

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019513156

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17906712

Country of ref document: EP

Kind code of ref document: A1