WO2018167893A1 - Procédé de génération de profil, procédé d'acquisition d'image, plate-forme mobile, véhicule aérien, programme et support d'enregistrement - Google Patents
Procédé de génération de profil, procédé d'acquisition d'image, plate-forme mobile, véhicule aérien, programme et support d'enregistrement Download PDFInfo
- Publication number
- WO2018167893A1 WO2018167893A1 PCT/JP2017/010515 JP2017010515W WO2018167893A1 WO 2018167893 A1 WO2018167893 A1 WO 2018167893A1 JP 2017010515 W JP2017010515 W JP 2017010515W WO 2018167893 A1 WO2018167893 A1 WO 2018167893A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- imaging device
- information
- positions
- selecting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- the present disclosure relates to a shape generation method, an image acquisition method, a mobile platform, a flying object, a program, and a recording medium that generate a shape of a subject based on an image captured by the flying object.
- a platform for example, an unmanned air vehicle that includes an imaging device such as a camera and performs imaging while flying on a preset fixed path is known (for example, see Patent Document 1).
- This platform receives a command such as a flight route and a shooting instruction from the ground base, flies in accordance with the command, performs shooting, and sends an acquired image to the ground base.
- the platform inclines the imaging device of the platform based on the positional relationship between the platform and the imaging target while flying along the fixed path that has been set.
- a ground shape in a fixed imaging range based on a captured image such as an aerial photograph taken by an unmanned air vehicle flying in the air (for example, UAV: Unmanned Aero Vehicle).
- UAV Unmanned Aero Vehicle
- a technique for generating a flight path of the unmanned air vehicle in advance is used.
- the unmanned aerial vehicle is made to fly according to a previously generated flight path, and the unmanned aerial vehicle is photographed at different imaging positions in the flight path. It is necessary to acquire a plurality of captured images.
- an imaging device mounted on an unmanned aerial vehicle has a limit in the range of light intensity that can be captured with a single exposure (hereinafter referred to as “dynamic range”), and if the brightness of the subject exceeds the upper limit of the dynamic range.
- High dynamic range synthesis is known as a means of avoiding overexposure and underexposure due to dynamic range limitations.
- HDR high dynamic range synthesis
- a plurality of images with different exposure settings are photographed and synthesized to generate an image having a wide dynamic range (high dynamic range image) with little overexposure and underexposure.
- HDR needs to acquire a plurality of images from the same imaging position, and is not suitable for shooting while moving like imaging by a flying object.
- a shape generation method includes a step of acquiring information related to a plurality of imaging positions of a flying object having a plurality of imaging devices, and an imaging device used for imaging each of a plurality of imaging positions from the plurality of imaging devices. , A step of imaging by the imaging device selected at each imaging position, a step of restoring the shape of the subject based on the captured image for each imaging device, and a shape restored for each imaging device.
- the step of selecting an imaging device includes at least a plurality of imaging devices based on the proportion of each imaging position occupied by a portion of a predetermined light amount or less in the imaging region at each imaging position. Selecting one imaging device.
- the step of selecting at least one imaging device selects the first imaging device when the ratio is equal to or less than the first threshold value, and selects the first imaging device when the ratio exceeds the second threshold value that is greater than the first threshold value.
- a second imaging device in which an exposure parameter higher than that of the first imaging device is set is selected and the ratio exceeds the first threshold and is equal to or less than the second threshold, the first imaging device and the second imaging device are selected.
- a step of selecting both of the devices may be included.
- a shape generation method in a mobile platform includes a step of acquiring information related to a plurality of imaging positions of a flying object having a plurality of imaging devices, and imaging for each of a plurality of imaging positions from the plurality of imaging devices.
- the step of selecting at least one imaging device selects the first imaging device when the ratio is equal to or less than the first threshold value, and selects the first imaging device when the ratio exceeds the second threshold value that is greater than the first threshold value.
- a second imaging device in which an exposure parameter higher than that of the first imaging device is set is selected and the ratio exceeds the first threshold and is equal to or less than the second threshold, the first imaging device and the second imaging device are selected.
- a step of selecting both of the devices may be included.
- the step of selecting the imaging device may select a plurality of imaging devices at at least one imaging position among the plurality of imaging positions.
- the step of selecting the imaging device includes a step of acquiring information on an irradiation angle of the light source, a step of acquiring information on an obstruction of the light source at each imaging position, an information on the irradiation angle of the light source, and a light source at each imaging position. And a step of estimating a ratio occupied by a portion of a predetermined light amount or less in the imaging region at the imaging position based on the information regarding the obstruction.
- the step of acquiring information related to the illumination angle of the light source includes the step of acquiring time information and geographic information of the imaging position, and the step of estimating information related to the illumination angle of the light source using the time information and geographic information. Good.
- an image acquisition method in a flying object for shape generation includes a step of acquiring information relating to a plurality of imaging positions of a flying object having a plurality of imaging devices, and a plurality selected from the plurality of imaging devices. Acquiring information relating to the imaging device used for imaging for each of the imaging positions, and imaging using each imaging device corresponding to the acquired information relating to the imaging device used for imaging for each of the plurality of imaging positions; The information relating to the imaging device used for imaging is based on the proportion of the portion of the imaging area that is equal to or less than the predetermined light quantity at each imaging position according to the selection of at least one imaging device from the plurality of imaging devices. It is the generated information.
- the step of acquiring information related to the imaging device used for imaging may include a step of receiving information related to the imaging device used for imaging from the mobile platform for each of a plurality of imaging positions.
- the step of acquiring information related to the imaging device used for imaging includes measuring the light amount of the imaging area at the imaging position at the time of imaging for each imaging position, and a predetermined light amount in the imaging area at the imaging position based on the light amount.
- the method may include a step of estimating a ratio occupied by the following portion, and a step of selecting at least one imaging apparatus from a plurality of imaging apparatuses based on the ratio and generating information related to the imaging apparatus used for imaging.
- the step of acquiring information related to the imaging device used for imaging includes the step of acquiring information regarding the irradiation angle of the light source for each imaging position, the step of acquiring information regarding the obstruction of the light source at the imaging position, and the irradiation angle of the light source. And a step of estimating a ratio of a portion below a predetermined light amount in the imaging region at the imaging position based on the information on the information on the obstruction of the light source at the imaging position, and at least one of the plurality of imaging devices based on the ratio Selecting one imaging device and generating information relating to the imaging device used for imaging.
- the step of acquiring information related to the illumination angle of the light source includes the step of acquiring time information and geographic information of the imaging position, and the step of estimating information related to the illumination angle of the light source using the time information and geographic information. Good.
- the information regarding the imaging device used for imaging is selected when the ratio of the portion of the imaging area equal to or smaller than the predetermined light amount in each imaging position is equal to or less than the first threshold, and the first imaging device is selected.
- the second threshold value greater than the second threshold value is exceeded, the second image pickup device in which the exposure parameter higher than that of the first image pickup device is set is selected, and the ratio exceeds the first threshold value and is equal to or less than the second threshold value.
- both the first imaging device and the second imaging device may be selected and generated.
- the information regarding the imaging device used for imaging may be generated by selecting a plurality of imaging devices at at least one of the plurality of imaging positions.
- the step of imaging using the imaging device includes the step of imaging using the imaging device specified by the acquired information regarding the imaging device used for imaging for each of a plurality of imaging positions during movement of the flying object. Good.
- the method may further include a step of transmitting images captured at a plurality of imaging positions to an information processing apparatus that performs shape generation.
- the mobile platform includes a storage unit, a communication unit that communicates with the flying object, and a processing unit, and the processing unit includes information on a plurality of imaging positions of the flying object including a plurality of imaging devices. And selecting an imaging device to be used for imaging for each of a plurality of imaging positions, and information about the plurality of imaging positions and information about the imaging device selected for each imaging position. Transmission to the flying body by the communication unit and selection of the imaging device is performed by selecting at least one of the plurality of imaging devices for each imaging position based on a ratio occupied by a portion of the imaging region at a predetermined light amount or less in the imaging position. This is realized by selecting one imaging device.
- the processing unit selects the first imaging device when the ratio is equal to or less than the first threshold, and when the ratio exceeds the second threshold that is larger than the first threshold, the exposure is higher than that of the first imaging device.
- the second imaging device with the parameter set is selected and the ratio exceeds the first threshold and is equal to or lower than the second threshold, both the first imaging device and the second imaging device are selected. Good.
- the processing unit further obtains an image captured by the imaging device selected at each imaging position from the flying object, restores the shape of the subject based on the captured image for each imaging device, and for each imaging device.
- the restored shape may be synthesized.
- the processing unit may select a plurality of imaging devices at at least one of the plurality of imaging positions.
- the processing unit acquires information about the irradiation angle of the light source, acquires information about the blocking object of the light source at each imaging position, and based on the information about the irradiation angle of the light source and information about the blocking object of the light source at each imaging position Thus, the ratio of the portion of the image pickup area that is less than or equal to the predetermined light amount may be estimated.
- the processing unit may acquire the time information and the geographical information of the imaging position, and use the time information and the geographical information to estimate information on the irradiation angle of the light source.
- the flying object includes a storage unit, a communication unit that communicates with the mobile platform, a processing unit, and a plurality of imaging devices.
- the processing unit is information related to a plurality of imaging positions of the flying object. And acquiring information related to the imaging device used for imaging for each of the plurality of imaging positions selected from the plurality of imaging devices, and relating to the imaging device used for the acquired imaging for each of the plurality of imaging positions.
- the information regarding the imaging device used for imaging is captured from at least one of the plurality of imaging devices based on the ratio of the portion of the imaging area at or below the predetermined light quantity in each imaging position. This is information generated according to the selection of the imaging device.
- the processing unit may receive information regarding the imaging device used for imaging from the mobile platform for each of the plurality of imaging positions.
- the image sensor further includes an optical sensor, and the processing unit measures, for each imaging position, the amount of light in the imaging region at the imaging position at the time of imaging.
- the occupation ratio may be estimated, and based on the ratio, at least one imaging apparatus may be selected from a plurality of imaging apparatuses, and information regarding the imaging apparatus used for imaging may be generated.
- the processing unit acquires information about the irradiation angle of the light source for each imaging position, acquires information about the blocking object of the light source at the imaging position, information about the irradiation angle of the light source and information about the blocking object of the light source at the imaging position, and Based on the information, the ratio of the portion of the imaging area below the predetermined light quantity in the imaging position is estimated, and based on the ratio, information on the imaging apparatus used for imaging by selecting at least one imaging apparatus from a plurality of imaging apparatuses is selected. May be generated.
- the processing unit may acquire the time information and the geographical information of the imaging position, and use the time information and the geographical information to estimate information on the irradiation angle of the light source.
- the information regarding the imaging device used for imaging is selected when the ratio of the portion of the imaging area equal to or smaller than the predetermined light amount in each imaging position is equal to or less than the first threshold, and the first imaging device is selected.
- the second threshold value greater than the second threshold value is exceeded, the second image pickup device in which the exposure parameter higher than that of the first image pickup device is set is selected, and the ratio exceeds the first threshold value and is equal to or less than the second threshold value.
- both the first imaging device and the second imaging device may be selected and generated.
- the information regarding the imaging device used for imaging may be generated by selecting a plurality of imaging devices at at least one of the plurality of imaging positions.
- the processing unit may capture an image of each of a plurality of imaging positions using the imaging device specified by the acquired information regarding the imaging device used for imaging while the flying object is moving.
- the processing unit may transmit images captured at a plurality of imaging positions to an information processing apparatus that performs shape generation.
- a step of acquiring information relating to a plurality of imaging positions of a flying object in a flying object having a plurality of imaging apparatuses that are computers, and a plurality of imaging positions selected from the plurality of imaging apparatuses The steps of acquiring information relating to the imaging device used for imaging, and imaging using the imaging device corresponding to the acquired information relating to the imaging device used for imaging for each of a plurality of imaging positions are executed.
- Information relating to the imaging device to be used is a program that is information generated according to the selection of at least one imaging device from a plurality of imaging devices, based on the proportion of the portion of the imaging area that is less than or equal to the predetermined light quantity at each imaging position It is.
- a step of acquiring information relating to a plurality of imaging positions of a flying object in a flying object having a plurality of imaging apparatuses that are computers, and a plurality of imaging positions selected from the plurality of imaging apparatuses The steps of acquiring information relating to the imaging device used for imaging, and imaging using the imaging device corresponding to the acquired information relating to the imaging device used for imaging for each of a plurality of imaging positions are executed.
- Information relating to the imaging device to be used is a program that is information generated according to the selection of at least one imaging device from a plurality of imaging devices, based on the proportion of the portion of the imaging area that is less than or equal to the predetermined light quantity at each imaging position Is a storage medium for storing.
- the shape generation method includes an unmanned aerial vehicle (UAV) as an example of a moving object, a mobile platform for remotely controlling the operation or processing of the unmanned aerial vehicle, and composite and composition of images.
- UAV unmanned aerial vehicle
- a mobile platform for remotely controlling the operation or processing of the unmanned aerial vehicle
- composite and composition of images Various types of processing (steps) in a shape generation system including a computer (PC) that performs processing are defined.
- PC computer
- the aircraft image acquisition method for shape generation according to the present disclosure is a method in which various processes (steps) in an unmanned air vehicle are defined in the shape generation method according to the present disclosure.
- the shape generation method in the mobile platform according to the present disclosure is a method in which various processes (steps) in the transmitter are defined in the shape generation method according to the present disclosure.
- the aircraft according to the present disclosure includes an aircraft (for example, a drone or a helicopter) that moves in the air.
- the flying object may be an unmanned flying object having a plurality of imaging devices, and is set in advance to image a subject in an imaging range (for example, a ground shape such as a building, a road, or a park within a certain range).
- the aircraft flies along the flight path and images are taken at a plurality of imaging positions (waypoints described later) set on the flight path.
- a mobile platform is a computer, for example, a transmitter for instructing remote control of various processes including movement of an unmanned air vehicle, and a terminal connected to the transmitter so as to be able to input and output information and data
- a terminal connected to the transmitter so as to be able to input and output information and data
- an information processing apparatus such as a PC or a tablet that is connected to an apparatus or an unmanned air vehicle so that information and data can be input and output.
- the unmanned air vehicle itself may be included as a mobile platform.
- the program according to the present disclosure is a program for causing an unmanned air vehicle or a mobile platform to execute various processes (steps).
- the recording medium records a program (that is, a program for causing an unmanned air vehicle or a mobile platform to execute various processes (steps)).
- the unmanned air vehicle 100 flies along a flight path set in advance within the imaging range.
- the flight path may be set by any conventional method, for example, a path that flies within the range set by an existing algorithm in the shortest distance, a path that flies in the shortest time, or a path that can save the most power.
- the flight path includes information on a plurality of imaging positions (that is, waypoints).
- the unmanned aerial vehicle 100 sequentially moves along the set route and images at the waypoint.
- each waypoint is set with the space
- FIG. 1 is a diagram illustrating a configuration example of a shape generation system according to each embodiment.
- the shape generation system 10 illustrated in FIG. 1 includes at least an unmanned air vehicle 100 and a transmitter 50.
- an information processing device for example, a PC, a tablet, etc.
- the unmanned air vehicle 100 and the transmitter 50 can communicate information and data with each other by using wired communication or wireless communication (for example, wireless LAN (Local Area Network) or Bluetooth (registered trademark)).
- wired communication or wireless communication for example, wireless LAN (Local Area Network) or Bluetooth (registered trademark)
- FIG. 1 illustration of a state in which the terminal device is attached to the casing of the transmitter 50 is omitted.
- the transmitter 50 as an example of the operation terminal is used in a state of being held by both hands of a person using the transmitter 50 (hereinafter referred to as “user”).
- FIG. 2 is a diagram showing an example of the appearance of the unmanned air vehicle 100.
- FIG. 3 is a diagram illustrating an example of a specific external appearance of the unmanned air vehicle 100.
- a side view when the unmanned air vehicle 100 flies in the moving direction STV0 is shown in FIG. 2, and a perspective view when the unmanned air vehicle 100 flies in the moving direction STV0 is shown in FIG.
- the unmanned aerial vehicle 100 is an example of a moving body that includes two imaging devices 220-1 and 220-2 and moves.
- the moving body is a concept including, in addition to the unmanned air vehicle 100, other aircraft that moves in the air, vehicles that move on the ground, ships that move on the water, and the like.
- the roll axis (see the x-axis in FIGS. 2 and 3) is defined in a direction parallel to the ground and along the movement direction STV0.
- a pitch axis (see the y-axis in FIGS. 2 and 3) is defined in a direction parallel to the ground and perpendicular to the roll axis, and further, a direction perpendicular to the ground and perpendicular to the roll axis and the pitch axis.
- the yaw axis (the z axis in FIGS. 2 and 3) is defined.
- the unmanned aerial vehicle 100 includes a UAV main body 102, a gimbal 200, a plurality of imaging devices 220-1 and 220-2, and a plurality of obstacle sensors 230.
- the unmanned air vehicle 100 can move based on a remote control instruction transmitted from a transmitter 50 as an example of a mobile platform according to the present disclosure.
- the movement of the unmanned air vehicle 100 means a flight, and includes at least ascending, descending, left turning, right turning, left horizontal movement, and right horizontal movement.
- the UAV main body 102 includes a plurality of rotor blades.
- the UAV main body 102 moves the unmanned air vehicle 100 by controlling the rotation of a plurality of rotor blades.
- the UAV main body 102 moves the unmanned aerial vehicle 100 using, for example, four rotary wings.
- the number of rotor blades is not limited to four.
- the unmanned air vehicle 100 may be a fixed wing aircraft that does not have rotating wings.
- the imaging device 220-1 and the imaging device 220-2 are cameras that image subjects (for example, the above-described ground shapes such as buildings, roads, and parks) included in a desired imaging range.
- image subjects for example, the above-described ground shapes such as buildings, roads, and parks
- an example is shown in which two imaging devices 220-1 and 220-2 are attached to one gimbal 200, but in actuality, each may be attached to a different gimbal 200 and controlled separately.
- the number of imaging devices is not limited to two, and more imaging devices may be provided.
- the plurality of obstacle sensors 230 can detect obstacles around the unmanned air vehicle.
- FIG. 4 is a block diagram showing an example of the hardware configuration of the unmanned air vehicle 100 constituting the shape generation system 10 of FIG.
- the unmanned air vehicle 100 includes a UAV processing unit 110, a communication interface 150, a memory 160, a gimbal 200, a rotary wing mechanism 210, an imaging device 220-1, an imaging device 220-2, an obstacle sensor 230,
- This configuration includes a GPS receiver 240, a battery 250, an optical sensor 260, and a timer 270.
- the UAV processing unit 110 is configured using, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
- the UAV processing unit 110 performs signal processing for overall control of operations of each unit of the unmanned air vehicle 100, data input / output processing with other units, data calculation processing, and data storage processing.
- the UAV processing unit 110 controls the flight of the unmanned air vehicle 100 in accordance with a program stored in the memory 160.
- the UAV processing unit 110 controls the movement (that is, flight) of the unmanned air vehicle 100 according to a command received from the remote transmitter 50 via the communication interface 150.
- the memory 160 may be removable from the unmanned air vehicle 100.
- the communication interface 150 communicates with the transmitter 50 (see FIG. 4).
- the communication interface 150 receives various commands for the UAV processing unit 110 from the remote transmitter 50.
- the UAV processing unit 110 controls the gimbal 200, the rotating blade mechanism 210, the imaging device 220-1, the imaging device 220-2, the obstacle sensor 230, the GPS receiver 240, the battery 250, the optical sensor 260, and the timer 270. Stores programs and the like necessary for execution.
- the memory 160 may be a computer-readable recording medium, such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), and It may include at least one flash memory such as a USB memory.
- the memory 160 may be provided inside the UAV main body 102. It may be provided so as to be removable from the UAV main body 102.
- the gimbal 200 supports the imaging device 220-1 and the imaging device 220-2 so as to be rotatable about at least one axis.
- the gimbal 200 may support the imaging device 220 rotatably about the yaw axis, pitch axis, and roll axis.
- the gimbal 200 may change the imaging direction of the imaging device 220 by rotating the imaging device 220-1 and the imaging device 220-2 around at least one of the yaw axis, the pitch axis, and the roll axis. Further, as described above, a configuration may be adopted in which one gimbal is provided for each of the imaging device 220-1 and the imaging device 220-2.
- the rotary blade mechanism 210 includes a plurality of rotary blades and a plurality of drive motors that rotate the plurality of rotary blades.
- the imaging device 220-1 and the imaging device 220-2 capture a subject within a desired imaging range and generate captured image data.
- the two imaging devices 220-1 and 220-2 preferably have the same angle of view and are set with different exposure parameters.
- Image data obtained by imaging by the imaging device 220-1 and the imaging device 220-2 is stored in a memory or a memory 160 respectively included in the imaging device 220-1 and the imaging device 220-2.
- the image data is stored for each imaging device.
- the obstacle sensor 230 is, for example, an infrared sensor, an imaging device, an ultrasonic sensor, and the like, detects an obstacle around the unmanned air vehicle 100, and outputs the information to the UAV processing unit 110.
- the GPS receiver 240 receives a plurality of signals indicating times and positions (coordinates) of each GPS satellite transmitted from a plurality of navigation satellites (that is, GPS satellites).
- the GPS receiver 240 calculates the position of the GPS receiver 240 (that is, the position of the unmanned air vehicle 100) based on the received signals.
- the GPS receiver 240 outputs the position information of the unmanned air vehicle 100 to the UAV processing unit 110.
- the location information of the GPS receiver 240 may be calculated by the UAV processing unit 110 instead of the GPS receiver 240.
- the UAV processing unit 110 receives information indicating the time and the position of each GPS satellite included in a plurality of signals received by the GPS receiver 240.
- the battery 250 has a function as a drive source of each part of the unmanned air vehicle 100 and supplies necessary power to each part of the unmanned air vehicle 100.
- the optical sensor 260 detects the amount of light in the imaging area and outputs the detection result to the UAV processing unit 110.
- the timer 270 manages time information and outputs it to the UAV processing unit 110.
- FIG. 5 is a perspective view showing an example of the appearance of the transmitter 50.
- the up / down / front / rear / left / right directions with respect to the transmitter 50 are assumed to follow the directions of arrows shown in FIG.
- the transmitter 50 is used in a state of being held by both hands of a user who uses the transmitter 50, for example.
- the transmitter 50 includes, for example, a resin casing 50B having a substantially rectangular parallelepiped shape (in other words, a substantially box shape) having a substantially square bottom surface and a height shorter than one side of the bottom surface.
- a specific configuration of the transmitter 50 will be described later with reference to FIG.
- a left control rod 53L and a right control rod 53R are provided in a projecting manner at approximately the center of the housing surface of the transmitter 50.
- the left control rod 53L and the right control rod 53R are used in operations for remotely controlling the movement of the unmanned air vehicle 100 by the user (for example, moving the unmanned air vehicle 100 back and forth, moving left and right, moving up and down, and changing the direction). Is done.
- the left control rod 53L and the right control rod 53R indicate positions in an initial state where no external force is applied from both hands of the user.
- the left control rod 53L and the right control rod 53R automatically return to a predetermined position (for example, the initial position shown in FIG. 5) after the external force applied by the user is released.
- the power button B1 of the transmitter 50 is disposed on the front side (in other words, the user side) of the left control rod 53L.
- the power button B1 is pressed once by the user, for example, the remaining capacity of the battery (not shown) built in the transmitter 50 is displayed in the remaining battery capacity display portion L2.
- the power button B1 is pressed again by the user, for example, the power of the transmitter 50 is turned on, and power is supplied to each part (see FIG. 6) of the transmitter 50 so that it can be used.
- RTH (Return To Home) button B2 is disposed on the front side (in other words, the user side) of the right control rod 53R.
- the transmitter 50 transmits a signal for automatically returning the unmanned air vehicle 100 to a predetermined position.
- the transmitter 50 can automatically return the unmanned air vehicle 100 to a predetermined position (for example, a take-off position stored in the unmanned air vehicle 100).
- the RTH button B2 is used when, for example, the user loses sight of the airframe of the unmanned aerial vehicle 100 during aerial shooting with the unmanned air vehicle 100 outdoors, or when it becomes impossible to operate due to radio interference or unexpected troubles. Is available.
- a remote status display unit L1 and a battery remaining amount display unit L2 are arranged on the front side (in other words, the user side) of the power button B1 and the RTH button B2.
- the remote status display unit L1 is configured by using, for example, an LED (Light Emission Diode), and displays a wireless connection state between the transmitter 50 and the unmanned air vehicle 100.
- the battery remaining amount display unit L2 is configured using, for example, an LED, and displays the remaining amount of the capacity of a battery (not shown) built in the transmitter 50.
- Two antennas AN1 and AN2 project from the rear side of the housing 50B of the transmitter 50 and rearward from the left control rod 53L and the right control rod 53R.
- the antennas AN1 and AN2 unmanned signals generated by the transmitter processing unit 61 (that is, signals for controlling the movement of the unmanned air vehicle 100) based on the user's operation of the left control rod 53L and the right control rod 53R. Transmit to the flying object 100.
- the antennas AN1 and AN2 can cover a transmission / reception range of 2 km, for example.
- the antennas AN1 and AN2 are used for unmanned flight of images taken by the imaging devices 220-1 and 220-2 of the unmanned air vehicle 100 wirelessly connected to the transmitter 50 or various data acquired by the unmanned air vehicle 100. When transmitted from the body 100, these images or various data can be received.
- the touch panel display TPD1 is configured using, for example, an LCD (Crystal Liquid Display) or an organic EL (Electroluminescence).
- LCD Crystal Liquid Display
- organic EL Electrode
- the shape, size, and arrangement position of the touch panel display TPD1 are arbitrary and are not limited to the example shown in FIG.
- FIG. 6 is a block diagram illustrating an example of a hardware configuration of the transmitter 50 configuring the shape generation system 10 of FIG.
- the transmitter 50 includes a left control rod 53L, a right control rod 53R, a transmitter processing unit 61, a wireless communication unit 63, a memory 64, a power button B1, an RTH button B2, an operation unit set OPS,
- the configuration includes a remote status display unit L1, a remaining battery level display unit L2, and a touch panel display TPD1.
- the transmitter 50 is an example of an operation terminal for remotely controlling the unmanned air vehicle 100.
- the left control rod 53L is used for an operation for remotely controlling the movement of the unmanned air vehicle 100 by, for example, the user's left hand.
- the right control rod 53R is used for an operation for remotely controlling the movement of the unmanned air vehicle 100 by, for example, the user's right hand.
- the movement of the unmanned aerial vehicle 100 includes, for example, a forward movement, a backward movement, a leftward movement, a rightward movement, a rising movement, a downward movement, and a leftward movement. Or a combination thereof, and so on.
- the transmitter processing unit 61 displays the remaining capacity of the battery (not shown) built in the transmitter 50 on the battery remaining amount display unit L2. Thereby, the user can easily check the remaining capacity of the battery capacity built in the transmitter 50.
- the power button B1 is pressed twice, a signal indicating that the power button B1 has been pressed twice is passed to the transmitter processing unit 61.
- the transmitter processing unit 61 instructs a battery (not shown) built in the transmitter 50 to supply power to each unit in the transmitter 50. As a result, the user can turn on the transmitter 50 and easily start using the transmitter 50.
- a signal indicating that the RTH button B2 has been pressed is input to the transmitter processing unit 61.
- the transmitter processing unit 61 generates a signal for automatically returning the unmanned air vehicle 100 to a predetermined position (for example, the take-off position of the unmanned air vehicle 100), and the wireless communication unit 63 and the antennas AN1 and AN2 are connected.
- the unmanned aerial vehicle 100 can automatically return (return) the unmanned air vehicle 100 to a predetermined position by a simple operation on the transmitter 50.
- the operation unit set OPS is configured using a plurality of operation units (for example, operation units OP1,..., Operation unit OPn) (n: an integer of 2 or more).
- the operation unit set OPS supports other operation units (for example, remote control of the unmanned air vehicle 100 by the transmitter 50) except for the left control rod 53L, the right control rod 53R, the power button B1, and the RTH button B2 shown in FIG. Various operation units).
- the various operation units referred to here include, for example, a button for instructing to capture a still image using the imaging device 220 of the unmanned air vehicle 100, and start and end of video recording using the imaging device 220 of the unmanned air vehicle 100.
- Button for adjusting the tilt direction of the gimbal 200 (see FIG. 4) of the unmanned air vehicle 100, a button for switching the flight mode of the unmanned air vehicle 100, and the imaging device 220 of the unmanned air vehicle 100 are set. Dial is applicable.
- the operation unit set OPS has a parameter operation unit OPA for inputting information on input parameters for generating waypoints of the unmanned air vehicle 100.
- the parameter operation unit OPA may be formed by a stick, a button, a key, a touch panel, or the like.
- the parameter operation unit OPA may be formed by the left control rod 53L and the right control rod 53R.
- the timing for inputting each parameter included in the input parameters by the parameter operation unit OPA may be the same or different.
- the remote status display unit L1 and the remaining battery level display unit L2 have been described with reference to FIG.
- the transmitter processing unit 61 is configured using a processor (for example, a CPU, MPU, or DSP).
- the transmitter processing unit 61 performs signal processing for overall control of operations of each unit of the transmitter 50, data input / output processing with other units, data calculation processing, and data storage processing.
- the wireless communication unit 63 is connected to two antennas AN1 and AN2.
- the wireless communication unit 63 transmits / receives information and data to / from the unmanned air vehicle 100 via the two antennas AN1 and AN2 using a predetermined wireless communication method (for example, WiFi (registered trademark)).
- the wireless communication unit 63 transmits the input parameter information from the transmitter processing unit 61 to the unmanned air vehicle 100.
- the memory 64 stores, for example, a ROM (Read Only Memory) that stores a program that defines the operation of the transmitter processing unit 61 and data of setting values, and various types of information and data that are used during the processing of the transmitter processing unit 61.
- RAM Random Access Memory
- the program and setting value data stored in the ROM of the memory 64 may be copied to a predetermined recording medium (for example, CD-ROM, DVD-ROM).
- a predetermined recording medium for example, CD-ROM, DVD-ROM.
- aerial image data captured by the imaging device 220 of the unmanned air vehicle 100 is stored in the RAM of the memory 64.
- the touch panel display TPD1 may display various data processed by the transmitter processing unit 61.
- the touch panel display TPD1 displays information on input parameters that have been input. Therefore, the user of the transmitter 50 can confirm the contents of the input parameter by referring to the touch panel display TPD1.
- the transmitter 50 may be connected to the terminal device by wire or wireless instead of including the touch panel display TPD1.
- Information on input parameters may be displayed on the terminal device, similar to the touch panel display TPD1.
- the terminal device may be a smartphone, a tablet terminal, a PC (Personal Computer), or the like. Further, the terminal device inputs at least one of the input parameters, sends the input parameter to the transmitter 50 by wired communication or wireless communication, and the wireless communication unit 63 of the transmitter 50 transmits the input parameter to the unmanned air vehicle 100. Good.
- the operation of the unmanned air vehicle 100 may be one that flies along a preset flight path and images, as will be described later, other than based on the operation by the transmitter 50.
- FIG. 7 is a diagram illustrating an example of the imaging range of each embodiment.
- imaging range A rectangular ground shape
- the case where a circular shade formed by blocking sunlight by a mountain or the like exists on the left side from the center of the imaging range A is taken as an example, but is not limited to this.
- the present disclosure can be applied to any other situation such as a case where lighting is blocked by furniture, a figurine, or the like in imaging of an indoor environment.
- the imaging range may be an irregular and complex range other than a rectangle, and the number of shades included in the imaging range and the shape of each shade may be varied.
- FIG. 8 is a sequence diagram illustrating processing in the shape generation system according to the first embodiment.
- the shape generation system according to the present embodiment includes a transmitter 50, an unmanned air vehicle 100 including an imaging device 220-1 (imaging device 1) and an imaging device 220-2 (imaging device 2), and an information processing device (for example, a PC, Tablet).
- an imaging device 220-1 imaging device 1
- an imaging device 220-2 imaging device 220-2
- an information processing device for example, a PC, Tablet
- the transmitter 50 acquires information regarding the imaging device 1 and the imaging position (waypoint) (step S11).
- the information on the waypoint is coordinate information on the map on which the unmanned air vehicle 100 determined based on the flight path set in advance within the imaging range A performs the imaging operation.
- the flight path is set by the transmitter 50, a terminal device (smartphone, tablet, etc.) connected to the transmitter 50, or another terminal device by a conventional method.
- the flight path flies within the imaging range A at the shortest distance.
- Information on waypoints may include, for example, longitude, latitude, and altitude information.
- the transmitter 50 may acquire information on the waypoint recorded in the built-in memory, or may acquire information on the waypoint from the outside via the wireless communication unit 63.
- FIG. 9 is a diagram schematically showing a flight path and a plurality of imaging positions in the imaging range A.
- 24 waypoints are provided on the flight path. That is, the unmanned air vehicle 100 takes the first image at the waypoint 1 in the imaging process (step S13) described later, then flies in the direction of the arrow and passes the waypoint 2 for the second time. After that, when the aircraft flies again in the direction of the arrow and passes through the waypoint 3, the third imaging is performed. As described above, the flight and the imaging are repeated, and the 24th imaging is performed at the waypoint 24, and then the aerial imaging is terminated.
- the waypoints are preferably set at intervals at which the captured images at adjacent waypoints overlap.
- the unmanned aerial vehicle 100 includes two imaging devices (imaging device 1 and imaging device 2) having different exposure parameters, and performs imaging using only the imaging device 1 for each waypoint, or One of imaging using only the imaging device 2 or imaging using the imaging device 1 and the imaging device 2 at the same time is selected.
- FIG. 10 is a flowchart showing an example of a step (step S12) of selecting an imaging device.
- step S12 the ratio of the shadow portion in the imaging region at the waypoint is acquired (step S121).
- the imaging range A includes a bright part that is exposed to direct sunlight and a dark part that is a shadow formed by blocking sunlight from a building or the like, and has an environment with a large difference in brightness. Therefore, a portion with a predetermined light amount or less is defined as a “shadow portion”.
- the transmitter 50 can estimate the ratio of the shadow portion based on the information regarding the irradiation angle of the light source and the obstruction of the light source.
- the irradiation angle of the light source refers to the irradiation angle of the sun.
- the transmitter 50 can estimate the irradiation angle of the sun based on, for example, time information and position information of the waypoint.
- the transmitter 50 may acquire time information from a built-in timer, for example, or may be acquired from the outside through GPS, the Internet, or the like.
- the waypoint position information may be longitude, latitude, and altitude information included in the information about the waypoint acquired in step S11.
- occlusion of a light source may acquire the schematic shape (for example, the existing elevation map etc.) of the imaging region with reference to the three-dimensional map database on the internet, for example.
- the transmitter 50 acquires the ratio of the shadow portion in the waypoint, and then selects an imaging device to be used based on the ratio.
- the transmitter 50 captures an image having an exposure parameter with a relatively low exposure amount.
- the device 1 is selected (step S123). Thereby, it can be expected to obtain an image in which the sunny portion of the imaging region has a proper exposure.
- the transmitter 50 selects the imaging device 2 having an exposure parameter with a relatively high exposure amount. (Step S125). Thereby, it can be expected to obtain an image in which the shaded portion of the imaging region is properly exposed.
- the transmitter 50 is the imaging device. 1 and the image pickup apparatus 2 are both selected (step S126). As a result, it can be expected to acquire both an image in which the shaded part is properly exposed and an image in which the sunny part is properly exposed.
- the above 30% and 70% are merely examples of the first threshold value and the second threshold value, respectively, and other numerical values may be set as required.
- At least one waypoint, the first threshold value and the second threshold value are set so that the transmitter 50 selects both the imaging device 1 and the imaging device 2.
- the shape restored by the feature points of the image captured by the imaging device 1 (to be described later) and at least a part of the shape restored by the feature points of the image captured by the imaging device 2 overlap each other and are captured.
- FIG. 11 is a diagram schematically showing a state in which an imaging device is selected at each waypoint.
- “ ⁇ ” is shown when the imaging device 1 is selected
- “ ⁇ ” is shown when the imaging device 2 is selected.
- the waypoints 1 to 7, 11 to 14, and 18 to 24 are portions of the sun, and the ratio occupied by the shaded portion is 30% or less, select. Since the waypoints 8, 10, 15, and 17 are on the boundary line between the shade and the sun, and the shaded portion exceeds 30% and is 70% or less, both the imaging device 1 and the imaging device 2 are selected. Since the waypoints 9 and 16 are in the shade and the ratio of the shaded portion exceeds 70%, the imaging device 2 is selected.
- step S12 After the imaging device to be used for each waypoint is selected (step S12), the process returns to FIG. 8, and the transmitter 50 generates information related to the imaging device for each waypoint based on the selected result. And information on the imaging device used in each waypoint are transmitted to the unmanned air vehicle 100.
- Information on waypoints and information on imaging devices used in each waypoint may be transmitted to the unmanned air vehicle 100 by the transmitter 50 by a wireless or wired communication method.
- the transmitter 50 may record the information on a storage medium such as a memory card and transmit the information by any other method such as inserting the storage medium into an unmanned air vehicle.
- the unmanned air vehicle 100 receives information on the waypoints and information on the imaging devices used at each waypoint, then flies along the flight path, and performs imaging using the imaging device selected at each waypoint. (Step S13).
- the unmanned air vehicle 100 performs imaging during its movement (that is, without stopping when it reaches the waypoint).
- the unmanned aerial vehicle 100 can not only shorten the aerial shooting time, but can also save power for stopping and restarting the unmanned aerial vehicle 100.
- the unmanned air vehicle 100 preferably stores the image captured in step S13 in a memory built in the imaging device or a memory built in the unmanned air vehicle for each imaging device.
- the unmanned air vehicle 100 transmits the captured image to the information processing apparatus when the aerial photography is finished.
- These images may be transmitted to the information processing apparatus by the unmanned air vehicle 100 by a wireless or wired communication method, or the unmanned air vehicle 100 is recorded on a storage medium such as a memory card, and the storage medium is stored in the information processing apparatus. It may be transmitted by any other method such as insertion into the network.
- the information processing apparatus that has received the image restores the shape of the subject for each imaging apparatus by a conventional technique such as SFM (Structure from Motion) (step S14).
- SFM Structure from Motion
- FIG. 12 is a diagram schematically showing that the shape is restored and formed for each imaging device.
- the sunshine portion and the vicinity of the boundary between the shade and the sunshine are restored normally, and information on the deep shade portion is missing (on the left side) (See Shape B).
- This is because when the ratio of shadow portions such as waypoints 9 and 16 exceeds 70%, the image capturing apparatus 1 does not capture an image, and thus information is insufficient.
- feature points are efficiently detected in portions other than B and can be restored normally.
- the information processing apparatus synthesizes the two restored shapes (step S15). As a result, the portions B lacking in information are supplemented with each other, and the shape of the entire imaging range A is generated.
- the shape restored based on the image captured by the imaging device 1 and the image captured by the imaging device 2 are used.
- the shape is restored and overlaps near the boundary between the shade and the sun.
- the synthesized shape can prevent the occurrence of a missing portion such as B in FIG. 12 without causing information shortage, and a highly accurate shape is guaranteed.
- the unmanned air vehicle 100 may transmit the captured image to the transmitter 50 as indicated by a broken line in FIG.
- the transmitter 50 restores the shape of the subject for each imaging device (step S14 ′), and synthesizes the restored shape (step S15 ′).
- the specific description regarding the restoration / combination at this time is the same as the respective processing (steps S14 and S15) when the above-described image is transmitted to the information processing apparatus, and is therefore omitted.
- the transmitter 50 selects an imaging device used for imaging for each waypoint.
- the second embodiment differs from the first embodiment in that the unmanned air vehicle 100 selects the imaging device. For convenience, the description overlapping with that of the first embodiment is omitted.
- FIG. 13 is a sequence diagram showing processing in the shape generation system according to the second embodiment.
- the transmitter 50 acquires information regarding the imaging device 1 and the imaging position (waypoint) (step S21).
- the transmitter 50 transmits information on the waypoint to the unmanned air vehicle 100.
- the information on the waypoints may be transmitted directly to the unmanned air vehicle 100 by the transmitter 50 by a wireless or wired communication method, or recorded in a storage medium such as a memory card by the transmitter 50, and the storage medium is unmanned.
- the transmission may be performed by any other method such as insertion into an air vehicle.
- the unmanned air vehicle 100 After receiving the information on the waypoint, the unmanned air vehicle 100 acquires the ratio of the shadow portion in the waypoint and selects the imaging device to be used based on the ratio (step S22).
- the unmanned air vehicle determines the ratio of the shadow portion in the waypoint as information on the irradiation angle of the light source and the blocker of the light source. Can be estimated.
- the irradiation angle of the light source refers to the irradiation angle of the sun.
- the unmanned air vehicle 100 can be estimated from the time information and the position information of the waypoint.
- the unmanned air vehicle 100 may obtain time information from the timer 270, or may obtain it from the outside through GPS, the Internet, or the like.
- the waypoint position information may be longitude, latitude, and altitude information included in the information about the waypoint acquired in step S21.
- the unmanned air vehicle 100 may acquire a rough shape of the imaging region (for example, an existing elevation map) by referring to a three-dimensional map database on the Internet.
- the unmanned air vehicle 100 flies along the flight path and the imaging region at the waypoint at the time of imaging by the optical sensor 260 such as an exposure meter is used. Measure the light intensity.
- the unmanned air vehicle 100 may continuously measure the amount of light in the imaging area during flight and estimate the ratio of the shadow portion in the waypoint before imaging.
- the light quantity may be measured at the timing of arrival at each waypoint (which may be the timing just before arrival).
- the unmanned air vehicle 100 selects an imaging device to be used based on the ratio of the shadow portion in the waypoint (step S22), and then captures an image with the imaging device selected at each waypoint (step S23).
- the image picked up in step S23 is stored for each image pickup device in a memory built in the image pickup device or a memory 160 built in the unmanned air vehicle 100.
- the unmanned air vehicle 100 transmits the image stored in the imaging device to the information processing device.
- These images may be transmitted to the information processing apparatus by the unmanned air vehicle 100 by a wireless or wired communication method, or recorded on a storage medium such as a memory card by the unmanned air vehicle 100, and the storage medium is stored in the information processing apparatus. It may be transmitted by any other method such as insertion into the network.
- the information processing apparatus that has received the image restores the shape of the subject by a conventional method such as SFM (Structure from Motion) for each imaging device (step S24), and combines the two restored shapes at the end (step S24). S25).
- SFM Structure from Motion
- the portions B lacking in information are supplemented with each other, and the shape of the entire imaging range A is generated.
- the shape generation system including the transmitter 50, the unmanned air vehicle 100, and the information processing apparatus has been described.
- the number of imaging devices provided in the unmanned air vehicle 100 is not limited to two, and may be more provided.
- the process executed by the transmitter 50 may be executed by any other type of mobile platform or information processing apparatus. These processes may be executed by the unmanned air vehicle 100 itself.
- the processing executed by the unmanned air vehicle 100 in each of the above embodiments may be executed by a mobile body having any other imaging function.
- the process executed by the information processing apparatus may be executed by another information processing apparatus such as a smartphone or a tablet, or may be executed by the unmanned air vehicle 100 itself.
- the processing in the shape generation system according to each embodiment constitutes the shape generation method of the present disclosure as a whole
- the processing executed by the transmitter 50 constitutes the shape generation method of the mobile platform according to the present disclosure
- the unmanned air vehicle 100 is
- the processing to be executed constitutes an image acquisition method in the flying object for shape generation according to the present disclosure.
- the process (steps) executed by the propo is executed by the transmitter processing unit 61 of the transmitter 50
- the process (steps) executed by the drone is the UAV of the unmanned air vehicle 100. You may perform in the process part 110.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- the transmitter processing unit 61 may cause the transmitter 50 which is a computer to execute a process (step) executed by the propo among the shape generation methods.
- This program may be stored in the memory 64 or other storage medium.
- the UAV processing unit 110 may cause the UAV processing unit 110 to execute a program that causes the unmanned air vehicle 100, which is a computer, to execute a process (step) executed by the drone in the shape generation method.
- This program may be stored in the memory 160 or other storage medium.
- the shape generation method the image acquisition method, the mobile platform, the flying object, the program, and the recording medium according to the present disclosure, it is possible to generate a highly accurate shape even in an environment where there is a large contrast between light and dark regardless of the dynamic range of the imaging device. Become.
- the flying object can be captured while moving without stopping on the flight path, and the sky It is possible to shorten the shooting time and save power.
- the shape generation method the image acquisition method, the mobile platform, the flying object, the program, and the recording medium according to the present disclosure, the shape can be formed with high efficiency even when the number of waypoints is large.
- the restoration operation twice, it is sufficient to perform the synthesis operation once, which is the same for all waypoints as in the conventional HDR. There is no need to synthesize images one by one.
- shape generation system 50 transmitter 61 transmitter processing unit 63 wireless communication unit 64 memory 100 unmanned air vehicle 102 UAV main body 110 UAV processing unit 150 communication interface 160 memory 220-1 imaging device 220-2 imaging device
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
Abstract
Selon l'invention, des informations concernant une pluralité de positions d'imagerie d'un véhicule aérien ayant une pluralité de dispositifs d'imagerie sont acquises. Un dispositif d'imagerie à utiliser pour l'imagerie au niveau de chaque position d'imagerie de la pluralité de positions d'imagerie est sélectionné parmi la pluralité de dispositifs d'imagerie. Une image est capturée à chaque position d'imagerie par le dispositif d'imagerie sélectionné, le profil d'un sujet est restauré pour chaque dispositif d'imagerie sur la base de l'image capturée et les profils sont combinés. Lors de la sélection d'un dispositif d'imagerie, au moins un dispositif d'imagerie est sélectionné pour chaque position d'imagerie, parmi la pluralité de dispositifs d'imagerie, sur la base d'une proportion d'une partie ayant une quantité de lumière prédéterminée ou moins dans une région d'imagerie au niveau de la position d'imagerie. Il s'ensuit que le profil d'un sujet ayant un contraste élevé peut être acquis avec une grande précision sans arrêter le véhicule aérien.
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019505602A JP6997170B2 (ja) | 2017-03-15 | 2017-03-15 | 形状生成方法、画像取得方法、モバイルプラットフォーム、飛行体、プログラム及び記録媒体 |
| PCT/JP2017/010515 WO2018167893A1 (fr) | 2017-03-15 | 2017-03-15 | Procédé de génération de profil, procédé d'acquisition d'image, plate-forme mobile, véhicule aérien, programme et support d'enregistrement |
| CN201780088336.XA CN110402453A (zh) | 2017-03-15 | 2017-03-15 | 形状生成方法、图像获取方法、移动平台、飞行体、程序以及记录介质 |
| JP2021139432A JP2022002391A (ja) | 2017-03-15 | 2021-08-27 | 形状生成方法、画像取得方法、モバイルプラットフォーム、飛行体、プログラム及び記録媒体 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2017/010515 WO2018167893A1 (fr) | 2017-03-15 | 2017-03-15 | Procédé de génération de profil, procédé d'acquisition d'image, plate-forme mobile, véhicule aérien, programme et support d'enregistrement |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018167893A1 true WO2018167893A1 (fr) | 2018-09-20 |
Family
ID=63522886
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/010515 Ceased WO2018167893A1 (fr) | 2017-03-15 | 2017-03-15 | Procédé de génération de profil, procédé d'acquisition d'image, plate-forme mobile, véhicule aérien, programme et support d'enregistrement |
Country Status (3)
| Country | Link |
|---|---|
| JP (2) | JP6997170B2 (fr) |
| CN (1) | CN110402453A (fr) |
| WO (1) | WO2018167893A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7752370B1 (ja) * | 2025-03-14 | 2025-10-10 | 西部マリン・サービス株式会社 | 水上点検装置及び水上点検方法 |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016119693A (ja) * | 2016-02-02 | 2016-06-30 | 株式会社ソニー・インタラクティブエンタテインメント | 画像撮像装置および画像撮像方法 |
| WO2016106715A1 (fr) * | 2014-12-31 | 2016-07-07 | SZ DJI Technology Co., Ltd. | Traitement sélectif de données de capteur |
-
2017
- 2017-03-15 JP JP2019505602A patent/JP6997170B2/ja not_active Expired - Fee Related
- 2017-03-15 CN CN201780088336.XA patent/CN110402453A/zh not_active Withdrawn
- 2017-03-15 WO PCT/JP2017/010515 patent/WO2018167893A1/fr not_active Ceased
-
2021
- 2021-08-27 JP JP2021139432A patent/JP2022002391A/ja not_active Withdrawn
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016106715A1 (fr) * | 2014-12-31 | 2016-07-07 | SZ DJI Technology Co., Ltd. | Traitement sélectif de données de capteur |
| JP2016119693A (ja) * | 2016-02-02 | 2016-06-30 | 株式会社ソニー・インタラクティブエンタテインメント | 画像撮像装置および画像撮像方法 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7752370B1 (ja) * | 2025-03-14 | 2025-10-10 | 西部マリン・サービス株式会社 | 水上点検装置及び水上点検方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN110402453A (zh) | 2019-11-01 |
| JP6997170B2 (ja) | 2022-01-17 |
| JP2022002391A (ja) | 2022-01-06 |
| JPWO2018167893A1 (ja) | 2020-01-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11377211B2 (en) | Flight path generation method, flight path generation system, flight vehicle, program, and storage medium | |
| JP6878567B2 (ja) | 3次元形状推定方法、飛行体、モバイルプラットフォーム、プログラム及び記録媒体 | |
| US20190318636A1 (en) | Flight route display method, mobile platform, flight system, recording medium and program | |
| JP6765512B2 (ja) | 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体 | |
| JP7251474B2 (ja) | 情報処理装置、情報処理方法、情報処理プログラム、画像処理装置および画像処理システム | |
| US11122209B2 (en) | Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium | |
| JP6862477B2 (ja) | 位置処理装置、飛行体、位置処理システム、飛行システム、位置処理方法、飛行制御方法、プログラム、及び記録媒体 | |
| WO2018150492A1 (fr) | Procédé d'affichage d'image, système d'affichage d'image, objet volant, programme et support d'enregistrement | |
| JP2021096865A (ja) | 情報処理装置、飛行制御指示方法、プログラム、及び記録媒体 | |
| JP2022002391A (ja) | 形状生成方法、画像取得方法、モバイルプラットフォーム、飛行体、プログラム及び記録媒体 | |
| JP7081198B2 (ja) | 撮影システム及び撮影制御装置 | |
| JP6329219B2 (ja) | 操作端末、及び移動体 | |
| JP6949930B2 (ja) | 制御装置、移動体および制御方法 | |
| WO2022188151A1 (fr) | Procédé de photographie d'image, appareil de commande, plateforme mobile et support de stockage informatique | |
| CN110785724B (zh) | 发送器、飞行体、飞行控制指示方法、飞行控制方法、程序及存储介质 | |
| JP6856670B2 (ja) | 飛行体、動作制御方法、動作制御システム、プログラム及び記録媒体 | |
| JP2024021143A (ja) | 3次元データ生成システム、及び3次元データ生成方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17900388 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019505602 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17900388 Country of ref document: EP Kind code of ref document: A1 |