WO2019235240A1 - Dispositif de traitement d'informations - Google Patents
Dispositif de traitement d'informations Download PDFInfo
- Publication number
- WO2019235240A1 WO2019235240A1 PCT/JP2019/020508 JP2019020508W WO2019235240A1 WO 2019235240 A1 WO2019235240 A1 WO 2019235240A1 JP 2019020508 W JP2019020508 W JP 2019020508W WO 2019235240 A1 WO2019235240 A1 WO 2019235240A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- index
- area
- image
- sunny
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
Definitions
- the present invention relates to a technology that supports the determination of the work content related to crops.
- Patent Document 1 discloses a technique for supporting fertilization management including fertilization management such as determination of the amount of fertilizer and other farm work based on observed data such as converted leaf color values calculated from image data obtained by photographing crops. Is disclosed.
- An index (for example, NDVI) indicating a growth state is obtained by using an output of a sensor (an image sensor or the like) that measures an amount of light with respect to a crop in a farm field, and is used as a guide for a work time.
- a sensor an image sensor or the like
- an object of this invention is to support the appropriate judgment of the growth condition in the farm field where shade is mixed.
- the present invention provides a determination unit that determines a sunny area of a field included in a captured field image, and an index that indicates a growth status of the crop in the determined sunny area
- An information processing apparatus including an index acquisition unit acquired as an output unit and an output unit that outputs the acquired sunny index as an index of the sunny area is provided.
- the figure showing the whole agriculture support system composition concerning an example Diagram showing hardware configuration of server device Diagram showing drone hardware configuration
- the figure showing the functional composition which an agricultural support system realizes The figure showing an example of the imaging method of a farm field
- the figure showing an example of the judgment result of the Hinata area The figure showing an example of the NDVI map of a pixel unit
- the figure showing an example of the NDVI map of an area unit The figure showing an example of the NDVI map of an area unit
- the figure showing an example of the search screen of growth information The figure showing an example of the search screen of growth information
- the figure showing the functional composition realized by a modification The figure showing an example of the image for index correction
- the figure showing the functional composition realized by a modification A figure showing an example of an input screen for shooting conditions The
- FIG. 1 represents the whole structure of the agricultural assistance system 1 which concerns on an Example.
- the agricultural support system 1 is a system that supports a person who performs work in a farm (a place where crops such as rice, vegetables, and fruits are grown) by using an index that represents the growth status of the crop.
- the index that represents the growth status is an index that represents one or both of the progress of the growing stage of the crop (for example, whether or not it is suitable for harvesting) and the status (also called activity) such as the presence or absence of disease. It is.
- NDVI Normalized Difference Vegetation ⁇ ⁇ Index
- an index representing the growth status of the crop in the field is calculated using an image of the field taken from above by the flying object. Is done.
- the flying body may be anything as long as it can photograph the field, and a drone is used in this embodiment.
- the agricultural support system 1 includes a network 2, a server device 10, a drone 20, and a user terminal 30.
- the network 2 is a communication system including a mobile communication network and the Internet, and relays data exchange between devices accessing the own system.
- the server device 10 is accessing the network 2 by wired communication (may be wireless communication), and the drone 20 and the user terminal 30 are accessing by wireless communication (the user terminal 30 may be wired communication).
- the user terminal 30 is a terminal used by a user of the system (for example, a worker who performs work in a farm), and is, for example, a smartphone, a laptop computer, or a tablet terminal.
- the drone 20 is a rotorcraft type flying body that includes one or more rotor blades and flies by rotating the rotor blades.
- the drone 20 includes a photographing unit that photographs a farm field from above while flying.
- the drone 20 is carried to the field by a farm worker who is a user of the agricultural support system 1, for example, and performs flight and shooting by performing an operation of starting shooting flight.
- the server device 10 is an information processing device that performs processing related to worker support.
- the server device 10 performs, for example, a process of calculating the above-described NDVI from the field image captured by the drone 20.
- NDVI uses the property that the green leaf of the spot absorbs a lot of red visible light and reflects a lot of light in the near-infrared region (0.7 ⁇ m to 2.5 ⁇ m). Represented by The worker can determine the timing of watering, fertilizer application, pesticide application, etc. on the crops in the field where he / she works with reference to the growth status represented by the calculation result of NDVI displayed on the user terminal 30. .
- FIG. 2 shows the hardware configuration of the server device 10 and the user terminal 30.
- Each of the server device 10 and the user terminal 30 is a computer including each device such as a processor 11, a memory 12, a storage 13, a communication device 14, an input device 15, an output device 16, and a bus 17.
- the term “apparatus” here can be read as a circuit, a device, a unit, or the like. Each device may include one or a plurality of devices, or some of the devices may not be included.
- the processor 11 controls the entire computer by operating an operating system, for example.
- the processor 11 may include a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, a register, and the like. Further, the processor 11 reads programs (program codes), software modules, data, and the like from the storage 13 and / or the communication device 14 to the memory 12, and executes various processes according to these.
- CPU central processing unit
- the number of processors 11 that execute various processes may be one, two or more, and the two or more processors 11 may execute various processes simultaneously or sequentially. Further, the processor 11 may be implemented by one or more chips.
- the program may be transmitted from the network via a telecommunication line.
- the memory 12 is a computer-readable recording medium, and includes, for example, at least one of ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), and the like. May be.
- the memory 12 may be called a register, a cache, a main memory (main storage device), or the like.
- the memory 12 can store the above-described program (program code), software module, data, and the like.
- the storage 13 is a computer-readable recording medium such as an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu- ray (registered trademark) disk, smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, or the like.
- an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disk, a magneto-optical disk (for example, a compact disk, a digital versatile disk, a Blu- ray (registered trademark) disk, smart card, flash memory (eg, card, stick, key drive), floppy (registered trademark) disk, magnetic strip, or the like.
- a computer-readable recording medium such as an optical disk such as a CD-ROM (Compact Disc ROM), a hard disk drive,
- the storage 13 may be called an auxiliary storage device.
- the above-described storage medium may be, for example, a database including the memory 12 and / or the storage 13, a server, or other suitable medium.
- the communication device 14 is hardware (transmission / reception device) for performing communication between computers via a wired and / or wireless network, and is also referred to as, for example, a network device, a network controller, a network card, or a communication module.
- the input device 15 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that accepts an input from the outside.
- the output device 16 is an output device (for example, a display, a speaker, or the like) that performs output to the outside. Note that the input device 15 and the output device 16 may have an integrated configuration (for example, a touch screen).
- the devices such as the processor 11 and the memory 12 are accessible to each other via a bus 17 for communicating information.
- the bus 17 may be composed of a single bus or may be composed of different buses between devices.
- FIG. 3 shows the hardware configuration of the drone 20.
- the drone 20 is a computer that includes a processor 21, a memory 22, a storage 23, a communication device 24, a flying device 25, a sensor device 26, a photographing device 27, and a bus 28.
- the term “apparatus” here can be read as a circuit, a device, a unit, or the like. Each device may include one or a plurality of devices, or some of the devices may not be included.
- the processor 21, the memory 22, the storage 23, the communication device 24, and the bus 28 are the same type of hardware as the device of the same name shown in FIG. 2 (performance and specifications may be different).
- the communication device 24 can also perform wireless communication between drones in addition to wireless communication with the network 2.
- the flying device 25 is a device that includes a motor, a rotor, and the like and causes the aircraft to fly. The flying device 25 can move the aircraft in all directions in the air, or can stop (hover) the aircraft.
- the sensor device 26 is a device having a sensor group that acquires information necessary for flight control.
- the sensor device 26 is a position sensor that measures the position (latitude and longitude) of the own device, and the direction in which the own device is facing (the drone has a front direction defined, and the front direction is directed.
- Direction sensor that measures the altitude of the aircraft, a velocity sensor that measures the velocity of the aircraft, and an inertial measurement sensor (IMU (Inertial) that measures triaxial angular velocity and acceleration in three directions. Measurement Unit)).
- IMU Inertial measurement sensor
- the photographing device 27 is a so-called digital camera that has a lens, an image sensor, and the like and records an image photographed by the image sensor as digital data.
- This image sensor has sensitivity not only to visible light but also to light having a wavelength in the near infrared region necessary for calculating NDVI.
- the photographing device 27 is attached to the lower part of the casing of the own device (drone 20), has a fixed photographing direction, and photographs a vertically lower part during the flight of the own device.
- the photographing device 27 has an autofocus function, and can automatically focus and photograph even if the flight altitude changes.
- the server device 10 and the drone 20 include a microprocessor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a programmable logic device (PLD), and a field programmable gate array (FPGA). It may be configured including hardware, and a part or all of each functional block may be realized by the hardware. For example, the processor 11 may be implemented by at least one of these hardware.
- DSP digital signal processor
- ASIC application specific integrated circuit
- PLD programmable logic device
- FPGA field programmable gate array
- the server device 10 and the drone 20 included in the agricultural support system 1 store programs provided by this system, and the function group described below is executed by the processor of each device executing the program and controlling each unit. Realized.
- FIG. 4 shows a functional configuration realized by the agricultural support system 1.
- the server device 10 includes an agricultural field image generation unit 101, a sunny area determination unit 102, an index calculation unit 103, a growth information generation unit 104, and a growth information recording unit 105.
- the drone 20 includes a flight control unit 201, a flight unit 202, a sensor measurement unit 203, and an imaging unit 204.
- the flight control unit 201 controls the flight of the own aircraft when photographing a farm field.
- the flight control unit 201 stores, for example, field range information (for example, latitude and longitude information indicating the outer edge of the field) indicating the geographical range of the field registered in advance by a farmer who is a user, and the field range. Based on the information, control is performed to fly the aircraft in a flight path that flies across the entire field at a constant altitude.
- the flight path in this case is, for example, a path in which a rectangular farm field flies in a wavy locus from one side of the farm field to the other side facing the side.
- it may be a route that flies along the outer edge of the field, and after a week, shifts the route to the inside and draws a spiral trajectory. Any route is acceptable.
- the flying unit 202 has a function of flying the aircraft.
- the flying device 202 is operated by operating a motor, a rotor, and the like included in the flying device 25.
- the sensor measurement unit 203 performs measurement by each sensor (position sensor, direction sensor, altitude sensor, speed sensor, inertial measurement sensor) included in the sensor device 26, and calculates the position, direction, altitude, speed, angular velocity, and acceleration of the own device. Measure repeatedly at predetermined time intervals.
- the sensor measurement unit 203 supplies sensor information indicating the measured position, direction, altitude, speed, angular velocity, and acceleration to the flight control unit 201.
- the flight control unit 201 controls the flight unit 202 based on the supplied sensor information and causes the aircraft to fly along the above-described flight path.
- the sensor measurement unit 203 supplies sensor information indicating the measured position, direction, altitude, and speed to the imaging unit 204.
- the photographing unit 204 has a function of photographing a subject using the photographing device 27 and is an example of the “photographing unit” in the present invention.
- the imaging unit 204 captures the field as a subject.
- the imaging unit 204 captures an image of a field, and also captures an area where the crop is growing in the field (a crop area).
- each pixel forming a still image captured by the imaging unit 204 shows visible light red, green, and blue.
- pixel values (R, G, B) it is represented by a pixel value (IR) indicating light having a wavelength in the near infrared region.
- the imaging unit 204 captures a plurality of still images based on the supplied sensor information so that all areas in the field are included.
- FIG. 5 shows an example of a method for photographing a farm field.
- FIG. 5 shows a path B1 when the drone 20 flies over the field A1 with a wavy locus.
- the imaging unit 204 calculates the imaging range (the field range included in the angle of view) of the field (0 m above the ground) from the altitude indicated by the sensor information and the angle of view of the imaging device 27. Then, the photographing unit 204 represents the ratio of the area where the current photographing range and the previous photographing range overlap from the speed and direction indicated by the sensor information (for example, expressed as a percentage of the overlapping area when the area of the photographing range is 100%). ) Is less than the threshold value, the next shooting is performed.
- the photographing unit 204 first photographs the photographing region C1, and then photographs the photographing region C2 that slightly overlaps the photographing region C1.
- the photographing unit 204 notifies the flight control unit 201 of the calculated size of the photographing range when the own device (drone 20) is turned back.
- the flight control unit 201 folds the route by shifting the distance by which the imaging range of the notified size overlaps, for example, the imaging areas C4 and C5 in FIG.
- the imaging unit 204 captures still images obtained by imaging the imaging regions C1 to C32 shown in FIG. 5, that is, a plurality of still images whose imaging ranges are slightly overlapped, by repeating imaging using this method.
- the field A ⁇ b> 1 has a size and shape that can accommodate a plurality of imaging ranges, but this need not be the case. In that case, all the areas in the field are included in any one of the still images by widening the overlapping part of the shooting ranges or by shooting including the outside of the field.
- the photographing method by the photographing unit 204 is not limited to this. For example, if the flight speed and flight altitude at the time of shooting are determined, the overlapping time interval is calculated in advance as shown in FIG. 5, so that shooting may be performed at that time interval. Further, if the map of the farm field and the shooting position are determined in advance, the shooting unit 204 may shoot when flying in the determined position. Further, the photographing unit 204 may photograph a moving image in which the data becomes large as long as the storage capacity or communication speed of the own device is not insufficient.
- a known method for photographing the ground using a drone may be used. Operation
- movement of each part with which drone 20 is provided is started by operation of the flight start by the farm worker mentioned above.
- the drone 20 flies over the set flight path over the field, and the imaging unit 204 repeatedly performs imaging as described above.
- image data indicating the photographed still image and photographing information related to photographing (information indicating the position, orientation, altitude, and time when photographing, and the angle of view of the photographing device 27). Is transmitted to the server device 10.
- the farm field image generation unit 101 of the server device 10 receives the transmitted image data, and acquires a still image indicated by the image data as a farm field image captured by the drone 20.
- the farm field image generation unit 101 is an example of the “first image acquisition unit” in the present invention.
- the field image generation unit 101 also acquires the shooting information indicated by the received image data, the field image generation unit 101 generates an image of the entire field from the image for each shooting region.
- the farm field image generation unit 101 calculates an overlapping part of each image using the position, orientation, altitude, and angle of view indicated by the acquired imaging information, and for the overlapping part, for example, adopts a pixel of one image to make the entire field Generate an image of
- the field image generation unit 101 assigns a pixel ID to each pixel of the generated entire field image
- the field image generation unit 101 supplies the entire field image data indicating the pixel ID and the entire field image to the sunny area determination unit 102 and the index calculation unit 103. To do.
- the sunny area determination unit 102 determines the sunny area of the farm field included in the photographed field image.
- the sunny area determination unit 102 is an example of the “determination unit” in the present invention.
- the sunny area determination unit 102 determines the sunny area included in the field image captured by the imaging unit 204 of the drone 20 as described above.
- the sunny area determination unit 102 determines the sunny area based on the pixel value of each pixel of the entire field image indicated by the supplied entire field image data.
- the sunny area determination unit 102 calculates, for example, values in the HSV color space (hue H: Hue, saturation S: Saturation, brightness V: Value) of each pixel from the R, G, and B values of the pixel values. .
- the sunny area determination unit 102 determines, for example, an area of an object of the same color (for example, a crop area and a soil area) that has a difference between the hue H and the saturation S within a predetermined range among the calculated HSV values of each pixel.
- the sunny area determination unit 102 extracts pixels whose lightness value V changes by a difference threshold value or more in the same color area determined in this way.
- the pixel extracted in this way may indicate the boundary between the sun and the shade.
- the value of brightness V is It is determined that it represents a pixel, and the area is determined as a sunny area. Further, the sunny area determination unit 102 determines an area that has not been determined as the sunny area as a shaded area.
- the sunny area determination unit 102 uses, for example, a difference threshold value and a lightness threshold value corresponding to the hue H and saturation S values of each area (in the leaf area, both threshold values are increased compared to the soil area)
- the shaded area may be determined.
- FIG. 6 shows an example of the determination result of the sunny area.
- the shaded area G1 due to the forest R1 is shown on the southwest side, and the other areas are shown as the sunny area F1.
- the sunny area determination unit 102 When the sunny area determination unit 102 performs the determination as described above, the sunny area information indicating the pixel ID of the pixel included in the area determined to be sunny and the shade information indicating the pixel ID of the pixel included in the area determined to be shaded. Is generated.
- the sunny area determination unit 102 supplies the generated sunny information and shade information to the growth information generation unit 104 together with the entire field image data.
- the index calculation unit 103 calculates an index representing the growth status of the crop shown in the image from the field image acquired by the field image generation unit 101.
- the index calculation unit 103 is an example of the “calculation unit” in the present invention.
- the index calculation unit 103 calculates the above-described NDVI as an index indicating the growth status.
- the image capturing unit 204 of the drone 20 captures an image of the farm field, and the index calculation unit 103 calculates NDVI as described above, thereby measuring the state of crop cultivation in the field.
- the growth information generation unit 104 generates an NDVI map in pixel units representing NDVI at a position on the field corresponding to each pixel.
- FIG. 7 shows an example of an NDVI map in pixel units. In the example of FIG. 7, the NDVI map M1 in pixel units of the field A1 shown in FIG. 5 is represented.
- the index calculation unit 103 supplies the NDVI map M1 representing the calculated NDVI to the growth information generation unit 104 together with the entire field image data as index information representing the calculated crop growth status.
- the growth information generation unit 104 generates growth information indicating the growth status of the crop in the field using the supplied sunny information, shade information, index information (for example, the NDVI map M1) and the entire field image data.
- the growth information generation unit 104 generates, in particular, sunny growth information indicating the growth status of the crop in the sun and shade growth information indicating the growth status of the crop in the shade.
- the growth information generation unit 104 generates such information as follows, for example.
- the growth information generation unit 104 includes the NDVI in the sunny area determined by the sunny area determination unit 102 among the NDVIs of the pixels indicated by the NDVI map M1 supplied from the index calculation unit 103 (NDVI at each position included in the sunny area). )
- a Hinata index As a Hinata index.
- the position includes a range from a position represented by one pixel of the entire field image to a position represented by a plurality of pixels, and represents a region having a certain extent in any case.
- the Hinata index refers to an index representing the growth status of a crop measured from an image of a crop in the sunny area.
- the growth information generation unit 104 acquires an NDVI associated with the pixel ID indicated by the sun direction information as the sun direction indicator among the NDVIs indicated by the index information.
- the growth information generation unit 104 acquires a shade area index that is not determined to be a sunny area (an index of a position that is not determined to be a sunny area) as a shade index.
- the shade index refers to an index representing the growth status of a crop measured from a crop image in the shaded area.
- the growth information generation unit 104 acquires, as a shade index, the NDVI associated with the pixel ID indicated by the shade information among the NDVIs indicated by the index information.
- the growth information generation unit 104 is an example of the “index acquisition unit” in the present invention.
- the growth information generating unit 104 includes the NDVI calculated for the sunny area from the NDVI calculated by the index calculating unit 103 for the entire field image as included in the sunny index (the sunny area of the field). As NDVI at position). Further, the growth information generation unit 104 acquires the NDVI calculated for the shaded area from the NDVI calculated by the index calculation unit 103 for the entire field image as a shaded index (NDVI at a position included in the shaded area of the field). .
- the growth information generation unit 104 generates an NDVI map for each area representing the growth status of the crop for each of a plurality of areas that divide the field A1, for each of the sunny area and the shaded area, using the acquired sunny index and shade index.
- 8A and 8B show examples of NDVI maps in units of areas. 8A shows an NDVI map Mf1 in the area unit in the sunny area F1 shown in FIG. 6, and FIG. 8B shows an NDVI map Mg1 in the area unit in the shaded area G1 shown in FIG.
- each segmented area is represented by an eight-level pattern (Lv1 is the smallest and Lv8 is the largest) according to the average value of NDVI.
- the NDVI map Mf1 indicates that the NDVI is larger and the growth state is better as it approaches the northeast side of the field A1. Further, the NDVI map Mg1 also shows that the NDVI is larger and the growth state is better as it approaches the northeast side of the field A1. However, in the NDVI map Mg1, the NDVI is small because it is shaded even in the divided area representing the same shooting area as the NDVI map Mf1.
- the NDVI of the segmented area Hc14 representing the imaging area C14 is Lv6 (third from the largest) in the NDVI map Mf1, but Lv4 (fifth from the largest) in the NDVI map Mg1.
- the NDVI of the segmented area Hc19 representing the imaging area C19 is Lv7 (second from the largest) in the NDVI map Mf1, but is Lv5 (fourth from the largest) in the NDVI map Mg1.
- the growth information generation unit 104 includes a symbol (a sun identification symbol) indicating that the generated NDVI map Mf1 is generated from the acquired hinata index, shooting information of the entire field image as a base, and information associated with the shooting date and time. It is generated as the above-mentioned sunny growth information. Further, the growth information generation unit 104 associates the generated NDVI map Mg1 with a symbol (shade identification symbol) indicating that the generated NDVI map Mg1 is generated from the acquired shade index, the shooting information and the shooting date and time of the original whole field image. Information is generated as the above-mentioned shade growth information. The growth information generation unit 104 supplies the generated growth direction information and shade growth information to the growth information recording unit 105.
- a symbol a sun identification symbol
- the growth information recording unit 105 records the growth information of the crop in the field (information indicating the growth status of the crop) together with the sunny growth information and the shade growth information generated by the growth information generation unit 104.
- the growth information recording unit 105 is in a state where the recorded growth information can be browsed by the user (farmer) (for example, on a web page accessible by a URL (Uniform (Resource Locator) transmitted to the user). Hold.
- the growth information recording unit 105 displays a screen for searching for growth information that is held, for example.
- 9A to 9D show an example of a growth information search screen.
- the user terminal 30 displays a growth information search screen including input fields for time, field name, and sunshine conditions.
- the search condition “Hyuga” of “Agricultural field A1” of “2017/5/15” is input.
- the user terminal 30 When the user performs an operation of pressing the search button H1, the user terminal 30 sends request data for requesting the sunny growth information generated from the photographed image of the field A1 having “2017/5/15” as the photographing date and time. Send to.
- the growth information recording unit 105 reads the requested sunny growth information from the recorded growth information, and transmits the read sunny growth information to the user terminal 30.
- the user terminal 30 displays the NDVI map Mf1 indicated by the transmitted sunny growth information as shown in FIG. 9B.
- the user terminal 30 takes a photographed image of the field A1 with “2017/5/15” as the photographing date and time.
- Request data for requesting the shade growth information generated from is transmitted to the server device 10.
- the growth information recording unit 105 reads the requested shade growth information from the recorded growth information, and transmits the read shade growth information to the user terminal 30.
- the user terminal 30 displays the NDVI map Mg1 indicated by the transmitted shade growth information as shown in FIG. 9D.
- the sunny growth information is information in which the NDVI map Mf1 representing the sunny index (NDVI of the sunny area) acquired by the growth information generating unit 104 is associated with the sunny identification symbol indicating that it is an index of the sunny area.
- the growth information recording unit 105 outputs (transmits) the sunny growth information to the user terminal 30 to output the associated sunny index as the sunny area index (the sunny area index). Output so that they can be identified.
- the shade growth information is information in which the NDVI map Mg1 representing the shade index (NDVI of the shaded area) acquired by the growth information generation unit 104 is associated with the shade identification symbol indicating the shade area index. .
- the growth information recording unit 105 outputs (transmits) the shade growth information to the user terminal 30 so that the obtained shade index is associated with the shade area index as an index (shade area index). Output so that they can be identified.
- the growth information recording unit 105 is an example of the “output unit” in the present invention.
- FIG. 10 shows an example of the operation procedure of each apparatus in the recording process. This operation procedure is started when a farmer who is a user takes the drone 20 to the field and performs an operation for starting a shooting flight. First, the drone 20 (flight control unit 201, flight unit 202, and sensor measurement unit 203) starts flying over the field based on the stored field range information (step S11).
- the drone 20 starts photographing each photographing region from above the field (step S12), and every time photographing is performed, the photographed still image and photographing information (position when photographing, Image data indicating the azimuth and altitude) is generated and transmitted to the server device 10 (step S13).
- the server device 10 farm field image generation unit 101 acquires an image of the farm field indicated by the transmitted image data (step S14).
- the server device 10 (the farm field image generation unit 101) generates an image of the entire farm field by combining the acquired farm field images (step S21). Subsequently, the server device 10 (the sunny area determination unit 102) determines the sunny area and the shade area of the field included in the generated image of the entire field (Step S22). Next, the server device 10 (index calculation unit 103) calculates an index (NDVI) indicating the growth status of the crop reflected in the image from the generated image of the entire field (step S23).
- NDVI index
- steps S22 and S23 may be performed in the reverse order or in parallel.
- the server apparatus 10 produces
- the server apparatus 10 produces
- the operations in steps S24 and S25 may be performed in the reverse order or in parallel.
- the server device 10 (growth information recording unit 105) records the growth information of the sun and the shade growth information generated in steps S24 and S25 as the growth information of the crop in the field (step S26). Then, when there is an access (request) from the user terminal 30, the server device 10 (growth information recording unit 105) outputs the growth information recorded in step S26 to the user terminal 30 (step S31).
- the reflected light becomes weaker and the pixel value of the pixel becomes smaller than in the sunny area. Therefore, even if the pixel value error is the same, the NDVI error is larger. Therefore, the accuracy of NDVI tends to be lower in the shaded area than in the sunny area.
- the present embodiment by acquiring the NDVI only for the sunny area, it is possible to support an appropriate determination of the growth situation in the field where the shade is mixed as compared with the case where the sunny area and the shade are not distinguished.
- the NDVI only for the shaded area is acquired and the NDVI map for the shaded area is also output.
- the NDVI map for the shaded area is also output.
- the NDVI only in the shaded area it tends to be judged that the growth situation is bad, but by comparing the NDVI only in the shaded area, the place where the growth situation is good and bad in the shaded area (For example, in the case of the NDVI map Mg1 in FIG. 8B, it can be said that the northeast side has a better growth situation).
- a modification using different parameters for obtaining a common value may be combined, and a common value or the like may be obtained using these parameters together.
- the growth information generation unit 104 acquires both the sunflower index and the shade index.
- the present invention is not limited to this.
- only the sunflower index may be acquired.
- the growth information generation unit 104 generates only the NDVI map Mf1 representing the acquired sunny indicator, and outputs the sunny growth information indicating the NDVI map Mf1 to the user terminal 30. Even in this case, since only NDVI in the sunny area can be compared, it is possible to support appropriate judgment of the growing condition of the sunny crop in the field where the shade is mixed.
- NDVI is calculated for each pixel of the entire field, but the present invention is not limited to this.
- the sunny area determination unit 102 determines only the sunny area and supplies the generated sunny information to the index calculation unit 103 together with the entire field image data.
- the index calculation unit 103 calculates NDVI only for the pixels in the sunny area indicated by the supplied sunny information, and supplies a map representing the calculated NDVI of the sunny area together with the entire field image data to the growth information generation unit 104 as index information. .
- the growth information generation unit 104 acquires the NDVI supplied in this way, that is, the NDVI calculated only for the sunny area in the field image as the sunny index. Thereby, the load of the process (NDVI calculation process) of the server device 10 can be reduced as compared with the case where the NDVI of the shaded area is also calculated.
- the growth information generation unit 104 generates an NDVI map in units of areas using the area corresponding to the imaging range as the segmented area in the embodiment, but the segmented area is not limited to this.
- a plurality of shooting ranges may be used as one segmented region, or a region corresponding to a divided region obtained by dividing one shooting region into a plurality of segments may be used as the segmented region.
- the shape and size of each segmented region may be unified or not uniform.
- a rotary wing aircraft was used as a vehicle for autonomous flight, but this is not a limitation.
- it may be an airplane type aircraft or a helicopter type aircraft.
- the function of autonomous flight is not essential, and if it is possible to fly the assigned flight airspace in the assigned flight permission period, for example, a radio control type (wireless control type) operated by a pilot from a remote location. May be used.
- a radio control type wireless control type operated by a pilot from a remote location. May be used.
- the determination of the sun area and the calculation of NDVI are performed based on the image taken by the drone 20 during the flight, but the present invention is not limited to this. These determinations and calculations may be performed based on, for example, an image manually captured by an operator using a digital camera, an image captured by a fixed digital camera installed on a farm field, or an image captured from a satellite.
- the NDVI is calculated using the measured value of the image sensor of the imaging device 27 of the drone 20, but the present invention is not limited to this.
- the NDVI is measured using the measured value of an infrared sensor of a handy type NDVI measuring instrument. May be calculated.
- the handy type can calculate NDVI from the reflected light of the crop only. Accuracy tends to be high. Which method is to be used may be determined in consideration of labor, cost, and required accuracy.
- NDVI was used as an index indicating the growth status, but the present invention is not limited to this.
- a leaf color value value indicating the color of a leaf
- a planting rate occupation rate per unit area of a planting region
- SPAD chlororophyll content
- plant height number of stems, or the like
- any value may be used as an index representing the growth status as long as it represents the growth status of the crop and can be calculated from the captured crop region image.
- the intensity of reflected light from the crops that reach the image sensor of the photographing device 27 is not only the sun and shade, but also the intensity of sunlight in each season, the amount of clouds, the atmospheric conditions, etc. It can change depending on the surrounding environment. Moreover, it may change also with the state of the lens of the imaging device 27 influenced by temperature, humidity, etc. Correction for eliminating the change in the index (NDVI) caused by the change in the intensity of the reflected light reaching the image sensor due to those factors may be performed.
- FIG. 11 shows a functional configuration realized in this modification.
- a server device 10 a including a corrected image acquisition unit 106 in addition to the units illustrated in FIG. 4 is illustrated.
- the corrected image acquisition unit 106 acquires an image for index correction taken by the drone 20.
- the corrected image acquisition unit 106 is an example of the “correction acquisition unit” in the present invention.
- the index correction image is, for example, an image obtained by photographing a panel having a plurality of regions in which the reflectance of light having a specific wavelength is known in advance.
- FIG. 12 shows an example of an index correction image.
- an image obtained by photographing the panel J1 in which the specific reflectance regions J11, J12, J13, and J14 are represented on the surface is represented as an index correction image.
- Each specific reflectance region is, for example, a region on the panel where the reflectance of red light and infrared rays changes stepwise (for example, the reflectance of red light and infrared rays of J11, J12, J13, and J14 are both 20). %, 40%, 60%, 80%).
- the farmer causes the drone 20 to photograph the panel J1 before the operation for starting the photographing flight or after the completion of the photographing flight.
- the drone 20 may be flying, or the user may lift and photograph it.
- the imaging unit 204 of the drone 20 transmits image data indicating the captured image of the panel J1 to the server device 10a.
- the corrected image acquisition unit 106 acquires the image of the panel J1 indicated by the image data thus transmitted as an index correction image.
- the corrected image acquisition unit 106 supplies the acquired index correction image to the index calculation unit 103.
- the index calculation unit 103 calculates a corrected index based on the supplied image, that is, the index correction image acquired by the corrected image acquisition unit 106.
- the index calculation unit 103 uses the red pixel values (r11, r12, r13, r14) of the specific reflectance regions J11, J12, J13, and J14 included in the acquired index correction image and the wavelengths in the near infrared region.
- the pixel values of light ir11, ir12, ir13, ir14 are read out as measurement values.
- the server device 10a captures red pixel values (R11, R12, R13, R14) and the near-infrared region when the specific reflectance regions J11, J12, J13, and J14 are photographed in an environment where NDVI can be measured satisfactorily.
- the pixel values (IR11, IR12, IR13, IR14) of light with wavelengths are stored in the server device 10a as reference values. If the measured value is different from the reference value, the index calculating unit 103 determines a correction formula for correcting the measured value to the reference value.
- the index calculation unit 103 determines correction formulas for R and IR as described above, the index calculation unit 103 performs correction using the correction formula that determines the pixel value of each pixel.
- the index calculation unit 103 uses the corrected R and IR pixel values of each pixel to calculate NDVI (corrected NDVI in this modification) as in the embodiment.
- NDVI corrected NDVI in this modification
- the correction of the pixel value may be performed by a function other than the index calculation unit 103.
- the pixel value of each pixel may be corrected when the agricultural field image generation unit 101 generates an entire agricultural field image.
- the NDVI correction method is not limited to the above-described method, and other known methods may be used.
- FIG. 13 shows a functional configuration realized in this modification.
- a server device 10 b including a sunshine condition specifying unit 107 in addition to the units illustrated in FIG. 11 is illustrated.
- the sunshine condition specifying unit 107 specifies whether the shooting condition of the index correction image is sunny or shaded.
- the sunshine condition specifying unit 107 is an example of the “condition specifying unit” in the present invention.
- a screen for inputting shooting conditions is displayed on the user terminal 30.
- FIG. 14 shows an example of an imaging condition input screen.
- the user terminal 30 displays an agricultural support system screen including input fields for a shooting date, a field name, and a sunshine condition.
- the photographing condition “shade” of “field A1” of “2018/5/15” is input.
- the user terminal 30 captures the photographing condition data indicating that the photographing date and time is “2018/5/15”, the photographing place is “farm field A1”, and the sunshine condition is “shade”.
- the sunshine condition specifying unit 107 sets the sunshine condition (“shade” in the example of FIG. 14) indicated by the transmitted imaging condition data, the date and time indicated by the imaging condition data, and the index correction image captured in the field. Specify as shooting conditions.
- the sunshine condition specifying unit 107 notifies the index calculation unit 103 of the specified shooting conditions together with the shooting date and time and the farm field.
- the index calculation unit 103 based on the index correction image acquired by the correction image acquisition unit 106, is used as an index in a portion determined to be a sunny area in the field image when the shooting conditions for the sun are identified.
- the corrected index is calculated (the index of the portion determined to be a shaded area is not corrected).
- the sunny area determination unit 102 supplies the generated sunny information and shade information to the index calculation unit 103 together with the entire field image data.
- the index calculation unit 103 corrects the pixel value of the pixel in the sunny area indicated by the supplied sunny information among the pixels indicated by the entire field image data as described in the description of FIG.
- the index calculation unit 103 calculates the NDVI in the same manner as in the embodiment, using the pixel value of each pixel indicating the corrected sunny area.
- the index calculation unit 103 uses the corrected image acquisition unit as an index in a portion of the field image that is not determined to be a sunny region (a portion determined to be a shaded region). A corrected index is calculated based on the index correction image acquired in 106 (the index of the portion determined to be the sunny area is not corrected). In this case, the index calculation unit 103 calculates the NDVI by correcting the pixel values of the pixels in the shaded area indicated by the supplied shade information among the pixels indicated by the entire field image data in the same manner as described above.
- the NDVI in the sunlit area is corrected when the panel J1 is in the sun, and the NDVI in the shaded area is corrected when the panel J1 is in the shade.
- the accuracy of correction using the image of panel J1 index correction image
- the NDVI in the area common to the sunshine condition that has not been specified is not corrected, but another correction may be performed for the area. Another correction will be described later.
- FIG. 15 shows a functional configuration realized in this modification.
- a server device 10 c including a flight instruction unit 108 in addition to the units illustrated in FIG. 4 is illustrated.
- the flight instruction unit 108 instructs the drone 20 on the shooting method for the position that is not determined to be the sunny area, that is, the position that is determined to be the shaded area.
- the flight instruction unit 108 is an example of the “instruction unit” in the present invention.
- the flight instruction unit 108 instructs the drone 20 to re-photograph the position determined to be the shaded area, for example, when the shaded area is determined from the field image captured after the shooting flight is completed.
- the flight instruction unit 108 instructs re-imaging so that the photographing is started at the timing when the photographing flight of the shaded area is completed before the scheduled end time.
- the flight instruction unit 108 calculates the time (shooting time) required for shooting the shaded area based on the area of the shaded area and the distance from the shooting start position, and is more than the time that is back from the estimated shooting time by the calculated shooting time. Instruct to start re-shooting at the previous time.
- the flight instructing unit 108 may instruct the drone 20 to shoot at a different shooting time on another day.
- the flight instruction unit 108 when a shaded area is determined from the field image captured after the shooting flight is completed, the imaging time, the field ID (information identifying the field), and the device ID (drone 20). Is stored).
- the farmer who is the user takes the drone 20 to the field, but does not perform the shooting flight start operation himself, for example, performs a flight standby operation.
- the drone 20 transmits to the server device 10c state data indicating that it is in the flight standby state, the field ID (stored in advance by the farm worker), and the device ID.
- the drone 20 instructs the flight instruction unit to instruct the shooting start time to be a time different from the shooting time stored in association with the field ID and the device ID indicated by the state data.
- the imaging start time for example, a time separated from a past imaging time by a predetermined time (a time when the shaded area sufficiently changes) is used. For example, if the past shooting time is 10:00 am and the time determined at 5 am is 5 hours, the flight instruction unit 108 instructs the shooting start time at 3 pm.
- the flight instruction unit 108 gives an instruction with 11:00 am as the shooting start time.
- the flight control unit 201 of the drone 20 starts shooting flight when the shooting start time indicated by the received instruction data comes. Also in this case, since a part of the position that was the shaded area at the time of re-shooting in the past can be taken as the sunny area, NDVI calculated from the pixels in the sunny area is increased as compared with the case where the shooting time change instruction is not given. be able to.
- the index calculation unit 103 corrects an index calculated from a pixel in a shaded area (an index in the shaded area) to an index that is expected to be calculated when the pixel is a sunny area. May be.
- the index calculation unit 103 performs this correction by comparing, for example, the pixel value of the image taken when the same crop is in the sunny area and the pixel value of the image taken in the shaded area.
- FIG. 16 shows an example of the photographing range of the photographing means installed in this modification.
- an entire field image E1 of the field A1 is shown.
- the entire farm field image E1 is an image taken at an early time in the afternoon in which the shadow (shade region G1) of the forest R1 shown in FIG. 5 extends north.
- the fixed camera K1 is installed at a position where the photographing region C41 that is shaded in the shaded region G1 but hardly includes the shaded region G2 is photographed.
- the fixed camera K1 has a predetermined interval in the morning (time when the shooting region C41 becomes the sunny region) and a predetermined time during the day (time when the shooting region C41 becomes the shaded region) (may be every day). Repeat every week).
- FIG. 17 shows a functional configuration realized in this modification.
- a server device 10 d that includes an image acquisition unit 109 in addition to the units illustrated in FIG. 4 is illustrated.
- the fixed camera K1 has a communication function, and transmits image data indicating a captured image to the server device 10d.
- the image acquisition unit 109 acquires an image indicated by the transmitted image data, that is, an image of a fixed area (a fixed area in which the sunny area and the shaded area in the field are switched) captured by the fixed camera K1.
- the image acquisition unit 109 is an example of the “second image acquisition unit” in the present invention.
- the image acquisition unit 109 supplies the acquired image of the fixed area to the index calculation unit 103.
- the index calculation unit 103 is a part of the field image that is not determined to be the sunny area based on the correlation between the index of the sunny area and the index of the shaded area obtained from the image of the fixed area acquired by the image acquisition unit 109. An index corrected for (a portion determined to be a shaded area) is calculated. For example, the index calculation unit 103 calculates the ratio (index ratio) between the NDVI calculated for the pixel in which the specific part of the crop in the field is captured in the sunny area and the NDVI calculated for the pixel in the shaded area. calculate.
- the index calculation unit 103 converts the index value, for example, from 0.0 to 2.0, and then converts the index ratio (from 0 to 1.0). Value). For example, the index calculation unit 103 obtains an expression representing the correlation between the pixel value in the shaded area and the index ratio.
- FIG. 18 shows an example of the correlation of the index ratio.
- FIG. 18 shows a graph in which the pixel value in the shaded area is shown on the horizontal axis and the index ratio is shown on the vertical axis.
- a correlation is shown in which the index ratio decreases as the pixel value increases.
- the index calculation unit 103 obtains an approximate expression indicating this correlation using a known method.
- the correlation is represented linearly, but may be represented by a quadratic curve or may be represented by a cubic or higher curve.
- the index calculation unit 103 calculates the corrected NDVI for each pixel in the shaded area, using an expression representing the correlation thus obtained.
- the index calculation unit 103 is expected to calculate when the pixel is in the sunny area by dividing the pixel value by the index ratio indicated by this expression with respect to the pixel value of each pixel. To the correct index.
- the corrected index of the shaded area as described above, even if there is an area that hardly becomes the sunny area throughout the day, such as the shaded area G2 shown in FIG. Thus, it can be shown with higher accuracy than in the case of not performing the correction of the present modification.
- the index calculation unit 103 may correct the index of the shaded area by a method different from the above modification.
- the index calculation unit 103 includes a portion determined to be a sunny region and a portion not determined to be a sunny region (a portion determined to be a shaded region) from the field image acquired by the field image generation unit 101. Correction is performed focusing on the boundary portion.
- the index calculation unit 103 determines a portion (determined as a shade area) of the field image that is not determined as the sunny area. Calculated index) is calculated. Specifically, for example, the index calculation unit 103 identifies, as a boundary pixel, a pixel that is a boundary line between the sunny area and the shaded area determined by the sunny area determination unit 102, and is adjacent to the sunny area side of the boundary pixel. NDVI is compared between a sunny pixel (an example of a sunny region within a predetermined range) and a shaded pixel adjacent to the shaded region side of the boundary pixel (an example of a shaded region within a predetermined range).
- FIG. 19 shows an example of a sunny pixel and a shaded pixel.
- the sunny pixel Df11 on the sunny area F1 side and the shaded pixel Dg11 on the shaded area G1 side are represented.
- a plurality of sunny pixels and shaded pixels are represented along the boundary line between the sunny region F1 and the shaded region G1.
- not all the sunny pixels and shaded pixels are shown, but sunny pixels and shaded pixels are included between these sunny pixels and shaded pixels.
- the index calculation unit 103 calculates NDVI for each of these sunlit pixels and shaded pixels, and obtains an expression indicating the correlation between the index ratio and the pixel value of the shaded pixel, as in the example of FIG. After that, the index calculation unit 103 calculates the corrected NDVI of the shaded area in the same manner as in the modified example.
- the pixels adjacent to the boundary pixels are used as the sunny area and the shaded area (pixels) within a predetermined range of the boundary pixels, but the present invention is not limited to this. Pixels located at a distance between one or more pixels from the boundary pixel may be used as the sunny area and the shade area. In short, a range in which the growth situation seems to be substantially uniform in the entire field image may be used as the predetermined range.
- the index of the pixel in the shaded area can be corrected to an index that is expected to be calculated when the pixel is the sunny area.
- amendment can be performed even if it does not install the fixed imaging
- the imaging device provided in the drone 20 is not limited to the above.
- it may have a zoom function (the resolution can be increased to improve the accuracy of NDVI), or it may have sensitivity specialized for red and infrared.
- the pixel values of light (blue, green) of other wavelengths may be restored by correcting the spectrum.
- the growth information recording unit 105 may output the sunny index and the shade index by a method different from the embodiment.
- the growth information recording unit 105 outputs, for example, a sunny indicator for a folder or database prepared for a sunny indicator, and outputs a shaded indicator for a folder or database prepared for a shade indicator, for example. Good.
- the growth information recording unit 105 not only outputs the growth information to the user terminal 30, but also, for example, a storage device that accumulates a sunny index and a shade index, an analysis of a growth situation and a future prediction from the sunny index and the shade index, etc. May be output to an analysis apparatus that performs the above and a visualization apparatus that performs a visualization process (such as generation of a graph and a map) that facilitates comparison of the growth status.
- the growth information recording unit 105 may output the growth information to any output destination as long as it leads to supporting the person who performs the work in the field.
- the apparatus for implementing each function shown in FIG. 4 and the like may be different from those shown in FIG.
- the drone may have all or some of the functions of the server device.
- the drone processor is an example of the “information processing apparatus” of the present invention.
- the user terminal 30 may realize the function of the server device.
- the user terminal 30 is an example of the “information processing apparatus” of the present invention.
- each function may be performed by another function or may be performed by a new function.
- the growth information generation unit 104 may perform the operation performed by the index calculation unit 103 (index calculation operation).
- the output part which newly provided the output of the sunny index and shade index which the growth information recording part 105 performs may perform.
- Two or more devices may realize each function provided in the server device. In short, as long as these functions are realized as the entire agricultural support system, the agricultural support system may include any number of devices.
- the present invention can also be understood as an information processing system such as an agricultural support system equipped with a flying object.
- the present invention can be understood as an information processing method for realizing processing performed by each device, or as a program for causing a computer that controls each device to function.
- This program may be provided in the form of a recording medium such as an optical disk in which it is stored, or may be provided in the form of being downloaded to a computer via a network such as the Internet, installed and made available for use. May be.
- Input / output information, etc. may be stored in a specific location (for example, a memory) or managed by a management table. Input / output information and the like can be overwritten, updated, or additionally written. The output information or the like may be deleted. The input information or the like may be transmitted to another device.
- Software Software whether it is called software, firmware, middleware, microcode, hardware description language, or another name, is an instruction, instruction set, code, code segment, program code, program, sub, It should be interpreted broadly to mean a program, a software module, an application, a software application, a software package, a routine, a subroutine, an object, an executable file, an execution thread, a procedure, a function, and the like.
- software, instructions, etc. may be transmitted / received via a transmission medium.
- software may use websites, servers, or other devices using wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave.
- wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave.
- DSL digital subscriber line
- wireless technology such as infrared, wireless and microwave.
- notification of predetermined information is not limited to explicitly performed, but is performed implicitly (for example, notification of the predetermined information is not performed). Also good.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Biodiversity & Conservation Biology (AREA)
- Botany (AREA)
- Ecology (AREA)
- Forests & Forestry (AREA)
- Environmental Sciences (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Image Processing (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
La présente invention a pour objet de prendre en charge une évaluation appropriée d'un état de croissance dans un champ agricole avec une teinte mélangée. Une unité de génération d'image de champ agricole (101) acquiert une image d'un champ agricole imagé au moyen d'un drone (20). Une unité de détermination de région ensoleillée (102) détermine une région ensoleillée du champ agricole incluse dans l'image acquise du champ agricole. Une unité de calcul d'indice (103) calcule un indice représentant un état de croissance de produits apparaissant dans l'image, à partir de l'image acquise du champ agricole. Une unité de génération d'informations de croissance (104) acquiert, en tant qu'un indice ensoleillé, un indice NDVI dans une position incluse dans la région ensoleillée déterminée, parmi les indices NDVI calculés. L'unité de génération d'informations de croissance (104) acquiert, en tant qu'indice à l'ombre, un indice dans une position qui n'est pas déterminée comme étant une région ensoleillée. L'unité de génération d'informations de croissance (104) utilise l'indice ensoleillé et l'indice à l'ombre acquis pour générer, pour chaque région ensoleillée et une région à l'ombre, une carte d'indice NDVI d'unité de région représentant l'état de croissance de produits dans chaque région d'une pluralité de régions dans lesquelles le champ agricole (A1) est divisé.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020523619A JP7218365B2 (ja) | 2018-06-06 | 2019-05-23 | 情報処理装置 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018108475 | 2018-06-06 | ||
| JP2018-108475 | 2018-06-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019235240A1 true WO2019235240A1 (fr) | 2019-12-12 |
Family
ID=68770226
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/020508 Ceased WO2019235240A1 (fr) | 2018-06-06 | 2019-05-23 | Dispositif de traitement d'informations |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7218365B2 (fr) |
| WO (1) | WO2019235240A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021108650A (ja) * | 2020-01-15 | 2021-08-02 | 国立研究開発法人農業・食品産業技術総合研究機構 | 農作物関連値導出装置および農作物関連値導出方法 |
| JP2021171057A (ja) * | 2020-04-20 | 2021-11-01 | 国立研究開発法人農業・食品産業技術総合研究機構 | 農作物関連値導出装置および農作物関連値導出方法 |
| WO2024190391A1 (fr) * | 2023-03-13 | 2024-09-19 | コニカミノルタ株式会社 | Dispositif d'identification, procédé d'identification et programme |
| CN118759987A (zh) * | 2024-07-25 | 2024-10-11 | 深圳英莱能源科技有限公司 | 一种育苗用种植环境智能调控系统 |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7657751B2 (ja) * | 2022-03-11 | 2025-04-07 | ヤンマーホールディングス株式会社 | 圃場情報管理方法、圃場情報管理システム、及び圃場情報管理プログラム |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006129492A1 (fr) * | 2005-06-03 | 2006-12-07 | Honda Motor Co., Ltd. | Dispositif de reconnaissance de vehicule et panneau routier |
| WO2009116613A1 (fr) * | 2008-03-21 | 2009-09-24 | 株式会社 伊藤園 | Procédé, dispositif et système pour évaluer l'aptitude à la cueillette de feuilles de thé et support utilisable par ordinateur |
| JP2012183021A (ja) * | 2011-03-04 | 2012-09-27 | Hitachi Ltd | 植生制御装置、植物育成システム |
| WO2018034166A1 (fr) * | 2016-08-17 | 2018-02-22 | ソニー株式会社 | Dispositif de traitement d'un signal et procédé de traitement d'un signal, et programme |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPWO2019082519A1 (ja) * | 2017-10-26 | 2020-11-19 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム、情報処理システム |
-
2019
- 2019-05-23 WO PCT/JP2019/020508 patent/WO2019235240A1/fr not_active Ceased
- 2019-05-23 JP JP2020523619A patent/JP7218365B2/ja active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006129492A1 (fr) * | 2005-06-03 | 2006-12-07 | Honda Motor Co., Ltd. | Dispositif de reconnaissance de vehicule et panneau routier |
| WO2009116613A1 (fr) * | 2008-03-21 | 2009-09-24 | 株式会社 伊藤園 | Procédé, dispositif et système pour évaluer l'aptitude à la cueillette de feuilles de thé et support utilisable par ordinateur |
| JP2012183021A (ja) * | 2011-03-04 | 2012-09-27 | Hitachi Ltd | 植生制御装置、植物育成システム |
| WO2018034166A1 (fr) * | 2016-08-17 | 2018-02-22 | ソニー株式会社 | Dispositif de traitement d'un signal et procédé de traitement d'un signal, et programme |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021108650A (ja) * | 2020-01-15 | 2021-08-02 | 国立研究開発法人農業・食品産業技術総合研究機構 | 農作物関連値導出装置および農作物関連値導出方法 |
| JP2021171057A (ja) * | 2020-04-20 | 2021-11-01 | 国立研究開発法人農業・食品産業技術総合研究機構 | 農作物関連値導出装置および農作物関連値導出方法 |
| JP7044285B2 (ja) | 2020-04-20 | 2022-03-30 | 国立研究開発法人農業・食品産業技術総合研究機構 | 農作物関連値導出装置および農作物関連値導出方法 |
| WO2024190391A1 (fr) * | 2023-03-13 | 2024-09-19 | コニカミノルタ株式会社 | Dispositif d'identification, procédé d'identification et programme |
| CN118759987A (zh) * | 2024-07-25 | 2024-10-11 | 深圳英莱能源科技有限公司 | 一种育苗用种植环境智能调控系统 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2019235240A1 (ja) | 2021-07-08 |
| JP7218365B2 (ja) | 2023-02-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11651478B2 (en) | Methods for agronomic and agricultural monitoring using unmanned aerial systems | |
| US11763441B2 (en) | Information processing apparatus | |
| JP7218365B2 (ja) | 情報処理装置 | |
| US12111251B2 (en) | Information processing apparatus, information processing method, program, and sensing system | |
| US11823447B2 (en) | Information processing apparatus, information processing method, program, and information processing system | |
| JP2018046787A (ja) | 農業管理予測システム、農業管理予測方法、及びサーバ装置 | |
| JP2020149201A (ja) | 作物の倒伏リスク診断に用いる生育パラメータの測定推奨スポット提示方法、倒伏リスク診断方法、および情報提供装置 | |
| JP7643341B2 (ja) | 情報処理装置、情報処理方法、プログラム | |
| US20220414362A1 (en) | Method and system for optimizing image data for generating orthorectified image | |
| AU2016339031A1 (en) | Forestry information management systems and methods streamlined by automatic biometric data prioritization | |
| JP7366887B2 (ja) | 情報処理装置 | |
| JP2008136411A (ja) | リモートセンシングにおける補正方法 | |
| Latif et al. | Mapping wheat response to variations in N, P, Zn, and irrigation using an unmanned aerial vehicle | |
| JP2025089328A (ja) | プログラム、方法、情報処理装置、システム | |
| JP7587282B2 (ja) | 方法、プログラム及び情報処理装置 | |
| CN116128953B (zh) | 用于确定作物叶面积指数的方法、装置及处理器 | |
| US20250256815A1 (en) | Mobile Agricultural Holding Management | |
| Sorenson | Evaluation of unmanned aerial vehicles and analytical software for creation of a crop consulting business | |
| KR20250120086A (ko) | 팜맵을 기반으로 하는 재배작물 판독 결과 표출 시스템 및 그 방법 | |
| KR20240086048A (ko) | 드론을 활용한 생육이미지 데이터 가공 방법 및 시스템 | |
| WO2021149355A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19815402 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2020523619 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19815402 Country of ref document: EP Kind code of ref document: A1 |