WO2015139091A1 - System for detecting target animals in a protected area - Google Patents
System for detecting target animals in a protected area Download PDFInfo
- Publication number
- WO2015139091A1 WO2015139091A1 PCT/AU2015/050119 AU2015050119W WO2015139091A1 WO 2015139091 A1 WO2015139091 A1 WO 2015139091A1 AU 2015050119 W AU2015050119 W AU 2015050119W WO 2015139091 A1 WO2015139091 A1 WO 2015139091A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- uav
- tracking path
- data representing
- protected area
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 0 C*1(CCCC1)C(C)=C Chemical compound C*1(CCCC1)C(C)=C 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M29/00—Scaring or repelling devices, e.g. bird-scaring apparatus
- A01M29/06—Scaring or repelling devices, e.g. bird-scaring apparatus using visual means, e.g. scarecrows, moving elements, specific shapes, patterns or the like
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M29/00—Scaring or repelling devices, e.g. bird-scaring apparatus
- A01M29/16—Scaring or repelling devices, e.g. bird-scaring apparatus using sound waves
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M31/00—Hunting appliances
- A01M31/002—Detecting animals in a given area
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
Definitions
- Figure 9 is a flow diagram showing processes performed by the system shown in Figure 1;
- a lens 106 which acts as an initial point of data/image acquisition
- a visual identifier 200 including light emitting diodes (LEDs) used to illuminate and clearly identify a mobile device when in flight;
- a propulsion system 204 for controlling flight and directional movement
- external sensors 310 for acquiring information from the surrounding environment as the device moves through the area.
- Logged sensor data can be used for system/environment observation and to assist flight control;
- flight control sensors transmitting data (feedback), at step 710, to the flight control system relating to the physical movements of the device;
- the main (camera) processor 116 completes a simplified level of image processing, with the aim of reducing significant pixel data to point and vector data.
- the point data produced will be used further to identify events and will be stored in RAM memory 114.
- Event vectors are be used to denote animal position, size movements and other pertinent information useful in identifying, tracking and deterring invasive pests. It also provides the basis for building a 3D map approximating local terrain 102 and obstacles (both permanent and temporary) within the visible area 13. This information creates the basis for flight path data and best approach calculations. Event data such as times, number of identified objects and other statistical data created during standard operation will be written to flash memory (or uploaded to the server 22 via TCP/IP wireless connection). This data will be used to improve data visual detection and object classification algorithms. It will also be used to review system 10 performance and potential system 10 issues. b. Event Logging and Identification
- Anomalies in pixel data will alert the camera unit 12 and cause specific areas and sequences of images to be further scrutinised.
- Objects causing the anomalies are identified in terms of: i. size;
- Time stamped logs are then initialised in both the camera unit 12 and mobile device 14 to establish flight event logs, for storage in memory 114, 208 as well as to update statistical information on the device 14 including : flight time;
- the selected mobile unit 14 will be instructed (wirelessly by the camera unit 12) to effect take off. This process consists of: i. the mobile unit 14 starting and progressively increasing motor 204 speeds until the device 14 lifts off; and
- the position of the device (in relation to objects/ terra in) is confirmed and validated by combining camera position data (observed and determined via tracking algorithms) and range finder data on board the device 14.
- microSD card slot (not shown) connected via SPI Serial interface.
- the simple master in, slave out protocol provides the processors will the ability to read and write data as required. Memory addressing will be dependent on the type and construction of the cards used.
- Each processor 116, 210, 300 will contain a limited amount of RAM (temporary memory).
- the memory is usually contained with the processor chipset.
- Each system 10 includes an external power source 18.
- the specific type of power source 18 will vary depending on local conditions, source access and availability and may include solar, wind, hydro or mains.
- the primary source 318 is connected to a regulator circuit 316 within the base station 16.
- the regulator circuit 316 stabilises and provides consistent incoming power, suitable to charge a 12V battery system 314 (approx. supply is required to be 13-15 volts).
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Insects & Arthropods (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Pest Control & Pesticides (AREA)
- Wood Science & Technology (AREA)
- Zoology (AREA)
- Environmental Sciences (AREA)
- Birds (AREA)
- Astronomy & Astrophysics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A system for detecting target animals in a protected area, said system for performing the steps of converting visual light into a sequence of digital images; identifying objects in the sequence of digital images; for each one of objects, generating object data representing at least one of position; velocity; and size; and detecting target animals by comparing object data generated for each one of said objects with known object data for target animals.
Description
SYSTEM FOR DETECTING TARGET
ANIMALS IN A PROTECTED AREA
Technical Field of the Invention
The present invention relates to a system for detecting target animals in a protected area .
Background of the Invention
There are a wide variety of protective and deterrent systems available for inhibiting animal damage to assets in agricultural and industrial settings. These systems are generally classed as either active or passive deterrent systems. Active deterrent systems include use of bioacoustic devices, gas cannons and scarecrows. Such systems include use of auditory and/or visual alerts to deter animals from protected areas. These systems have previously been susceptible to habituation and animals can quickly become accustomed to their presence. This results in animals ignoring or avoiding the deterrents and roaming around in protected areas.
Passive deterrent systems include netting for fruit crops, bird spikes and spring wires, for example. Passive deterrents are used to protect areas by physically limiting animal access. The passive deterrents are often considered much more effective as they minimise the impact of animal habituation. However, they are typically much more time consuming and labour intensive to implement, maintain and repair. Inevitably, passive deterrents offer improved protection when compared to active deterrents. However, they are still susceptible to behavioural adaptations. Inquisitive animal nature and repeated interaction with deterrents leads to animals overcoming static deterrents resulting in animals entering protected areas. A combined lack of responsiveness and a lack of physical repercussions, like those from natural predators, may not have previously promoted instinctual self-preservation or avoidance beyond an initial uncertainty about a deterrents appearance in the area.
Falconry and culling offer a physical threat and produce a much more immediate form of animal control. However, these approaches are often considered inhumane and impractical due to the scale, type and duration of animal control required. It is generally desirable to provide an easily installed, low maintenance, adaptive animal observation and deterrent approach for agricultural and industrial applications.
It is generally desirable to overcome or ameliorate one or more of the above mentioned difficulties, or at least provide a useful alternative.
Summary of the Invention
In accordance with the invention, there is provided a system for detecting target animals in a protected area, said system for performing the steps of:
(a) converting light into a sequence of digital images;
(b) identifying objects in the sequence of digital images;
(c) for each one of objects, generating object data representing at least one of:
(i) position;
(ii) velocity; and
(iii) size; and
(d) detecting target animals by comparing object data generated for each one of said objects with known object data for target animals.
Preferably, including the step of generating tracking data for detected target animals including one or more of:
(a) speed;
(b) size; and
(c) direction.
Preferably, including the step of generating terrain data representing a map of the protected area including height and position of identified permanent obstacles.
In accordance with the invention, there is also provided a system for protecting an asset from target animals in a protected area, said system for performing the steps
of:
(a) detecting a target animal in the protected area by performing the above described steps;
(b) generating data representing a tracking path for an unmanned aerial vehicle (UAV) to intercept the target animal based on :
(i) a known location of the UAV;
(ii) said terrain data representing a map of the protected area; and
(ii) data representing target animal position;
(c) sending data representing the tracking path for the target animal to the UAV; and
(d) the UAV flying from said known location along the tracking path to intercept the target animal,
wherein the target animals are deterred from entering the protected area by the UAV. Preferably, the asset is crop at an agricultural site. Preferably, including the steps of:
(a) monitoring position and velocity of the UAV during flight along the tracking path; (b) monitoring position and velocity of the target animal;
(c) amending the tracking path if the target animal has moved;
(d) amending the tracking path if the UAV has deviated from the tracking path;
(e) if the tracking path has been amended, sending data representing the amended tracking path to the UAV. In accordance with the invention, there is also provided a system for intercepting a target animal with an unmanned aerial vehicle {UAV) in a protected area, said system for performing the steps of:
(a) receiving data representing a tracking path for the target animal; and
(b) launching the UAV and flying from a known location along the tracking path to intercept the target animal,
wherein the target animals are deterred from entering the protected area by the UAV.
In accordance with the invention, there is also provided a computer program for detecting target animals in a protected area, said program for performing the steps of:
(a) converting data representing light into a sequence of digital images;
(b) identifying objects in the sequence of digital images;
(c) for each one of objects, generating object data representing at least one of:
(i) position;
(ii) velocity; and
(iii) size; and
(d) detecting target animals by comparing object data generated for each one of said objects with known object data for target animals. Preferably, including the step of generating tracking data for detected target animals including one or more of:
(a) speed;
(b) size; and
(c) direction.
Preferably, including the step of generating terrain data representing a map of the protected area including height and position of identified permanent obstacles.
In accordance with the invention, there is also provided a computer program for protecting an asset from target animals in a protected area, said program for performing the steps of:
(a) detecting a target animal in the protected area by performing the above described steps;
(b) generating data representing a tracking path for an unmanned aerial vehicle (UAV) to intercept the target animal based on :
(i) a known location of the UAV;
(ii) said terrain data representing a map of the protected area; and
(ii) data representing target animal position;
(c) sending data representing the tracking path for the target animal to the UAV; and
(d) the UAV flying from said known location along the tracking path to intercept the target animal,
wherein the target animals are deterred from entering the protected area by the UAV. In accordance with the invention, there is also provided a computer program for
intercepting a target animal with an unmanned aerial vehicle (UAV) in a protected area, said program for performing the steps of:
(a) receiving data representing a tracking path for the target animal; and
(b) launching the UAV and flying from a known location along the tracking path to intercept the target animal,
wherein the target animals are deterred from entering the protected area by the UAV.
In accordance with the invention, there is also provided a non transitory computer readable data storage including the above described computer program stored thereon.
Brief Description of the Drawings
Preferred embodiments of the present invention are hereafter described, by way of non-limiting example only, with reference to the accompanying drawing in which :
Figure 1 is a schematic diagram of a system for detecting target animals in a protected area;
Figure 2 is another schematic diagram of the system shown in Figure 1;
Figure 3 is a block diagram showing the components of the system shown in Figure 1 ; Figure 4 is a block diagram showing dataflow between a camera unit of the system shown in Figure 1;
Figure 5 is a block diagram showing data flow between component parts of a unmanned aerial vehicle of the system shown in Figure 1;
Figure 6 is a block diagram showing dataflow between component part of a base station of the system shown in Figure 1;
Figure 7 is a flow diagram showing processes performed by the system shown in Figure 1;
Figure 8 is a flow diagram showing processes performed by the system shown in Figure 1;
Figure 9 is a flow diagram showing processes performed by the system shown in Figure 1; and
Figure 10 is a diagrammatic illustration of a base station of the system shown in Figure 1.
Detailed Description of Preferred Embodiments of the Invention
The system 10 shown in Figures 1 and 2 is used to detect ta rget animals in a protected area 15. The system is used to perform the steps of:
(a) converting visual light into a sequence of digital images;
(b) identifying objects in the sequence of digital images;
(c) for each one of the objects, generating object data representing at least one of:
(i) position;
(ii) velocity; and
(iii) size; and
(d) detecting target animals by compa ring object data generated for each one of sa id objects with known object data for target animals.
Preferably, the system 10 also performs the step of generating tracking data for detected ta rget animals including speed, size, and direction . Preferably, the system 10 also includes the step of generating terrain data representing a map of the protected area including height and position of identified permanent obstacles.
The system 10 is also used to protect an asset 11 from ta rget animals in the protected area 15. The system 10 performs the following steps in that regard : detecting a target animal in the protected area 15 by performing the above described steps;
generating data representing a tracking path for an unmanned aerial vehicle (UAV) 14 to intercept the target animal based on :
(i) a known location of the UAV;
(ii) the terrain data representing a map of the protected area ; and
(ii) data representing target animal position;
sending data representing the tracking path for the ta rget animal to the UAV 14; and
the UAV 14 flying from said known location along the tracking path to intercept the target animal .
The detect and deter system 10 is for use in observing and deterring animals from agricultural and industrial sites. Camera Units 12 use specialised algorithms to identify, track and record animal movements, while Mobile units 14 engage and deter animals from the area. As shown, the camera unit 12 is separate to the UAV 14. However, in an alternative embodiment, the camera unit 12 is coupled to (or forms part of) the UAV 14.
Mobile units 14 move freely within the air space surrounding the site 15. A high decibel auditory response mounted on each mobile unit 14, acts as a physical deterrent making it unbearable for the animal/s to remain with the area 14. This approach provides a greater resistance to stubborn pests and animal habituation than static deterrent methods, as the approach is highly variable and the physical effects are unavoidable. Each unit 14 can function independently, as apart of a group, and or via a remote system. Internal and external sensor data evaluates local environmental conditions, aids object identification, assist flight control and optimises unit performance and life expectancy. System Overview
With reference to Figures 1 to 3, the system 10 is used to protect assets 11, such as grapes growing in a vineyard. The system 10 includes: 1. a camera unit 12 which houses optics and processing electronics to detect animals and objects within the observable area 13 and the protected area 15;
2. one or more unmanned aerial vehicles 14 {UAV) (also referred through in this specification as (mobile deterrents 14; mobile unit 14; mobile devices 14), each being a device capable of unaided flight, for use in approaching, harassing and deterring animals 15 from the protected area 15;
3. a base station 16 including a protective container 18 for storing and recharging the mobile deterrents 14;
4. a power generation station 20; and
5. a computer device 22 operating as a remote server for offsite data storage of events, system errors and operation information and for acquiring software
updates.
Again, as shown, the camera unit 12 is separate to the UAV 14. However, in an alternative embodiment, the camera unit 12 is coupled to (or forms part of) the UAV 14.
Camera Unit 12
With reference to Figure 4, the camera unit 12 is used to identify, track and record animal movements within the observable area 13 and include:
1. a mounting pole 100 anchored to a floor or ground surface 102 using standard anchoring techniques;
2. a head unit 104 coupled to a distal end of the pole 100, including :
a. a lens 106 which acts as an initial point of data/image acquisition;
b. an image sensor 108 for converting light entering the lens 106 into digital signals (specifically sequential images and pixel data), whereby the lens 106 controlling light entering the image sensor 108;
c. memory 110 for temporary/short term storage of image data during initial conversion/image processing;
d. a converter/image processor 112 for conducting initial processing of pixel data (may include data manipulation, comparisons and compression);
e. system memory 114 which is used by the processor 112 to store processed image data such as object locations, terrain schematics and system information;
f. a processor 116 for final data/image processing. The processor 116 computes object tracking, pattern matching, terrain layouts, best approach scenarios and device communications;
g- a battery 118 which functions as a 12 volt power supply linked to base unit supply;
h. a wireless transmitter/receiver 120 for effecting wireless communications link to external devices. The transmitter/ receiver 120 transmits pertinent sensor and system information between devices, as well as for offsite data storage; and
i. a communications link 122 to the base station for receiving system information and controlling base station peripherals; and 3. a computer device 22 operating as a remote server for offsite data storage of events, system errors and operation information and for acquiring software updates.
The mounting pole 100 is of suitable height to provide the head unit 104 with an optimal view of the protected area 15. Again, in an alternative embodiment, the camera unit 12 is coupled to (or forms part of) the UAV 14. In this embodiment, the camera unit 12 is positioned on the UAV 14 to achieve a visual vantage over the protected area 15.
In this embodiment, the camera unit 12 functions in an analogous way to when it is attached to the post 100. However, some resources would be shared . For example, the image sensor and lens apparatus are housed within the drone unit 12 while processing power and wireless communications could be shared.
The image data is above-described as being processed within the camera unit 12 (as pre-processing) before being sent to a primary processor for the algorithms. In this embodiment, this is transferred directly into the drone hardware and completed within the drone 14 its self.
As a further alternative, the drone 14 remains a dummy unit, sending/streaming data through its sensors to a main/hub processor situated in the base station 16. This would mean the camera unit 14 section of the drone 14 would complete initial digitisation (some image processing). The images (pixel data) would then be communicated wirelessly to the base station 16, be processed and animal/tracking information determine, and the necessary details/tracking /flight paths sent back to control the drone 14.
Mobile Deterrent 14
Mobile units 14 deter animals from entering the protected area 15. Mobile units 14 move freely within the air space surrounding the assets 11. Preferably, a high decibel
auditory generator (not shown) is mounted on each mobile unit 14 which makes it unbearable for the animals to remain with the protected area 15. Advantageously, this approach provides a greater resistance to stubborn pests and animal habituation than static deterrent methods, as the approach is highly variable and the physical effects are unavoidable.
Each unit 14 can function independently, as apart of a group. Internal and external sensor data evaluates local environmental conditions, aids object identification, assist flight control and optimises unit performance and life expectancy.
With reference to Figure 5, the mobile deterrent 14 includes the following components:
1. a visual identifier 200 including light emitting diodes (LEDs) used to illuminate and clearly identify a mobile device when in flight;
2. a flight controller 202 for controlling flight stability and propulsion based on sensor and directional data;
3. a propulsion system 204 for controlling flight and directional movement;
4. flight control sensors 206 for providing on-going data as to the movement of the mobile unit 14 (for flight control feedback) such as; pitch, roll, yaw, velocity, acceleration, altitude etc.
5. memory 208 including data storage for device information, internal and external sensor data and communications information;
6. a processor 210 for managing wireless communications, data requests and computes and relays internal and external sensor information (computing data as necessary);
7. a wireless transmitter/receiver 209 for effecting wireless communications link to external devices;
8. a deterrent generator 212 including a high decibel auditory device;
9. a data converter 214 for converting/processing of sensor data, including image processing, analogue/digital data conversion and compression;
10. internal sensors 216 for confirming component operation and functionality, such as operating temperatures, mechanical/motion, and electrical conductivity;
11. external sensors 218 for acquiring information from the surrounding
environment as the device moves through the area. Logged sensor data can be used for system/environment observation and to assist flight control.
12. a battery monitor 220 for monitoring power consumption;
13. a battery 222 including a small light weight battery to supply power to device; and
14. recharge plates 224 for enabling recharging of device.
With regard to item 6, in the embodiment where the camera unit 12 is coupled to, or part of, the drone 14, the "sensor information" includes images forming camera data sent to the base station.
Base Station 16
The base station is used to house the mobile deterrents 14, recharge their batteries and to pass on flight instructions. With reference to Figure 6, the base station 16 includes:
1. a processor 300 for linking to camera unit 12 for data requests and controls devices/peripherals;
2. a communications link 302 to the camera unit 12;
3. enclosure actuators 304 which operate mechanical aspects of the base station such as door motors, used to protect or enclose the device;
4. charging circuits 304 for controlling the recharging process of mobile units 14;
5. recharge plates 306 for enabling recharging of device 14;
6. a data converter 308 for converting/processing of sensor data, including image processing, analogue/digital data conversion and compression;
7. external sensors 310 for acquiring information from the surrounding environment as the device moves through the area. Logged sensor data can be used for system/environment observation and to assist flight control;
8. a battery monitor 312 for monitoring power consumption;
9. a battery 314; and
10. power generation station 18 including :
a. regulator circuits 316 for converting external power supply to usable levels; and
b. a power source 318 which acts as an external source of power may
include solar, wind, convection, etc Camera Unit 12 - Imaging Algorithms: The camera unit 12 performs the following steps 500 shown in Figure 7 :
1. image acquisition 502 which is the process of converting visual light into sequential digital images (frames);
2. initial image processing 504 which simplifies and separates frames into usable form for task specific processing;
3. object detection 506 for detecting motion and speed of objects and determining object types (animal, deterrent unit, other);
4. image analysis 508 whereby additional information sought from the acquired images is used in identifying the condition of plants, weather or other objects; 5. mapping of terrain 510 such as permanent obstacles, including height and position;
6. location and tracking data 512 for acquiring target object information including speed, size, direction;
7. outputting 514 processed data into external storage data storage;
8. further comparing and evaluating 516 the data acquired to improve accuracy and reduce false identification and system errors.
Camera Unit 12 - Deterrent Algorithms: The camera unit 12 performs the following steps 600 shown in Figure 8 :
1. detecting 602 an object/animal is detected or identified which requires deterring;
2. determining 604 a best effort deterrent approach path using terrain information and object location information;
3. communicating 606 tracking information to mobile unit 14 for launching the mobile unit 14
4. monitoring 608 flight status and health or mobile unit 14 during operation;
5. communicating 610 updated location information to mobile unit 14;
6. activating 612 the deterrent generator when in close proximity to animal;
7. communicating 614 return to base station information to mobile unit 14; and 8 enabling charging 616 of mobile unit 14 at the base station 16.
Mobile Unit 14 - Flight / Movement Control Algorithm
The mobile unit 14 performs the following steps 700 shown in Figure 9 :
1. receiving, at step 702, flight path data from the camera unit 12;
2. receiving, at step 704, data from external sensors providing proximity, location and object detection information;
3. altering at step 708 the predetermined flight path based on the data from the external sensors;
4. supplying proximity and object detection information to the flight control system;
5. computing necessary flight control sensors, external and flight path data to manoeuvre mobile unit through out protected area;
6. flight control sensors transmitting data (feedback), at step 710, to the flight control system relating to the physical movements of the device; and
7. the internal and external sensor data being communicated, at step 712, to the camera unit 12 to enable safe, on going operation of mobile units 14.
System 10 Processes (Functionality)
The system 10 generally includes the following processes:
1. Mobile Unit 14 Wakeup / Start Up (Device Initialisation);
2. Camera Unit 12
a. Image Analysis (Mapping/Analysis/Object Detection)Operation;
b. Event Logging and Identification; and
3. Animal Deterring by the Camera Unit 12 and the Mobile Unit 14.
A description of these processes is below described in detail with reference to the accompanying drawings. 1. System Wake/Start-up (Device Initialisation)
When initially connected to a power source (i.e. when switched on), each mobile device 14 starts in a low power/sleep mode. All peripherals, sensors and components (that have this capability) will be kept this way during inactive periods to minimise power consumption. The mobile devices 14 will then be: a. enabled/woken by processor 116 (main system processor) of the camera unit 12;
b. initialised;
c. used as necessary; and
d. returned to the sleep state, as deemed necessary by the main system processor 116 of the camera unit.
Mobile Device 14 On / Off Switch
Each mobile device 14 includes an on/off power switch (not shown). The power switch connects a latching circuit (not shown) to the power terminal of the main processor 210 of the mobile device 14. The latch locks a second power circuit connection (not shown) to the processor 210 that provides power to the processor 210 and simultaneously triggers an interrupt pin waking the main processor 210 from its sleep mode. When the switch is later turned off, the processor 210 of the mobile device 14 does not loose power immediately, but rather is again triggered by an interrupt circuit to complete its showdown procedure and (using an IO pin on the processor) releasing the latch (see shutdown procedure below).
The process performed by the processor of each mobile device 14 are summarised below: a. Initialising the Main Processor;
b. Device Activation;
c. Systems Checking;
d. Wireless Connectivity; and
e. Error Checking. A description of each one of these processes is set out below in further detail.
a. Initialising the Main Processor
Once power is supplied to the main processor 210 in the above described manner, and it is in sleep mode, it can be woken in two ways: i. via a sensor trip circuit; or
ii. via an internal timer (i.e. a countdown, or specific clock time). An internal timer (not shown) will provide the processor 210 with a time stamp for all events that are logged or require approximate timed intervals. Each device 14 may also be set to operate during set periods of time based on the time of day. Events triggered by the timer will cause interrupts to wake the processor 210 and/or put the processor 210 into sleep mode (see below)
Alternatively, external sensors 218 may be used in a similar fashion to trigger a wake circuit externally. A wake or interrupt pin on the processor 210 can be used to identify additional needs to wake the device 14 and signal that the device 14 should be operating under current conditions regardless of time restraints (i.e. a light sensor indicating that it is light).
The sensor circuitry is used in conjunction with the timer and either may be prioritised based on the intended application of the unit. During standard operation the sensor circuit may change, but will only trigger an initialisation process when the unit is in sleep/low power mode. b. Device Activation
The main processor 210 upon waking will activate (pull high) a similar wake circuit initialising all applicable components (that were in a sleep/low power mode).
Each processor 210 will initialise peripherals using an IO processor pin connected to sleep/wake pins on each component. Pulling this circuit high(/low) will initialise/enable the mobile device's image processors 200, wireless devices 209, sensors 216, 218, error checking signals/sensors, power management circuits 220, etc.
The mobile devices 14 will be woken via a short pulse of power being supplied to the charge plates 224 of each unit 14. The charge plates 224 are primarily used for used for recharging. However, they are also connected to a sleep mode latching circuit, similar to the main processor 210.
Some components, such as image sensors 200, image processors and other multi stage peripherals may require individual power on/initialise periods. These occur inline with manufacture specs for chipsets and similar components. c. Systems Checking
Once all peripherals and components of the mobile devices 14 have completed their necessary power up sequences, the controlling processor 210 of the devices 14 will begin requesting sensor data and initial reading from all sensors and circuits. Sensor flags and data readings will be used to confirm correct functionality and device operation. For example, this will be done by comparing data to a lookup table of values pre-programmed into memory. Sensor data is requested and transmitted in standard communications form using standard communication protocols. This is dependent on the sensor type and communication capabilities of each peripheral. These may include I2C, USART, processor pin flags (pulled high or low) etc. Where applicable the main processors 210 will act as master, and sensors and stage two processor as slaves responding to data requests.
High data rate sensors such as those used for flight control in the mobile devices 14, will begin actively streaming data to the processor 210. For these devices, initial reading will be averaged over multiple data packets (outliers removed) to better approximate data values.
Data readings will be stored in RAM memory 208 as an initialisation report, which will be communicated to the main processor 210 to establish the status, functionality and errors/faults of the device 14 prior to commencing operating procedures.
d. Wireless connectivity
Wireless devices 14 will contain a static addressing system. Having completed initialisation processes, each wireless device 14 will begin polling each address for confirmation of availability and correct functionality. In doing so, the main processor 210 will request initialisation reports from each device 14.
Reports may be sent directly (via I2C to the base station 16) or wirelessly (mobile unit) to be stored in flash memory 114 linked to the camera unit processor 116.
After internal peripherals and data has been sourced, wireless devices 14 will begin locating local systems via a series of network broadcasts. Each unit will in turn determine its GPS location provide the GPS location and contact network address to determine local neighbours and positioning. e. Error Checking
By comparing values read directly from sensors and those within a look up table (stored in static memory), the device 14 performs basic fault finding/error checking. Data within the above-mentioned lookup tables will be pre-programmed into memory 208 and will consist of a range variables, max/min values and similar conditions with which the device 14 must operate.
In the event a device 14 sensor provides incomplete or incorrect data, the main processor 210 will be capable of requesting new readings from the specific device (via hardwired or wireless communication). If this reading again fails to meet the expectations of the lookup table it may be ignored as a none essential element or cause a fault flag and be passed on to the user (via wireless link). Variables establish during the initialisation process will provide a compete overview of the operating device 14. These variables will be used in conditional programming to allow the exclusion of none essential aspects of each device 14 to be removed from further calculations, down to a bare essentials level. The variables will include links to all sensors, active components and circuits. If a device is unable to confirm the functionality or operation of a low level component it will be deemed unfit for
operation the system will exclude it from operation. This will produce a system fault report, which will alert the user or service crew via a TCP/IP wireless connection.
Numerous failures may be the result of extreme conditions within the area around the system 10. Request can be made to local neighbour devices 14 to provide sensor data to establish/check sensor readings. These checks will be limited primarily to weather conditions and the like.
Some faults may be checked via standard device operation, (i.e. a motor may produce a change in more than one sensor such as the current it draws and vibrations in the devices accelerometer). These comparisons may also be used to fault find and denote any system 10 issues. These approaches will be based on pre-programmed variables and functions incorporated into each device. 2. Camera Unit 12
The camera unit's 12 primary functions are to observe the protected area 15 and to provide control operation of mobile units 14 during flight. Secondary to this is control of secondary processors (base station 16 and grounded mobile devices 14) and thirdly is the observation of sensory systems and power management. a. Image Analysis (Mapping/Analysis/Object Detection)
A wide angle lens 106 converts reflected light from the surrounding environment through to an high definition CMOS/CCD image sensor 108. The image sensor 108 convert detected light into a streaming sequence of images at 30 to 50 frames per second, with pixel value data of 1920x1080 in the two dimensional plane at 16bit resolution, for example. The sensor 108 is controlled via hardwired circuitry linked directly to the main processor 116 IO pins. The streaming output from the sensor 108 is connected directly to an image converter 112 via I2C/USART (dependent on component specifications). Pixel data is streamed continuously into short-term memory/ registers within the image converter 112.
The image converter 112 is preferably a Field Programmable Gate Array (FPGA) or a Graphics Processing Unit (GPU). The converter 112 performs some image processing
tasks such as: i. sequential comparisons;
ii. image stabilisation and correction;
iii. data transforms; and
iv. compression conversions.
Some pixel data will be written and stored in RAM memory 110 for reuse successive image operations.
After initial processing, image data (in a compressed form) is streamed to the main processor 116. The protocol complies with the manufacturer and components used.
For sequential image processing (up to ten frames) still frame references are created and stored with in RAM memory 110 (specific to the image converter). These references will continually vary over time and be used to improve object classification, identification and tracking algorithms.
The main (camera) processor 116 completes a simplified level of image processing, with the aim of reducing significant pixel data to point and vector data. The point data produced will be used further to identify events and will be stored in RAM memory 114.
Specific event vectors, are be used to denote animal position, size movements and other pertinent information useful in identifying, tracking and deterring invasive pests. It also provides the basis for building a 3D map approximating local terrain 102 and obstacles (both permanent and temporary) within the visible area 13. This information creates the basis for flight path data and best approach calculations. Event data such as times, number of identified objects and other statistical data created during standard operation will be written to flash memory (or uploaded to the server 22 via TCP/IP wireless connection). This data will be used to improve data visual detection and object classification algorithms. It will also be used to review system 10 performance and potential system 10 issues.
b. Event Logging and Identification
Anomalies in pixel data will alert the camera unit 12 and cause specific areas and sequences of images to be further scrutinised. Objects causing the anomalies are identified in terms of: i. size;
ii. shape; and
iii. movement (speed, direction etc).
This information is used to classifying specific types of objects.
This processing will occur within the main processor 116 where pattern recognition and similar algorithms can select small area to evaluate quickly, with more flexibility than in the image processing unit.
Features are identified within the pixel data. That is, outlines and object edges can be determined by abrupt changes in pixel colour data. Similarly, the movement of a consistent array of data values across an image can denote a moving object, with speed, location and size.
An identified object will create a time stamped log containing information such as that described above, it will also contain progressive tracking data and information on entry and exit points.
Objects correctly fitting the classification of an animal and or unwanted pest initiates a deter function, activating one or more mobile units.
Cross matching pixel values with values with previously detected objects. Size can distinguish between a person or a bird. Colour can distinguish between a natural and an unnatural object such as a fluorescent jacket, a piece of machinery, or similar.
3. Animal Deterring by the Camera Unit 12 and the Mobile Unit 14 While the system 10 is active mobile units 14 will remain on standby (sleep mode, or
being recha rged) in preparation for being launched . Detected animals within the protected area will flag the need to launch a deterrent 14, inline with the animal/s location. (The mobile devices 14 will be woken as described above) . The processes performed by each mobile device 14 are broken up into the following routines: a . Start Up;
b. Take Off;
c. Basic Flight;
d . Deterrent;
e. Return to Base
f. Landing
g - Recharging
h . Data Logging
Each one of these processes is below described in more detail , a. Start Up
In the event and animal is detected, the camera unit processor 116 wil flag the object detection event;
forwa rd its current location and recent movements into a new event tracking function;
initiate a best approach function, determining the optimal means of reaching the last known location and moving the animal from the a rea .
Processed animal location and tracking data will be combined with existing information within the system such as terrain maps or similar, to pla n out and optimise the drone's movements to deter the animal .
The camera unit processor 116 will utilise terrain map data points from memory 114 and complete an initialisation process (described above) of the best available mobile deterrent 14 (as j udged by previous data logs of battery levels, flight time, etc all
requested from the device at the end of the time of last use) .
In doing so, the current status of available system components will be considered . That is, the availability of the drones (as determined by battery levels) . What is the shortest and most effective path (maximising battery life, flight time and avoidance of obstacles) .
The camera unit 12 then generates a series of way points that define a path to reach the animals location. The camera unit 12 then wirelessly sends the way points to the appropriate mobile unit 14 (static addressing discussed ea rlier) . This is based on the best path for animal interception . They form a set of movements, way points, in which the drone will move to reach its target location. These may change during the flight in response to movement of the animal . Once the mobile device 14 has completed its initialisation process the camera unit 12 will send it an instruction for it to enter an active state (enabling motors, lights and other on-board sensory equipment) . The camera unit 12 will then visually identify the mobile device 14 and begin tracking it. The mobile device 14 preferably determines its position using GPS receiver and wirelessly communicates data this back to the camera unit 12.
Time stamped logs are then initialised in both the camera unit 12 and mobile device 14 to establish flight event logs, for storage in memory 114, 208 as well as to update statistical information on the device 14 including : flight time;
I . performance cha racteristics;
I I . safety para meters; and
other issues. b. Take Off
Once active, the selected mobile unit 14 will be instructed (wirelessly by the camera unit 12) to effect take off. This process consists of:
i. the mobile unit 14 starting and progressively increasing motor 204 speeds until the device 14 lifts off; and
ii. the on-board sensors 216, 218 and flight telemetry 206 providing feedback as to the movement of the device 14 with a view to maintaining a steady heading and altitude.
Using I/O pins on the processor 210, motors 204 are linked directly to the device's lightweight power supply 222. Internal start-up sequences with the motors 204 and motor driver 202 will begin awaiting a control signal from the processor 210 to determine motor speed. The motor controller 202 will determine motor thrust by variations of a 1 to 2 millisecond pulse (sent once during a period of 20 milliseconds) - this is a basic remote transmitter standard form for RC aircraft transmission.
Voltage and current sensors (in line with motor supply circuitry) oversee individual motor operation and feed data to the processor 210 upon request. Proximity/distance approximations from external range finders 218 confirm device 14 movements (specifically altitude 0 to 4meters) and assist with object detection and avoidance, position correction and altitude. These sensors 218 send data requests repeatedly during flight and data sent in response to the processor 210 as a digital signal (again using standard I2C or similar protocols).
At minimum motor trust thrust (spinning but not yet lifted off), range/height sensor readings will not alter, only once the motors are spinning with a range of (lift/hover) the data from the range finders 218 will increase and vary. Observing and comparing numerous sensor readings can be used to confirm the correct operation of the device 14 (again using a look up or range table as required).
The position of the device (in relation to objects/ terra in) is confirmed and validated by combining camera position data (observed and determined via tracking algorithms) and range finder data on board the device 14.
The combination of video position data and feedback from the sensors on the drone unit 14 (measuring proximity distances from the ground via reflectance of light /audio) accurate records and calculations can be made to control flight and ensure correct safe positioning amongst obstacles.
Proximity data will be progressively communicated to the camera unit 12 during the flight to improve mapping data and details. c. Basic Flight
The on-board sensors 216 of the mobile deterrent 14 include: i. a compass (heading/forward direction);
ii. an accelerometer (changes in speed); and
iii. gyroscope (angle of attack/ rotation).
These sensors 216 provide the processor 210 with data readings to aid stable flight control.
In conjunction with control system calculations and equations, sensor data can be used to provide electronic control for safe level flight. This calculated data is used to change motor speeds, enabling specific corrections and movements in any direction. Data (determined by manufacturer) is transmitted via standard protocols (I2C/USART /other). This data is continually updated and is easily accessed by the processor 210 to maintain stability and flight control. The data read from each sensor 216 will contribute to one parameter within a PID (or similar) flight control algorithm. This algorithm runs repetitively with the highest processor priority to maintain stable/controllable flight.
In conjunction with flight movement data (wirelessly received from the camera unit 12) these algorithms control the motors and in turn manoeuvre the device in three dimensional space. The camera unit 12 simultaneously monitors the active mobile units 14 and animal movements. Data in memory 114 is updated to track the appropriate unit 14 and manoeuvre the mobile unit 14 into position near to the detected animal. Multiple units 14 may be used simultaneously to deter multiple animals or improve deterrent approach effectiveness for one animal. In this situation the camera unit 12 coordinate device 14 movements to avoid collisions and eliminate any redundancies.
d. Deterrent
When in close proximity to a located animal, as judged by either the on-board sensors 218 within the mobile unit 14 or by the camera unit's observation data, the mobile unit 14 initiates its audible deterrent generator 212. Sensors 218 on-board the mobile unit 14 include imaging or infrared equipment (using algorithms and pattern matching similar to that within the camera unit) capable of pin pointing animal locations more accurately than from the camera unit (at distance). These capabilities will be used in conjunction with camera location data to improve deterrent targeting.
The audible deterrent generator 212 consists of a frequency based (555timer or equivalent) circuit which boosts the units 12v DC power to a high frequency (4500Hz- 7000Hz) oscillating output. This output will be enabled by a single processor IO pin and used to drive several piezoelectric speakers mounted on the outside of the unit 14. The sound generated by the generator 212 continues until the animal is driven from the position (as observed by the camera unit 12) or moves outside the range of the mobile device 14. Smaller, faster moving animals may require multiple attempts to be removed (i.e. repeating repeat stages c & d) . In this case the mobile unit 14 receives new commands from the camera unit 12 to pursue the animal to its new location (within the bounds of the protected area 15). Larger animals may be more tolerable to the frequency and decibel level and seek to progressively move from the area 15. To increase the impact of the deterrent multiple units 14 may be used, variable frequencies or periodically starting and stopping the auditory tone. e. Return to Base
The camera unit 12 monitors animal movements and determines when they have exited the protected area 15. Data related to the detection and tracking of the animals is written into memory 114 (RAM/flash). During non-busy periods some processor power 116 may be used to improve algorithms and probability/assumptions
about future detections. Algorithms, which assess common entry and exits points, as well as common resting sites and avoidance patterns, will be calculated. This aims to improve the rate of detection and hasten animal deterrence. On detection of a new animal, the camera unit 12 may divert a mobile unit 14, and generate a new set of flight data and distribute it to the active device 14.
As above described, the optimal use of available drones 14 is selected to pursue each animal. This includes already in use drones 14 who may be diverted or redirected to persistently deter animals until another optimal drone 14 takes precedent.
Each device 14 will continue to pursue and deter an animal until it is necessary to return to base 16. For example, the mobile unit 14 may return to base when : i. low battery life is flagged by on-board voltage regulators; or
ii. all animals have been removed from the protected area 15.
When it is necessary for the device 14 to return to base 16, a new set of flight commands (directions) is transmitted to the mobile unit 14 to aid its movements back to the base station 16. (a reverse of approaching the animal).
When a drone 14 is no longer the optimal choice or no further animals are to be deterred, a new flight path is created (as was previously to the animal) only this time it the path will be calculated to return to the base station. f. Landing
When the mobile device 14 is visually detected to be within 1 to 2meters of the landing pads 320 of the base station 16, the camera unit 12 will implement the landing sequence. This requires the mobile unit 14 to position its self directly above the landing pad 320 and progressively reduce altitude until it rests on top of the pad 320.
Like with take off, the mobile device 14 will combine wireless data (position x, y axis) from the camera unit 12 and range finder data (altitude/height z axis) from sensors
216.
The landing plate 320 preferably includes a clearly distinguishable pattern to assist in producing accurate camera data (in terms of movement commands) to direct the mobile device 14. The mobile unit 14 preferably utilises other sensors 218 (imaging, light seeking etc) to aid its approach during the landing sequence. These sensors 218 may also be used during flight or can be activated by command from the camera unit 12. It is important that sensor readings be read quickly and repetitively during this process (are a priority) to maintain maximum accuracy and up-to-date processor information.
The landing pad 320 includes a concave shaped floor which allows the unit 14 to reach an approximate centre and be guided into place. The base of the pad will consist of two charge plates 224 (terminals) used to electrically connect the two devices and enable charging. A low power circuit 220 will flag a flowing (power) connection between the two devices 14, 16 indicating that the unit 14 is contacting the terminals correctly. Similarly on-board the mobile device 14 a similar circuit will confirm (along with proximity sensor readings) that the device 14 has landed. At this point both the base station 16 and mobile device 14 will confirm via wireless/I2C transmissions to the camera unit 12 processor 116 that they are ready for recharging. The device 14 may require repositioning if sensor data, visual tracking or the terminals are not correctly positioned. The camera unit 12 will continue to position the device (via new commands) until these confirmations are received.
Once the mobile unit 14 is resting (and an electrical connection is established on the landing plates 224), recharging can be initiated and the motor controllers and all flight control peripherals and sensors shutdown (sleep mode) . g. Recharging
The camera unit's processor 116 requests the base station processor 300 to commence charging the appropriate landing pad 320 (I2C). The base station processor 300 (IO pins) will enable a charging circuit 304 to manage the recharge as necessary (a specific charge routine and variable currents and voltages are necessary
to prolong battery life). The base station 16 notifies (via I2C) the camera unit 12 that it has begun recharging. The camera unit 12 will then request a confirmation from the mobile unit (wirelessly) 14. The base station 16 is preferably the primary manager of the recharge process. Each landing pad 320 is individually monitored and the base station 16 is capable of recharging all units 14 simultaneously. The recharge will cease when the power in the mobile device 14 is seen to have reached 12 to 13 volts. Once the recharge is complete the camera unit 12 is notified by the base station 16 and in turn it will wirelessly command the mobile unit 14 to shut down and enter sleep mode.
Interrupt pins (linked to the charge plates) on the mobile units processor 210 will indicate power being transmitted to and from the device. Change to the power supplied to this pin will initiate the devices 14 wake sequence as well as indicate the beginning and end of the recharging sequence. h. Data Logging
During recharge the mobile unit 14 transmits any additional information about the previous flight to the camera unit 12. Data is primarily transmitted during flight. However, some additional data/sensor variables may still be awaiting transmission when the device 14 returns to base 16. It is unlikely that this task will extend longer than recharging however the unit 14 will not be shut down until after all data is transmitted. Preferably, there will be no data stored long term in the mobile device 14 and during shutdown (sleep mode) all previous variables and stored data is cleared.
The camera unit 12 will process the information provided to aid terrain mapping and other long-term information stored in memory.
Memory Read & Write (FLASH - EEPROM, RAM, etc)
Memory is a store medium used to house variables and system data during and post processing. The camera unit 12 operates using short term RAM memory 110 for
variables, functions and algorithm data. It will also store data in more permanently in flash memory 114. Flash memory data will be kept for analysis over an extended period of time (days/weeks/even after the system has been shut down). Flash (microSD)
Within the camera unit 12 there will be a microSD card slot (not shown) connected via SPI Serial interface. The simple master in, slave out protocol provides the processors will the ability to read and write data as required. Memory addressing will be dependent on the type and construction of the cards used.
RAM
Each processor 116, 210, 300 will contain a limited amount of RAM (temporary memory). The memory is usually contained with the processor chipset.
Initial Programming
Each mobile device 14 includes a standard USB communications port through which it will receive its primary programming. Each device 14 will have a main application program uploaded using an external device such as a PC/Laptop etc.
Communications Protocols Hardwired communications will utilise standard component protocols including I2C, USART or, SPI. Wireless communications may include one or more of the following : BlueTooth, ZigBee or RF transmissions (RADIO) or WiFi.
Power monitoring system 220, 312 (base station and mobile unit)
Power management components 220, 312 act as slave devices, responding to periodic processors requests for sensor readings. Sensor data will be converted from the sensor output to a comparable value within the processor. (Sensor output will very dependent on component construction and manufacturer).
A flag pin linked to a processor interrupt circuit will be enabled (pulled low/high) in the event the power system reaches a low power level. It is only likely that this will occur after an extended period without a functioning external power source. A low power indicator on a mobile unit will cause the current operation to be exited and begin the return to base routine (see above). In this case the camera unit 12 will receive a wireless notification and simultaneously assign a new device to replace the depleted unit. The camera unit 12 is capable of reducing power consumption by shutting down nonessential aspects of the system. In the event of a complete loss of power, the camera unit 12 (powered via a small battery) will shutdown all units (active or inactive) and report (via wireless communication) to a neighbouring system the fault and user assistance requested (report will include a GPS location or unit number).
These sensors will provide approximate energy consumption and storage levels, they will also be used to conduct comparisons and identify faulty units (drawing to much power, no power or are none responsive). Power Generation, Monitoring and Consumption
Each system 10 includes an external power source 18. The specific type of power source 18 will vary depending on local conditions, source access and availability and may include solar, wind, hydro or mains. The primary source 318 is connected to a regulator circuit 316 within the base station 16. The regulator circuit 316 stabilises and provides consistent incoming power, suitable to charge a 12V battery system 314 (approx. supply is required to be 13-15 volts).
Both the incoming power source 318 and base station battery 314 will include inline voltage/current monitoring sensors. These provide up-to-date information on incoming and consumed power (see power monitoring above) .
Sensors enabling the measurement of changes in voltage will enable battery levels to be included in optimal use calculations or when shutting down specific peripherals within the unit to insure functionality.
Sensor Reading
Each sensor 113, 216, 218 will provide a different aspect of system performance to its local processor 116, 210. There are a number of different forms of outputs including data streams (in the case of high output sensors), request/ respond sensors and each of which may respond in either analogue or digital form. In many cases digital data 113 will require to be adjusted or put through an algorithm to convert it into usable form, (output styles and data formats is dependent on the manufacturer and component). Primarily, sensors will respond to processor requests via a trigger circuit being pulled high/low, or communications bus (master/slave) I2C, USART or serial data request. Sensor will in turn assemble an up to date response and reply to the processors request. This data is then processed and utilised by the processor (via 10 pins or communications bus). High power sensors may house a sleep trigger 10 pin, will be linked to the sleep mode circuitry controlled by the local processor (see start up above).
Opening/Closing Base Station Cover As shown in Figure 10, the base station 16, consist of four landing pads 320, each with terminal plates 306 situated in the centre. The main purpose of these landing pads 320 is to house the mobile units 14 when inactive and provide recharging capabilities. To protect the mobile units 14 a cover (not shown) can be moved into place (a lid which splits in half and slides down either side of the base station). The lid is opened and closed by the base station processor 210 via motor controllers and two internal motors. An encoder on each motor provides confirmation of both lid movement and lid position. The lids can be closed only when all mobile units 14 are stationed within the base station 16 (as determined via variables within in the camera unit software), but may be opened at any time (the camera unit will send a request to the base station processor). The lid is to be left open during standard operation however may be closed during extreme weather to protect devices. Shut Down Sequence
When the system 10 is being removed or in the event of a complete loss of power the system will return itself to sleep mode before disconnecting its power source 318 (any remaining closed latches).
Turning off the power switch (not show) on the base station will not directly disable the system 10. Rather, it will remove power from the latched on/off circuit connecting the camera unit processor 210 directly to a power source 318. A secondary trigger circuit (interrupt connected to the main processor) will signal the system to begin the shut down process.
The shut down process initiates by returning all mobile units 14 to the base station 16. The base station lid will be closed and (via wireless connection or I2C) all processors requested to enter sleep mode. Sleep mode consists (as previously discussed) of the devices processor 210 shutting down (via IO pin on processor connected via circuit to a specific sleep 10 pin on component) all compatible system peripherals (sensors, motor controllers etc).
Each processor will confirm (with the camera unit processor) that their components have been shutdown, and the camera unit 12 will in turn enable the sleep circuit shutting down each processor.
Mobile units 14 will instead receive a wireless command to remove the latch connected to their sleep circuit. This will not shut down the device completely, and it will remain in sleep mode (relying on its internal battery) until woken via (power being supplied to the recharge plates - see start-up).
The camera unit processor 116 will release the latch maintaining its connection to the system power (via 10 pin circuitry). This will remove power to the entire system 10 including all peripherals and processors. This has effectively allowed the camera unit processor 116 to cut power to all devices and circuits (including its own) to shutdown the system 10.
Sleep Mode (power saving mode, inactive mode)
When the system 10 is not required to be in operation (determined by application i.e. during set times, day/night or other). A small battery will maintain an internal clock within in the camera unit 12. Entering into sleep mode, like the shutdown sequence involves the camera unit requesting all devices enter sleep mode but it does not remove power to the system or its components. Instead it enables wake sensors and the internal timer (like an alarm aimed at waking the device at a specific time or after a specific period) and removes the latch on its own sleep 10 pin. (see start-up for wake process) Definitions:
Components: electronic parts, resistors, chips, electrical connections/circuitry.
Peripherals: A select group of components, which operate together to complete a specific function ie a power circuit (includes battery, circuitry, regulators).
Devices: Individual units, ie a Camera Unit, a Mobile Unit, A base station
Protected Asset: the basis for which the system is implemented to protect, may include; crops, equipment, buildings, land, water, live stock
Many modifications will be apparent to those skilled in the art without departing from the scope of the present invention
Throughout this specification, unless the context requires otherwise, the word "comprise", and variations such as "comprises" and "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.
The reference to any prior art in this specification is not, and should not be taken as, an acknowledgment or any form of suggestion that the prior art forms part of the common general knowledge in Australia.
Claims
1. A system for detecting target animals in a protected area, said system for performing the steps of:
(a) converting light into a sequence of digital images;
(b) identifying objects in the sequence of digital images;
(c) for each one of objects, generating object data representing at least one of:
(i) position;
(ii) velocity; and
(iii) size; and
(d) detecting target animals by comparing object data generated for each one of said objects with known object data for target animals.
2. The system claimed in claim 1, including the step of generating tracking data for detected target animals including one or more of:
(a) speed;
(b) size; and
(c) direction.
3. The system claimed in claim 2, including the step of generating terrain data representing a map of the protected area including height and position of identified permanent obstacles.
4. A system for protecting an asset from target animals in a protected area, said system for performing the steps of:
(a) detecting a target animal in the protected area by performing the steps claimed in claim 3;
(b) generating data representing a tracking path for an unmanned aerial vehicle (UAV) to intercept the target animal based on :
(i) a known location of the UAV;
(ii) said terrain data representing a map of the protected area; and (ii) data representing target animal position;
(c) sending data representing the tracking path for the target animal to the UAV; and
(d) the UAV flying from said known location along the tracking path to intercept the target animal,
wherein the target animals are deterred from entering the protected area by the UAV.
5. The system claimed in claim 4, wherein the step of sending data representing the tracking path for the target animal to the UAV is effected by a camera unit coupled to the UAV.
The system claimed in claim 4 or claim 5, wherein the asset is crop at ricultural site.
The system claimed in any one of claims 4 to 6, including the steps of:
(a) monitoring position and velocity of the UAV during flight along the tracking path;
(b) monitoring position and velocity of the target animal;
(c) amending the tracking path if the target animal has moved;
(d) amending the tracking path if the UAV has deviated from the tracking path;
(e) if the tracking path has been amended, sending data representing the amended tracking path to the UAV.
8. The system claimed in claim in any one of claims 4 to 7, including the steps of:
(a) monitoring position and velocity of the UAV during flight along the tracking path;
(b) monitoring position and velocity of the target animal; and
(c) activating an audible sound generator on the UAV when the UAV is within audible range of the target animal,
wherein the audible sound generator emits a high decibel sound which makes it undesirable for an animal to remain in the protected area.
9. The system claimed in any one of claims 4 to 8, wherein the step of generating data representing a tracking path for the UAV includes the step of generating data representing tracking paths for multiple UAVs to intercept the target animal.
A system for intercepting a target animal with an unmanned aerial vehicle in a protected area, said system for performing the steps of:
(a) receiving data representing a tracking path for the target animal; and
(b) launching the UAV and flying from a known location along the tracking path to intercept the target animal,
wherein the target animals are deterred from entering the protected area by the UAV.
11. The system claimed in claim 10, including the step of receiving amended an tracking path whilst in flight and adjusting a flight path of the UAV accordingly.
12. The system claimed in claim 10 or claim 11, including the steps of:
(a) receiving data indicating that the target animal is within audible range of the UAV; and
(c) activating an audible sound generator on the UAV,
wherein the audible sound generator emits a high decibel sound which makes it undesirable for an animal to remain in the protected area.
13. The system claimed in any one of claims 9 to 11, wherein said step of receiving data representing the tracking path is effected by a camera unit coupled to the UAV.
14. A computer program for detecting target animals in a protected area, said program for performing the steps of:
(a) converting data representing light into a sequence of digital images; (b) identifying objects in the sequence of digital images;
(c) for each one of objects, generating object data representing at least one of:
(i) position;
(ii) velocity; and
(iii) size; and
(d) detecting target animals by comparing object data generated for one of said objects with known object data for target animals.
15. The program claimed in claim 14, including the step of generating tracking data for detected target animals including one or more of:
(a) speed;
(b) size; and
(c) direction.
16. The program claimed in claim 15, including the step of generating terrain data representing a map of the protected area including height and position of identified permanent obstacles.
17. A computer program for protecting an asset from target animals in a protected area, said program for performing the steps of:
(a) detecting a target animal in the protected area by performing the steps claimed in claim 14;
(b) generating data representing a tracking path for an unmanned aerial vehicle (UAV) to intercept the target animal based on :
(i) a known location of the UAV;
(ii) said terrain data representing a map of the protected area; and (ii) data representing target animal position;
(c) sending data representing the tracking path for the target animal to the UAV; and
(d) the UAV flying from said known location along the tracking path to intercept the target animal,
wherein the target animals are deterred from entering the protected area by the UAV.
18. The program claimed in claim 17, wherein the asset is crop at an agricultural site.
19. The program claimed in claim 17 or claim 18, including the steps of:
(a) monitoring position and velocity of the UAV during flight along the tracking path;
(b) monitoring position and velocity of the target animal;
(c) amending the tracking path if the target animal has moved;
(d) amending the tracking path if the UAV has deviated from the tracking path;
(e) if the tracking path has been amended, sending data representing the
amended tracking path to the UAV.
20. The program claimed in claim in any one of claims 17 to 19, including the steps of:
(a) monitoring position and velocity of the UAV during flight along the tracking path;
(b) monitoring position and velocity of the target animal; and
(c) activating an audible sound generator on the UAV when the UAV is within audible range of the target animal,
wherein the audible sound generator emits a high decibel sound which makes it undesirable for an animal to remain in the protected area.
21. The program claimed in any one of claims 17 to 20, wherein the step of generating data representing a tracking path for the UAV includes the step of generating data representing tracking paths for multiple UAVs to intercept the target animal.
22. A computer program for intercepting a target animal with an unmanned aerial vehicle (UAV) in a protected area, said program for performing the steps of:
(a) receiving data representing a tracking path for the target animal; and
(b) launching the UAV and flying from a known location along the tracking path to intercept the target animal,
wherein the target animals are deterred from entering the protected area by the UAV.
23. The program claimed claim 22, wherein said step of receiving data representing the tracking path is effected by a camera unit coupled to the UAV.
24. The program claimed in claim 22 or claim 23, including the step of receiving amended an tracking path whilst in flight and adjusting a flight path of the UAV accordingly.
25. The system claimed in any one of claims 22 to 24, including the steps of:
(a) receiving data indicating that the target animal is within audible range of the UAV; and
(c) activating an audible sound generator on the UAV, wherein the audible sound generator emits a high decibel sound which makes it undesirable for an animal to remain in the protected area.
26. Non transitory computer readable data storage including the computer program claimed in any one of claims 14 to 25 stored thereon.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2014900962 | 2014-03-19 | ||
| AU2014900962A AU2014900962A0 (en) | 2014-03-19 | System for detecting target animals in a protected area |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015139091A1 true WO2015139091A1 (en) | 2015-09-24 |
Family
ID=54143553
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/AU2015/050119 Ceased WO2015139091A1 (en) | 2014-03-19 | 2015-03-19 | System for detecting target animals in a protected area |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2015139091A1 (en) |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107239078A (en) * | 2017-06-26 | 2017-10-10 | 中国人民解放军国防科学技术大学 | A kind of unmanned plane base station selection and patrol method for optimizing route and device |
| DK201670824A1 (en) * | 2016-10-19 | 2018-05-07 | Animal Control Aps | Method of scaring birds and mammals |
| WO2018087354A1 (en) * | 2016-11-11 | 2018-05-17 | Bioseco Sp Z O. O. | Systems and methods for detecting flying animals |
| EP3361853A4 (en) * | 2015-10-12 | 2019-06-19 | Drone Seed Co. | SYSTEMS AND METHODS FOR MANAGING FOREST INFORMATION RATIONALIZED BY PRIORITIZING AUTOMATIC BIOMETRIC DATA |
| EP3608746A1 (en) * | 2018-08-06 | 2020-02-12 | GE Aviation Systems Limited | Wildlife intercept system and method of operating |
| US10856542B2 (en) | 2017-11-30 | 2020-12-08 | Florida Power & Light Company | Unmanned aerial vehicle system for deterring avian species from sensitive areas |
| WO2021070153A1 (en) | 2019-10-11 | 2021-04-15 | Brandenburg Connect Limited | Animal detection |
| US11140326B2 (en) | 2015-05-22 | 2021-10-05 | The United States Of America, As Represented By The Secretary Of The Navy | Aerial video based point, distance, and velocity real-time measurement system |
| GB2596512A (en) * | 2020-05-26 | 2022-01-05 | Bangor Univ | Improvements in and relating to drone control |
| EP3945802A4 (en) * | 2019-04-03 | 2023-01-11 | Dow Agrosciences LLC | ADAPTIVE ACTIVE INFRARED SENSOR HARDWARE AND SOFTWARE FOR PEST DETECTION WITH A PEST DETECTION STATION |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100201525A1 (en) * | 2007-07-13 | 2010-08-12 | Birdsvision Ltd. | Method and system for detecting and deterring animal intruders |
| US8026842B2 (en) * | 2006-06-08 | 2011-09-27 | Vista Research, Inc. | Method for surveillance to detect a land target |
-
2015
- 2015-03-19 WO PCT/AU2015/050119 patent/WO2015139091A1/en not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8026842B2 (en) * | 2006-06-08 | 2011-09-27 | Vista Research, Inc. | Method for surveillance to detect a land target |
| US20100201525A1 (en) * | 2007-07-13 | 2010-08-12 | Birdsvision Ltd. | Method and system for detecting and deterring animal intruders |
Non-Patent Citations (1)
| Title |
|---|
| KRISHNAMOORTHY, K. ET AL.: "UAV Search and Capture of a Moving Ground Target under Delayed Information", 51ST IEEE CONFERENCE ON DECISION AND CONTROL, Maui, Hawaii, USA, XP032324713 * |
Cited By (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11140326B2 (en) | 2015-05-22 | 2021-10-05 | The United States Of America, As Represented By The Secretary Of The Navy | Aerial video based point, distance, and velocity real-time measurement system |
| EP3361853A4 (en) * | 2015-10-12 | 2019-06-19 | Drone Seed Co. | SYSTEMS AND METHODS FOR MANAGING FOREST INFORMATION RATIONALIZED BY PRIORITIZING AUTOMATIC BIOMETRIC DATA |
| DK201670824A1 (en) * | 2016-10-19 | 2018-05-07 | Animal Control Aps | Method of scaring birds and mammals |
| WO2018087354A1 (en) * | 2016-11-11 | 2018-05-17 | Bioseco Sp Z O. O. | Systems and methods for detecting flying animals |
| CN110167344A (en) * | 2016-11-11 | 2019-08-23 | 生物生态服务提供商 | System and method for detecting flying animals |
| US11093738B2 (en) | 2016-11-11 | 2021-08-17 | Bioseco Sp Z.O.O | Systems and methods for detecting flying animals |
| EP3537875B1 (en) * | 2016-11-11 | 2023-12-27 | Bioseco Spólka Akcyjna | System and method for detecting flying animals |
| CN107239078A (en) * | 2017-06-26 | 2017-10-10 | 中国人民解放军国防科学技术大学 | A kind of unmanned plane base station selection and patrol method for optimizing route and device |
| CN107239078B (en) * | 2017-06-26 | 2020-03-27 | 中国人民解放军国防科学技术大学 | Unmanned aerial vehicle base station site selection and patrol path optimization method and device |
| US10856542B2 (en) | 2017-11-30 | 2020-12-08 | Florida Power & Light Company | Unmanned aerial vehicle system for deterring avian species from sensitive areas |
| CN110859176A (en) * | 2018-08-06 | 2020-03-06 | 通用电气航空系统有限公司 | Wild animal interception system and operation method |
| GB2576306A (en) * | 2018-08-06 | 2020-02-19 | Ge Aviat Systems Ltd | Wildlife intercept system and method of operating |
| EP3608746A1 (en) * | 2018-08-06 | 2020-02-12 | GE Aviation Systems Limited | Wildlife intercept system and method of operating |
| EP3945802A4 (en) * | 2019-04-03 | 2023-01-11 | Dow Agrosciences LLC | ADAPTIVE ACTIVE INFRARED SENSOR HARDWARE AND SOFTWARE FOR PEST DETECTION WITH A PEST DETECTION STATION |
| US12171211B2 (en) | 2019-04-03 | 2024-12-24 | Ecolab Usa Inc. | Adaptive active infrared sensor hardware and software in the detection of pests with pest detection station |
| GB2591432A (en) * | 2019-10-11 | 2021-08-04 | Brandenburg Connect Ltd | Animal detection |
| WO2021070153A1 (en) | 2019-10-11 | 2021-04-15 | Brandenburg Connect Limited | Animal detection |
| GB2591432B (en) * | 2019-10-11 | 2024-04-10 | Caucus Connect Ltd | Animal detection |
| US12121324B2 (en) | 2019-10-11 | 2024-10-22 | Caucus Connect Limited | Animal detection |
| GB2596512A (en) * | 2020-05-26 | 2022-01-05 | Bangor Univ | Improvements in and relating to drone control |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2015139091A1 (en) | System for detecting target animals in a protected area | |
| US20230294827A1 (en) | Method and system for controlling an unmanned aerial vehicle | |
| US20180068164A1 (en) | Systems and methods for identifying pests in crop-containing areas via unmanned vehicles | |
| US20190110461A1 (en) | Method and apparatus for identifying, locating and scaring away birds | |
| US20190152595A1 (en) | Apparatus for Sustained Surveillance and Deterrence with Unmanned Aerial Vehicles (UAV) | |
| US10045523B2 (en) | Baiting method and apparatus for pest control | |
| EP3563204B1 (en) | Control device, monitoring device and control program | |
| US20190246623A1 (en) | Pest deterrent system | |
| US20180068165A1 (en) | Systems and methods for identifying pests in crop-containing areas via unmanned vehicles based on crop damage detection | |
| KR101710329B1 (en) | Surveillance system using drone | |
| US20180233007A1 (en) | Drone based security system | |
| CN110832195A (en) | Monitoring system for wind farms and related method | |
| KR20150086118A (en) | System and method for repelling birds and beasts using a flying robot | |
| WO2018048666A1 (en) | Systems and methods for defending crops from crop-damaging pests via unmanned vehicles | |
| KR102512529B1 (en) | Method and apparatus of operating and managing unmanned golf course | |
| TW201805906A (en) | Security eviction system with unmanned aerial vehicles | |
| KR101692018B1 (en) | Drone with self camera and photographer chasing function | |
| Ju et al. | Investigation of an autonomous tracking system for localization of radio-tagged flying insects | |
| US20200037603A1 (en) | Pest deterrent and/or alert assistant device | |
| JP2020092643A (en) | Unmanned flight device, unmanned flight system, and unmanned flight device control system | |
| JP2025083351A (en) | Flight type robot | |
| US10959058B1 (en) | Object tracking systems and methods | |
| CN114027288A (en) | Multi-information-source comprehensive processing bird protection device and method based on wind power plant | |
| WO2012127424A1 (en) | Threat control system for fish ponds | |
| JP7650485B2 (en) | Flying robot, flying robot control program, and flying robot control method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15764261 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15764261 Country of ref document: EP Kind code of ref document: A1 |