[go: up one dir, main page]

WO2020025320A1 - Controller for detecting animals with physiological conditions - Google Patents

Controller for detecting animals with physiological conditions Download PDF

Info

Publication number
WO2020025320A1
WO2020025320A1 PCT/EP2019/069225 EP2019069225W WO2020025320A1 WO 2020025320 A1 WO2020025320 A1 WO 2020025320A1 EP 2019069225 W EP2019069225 W EP 2019069225W WO 2020025320 A1 WO2020025320 A1 WO 2020025320A1
Authority
WO
WIPO (PCT)
Prior art keywords
controller
environment
animal
infrared
infrared image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2019/069225
Other languages
French (fr)
Inventor
Marc Andre De Samber
Harry Broers
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Signify Holding BV
Original Assignee
Signify Holding BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding BV filed Critical Signify Holding BV
Publication of WO2020025320A1 publication Critical patent/WO2020025320A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present disclosure relates to a controller and computer program for determining whether or not an animal has a physiological condition.
  • US 5,474,085 A discloses a method and apparatus for remote sensing of livestock, using a thermographic image sensing system, in order to determine one or more of a number, weight, location, temperature, carcass pH, etc., of animals in a surveillance area.
  • WO 2015/030611 Al discloses a method and apparatus for determining at least one characteristic of respiration of a non- human animal. A plurality of thermal infrared images of the animal are obtained using at least one thermographic camera. The area of the images capture a region of interest relating to respiration of the animal. Each image is analysed to obtain data relating to respiration of the animal. At least one characteristic of respiration of the animal is determined based at least in part on the data obtained from the images.
  • controller for determining whether one or more animals within an environment have a physiological condition.
  • the controller has an interface for receiving, from one or more infrared cameras, a first infrared image of the environment and a second infrared image of the environment.
  • the controller is configured so as, after receiving the first infrared image and before receiving the second infrared image, to control one or more illumination sources to illuminate the environment for a predetermined time period.
  • the controller is further configured to identify a region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of a first animal; and to identify a second, corresponding region within the second infrared image.
  • the controller is configured to then determine a change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images; and determine whether or not the first animal has a physiological condition based on the determined change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images.
  • this may be performed using a computer vision algorithm like object tracking, where one or more of the location, pose, shape, and visual appearance of an object may be tracked in successive infrared images. From these features, information including motion, shape deformations, appearance changes, and orientations (of parts) of an object (i.e. an animal) may be obtained.
  • a change in position of a region in the first and second infrared images includes a change in location of the region, a change in pose of the region, and a change in shape of the region.
  • the illumination provided by the illumination sources acts as a stimulus to the animals, causing the animals to move (e.g. the illumination may cause the animals to wake up).
  • a determination can be made as to whether the animal does indeed have a physiological condition (e.g. a health problem such, as, for example, an illness, injury or death, or balding, shedding of feathers, etc. which may indicate a health problem).
  • a physiological condition e.g. a health problem such, as, for example, an illness, injury or death, or balding, shedding of feathers, etc. which may indicate a health problem.
  • animals showing a small degree of movement e.g. those that are lethargic
  • zero movement of an animal between the time that the first and second infrared images are taken may indicate that the animal has died.
  • animals who are identified in the first image e.g. with a region indicating that the animal may have a physiological condition
  • show a large degree of movement after the illumination stimulus may be deemed as healthy.
  • This advantageously prevents healthy animals from being diagnosed with a physiological condition, allowing, for example, a farmer to focus on those animals deemed to have a physiological condition, e.g. a disease.
  • a change in thermal characteristic such as, for example, temperature
  • a change in temperature may be caused by a change in orientation of the wings of a winged animal as the feathers would shield more of the animal’s body heat.
  • skin temperature in hens measured following a mild or more severe acute stressor i.e. the illumination
  • the temperature of thermoregulatory tissues can temporarily drop under acute stress, and the magnitude of this skin temperature change may reflect acute stressor intensity.
  • the measured temperature increase or decrease may be used to identify a physiological condition.
  • the identified regions may be thermal (e.g. temperature) regions.
  • the identified regions may each correspond to at least one of: a predefined thermal pattern, a predefined thermal shape or predefined thermal area.
  • the controller may be configured to identify that the identified region corresponds to at least part of a first one of the animals within the first infrared image.
  • the first and second infrared images may be thermal images.
  • the one or more animals may be poultry.
  • said identified region may correspond to a region of either:
  • said identified region may correspond to a region of either:
  • the controller may be configured to perform said controlling of the one or more illumination sources to illuminate the environment by: controlling the one or more illumination sources to illuminate the environment with illumination having one, more, or all of the following characteristics: (i) a predetermined brightness level, (ii) a predetermined colour, (iii) a predetermined temperature, (iv) a predetermined spatial pattern, and/or (v) a predetermined duration.
  • the controller may be configured to control the one or more infrared cameras to capture the second infrared image of the environment within a predetermined time period of said illumination the environment.
  • the controller may be configured to determine whether the change in position between the identified first and second regions is less than a threshold amount and, if the change in position is determined to be less than the threshold amount, control one or more infrared cameras to capture a third infrared image of the environment.
  • a (more detailed) infrared image may be captured of those animals who are deemed likely to be suffering from a physiological condition such as, for example, a health problem.
  • This third image may therefore offer a better insight into the particular type of physiological condition that the animal is experiencing. For example, an animal having a condition such as, for example, balding may be detected.
  • the third infrared image may provide a more accurate image of this animal. From this third image, a farmer, for example, may be able to determine if the balding is due to or indicative of a more serious problem such as an underlying health issue.
  • the controller may be configured to perform said controlling of the one or more infrared cameras to capture the third infrared image of the environment by controlling the one or more infrared cameras to capture an infrared image of the head of the first animal.
  • the controller may be configured to: identify one or more regions within the third infrared image; and determine whether or not the first animal has a physiological condition based on the identified one or more regions within the third infrared image. In embodiments, the controller may be configured to perform said identifying of the one or more regions within the third infrared image by identifying one or more regions on the head of the first animal.
  • the controller may be configured to perform said determining of whether or not the first animal has a physiological condition based on the identified one or more regions within the third infrared image by: determining if the one or more regions coincide with at least one of the (i) nose, (ii) mouth, and (iii) eyes of the first animal in the third infrared image.
  • the controller may determine from the third infrared image, based on regions around, say, the nose (or beak if applicable) of the animal, that the animal’s nose is leaking nasal mucus. This is an indicator that the animal may have a respiratory infection.
  • the controller may be configured to: control one or more microphones to monitor for sounds indicative of an animal within the environment having a physiological condition, and control one or more infrared cameras to capture said first infrared image of the environment in response to detecting a sound indicative of an animal within the environment having a physiological condition.
  • the controller may be configured to: control one or more illumination sources to illuminate the environment for multiple sequential time periods, each time period of illumination having a different characteristic from the previous time period; receive an infrared image of the environment after one or more of the multiple sequential time periods of illumination; determine a change in position between the identified regions between the one or more infrared images; and determine whether or not the first animal has a physiological condition based on the determined change in position between the identified region between the one or more infrared images.
  • the controller may be configured to cause an alert to be output to a user if the first animal is determined to have a physiological condition.
  • the controller may be configured to cause a visual indication to be displayed, on a user interface of a user device, if the first animal is determined to have a physiological condition, wherein the visual indication identifies the first animal.
  • said causing an alert to be output to the user may comprise at least one of: (i) causing a visual alert to be displayed on a user interface of the user device, and (ii) causing an audio alert to be output from a speaker within the environment.
  • the controller may be configured to control the one or more luminaires to illuminate an area in the environment that is free from the first animal (or any animal identified as having a physiological condition.
  • a computer program product for determining whether one or more animals within an environment have a physiological condition
  • the computer program product comprising code embodied on one or more computer-readable storage media and configured so as when executed on one or more processors to perform operations of: receiving, from one or more infrared cameras, a first infrared image of the environment; controlling one or more illumination sources to illuminate the environment for a predetermined time period; receiving, from one or more infrared cameras, a second infrared image of the environment; identifying a first region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of a first animal; determining a change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images; and
  • determining whether or not the first animal has a physiological condition based on the determined change in position and/or thermal characteristic between the identified first and second region in the first and second infrared images.
  • a system for determining whether one or more animals within an environment have a physiological condition comprising: one or more infrared cameras; one or more illumination sources; and a controller, wherein at least one infrared camera is configured to capture a first infrared image of the environment; wherein at least one illumination source is configured to illuminate the environment for a predetermined time period; wherein at least one infrared camera is configured to capture a second infrared image of the environment; and wherein the controller is configured to: identify a first region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of a first animal; identify a second, corresponding region within the second infrared image; determine a change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images; and determine whether or not the first animal has a physiological condition based on the determined change in position and/or thermal characteristic between the identified first and
  • the system may comprise a dispenser configured to dispense one or more substances, and wherein the controller is configured to control the dispenser to automatically dispense at least one substance in response to determining that the animal has a physiological condition.
  • the at least one substance that the dispenser is configured to release may comprise a medicine.
  • Fig. 1 shows schematically an environment comprising a system for determining whether one or more animals within an environment have a physiological condition
  • Fig. 2 shows schematically an example block diagram of a control system for controlling the system
  • Figs. 3A 3B show examples of infrared images of poultry showing several regions corresponding to features of the poultry.
  • Previous systems do not allow for physiological conditions (e.g. health problems) in large, dense animal populations, like those found in chicken farm stables for example, to be easily detected at the individual animal level.
  • Embodiments of the present invention provide an automated, efficient and selective system for determining whether or not an animal within an environment has a physiological condition based on the level of movement of regions between infrared images captured at different points in time, whereby the animal is subjected to a light stimulus between the captured images.
  • a fast response to the physiological condition can be realised. This may allow for the disease to be contained more quickly and easily by, for example, the administration of medicine at the individual or population level, or by changing the living condition in the animal population.
  • the automatic identification and treatment or removal of sick and/or dead animals helps to prevent the spread of disease to healthy neighbouring animals.
  • infrared images can readily detect liquids and moisture. This information can be used to assess the state of living conditions such as, for example, flooring or bedding in the environment.
  • the infrared images may detect animals with a physiological condition of wet skin, fur, feathers, etc. (e.g. on their body). If the animals are too wet because their bedding is moist, for example, this can lead to an increased health risk. By identifying this problem, the living conditions can be improved.
  • the invention Whilst the invention will be described in examples relating to poultry and chickens in particular, the invention has wider usage to a variety of animals.
  • the invention may be used to identify physiological conditions in, for example, cattle, sheep, pigs, etc.
  • example techniques for determining if an animal has a health problem equally apply to techniques for determining if an animal has a physiological condition, such as, for example, molting, balding, etc.
  • FIG. 1 illustrates an example environment 100 in which embodiments disclosed herein may be employed.
  • the environment 100 is a space that may be occupied by one or more users 102 and/or one or more animals.
  • the one or more animals may be poultry, that is, they may be any domesticated bird kept by humans for their eggs, their meat or their feathers, including chicken, duck, goose, partridge, pigeon, quail, turkey, Guinea fowl, pheasant, etc.
  • the one or more animals may be farm animals such as, for example, goats, horses, donkeys, pigs, cows, etc.
  • the one or more animals may in general be any vertebrate other than a human. That is, the animals are non-human animals.
  • the environment 100 may take the form of an indoor space such as one or more rooms of a home, office or other building such as a bam or farmhouse; an outdoor space such as a garden or park; a partially covered space such as a gazebo; or a combination of such spaces such as a farm comprising both indoor and outdoor spaces.
  • an indoor space such as one or more rooms of a home, office or other building such as a bam or farmhouse
  • an outdoor space such as a garden or park
  • a partially covered space such as a gazebo
  • a combination of such spaces such as a farm comprising both indoor and outdoor spaces.
  • the environment 100 is equipped with a plurality of illumination sources (or light emitting elements, light sources, luminaires) 106 installed or otherwise disposed at different locations throughout the environment 100.
  • An illumination source 106 may refer to any kind of light emitting device for illuminating an environment or part of the environment occupied by an animal 104, whether providing ambient lighting or task lighting.
  • a human user 102 may also be present in the environment 100, or elsewhere outside the environment 100 in question.
  • Each of the illumination sources 106 may take any of a variety of possible forms, such as a ceiling or wall mounted luminaire, a free-standing floor or table light source 106, an light source 106 mounted on a pole, gantry or rigging; or a less traditional form such as an light source 106 embedded in a surface or item of furniture (and the different light sources 106 in the environment 100 need not take the same form as one another).
  • each light emitting element 106 comprises at least one lamp (light element) and any associated housing, socket and/or support.
  • suitable lamps include LED-based lamps, or traditional filament bulbs or gas discharge lamps.
  • the environment 100 may be divided into a plurality of different zones or localities (not shown), such as different rooms, each illuminated by a different respective subset of one or more of the illumination sources 106.
  • the different zones may relate to different sections of a farmhouse or stable, e.g. a chicken coup.
  • the environment 100 may also be equipped with one or more user devices 108.
  • each zone or locality may comprise a single respective user device 108.
  • each zone or locality may comprise more than one respective user device 108.
  • the user device 108 may be, for example, a mobile device including mobile or cell phones (including so-called“smart phones”), personal digital assistants, pagers, tablet and laptop computers and wearable communication devices (including so-called“smart watches”).
  • One or more of the user devices 108 may comprise a lighting control device.
  • Each of the lighting control devices may take the form of a stand-alone lighting control device such as a smart light switch, a dimming switch, etc. or alternatively a lighting control device integrated in another user device 108 such as a mobile user terminal such as a smartphone or tablet, or even a wearable device that can be worn about the user’s person.
  • the user terminal may be installed with a suitable lighting control app.
  • the lighting control device can be mains powered, battery powered, or use energy-harvesting techniques to supply its energy.
  • the lighting control device is configured to be able to control the light emitted by one or more light illumination sources 106 in the environment 100.
  • This may include switching the illumination sources 106 on/off, controlling the colour of the light, controlling the dimming level, controlling a time- varying effect of the light, controlling a spatial- varying effect of the light or adjusting any other aspects of the light that may be applicable to the illumination sources 106 within the environment 100.
  • the environment 100 may also be equipped with a central lighting bridge 110 or server.
  • the environment is also equipped with one or more infrared cameras 112.
  • An infrared camera 112 (also known in the art as a thermographic camera or thermal imaging camera) is a device that forms an image using infrared radiation. Images formed using infrared radiations are known as thermograms. Infrared cameras operate in wavelengths of between approximately 750 nm (0.75 pm) and 14,000 nm (14 pm). The wavelength region which ranges from 0.75 to 3pm is known as the near infrared regions. The region between 3 and 6pm is known as the mid-infrared and infrared radiation which has a wavelength greater higher than 6pm is known as far infrared. Each infrared camera may operate in a particular wavelength region. The one or more infrared cameras may operate using passive
  • thermography That is, an infrared camera is pointed at a region of the environment and from captured thermal image a temperature map is constructed.
  • the one or more infrared cameras may operate using active thermography. That is, an energy source is first used to produce a thermal contrast between a feature of interest (e.g. an animal or a part of an animal) and the background.
  • a feature of interest e.g. an animal or a part of an animal
  • Figure 2 illustrates a system 200 for determining whether one or more animals within an environment have a health problem.
  • the user device 108 may comprise a user interface 202 arranged to receive an input from the user 102 and operatively coupled to a controller 204.
  • the user interface 202 may comprise a display in the form of a screen and means for receiving inputs from the user.
  • the user interface 202 may comprise a touch screen, or a point-and-click user interface comprising a mouse, track pad, or tracker ball or the like.
  • the user interface 202 may comprise a microphone for receiving a voice command from the user.
  • the user interface 202 may comprise a camera, infrared detector or the like for detecting gesture commands from the user 102.
  • the user device 102 may also comprise one or more speakers for outputting audio from the user device.
  • the user device 108 comprises the controller 204 coupled to the user interface 202 in order to receive an indication of the user’s commands.
  • the controller 204 is also operatively coupled to a lighting system comprising the one or more illumination sources 106 discussed in relation to Figure 1 via a wireless transceiver 206.
  • the controller 204 is also operatively coupled to an infrared imaging system comprising the one or more infrared cameras 112 discussed in relation to Figure 1 via the wireless transceiver 206.
  • the controller 204 can thereby control the one or more infrared cameras via the wireless transceiver 206.
  • the controller has an interface for receiving infrared images from the one or more infrared cameras.
  • the interface may be internal to the controller, i.e. located within the same housing as the controller. Alternatively, the interface may be external to the controller, i.e. not located within the same housing as the controller, and operatively coupled to the controller via a wired or wireless connection. In some examples, the interface is the
  • the controller 204 is implemented in the form of software stored in memory and arranged for execution on a processor (the memory on which the software is stored comprising one or more memory units employing one or more storage media, e.g. EEPROM or a magnetic drive, and the processor on which the software is run comprising one or more processing units).
  • the controller 204 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware circuitry such as a PGA or FPGA.
  • the controller 204 may be implemented internally in a single user device 108 along with the user interface 202 and the wireless transceiver 206, i.e. in the same housing.
  • the controller 204 could, partially or wholly, be implemented externally such as on a central lighting bridge 110 or server comprising one or more server units at one or more geographic sites (not shown).
  • the controller 204 may be configured to perform some or all of the actions of the user device 108 disclosed herein.
  • the controller 204 is configured to receive the user commands via the user interface 202.
  • the controller 204 is also configured to communicate with one or more illumination sources 106 and the one or more infrared cameras 112 within the environment 100 via the wireless transceiver 206 and/or where applicable, the controller 204 is also configured to communicate with the central lighting bridge 110 or server via the wireless transceiver 208.
  • the user device 108 comprises the wireless transceiver 206 for communicating via any suitable wireless medium, e.g. a radio transceiver for communicating via a radio channel (though other forms are not excluded, e.g. an ultrasound or infrared transceiver).
  • the wireless transceiver 206 may comprises a Wi-Fi, ZigBee, Bluetooth, Thread etc. interface for communicating with the infrared cameras 112 and/or illumination sources 106. Each illumination source 106 and infrared camera is configured to be able to
  • a radio channel preferably a radio channel (though the possibility of other media such as visual light communications, ultrasound or infrared are not excluded).
  • the radio channel may be based on a radio access technology such as ZigBee, Bluetooth, Wi-Fi,
  • a wired connection could alternatively, or additionally, be provided between the user device 108 and the light emitting elements 106 and/or infrared cameras 112 for control purposes, e.g. an Ethernet or DMX connection.
  • the wireless transceiver 206 may communicate with the illumination sources 106 and/or infrared cameras 112 via the central lighting bridge 110 or a server, for example, over a local area network or a wide area network such as the internet.
  • the functionality of the central lighting bridge 110 or server is implemented in the form of software stored in memory and arranged for execution on a processor (the memory on which the software is stored comprising one or more memory units employing one or more storage media, e.g. EEPROM or a magnetic drive, and the processor on which the software is run comprising one or more processing units).
  • a processor the memory on which the software is stored comprising one or more memory units employing one or more storage media, e.g. EEPROM or a magnetic drive, and the processor on which the software is run comprising one or more processing units.
  • some or all of the functionality of the central lighting bridge 110 or server could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware circuitry such as a PGA or FPGA.
  • the central lighting bridge 110 or server may be implemented locally within the environment 100 or at a remote location, and may comprise one or more physical units at one or more geographic sites.
  • the central lighting bridge 110 may comprise a wireless transceiver.
  • the wireless transceiver may comprise a Wi-Fi, ZigBee, Bluetooth, Thread etc. interface for communicating with the illumination sources 106, user device 108, central lighting and/or infrared cameras 112 over a local and/or wide area network.
  • a radio channel may be based on a radio access technology such as ZigBee, Bluetooth, Wi-Fi, Thread,
  • the central lighting bridge 110 may comprise a wired connection for communicating with the illumination sourcesl06, user device 108, central lighting and/or infrared cameras 112.
  • the controller 204 may instead be located, in whole or in part, on the central lighting bridge 110 or server.
  • the controller 204 on the user device 108, the central bridge 110 or the server may be configured to control the illumination sources 106 and infrared cameras 112 in accordance with the following.
  • the controller 204 is configured to receive a first infrared image from one or more infrared cameras 112.
  • each infrared camera 112 may capture a respective infrared image and transmit said infrared images to the controller 204.
  • a single infrared image may be captured by a single infrared camera 112 and transmitted to the controller 204.
  • multiple infrared cameras 112 may capture a respective infrared image and those infrared images may be combined (e.g. merged, overlaid, superimposed, etc.), by one or more of the infrared cameras 112 or by the controller 204, into a single infrared image.
  • the infrared cameras 112 may capture an image of the environment 100 in its entirety.
  • the environment 100 may be a bam and the infrared image may capture the entire floor space of the bam.
  • the infrared cameras 112 may capture an image of only part of the environment 100.
  • the part of the environment 100 may be an area of the bam, stable or other farm building such as, for example, a coup, a pen, a drive bay, a milk house, a corral, etc.
  • the controller 204 may be configured to control the one or more infrared cameras 112 to capture the first infrared image, e.g. in response to a trigger such as, for example, a user command.
  • the infrared cameras 112 may capture a first infrared image periodically (e.g. every minute, few minutes, hour, etc.) or at predefined points in time.
  • the controller 204 is configured to control one or more illumination sources 106 to illuminate the environment 100 for a predetermined time period.
  • the illumination sources 106 may be controlled to illuminate the environment 100 in its entirety.
  • the illumination sources 106 may be controlled to illumination only a part of the environment 100.
  • the one or more illumination sources 106 may act as a spotlight to illuminate only the part of the environment 100 captured in the first infrared image, or only part of the environment 100 captured in the first infrared image.
  • the predetermined time period may, for example, one or more seconds. The predetermined time period may be dependent on the received first infrared image, as discussed below.
  • the controller 204 is also configured to receive a second infrared image of the environment 100 from the one or more infrared cameras 112. Similarly to the first infrared image, the second infrared image may capture all or only part of the environment 100. The controller 204 may be configured to control the infrared cameras 112 to capture the second infrared image.
  • the controller 204 comprises an image recognition algorithm configured to identify a region within the first infrared image, the region corresponding to at least part of one of the animals 104 (referred to hereinafter as“the first animal”) within the first infrared image (and therefore within the environment 100). For example, the controller may be configured to identify a temperature region and/or a thermal region corresponding to at least part of one of the animals.
  • the region may, in some example, not correspond to part of an animal. Instead, the region may correspond to part of the environment 100. Since the amount of radiation emitted by an object increases with temperature, thermography allows one to see variations in temperature. The region may therefore be thought of as“heat region”, wherein the infrared images shows regions emitting different amounts of heat. For example, an animal 104 may be warmer than the background environment 100, e.g. the floor of a stable, and therefore the animal 104 will stand out against the background as a region. Different parts of the animal 104 (e.g. head, body, limbs, eyes, and feet) may also have different regions.
  • the identified regions may correspond to a thermal shape and/or pattern. That is, the controller 204 may identify shapes or patterns in the infrared images. The shapes and/or patterns may be predetermined. For instance, the controller 204 may identity shapes and/or patterns that are known to be indicative of an animal having a physiological condition.
  • the controller 204 may be configured to determine (or identify) that the region corresponds to at least part of one of the animal(s) captured within the first infrared image. That is, the identified region may be assumed to correspond to part of an animal 104 by the very fact that a region is identified. Alternatively, the controller 204 may determine that the region has a temperature indicative of an animal, is above (greater than) or below (less than) the background temperature of the environment 100, or is at least part of an animal 104 shaped region.
  • the controller identifies a region in the second infrared image that
  • the regions may correspond in that they share a thermal characteristic, e.g. they are at the same temperature or their respective temperatures differ by less than a threshold amount.
  • the regions may correspond in that they are at least one of the same size, shape and/or patter, or at least substantially the same size, shape and/or pattern.
  • the regions may correspond in that they are determined to be correspond to at least part of the same animal.
  • the controller 204 can then determine a change in position and/or thermal characteristic of the identified region between the first and second infrared images.
  • the controller 204 determines by how much (in real or arbitrary units) the region has moved from the first infrared image to the second infrared image and/or how much a thermal characteristic has changed from the first infrared image to the second infrared image.
  • the thermal characteristic may represent a measured temperature or heat difference between the infrared images, i.e. a difference in how much infrared light is captured from image to the next.
  • the controller 204 determines whether or not the first animal 104 has a health problem. For example, the controller 204 may determine that the first animal 104 has a health problem if the positional change of the identified region is zero and/or the thermal change is greater than a predetermined minimum value. The minimum value may be greater than zero. From this information the controller 204 may determine that the animal 104 is dead or severely sick or injured such that it cannot move. As another example, if the positional change is small, the controller 204 may determine that the animal 104 has a health problem such as, for example, an illness or disease. As a further example, if the positional change is large, the controller 204 may determine that the animal 104 does not have a health problem.
  • the controller 204 may identify regions above or below a threshold temperature. For example, the controller 204 may look for regions that are above an animal’s typical body temperature. This information may be stored in memory of the user device 108 and/or accessed from the central bridge or server. In some examples, the user device 108 is configured to allow the user 102 to configure the threshold temperature.
  • the identified region may correspond to a region of a temperature above or below the average temperature within the environment 100.
  • controller 204 may identify a region that is above the average environment 100 temperature. That is, the controller 204 may assume that a region above the average environment 100 temperature corresponds to an animal. Similarly, the controller 204 may assume that a region below the average environment temperature corresponds to an animal.
  • the identified region may correspond to a region of temperature above or below the average temperature of the one or more animals 104 within the environment 100.
  • the body temperature of the animal 104 is strongly correlated to the animal’s core temperature.
  • thermal imaging can be used to detect a status of increased or decreased temperature, thereby detecting a possible fever or other health problem(s) of the animal.
  • the controller 204 may control one or more of the illumination sources 106 within the environment 100 to emit illumination having one or more predetermined characteristics.
  • the controller 204 may control a particular subset of the illumination sources 106 (e.g. a single illumination source) to emit illumination at a given time, or the controller 204 may control all of the illumination sources 106 to emit illumination at the same time.
  • the illumination sources 106 may be controlled to emit illumination having a predetermined brightness level, luminous intensity, luminous flux, and/or luminance.
  • the illumination sources 106 may additionally or alternatively be controlled to emit illumination having a predetermined colour. That is, the illumination may have a predetermined frequency, wavelength or spectrum.
  • each illumination source may have a multi-colour LED (e.g.
  • the illumination sources 106 may be controlled to emit illumination having a predetermined temperature (or colour temperature).
  • the illumination may have a colour temperature over 5000 K (so called “cool colours” like bluish white), a colour temperature between 2400 and 3000 K (so called “warm colours", like yellowish white through to red).
  • the illumination may have a colour temperature ranging from 1700 to 6000 K.
  • the controller may emit illumination having a predetermined static or dynamically varying spatial pattern. That is, the illumination may have different characteristics at different points in space and/or time. Furthermore, the illumination may be emitted for a predetermined duration.
  • the duration may be less than the time period between the first and second infrared images being captured. That is, the first infrared image may be captured at time TO, and the second infrared image may be captured at time Tl .
  • the controller 204 may control the one or more infrared cameras 112 to capture the second infrared mage within a predetermined time period of the illumination of the environment 100. That is, within a predetermined time period of the start of the illumination (i.e. the point in time at which the illumination sources 106 are controlled to illuminate the environment 100) or within a predetermined time period of the end of the illumination (i.e. the point in time at which the illumination sources 106 stop illuminating the environment 100, either completely or with the illumination described above).
  • the second infrared image may be time-linked to the initiation of the illumination of the environment 100 (e.g. captured one second after the initiation, or within three seconds of the initiation).
  • the second infrared image may be captured one second after the end of the predetermined time period of illumination.
  • determining a change in position of the identified region between the first and second infrared images may comprise the controller 204 determining whether the change in position of the identified temperature is less than a threshold amount. If the change in position is less than the threshold amount, the controller 204 may control one or more infrared cameras 112 to capture a third infrared image of the environment 100. For example, if the change in position if zero units (e.g. zero pixels, zero centimetres, zero metres, etc.) and the threshold is any number above zero, a third infrared image may be captured. Similarly, if the threshold is one metre and the change in position is less than one metre, the controller 204 may trigger a third infrared image to be captured. These examples may correspond to a health problem such as, for example, an animal 104 being dead or sick. In some examples, if the change in position is zero, the animal 104 is presumed to be dead and a third image is not captured.
  • a health problem such as, for example, an animal
  • the third infrared image may have a higher level of detail than the first and/or second infrared images.
  • a different type of infrared camera 112 may be used to capture the third infrared image compared to the type of infrared camera 112 used to capture the first and/or second infrared images.
  • the third infrared image may have a greater resolution and/or a greater image sharpness.
  • the one or more infrared cameras 112 may be controlled to capture an infrared image (the third infrared image) of the head of an animal. That is, the controller 204 may determine where the head of the animal 104 is within the environment 100 and control the infrared camera(s) to capture an image at the corresponding position. Alternatively, the infrared camera(s) may capture an infrared image of the animal 104 in general, and the portion of the image corresponding to the head of the animal 104 may be enlarged, focused on, etc.
  • the controller 204 may identify one or more regions within the third infrared image. For example, a region may correspond to the head of animal, the body of animal, the limbs of an animal, etc. The regions may correspond to regions of temperature above or below the average temperature within the third infrared image, and/or above or below the average temperature of the animal 104 captured within the third infrared image. The controller 204 may also identify regions above or below a threshold temperature. The controller 204 may then determine whether or not an animal 104 has a health problem based on the one or more regions that are identified in the third infrared image. Identifying one or more regions within the third infrared image may comprise identifying one or more regions on the head of the first animal.
  • the controller 204 may first identify a region within the third infrared image corresponding to the head of the animal 104 and then identify regions on the identified head of the animal. For example, basic image recognition techniques may be used to identify an object in an image that corresponds to a (animal) head.
  • the controller 204 may determine whether or not an animal 104 has a health problem based on the one or more regions that are identified on the head of the animal 104 in the third infrared image. For example, the controller 204 may determine that the animal 104 has a health problem if a region is identified that coincides with one, more, or all of the nose, mouth, eyes, and beak (if applicable). For example, a low temperature region may be identified on or around the nose or beak of an animal. Here,“low temperature” may mean below the average temperature of the animal’s head, or below a threshold temperature. A (low) temperature region that coincides with being on or around the nose, for example, may be caused by mucus outflow from an orifice in the animal’s head, e.g.
  • Figure 3A is an example of infrared images taken (middle and bottom) of chickens using an infrared camera.
  • the top image is a visible light image of the same chickens. Even with a low resolution camera, the images show that hot spots (i.e. temperature regions) are easily detected, e.g. on the head and body. The images also reveal that it is easy to detect chickens having different amount of feathers.
  • the chicken on the left hand side of the images in Figure 3A (marked by reference numeral 302) is a healthy chicken with many figures and thus appears darker in the images.
  • the chicken on the left hand side of the images in Figure 3A (marked by reference numeral 304) is in a molting phase and thus appears lighter in the images, with several different (temperature) regions captured.
  • This information may be used to deduce, e.g. in a stable or coop, if the chicken has a physiological condition that causes molting.
  • Figure 3b is another example of infrared images taken (middle and bottom) of chickens using an infrared camera.
  • the top image is a visible light image of the same chickens. Again, even with a low resolution camera, the images show that hot spots (i.e. temperature regions) are easily detected, e.g. on the head 308 and legs 306.
  • the controller 204 may control one or more
  • microphones to monitor for sounds indicative of an animal 104 suffering from a health problem.
  • a microphone array may be used to listen in on an animal’s location, e.g. using beam forming.
  • the microphone(s) may monitor for wheezing, sneezing, coughing or gurgling. This may provide an auditory indication and/or verification of an animal 104 having a health problem such as, for example, a respiratory disease or infection.
  • the controller 204 comprises a sound recognition algorithm arranged to detect sounds, such as those mentioned above, via the one or more microphones.
  • the detection of sounds indicative of an animal 104 suffering from a health problem may be used to trigger the infrared camera(s) to capture the first infrared image of the environment 100. For example, if a wheezing sound is detected, the controller 204 may control the infrared camera(s) to capture the first infrared image. The infrared image may be captured by at least one infrared camera 112 directed towards the source of the wheezing sound. Additionally or alternatively, the detection of such sounds may be used by the controller 204 to verify (or at least increase the confidence of) a determination of an animal 104 having a health problem. For example, if an animal 104 is determined to have a health problem (e.g.
  • the controller 204 may control the microphone(s) to monitor for sounds indicative of a health problem. If those sounds are detected, the controller 204 can be confident that the determination is correct (this may feed into a confidence score displayed to a user 102, as described below).
  • the controller 204 may also control the one or more infrared cameras 112 to take one or more additional infrared images of the environment 100, each after a subsequent time period of illumination. That is, the one or more illumination sources 106 may be controlled to illuminate the environment 100 for sequential time periods and an infrared image is captured after each time period of illumination.
  • Each time period of illumination may have a different characteristic from the immediately previous time period of illumination.
  • the characteristic may be one or more of a predetermined brightness level, a predetermined colour, and a predetermined temperature.
  • the characteristic may also be the presence or absence of illumination.
  • the first and third time periods may correspond to the one or more illumination sources 106 emitting illumination, whilst the second and fourth time periods may correspond to the one or more illumination sources 106 not emitting
  • every other period of illumination may comprise bright illumination, whilst the periods of illumination in between may comprise softer illumination.
  • the controller 204 may determine a change in position of the identified region across the multiple captured infrared images. This may help to prevent incorrectly determining that an animal 104 is sick. For example, if an animal 104 does not move (much or at all) after one time period of illumination, the animal 104 may simply still be in the process of waking up from its sleep. However, if the animal 104 does not move (much or at all) after several short bursts of bright light, the animal 104 is likely to have health problem.
  • the controller 204 may cause an alert to be output to a user 102.
  • the alert may be a visual alert displayed on a user interface 102 (or display screen) of the user device 108.
  • the alert may be an audio alert played out by a speaker of the user device 108, and/or by an external speaker, e.g. one connected to the user device 108 via a wired or wireless connection.
  • the controller 204 may cause the animal 104 to be visually identified on the user interface 102 (or display screen) of the user device 108. That is, the user interface 102 may show a (live) image of the environment 100, with the sick animal 104 identified on the image.
  • one or more illumination sources 106 may be used to visually identify the animal 104 with the health problem. That is, an illumination source may illuminate the animal, or part of the environment 100 in which the animal 104 is situated.
  • the animal 104 may be placed under a bright spot light, (or at least illuminated with different illumination compared to the rest of the environment 100, e.g. red light) so that the user 102 can easily locate the animal.
  • Alerting the user 102 and/or visually identifying the animal 104 allows the user 102 to take action to treat the animal 104 and/or to remove the animal 104 from the environment 100.
  • Early identification of a health problem allows for the early treatment of the health problem. This is not only beneficial for the identified animal 104 but also helps to prevent the health problem (e.g. a disease) from spreading to the other animals 104 in the population.
  • the controller 204 may, in response to determining that one or more animals have a physiological condition, control one or more luminaires to illuminate a first area within the environment with illumination being different to that illuminating a second area within the environment that contains the animal having the condition. For example, a spotlight may be put on an area away from a sick animal (i.e. an area more than a
  • predetermined minimum distance may be illumination with predetermined lighting characteristics).
  • the illumination may have a predetermined brightness level, a
  • predetermined colour a predetermined temperature
  • predetermined spatial pattern a predetermined spatial pattern
  • predetermined duration Illuminating an area or areas away from the sick animal may result in healthy animals avoiding the location of the sick animal so that the chance of potential infections is minimized and/or the user (e.g. farmer or caretaker) can easily identify and remove the animal from the environment without disturbing the healthy population.
  • the environment 100 may contain one or more dispensers 114, as shown in Figure 1.
  • the dispenser 114 may dispense medicine, water, and/or food.
  • the controller 204 may be operatively coupled to the dispenser 114 such that it can control the dispenser 114 to release whatever substance it contains, or even a particular substance is the dispenser 114 contains multiple types of substance.
  • the controller 204 may control the dispenser 114 to release a substance in response to determining that an animal 104 has a health problem.
  • the controller 204 may, upon determining that an animal 104 has a health problem, release a (particular) medicine into the environment 100. The medicine released may be based on the assumed health problem, e.g. based on the presence or absence of mucus in the third infrared image.
  • the controller 204 may be configured to adapt one or more environmental conditions in the environment 100.
  • the controller 204 may adapt the temperature, humidity and/or ventilation within the environment 100 in order to help treat an animal’s health problem and/or to prevent the spread of a disease or infection.
  • the controller 204 may adapt the one or more environmental conditions in response to determining that an animal 104 has a health problem.
  • the controller 204 may be operatively coupled to the necessary components for adapting a given environmental condition (e.g. to one or more heating elements, cooling elements, air conditioning units, ventilation systems, etc.)
  • the present invention uses a thermal imaging system which may cover the total environment 100 area and is able to capture and analyse, to a sufficient level of detail, a thermal image of an animal 104 such as a chicken.
  • a lighting system is controlled to project (intense) illumination onto the animals 104.
  • a control system may, automatically, cause a sequence of thermal imaging and light actuation actions to be performed in order to detect and/or diagnose a medical condition of one or more animals 104 within the environment 100.
  • a user interface 102 may provide a user 102 (e.g. a stable manager) data on sick animals 104, allowing the user 102 to act (e.g. remove the sick animal 104 from the population, provide medicine therapy to prevent outbreak of an epidemic as originating from an initial individual disease case, etc.).
  • the invention allows for the detection of specific disease types (e.g.
  • respiratory infections which are typical, cumbersome and contagious, especially in chicken
  • respiratory infections which can not only have a high impact on the wellbeing of the animals 104 and the productivity of the breeding, but that is also difficult to diagnose in a dense stable population.
  • manual visual inspection of all animals 104 in a dense population is practically impossible.
  • One symptom of respiratory infections is mucus outflow (or“snot”) from the beak.
  • infection systemic responses will also lead to a raised body temperature (body temperature is typically strongly correlated to an animal’s core temperature).
  • Infrared thermal imaging can be used to detect a status of increased temperature which is an indication of a fever (as a possible symptom of infection) in birds.
  • the search for sick (e.g. respiratory disease suffering) birds can be done in a sequence of actions.
  • thermal imaging is used for detecting birds with a raised temperature. Such a scan is also useful to detect birds with lowered body temperature (indicating a dying or dead bird). Sick (and of course also dead) birds will also show low levels of movement (therefore dynamic thermal imaging can give a first insight into activity levels and hence the status of homeostasis or disruption thereof).
  • acoustic monitoring can be used to check for deviating behaviour (checking for gurgling and wheezing) to trigger the search for sick individuals.
  • an actuation step is performed to check the alertness of the bird.
  • a sick bird will show lethargy and will be less responsive to triggers.
  • Triggering the bird with a short burst of high intensity light, as provided by the connected and controllable light system will alert and wake up birds and lead to agitation and movement in a healthy bird.
  • thermal image capturing of nose, beak and eyes areas of the bird will allow for the detection of snot (i.e.
  • a microphone array can be used to listen in on the bird location via beam forming to enable auditory verification (checking for gurgling and wheezing in individual birds).
  • a manual or automated response action can be taken, either by a stable operator or by an automated system, such as e.g. the removal of the bird and/or by starting population- level medicinal treatment to prevent outbreak of the disease to the flock.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Environmental Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Housing For Livestock And Birds (AREA)

Abstract

A controller for determining whether an animal has a physiological condition, wherein the controller has an interface for receiving, from infrared cameras, first and second infrared images of the environment, wherein the controller is configured to: after receiving the first infrared image and before receiving the second infrared image, control illumination sources to illuminate the environment for a predetermined time period; identify a first region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of an animal; identify a second, corresponding region within the second infrared image; determine a change in position and/or temperature between the identified first and second regions in the first and second infrared images; and determine whether or not the animal has a physiological condition based on the determined change in position and/or temperature between the identified regions in the first and second infrared images.

Description

CONTROLLER FOR DETECTING ANIMALS WITH PHYSIOLOGICAL CONDITIONS
TECHNICAL FIELD
The present disclosure relates to a controller and computer program for determining whether or not an animal has a physiological condition. BACKGROUND
In general, agriculture is moving more and more towards highly artificial settings. In particular, the cultivation and breeding of animals often involves the use of large size stables (or bams, coups, pens, etc.) The density and numbers of animals in these stables leads to increased issues with animal epidemiology, with a higher likelihood of fast- spreading diseases amongst those animals. Poultry, especially chicken, are quite susceptible to respiratory infections. Dense populations of chickens can lead to a fast spread of an infection throughout the population of chickens, e.g. within a stable. This is further worsened by certain conditioning such as, for example, high stress levels which often occur in large, dense groups of animals.
As compared to former methods, in which farmers could easily track the health status of their animals, the use of industrial methods leads to highly populated stables that do not allow for the easy or fast monitoring of the animals. For example, a single chicken farm alone may have several hundred, or sometimes thousand, chickens. For a farmer to check the condition of each chicken individually each day would be laborious and time- consuming, if not impossible. Furthermore, the monitoring of individual animals with an on- body sensing system would be very expensive (costs arising from the purchase of new sensors and the repair or replacement of damaged sensors) and arduous in practice (for example, a farmer would be required to fit a sensor to each individual animal).
Therefore more automated, connected systems are required, in order to safeguard the health, wellbeing and productivity of the animal population.
US 5,474,085 A discloses a method and apparatus for remote sensing of livestock, using a thermographic image sensing system, in order to determine one or more of a number, weight, location, temperature, carcass pH, etc., of animals in a surveillance area. WO 2015/030611 Al discloses a method and apparatus for determining at least one characteristic of respiration of a non- human animal. A plurality of thermal infrared images of the animal are obtained using at least one thermographic camera. The area of the images capture a region of interest relating to respiration of the animal. Each image is analysed to obtain data relating to respiration of the animal. At least one characteristic of respiration of the animal is determined based at least in part on the data obtained from the images.
SUMMARY
According to a first aspect disclosed herein, there is provided controller for determining whether one or more animals within an environment have a physiological condition. The controller has an interface for receiving, from one or more infrared cameras, a first infrared image of the environment and a second infrared image of the environment. The controller is configured so as, after receiving the first infrared image and before receiving the second infrared image, to control one or more illumination sources to illuminate the environment for a predetermined time period. The controller is further configured to identify a region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of a first animal; and to identify a second, corresponding region within the second infrared image. The controller is configured to then determine a change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images; and determine whether or not the first animal has a physiological condition based on the determined change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images.
For example, this may be performed using a computer vision algorithm like object tracking, where one or more of the location, pose, shape, and visual appearance of an object may be tracked in successive infrared images. From these features, information including motion, shape deformations, appearance changes, and orientations (of parts) of an object (i.e. an animal) may be obtained. Here, a change in position of a region in the first and second infrared images includes a change in location of the region, a change in pose of the region, and a change in shape of the region.
An advantage of this is that the illumination provided by the illumination sources acts as a stimulus to the animals, causing the animals to move (e.g. the illumination may cause the animals to wake up). Based on the movement of the animal between the first and second infrared images, a determination can be made as to whether the animal does indeed have a physiological condition (e.g. a health problem such, as, for example, an illness, injury or death, or balding, shedding of feathers, etc. which may indicate a health problem). For example, animals showing a small degree of movement (e.g. those that are lethargic) are more likely to be ill. In addition, zero movement of an animal between the time that the first and second infrared images are taken may indicate that the animal has died. Moreover, animals who are identified in the first image (e.g. with a region indicating that the animal may have a physiological condition) and show a large degree of movement after the illumination stimulus, may be deemed as healthy. This advantageously prevents healthy animals from being diagnosed with a physiological condition, allowing, for example, a farmer to focus on those animals deemed to have a physiological condition, e.g. a disease.
As another example, a change in thermal characteristic (or thermal signature) such as, for example, temperature, can also be used to detect physiological conditions. That is, a change in e.g. temperature between the corresponding regions in the first and second images may indicate that the first animal is suffering from a physiological condition. For example, a change in temperature may be caused by a change in orientation of the wings of a winged animal as the feathers would shield more of the animal’s body heat. As another example, skin temperature in hens measured following a mild or more severe acute stressor (i.e. the illumination) may be detected. The temperature of thermoregulatory tissues can temporarily drop under acute stress, and the magnitude of this skin temperature change may reflect acute stressor intensity. The measured temperature increase or decrease may be used to identify a physiological condition.
In embodiments, the identified regions may be thermal (e.g. temperature) regions.
In embodiments, the identified regions may each correspond to at least one of: a predefined thermal pattern, a predefined thermal shape or predefined thermal area.
In embodiments, the controller may be configured to identify that the identified region corresponds to at least part of a first one of the animals within the first infrared image.
In embodiments, the first and second infrared images may be thermal images.
In embodiments, the one or more animals may be poultry.
In embodiments, said identified region may correspond to a region of either:
(i) a temperature above (i.e. greater than) the average temperature of within the environment, or (ii) a temperature below (i.e. less than) the average temperature of within the environment. In embodiments, said identified region may correspond to a region of either:
(i) a temperature above (i.e. greater than) the average temperature of the one or more animals within the environment, or (ii) a temperature below (i.e. less than) the average temperature of the one or more animals within the environment.
In embodiments, the controller may be configured to perform said controlling of the one or more illumination sources to illuminate the environment by: controlling the one or more illumination sources to illuminate the environment with illumination having one, more, or all of the following characteristics: (i) a predetermined brightness level, (ii) a predetermined colour, (iii) a predetermined temperature, (iv) a predetermined spatial pattern, and/or (v) a predetermined duration.
In embodiments, the controller may be configured to control the one or more infrared cameras to capture the second infrared image of the environment within a predetermined time period of said illumination the environment.
In embodiments, the controller may be configured to determine whether the change in position between the identified first and second regions is less than a threshold amount and, if the change in position is determined to be less than the threshold amount, control one or more infrared cameras to capture a third infrared image of the environment.
An advantage of this is that a (more detailed) infrared image may be captured of those animals who are deemed likely to be suffering from a physiological condition such as, for example, a health problem. This third image may therefore offer a better insight into the particular type of physiological condition that the animal is experiencing. For example, an animal having a condition such as, for example, balding may be detected. The third infrared image may provide a more accurate image of this animal. From this third image, a farmer, for example, may be able to determine if the balding is due to or indicative of a more serious problem such as an underlying health issue.
In embodiments, the controller may be configured to perform said controlling of the one or more infrared cameras to capture the third infrared image of the environment by controlling the one or more infrared cameras to capture an infrared image of the head of the first animal.
In embodiments, the controller may be configured to: identify one or more regions within the third infrared image; and determine whether or not the first animal has a physiological condition based on the identified one or more regions within the third infrared image. In embodiments, the controller may be configured to perform said identifying of the one or more regions within the third infrared image by identifying one or more regions on the head of the first animal.
In embodiments, the controller may be configured to perform said determining of whether or not the first animal has a physiological condition based on the identified one or more regions within the third infrared image by: determining if the one or more regions coincide with at least one of the (i) nose, (ii) mouth, and (iii) eyes of the first animal in the third infrared image.
For example, the controller may determine from the third infrared image, based on regions around, say, the nose (or beak if applicable) of the animal, that the animal’s nose is leaking nasal mucus. This is an indicator that the animal may have a respiratory infection.
In embodiments, the controller may be configured to: control one or more microphones to monitor for sounds indicative of an animal within the environment having a physiological condition, and control one or more infrared cameras to capture said first infrared image of the environment in response to detecting a sound indicative of an animal within the environment having a physiological condition.
In embodiments, the controller may be configured to: control one or more illumination sources to illuminate the environment for multiple sequential time periods, each time period of illumination having a different characteristic from the previous time period; receive an infrared image of the environment after one or more of the multiple sequential time periods of illumination; determine a change in position between the identified regions between the one or more infrared images; and determine whether or not the first animal has a physiological condition based on the determined change in position between the identified region between the one or more infrared images.
In embodiments, the controller may be configured to cause an alert to be output to a user if the first animal is determined to have a physiological condition.
In embodiments, the controller may be configured to cause a visual indication to be displayed, on a user interface of a user device, if the first animal is determined to have a physiological condition, wherein the visual indication identifies the first animal.
Advantageously, this allows the farmer to easily find the animal (e.g. from amongst a large number of near identical animals) experiencing the physiological condition in order to treat that particular animal, e.g. by providing medicine. In embodiments, said causing an alert to be output to the user may comprise at least one of: (i) causing a visual alert to be displayed on a user interface of the user device, and (ii) causing an audio alert to be output from a speaker within the environment.
In embodiments, if the first animal is determined to have a physiological condition, the controller may be configured to control the one or more luminaires to illuminate an area in the environment that is free from the first animal (or any animal identified as having a physiological condition. An advantage of this is that the healthy animals are directed away from the sick animal to minimize the chance of e.g. an infection spreading.
According to a second aspect disclosed herein, there is provided a computer program product for determining whether one or more animals within an environment have a physiological condition, the computer program product comprising code embodied on one or more computer-readable storage media and configured so as when executed on one or more processors to perform operations of: receiving, from one or more infrared cameras, a first infrared image of the environment; controlling one or more illumination sources to illuminate the environment for a predetermined time period; receiving, from one or more infrared cameras, a second infrared image of the environment; identifying a first region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of a first animal; determining a change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images; and
determining whether or not the first animal has a physiological condition based on the determined change in position and/or thermal characteristic between the identified first and second region in the first and second infrared images.
According to a third aspect disclosed herein, there is provided a system for determining whether one or more animals within an environment have a physiological condition, wherein the system comprises: one or more infrared cameras; one or more illumination sources; and a controller, wherein at least one infrared camera is configured to capture a first infrared image of the environment; wherein at least one illumination source is configured to illuminate the environment for a predetermined time period; wherein at least one infrared camera is configured to capture a second infrared image of the environment; and wherein the controller is configured to: identify a first region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of a first animal; identify a second, corresponding region within the second infrared image; determine a change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images; and determine whether or not the first animal has a physiological condition based on the determined change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images.
In embodiments, the system may comprise a dispenser configured to dispense one or more substances, and wherein the controller is configured to control the dispenser to automatically dispense at least one substance in response to determining that the animal has a physiological condition.
In embodiments, the at least one substance that the dispenser is configured to release may comprise a medicine.
BRIEF DESCRIPTION OF THE DRAWINGS
To assist understanding of the present disclosure and to show how embodiments may be put into effect, reference is made by way of example to the
accompanying drawings in which:
Fig. 1 shows schematically an environment comprising a system for determining whether one or more animals within an environment have a physiological condition, and
Fig. 2 shows schematically an example block diagram of a control system for controlling the system, and
Figs. 3A 3B show examples of infrared images of poultry showing several regions corresponding to features of the poultry.
DETAIFED DESCRIPTION
Modem cultivation and breeding of animals, in particular industrial scale agriculture, requires automated, accurate systems to detect unhealthy animals. Previous systems do not allow for physiological conditions (e.g. health problems) in large, dense animal populations, like those found in chicken farm stables for example, to be easily detected at the individual animal level.
Embodiments of the present invention provide an automated, efficient and selective system for determining whether or not an animal within an environment has a physiological condition based on the level of movement of regions between infrared images captured at different points in time, whereby the animal is subjected to a light stimulus between the captured images. By having an automated for detecting the occurrence of physiological conditions such as, for example, respiratory diseases, a fast response to the physiological condition (or disease) can be realised. This may allow for the disease to be contained more quickly and easily by, for example, the administration of medicine at the individual or population level, or by changing the living condition in the animal population. Furthermore, the automatic identification and treatment or removal of sick and/or dead animals helps to prevent the spread of disease to healthy neighbouring animals. As another example, infrared images can readily detect liquids and moisture. This information can be used to assess the state of living conditions such as, for example, flooring or bedding in the environment. For example, the infrared images may detect animals with a physiological condition of wet skin, fur, feathers, etc. (e.g. on their body). If the animals are too wet because their bedding is moist, for example, this can lead to an increased health risk. By identifying this problem, the living conditions can be improved.
Whilst the invention will be described in examples relating to poultry and chickens in particular, the invention has wider usage to a variety of animals. The invention may be used to identify physiological conditions in, for example, cattle, sheep, pigs, etc. Herein, example techniques for determining if an animal has a health problem equally apply to techniques for determining if an animal has a physiological condition, such as, for example, molting, balding, etc.
Figure 1 illustrates an example environment 100 in which embodiments disclosed herein may be employed. The environment 100 is a space that may be occupied by one or more users 102 and/or one or more animals. The one or more animals may be poultry, that is, they may be any domesticated bird kept by humans for their eggs, their meat or their feathers, including chicken, duck, goose, partridge, pigeon, quail, turkey, Guinea fowl, pheasant, etc. Alternatively, the one or more animals may be farm animals such as, for example, goats, horses, donkeys, pigs, cows, etc. The one or more animals may in general be any vertebrate other than a human. That is, the animals are non-human animals.
The environment 100 may take the form of an indoor space such as one or more rooms of a home, office or other building such as a bam or farmhouse; an outdoor space such as a garden or park; a partially covered space such as a gazebo; or a combination of such spaces such as a farm comprising both indoor and outdoor spaces.
The environment 100 is equipped with a plurality of illumination sources (or light emitting elements, light sources, luminaires) 106 installed or otherwise disposed at different locations throughout the environment 100. An illumination source 106 may refer to any kind of light emitting device for illuminating an environment or part of the environment occupied by an animal 104, whether providing ambient lighting or task lighting. A human user 102 may also be present in the environment 100, or elsewhere outside the environment 100 in question. Each of the illumination sources 106 may take any of a variety of possible forms, such as a ceiling or wall mounted luminaire, a free-standing floor or table light source 106, an light source 106 mounted on a pole, gantry or rigging; or a less traditional form such as an light source 106 embedded in a surface or item of furniture (and the different light sources 106 in the environment 100 need not take the same form as one another).
Alternatively, the illumination sources may take the form of head mounted glasses or light emitting contact glasses. Whatever form it takes, each light emitting element 106 comprises at least one lamp (light element) and any associated housing, socket and/or support.
Examples of suitable lamps include LED-based lamps, or traditional filament bulbs or gas discharge lamps.
In some scenarios the environment 100 may be divided into a plurality of different zones or localities (not shown), such as different rooms, each illuminated by a different respective subset of one or more of the illumination sources 106. For example, the different zones may relate to different sections of a farmhouse or stable, e.g. a chicken coup.
The environment 100 may also be equipped with one or more user devices 108. For example, each zone or locality may comprise a single respective user device 108. Alternatively, each zone or locality may comprise more than one respective user device 108. The user device 108 may be, for example, a mobile device including mobile or cell phones (including so-called“smart phones”), personal digital assistants, pagers, tablet and laptop computers and wearable communication devices (including so-called“smart watches”).
One or more of the user devices 108 may comprise a lighting control device. Each of the lighting control devices may take the form of a stand-alone lighting control device such as a smart light switch, a dimming switch, etc. or alternatively a lighting control device integrated in another user device 108 such as a mobile user terminal such as a smartphone or tablet, or even a wearable device that can be worn about the user’s person. For example, the user terminal may be installed with a suitable lighting control app. The lighting control device can be mains powered, battery powered, or use energy-harvesting techniques to supply its energy. The lighting control device is configured to be able to control the light emitted by one or more light illumination sources 106 in the environment 100. This may include switching the illumination sources 106 on/off, controlling the colour of the light, controlling the dimming level, controlling a time- varying effect of the light, controlling a spatial- varying effect of the light or adjusting any other aspects of the light that may be applicable to the illumination sources 106 within the environment 100. The environment 100 may also be equipped with a central lighting bridge 110 or server.
The environment is also equipped with one or more infrared cameras 112. An infrared camera 112 (also known in the art as a thermographic camera or thermal imaging camera) is a device that forms an image using infrared radiation. Images formed using infrared radiations are known as thermograms. Infrared cameras operate in wavelengths of between approximately 750 nm (0.75 pm) and 14,000 nm (14 pm). The wavelength region which ranges from 0.75 to 3pm is known as the near infrared regions. The region between 3 and 6pm is known as the mid-infrared and infrared radiation which has a wavelength greater higher than 6pm is known as far infrared. Each infrared camera may operate in a particular wavelength region. The one or more infrared cameras may operate using passive
thermography. That is, an infrared camera is pointed at a region of the environment and from captured thermal image a temperature map is constructed. Alternatively, the one or more infrared cameras may operate using active thermography. That is, an energy source is first used to produce a thermal contrast between a feature of interest (e.g. an animal or a part of an animal) and the background.
Figure 2 illustrates a system 200 for determining whether one or more animals within an environment have a health problem. As shown in Figure 2, the user device 108 may comprise a user interface 202 arranged to receive an input from the user 102 and operatively coupled to a controller 204. The user interface 202 may comprise a display in the form of a screen and means for receiving inputs from the user. For example, the user interface 202 may comprise a touch screen, or a point-and-click user interface comprising a mouse, track pad, or tracker ball or the like. Alternatively or additionally, the user interface 202 may comprise a microphone for receiving a voice command from the user. In another example, the user interface 202 may comprise a camera, infrared detector or the like for detecting gesture commands from the user 102. The user device 102 may also comprise one or more speakers for outputting audio from the user device.
The user device 108 comprises the controller 204 coupled to the user interface 202 in order to receive an indication of the user’s commands. The controller 204 is also operatively coupled to a lighting system comprising the one or more illumination sources 106 discussed in relation to Figure 1 via a wireless transceiver 206. The controller 204 is also operatively coupled to an infrared imaging system comprising the one or more infrared cameras 112 discussed in relation to Figure 1 via the wireless transceiver 206. The controller 204 can thereby control the one or more infrared cameras via the wireless transceiver 206. The controller has an interface for receiving infrared images from the one or more infrared cameras. The interface may be internal to the controller, i.e. located within the same housing as the controller. Alternatively, the interface may be external to the controller, i.e. not located within the same housing as the controller, and operatively coupled to the controller via a wired or wireless connection. In some examples, the interface is the wireless transceiver 206.
In embodiments, the controller 204 is implemented in the form of software stored in memory and arranged for execution on a processor (the memory on which the software is stored comprising one or more memory units employing one or more storage media, e.g. EEPROM or a magnetic drive, and the processor on which the software is run comprising one or more processing units). Alternatively it is not excluded that some or all of the controller 204 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware circuitry such as a PGA or FPGA. Whatever form it takes, in embodiments the controller 204 may be implemented internally in a single user device 108 along with the user interface 202 and the wireless transceiver 206, i.e. in the same housing. Alternatively the controller 204 could, partially or wholly, be implemented externally such as on a central lighting bridge 110 or server comprising one or more server units at one or more geographic sites (not shown).
The controller 204 may be configured to perform some or all of the actions of the user device 108 disclosed herein. For example, the controller 204 is configured to receive the user commands via the user interface 202. The controller 204 is also configured to communicate with one or more illumination sources 106 and the one or more infrared cameras 112 within the environment 100 via the wireless transceiver 206 and/or where applicable, the controller 204 is also configured to communicate with the central lighting bridge 110 or server via the wireless transceiver 208.
The user device 108 comprises the wireless transceiver 206 for communicating via any suitable wireless medium, e.g. a radio transceiver for communicating via a radio channel (though other forms are not excluded, e.g. an ultrasound or infrared transceiver). The wireless transceiver 206 may comprises a Wi-Fi, ZigBee, Bluetooth, Thread etc. interface for communicating with the infrared cameras 112 and/or illumination sources 106. Each illumination source 106 and infrared camera is configured to be able to
communicate over a wireless channel in order to perform the respective control operations disclosed herein, preferably a radio channel (though the possibility of other media such as visual light communications, ultrasound or infrared are not excluded). For instance the radio channel may be based on a radio access technology such as ZigBee, Bluetooth, Wi-Fi,
Thread, JupiterMesh, Wi-SUN, 6L0WPAN, etc. It is also not excluded that a wired connection could alternatively, or additionally, be provided between the user device 108 and the light emitting elements 106 and/or infrared cameras 112 for control purposes, e.g. an Ethernet or DMX connection.
Alternatively, the wireless transceiver 206 may communicate with the illumination sources 106 and/or infrared cameras 112 via the central lighting bridge 110 or a server, for example, over a local area network or a wide area network such as the internet.
In embodiments, the functionality of the central lighting bridge 110 or server is implemented in the form of software stored in memory and arranged for execution on a processor (the memory on which the software is stored comprising one or more memory units employing one or more storage media, e.g. EEPROM or a magnetic drive, and the processor on which the software is run comprising one or more processing units). Alternatively it is not excluded that some or all of the functionality of the central lighting bridge 110 or server could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware circuitry such as a PGA or FPGA. Also note again that the central lighting bridge 110 or server may be implemented locally within the environment 100 or at a remote location, and may comprise one or more physical units at one or more geographic sites.
The central lighting bridge 110 may comprise a wireless transceiver. The wireless transceiver may comprise a Wi-Fi, ZigBee, Bluetooth, Thread etc. interface for communicating with the illumination sources 106, user device 108, central lighting and/or infrared cameras 112 over a local and/or wide area network. For instance a radio channel may be based on a radio access technology such as ZigBee, Bluetooth, Wi-Fi, Thread,
JupiterMesh, Wi-SUN, 6F0WPAN, etc. Alternatively or additionally, in embodiments the central lighting bridge 110 may comprise a wired connection for communicating with the illumination sourcesl06, user device 108, central lighting and/or infrared cameras 112.
In examples, instead of being located on the user device 108, the controller 204 may instead be located, in whole or in part, on the central lighting bridge 110 or server. Whatever implementation in terms of physical infrastructure, the controller 204 on the user device 108, the central bridge 110 or the server may be configured to control the illumination sources 106 and infrared cameras 112 in accordance with the following.
The following describes a controller 204 and system 200 for determining whether one or more animals 104, e.g. poultry, within an environment, e.g. a farm building, have a health problem. The controller 204 is configured to receive a first infrared image from one or more infrared cameras 112. For example, each infrared camera 112 may capture a respective infrared image and transmit said infrared images to the controller 204. As another example, a single infrared image may be captured by a single infrared camera 112 and transmitted to the controller 204. As a further example, multiple infrared cameras 112 may capture a respective infrared image and those infrared images may be combined (e.g. merged, overlaid, superimposed, etc.), by one or more of the infrared cameras 112 or by the controller 204, into a single infrared image.
The infrared cameras 112 may capture an image of the environment 100 in its entirety. For example, the environment 100 may be a bam and the infrared image may capture the entire floor space of the bam. Alternatively, the infrared cameras 112 may capture an image of only part of the environment 100. For example, the part of the environment 100 may be an area of the bam, stable or other farm building such as, for example, a coup, a pen, a drive bay, a milk house, a corral, etc.
The controller 204 may be configured to control the one or more infrared cameras 112 to capture the first infrared image, e.g. in response to a trigger such as, for example, a user command. Alternatively, the infrared cameras 112 may capture a first infrared image periodically (e.g. every minute, few minutes, hour, etc.) or at predefined points in time.
The controller 204 is configured to control one or more illumination sources 106 to illuminate the environment 100 for a predetermined time period. The illumination sources 106 may be controlled to illuminate the environment 100 in its entirety.
Alternatively, the illumination sources 106 may be controlled to illumination only a part of the environment 100. For example, the one or more illumination sources 106 may act as a spotlight to illuminate only the part of the environment 100 captured in the first infrared image, or only part of the environment 100 captured in the first infrared image. The predetermined time period may, for example, one or more seconds. The predetermined time period may be dependent on the received first infrared image, as discussed below.
The controller 204 is also configured to receive a second infrared image of the environment 100 from the one or more infrared cameras 112. Similarly to the first infrared image, the second infrared image may capture all or only part of the environment 100. The controller 204 may be configured to control the infrared cameras 112 to capture the second infrared image. The controller 204 comprises an image recognition algorithm configured to identify a region within the first infrared image, the region corresponding to at least part of one of the animals 104 (referred to hereinafter as“the first animal”) within the first infrared image (and therefore within the environment 100). For example, the controller may be configured to identify a temperature region and/or a thermal region corresponding to at least part of one of the animals. The region may, in some example, not correspond to part of an animal. Instead, the region may correspond to part of the environment 100. Since the amount of radiation emitted by an object increases with temperature, thermography allows one to see variations in temperature. The region may therefore be thought of as“heat region”, wherein the infrared images shows regions emitting different amounts of heat. For example, an animal 104 may be warmer than the background environment 100, e.g. the floor of a stable, and therefore the animal 104 will stand out against the background as a region. Different parts of the animal 104 (e.g. head, body, limbs, eyes, and feet) may also have different regions.
In some embodiments, the identified regions may correspond to a thermal shape and/or pattern. That is, the controller 204 may identify shapes or patterns in the infrared images. The shapes and/or patterns may be predetermined. For instance, the controller 204 may identity shapes and/or patterns that are known to be indicative of an animal having a physiological condition.
As an optional feature, the controller 204 may be configured to determine (or identify) that the region corresponds to at least part of one of the animal(s) captured within the first infrared image. That is, the identified region may be assumed to correspond to part of an animal 104 by the very fact that a region is identified. Alternatively, the controller 204 may determine that the region has a temperature indicative of an animal, is above (greater than) or below (less than) the background temperature of the environment 100, or is at least part of an animal 104 shaped region.
The controller identifies a region in the second infrared image that
corresponds to the region in the first infrared image. The regions may correspond in that they share a thermal characteristic, e.g. they are at the same temperature or their respective temperatures differ by less than a threshold amount. As another example, the regions may correspond in that they are at least one of the same size, shape and/or patter, or at least substantially the same size, shape and/or pattern. The regions may correspond in that they are determined to be correspond to at least part of the same animal. The controller 204 can then determine a change in position and/or thermal characteristic of the identified region between the first and second infrared images. That is, the controller 204 determines by how much (in real or arbitrary units) the region has moved from the first infrared image to the second infrared image and/or how much a thermal characteristic has changed from the first infrared image to the second infrared image. The thermal characteristic may represent a measured temperature or heat difference between the infrared images, i.e. a difference in how much infrared light is captured from image to the next.
From the determined change in position and/or thermal characteristic, the controller 204 determines whether or not the first animal 104 has a health problem. For example, the controller 204 may determine that the first animal 104 has a health problem if the positional change of the identified region is zero and/or the thermal change is greater than a predetermined minimum value. The minimum value may be greater than zero. From this information the controller 204 may determine that the animal 104 is dead or severely sick or injured such that it cannot move. As another example, if the positional change is small, the controller 204 may determine that the animal 104 has a health problem such as, for example, an illness or disease. As a further example, if the positional change is large, the controller 204 may determine that the animal 104 does not have a health problem.
The controller 204 may identify regions above or below a threshold temperature. For example, the controller 204 may look for regions that are above an animal’s typical body temperature. This information may be stored in memory of the user device 108 and/or accessed from the central bridge or server. In some examples, the user device 108 is configured to allow the user 102 to configure the threshold temperature.
The identified region may correspond to a region of a temperature above or below the average temperature within the environment 100. For example, controller 204 may identify a region that is above the average environment 100 temperature. That is, the controller 204 may assume that a region above the average environment 100 temperature corresponds to an animal. Similarly, the controller 204 may assume that a region below the average environment temperature corresponds to an animal.
Additionally or alternatively, the identified region may correspond to a region of temperature above or below the average temperature of the one or more animals 104 within the environment 100. In most animals 104 and in poultry in particular, the body temperature of the animal 104 is strongly correlated to the animal’s core temperature.
Therefore identifying an animal 104 with a body temperature above or below that of the other animals 104 within the environment 100 may indicate that there is a problem with that animal, and that that animal 104 should be investigated further (i.e. by monitoring the change in position of the region between the first and second infrared images). That is, thermal imaging can be used to detect a status of increased or decreased temperature, thereby detecting a possible fever or other health problem(s) of the animal.
The controller 204 may control one or more of the illumination sources 106 within the environment 100 to emit illumination having one or more predetermined characteristics. The controller 204 may control a particular subset of the illumination sources 106 (e.g. a single illumination source) to emit illumination at a given time, or the controller 204 may control all of the illumination sources 106 to emit illumination at the same time. The illumination sources 106 may be controlled to emit illumination having a predetermined brightness level, luminous intensity, luminous flux, and/or luminance. The illumination sources 106 may additionally or alternatively be controlled to emit illumination having a predetermined colour. That is, the illumination may have a predetermined frequency, wavelength or spectrum. For example, each illumination source may have a multi-colour LED (e.g. and RGB LED) capable of emitting red, green, blue, cyan, magenta and yellow. Furthermore, the illumination sources 106 may be controlled to emit illumination having a predetermined temperature (or colour temperature). For example, the illumination may have a colour temperature over 5000 K (so called "cool colours" like bluish white), a colour temperature between 2400 and 3000 K (so called "warm colours", like yellowish white through to red). In general, the illumination may have a colour temperature ranging from 1700 to 6000 K. As another example, the controller may emit illumination having a predetermined static or dynamically varying spatial pattern. That is, the illumination may have different characteristics at different points in space and/or time. Furthermore, the illumination may be emitted for a predetermined duration. The duration may be less than the time period between the first and second infrared images being captured. That is, the first infrared image may be captured at time TO, and the second infrared image may be captured at time Tl . The illumination may last for a duration of time dT = Ti-To or dT<Ti-To.
As an optional feature, the controller 204 may control the one or more infrared cameras 112 to capture the second infrared mage within a predetermined time period of the illumination of the environment 100. That is, within a predetermined time period of the start of the illumination (i.e. the point in time at which the illumination sources 106 are controlled to illuminate the environment 100) or within a predetermined time period of the end of the illumination (i.e. the point in time at which the illumination sources 106 stop illuminating the environment 100, either completely or with the illumination described above). For example, the second infrared image may be time-linked to the initiation of the illumination of the environment 100 (e.g. captured one second after the initiation, or within three seconds of the initiation). As another example, the second infrared image may be captured one second after the end of the predetermined time period of illumination.
In some embodiments, determining a change in position of the identified region between the first and second infrared images may comprise the controller 204 determining whether the change in position of the identified temperature is less than a threshold amount. If the change in position is less than the threshold amount, the controller 204 may control one or more infrared cameras 112 to capture a third infrared image of the environment 100. For example, if the change in position if zero units (e.g. zero pixels, zero centimetres, zero metres, etc.) and the threshold is any number above zero, a third infrared image may be captured. Similarly, if the threshold is one metre and the change in position is less than one metre, the controller 204 may trigger a third infrared image to be captured. These examples may correspond to a health problem such as, for example, an animal 104 being dead or sick. In some examples, if the change in position is zero, the animal 104 is presumed to be dead and a third image is not captured.
The third infrared image may have a higher level of detail than the first and/or second infrared images. For example, a different type of infrared camera 112 may be used to capture the third infrared image compared to the type of infrared camera 112 used to capture the first and/or second infrared images. For example, the third infrared image may have a greater resolution and/or a greater image sharpness.
The one or more infrared cameras 112 may be controlled to capture an infrared image (the third infrared image) of the head of an animal. That is, the controller 204 may determine where the head of the animal 104 is within the environment 100 and control the infrared camera(s) to capture an image at the corresponding position. Alternatively, the infrared camera(s) may capture an infrared image of the animal 104 in general, and the portion of the image corresponding to the head of the animal 104 may be enlarged, focused on, etc.
The controller 204 may identify one or more regions within the third infrared image. For example, a region may correspond to the head of animal, the body of animal, the limbs of an animal, etc. The regions may correspond to regions of temperature above or below the average temperature within the third infrared image, and/or above or below the average temperature of the animal 104 captured within the third infrared image. The controller 204 may also identify regions above or below a threshold temperature. The controller 204 may then determine whether or not an animal 104 has a health problem based on the one or more regions that are identified in the third infrared image. Identifying one or more regions within the third infrared image may comprise identifying one or more regions on the head of the first animal. That is, the controller 204 may first identify a region within the third infrared image corresponding to the head of the animal 104 and then identify regions on the identified head of the animal. For example, basic image recognition techniques may be used to identify an object in an image that corresponds to a (animal) head.
The controller 204 may determine whether or not an animal 104 has a health problem based on the one or more regions that are identified on the head of the animal 104 in the third infrared image. For example, the controller 204 may determine that the animal 104 has a health problem if a region is identified that coincides with one, more, or all of the nose, mouth, eyes, and beak (if applicable). For example, a low temperature region may be identified on or around the nose or beak of an animal. Here,“low temperature” may mean below the average temperature of the animal’s head, or below a threshold temperature. A (low) temperature region that coincides with being on or around the nose, for example, may be caused by mucus outflow from an orifice in the animal’s head, e.g. the nose. For example, outflow of music will lead to a local cold spot at that location as the mucus cools down rapidly. Such spots stand out visibly against the warmer surface of the head, e.g. the skin. In poultry in particular, cold spots are easily detectable as the near environment 100 on the beak is highly perfused, resulting in a high contrast between the beak and mucus.
Figure 3A is an example of infrared images taken (middle and bottom) of chickens using an infrared camera. The top image is a visible light image of the same chickens. Even with a low resolution camera, the images show that hot spots (i.e. temperature regions) are easily detected, e.g. on the head and body. The images also reveal that it is easy to detect chickens having different amount of feathers. For example, the chicken on the left hand side of the images in Figure 3A (marked by reference numeral 302) is a healthy chicken with many figures and thus appears darker in the images. On the hand, the chicken on the left hand side of the images in Figure 3A (marked by reference numeral 304) is in a molting phase and thus appears lighter in the images, with several different (temperature) regions captured. This information may be used to deduce, e.g. in a stable or coop, if the chicken has a physiological condition that causes molting.
Figure 3b is another example of infrared images taken (middle and bottom) of chickens using an infrared camera. The top image is a visible light image of the same chickens. Again, even with a low resolution camera, the images show that hot spots (i.e. temperature regions) are easily detected, e.g. on the head 308 and legs 306. As an optional feature, the controller 204 may control one or more
microphones (or acoustic sensors) to monitor for sounds indicative of an animal 104 suffering from a health problem. For example, a microphone array may be used to listen in on an animal’s location, e.g. using beam forming. For example, the microphone(s) may monitor for wheezing, sneezing, coughing or gurgling. This may provide an auditory indication and/or verification of an animal 104 having a health problem such as, for example, a respiratory disease or infection. In such embodiments the controller 204 comprises a sound recognition algorithm arranged to detect sounds, such as those mentioned above, via the one or more microphones.
The detection of sounds indicative of an animal 104 suffering from a health problem may be used to trigger the infrared camera(s) to capture the first infrared image of the environment 100. For example, if a wheezing sound is detected, the controller 204 may control the infrared camera(s) to capture the first infrared image. The infrared image may be captured by at least one infrared camera 112 directed towards the source of the wheezing sound. Additionally or alternatively, the detection of such sounds may be used by the controller 204 to verify (or at least increase the confidence of) a determination of an animal 104 having a health problem. For example, if an animal 104 is determined to have a health problem (e.g. due to the determined change in position of the identified region or due to one or more identified regions in the third infrared image), the controller 204 may control the microphone(s) to monitor for sounds indicative of a health problem. If those sounds are detected, the controller 204 can be confident that the determination is correct (this may feed into a confidence score displayed to a user 102, as described below).
As well as capturing a second infrared image of the environment 100 after the (first) time period of illumination, the controller 204 may also control the one or more infrared cameras 112 to take one or more additional infrared images of the environment 100, each after a subsequent time period of illumination. That is, the one or more illumination sources 106 may be controlled to illuminate the environment 100 for sequential time periods and an infrared image is captured after each time period of illumination.
Each time period of illumination may have a different characteristic from the immediately previous time period of illumination. The characteristic may be one or more of a predetermined brightness level, a predetermined colour, and a predetermined temperature.
The characteristic may also be the presence or absence of illumination. For example, if there are four time periods of illumination, the first and third time periods may correspond to the one or more illumination sources 106 emitting illumination, whilst the second and fourth time periods may correspond to the one or more illumination sources 106 not emitting
illumination. In another example, every other period of illumination may comprise bright illumination, whilst the periods of illumination in between may comprise softer illumination.
The controller 204 may determine a change in position of the identified region across the multiple captured infrared images. This may help to prevent incorrectly determining that an animal 104 is sick. For example, if an animal 104 does not move (much or at all) after one time period of illumination, the animal 104 may simply still be in the process of waking up from its sleep. However, if the animal 104 does not move (much or at all) after several short bursts of bright light, the animal 104 is likely to have health problem.
If the controller 204 determines, at any stage, that an animal 104 has a health problem, the controller 204 may cause an alert to be output to a user 102. For example, the alert may be a visual alert displayed on a user interface 102 (or display screen) of the user device 108. Additionally or alternatively, the alert may be an audio alert played out by a speaker of the user device 108, and/or by an external speaker, e.g. one connected to the user device 108 via a wired or wireless connection.
Additionally or alternatively, if the controller 204 determines, at any stage, that an animal 104 has a health problem, the controller 204 may cause the animal 104 to be visually identified on the user interface 102 (or display screen) of the user device 108. That is, the user interface 102 may show a (live) image of the environment 100, with the sick animal 104 identified on the image. As another example, one or more illumination sources 106 may be used to visually identify the animal 104 with the health problem. That is, an illumination source may illuminate the animal, or part of the environment 100 in which the animal 104 is situated. For example, the animal 104 may be placed under a bright spot light, (or at least illuminated with different illumination compared to the rest of the environment 100, e.g. red light) so that the user 102 can easily locate the animal.
Alerting the user 102 and/or visually identifying the animal 104 allows the user 102 to take action to treat the animal 104 and/or to remove the animal 104 from the environment 100. Early identification of a health problem allows for the early treatment of the health problem. This is not only beneficial for the identified animal 104 but also helps to prevent the health problem (e.g. a disease) from spreading to the other animals 104 in the population.
The controller 204 may, in response to determining that one or more animals have a physiological condition, control one or more luminaires to illuminate a first area within the environment with illumination being different to that illuminating a second area within the environment that contains the animal having the condition. For example, a spotlight may be put on an area away from a sick animal (i.e. an area more than a
predetermined minimum distance may be illumination with predetermined lighting characteristics). The illumination may have a predetermined brightness level, a
predetermined colour, a predetermined temperature, a predetermined spatial pattern, and/or a predetermined duration. Illuminating an area or areas away from the sick animal may result in healthy animals avoiding the location of the sick animal so that the chance of potential infections is minimized and/or the user (e.g. farmer or caretaker) can easily identify and remove the animal from the environment without disturbing the healthy population.
In some embodiments, the environment 100 may contain one or more dispensers 114, as shown in Figure 1. For example, the dispenser 114 may dispense medicine, water, and/or food. The controller 204 may be operatively coupled to the dispenser 114 such that it can control the dispenser 114 to release whatever substance it contains, or even a particular substance is the dispenser 114 contains multiple types of substance. For example, the controller 204 may control the dispenser 114 to release a substance in response to determining that an animal 104 has a health problem. As an example, the controller 204 may, upon determining that an animal 104 has a health problem, release a (particular) medicine into the environment 100. The medicine released may be based on the assumed health problem, e.g. based on the presence or absence of mucus in the third infrared image.
As another optional feature, the controller 204 may be configured to adapt one or more environmental conditions in the environment 100. For example, the controller 204 may adapt the temperature, humidity and/or ventilation within the environment 100 in order to help treat an animal’s health problem and/or to prevent the spread of a disease or infection. The controller 204 may adapt the one or more environmental conditions in response to determining that an animal 104 has a health problem. The controller 204 may be operatively coupled to the necessary components for adapting a given environmental condition (e.g. to one or more heating elements, cooling elements, air conditioning units, ventilation systems, etc.)
The present invention uses a thermal imaging system which may cover the total environment 100 area and is able to capture and analyse, to a sufficient level of detail, a thermal image of an animal 104 such as a chicken. A lighting system is controlled to project (intense) illumination onto the animals 104. A control system may, automatically, cause a sequence of thermal imaging and light actuation actions to be performed in order to detect and/or diagnose a medical condition of one or more animals 104 within the environment 100. Optionally, a user interface 102 may provide a user 102 (e.g. a stable manager) data on sick animals 104, allowing the user 102 to act (e.g. remove the sick animal 104 from the population, provide medicine therapy to prevent outbreak of an epidemic as originating from an initial individual disease case, etc.).
The invention allows for the detection of specific disease types (e.g.
respiratory infections which are typical, cumbersome and contagious, especially in chicken) that can not only have a high impact on the wellbeing of the animals 104 and the productivity of the breeding, but that is also difficult to diagnose in a dense stable population. For example manual visual inspection of all animals 104 in a dense population is practically impossible. One symptom of respiratory infections is mucus outflow (or“snot”) from the beak. Also, as with most viral or bacterial diseases, infection systemic responses will also lead to a raised body temperature (body temperature is typically strongly correlated to an animal’s core temperature).
Infrared thermal imaging can be used to detect a status of increased temperature which is an indication of a fever (as a possible symptom of infection) in birds. The search for sick (e.g. respiratory disease suffering) birds can be done in a sequence of actions. First, thermal imaging is used for detecting birds with a raised temperature. Such a scan is also useful to detect birds with lowered body temperature (indicating a dying or dead bird). Sick (and of course also dead) birds will also show low levels of movement (therefore dynamic thermal imaging can give a first insight into activity levels and hence the status of homeostasis or disruption thereof). Also, acoustic monitoring can be used to check for deviating behaviour (checking for gurgling and wheezing) to trigger the search for sick individuals.
For a first triage to check if birds with raised temperature are actually suffering from a disease, an actuation step is performed to check the alertness of the bird. Typically, a sick bird will show lethargy and will be less responsive to triggers. Triggering the bird with a short burst of high intensity light, as provided by the connected and controllable light system will alert and wake up birds and lead to agitation and movement in a healthy bird.
Immediately after (or at least time-linked) a new thermal image is captured of the zone of interest or environment 100. Sick birds will show no response, or a much lower response to the administered light trigger, and therefore the next thermal imaging will show that the‘hot spot’ of a specific bird has not or has hardly moved position. Instead of the comparison of two thermal images, video tracking in thermal image sequences can be used to identify birds with deviating motion patterns or behaviour. A further refinement on the type of disease (in this case the respiratory disease ‘snot’) can now be done, to make a more detailed diagnosis by applying a more accurate thermal image of the target bird. For the case of snot, thermal image capturing of nose, beak and eyes areas of the bird will allow for the detection of snot (i.e. by the local cold spots on these body locations). The prior triage will also allow preventing false alert by e.g. a bird that has cold spots just after drinking, as it will first be detected that the bird is largely immobile and not near a watering place in the stable. Furthermore, the appearance of water in the thermal image sequence will differ from snot, the water will dry up quickly and disappear while the mucus will stay visible. Additionally, a microphone array can be used to listen in on the bird location via beam forming to enable auditory verification (checking for gurgling and wheezing in individual birds).
Now either a manual or automated response action can be taken, either by a stable operator or by an automated system, such as e.g. the removal of the bird and/or by starting population- level medicinal treatment to prevent outbreak of the disease to the flock.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a” or "an" does not exclude a plurality. A single processor or other unit may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. A controller (204) for determining whether one or more animals (104) within an environment (100) have a physiological condition, wherein the controller (204) has an interface (206) for receiving, from one or more infrared cameras (112), a first infrared image of the environment (100) and a second infrared image of the environment (100), wherein the controller is configured to:
after receiving the first infrared image and before receiving the second infrared image, control one or more illumination sources (106) to illuminate the environment (100) for a predetermined time period;
identify a first region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of a first animal (104);
identify a second, corresponding region within the second infrared image; determine a change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images; and
determine whether or not the first animal (104) has a physiological condition based on the determined change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images.
2. A controller (204) according to claim 1, wherein said identified regions correspond to regions of either: (i) a temperature above the average temperature within the environment (100), or (ii) a temperature below the average temperature within the environment (100).
3. A controller (204) according to claim 1 or claim 2, wherein said identified regions correspond to regions of either: (i) a temperature above the average temperature of the one or more animals (104) within the environment (100), or (ii) a temperature below the average temperature of the one or more animals (104) within the environment (100).
4. A controller (204) according to any preceding claim, wherein the controller (204) is configured to perform said controlling of the one or more illumination sources (106) to illuminate the environment (100) by: controlling the one or more illumination sources (106) to illuminate the environment (100) with illumination having one, more, or all of the following characteristics: (i) a predetermined brightness level, (ii) a predetermined colour, (iii) a predetermined temperature, (iv) a predetermined spatial pattern, and/or (v) a predetermined duration.
5. A controller (204) according to any preceding claim, wherein the controller (204) is configured to control the one or more infrared cameras (112) to capture the second infrared image of the environment (100) within a predetermined time period of said illumination the environment (100).
6. A controller (204) according to any preceding claim, wherein the controller (204) is configured to determine whether the change in position between the identified first and second regions is less than a threshold amount and, if the change in position is determined to be less than the threshold amount, control one or more infrared cameras (112) to capture a third infrared image of the environment (100).
7. A controller (204) according to claim 6, wherein the controller (204) is configured to perform said controlling of the one or more infrared cameras (112) to capture the third infrared image of the environment (100) by controlling the one or more infrared cameras (112) to capture an infrared image of the head of the first animal.
8. A controller (204) according to claim 6 or claim 7, wherein the controller (204) is configured to:
identify one or more regions within the third infrared image; and determine whether or not the first animal (104) has a physiological condition based on the identified one or more regions within the third infrared image.
9. A controller (204) according to claim 8, wherein the controller (204) is configured to perform said identifying of the one or more regions within the third infrared image by identifying one or more regions on the head of the first animal.
10. A controller (204) according to claim 9, wherein the controller (204) is configured to perform said determining of whether or not the first animal (104) has a physiological condition based on the identified one or more regions within the third infrared image by: determining if the one or more regions coincide with at least one of the (i) nose,
(ii) mouth, and (iii) eyes of the first animal (104) in the third infrared image.
11. A controller (204) according to any preceding claim, wherein the controller (204) is configured to:
control one or more microphones to monitor for sounds indicative of an animal (104) within the environment (100) having a physiological condition, and
control one or more infrared cameras (112) to capture said first infrared image of the environment (100) in response to detecting a sound indicative of an animal (104) within the environment (100) having a physiological condition.
12. A controller (204) according to any preceding claim, wherein the controller (204) is configured to:
control one or more illumination sources (106) to illuminate the environment (100) for multiple sequential time periods, each time period of illumination having a different characteristic from the previous time period;
receive an infrared image of the environment (100) after one or more of the multiple sequential time periods of illumination;
determine a change in position and/or thermal characteristic between the identified regions between the one or more infrared images; and
determine whether or not the first animal (104) has a physiological condition based on the determined change in position and/or thermal characteristic between the identified region between the one or more infrared images.
13. A controller (204) according to any preceding claim, wherein the controller (204) is configured to cause an alert to be output to a user (102) if the first animal (104) is determined to have a physiological condition.
14. A controller (204) according to any preceding claim, wherein the controller (204) is configured to cause a visual indication to be displayed, on a user interface (202) of a user device (108), if the first animal (104) is determined to have a physiological condition, wherein the visual indication identifies the first animal.
15. A computer program product for determining whether one or more animals
(104) within an environment (100) have a physiological condition, the computer program product comprising code embodied on one or more computer-readable storage media and configured so as when executed on one or more processors to perform operations of:
receiving, from one or more infrared cameras (112), a first infrared image of the environment (100);
controlling one or more illumination sources (106) to illuminate the environment (100) for a predetermined time period;
receiving, from one or more infrared cameras (112), a second infrared image of the environment ( 100);
identifying a first region within the first infrared image, wherein the identified first region comprises an infrared image of at least part of a first animal (104);
identifying a second, corresponding region within the second infrared image; determining a change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images; and
determining whether or not the first animal (104) has a physiological condition based on the determined change in position and/or thermal characteristic between the identified first and second regions in the first and second infrared images.
PCT/EP2019/069225 2018-07-31 2019-07-17 Controller for detecting animals with physiological conditions Ceased WO2020025320A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP18186593.2 2018-07-31
EP18186593 2018-07-31

Publications (1)

Publication Number Publication Date
WO2020025320A1 true WO2020025320A1 (en) 2020-02-06

Family

ID=63452346

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/069225 Ceased WO2020025320A1 (en) 2018-07-31 2019-07-17 Controller for detecting animals with physiological conditions

Country Status (1)

Country Link
WO (1) WO2020025320A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113972006A (en) * 2021-10-22 2022-01-25 中冶赛迪重庆信息技术有限公司 Live animal health detection method and system based on infrared temperature measurement and image recognition
WO2022070128A1 (en) * 2020-09-30 2022-04-07 Pal Singh Mahender System and method for automatic mortality detection and management
WO2022190050A1 (en) * 2021-03-11 2022-09-15 Lely Patent N.V. Animal husbandry system and illumination unit suitable for the system
WO2022200062A1 (en) * 2021-03-23 2022-09-29 Signify Holding B.V. System and method for supplying sustenance to an animal species
CN115397235A (en) * 2020-06-02 2022-11-25 艾酷帕克株式会社 Animal product control system, animal product control server, animal product control method, and animal product control program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5474085A (en) 1994-02-24 1995-12-12 University Of Prince Edward Island Remote thermographic sensing of livestock
US20050005309A1 (en) * 2002-10-29 2005-01-06 Cantor Glenn H. Method of using animal models to predict adverse drug reactions
US20060225668A1 (en) * 2005-04-11 2006-10-12 Ross Brian C Animal deterrent apparatus for mounting to a culvert
US20110021928A1 (en) * 2009-07-23 2011-01-27 The Boards Of Trustees Of The Leland Stanford Junior University Methods and system of determining cardio-respiratory parameters
US8915215B1 (en) * 2012-06-21 2014-12-23 Scott A. Helgeson Method and apparatus for monitoring poultry in barns
WO2015030611A1 (en) 2013-09-02 2015-03-05 Interag Method and apparatus for determining respiratory characteristics of an animal
US9697599B2 (en) * 2015-06-17 2017-07-04 Xerox Corporation Determining a respiratory pattern from a video of a subject

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5474085A (en) 1994-02-24 1995-12-12 University Of Prince Edward Island Remote thermographic sensing of livestock
US20050005309A1 (en) * 2002-10-29 2005-01-06 Cantor Glenn H. Method of using animal models to predict adverse drug reactions
US20060225668A1 (en) * 2005-04-11 2006-10-12 Ross Brian C Animal deterrent apparatus for mounting to a culvert
US20110021928A1 (en) * 2009-07-23 2011-01-27 The Boards Of Trustees Of The Leland Stanford Junior University Methods and system of determining cardio-respiratory parameters
US8915215B1 (en) * 2012-06-21 2014-12-23 Scott A. Helgeson Method and apparatus for monitoring poultry in barns
WO2015030611A1 (en) 2013-09-02 2015-03-05 Interag Method and apparatus for determining respiratory characteristics of an animal
US9697599B2 (en) * 2015-06-17 2017-07-04 Xerox Corporation Determining a respiratory pattern from a video of a subject

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115397235A (en) * 2020-06-02 2022-11-25 艾酷帕克株式会社 Animal product control system, animal product control server, animal product control method, and animal product control program
EP4118957A4 (en) * 2020-06-02 2023-08-09 Eco-Pork Co., Ltd. Livestock control system, livestock control server, livestock control method, and livestock control program
WO2022070128A1 (en) * 2020-09-30 2022-04-07 Pal Singh Mahender System and method for automatic mortality detection and management
WO2022190050A1 (en) * 2021-03-11 2022-09-15 Lely Patent N.V. Animal husbandry system and illumination unit suitable for the system
NL2027742B1 (en) * 2021-03-11 2022-09-27 Lely Patent Nv Animal husbandry system and illumination unit suitable for the system
WO2022200062A1 (en) * 2021-03-23 2022-09-29 Signify Holding B.V. System and method for supplying sustenance to an animal species
JP2024508997A (en) * 2021-03-23 2024-02-28 シグニファイ ホールディング ビー ヴィ Systems and methods for providing sustenance to animal species
JP7546787B2 (en) 2021-03-23 2024-09-06 シグニファイ ホールディング ビー ヴィ Systems and methods for providing sustainment to animal species
US12382931B2 (en) 2021-03-23 2025-08-12 Signify Holding B.V. System and method for supplying sustenance to an animal species
CN113972006A (en) * 2021-10-22 2022-01-25 中冶赛迪重庆信息技术有限公司 Live animal health detection method and system based on infrared temperature measurement and image recognition
CN113972006B (en) * 2021-10-22 2024-06-11 中冶赛迪信息技术(重庆)有限公司 Live animal health detection method and system based on infrared temperature measurement and image recognition

Similar Documents

Publication Publication Date Title
WO2020025320A1 (en) Controller for detecting animals with physiological conditions
Nourizonoz et al. EthoLoop: automated closed-loop neuroethology in naturalistic environments
ES2867895T3 (en) System for the analysis of images of animal excrement
ES2792678T3 (en) An animal husbandry method and an animal shed
Kittawornrat et al. Toward a better understanding of pig behavior and pig welfare
US20180260645A1 (en) Devices and methods for analyzing animal behavior
Hixson et al. Behavioral changes in group-housed dairy calves infected with Mannheimia haemolytica
US20190261596A1 (en) Devices and methods for analyzing rodent behavior
WO2014118788A2 (en) Early warning system and/or optical monitoring of livestock including poultry
US9485966B1 (en) Device and method of animal reward
US20150359200A1 (en) Infrared thermography and behaviour information for identification of biologically important states in animals
Long et al. Effect of light-emitting diode vs. fluorescent lighting on laying hens in aviary hen houses: Part 1–Operational characteristics of lights and production traits of hens
KR20190046163A (en) Farm management apparatus and method
Liu et al. Effects of light-emitting diode light v. fluorescent light on growing performance, activity levels and well-being of non-beak-trimmed W-36 pullets
US12274240B2 (en) Laser enrichment device, system, and method for poultry
CN114009365A (en) Intelligent breeding method, system, equipment and storage medium of Internet of things
Cadena et al. Evaporative respiratory cooling augments pit organ thermal detection in rattlesnakes
KR101931263B1 (en) A system of a stable for pig breeding using near field communication
KR20230100198A (en) Wearable device for poultry management and poultry management system using the same
KR101682606B1 (en) Lighting control system of porltry farm
KR20230086891A (en) Cattle management method in stable
CA2854344A1 (en) Infrared thermography and behaviour information for identification of biolically important states in animals
KR20150098024A (en) System and method for rearing environment control of animal using image data and bio-signal data
Patel et al. Role of computer science (Artificial Intelligence) in poultry management
Dawkins et al. Poultry welfare monitoring: group-level technologies

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19740569

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19740569

Country of ref document: EP

Kind code of ref document: A1