WO2022266705A1 - A system and apparatus for animal management - Google Patents
A system and apparatus for animal management Download PDFInfo
- Publication number
- WO2022266705A1 WO2022266705A1 PCT/AU2022/050627 AU2022050627W WO2022266705A1 WO 2022266705 A1 WO2022266705 A1 WO 2022266705A1 AU 2022050627 W AU2022050627 W AU 2022050627W WO 2022266705 A1 WO2022266705 A1 WO 2022266705A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- animal
- target animal
- power mode
- processor
- environment
- Prior art date
Links
- 241001465754 Metazoa Species 0.000 title claims abstract description 431
- 238000000034 method Methods 0.000 claims abstract description 47
- 230000008569 process Effects 0.000 claims abstract description 39
- 230000004044 response Effects 0.000 claims abstract description 18
- 230000008859 change Effects 0.000 claims abstract description 10
- 238000001514 detection method Methods 0.000 claims description 62
- 238000004891 communication Methods 0.000 claims description 46
- 238000012544 monitoring process Methods 0.000 claims description 40
- 230000000007 visual effect Effects 0.000 claims description 35
- 230000033001 locomotion Effects 0.000 claims description 31
- 230000009471 action Effects 0.000 claims description 24
- 238000004422 calculation algorithm Methods 0.000 claims description 20
- 238000005286 illumination Methods 0.000 claims description 12
- 150000001875 compounds Chemical class 0.000 claims description 10
- 239000002245 particle Substances 0.000 claims description 7
- 239000000126 substance Substances 0.000 claims description 7
- 238000003384 imaging method Methods 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims description 5
- 230000003862 health status Effects 0.000 claims description 4
- 230000000977 initiatory effect Effects 0.000 claims description 3
- 238000002310 reflectometry Methods 0.000 claims description 3
- 238000002835 absorbance Methods 0.000 claims description 2
- 230000003542 behavioural effect Effects 0.000 claims description 2
- 238000012545 processing Methods 0.000 description 30
- 241000894007 species Species 0.000 description 18
- 241000282326 Felis catus Species 0.000 description 17
- 238000007726 management method Methods 0.000 description 15
- 238000004458 analytical method Methods 0.000 description 11
- 230000006399 behavior Effects 0.000 description 10
- 239000002574 poison Substances 0.000 description 8
- 231100000614 poison Toxicity 0.000 description 8
- 230000003213 activating effect Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000005021 gait Effects 0.000 description 6
- 239000012528 membrane Substances 0.000 description 6
- 238000003860 storage Methods 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000012795 verification Methods 0.000 description 5
- 239000003053 toxin Substances 0.000 description 4
- 231100000765 toxin Toxicity 0.000 description 4
- 108700012359 toxins Proteins 0.000 description 4
- 241000607479 Yersinia pestis Species 0.000 description 3
- 238000009472 formulation Methods 0.000 description 3
- 230000036541 health Effects 0.000 description 3
- 230000007774 longterm Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000012806 monitoring device Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 241000282376 Panthera tigris Species 0.000 description 2
- 230000036760 body temperature Effects 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 238000003708 edge detection Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000013011 mating Effects 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000010183 spectrum analysis Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000008685 targeting Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 206010002660 Anoxia Diseases 0.000 description 1
- 241000976983 Anoxia Species 0.000 description 1
- 241001416526 Burramyidae Species 0.000 description 1
- 241000282421 Canidae Species 0.000 description 1
- 241000283070 Equus zebra Species 0.000 description 1
- 206010021143 Hypoxia Diseases 0.000 description 1
- GAAKALASJNGQKD-UHFFFAOYSA-N LY-165163 Chemical compound C1=CC(N)=CC=C1CCN1CCN(C=2C=C(C=CC=2)C(F)(F)F)CC1 GAAKALASJNGQKD-UHFFFAOYSA-N 0.000 description 1
- 241000544672 Myrmecobius fasciatus Species 0.000 description 1
- CLCTZVRHDOAUGJ-UHFFFAOYSA-N N-[4-(3-chloro-4-cyanophenoxy)cyclohexyl]-6-[4-[[4-[2-(2,6-dioxopiperidin-3-yl)-6-fluoro-1,3-dioxoisoindol-5-yl]piperazin-1-yl]methyl]piperidin-1-yl]pyridazine-3-carboxamide Chemical compound FC1=CC2=C(C=C1N1CCN(CC3CCN(CC3)C3=CC=C(N=N3)C(=O)NC3CCC(CC3)OC3=CC(Cl)=C(C=C3)C#N)CC1)C(=O)N(C1CCC(=O)NC1=O)C2=O CLCTZVRHDOAUGJ-UHFFFAOYSA-N 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 239000004411 aluminium Substances 0.000 description 1
- 230000007953 anoxia Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 239000011805 ball Substances 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 238000009395 breeding Methods 0.000 description 1
- 230000001488 breeding effect Effects 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 239000004519 grease Substances 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 206010025482 malaise Diseases 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 239000007769 metal material Substances 0.000 description 1
- 230000000422 nocturnal effect Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 239000008188 pellet Substances 0.000 description 1
- 239000003016 pheromone Substances 0.000 description 1
- 230000001766 physiological effect Effects 0.000 description 1
- 239000002861 polymer material Substances 0.000 description 1
- 244000062645 predators Species 0.000 description 1
- 230000035935 pregnancy Effects 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 239000002516 radical scavenger Substances 0.000 description 1
- 230000001850 reproductive effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- JGFYQVQAXANWJU-UHFFFAOYSA-M sodium fluoroacetate Chemical compound [Na+].[O-]C(=O)CF JGFYQVQAXANWJU-UHFFFAOYSA-M 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 229910001220 stainless steel Inorganic materials 0.000 description 1
- 239000010935 stainless steel Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K29/00—Other apparatus for animal husbandry
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M31/00—Hunting appliances
- A01M31/002—Detecting animals in a given area
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3231—Monitoring the presence, absence or movement of users
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/325—Power saving in peripheral device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/96—Management of image or video recognition tasks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
Definitions
- the present application relates to an apparatus adapted to detect and identify animals and/or animal species.
- Embodiments of the present invention are particularly adapted for the detection, identification, monitoring and management of target animal populations and detection of individual animals within an environment being monitored.
- the invention is applicable in broader contexts and other applications.
- systems for detecting animal species include passive infrared (PIR) cameras, which detect heat signatures but are not well adapted to distinguish animals from changes in other-non animal sources of heat such as moving shadows.
- Other systems for detecting animals involve physical tags or physical mechanisms such as trip-wires or pressure plates and other traps.
- PIR passive infrared
- Other systems for detecting animals involve physical tags or physical mechanisms such as trip-wires or pressure plates and other traps.
- These systems suffer from a number of deficiencies including but not limited to; the need to capture and tag animals, the inability to precisely and quickly identify a target animal species and, in cases where the animal is identified, unwanted delays in identifying the target animal species, resulting in the target animal moving out of the area in which the detection occurs. This can be problematic if for instance, an action such as administering a compound onto the target animal is required.
- Other problems associated with present systems include a generally low level of accuracy in distinguishing a target animal species, potentially resulting in an action being taken on the wrong species of animal
- sensors such as passive infra-red sensors and optical sensors have been employed to monitor animal species have the advantage of remote sensing over wide areas.
- these systems suffer from a large number of false triggers due to moving objects like blowing vegetation and ambient temperatures approaching animal core temperatures.
- false triggers, or lack of triggers when the ambient temperature is close to body temperature give rise to incorrect tracking and administration of control actions, as well as increasing overall power consumption of the monitoring devices.
- the present inventors have identified a need for improvement to monitoring devices to be able to quickly and accurately identify target animals, potentially down to the individual animal, while being robust and energy efficient to be situated in the field for long periods of time.
- an apparatus configured to identify a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; and a processor contained within the housing and configured to process the images to identify a target animal in real-time; wherein the apparatus is configured to operate in a low power mode and a higher power mode; wherein, in the low power mode, the one or more image sensors are controlled to capture images in a low power mode and the processor is configured to detect a change in the environment from the images and, in response, trigger the higher power mode; and wherein, in the higher power mode, the one or more image sensors are controlled to capture images in a higher power mode than in the low power mode and the processor is configured to identify a target animal in the environment.
- the one or more image sensors are configured to capture images at a lower resolution and/or a lower frame rate than in the high power mode.
- the processor in the higher power mode, is configured to classify a detected animal as a target animal.
- the apparatus is configured to operate in a second higher power mode in which the processor performs a classification to determine if a detected animal is a target animal.
- the processor is preferably configured to classify the animal as an individual target animal.
- the apparatus includes a communications module configured to transmit one or more of the captured images to a remote server and wherein the remote server performs a classification to identify if a detected animal is a target animal.
- switching between the low power mode and higher power mode occurs within a period of less than 100 milliseconds.
- an apparatus configured to perform real-time identification of an individual target animal in an environment, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture one or more 2D or 3D images of the environment; a processor contained within the housing and configured to process the one or more images to identify the individual target animal in real time based on one or more detected visual characteristics in the one or more images and, in response, generate an identification data packet; and a communications module configured to communicate the identification data packet to a remote server which is in communication with other apparatus.
- the apparatus includes a GPS device configured to determine a location of the apparatus.
- At least one of the one or more image sensors is a thermal image sensor capable of imaging in the infrared wavelength range to detect thermal characteristics.
- the thermal image sensor may be calibrated to detect a temperature of the target animal such as an average temperature.
- the processor is configured to use the temperature of the target animal and background to determine a health status of the target animal.
- the apparatus includes an acoustic sensor to detect acoustic characteristics within the environment.
- the processor is configured to detect a presence of or identify the target animal at least in part by one or more acoustic characteristics that match or closely match an acoustic characteristic indicative of the target animal.
- the apparatus includes a particle detector or chemical analyser configured to detect scent characteristics within the environment.
- the processor is preferably configured to detect a presence of or identify the target animal at least in part by determining one or more scent signatures of the target animal based on the detected scent characteristics.
- the apparatus includes a battery and the one or more image sensors and the processor are powered locally by the battery.
- the processor is implemented on a system-on-chip (SoC) device.
- SoC system-on-chip
- the processor and/or other hardware are incorporated into an embedded hardware system.
- the one or more image sensors are also implemented on an SoC device.
- the processor includes a clock to determine the current time at the location. [0029] In some embodiments, the processor is configured to execute a machine learned classifier to identify the target animal. In some embodiments, the processor is configured to execute a neural network classifier algorithm.
- the detected visual characteristics include a shape of the target animal. In some embodiments, the detected visual characteristics include one or more predefined movements or behavioural characteristics of the target animal over a plurality of images. In some embodiments, the detected visual characteristics include thermal characteristics of the target animal. In some embodiments, the detected visual characteristics include a brightness or reflectivity or absorbance characteristic of the target animal. In some embodiments, the detected visual characteristics include a distinct colour and/or marking of the target animal.
- the target animal includes a predefined animal species.
- the target animal is an individual animal.
- the individual animal is a unique individual that can be distinguished from all other animals.
- the apparatus includes a wireless identifier configured to detect pre-stored data from a wireless tag associated with one or more animals.
- the apparatus is configured to operate in one of a low power mode or a high power mode.
- the one or more image sensors are configured to capture images at a lower frame rate and/or a lower resolution.
- the high power mode is preferably activated by a trigger signal.
- the trigger signal may be based on a mechanical or electromechanical trigger associated with the apparatus.
- the trigger signal is based on detection of one or more trigger visual characteristics in the one or more images.
- the one or more trigger visual characteristics include movement characteristics in the images.
- the one or more trigger visual characteristics include detection of a predefined shape or shapes in the images.
- the one or more trigger visual characteristics include detection of a predefined temperature or brightness in the images.
- the trigger signal is based on detection of one or more trigger acoustic characteristics detected by an acoustic sensor.
- the one or more image sensors are deactivated.
- an acoustic sensor is configured to sense acoustic signals and the processor is configured to process the acoustic signals to identify one or more acoustic sounds indicative of an animal or a target animal.
- the apparatus includes an illumination device configured to selectively illuminate at least a part of the environment.
- the apparatus includes one or more of a light level sensor, humidity sensor and/or a temperature sensor.
- the apparatus includes an actuation device responsive to a sensor signal generated from the processor for initiating an action in response to identification of the target animal.
- the actuation device includes a dispenser for dispensing a compound onto the target animal.
- the actuation device includes a visual or acoustic stimulus actuated in response to the identification of the target animal.
- the apparatus includes a sound generator adapted to generate sounds to lure the target animal.
- the apparatus includes a visual lure to lure the target animal toward the apparatus.
- the communications module is further adapted to communicate the characteristics and/or the presence of a target animal to other nearby apparatus for detecting or identifying animals.
- the communications device is adapted to send and receive data to a remote database at predetermined time periods.
- the apparatus is incorporated into a drone or UAV which is controllably moveable around the environment.
- an actuation device including one or more actuators responsive to a sensor signal generated from an apparatus of the first or second aspect.
- a system configured to detect a presence of a target animal in an environment, the system including a plurality of apparatuses of the second aspect, wherein the communications module is adapted to communicate the identification of a target animal to one or more others of the apparatuses either directly or via a remote server.
- the communications module is adapted to communicate the identification of a target animal to one or more others of the apparatuses either directly or via a remote server.
- one or more other apparatus are controlled from a lower power mode into one or more high power modes.
- an apparatus configured to detect a presence of a target animal in an environment in real-time, the apparatus including: a housing; one or more sensors mounted on or within the housing and configured to sense one or more characteristics of the environment; a processor contained within the housing and configured to process the one or more images and environment characteristics to detect a presence of a target animal in real time; and a communication device adapted to communicate the environment characteristics and/or the presence of a target animal to other nearby apparatus to switch the other nearby apparatus into a monitoring mode.
- the one or more sensors include one or more of image sensors, acoustic sensors, temperature sensors, particle sensors and/or chemical analysers.
- an animal monitoring system for monitoring animals within an environment, the system including a plurality of animal monitoring apparatus positioned at spatially separated locations within the environment, each animal monitoring apparatus including: one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; and a processor contained within the housing and configured to process the images to identify a target animal in real-time; and a communications module for communicating the identification of a target animal to one or more others of the animal monitoring apparatus.
- the system of the sixth aspect may include a server configured to communicate with the communications modules of the animal monitoring apparatus to receive the identification of a target animal from one of the animal monitoring apparatus.
- the server upon identification of a target animal from one of the animal monitoring apparatus, is configured to receive images from the animal monitoring apparatus and process those images to classify the identified target animal as a subgroup of the target animals or an individual target animal.
- the communications module of that apparatus upon identification of a target animal by one of the animal monitoring apparatus, sends a signal to one or more other animal monitoring apparatus to switch that apparatus from a low power mode into a higher power mode.
- one or more of the animal monitoring apparatus are incorporated onto a drone or UAV device.
- an apparatus configured to detect a presence of a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; a processor contained within the housing and configured to process the images to detect a presence of an animal and/or classify an animal in real-time; and a communications device configured to transmit at least a subset of the captured images to a remote server upon detection of an animal by the processor, wherein the remote server is configured to process the images to classify the animal as a target animal.
- an apparatus configured to identify a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; an acoustic sensor configured to sense acoustic signals; and a processor contained within the housing and configured to process the acoustic signals to detect acoustic signals indicative of an animal and process the images to identify a target animal in real-time; wherein the apparatus is configured to operate in a low power mode and a higher power mode; wherein, in the low power mode, the one or more image sensors are deactivated and the processor is configured to detect acoustic signals indicative of an animal and, in response, trigger the higher power mode; and wherein, in the higher power mode, the one or more image sensors are activated and controlled to capture images and the processor is configured to identify a target animal in the environment.
- Figure 1 shows a schematic view of an apparatus configured to detect a presence of a target animal in an environment
- Figure 2 shows the apparatus in use in the field
- Figure 3 shows a system of the apparatus of Figure 1 and Figure 2 in use
- Figure 4 shows a dispensing device in accordance with an embodiment of the invention
- Figure 5 exemplifies a neural network for detecting the target animal
- Figure 6 shows a flow chart depicting the process of detecting a target animal in accordance with an embodiment of the invention.
- FIG. 1 there is illustrated schematically an apparatus 1000 configured to detect the presence of a target animal (shown in Figure 2 as 3000) in an environment 2000.
- the apparatus 1000 allows for local and/or remote processing of data inputs such as image and acoustic data with the use of sensors 102, 103, 105.
- the primary sensors include visible image sensors 102 and infrared image sensors 103 to image a region of an environment 2000 proximal to apparatus 1000.
- additional sensors 105 may be used to augment the primary sensors and these additional sensors include particle detectors/chemical analyser, acoustic sensors, optical or ultrasonic rangefinder sensors and temperature sensors.
- Image sensors 102 and 103 may include conventional two dimensional image sensors such as CMOS or CCD arrays, or may include more sophisticated sensors such as phase detect pixel sensors, stereo imaging systems, LIDAR systems, hyperspectral imagers, time of flight cameras, structured light systems and other systems capable of imaging a scene in three dimensions. These more sophisticated imaging devices are capable of extracting depth information from an imaged scene. Depth information allows the determination of distance to an object which, in turn can more accurately allow determination of an animal’s size and speed of movement.
- individual target animal will be used. These terms are intended to refer to a single unique individual animal that can be distinguished from all other animals via distinct characteristics such as physical markings, tags, gait and/or behaviour.
- the ability to process the data locally allows for real time or near real time processing and evaluations of the data which may otherwise not be possible if the processing was performed remotely.
- the local processing may also minimize the use of network bandwidth for communications which tends to be limited (and expensive) in remote areas in which the apparatus 1000 is likely to be used.
- communication with a cloud server or other remote device to perform remote processing may be performed in some instances where higher processing power is required.
- FIG. 3 there is illustrated a system 3500 of apparatus 1000 which communicate wirelessly via a central remote server 306. Based on the outputs from the sensors (102, 103, 105), and in particular to the image sensors 102, one or more images are processed to identify the target animal 3000 and, in response to that identification, an identification data packet is generated and sent via a communications module 303 to the remote server 306.
- Server 306 may be a physical server or a cloud-based server in communication with one, many or all of the apparatus 1000 within system 3500.
- the server 306 is in communications with a network of other apparatus 1000, which may be distributed at spatially separate locations across a geographic region being monitored. In this manner, a network of apparatus 1000 are able to communicate with each other via server 306.
- the identification data packet is a small set of data capable of alerting server 306 of a potential detection of a target animal. The small data size allows bandwidth and power constraints of communications module 303 to be minimised.
- the identification data packet may contain information such as:
- Device specific data such as a unique identifier, location and current battery power level
- apparatus 1000 is capable of detecting a target animal 3000 in the form of an individual animal, an animal species, sub species, cohort, or genus or group of animals or species, sub-species or genus.
- target animal used herein are intended to cover these different options.
- the apparatus 1000 includes a protective housing 100 where the one or more sensors 102, 103, 105 are mounted on or within the housing 100.
- Each of the sensors 102, 103, 105 are operably connected to a memory 107 and processor 104 for processing of the data acquired from the sensors 102, 103, 105.
- memory 107 may include random access memory (RAM), read-only memory (ROM) and/or electrically erasable programmable read-only memory (EEPROM).
- RAM random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read-only memory
- other equivalent memory or storage systems may be used as should be readily apparent to those skilled in the art.
- Each sensor 102, 103, 105 is adapted to detect various characteristics of the target animal 3000 such as physical appearance and/or behaviour, the size of the target animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and colour of the target animal 3000.
- characteristics of the target animal 3000 such as physical appearance and/or behaviour, the size of the target animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and colour of the target animal 3000.
- audio sensors are used, audio patterns, animal call sounds and signatures may be captured.
- Each of the image sensors 102 and 103 are configured to capture one or more images (in either 2D or 3D) of a region of the environment 2000.
- Each image sensor 102 and 103 may take a variety of forms ranging from a conventional CCD or CMOS visual spectrum image sensor, IR sensor, 3D camera or LIDAR sensor.
- a near-infrared 2D image sensor 103 with a global shutter paired with a high intensity pulsed near-infrared illumination source may be used.
- the illumination source is then "flashed" at the same time as the image sensor 102 global shutter in order to illuminate and capture images from a broad area of the local environment. This allows for a low power draw (around 10 mW or less) minimizing battery usage.
- a passive system is used in which no illumination device is implemented to further reduce power consumption.
- each image sensor (102, 103) is adapted to capture two or three dimensional images which, when processed by processor 104, can locally discriminate living creatures from non-living creatures.
- the housing 100 contains a processor 104 which is operably connected to a memory 107 for storage of instructions and data.
- the processor 104 is configured, inter alia, to process one or more images and inputs from the sensors (102, 103) to detect the presence of the target animal 3000 in real time.
- Figure 2 exemplifies the use of a single apparatus 1000 in use with a target animal 3000. However, it will be understood that more than one apparatus 1000 would be typically used to aid in the detection of a target animal 3000. A system of such apparatus 1000 is described below.
- Processor 104 may be implemented as any form of computer processing device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
- the processor 104 takes the form of an embedded system or system-on-chip (SoC) device.
- SoC device provides the benefit of a single platform where the entire computing system can be integrated onto the SoC device.
- the SoC device may further include one or more of the image sensors 102 and 103 and a clock for precisely determining a current time.
- the clock can provide valuable information input when determining whether a detected animal has been correctly identified.
- data from the clock can be accessed to register timestamps of detected animal events.
- a separate clock device may be included in apparatus 1000.
- Information from the local clock on apparatus 1000 may be used by processor 104 in the classification of animals.
- processor 104 may be used by processor 104 as a measure of confidence or an input to a classifier algorithm to avoid false positive detections that may occur.
- processor 104 may be divided into different modules such as a vision processor functional module for performing computer vision processes and a device controller functional module for performing device control.
- a vision processor functional module for performing computer vision processes
- a device controller functional module for performing device control.
- the functions of the vision processor functional module and device controller functional module may be realized as separate hardware such as microprocessors in conjunction with custom or specialized circuitry.
- processor 104 is collectively realised as a heterogenous computing system such as a big. LITTLE architecture in which a smaller, low power processor performs low level processing while a larger more power intensive processing performs higher level processing.
- initial animal detection may be performed by the smaller processor, which may include a small image signal processor (ISP) engine and subsequent high level animal classification may be performed by the larger processor, which includes a larger ISP engine.
- ISP image signal processor
- the vision processor functional module of processor 104 is configured to process the captured images to perform target animal detection; for example to perform classification based on shape, movement (e.g. speed), colour, size, temperature, thermal signature and other characteristics of objects detected within the imaged environment.
- the vision processor module may utilize one or more image processing algorithms such as object tracking, edge detection, shape recognition, contour detection and spectral analysis.
- the device controller functional module of processor 104 is configured to control the various devices of apparatus 1000 such as sensors 102, 103 and 105.
- the device controller may control a timing, resolution and/or frame rate of image sensors 102 and 103 and may also control the selective illumination of one or more light sources to illuminate the environment being imaged (if the apparatus is fitted with active lighting).
- the processor 104 is capable of executing a machine learning algorithm such as a neural network classifier 4000 to detect the presence of the target animal.
- the algorithm 4000 may take a virtually unlimited number of inputs such as data from the imaging device, acoustic inputs, scent data, time data and GPS location data among other possibilities.
- Figure 5 illustrates exemplary inputs (e.g. 4001) in the form of image sensor input, acoustic sensor input, scent data from a particle or chemical sensor, time data from a clock and GPS location data. Other information such as a time of day may also be input to the algorithm 4000. It will be understood that the greater number of inputs will typically result in a higher probability of detecting the target animal species accurately.
- the inputs are fed to nodes (e.g. 4003) of one or more hidden layers (a single hidden layer 4005 is shown in Figure 5).
- the nodes represent weighted functions of the inputs wherein the weights of the functions can be varied based on training of the algorithm.
- the outputs of the nodes are combined at an output 4007 to produce a determination of a presence and/or classification of an animal species (or a determination that no target animal is present).
- Algorithm 4000 has preferably been trained on a dataset of images, sounds, videos and/or other characteristic data of the target animal or animals.
- the algorithm 4000 may be static or dynamic to be able to be further trained to improve the classification.
- the learning of algorithm 4000 may be supervised, semi-supervised or unsupervised.
- the housing 100 may be fabricated from a number of materials suitable for use in an outdoor setting.
- the housing 100 may be formed from a rigid polymer material that includes UV stabilized polymers to withstand the sun.
- a metallic material may be used for the housing 100 where greater longevity is required as it would be resistant to UV degradation.
- a metallic housing 100 made from a material resistant to corrosion such as stainless steel or aluminium would be preferred.
- the use of a polymer housing is to be preferred if the housing is adapted for containing wireless communications.
- the housing 100 may be formed of other rigid or semi-rigid materials such as wood.
- the apparatus 1000 includes one or more sensors 102, 103, 105 contained within the housing 100 and positioned to monitor environment 2000.
- the apparatus 1000 is powered by a battery 110.
- the battery 110 may be of the rechargeable variety in which case a combination of a solar panel array or small wind turbine can be used to charge the battery. Alternatively, single use batteries may be used where they are periodically replaced when the apparatus is maintained.
- apparatus 1000 may be connected to mains power and powered by the electricity grid.
- the solar panel array is configured to convert light incident upon the solar panel array into electrical power, and to store the electrical power in the battery 110. For instance, the solar panel array converts sunshine during daytime into power in order to recharge the battery 110.
- the solar panel array is located on an outer or top surface of the apparatus 1000. In other embodiments, the solar panel array is positioned remote from the apparatus 1000.
- the battery 110 can be recharged by a generator or an external power source, can be a replaceable power source (e.g., a replaceable battery that is swapped out periodically), or can be itself located remotely from the apparatus 1000 (for instance, via power lines electrically coupling the battery 110 to the apparatus 1000).
- Thermal image sensor 103 may utilize IR sensitive pixels to detect thermal characteristics of the target animal species 3000.
- the thermal image sensors 103 are preferably calibrated to have a high sensitivity at temperatures corresponding to the target animal’s core temperature so as to accurately detect the temperature of the target animal 3000 and/or thermal characteristics of regions of the target animal (e.g. a heat map of the animal).
- certain target animals 3000 may have well defined body temperatures and, as such, the detected temperature of the detected animal may be used as an input to processor 104 to classify whether the detected animal is a target animal species or not based at least in part on the temperature of the animal.
- the detected temperature may be used to determine a health status of the target animal 3000. For instance, if a visual classification confirms the identity of the target animal 3000 as a given animal species but the detected temperature is outside the expected range for the animal species, this may be indicative that the target animal is unwell. Such health information can also be useful ecological data to obtain.
- visual characteristics of an animal are used to make a determination as to whether the detected animal is the target animal. These visual characteristics include but are not limited to; a shape of the target animal 3000, colour and/or marking, size, one or more predefined movement characteristics such as gait of the target animal 3000 over more than one image, a core or average temperature of the target animal 3000 or a temperature distribution across the target animal 3000 as detected by the IR sensor 103, the brightness or reflectivity of the target animal and any distinct marking that the target animal 3000 may have.
- the apparatus 1000 includes a GPS device 106 to allow for the determination of the location of the apparatus 1000.
- the location of the apparatus 1000 may be used to provide important information about the location of the detected target species and the location of the apparatus to assist when it requires service or replacement among other things.
- acoustic sensor 108 In addition to sensors that are sensitive to the visual and IR ranges, other embodiments include the use of an acoustic sensor 108.
- the acoustic sensor 108 is calibrated to detect the acoustic characteristics of the environment and more specifically, any characteristic noises that the target animal species may make including mating calls and other characteristic sounds of the particular animal species of interest.
- light level sensors may be used.
- a humidity sensor may be used.
- ambient temperature sensors may be used.
- the apparatus includes a particle detector/chemical analyser 105 which is adapted to detect in its broadest sense, scent characteristics of the environment and more specifically, signature characteristics of the target animal 3000 such as pheromones indicative of a certain target animal. Other characteristics such as the animal's sex, health status or pregnancy status may be determined using the detection of scent.
- the apparatus 1000 includes a wireless identifier configured to detect wireless signals and pre-stored identification data associated with one or more animals.
- the wireless identifier may take the form of a Zigbee® based RF tag or Wireless Sensor Network (WSN) technology, which may provide long range low power wireless tracking or the target animals 3000.
- WSN Wireless Sensor Network
- the wireless identifier may take the form of a Radio Frequency Identification (RFID) device.
- RFID Radio Frequency Identification
- the RFID device may be used to detect RFID signals in the environment such as those that may be present in the vicinity of an animal with an embedded RFID chip.
- the RFID device makes use of active RFID tags allowing a range of tracking in the vicinity of hundreds of meters.
- Local memory 107 storage on the apparatus 1000 allows for images and other data related to a detection event which may be later retrieved either manually or via a network for analysis.
- the memory 107 may take the form of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), and other equivalent memory or storage systems as should be readily apparent to those skilled in the art.
- the software/firmware stored on each of the apparatus 1000 may be updated either manually by an operator in the field or more preferably, over the air (OTA) or generally via a network where multiple units can be updated at one time.
- OTA over the air
- the apparatus 1000 is configured to operate in a plurality of different power modes.
- apparatus 1000 may be configured to operate in either a low power mode, allowing for long term monitoring in a low power state, or a higher power active mode, which may be triggered when greater computing power is required.
- the low power mode operates when the device is in its quiescent state awaiting the detection of a change in the environment such as the presence of a target animal 3000. Once a change in the environment is detected, the device may then operate in the high power mode where greater computing power and additional functions are utilized to determine whether the change in environment is a detection of the target animal 3000 or not. Control of which power mode the apparatus 1000 operates in is performed by processor 104 as described below.
- the one or more image sensors may operate at a lower frame rate and/or resolution, consuming less processing power and memory and in turn consuming less electrical power.
- the deactivation of multiple image sensors may be performed in the low power mode further reducing the power consumption in this mode.
- the transition from low power to high power mode may be instigated by the use of a trigger signal which may take a number of forms such as a visual trigger such as the detection of a shape or movement in the images or a mechanical or electromechanical trigger associated with the apparatus.
- Other triggers may include the detection of a predefined brightness or temperature in the images.
- trigger signal is an acoustic trigger, where for instance, the trigger may activate when a particular sound indicative of the target animal species 3000 is detected.
- one or more illumination devices are included in association with the apparatus 1000. These illumination devices may include one or more LEDs operating in the visible and/or infrared wavelength range. The illumination devices may be controlled by processor 104 to selectively illuminate parts of the environment to modify an animal's behaviour among other things.
- the apparatus 1000 includes a communications module 303, which is adapted to communicate with other apparatus 1000 positioned in the environment and alternatively to send and receive data to a remote server 306 at predefined time periods for the collection and retrieval of data among other things.
- the communications module 303 is adapted to communicate information such as the visual characteristics and/or presence of the target animal to server 306, which can, in turn, communicate with communications modules of other apparatus 1000.
- the communications module 303 is further adapted to communicate environmental characteristics and/or the presence of a target animal 3000 to other apparatus 1000 via server 306. In some embodiments, communications module 303 is able to communicate directly with other communications devices of other apparatus directly and bypass central server 306.
- the communications module 303 can include a wireless receiver, transmitter, or transceiver hardware.
- the communications module 303 may also be adapted to transmit other status data such as an energy level of the battery 110 (e.g. state of battery charge) or diagnostic codes in the event of a malfunction.
- Firmware updates may be performed Over the Air (OTA) using the communications module 303.
- OTA Over the Air
- a variety of wireless protocols may be used, with low power wireless protocols such as Bluetooth, BLE, ZigBee, 2G or 3G being some examples.
- communications module 303 includes hardware for communicating between devices over a wired network such as USB, Ethernet, twisted pair or coaxial cables.
- the communications module 303 is adapted to communicate the detected presence of a target animal 3000 to any of the other apparatuses 1000 allowing for the tracking of the target animal and the collection of information as to the target animal's 3000 movements.
- the detected presence of the target animal 3000 at one of the apparatus 1000 may be communicated to other apparatus within the vicinity causing them to alter from a lower power mode to a high power mode in anticipation of the target animal 3000 approaching other apparatus 1000 in the vicinity of the apparatus 1000 that initially detected the target animal 3000.
- apparatus 1000 are stationary devices deployed at specific locations throughout environment 2000.
- apparatus 1000 may be in the form of a drone or unmanned aerial vehicle (UAV), which can be controlled to move around environment 2000.
- UAV unmanned aerial vehicle
- apparatus 1000 can be used simply for detection of animals within the environment 2000, in some embodiments, apparatus 1000 is advantageously capable of performing various actions in response to detection of a target animal. These responsive actions include:
- Dispensing a compound such as a poison or medicament from a dispenser on the apparatus 1000 • Dispensing a poison compound onto the target animal (for pest animals);Dispensing food or water from a food dispenser on the apparatus 1000.
- the apparatus further includes an actuation device 402 which is responsive to a sensor signal generated from the processor 104 for initiating a response based on the detection of the target animal 3000.
- the actuation device 402 further includes a dispenser 404 which is adapted to dispense a compound 406 onto the target animal 3000.
- the compound is a pharmaceutical substance for medicating the target animal in response to a disease state.
- the pharmaceutical is a toxin
- the dosage of toxin supplied in the pharmaceutical is at least sufficient to incapacitate the target animal, and may cause death instantaneously or delayed relative to the ejection of the pharmaceutical, for instance via toxin-induced anoxia or other physiological effect.
- the pharmaceutical is sodium fluoroacetate 1080 ("1080") which is a well-known poison in Australia, and an effective dosage of 1080 may be between 5 mg and 12 mg, which may be supplied as 0.4ml of 30g/L concentrate 1080.
- 1080 has the advantage that it is present in a number of Australian native flora species and as such, Australian native animals tend to have a higher tolerance to it than introduced species such as feral cats. For instance, a predetermined dose of 1080 can euthanize a feral cat without harming other Australian native species.
- the pharmaceutical is PAPP and the dosage of the pharmaceutical is between 100mg and 300mg.
- the toxin supplied in the pharmaceutical is delivered within a volume of fluid between approximately 1 ml and 5ml.
- the pharmaceutical may be supplied in a viscous form.
- the pharmaceutical includes a gel formulation.
- the gel formulation has a consistent viscosity at a range of temperatures and pressures, and can beneficially improve the reliability of the speed, direction, and precision of the ejection of the pharmaceutical.
- the pharmaceutical includes a grease formulation, or is administered as a spray.
- a syringe or similar vessel is adapted (e.g., as part of the dispenser 404) to provide separate measured doses of the pharmaceutical for each ejection of the pharmaceutical by the targeting system.
- a larger vessel e.g., a canister or tank
- a larger vessel provides a constant supply of the pharmaceutical to be applied or ejected in amounts corresponding to single doses per application/ejection.
- different target animals may receive different doses of the pharmaceutical.
- the dose of the pharmaceutical applied to a given target animal can be selected, for instance based on the type of the target animal 3000 detected by the apparatus 1000.
- the pharmaceutical is enclosed within a frangible membrane designed to rupture upon contact with the target animal 3000.
- the frangible membrane contains the pharmaceutical and each membrane, which may be in the form of a capsule, pellet, ball, or the like, contains a distinct unit dose of the pharmaceutical.
- the dispenser 404 can shoot or eject the frangible membrane at the target animal 3000, which ruptures upon contact with the target animal 3000, causing the pharmaceutical enclosed within to contact and/or stick to the coat of the target animal 3000.
- the frangible membrane is shot at the target animal 3000 at a speed fast enough to ensure the frangible membrane ruptures, but at a speed slow enough to prevent significant pain from being caused to the target animal 3000.
- the actuation device 402 includes a visual or acoustic (audio lure) stimulus which may be actuated in response to the detection of the target animal 3000.
- the visual stimulus may include a LED light arrangement which is adapted to generate a pattern of light that is likely to attract or lure the target animal or alternatively scare non-target animals away from the apparatus.
- An acoustic stimulus may be produced by a sound generator and fed through a loudspeaker mimicking noises characteristic of or attractive to the target animal 3000 which may aid in luring the target animal to a vicinity of the apparatus 1000.
- the acoustic stimulus is programmable enabling realistic sounds including cat prey and mating calls for cats and foxes to be broadcast at variable volumes and intervals to optimize the luring capacity of the apparatus 1000.
- the acoustic stimulus can be configured to play at certain times of the day further improving the ability to lure target species.
- FIG. 6 there is illustrated a flow chart depicting an exemplary method 6000 of operating apparatus 1000 for the detection of a target animal 3000.
- the apparatus 1000 enters a low power or quiescent state where the environment 2000 is monitored at a basic level.
- the low power state may be the default state that apparatus 1000 enters upon initialisation or after a predetermined period of no activity.
- apparatus 1000 performs basic monitoring of the environment 2000 and only certain functions of apparatus 1000 will be activated.
- some of the sensors may be deactivated, one or more of the image sensors may operate at a lower frame rate and/or resolution, illumination devices (if installed) may be deactivated or turned down and other functions deactivated.
- processor 104 may be configured to perform only basic image processing algorithms such as low resolution object detection, brightness variations or edge detection that draw relatively low power.
- sensors 102 and 103 may be deactivated and a low power sensor such as a motion sensor or acoustic sensor may be used to simply detect motion or sounds within the environment 2000.
- a basic image classifier may be trained and executed on processor 104 to detect the normal background of environment 2000 being imaged by the image sensors. Minor changes across images such movement of branches or sunlight changes throughout the day may be taken into account in this classifier to reduce the instance of false triggers. This basic classifier can more accurately determine when an animal enters the environment scene being imaged, which would substantially change the normal background that is imaged.
- processor 104 includes multi-stage processor hardware such as in a big. LITTLE architecture, the smaller processor and ISP engine may be executed to perform the low power mode processes of step 6001.
- the low power mode remains active until, at step 6002, a trigger signal is detected by processor 104.
- the trigger signal may include a trigger from a mechanical device such as a weight sensor, a detection of a shape or motion by one of the image sensors 102 and 103 or detection of a sound by acoustic sensor 105 or a separate motion sensor. Where a basic classifier is implemented by processor 104, the trigger may be upon detection of a change in the normal background that is imaged by image sensor 102 or 103.
- the trigger signal may also be received from another nearby apparatus or server 306 via communications module 303. For example, if the nearby apparatus has detected a target animal in the environment, it may transmit a trigger signal to apparatus 1000 via server 306 and other nearby apparatus to wake them from their low power state.
- processor 104 switches apparatus 1000 into a first higher power mode in which additional functionality is activated.
- This mode termed “Stage 1” in Figure 6, is configured to allow apparatus 1000 to detect whether an animal is present.
- the level of processing and power consumption is somewhat higher than that of the low power mode in order to perform the detection.
- the image sensors may be configured to image the environment at a higher resolution and/or higher frame rate, deactivated sensors may be activated and illumination devices (if installed) may be activated.
- processor 104 may be configured to implement more comprehensive image processing algorithms such as shape recognition, spectral analysis and a machine learned classifier in order to determine whether or not an animal is present.
- Stage 1 may include capturing one or more still frame (non video) images and performing image processing on those images. In other embodiments, Stage 1 may include capturing a low frame rate video sequence and performing image processing on that sequence of images.
- image sensors 102 or 103 may be activated.
- the processor 104 registers the current time of day is daylight hours, visible image sensor 102 may be activated and IR sensor 103 deactivated.
- processor 104 registers the current time of day as being night time, visible image sensor 102 may be deactivated and IR sensor 103 activated.
- maintaining low power consumption is still of primary importance as many false triggers may switch apparatus 1000 into Stage 1 .
- This may include processor 104 determining a confidence measure, and, if the confidence measure is above a threshold value, a designation that an animal has been detected is made.
- a threshold confidence value might be 70%, 80%, 90% or 95%. This confidence value may be performed by the detection of known characteristics of animals in comparison to characteristics of general movement within the environment. In many cases, the detected movement will not be due to an animal but rather motion within the environment. If no animal has been detected, the system returns to step 6001 and apparatus 1000 re-enters the low power mode. If an animal is detected, then system operation proceeds to step 6005, which includes a classification operation to classify the detected animal.
- Animal detection at step 6004 may occur via a number of techniques including matching recorded data with data stored in a database. This includes basic shape or pattern recognition and comparison with a database of stored animal shapes, acoustic recognition of a stored animal call, thermal signature detection from IR images detected by IR sensor 103, and movement or motion detection amongst others.
- processor 104 includes multi-stage processor hardware such as in a big. LITTLE architecture, the larger processor and ISP engine may be executed to perform the higher processing of step 6004.
- the period from commencement of Stage 1 (upon triggering) to the determination at step 6004 is preferably only a period of milliseconds, such as less than 100 milliseconds. In some embodiments, the period is less than 50 milliseconds or less than 10 milliseconds. This rapid detection is preferable so as to be able to quickly detect a fast moving animal within the field of view of image sensors 102 and 103.
- apparatus 1000 does not know what type of animal has been detected; simply that an animal has been detected.
- This form of low-end processing allows for apparatus 1000 to operate at low power and to return to the low power mode if some trigger other than an animal, such a movement of a tree, switches device into the Stage 1 mode.
- apparatus 1000 may send a signal via communications module 303 to server 306 or directly to other nearby apparatus to alert them of the animal detection.
- This alert may, for example, trigger those devices to switch from the low power mode into the Stage 1 or a higher power mode for detecting the animal.
- apparatus 1000 is switched into a Stage 2 “classification” mode of operation in which a target animal classification process is performed by processor 104.
- Stage 2 represents a more processor-intensive mode in which a higher level of processing occurs to classify the animal based on input received from the various sensors.
- Additional devices such as sensors and illumination devices may be activated or switched into a different mode.
- image sensors 102 and 103 may be activated into higher frame rate and/or higher resolution modes to better capture characteristics of the animal such as shape, physical appearance and/or behaviour, the size of the animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and colour of the animal.
- Stage 2 represents a higher power mode than Stage 1 , which, in turn, draws a higher power than the low power mode of step 6001.
- Stage 1 detection performs analysis on only a single image frame or small number of images while Stage 2 classification includes performing analysis on a sequence of video images. Analysis of video allows processor 104 to determine temporal characteristics of the animal such as movement gait and behaviour.
- Stage 2 classification includes a more comprehensive analysis of the data by processor 104.
- Stage 2 classification includes performing a classifier algorithm such as that described above and illustrated in Figure 5.
- processor 104 determines whether or not a target animal has been detected in the classification of step 6005. Like with step 6004, this decision may be based on a confidence measure produced with the classification. If the confidence of the detected animal being a target animal is greater than a threshold confidence value, then a decision is made that a target animal has been detected.
- a threshold confidence value might be 70%, 80%, 90% or 95%.
- the detection of a target animal at step 6006 may not be limited to a single animal but may include a group of target animals (e.g. fox, feral cat, endangered pygmy possum) that are of interest to be monitored.
- a plurality of target animals may be stored in memory 107 and accessed by processor 104 for classification.
- the target animal classification at step 6005 may be simply to detect a species of the target animal or it may be to detect a subset (e.g. male or adult animals only), cohort or individual target animal.
- step 6006 If, at step 6006, a target animal is not detected, then the process returns to step 6001 and apparatus 1000 again enters the low power mode. If, at step 6006, a target animal is detected, then the process proceeds to optional step 6007 in which further processing is performed to determine a specific individual target animal.
- This step is designated as Stage 3 and is optional as Stage 2 may be sufficient to detect an individual target animal.
- an individual animal of interest may have known physical markings that can be detected in the Stage 2 classification. However, in some embodiments, particularly where an individual animal is difficult to distinguish from other animals of a species, Stage 3 classification can provide a further classification to determine if the target animal is the individual animal of interest.
- Stage 3 classification may include running a more advanced classifier algorithm such as a machine learnt algorithm trained based on images of the individual animal.
- Stage 3 classification may be executed solely by processor 104 within apparatus 1000 (e.g. a larger processor and ISP engine of a two-stage or heterogenous processor system such as in the big. LITTLE architecture) or may be executed wholly or in part by server 306 or another cloud server with higher processing power.
- Stage 3 classification may also include receiving inputs from other nearby apparatus which may have detected the individual animal to consider movement patterns of the animal.
- the Stage 3 classification at step 6007 determines a confidence value regarding a confidence that the individual animal has been detected. If the confidence value is greater than a confidence threshold, then, at step 6008, processor 104 determines that the individual target animal has been detected.
- a threshold confidence value might be 70%, 80%, 90% or 95%. If, at step 6008, the confidence value is lower than the threshold confidence value, then apparatus 1000 is returned to a lower power state such as the low power mode of step 6001.
- one or more responsive actions are taken by apparatus 1000, depending on the animal detected.
- the responsive action might be to dispense a poison from a dispenser in apparatus 1000 as described above and also in Australian patent 2016302398 entitled “Device for Proximal Targeting of Animals”.
- Example responsive actions taken at step 6009 include:
- Triggering a trap and/or Activating one or more additional sensors such as acoustic sensors or RFID tag sensors to capture more data about the individual animal.
- apparatus may optionally perform a verification that the action was successful. This may include processing a short sequence of images after the action to observe an outcome.
- image sensor 102 is controlled to capture a short sequence of images and processor 104 processes the images to visibly identify that the poison was administered to the animal (e.g. poison observed to land on the animal’s body).
- processor 104 may issue a verification to server 306 and/or store the verification in memory 107.
- communications module 303 is controlled to transmit the short sequence of images captured after the action to server 306 for analysis and verification by a human operator.
- processor 104 may store relevant information in memory 107 and/or transmit information to server 306. This may include the detection of a particular animal or animal species for counting in a study, markings of the detected animal, movement patterns, direction of travel, gait, behaviour characteristics (e.g. injured), as well as biometric information such as age, size and gender. This information is valuable for the ongoing study of the ecosystem within environment 2000.
- apparatus 1000 is able to operate in more than or fewer than four power modes.
- system 3500 is able to collectively monitor a large area of environment 2000 that extends significantly beyond that of the field of view of a single sensor apparatus 1000.
- apparatus 1000 may be deployed at spatially separated locations around environment 2000, particularly in locations where target animals are known or likely to be present.
- at least a subset of the apparatus 1000 are fitted to drones so that they are mobile and can controllably move around environment 2000.
- the apparatus 1000 are each preferably locally powered by on-board batteries and optionally supplemented with solar and/or wind turbine installations. However, in some embodiments, some or all of the apparatus 100 may be powered by mains power. Further, each apparatus 1000 preferably communicates wirelessly with remote server 306 via communications module 303 and communicates only small amounts of data over short periods of time and at predetermined time periods so as to minimise power consumption of the apparatus 1000. However, in some embodiments, communications module 303 includes a wired connection such as USB, Ethernet or twisted pair cable to connect each apparatus 1000 with remote server 306 and/or between different apparatus via wired connections. The transmitted data may be compressed and encrypted by various data encryption algorithms known in the art. In other embodiments, the various apparatus 1000 are able to communicate directly with each other without communicating with server 306, such as in a mesh network.
- server 306 acts as a central hub for collating data from each apparatus 1000 and performs system-level decision making such as which apparatus to switch into higher or lower power modes, which apparatus are malfunctioning, whether any apparatus needs to be relocated, serviced, or replaced.
- Server 306 may also intermittently issue software updates to the various apparatus to, for example, update the classification algorithms to more accurately classify the current target animals or change the target animals to be classified.
- Server 306 may also monitor power levels of the respective batteries of each apparatus 1000 and issue alerts if a battery needs replacement or if an apparatus is offline.
- server 306 may also perform a higher level classification to that performed by apparatus 1000, such as the Stage 3 classification described above.
- each apparatus may employ one or more static classifier algorithms which perform classification of target animals and feed these classifications and associated images, acoustic data and other data to server 306 for further processing.
- Server 306 may employ a dynamic machine learning algorithm which continues to learn based on an updated training dataset fed by the data received from each apparatus 1000. This may be particularly useful when the system 3500 is aiming to detect individual animals having distinct markings or the like. In this situation, the apparatus 1000 may be used to detect the particular species of animal and server 306 performs a higher level analysis of the data to determine if the particular animal has been detected or not.
- server 306 to perform a higher level classification allows the classification software and hardware used in the apparatus 1000 to be kept relatively simple. This leads to reduced cost and power consumption by the apparatus 1000 and longer field operating lifetime.
- server 306 may be powered by mains power due to a higher power consumption.
- Server 306 may periodically issue software updates to the apparatus 1000 as the dynamic classification system used at the server is able to classify target animals more accurately and/or the target animals to be identified and monitored changes.
- real-time or near real-time means a time frame that is in the order of milliseconds and preferably less than 100 milliseconds in order to be able to detect, image and classify a swiftly moving animal and perform an appropriate responsive action before the animal moves out of a responsive action zone proximal to the apparatus.
- This rapid timing is important as animal management can be very time-sensitive.
- efficient animal management requires identification of animals at multiple locations to be time-stamped and retrieved and centralized into a single database, where correlations between the detections can be made in order to understand the prevalence, movements and other statistics of the animal population.
- System 3500 is able to achieve this using the centralised server 306 as the central management engine.
- Server 306 may be cloud-based or include or communicate with a database that is cloud-based to alleviate the very real risks of data loss through transfer from local devices to more permanent data storage devices and during/following classification.
- Server 306 may host or enable an interface or dashboard that is accessible by ecologists or other personnel to access and analyse the data gathered by system 3500.
- the system may provide an efficient interface, able to serve the needs of several key user roles including:
- server 306 may also host management software capable of accessing a database to collate detection event data from groups of apparatus in the same geographic area.
- the management software may also include capability to upload and update sensor software and configurations using "over the air updates”.
- the management software may provide the capability to remotely alert users of a particular category of animal detection event (for example a feral or domestic cat, or a particular species of wildlife, detected).
- the management software may also provide the capability for members of the public to observe or be notified about the identity of the animals being detected (for example to notify them that their cat or dog is detected).
- system 3500 When there are undefined numbers of animals over a large geographic area, and potentially multiple user roles in the management of the animals, system 3500 described above provides various benefit and advantages, including:
- a plurality of sensors spatially dispersed in the environment in order to detect more animals in a wider variety of locations, habitats and times. • Sensors that are both sensitive and specific (or accurate) in order to avoid false triggering/classifications cluttering the system, creating unnecessary costs, consuming battery life/power and memory storage and triggering erroneous actions.
- Sensors or remote management software that are equipped to recognize individual animals that have been previously identified (for example detecting a specific individual tiger in a population of tigers).
- Management software that includes a database to collate detection event data from groups of sensors in the same geographic area.
- Management software that includes the means to upload and update sensor software and configurations using “over the air updates”.
- Management software that offers the means to remotely alert users of a particular category of animal detection event (for example a feral or domestic cat, or a particular species of wildlife, detected).
- Management software that offers the means for members of the public to observe or be notified about the identity of the animals being detected (for example to notify them that their cat or dog is detected).
- system level operation of system 3500 of apparatus 1000 provides for the identification, monitoring and management of animals within a potentially wide geographic area with minimal manual intervention by field personnel.
- the processes of system 3500 are largely automated and various management processes can be instigated automatically based on predefined rules and processes. Data can be centrally managed and monitored with the capability of field personnel to modify the rules and processes implemented by system 3500 such as to modify the management processes and/or vary the target animals being monitored.
- the collation of the spatial information from the system of apparatus allows for accurate determination of whether there is one or more than one problematic animals within the environment. This level of information is not possible from a single sensor apparatus in the field.
- infrared refers to the general infrared area of the electromagnetic spectrum which includes near infrared, infrared and far infrared frequencies or light waves.
- controller or “processor” may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
- a “computer” or a “computing machine” or a “computing platform” may include one or more processors.
- any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
- the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
- the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
- Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
- Coupled when used in the claims, should not be interpreted as being limited to direct connections only.
- the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
- the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
- Coupled may mean that two or more elements are either in direct physical, electrical or optical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Environmental Sciences (AREA)
- Multimedia (AREA)
- Pest Control & Pesticides (AREA)
- Wood Science & Technology (AREA)
- Zoology (AREA)
- Insects & Arthropods (AREA)
- Animal Husbandry (AREA)
- Biodiversity & Conservation Biology (AREA)
- Human Computer Interaction (AREA)
- Catching Or Destruction (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2022297009A AU2022297009A1 (en) | 2021-06-21 | 2022-06-21 | A system and apparatus for animal management |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2021901865A AU2021901865A0 (en) | 2021-06-21 | A System and Apparatus for Animal Management | |
AU2021901865 | 2021-06-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022266705A1 true WO2022266705A1 (en) | 2022-12-29 |
Family
ID=84543784
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU2022/050627 WO2022266705A1 (en) | 2021-06-21 | 2022-06-21 | A system and apparatus for animal management |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU2022297009A1 (en) |
WO (1) | WO2022266705A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118196839A (en) * | 2024-05-14 | 2024-06-14 | 湖南嘉原农业科技集团有限公司 | Hypoxia state prediction method and system based on supervised learning |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160026895A1 (en) * | 2014-07-23 | 2016-01-28 | Metecs, LLC | Alerting system for automatically detecting, categorizing, and locating animals using computer aided image comparisons |
US20160277688A1 (en) * | 2015-03-18 | 2016-09-22 | The Samuel Roberts Noble Foundation, Inc. | Low-light trail camera |
US20170079260A1 (en) * | 2014-04-18 | 2017-03-23 | Hogman-Outdoors, Llc | Game Alert System |
US20180000575A1 (en) * | 2016-06-29 | 2018-01-04 | International Business Machines Corporation | Unmanned aerial vehicle-based system for livestock parasite amelioration |
WO2020037377A1 (en) * | 2018-08-24 | 2020-02-27 | OutofBox Solutions Tech Pty Ltd | A detection system |
US20210176982A1 (en) * | 2019-12-11 | 2021-06-17 | Plano Molding Company, Llc | Camera system and method for monitoring animal activity |
US20210259235A1 (en) * | 2020-02-24 | 2021-08-26 | Sony Corporation | Detection of animal intrusions and control of a repellent mechanism for detected animal intrusions |
WO2022040366A1 (en) * | 2020-08-18 | 2022-02-24 | IntelliShot Holdings, Inc. | Automated threat detection and deterrence apparatus |
-
2022
- 2022-06-21 AU AU2022297009A patent/AU2022297009A1/en active Pending
- 2022-06-21 WO PCT/AU2022/050627 patent/WO2022266705A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170079260A1 (en) * | 2014-04-18 | 2017-03-23 | Hogman-Outdoors, Llc | Game Alert System |
US20160026895A1 (en) * | 2014-07-23 | 2016-01-28 | Metecs, LLC | Alerting system for automatically detecting, categorizing, and locating animals using computer aided image comparisons |
US20160277688A1 (en) * | 2015-03-18 | 2016-09-22 | The Samuel Roberts Noble Foundation, Inc. | Low-light trail camera |
US20180000575A1 (en) * | 2016-06-29 | 2018-01-04 | International Business Machines Corporation | Unmanned aerial vehicle-based system for livestock parasite amelioration |
WO2020037377A1 (en) * | 2018-08-24 | 2020-02-27 | OutofBox Solutions Tech Pty Ltd | A detection system |
US20210176982A1 (en) * | 2019-12-11 | 2021-06-17 | Plano Molding Company, Llc | Camera system and method for monitoring animal activity |
US20210259235A1 (en) * | 2020-02-24 | 2021-08-26 | Sony Corporation | Detection of animal intrusions and control of a repellent mechanism for detected animal intrusions |
WO2022040366A1 (en) * | 2020-08-18 | 2022-02-24 | IntelliShot Holdings, Inc. | Automated threat detection and deterrence apparatus |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118196839A (en) * | 2024-05-14 | 2024-06-14 | 湖南嘉原农业科技集团有限公司 | Hypoxia state prediction method and system based on supervised learning |
Also Published As
Publication number | Publication date |
---|---|
AU2022297009A1 (en) | 2023-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2016302398B2 (en) | Device for proximal targeting of animals | |
US10512260B2 (en) | Method and apparatus for automated animal trapping | |
EP3466256B1 (en) | Machine for capturing, counting and monitoring of insects | |
US11659826B2 (en) | Detection of arthropods | |
Hadjur et al. | Toward an intelligent and efficient beehive: A survey of precision beekeeping systems and services | |
US11617353B2 (en) | Animal sensing system | |
US20190166823A1 (en) | Selective Action Animal Trap | |
US20160277688A1 (en) | Low-light trail camera | |
AU2017293656A1 (en) | Pest deterrent system | |
US20210329891A1 (en) | Dynamic laser system reconfiguration for parasite control | |
WO2022266705A1 (en) | A system and apparatus for animal management | |
US20230102968A1 (en) | Selective Predator Incapacitation Device & Methods (SPID) | |
Janani et al. | Human-Animal Conflict Analysis and Management-A Critical Survey | |
Sundaram et al. | Integrated animal monitoring system with animal detection and classification capabilities: a review on image modality, techniques, applications, and challenges | |
KR100688243B1 (en) | System for capturing wild animal using by remote control | |
Pillewan et al. | Review on design of smart domestic farming based on Internet of Things (IoT) | |
Darras et al. | Eyes on nature: Embedded vision cameras for multidisciplinary biodiversity monitoring | |
US20210315186A1 (en) | Intelligent dual sensory species-specific recognition trigger system | |
Wang | Intelligent UAVs for Pest Bird Control in Vineyards | |
Lathesparan et al. | Real-time animal detection and prevention system for crop fields | |
Bello | An overview of animal behavioral adaptive frightening system | |
Dadhich | Increasing the accuracy of rodent detection and estimation of the population with sensor fusion | |
Bhusal | Unmanned Aerial System (UAS) for Bird Damage Control in Wine Grapes | |
Kumar et al. | Animal Repellent System for Smart Farming using AI and Edge Computing | |
Dadhich | Increasing the accuracy of rodent detection and estimation of the population with emerging sensor technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22826880 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 806371 Country of ref document: NZ Ref document number: 2022297009 Country of ref document: AU Ref document number: AU2022297009 Country of ref document: AU |
|
ENP | Entry into the national phase |
Ref document number: 2022297009 Country of ref document: AU Date of ref document: 20220621 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22826880 Country of ref document: EP Kind code of ref document: A1 |