WO2022043187A1 - Identifying poultry associated with eggs of a quality - Google Patents

Identifying poultry associated with eggs of a quality Download PDF

Info

Publication number
WO2022043187A1
WO2022043187A1 PCT/EP2021/073060 EP2021073060W WO2022043187A1 WO 2022043187 A1 WO2022043187 A1 WO 2022043187A1 EP 2021073060 W EP2021073060 W EP 2021073060W WO 2022043187 A1 WO2022043187 A1 WO 2022043187A1
Authority
WO
WIPO (PCT)
Prior art keywords
egg
hen
data
poultry
camera
Prior art date
Application number
PCT/EP2021/073060
Other languages
French (fr)
Inventor
Abhishek MURTHY
Mathan Kumar GOPAL SAMY
Peter Deixler
Tharak VANGALAPAT
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2022043187A1 publication Critical patent/WO2022043187A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating

Definitions

  • the present application relates generally to the field of data processing and more specifically to using camera data associated with livestock (e.g., poultry) and eggs, captured at hatcheries, to associate an identify livestock associated with eggs of a quality.
  • livestock e.g., poultry
  • eggs captured at hatcheries
  • US2020/170219A discloses an unmanned aerial vehicle for determining geolocation exclusion zones of animals.
  • the unmanned aerial vehicle includes a processorbased monitoring device to track geolocation information associated with an animal from the unmanned aerial vehicle, an identification device mounted on the unmanned aerial vehicle to identify the animal and to track a position of the animal over time, and a mapping device coupled to the monitoring device to determine locations where the animal has traversed and to identify where an encounter with the animal is reduced.
  • the geolocation information associated with an animal may include identifying birds and their nests.
  • this and other objects are achieved by a data processing device according to claim 1. According to a second aspect of the invention, this and other objects are achieved by a data processing method according to claim 9.
  • Fig. 1 is a diagram illustrating an example hatchery environment in which example devices operate, in accordance with various aspects and embodiments of the subject disclosure.
  • Fig. 2 is a diagram illustrating an example system for identifying a hen, in accordance with various aspects and embodiments of the subject disclosure.
  • Fig. 3 is a graph illustrating a pattern of movement indicative of a pre-laying behavior engaged in by a poultry ready to lay an egg, in accordance with various aspects and embodiments of the subject disclosure.
  • Fig. 4 illustrates a flow diagram relating to example operations that can be performed by a poultry and egg analyzer device, in accordance with various aspects and embodiments of the subject disclosure.
  • Fig. 5 illustrates a flow diagram relating to another example operations that can be performed by a poultry and egg analyzer device, in accordance with various aspects and embodiments of the subject disclosure.
  • Fig. 6 illustrates a flow diagram relating to example operations that can be performed by a luminaire, in accordance with various aspects and embodiments of the subject disclosure.
  • Fig. 7 illustrates an example block diagram of a computer that can be operable to execute processes and methods in accordance with various aspects and embodiments of the subject disclosure.
  • the one or more processors can be any processor device known to those of ordinary skill, for example processors offered for sale by Intel (e.g., branded Pentium processors), Advanced Micro Devices (AMD), International Business Machines (IBM) and the like. It is also contemplated that processors of other brands can be suitable. Additionally, future processors, as they are developed and branded, are contemplated to be within the scope of the present invention.
  • processor is further elaborated upon below.
  • the memories can comprise any suitable computer-readable storage medium, including, for example, on-chip memory, read only memory (ROM), random access memory (RAM), hard disks, compact disks, DVDs, optical data stores, and/or magnetic data stores.
  • ROM read only memory
  • RAM random access memory
  • the one or more devices can also comprise circuitry and hardware components as described below with respect to FIG. 7.
  • Example embodiments of the devices may take the form of entirely hardware embodiments, entirely software embodiments, and embodiments combining both software and hardware aspects.
  • the poultry identification systems and methods that autonomously monitors poultry to detect and tag layers in a poultry cell which are responsible for laying poor grade of eggs and subsequently directs and enables farm workers to separate and take corrective measures for underperforming poultry in the observed cell.
  • Underperforming hen laying can be a sign of diseased poultry.
  • Poultry in the context of this application, can include any livestock that can lay eggs for human consumption, including but not limited to, chickens, turkeys, quails, and ostriches. Hen, in the context of this application, refers to egg laying poultry.
  • Systems in accordance with example embodiments of the present application can employ a grid of sensor(s) (e.g., a visual camera, thermal camera, etc.). Also associated with the system can be connected lighting devices (e.g., luminaires).
  • a machine learning engine can identify poultry uniquely among its neighbors in the cell by utilizing poultry facial recognition techniques and algorithms.
  • the example embodiments of systems can perform the functions of automatically identifying, using cameras and sensors, a hen that lays particular eggs, an in particular a hen that lays a poorer-grade eggs, due to sickness or other factors (e.g., feed, lack of lighting, lack of exercise, lack of medication, etc.).
  • the under-performing hens can be rehabilitated, for example by provide lighting interventions, appropriate feed, medicine, etc., to improve the well-being of the underperforming hens so that the hens can improve the grade of the eggs the hens lay.
  • the example poultry identifications systems can identify hens perform less satisfactorily than other hens, which includes laying lower quality eggs, or poorer grade eggs (e.g., smaller, irregular shaped, discolored, cracked, broken), or laying eggs at a lower frequency (e.g., less eggs per day)
  • a regular shaped egg is typically oval, with one end being larger than the other.
  • Shell quality of the eggs can comprise whether the egg has any cracks, or are broken, and also comprise the egg color. Weak-shelled eggs show a different color, which can be due to calcium deficiency and poor calcium sources.
  • Early identification, including by using machine learning techniques, of poor quality egg-laying poultry can allow poultry owners to treat them separately.
  • a first camera(s) can be used to identify a hen
  • a second camera(s) can be used to trace the hen’s pre-laying behavior, and subsequently to recognize whether the farm worker has selected the right, underperforming hen.
  • the second camera can be facing top facing, mounted above the poultry cage cells pointed at the top of poultry cages, and the first camera can be side facing, mounted so that the camera points toward a side faces of the poultry cage cells.
  • the first camera can also be used to capture images of the egg, which can be analyzed to derive egg data relating to the dimensions of the egg.
  • one or more devices in the system in the system can analyze visual data associated with captured images (e.g., captured by a camera) of poultry (e.g., hens) in a cage cell.
  • the poultry and egg analysis system can utilize facial recognition techniques using computer vision or machine learning techniques to identify the egg-laying poultry.
  • the poultry and egg analysis system can also analyze data, such as time data, location data, and egg data to correlate this data with a particular hen.
  • a dynamic, or modulated, light effects from one or more luminaires can be used to advise a farm worker to pick the correct, underperforming hen from a multitude of hens in the cell/cage.
  • the system can actuate lighting devices (e.g., luminaires) that generate a visual alert.
  • lighting devices e.g., luminaires
  • direct lighting emanating from one or more luminaires that are part of the system, can shine upon the hen that has been identified.
  • the luminaires can receive positional data from the system indicating the location of the hen, which is updated depending on the movement of the hen. The positional data can be used by the luminaires to illuminate the hen.
  • the luminaire comprises a camera so that it can use images to determine what position to move the luminaire so as to be able to direct a beam of light upon the hen.
  • These light effects can guide a farm worker to the correct cage.
  • the systems can also generate a report of the prediction to farmers (e.g., a report comprising an image of the underperforming hen, egg data comprising the dimensions of eggs laid by the hen, egg data comprising egg laying rate of the hen, cage location, etc.).
  • a light effect can indicate whether the appropriate, identified poultry has been selected. The light can differ from the light that identified the poultry.
  • example embodiments of the present application use different cameras to keep track of hens and eggs, complemented by light effect by the farm lights aimed at guiding the workflow in the farm.
  • the camera and the luminaire can be integrated to form one device.
  • the present application can be operable to use an iterative process to capture data, analyze the captured data by combining it with previously analyzed data, produce a new set of refined analysis.
  • Systems and methods herein can be used to detect and tag layers in a poultry cell that to identify hens that lay above-standard grade eggs, or can identify hens that lay poor grade eggs, with a higher degree of confidence, guiding farmworkers to the abovestandard performing hen, or the underperforming hen.
  • FIG. 1 is a diagram illustrating an example of an environment 100 depicting various devices that can be used in the example embodiments of poultry identification systems.
  • the example systems can operate in a poultry hatching environment.
  • the environment 100 can comprise one or more cages 105 (cage 105 in the singular, cages 105 in the plural). Multiple cages can comprise a cage cell, and multiple cage cells can comprise cage cells. One or more cages can contain poultry 110, and in particular, hens that lay eggs 115 (egg 115 in the singular, eggs 115 in the plural).
  • the example poultry identification systems operating in the environment 100 can comprise several cameras.
  • the example systems can comprise one or more top-facing cameras (top-facing camera 120 in the singular, top-facing cameras 120 in the plural).
  • the example poultry identification systems can also comprise one or more side-facing cameras (side-facing camera 125 in the singular, side-facing cameras 125 in the plural) that are pointed at the sides of the cages 105. Because the cages might be arranged to be adjacent to each other, not every angle of the cage might be exposed to the side-facing cameras. For example, a cage that is positioned in between two other cages might only have a camera pointed at the back and front of the cage, and not the two sides on account of the two sides being adjacent to other cages.
  • the one or more of the cameras can comprise a housing, and can comprise a processor and memory, as explained above, and have the same or similar components as described below with respect to FIG. 7.
  • One or more of the cameras can comprise actuators and motors, so as to be operable to respond to control signals, including to move so as to point at different angles with respect to the ceiling.
  • One or more of the cameras can communicate using one or more wireless communication protocols (e.g., cellular protocols such as LTE, 5G, etc., or other wireless protocols such as Wi-Fi or Bluetooth, or radio frequency (RF)). If wired, the one or more cameras can communicate using one or more wired protocols, such as Ethernet.
  • wireless communication protocols e.g., cellular protocols such as LTE, 5G, etc., or other wireless protocols such as Wi-Fi or Bluetooth, or radio frequency (RF)
  • the cameras can be operable to take a picture, take video, or in some example embodiment systems, take thermographic pictures or videos.
  • the cameras in some example embodiment systems can also be equipped with night-vision technology so as to capture movement, shapes, and other detail in low light conditions.
  • the cameras (top or side facing) can also comprise a clock, which in some embodiments can be time synced to a networked clock, and in other embodiments be an internal non-networked clock.
  • the clock can be used to stamp the time (e.g., create a time stamp) of any visual pictures or frames of videos sent by the cameras.
  • the system can also comprise one or more lighting devices.
  • the lighting devices can be luminaires (e.g., luminaire 130 in the singular, luminaires 130 in the plural), which can be mounted on a ceiling.
  • the one or more luminaires 130 can comprise one or more light sources, together with the parts designed to distribute the light, to position and protect the light sources, and to connect the light sources to the power supply.
  • the light sources can be incandescent, fluorescent, halogen-based, or a light emitting diode (LED).
  • the luminaires 130 can be placed above the cages, and can emanate light that has a beam width such that an individual hen can be illuminated.
  • an aperture of the luminaire and/or mirrors can be used to control the beam width.
  • the beam width can also be adjusted by elevating or de-elevating the position of the luminaire with respect to the cage.
  • the luminaires 130 can have a motor at the base, and actuators that control the motors, so that the luminaires can move to illuminate at different angles with respect to the luminaire and the ceiling.
  • the light sources can be emit light of different colors, such as red, blue, or green.
  • the one or more luminaires can comprise a processor and memory, and can have one more of the components described below with respect to FIGS. 7.
  • the one or more luminaires have the ability to be independently and simultaneously controlled (e.g., controlled or operated at the direction of a poultry and egg analyzer, described with respect to FIG.
  • the luminaires can be controlled to turn off, or turn on, or to emit light of different intensities or colors.
  • the luminaires can be controlled wirelessly (e.g., cellular, Wi-Fi, Bluetooth, RF, etc.), or by wire (RF, Ethernet, etc.).
  • FIG. 2 shows example embodiments of poultry identification systems 200.
  • the example poultry identification systems can comprise the one or more top-facing cameras 120, the one or more side facing cameras 125, and the one or more luminaires 130 introduced in FIG. 1.
  • the example systems can also comprise a poultry and egg analyzer 215, which can be a one or more devices (or a system comprising one or more devices) having a processor and memory, as mentioned above.
  • the poultry and egg analyzer 215 can have, or be connected to, a repository 220 (which can be a local storage device, or a storage device accessible via a networked device).
  • the repository can comprise one or more storage devices (e.g., hard drive, solid state drive, flash drive, etc.) capable of storing digital data and information.
  • the poultry and egg analyzer can function as a server device. It some example embodiments, the poultry and egg analyzer can serve up user interface pages (e.g., webpages) via a browser, which can be used to report, monitor, list, or access certain data.
  • the user interface pages can be accessible by one or more user equipment 225 (UE 225).
  • UE 225 can comprise a desktop computer, laptop computer, tablet, smartphone, smartwatch, or the like.
  • the various devices of FIG. 2 can be connected to each other via one or more communications networks.
  • the one or more communications networks can include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, cellular network, satellite network, data over cable network (e.g., operating under one or more data over cable service interface specification “DOCSIS”), or any other type of computer or communications network.
  • the communications networks can also comprise, for example, a Local Area Network (LAN), such as an office or Wi-Fi network.
  • LAN Local Area Network
  • the communications network can also comprise, for example, a Bluetooth network that allows one communications device 140 to be connected to other communications devices via the Bluetooth network.
  • the communications network can also comprise an RFID network that can receive RFID signals from active and passive RFID devices. The devices can communicate with each other either directly or indirectly.
  • identifiable biometric characteristics of the poultry can be evaluated in conjunction with the behaviors inferred from trajectory tracking (e.g., pattern of movement) to identify with high confidence level that a particular hen laid a particular egg.
  • trajectory tracking e.g., pattern of movement
  • the one or more top-facing cameras 120 and the one more side-facing cameras 125 can capture images (e.g., pictures, video, video frames, thermographic video, etc.) and transmit data signals representative of the captured images (e.g., top-facing camera data 205, side-facing camera data 210).
  • the poultry and egg analyzer 215 can receive the top-facing camera data 205 and the side-facing camera data 210 and determine that a particular egg was laid in a particular cage.
  • the poultry and egg analyzer can also create data associated with an image of an egg (e.g., egg data 230).
  • Egg data 230 can be stored in a repository (e.g., repository 220).
  • the egg data 230 can comprise, for an egg 115 that was laid, the dimensions of the egg 115, which can account for differences in the distance of each camera from each capture egg (e.g., the close a camera is to an egg, the larger the image of the egg will be captured). As such, the dimensions of each egg 115 are normalized by the poultry and egg analyzer 215 based on the distance of the camera from the egg, which can be determined based on the sum of the distances of the side-facing cameras 125 from the cage 105, and the distance of the cage 105 from the egg 115. Egg data 230 can also comprise a time at which the egg was detected to be laid, which can be based on top-facing.
  • N 4 means there are 4 hens in the cage
  • the poultry and egg analyzer can detect and identify an egg 115 that was laid in a cage 105, having determined the cage responsible for a specific laid egg, the task of identifying the hen that laid an egg is reduced to a search among the N poultry 110 in that cage 105.
  • the poultry and egg analyzer 215 can analyze the biometrics of each poultry 110 to identify a particular hen. Some of the identifiable biometric characteristics (or features) of a hen that can be used to identify the hen based on identifiable characteristics, which can comprise the overall size, color, and the comb characteristics.
  • the comb of a hen is strongly correlated with the size of the eggs laid by her.
  • the comb is used in wild-derived populations to base mating decisions on by both males and females. In males, the comb is an indicator of social rank, with females actively soliciting mating from males with larger combs as well as also correlating with bone mass. In females the comb is indicative of greater reproductive potential, through an increase in egg production.
  • the biometric data for one or more poultry 110 can be stored in repository 220, and associated poultry identification data 235 for a particular poultry. Additionally, in most cases, the size of the hen is also strongly correlated with the size of the egg. Similar identifiable biometrics can be used by the poultry and egg analyzer 215, to separate male and female poultry. Using identifiable biometrics such as comb size, as well as higher activity levels of male poultry, and other behaviors observed of male poultry, the poultry and egg analyzer 215 can identify one sex of poultry from another.
  • Images sent to the poultry and egg analyzer 215 can be used by the poultry and egg analyzer 215 to identify a pattern of movement, wherein the trajectories can be expressed in three dimensions x, y, and z, with time t denoting the timestamp.
  • the pattern of movement data and time t data can be stored with poultry identification data 235.
  • the pattern of movement of a hen at time prior to the laying of an egg at another time can be used to correlate whether that hen laid the egg. Hens typically engage in pre-laying behavior (e.g., a pattern of movement) before oviposition, consisting of a search phase, selection of a nest site, and formation of a nest hollow.
  • pre-laying behavior Different breeds of poultry may emphasize some aspects of pre-laying behavior more than others. For example, white leghorn hens can have pronounced search and nest selection behavior during which they visit and investigate a number of potential nest sites before choosing one. Medium-weight hybrid brown egg layer breeds tend to sit longer in nests and perform nest building activities, such as gathering litter around the hen to form the nest hollow. Pre-laying behavior might occur during a certain period on any given day because it is triggered by hormones associated with the last ovulation, and not by the mere presence of an egg in the shell gland. Normally, pre-laying behavior begins an hour or two before the egg is ready to be laid, and culminates in the hen settling in a nest and laying an egg.
  • typical pre-laying behavior includes hens blowing up (expanding) their wings (hence the volume of hen increases puffs-up), and followed by the post-laying behavior of decreasing this volume withing about 30 seconds from the egg being laid.
  • post-laying behavior usually hens will stay for an extended time at the location or place at which they laid their eggs before vacating the area. Hens vacating the egg-laying place would also depend on the presence of other, which might frighten them away.
  • pattern of movement such as pre-laying and post-laying behavior, can also be used to compute the likelihood of a hen having laid an egg.
  • This equation accounts for two types of features: biometrics and trajectories.
  • biometrics like the comb-size and the overall size of the hen, are data that can be used.
  • the likelihood of the hen i laying the egg based on the biometrics can be computed by fitting probability densities to data.
  • Trajectories are the second set of data that can be evaluated. As shown in FIG. 3, different trajectories correspond to different behaviors of the poultry.
  • the term P (features of i]egg size) is the probability of hen I generating the trajectories given the fact that an egg of said size was laid.
  • the term P(chicken_i laid) is the prior bias of hen i having laid the eggs, based on the biometrics. For example, if there is a linear relationship between the plume size and the egg size, then the poultry and egg analyzer 215 can estimate the probability that one of N hen laid that egg.
  • the prior probability is combined with the likelihood to get a posterior estimate of the probabilities of each hen laying the said egg.
  • the poultry and egg analyzer 215 can assign the egg to the hen with the maximum posterior probability.
  • Pre-set thresholds which can be adjusted, can be input (e.g., input by a user equipment 225) and used to determine which hens are underperforming, for example, thresholds related to an egg laying rate (e.g., number of eggs laid per day, or some other time frame), as well as the size and color of the egg that was laid.
  • underperforming poultry can be identified through an iterative process using machine learning techniques by the poultry and egg analyzer 215.
  • the poultry and egg analyzer 215 can send a signal to a luminaire 130, preferably one closest to the cage of the identified hen.
  • a luminaire 130 preferably one closest to the cage of the identified hen.
  • the poultry and egg analyzer 215 can transmit a signal (e.g., via a communication network) to actuate lighting devices (e.g., luminaires 130), preferably a luminaire close to the cage containing the underperforming poultry.
  • the luminaire 130 can shine a light upon the poultry that has been identified.
  • the beam width of the light can be narrow enough to shine a light on a specific hen.
  • the resulting light can be a spotlight, or in some embodiments.
  • the emanated lights might be of different colors, and can also flash. These light effects can guide a farm worker to the correct cage.
  • a light effect e.g., change in color
  • the system can also generate a report of the prediction to farmers (e.g., via an email showing an image of the underperforming hen), so the farm worker knows the results of any analysis regarding the hen’s underperformance.
  • a decision can be made as to whether to separate an underperforming hen, or provide lighting interventions, medical treatments, etc., to boost the size of the egg.
  • one or more of the example methods and operations can be performed as described in FIGS. 4-6.
  • the methods and operations can be performed by one or more devices comprising a processor and a memory.
  • the device can have some or all of the components as described below with respect to FIG. 7.
  • Machine-readable storage media comprising executable instructions that, when executed by a processor, can also facilitate performance of the methods and operations described in FIGS. 4-6.
  • steps or aspects described in one operation can be substituted or combined with steps and aspects with respect to the other operations, as well as features described, unless context warrants that such combinations or substitutions are not possible.
  • a feature, step, or aspect is not described with respect to example operations, this does not mean that said feature, step, or aspect is incompatible or impossible with respect to those operations.
  • the example operations of the present application described above e.g., with respect to FIGS. 1-3 and below are not necessarily limited to the steps, features, or aspects that are described with respect to those example operations.
  • steps, features, or aspects are not limited to those described in FIGS. 4-6, and can be combined or substituted with other steps, features or aspects relating to a farm animal operation monitoring system(s) in accordance with example implementations as described in this disclosure above and below.
  • the poultry and egg analyzer 215 can take the form of a device (or one or more devices, which may be networked and can have one or more of the structural components as described in FIG. 7 below) that comprises a processor and a memory that stores executable instructions that, when executed by the processor, facilitate performance of operations as described in FIG. 4.
  • the operations 400 can comprise, at block 405, receiving camera data representative of images of a poultry located in a poultry cage, wherein the poultry is one of a plurality of poultry.
  • the poultry has identifiable biometric characteristics (e.g., comb of the poultry, overall size of the poultry, etc.), and one of the identifiable biometric characteristics is correlated with egg laying performance (e.g., comb of the poultry).
  • the camera data can comprise top-facing camera data (e.g., top-facing camera data 205) transmitted by a camera that faces a top of the poultry cage (e.g., top facing cameras 120).
  • the camera data can also comprise side-facing camera data (e.g., side-facing camera data 210) transmitted by a camera that faces a side of the poultry cage (e.g., side facing cameras 125).
  • the operations 400 can comprise, at block 410, analyzing the camera data to determine the identifiable biometric characteristics of the poultry.
  • the operations 400 can comprise, based on the identifiable biometric characteristics, identifying the poultry and storing poultry identification data (e.g., poultry identification data 235) associated with the poultry in a repository (e.g., repository 220).
  • poultry identification data e.g., poultry identification data 235
  • the operations 400 can comprise, determining, based on the camera data, whether the poultry has engaged in a pattern of movement indicative of a laying behavior.
  • the laying behavior can comprise pre-laying behavior, as well as post-laying behavior.
  • the operations 400 can comprise, determining, based on the camera data, a first time during which the pattern of movement occurs.
  • time may refer to a (discrete) point in time or to a period in time that an event (such as the pattern of movement) occurs.
  • the operations 400 can comprise, determining, based on the camera data, a presence of an egg in a space associated with the poultry cage.
  • the space associated with the poultry cage can be an area inside the cage, or the space can be a collection area nearby the cage (e.g., a collection bin, conveyor belt, etc.).
  • the operations 400 can comprise, at block 435, storing in the repository egg data representative of the egg (e.g., egg data 230), the egg data can also comprise quality data related to a quality, or grade, of the egg.
  • the quality data can comprise information relating to a dimension of the egg (e.g., size, width, whether it is oval shaped and slightly larger at one end, etc.).
  • the quality data can also comprise shell quality data (e.g., color of the shell, for example on a whiteness scale, or whether it is above or below a threshold of whiteness, whether the shell is cracked or broken).
  • the egg data can also comprise a second time at which the egg appears in a space around the poultry cage, including in the poultry cage.
  • the operations 400 can comprise, at block 440, identifying the poultry as a hen that laid the egg, based on an analysis considering the one of the identifiable biometric characteristics of the poultry, the egg data, and a proximity in time between the first time and the second time.
  • a proximity in time may refer to the time internal between the first time and the second time, where the event that occurs in the second time immediately follows the event that occurs in the first time.
  • the event of the first time and the event of the first time may be for example one or more of engagement in a pattern of movement indicative of a laying behavior and the appearance of an egg in a space.
  • the proximity in time may refer to the time interval between the end of the first period in time and the start of the second period in time.
  • the operations 400 can comprise, at block 450, associating the egg data with the poultry identification data.
  • the operations 400 can further comprise performing an iterative process, wherein the iterative process comprises examining the egg data and further egg data associated with the hen, wherein the further egg data was collected subsequent to the collection of the egg data.
  • the operations 400 can further comprise determining, based on the iterative process, whether the hen lays eggs of a level of quality, wherein the level of quality is based on one or more of a size of the eggs laid by the hen, a color of the eggs (e.g., whiteness scale), shell quality (e.g., whether the eggs comprise broken shells or cracked shells) and a frequency at which the eggs are laid by the hen (e.g., how many eggs are laid by the hen per day).
  • the level of quality can be indicative of egg laying underperformance by the hen (e.g., eggs laid are smaller size, or the number of eggs per day is low).
  • the level of quality can be indicative of an egg laying overperformance by the hen.
  • the operations 400 can further comprise, in response to a determination that the hen lays eggs of the level of quality, sending positional data of the hen to a luminaire (e.g., luminaire 130) positioned above the poultry cage.
  • the operations 400 can further comprise, in response to a determination that the hen has been removed from the poultry cage, transmitting a signal to the luminaire indicating that the poultry has been removed.
  • the poultry and egg analyzer 215 can take the form of a device (or one or more devices, which may be networked and can have components as described in FIG. 7 below) that comprises a processor and a memory that stores executable instructions that, when executed by the processor, facilitate performance of data processing operations (e.g., methods) as described in FIG. 5.
  • a device or one or more devices, which may be networked and can have components as described in FIG. 7 below
  • the poultry and egg analyzer 215 can take the form of a device (or one or more devices, which may be networked and can have components as described in FIG. 7 below) that comprises a processor and a memory that stores executable instructions that, when executed by the processor, facilitate performance of data processing operations (e.g., methods) as described in FIG. 5.
  • the operations 500 can comprise, at block 505, receiving camera data representative of images of a hen located in a cage, wherein the hen is one of a plurality of poultry, and wherein the images comprise identifiable biometric characteristics of the hen, and wherein one of the identifiable biometric characteristics is correlated with egg laying performance (a comb of the poultry).
  • the camera data can comprise top-facing camera data (e.g., top-facing camera data 205) transmitted by a camera that faces the top of the cage (e.g., top-facing camera 120), and can also comprise side-facing camera data (e.g., side-facing camera data 210) transmitted by a camera that faces a side of the cage (e.g., side-facing camera 125).
  • the operations 500 can comprise, at block 510, based on the identifiable biometric characteristics (e.g., comb, overall size) in the camera data, identifying the hen and storing identification data (e.g., poultry identification data 235) associated with the hen in a repository (e.g., repository 220).
  • identifiable biometric characteristics e.g., comb, overall size
  • identification data e.g., poultry identification data 235
  • the operations 500 can comprise, determining, based on the camera data, whether the hen has engaged in a pattern of movement indicative of a laying behavior (e.g., pre-laying behavior, post-laying behavior).
  • a laying behavior e.g., pre-laying behavior, post-laying behavior
  • the operations 500 can comprise, determining, based on the camera data, a first time during which the pattern of movement occurs.
  • the operations 500 can comprise, detecting, based on the camera data, a presence of an egg laid by one of the plurality of poultry.
  • the operations 500 can comprise, at block 530, storing in the repository egg data representative of the egg, the egg data comprising quality data for the egg and a second time at which the egg is detected to be present.
  • the operations 500 can comprise, at block 535, identifying the hen as the hen that laid the egg, based on an analysis that considers the one of the identifiable biometric characteristics of the hen, the egg data, and a proximity in time between the first time and the second time.
  • the operations 500 can comprise, associating the egg data with the identification data.
  • the operations 500 can further comprise, performing an iterative process, wherein the iterative process comprises examining the egg data and further egg data associated with the hen, wherein the further egg data was collected subsequent to the collection of the egg data.
  • the operations 500 can further comprise, determining, based on the iterative process, whether the hen laid eggs of a level of quality indicative of an egg-laying performance of the hen, wherein the level is based on a size of eggs laid by the hen, a color of the eggs, a shell quality (e.g., whether the shell is cracked, whether the shell is broken) and a frequency at which the eggs are laid by the hen (e.g., a rates such as how many eggs are laid per day).
  • a level of quality indicative of an egg-laying performance of the hen wherein the level is based on a size of eggs laid by the hen, a color of the eggs, a shell quality (e.g., whether the shell is cracked, whether the shell is broken) and a frequency at which the eggs are laid by the hen (e.g., a rates such as how many eggs are laid per day).
  • the operations 500 can further comprise, in response to a determination that the hen lays eggs of the threshold level of quality, sending positional data of the hen to a luminaire positioned above the cage (e.g., luminaire 130).
  • the operations 500 can further comprise, in response to a determination that the hen has been removed from the cage, transmitting a signal to the luminaire indicating that the hen has been removed.
  • the luminaire e.g., luminaire 130
  • the luminaire which can be mounted above a poultry cage
  • the operations 600 can comprise, at block 605, receiving positional data via the network interface related to a location of a hen in a cage, wherein the hen has been identified as performing at a level of performance based on an analysis of egg data associated with the hen.
  • the operations 600 can comprise, using an image captured by the camera and the positional data, directing the light source to emanate a light beam to the location of the hen, wherein a beam width of the light beam is narrow enough to illuminate at least a portion of the hen.
  • the operations 600 can further comprise, receiving a signal (e.g., from the poultry and egg analyzer 215) to generate a light, wherein the signal is indicative of the removal of the hen from the cage.
  • the light can be of a different color than the light beam.
  • the luminaire with a camera can also serve as a top-facing camera operable to transmit signals representative of captured video of a plurality of poultry residing in the cage.
  • each cage can be mapped on a plane (e.g., x-y plain, wherein a location of the plane can comprise an x-y coordinate).
  • Top-facing camera data can be used by the poultry and egg analyzer 215, to determine the coordinate of a particular hen.
  • the poultry and egg analyzer 215 can transmit the coordinates of the underperforming hen to the luminaire.
  • the luminaire can, for example using actuators, position itself (or position a light source of the luminaire), so as to direct light to the coordinates where the hen is located.
  • the luminaire does not need to have a camera.
  • the poultry and egg analyzer 215, top-facing cameras 120, side-facing cameras 125, luminaires 130, and user equipment 225 can comprise or more of the structural components described in FIG. 7.
  • Fig. 7 schematically shows an example implementation of a computing system 700.
  • various devices used in the systems described above e.g., farm animal monitoring farm animal operation monitoring system(s) 100, including farm animal sound data monitoring system 800, etc.
  • the computing system 700 can include a computer 702.
  • the local device 105 can be an edge computing device.
  • Computer 702 can comprise a processor 704, memory 706, various interfaces, and various adapters, each of which can be coupled via a local interface, such as system bus 708.
  • the system bus 708 can be any of several types of bus structures that can interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
  • the processor 704 is capable of processing computer-executable instructions that, when executed by the processor 704, facilitate performance of operations, including methods, operations, functions, or steps described in this disclosure.
  • the processor 704 can comprise one of more devices that can process the instructions.
  • the computer-executable instructions can comprise a program file, software, software module, program module, software application, etc., that is in a form that can ultimately be run by the processor 704.
  • the computer-executable instructions can be, for example: a compiled program that can be translated into machine code in a format that can be loaded into a random access memory 712 of memory 706 and run by the processor 704; source code that may be expressed in proper format such as object code that is capable of being loaded into a random access memory 712 and executed by the processor 704; or source code that may be interpreted by another executable program to generate instructions in a random access memory 712 to be executed by the processor 704, etc.
  • the software applications as described herein may be embodied in software or code executed by hardware as discussed in FIG. 7, as an alternative the same may also be embodied in dedicated hardware or a combination of software/general purpose hardware and dedicated hardware.
  • the computer-executable instructions can be stored on a machine-readable storage media (i.e., computer-readable storage media, also referred to as machine-readable storage medium, or as computer-readable storage medium).
  • the computer-readable storage media can comprise memory 706, as well as storage device 714.
  • the memory 706 can represent multiple memories that operate in parallel processing circuits, and memory 706 can comprise both nonvolatile memory (e.g., read-only memory (ROM)) and volatile memory (e.g., random access memory (RAM)), illustrated by way of example as ROM 710 and RAM 712.
  • the computer 702 can further comprise a storage device 714 (or additional storage devices) that can store data or software program modules.
  • Storage device 714 can comprise, for example, an internal hard disk drive (HDD) (e.g., EIDE, SATA), solid state drive (SSD), one or more external storage devices (e.g., a magnetic floppy disk drive (FDD), a memory stick or flash drive reader, a memory card reader, etc.), an optical disk drive 720 (e.g., which can read or write from a compact disc (CD), a digital versatile disk (DVD), a BluRay Disc (BD), etc.).
  • HDD internal hard disk drive
  • SSD solid state drive
  • FDD magnetic floppy disk drive
  • FDD magnetic floppy disk drive
  • memory stick or flash drive reader e.g., a memory stick or flash drive reader, a memory card reader, etc.
  • an optical disk drive 720 e.g., which can read or write from a compact disc (CD), a digital versatile disk (
  • storage device 714 is illustrated as located within the computer 702, the storage device 714 can also be of the variety configured for external, or peripheral, location and use (e.g., external to the housing of the computer 702).
  • the storage device can be connected to the system bus 708 by storage interface 724, which can be an HDD interface, an external storage interface, an optical drive interface, a Universal Serial Bus (USB) interface, and any other internal or external drive interfaces.
  • storage interface 724 can be an HDD interface, an external storage interface, an optical drive interface, a Universal Serial Bus (USB) interface, and any other internal or external drive interfaces.
  • ROM 710 can provide nonvolatile storage of data, data structures, databases, software program modules (e.g., computer-executable instructions), etc., which can be, for example, a basic input/output system (BIOS) 728, an operating system 730, one or more application programs 732, other program modules 734, and application program data 736.
  • BIOS basic input/output system
  • any component discussed herein is implemented in the form of software, any one of a number of programming languages can be employed, such as, for example, C, C++, C#, Objective C, Java®, JavaScript®, Perl, PHP, Visual Basic®, Python®, Ruby, Flash®, or other programming languages.
  • Data can be stored in a suitable digital format.
  • the processor 704 can also comprise on-chip memory to facilitate processing of the instructions.
  • the systems and methods described herein can be implemented utilizing various commercially available operating systems or combinations of operating systems.
  • computer-readable storage media refers to respective types of storage devices, it should be appreciated by those skilled in the art that other types of storage media which are readable by a computer, whether presently existing or developed in the future, could also be used in the example operating environment.
  • Computer 702 can optionally comprise emulation technologies.
  • a hypervisor (not shown) or other intermediary can emulate a hardware environment for operating system 730, and the emulated hardware can optionally be different from the hardware illustrated in FIG. 7.
  • operating system 730 can comprise one virtual machine (VM) of multiple VMs hosted at computer 702.
  • VM virtual machine
  • operating system 730 can provide runtime environments, such as the Java runtime environment or the .NET framework, for applications 732. Runtime environments are consistent execution environments that allow applications 732 to run on any operating system that includes the runtime environment.
  • operating system 730 can support containers, and applications 732 can be in the form of containers, which are lightweight, standalone, executable packages of software that include, e.g., code, runtime, system tools, system libraries and settings for an application.
  • computer 702 can be enable with a security module, such as a trusted processing module (TPM).
  • TPM trusted processing module
  • boot components hash next in time boot components, and wait for a match of results to secured values, before loading a next boot component.
  • This process can take place at any layer in the code execution stack of computer 702, e.g., applied at the application execution level or at the operating system (OS) kernel level, thereby enabling security at any level of code execution.
  • OS operating system
  • a user can enter commands and information into the computer 702 using one or more wired/wireless input devices, such as a keyboard 738, a touch screen 740, or a cursor control device 742, such as a mouse, touchpad, or trackball.
  • wired/wireless input devices such as a keyboard 738, a touch screen 740, or a cursor control device 742, such as a mouse, touchpad, or trackball.
  • Other input devices can comprise a microphone, an infrared (IR) remote control, a radio frequency (RF) remote control, or other remote control, a joystick, control pad, a virtual reality controller and/or virtual reality headset, a game pad, a stylus pen, an image input device (e.g., camera(s)), a gesture sensor input device, a vision movement sensor input device, an emotion or facial detection device, a biometric input device, (e.g., fingerprint or iris scanner), or the like.
  • IR infrared
  • RF radio frequency
  • input devices are often connected to the processing unit 704 through an input device interface 744 that can be coupled to the system bus 708, but can be connected by other interfaces, such as a parallel port, an IEEE 794 serial port, a game port, a USB port, audio port, an IR interface, a BLUETOOTH® interface, etc.
  • an input device interface 744 can be coupled to the system bus 708, but can be connected by other interfaces, such as a parallel port, an IEEE 794 serial port, a game port, a USB port, audio port, an IR interface, a BLUETOOTH® interface, etc.
  • a display device 746 such as a monitor, television, or other type of display device, can be also connected to the system bus 708 via an interface, such as a video adapter 748.
  • a computer 702 can also connect with other output devices (not shown), such as speakers, printers, etc.
  • the computer 702 can operate in a networked environment using wired or wireless communications to one or more remote computers, such as a remote computer 750 (e.g., one or more remote computers).
  • the remote computer 750 can be a workstation, a server computer, a router, a personal computer, a tablet, a cellular phone, a portable computer, microprocessor-based entertainment appliance, a peer device, a network node, and internet of things (loT) device, and the like, and typically includes many or all of the elements described relative to the computer 702, although, for purposes of brevity, only a memory/storage device 752 is illustrated.
  • a remote computer 150 can comprise a computing device that is primarily used for storage, such as a network attached storage device (NAS), redundant array of disks (RADs), or a device that is a part of a SAN (storage area network), wherein the storage device comprises memory/storage 752.
  • NAS network attached storage device
  • RADs redundant array of disks
  • SAN storage area network
  • program modules depicted relative to the computer 702 or portions thereof can be stored in the remote memory/storage device 752 (some refer to this as “cloud storage” or “storage in the cloud).
  • data and information can also be stored remotely at the remote memory/storage device 752.
  • a remote computer 750 that is a server device can facilitate storage and retrieval of information to a networked memory/storage device 752.
  • the computer 702 can manage storage provided by the cloud storage system as it would other types of external storage. For instance, access to cloud storage sources can be provided as if those sources were stored locally on the computer 702.
  • a connection between the computer 702 and a cloud storage system can be established, either via wired or wireless connectivity, over a network 754.
  • the network can be, for example, wireless fidelity (Wi-Fi) network, a local area network (LAN), wireless LAN, larger networks (e.g., a wide area network (WAN)), cable-based communication network (e.g., a communication network implementing the data over cable service interface specification (DOCSIS), asynchronous transfer mode (ATM) network, digital subscriber line (DSL) network, asymmetric digital subscriber line (ADSL) network, a cellular network (e.g., 4G Long Term Evolution (LTE), 5G, etc.), and other typical fixed and mobile broadband communications networks, and can comprise components (e.g., headend equipment, local serving office equipment, Digital Subscriber Line Access Multiplexers (DSLAMs), Cable Modem Termination Systems (CMTSs), cellular nodes, etc.) related to each of these types of networks.
  • the network 754 can facilitate connections to a
  • the computer 702 can be connected to the network 754 through a wired or wireless communications component 758.
  • the communications component 758 can comprise, for example, a network work interface adapter (e.g., network interface card), wireless access point (WAP) adapter.
  • the communications component 758 can also comprise cellular receivers, cellular transmitters, and cellular transceivers that enable cellular communications.
  • the communications component 758 can facilitate wired or wireless communication to the network 754, which can include facilitating communications through a gateway device, such as a cable modem, DSL modem, ADSL modem, cable telephony modem, wireless router, or other devices that can be used to facilitate establishment of communications.
  • the gateway device which can be internal or external and a wired or wireless device, can be connected to the system bus 708 via the communications component 758. It will be appreciated that the network connections and components shown are examples, and other methods of establishing a communications link between a remote computer 750 can be used.
  • a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and the server can be a component.
  • One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers.
  • components also can execute from various computer readable storage media comprising various data structures stored thereon.
  • the components can communicate via local and/or remote processes such as in accordance with a signal comprising one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal).
  • a component can be an apparatus with specific functionality provided by mechanical parts operated by electric or electronic circuitry that is operated by software or firmware application(s) executed by a processor, wherein the processor can be internal or external to the apparatus and executes at least a part of the software or firmware application.
  • a component can be an apparatus that provides specific functionality through electronic components without mechanical parts, the electronic components can comprise a processor therein to execute software or firmware that confers at least in part the functionality of the electronic components.
  • An interface can comprise input/output (I/O) components as well as associated processor, application, and/or API components.
  • processor can refer to substantially any computing processing unit or device comprising single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory.
  • a processor can refer to an integrated circuit, a central processing unit (CPU), an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
  • CPU central processing unit
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • PLC programmable logic controller
  • CPLD complex programmable logic device
  • Processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of UE.
  • a processor also can be implemented as a combination of computing processing units.
  • the disclosed subject matter can be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter.
  • Computer-readable storage media can be any available storage media that can be accessed by the computer, and can comprise various forms of memory, as will be elaborated further below.
  • Memory can be of various types, such as hard-disk drives (HDD), floppy disks, zip disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, flash memory devices (cards, sticks, key drives, thumb drives), cartridges, optical discs (e.g., compact discs (CD), digital versatile disk (DVD), Blu-ray Disc (BD)), a virtual device that emulates a storage device, and other tangible and/or non-transitory media which can be used to store desired information.
  • HDD hard-disk drives
  • floppy disks zip disks
  • magnetic cassettes magnetic tape
  • magnetic disk storage or other magnetic storage devices such as compact discs (CD), digital versatile disk (DVD), Blu-ray Disc (BD)
  • CD compact discs
  • DVD digital versatile disk
  • Blu-ray Disc Blu-ray Disc
  • Memory can also comprise volatile memory as well as nonvolatile memory, whereby volatile memory components are those that do not retain data values upon loss of power and nonvolatile components are those that retain data upon a loss of power.
  • nonvolatile memory can comprise read only memory (ROM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can comprise random access memory (RAM), which acts as external cache memory.
  • RAM is available in many forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), magnetic random access memory (MRAM), and direct Rambus RAM (DRRAM). Additionally, the disclosed memory components of systems or methods herein are intended to comprise these and any other suitable types of memory.
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM Synchlink DRAM
  • MRAM magnetic random access memory
  • DRRAM direct Rambus RAM
  • facilitate as used herein is in the context of a system, device or component “facilitating” one or more actions, methods, or example operations, in respect of the nature of complex computing environments in which multiple components and/or multiple devices can be involved in some computing operations.
  • Non-limiting examples of actions that may or may not involve multiple components and/or multiple devices comprise the methods described herein, including but not limited to transmitting or receiving data, establishing a connection between devices, determining intermediate results toward obtaining a result, etc.
  • a computing device or component can facilitate an operation by playing any part in accomplishing the operation (e.g., directing, controlling, enabling, etc.).
  • the terms (comprising a reference to a “means”) used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., a functional equivalent), even though not structurally equivalent to the disclosed structure, which performs the function in the herein illustrated example aspects of the embodiments.
  • the embodiments comprise a system as well as a computer-readable storage media comprising computerexecutable instructions for performing the acts or events of the various methods.
  • claim language specifically recites “means for” the claim is intended to encompass a recited claim structure, and not invoke means plus function language.
  • the terms “user,” “subscriber,” “customer,” “consumer,” and the like are employed interchangeably throughout the subject specification, unless context warrants particular distinction(s) among the terms. It should be appreciated that such terms can refer to human entities, associated devices, or automated components supported through artificial intelligence (e.g., a capacity to make inference based on complex mathematical formalisms) which can provide simulated vision, sound recognition and so forth.
  • artificial intelligence e.g., a capacity to make inference based on complex mathematical formalisms
  • wireless network and “network” are used interchangeably in the present application, when context wherein the term is utilized warrants distinction for clarity purposes such distinction is made explicit.
  • the word “exemplary,” where used, is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the word exemplary is intended to present concepts in a concrete fashion. Wherever the phrases “for example,” “such as,” “including” and the like are used herein, the phrase “and without limitation” is understood to follow unless explicitly stated otherwise.
  • the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances.
  • references to singular components or items are intended, unless otherwise specified, to encompass two or more such components or items.
  • the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.
  • disclosed systems and apparatuses and components or subsets thereof should neither be presumed to be exclusive of other disclosed systems and apparatuses, nor should an apparatus be presumed to be exclusive to its depicted components in an example embodiment or embodiments of this disclosure, unless where clear from context to the contrary.
  • steps or blocks as shown in example methods, or operations can be interchangeable with steps or blocks as show in other example methods or operations.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Biophysics (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Housing For Livestock And Birds (AREA)

Abstract

Example systems and methods receive camera data representative of images of a hen located in a cage. The hen can be identified based on identifiable biometric characteristics. When an egg is laid, the hen that laid the egg can be identified, based on an analysis that considers the one of the identifiable biometric characteristics of the hen, egg data, and a proximity in time between a pattern of movement indicative of a laying behavior by the hen, and the time the egg was detected being laid. An iterative process can be performed wherein the egg data and subsequently collected egg data can be used to determine a level of quality of eggs laid by the hen.

Description

IDENTIFYING POULTRY ASSOCIATED WITH EGGS OF A QUALITY
TECHNICAL FIELD
The present application relates generally to the field of data processing and more specifically to using camera data associated with livestock (e.g., poultry) and eggs, captured at hatcheries, to associate an identify livestock associated with eggs of a quality.
BACKGROUND
In livestock farms, particularly poultry farms comprising hatcheries, farmers strive to maximize the growth of egg-laying poultry, decrease mortality rate, and increase egg counts (e.g., quantity). Poultry owners struggle to take precautionary measures related to addressing underperforming poultry, reduced egg count, reduced egg sizes, and poultry growth for better investment returns due to the high costs of labor and the nature of having poultry lay eggs in cages.
In a hatchery, it can often be difficult to determine which hen laid which egg, and in particular, determine which hen lays lower grade eggs. Especially with many hens in a cage, the chances that one or more hens obscures another from observation, can contribute to this difficulty.
US2020/170219A discloses an unmanned aerial vehicle for determining geolocation exclusion zones of animals. The unmanned aerial vehicle includes a processorbased monitoring device to track geolocation information associated with an animal from the unmanned aerial vehicle, an identification device mounted on the unmanned aerial vehicle to identify the animal and to track a position of the animal over time, and a mapping device coupled to the monitoring device to determine locations where the animal has traversed and to identify where an encounter with the animal is reduced. The geolocation information associated with an animal may include identifying birds and their nests.
SUMMARY OF THE INVENTION
It is an object of the present invention to at least partly mitigate this problem.
According to a first aspect of the invention, this and other objects are achieved by a data processing device according to claim 1. According to a second aspect of the invention, this and other objects are achieved by a data processing method according to claim 9.
According to third aspect of the invention, this and other objects are achieved by a luminaire according to claim 13.
BRIEF DESCRIPTION OF THE DRAWINGS
Non-limiting and non-exhaustive embodiments of the subject disclosure are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Fig. 1 is a diagram illustrating an example hatchery environment in which example devices operate, in accordance with various aspects and embodiments of the subject disclosure.
Fig. 2 is a diagram illustrating an example system for identifying a hen, in accordance with various aspects and embodiments of the subject disclosure.
Fig. 3 is a graph illustrating a pattern of movement indicative of a pre-laying behavior engaged in by a poultry ready to lay an egg, in accordance with various aspects and embodiments of the subject disclosure.
Fig. 4 illustrates a flow diagram relating to example operations that can be performed by a poultry and egg analyzer device, in accordance with various aspects and embodiments of the subject disclosure.
Fig. 5 illustrates a flow diagram relating to another example operations that can be performed by a poultry and egg analyzer device, in accordance with various aspects and embodiments of the subject disclosure.
Fig. 6 illustrates a flow diagram relating to example operations that can be performed by a luminaire, in accordance with various aspects and embodiments of the subject disclosure.
Fig. 7 illustrates an example block diagram of a computer that can be operable to execute processes and methods in accordance with various aspects and embodiments of the subject disclosure.
DETAILED DESCRIPTION
The following description and the annexed drawings set forth in detail certain illustrative aspects of the subject matter. However, these aspects are indicative of but a few of the various ways in which the principles of the subject matter can be employed. Other aspects, advantages, and novel features of the disclosed subject matter will become apparent from the following detailed description when considered in conjunction with the provided drawings. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of the subject disclosure. It may be evident, however, that the subject disclosure can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the subject disclosure.
The subject disclosure of the present application describes example embodiments of systems (referred to herein as the “poultry identification systems,” or for convenience “example systems,” or “systems”) and methods, and example embodiments of the poultry identification systems are described below with reference to block diagrams and flowchart illustrations of methods, functions, apparatuses, and computer program products and modules. Steps of block diagrams and flowchart illustrations support combinations of mechanisms for performing the specified functions, combinations of steps for performing the specified functions, and program instructions for performing the specified functions. It should be understood that each step of the block diagrams and flowchart illustrations, combinations of steps in the block diagrams and flowchart illustrations, or any operations, functions, methods, and processes described herein, can be implemented, in accordance with example embodiments of the present invention, by computer processing systems comprising devices (e.g., top-facing cameras, side-facing cameras, poultry and egg analyzer, and luminaires, as introduced in FIG. 1 and FIG. 2), having one or more processors and one or more memories that store executable instructions (e.g., computer program product, computer- readable instructions, software, software programs, software applications, etc.) that, when executed by the one or processors, facilitate (e.g., perform, control, command, direct, order, etc.) performance of the operations, functions, methods, and processes described below in accordance with example embodiments of the present invention. The one or more processors (e.g., microprocessors, processors, central processing unit (CPU), system on a chip (SoC), combinations of these, or other programmable data processing apparatus) can be any processor device known to those of ordinary skill, for example processors offered for sale by Intel (e.g., branded Pentium processors), Advanced Micro Devices (AMD), International Business Machines (IBM) and the like. It is also contemplated that processors of other brands can be suitable. Additionally, future processors, as they are developed and branded, are contemplated to be within the scope of the present invention. The term processor is further elaborated upon below. The memories can comprise any suitable computer-readable storage medium, including, for example, on-chip memory, read only memory (ROM), random access memory (RAM), hard disks, compact disks, DVDs, optical data stores, and/or magnetic data stores. In addition to processors and memories, the one or more devices can also comprise circuitry and hardware components as described below with respect to FIG. 7. Example embodiments of the devices may take the form of entirely hardware embodiments, entirely software embodiments, and embodiments combining both software and hardware aspects.
The poultry identification systems and methods that autonomously monitors poultry to detect and tag layers in a poultry cell which are responsible for laying poor grade of eggs and subsequently directs and enables farm workers to separate and take corrective measures for underperforming poultry in the observed cell. Underperforming hen laying can be a sign of diseased poultry. Poultry, in the context of this application, can include any livestock that can lay eggs for human consumption, including but not limited to, chickens, turkeys, quails, and ostriches. Hen, in the context of this application, refers to egg laying poultry. Systems in accordance with example embodiments of the present application can employ a grid of sensor(s) (e.g., a visual camera, thermal camera, etc.). Also associated with the system can be connected lighting devices (e.g., luminaires). By applying a combination of deep learning and machine learning techniques, a machine learning engine can identify poultry uniquely among its neighbors in the cell by utilizing poultry facial recognition techniques and algorithms.
The example embodiments of systems can perform the functions of automatically identifying, using cameras and sensors, a hen that lays particular eggs, an in particular a hen that lays a poorer-grade eggs, due to sickness or other factors (e.g., feed, lack of lighting, lack of exercise, lack of medication, etc.). Once separated, the under-performing hens can be rehabilitated, for example by provide lighting interventions, appropriate feed, medicine, etc., to improve the well-being of the underperforming hens so that the hens can improve the grade of the eggs the hens lay.
The example poultry identifications systems can identify hens perform less satisfactorily than other hens, which includes laying lower quality eggs, or poorer grade eggs (e.g., smaller, irregular shaped, discolored, cracked, broken), or laying eggs at a lower frequency (e.g., less eggs per day) A regular shaped egg is typically oval, with one end being larger than the other. Shell quality of the eggs can comprise whether the egg has any cracks, or are broken, and also comprise the egg color. Weak-shelled eggs show a different color, which can be due to calcium deficiency and poor calcium sources. Early identification, including by using machine learning techniques, of poor quality egg-laying poultry can allow poultry owners to treat them separately.
In example embodiments, a first camera(s) can be used to identify a hen, and a second camera(s) can be used to trace the hen’s pre-laying behavior, and subsequently to recognize whether the farm worker has selected the right, underperforming hen. The second camera can be facing top facing, mounted above the poultry cage cells pointed at the top of poultry cages, and the first camera can be side facing, mounted so that the camera points toward a side faces of the poultry cage cells. The first camera can also be used to capture images of the egg, which can be analyzed to derive egg data relating to the dimensions of the egg. In accordance with example embodiments of the present invention, one or more devices in the system in the system, such as a poultry and egg analysis system, can analyze visual data associated with captured images (e.g., captured by a camera) of poultry (e.g., hens) in a cage cell. The poultry and egg analysis system can utilize facial recognition techniques using computer vision or machine learning techniques to identify the egg-laying poultry. The poultry and egg analysis system can also analyze data, such as time data, location data, and egg data to correlate this data with a particular hen.
After analysis identifies the most likely hen that laid a lower grade egg, a dynamic, or modulated, light effects from one or more luminaires can be used to advise a farm worker to pick the correct, underperforming hen from a multitude of hens in the cell/cage. The system can actuate lighting devices (e.g., luminaires) that generate a visual alert. In example embodiments, direct lighting, emanating from one or more luminaires that are part of the system, can shine upon the hen that has been identified. The luminaires can receive positional data from the system indicating the location of the hen, which is updated depending on the movement of the hen. The positional data can be used by the luminaires to illuminate the hen. In some example embodiments, the luminaire comprises a camera so that it can use images to determine what position to move the luminaire so as to be able to direct a beam of light upon the hen. These light effects can guide a farm worker to the correct cage. The systems can also generate a report of the prediction to farmers (e.g., a report comprising an image of the underperforming hen, egg data comprising the dimensions of eggs laid by the hen, egg data comprising egg laying rate of the hen, cage location, etc.). When a farm worker touches the hen, a light effect can indicate whether the appropriate, identified poultry has been selected. The light can differ from the light that identified the poultry.
Hence, example embodiments of the present application use different cameras to keep track of hens and eggs, complemented by light effect by the farm lights aimed at guiding the workflow in the farm. In some example embodiments, the camera and the luminaire can be integrated to form one device.
The present application can be operable to use an iterative process to capture data, analyze the captured data by combining it with previously analyzed data, produce a new set of refined analysis. Systems and methods herein can be used to detect and tag layers in a poultry cell that to identify hens that lay above-standard grade eggs, or can identify hens that lay poor grade eggs, with a higher degree of confidence, guiding farmworkers to the abovestandard performing hen, or the underperforming hen.
FIG. 1 is a diagram illustrating an example of an environment 100 depicting various devices that can be used in the example embodiments of poultry identification systems. The example systems can operate in a poultry hatching environment.
The environment 100 can comprise one or more cages 105 (cage 105 in the singular, cages 105 in the plural). Multiple cages can comprise a cage cell, and multiple cage cells can comprise cage cells. One or more cages can contain poultry 110, and in particular, hens that lay eggs 115 (egg 115 in the singular, eggs 115 in the plural).
The example poultry identification systems operating in the environment 100 can comprise several cameras. The example systems can comprise one or more top-facing cameras (top-facing camera 120 in the singular, top-facing cameras 120 in the plural).
The example poultry identification systems can also comprise one or more side-facing cameras (side-facing camera 125 in the singular, side-facing cameras 125 in the plural) that are pointed at the sides of the cages 105. Because the cages might be arranged to be adjacent to each other, not every angle of the cage might be exposed to the side-facing cameras. For example, a cage that is positioned in between two other cages might only have a camera pointed at the back and front of the cage, and not the two sides on account of the two sides being adjacent to other cages.
The one or more of the cameras (top-facing cameras 120, or side-facing cameras 125) can comprise a housing, and can comprise a processor and memory, as explained above, and have the same or similar components as described below with respect to FIG. 7. One or more of the cameras can comprise actuators and motors, so as to be operable to respond to control signals, including to move so as to point at different angles with respect to the ceiling. One or more of the cameras can communicate using one or more wireless communication protocols (e.g., cellular protocols such as LTE, 5G, etc., or other wireless protocols such as Wi-Fi or Bluetooth, or radio frequency (RF)). If wired, the one or more cameras can communicate using one or more wired protocols, such as Ethernet. The cameras can be operable to take a picture, take video, or in some example embodiment systems, take thermographic pictures or videos. The cameras in some example embodiment systems can also be equipped with night-vision technology so as to capture movement, shapes, and other detail in low light conditions. The cameras (top or side facing) can also comprise a clock, which in some embodiments can be time synced to a networked clock, and in other embodiments be an internal non-networked clock. The clock can be used to stamp the time (e.g., create a time stamp) of any visual pictures or frames of videos sent by the cameras.
The system can also comprise one or more lighting devices. The lighting devices can be luminaires (e.g., luminaire 130 in the singular, luminaires 130 in the plural), which can be mounted on a ceiling. The one or more luminaires 130 can comprise one or more light sources, together with the parts designed to distribute the light, to position and protect the light sources, and to connect the light sources to the power supply. The light sources can be incandescent, fluorescent, halogen-based, or a light emitting diode (LED). The luminaires 130 can be placed above the cages, and can emanate light that has a beam width such that an individual hen can be illuminated. In some example embodiments, an aperture of the luminaire and/or mirrors can be used to control the beam width. In some example embodiments, the beam width can also be adjusted by elevating or de-elevating the position of the luminaire with respect to the cage. As was with the cameras, the luminaires 130 can have a motor at the base, and actuators that control the motors, so that the luminaires can move to illuminate at different angles with respect to the luminaire and the ceiling. In some example embodiments, the light sources can be emit light of different colors, such as red, blue, or green. In example embodiments, the one or more luminaires can comprise a processor and memory, and can have one more of the components described below with respect to FIGS. 7. The one or more luminaires have the ability to be independently and simultaneously controlled (e.g., controlled or operated at the direction of a poultry and egg analyzer, described with respect to FIG. 2 below). The luminaires can be controlled to turn off, or turn on, or to emit light of different intensities or colors. In example embodiments, the luminaires can be controlled wirelessly (e.g., cellular, Wi-Fi, Bluetooth, RF, etc.), or by wire (RF, Ethernet, etc.).
FIG. 2 shows example embodiments of poultry identification systems 200. The example poultry identification systems can comprise the one or more top-facing cameras 120, the one or more side facing cameras 125, and the one or more luminaires 130 introduced in FIG. 1. The example systems can also comprise a poultry and egg analyzer 215, which can be a one or more devices (or a system comprising one or more devices) having a processor and memory, as mentioned above. The poultry and egg analyzer 215 can have, or be connected to, a repository 220 (which can be a local storage device, or a storage device accessible via a networked device). The repository can comprise one or more storage devices (e.g., hard drive, solid state drive, flash drive, etc.) capable of storing digital data and information. In example embodiments, the poultry and egg analyzer can function as a server device. It some example embodiments, the poultry and egg analyzer can serve up user interface pages (e.g., webpages) via a browser, which can be used to report, monitor, list, or access certain data. The user interface pages can be accessible by one or more user equipment 225 (UE 225). UE 225 can comprise a desktop computer, laptop computer, tablet, smartphone, smartwatch, or the like.
The various devices of FIG. 2 can be connected to each other via one or more communications networks. The one or more communications networks can include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, cellular network, satellite network, data over cable network (e.g., operating under one or more data over cable service interface specification “DOCSIS”), or any other type of computer or communications network. The communications networks can also comprise, for example, a Local Area Network (LAN), such as an office or Wi-Fi network. The communications network can also comprise, for example, a Bluetooth network that allows one communications device 140 to be connected to other communications devices via the Bluetooth network. The communications network can also comprise an RFID network that can receive RFID signals from active and passive RFID devices. The devices can communicate with each other either directly or indirectly.
In example embodiments of the poultry identification systems, identifiable biometric characteristics of the poultry can be evaluated in conjunction with the behaviors inferred from trajectory tracking (e.g., pattern of movement) to identify with high confidence level that a particular hen laid a particular egg.
The one or more top-facing cameras 120 and the one more side-facing cameras 125 can capture images (e.g., pictures, video, video frames, thermographic video, etc.) and transmit data signals representative of the captured images (e.g., top-facing camera data 205, side-facing camera data 210). The poultry and egg analyzer 215 can receive the top-facing camera data 205 and the side-facing camera data 210 and determine that a particular egg was laid in a particular cage. The poultry and egg analyzer can also create data associated with an image of an egg (e.g., egg data 230). Egg data 230 can be stored in a repository (e.g., repository 220). The egg data 230 can comprise, for an egg 115 that was laid, the dimensions of the egg 115, which can account for differences in the distance of each camera from each capture egg (e.g., the close a camera is to an egg, the larger the image of the egg will be captured). As such, the dimensions of each egg 115 are normalized by the poultry and egg analyzer 215 based on the distance of the camera from the egg, which can be determined based on the sum of the distances of the side-facing cameras 125 from the cage 105, and the distance of the cage 105 from the egg 115. Egg data 230 can also comprise a time at which the egg was detected to be laid, which can be based on top-facing.
For each cage, given that there are N hens in the cage (e.g., N = 4 means there are 4 hens in the cage), given that the poultry and egg analyzer can detect and identify an egg 115 that was laid in a cage 105, having determined the cage responsible for a specific laid egg, the task of identifying the hen that laid an egg is reduced to a search among the N poultry 110 in that cage 105.
The poultry and egg analyzer 215 can analyze the biometrics of each poultry 110 to identify a particular hen. Some of the identifiable biometric characteristics (or features) of a hen that can be used to identify the hen based on identifiable characteristics, which can comprise the overall size, color, and the comb characteristics. The comb of a hen is strongly correlated with the size of the eggs laid by her. The comb is used in wild-derived populations to base mating decisions on by both males and females. In males, the comb is an indicator of social rank, with females actively soliciting mating from males with larger combs as well as also correlating with bone mass. In females the comb is indicative of greater reproductive potential, through an increase in egg production. In turn, egg production is highly dependent on bone morphology in the hen, with one of the principal limitations to egg production being calcium deposition. The biometric data for one or more poultry 110 can be stored in repository 220, and associated poultry identification data 235 for a particular poultry. Additionally, in most cases, the size of the hen is also strongly correlated with the size of the egg. Similar identifiable biometrics can be used by the poultry and egg analyzer 215, to separate male and female poultry. Using identifiable biometrics such as comb size, as well as higher activity levels of male poultry, and other behaviors observed of male poultry, the poultry and egg analyzer 215 can identify one sex of poultry from another.
Once a particular hen has been identified, the movements of that hen can be tracked. Images sent to the poultry and egg analyzer 215 can be used by the poultry and egg analyzer 215 to identify a pattern of movement, wherein the trajectories can be expressed in three dimensions x, y, and z, with time t denoting the timestamp. The pattern of movement data and time t data can be stored with poultry identification data 235. The pattern of movement of a hen at time prior to the laying of an egg at another time can be used to correlate whether that hen laid the egg. Hens typically engage in pre-laying behavior (e.g., a pattern of movement) before oviposition, consisting of a search phase, selection of a nest site, and formation of a nest hollow. Different breeds of poultry may emphasize some aspects of pre-laying behavior more than others. For example, white leghorn hens can have pronounced search and nest selection behavior during which they visit and investigate a number of potential nest sites before choosing one. Medium-weight hybrid brown egg layer breeds tend to sit longer in nests and perform nest building activities, such as gathering litter around the hen to form the nest hollow. Pre-laying behavior might occur during a certain period on any given day because it is triggered by hormones associated with the last ovulation, and not by the mere presence of an egg in the shell gland. Normally, pre-laying behavior begins an hour or two before the egg is ready to be laid, and culminates in the hen settling in a nest and laying an egg. If egg laying is delayed for some reason, the period for pre-laying behavior will pass and the hen will no longer be motivated to seek a nest. The egg will be laid outside the nest while the hen goes about other activities. Too much competition for nest boxes can cause subordinate hens to learn to use alternate nest sites or to delay egg laying beyond the critical period for pre-laying behavior, in either case leading them to lay floor eggs, because they are prevented from entering nests by dominant hens. It is advisable to provide at least one nest space for every five hens to ensure that all hens can access nests when needed. Additionally, hens also exhibit post-laying behaviors. They will often squawk and cackle after they have laid the eggs. This can be picked up by our sensor bundles that have microphones in them, wherein the audible sensor signals can be sent to the connected poultry and egg analyzer 215 for analysis. Also, typical pre-laying behavior includes hens blowing up (expanding) their wings (hence the volume of hen increases puffs-up), and followed by the post-laying behavior of decreasing this volume withing about 30 seconds from the egg being laid. As another example of post-laying behavior, usually hens will stay for an extended time at the location or place at which they laid their eggs before vacating the area. Hens vacating the egg-laying place would also depend on the presence of other, which might frighten them away. Thus, pattern of movement, such as pre-laying and post-laying behavior, can also be used to compute the likelihood of a hen having laid an egg.
Once an egg is laid, which can be detected by the imaging sensor, the trajectories and the biometrics become features in a Bayesian algorithm that works as follows: P(chickenJ laid egg\features) « P(features of i egg size) * PfchickenJ laid)
This equation accounts for two types of features: biometrics and trajectories. When an egg was laid, the poultry and egg analyzer 215 estimates the likelihood for each specific hen having laid this egg. The identifiable biometrics, like the comb-size and the overall size of the hen, are data that can be used. The likelihood of the hen i laying the egg based on the biometrics can be computed by fitting probability densities to data.
Trajectories (e.g., patterns of movement, such as pre-laying or post-laying behavior) are the second set of data that can be evaluated. As shown in FIG. 3, different trajectories correspond to different behaviors of the poultry. The term P (features of i]egg size) is the probability of hen I generating the trajectories given the fact that an egg of said size was laid. The term P(chicken_i laid) is the prior bias of hen i having laid the eggs, based on the biometrics. For example, if there is a linear relationship between the plume size and the egg size, then the poultry and egg analyzer 215 can estimate the probability that one of N hen laid that egg.
The prior probability is combined with the likelihood to get a posterior estimate of the probabilities of each hen laying the said egg. The poultry and egg analyzer 215 can assign the egg to the hen with the maximum posterior probability. Pre-set thresholds, which can be adjusted, can be input (e.g., input by a user equipment 225) and used to determine which hens are underperforming, for example, thresholds related to an egg laying rate (e.g., number of eggs laid per day, or some other time frame), as well as the size and color of the egg that was laid. In example embodiments, after several monitoring cycles, underperforming poultry can be identified through an iterative process using machine learning techniques by the poultry and egg analyzer 215.
Once an underperforming hen has been identified, for example via the probabilities and correlations and iterative machine learning process, the poultry and egg analyzer 215 can send a signal to a luminaire 130, preferably one closest to the cage of the identified hen. After analysis identifies the most likely hen that laid a lower grade egg, dynamic, or modulated, light effects can be used to advise a farm worker to pick the correct, underperforming hen from a multitude of hens in the cell/cage. The poultry and egg analyzer 215 can transmit a signal (e.g., via a communication network) to actuate lighting devices (e.g., luminaires 130), preferably a luminaire close to the cage containing the underperforming poultry. Once the signal is received, the luminaire 130 can shine a light upon the poultry that has been identified. As mentioned above, the beam width of the light can be narrow enough to shine a light on a specific hen. The resulting light can be a spotlight, or in some embodiments. The emanated lights might be of different colors, and can also flash. These light effects can guide a farm worker to the correct cage. In some example embodiments, when a farm worker touches the hen, a light effect (e.g., change in color) can indicate whether the appropriate, identified poultry has been selected.
The system can also generate a report of the prediction to farmers (e.g., via an email showing an image of the underperforming hen), so the farm worker knows the results of any analysis regarding the hen’s underperformance. A decision can be made as to whether to separate an underperforming hen, or provide lighting interventions, medical treatments, etc., to boost the size of the egg.
In accordance with example embodiments, one or more of the example methods and operations, as described above, can be performed as described in FIGS. 4-6. The methods and operations can be performed by one or more devices comprising a processor and a memory. The device can have some or all of the components as described below with respect to FIG. 7. Machine-readable storage media, comprising executable instructions that, when executed by a processor, can also facilitate performance of the methods and operations described in FIGS. 4-6. In each of these operations, steps or aspects described in one operation can be substituted or combined with steps and aspects with respect to the other operations, as well as features described, unless context warrants that such combinations or substitutions are not possible. Further, if a feature, step, or aspect is not described with respect to example operations, this does not mean that said feature, step, or aspect is incompatible or impossible with respect to those operations. As such, the example operations of the present application described above (e.g., with respect to FIGS. 1-3) and below are not necessarily limited to the steps, features, or aspects that are described with respect to those example operations. Further, steps, features, or aspects are not limited to those described in FIGS. 4-6, and can be combined or substituted with other steps, features or aspects relating to a farm animal operation monitoring system(s) in accordance with example implementations as described in this disclosure above and below.
In example embodiments, the poultry and egg analyzer 215 can take the form of a device (or one or more devices, which may be networked and can have one or more of the structural components as described in FIG. 7 below) that comprises a processor and a memory that stores executable instructions that, when executed by the processor, facilitate performance of operations as described in FIG. 4.
Referring to FIG. 4, the operations 400 can comprise, at block 405, receiving camera data representative of images of a poultry located in a poultry cage, wherein the poultry is one of a plurality of poultry. The poultry has identifiable biometric characteristics (e.g., comb of the poultry, overall size of the poultry, etc.), and one of the identifiable biometric characteristics is correlated with egg laying performance (e.g., comb of the poultry). The camera data can comprise top-facing camera data (e.g., top-facing camera data 205) transmitted by a camera that faces a top of the poultry cage (e.g., top facing cameras 120). The camera data can also comprise side-facing camera data (e.g., side-facing camera data 210) transmitted by a camera that faces a side of the poultry cage (e.g., side facing cameras 125).
The operations 400 can comprise, at block 410, analyzing the camera data to determine the identifiable biometric characteristics of the poultry.
At block 415, the operations 400 can comprise, based on the identifiable biometric characteristics, identifying the poultry and storing poultry identification data (e.g., poultry identification data 235) associated with the poultry in a repository (e.g., repository 220).
At block 420, the operations 400 can comprise, determining, based on the camera data, whether the poultry has engaged in a pattern of movement indicative of a laying behavior. The laying behavior can comprise pre-laying behavior, as well as post-laying behavior.
At block 425, the operations 400 can comprise, determining, based on the camera data, a first time during which the pattern of movement occurs. In the present application, the wording “time” may refer to a (discrete) point in time or to a period in time that an event (such as the pattern of movement) occurs.
At block 430, the operations 400 can comprise, determining, based on the camera data, a presence of an egg in a space associated with the poultry cage. The space associated with the poultry cage can be an area inside the cage, or the space can be a collection area nearby the cage (e.g., a collection bin, conveyor belt, etc.).
The operations 400 can comprise, at block 435, storing in the repository egg data representative of the egg (e.g., egg data 230), the egg data can also comprise quality data related to a quality, or grade, of the egg. The quality data can comprise information relating to a dimension of the egg (e.g., size, width, whether it is oval shaped and slightly larger at one end, etc.). The quality data can also comprise shell quality data (e.g., color of the shell, for example on a whiteness scale, or whether it is above or below a threshold of whiteness, whether the shell is cracked or broken). The egg data can also comprise a second time at which the egg appears in a space around the poultry cage, including in the poultry cage.
The operations 400 can comprise, at block 440, identifying the poultry as a hen that laid the egg, based on an analysis considering the one of the identifiable biometric characteristics of the poultry, the egg data, and a proximity in time between the first time and the second time. In the present application, the wording “a proximity in time” may refer to the time internal between the first time and the second time, where the event that occurs in the second time immediately follows the event that occurs in the first time. The event of the first time and the event of the first time may be for example one or more of engagement in a pattern of movement indicative of a laying behavior and the appearance of an egg in a space. In case of a first period in time and a second period in time, the proximity in time may refer to the time interval between the end of the first period in time and the start of the second period in time.
The operations 400 can comprise, at block 450, associating the egg data with the poultry identification data.
The operations 400 can further comprise performing an iterative process, wherein the iterative process comprises examining the egg data and further egg data associated with the hen, wherein the further egg data was collected subsequent to the collection of the egg data.
The operations 400 can further comprise determining, based on the iterative process, whether the hen lays eggs of a level of quality, wherein the level of quality is based on one or more of a size of the eggs laid by the hen, a color of the eggs (e.g., whiteness scale), shell quality (e.g., whether the eggs comprise broken shells or cracked shells) and a frequency at which the eggs are laid by the hen (e.g., how many eggs are laid by the hen per day). The level of quality can be indicative of egg laying underperformance by the hen (e.g., eggs laid are smaller size, or the number of eggs per day is low). The level of quality can be indicative of an egg laying overperformance by the hen.
The operations 400 can further comprise, in response to a determination that the hen lays eggs of the level of quality, sending positional data of the hen to a luminaire (e.g., luminaire 130) positioned above the poultry cage. The operations 400 can further comprise, in response to a determination that the hen has been removed from the poultry cage, transmitting a signal to the luminaire indicating that the poultry has been removed.
In example embodiments, the poultry and egg analyzer 215 can take the form of a device (or one or more devices, which may be networked and can have components as described in FIG. 7 below) that comprises a processor and a memory that stores executable instructions that, when executed by the processor, facilitate performance of data processing operations (e.g., methods) as described in FIG. 5.
The operations 500 can comprise, at block 505, receiving camera data representative of images of a hen located in a cage, wherein the hen is one of a plurality of poultry, and wherein the images comprise identifiable biometric characteristics of the hen, and wherein one of the identifiable biometric characteristics is correlated with egg laying performance (a comb of the poultry). The camera data can comprise top-facing camera data (e.g., top-facing camera data 205) transmitted by a camera that faces the top of the cage (e.g., top-facing camera 120), and can also comprise side-facing camera data (e.g., side-facing camera data 210) transmitted by a camera that faces a side of the cage (e.g., side-facing camera 125).
The operations 500 can comprise, at block 510, based on the identifiable biometric characteristics (e.g., comb, overall size) in the camera data, identifying the hen and storing identification data (e.g., poultry identification data 235) associated with the hen in a repository (e.g., repository 220).
At block 515, the operations 500 can comprise, determining, based on the camera data, whether the hen has engaged in a pattern of movement indicative of a laying behavior (e.g., pre-laying behavior, post-laying behavior).
At block 520, the operations 500 can comprise, determining, based on the camera data, a first time during which the pattern of movement occurs.
At block 525, the operations 500 can comprise, detecting, based on the camera data, a presence of an egg laid by one of the plurality of poultry.
The operations 500 can comprise, at block 530, storing in the repository egg data representative of the egg, the egg data comprising quality data for the egg and a second time at which the egg is detected to be present.
The operations 500 can comprise, at block 535, identifying the hen as the hen that laid the egg, based on an analysis that considers the one of the identifiable biometric characteristics of the hen, the egg data, and a proximity in time between the first time and the second time.
At block 540, the operations 500 can comprise, associating the egg data with the identification data.
The operations 500 can further comprise, performing an iterative process, wherein the iterative process comprises examining the egg data and further egg data associated with the hen, wherein the further egg data was collected subsequent to the collection of the egg data.
The operations 500 can further comprise, determining, based on the iterative process, whether the hen laid eggs of a level of quality indicative of an egg-laying performance of the hen, wherein the level is based on a size of eggs laid by the hen, a color of the eggs, a shell quality (e.g., whether the shell is cracked, whether the shell is broken) and a frequency at which the eggs are laid by the hen (e.g., a rates such as how many eggs are laid per day).
The operations 500 can further comprise, in response to a determination that the hen lays eggs of the threshold level of quality, sending positional data of the hen to a luminaire positioned above the cage (e.g., luminaire 130).
The operations 500 can further comprise, in response to a determination that the hen has been removed from the cage, transmitting a signal to the luminaire indicating that the hen has been removed.
In example embodiments, the luminaire (e.g., luminaire 130), which can be mounted above a poultry cage, can take the form of a device (or one or more devices, which may be networked and can have components as described in FIG. 7 below) that comprises a network interface device through which the luminaire can receive communication signals, a light source (e.g., one or more LEDs), a camera, a processor and a memory that stores executable instructions that, when executed by the processor, facilitate performance of operations (e.g., methods) as described in FIG. 6.
Referring to FIG. 6, the operations 600 can comprise, at block 605, receiving positional data via the network interface related to a location of a hen in a cage, wherein the hen has been identified as performing at a level of performance based on an analysis of egg data associated with the hen.
At block 610, the operations 600 can comprise, using an image captured by the camera and the positional data, directing the light source to emanate a light beam to the location of the hen, wherein a beam width of the light beam is narrow enough to illuminate at least a portion of the hen.
The operations 600 can further comprise, receiving a signal (e.g., from the poultry and egg analyzer 215) to generate a light, wherein the signal is indicative of the removal of the hen from the cage. The light can be of a different color than the light beam. In some example embodiments, the luminaire with a camera can also serve as a top-facing camera operable to transmit signals representative of captured video of a plurality of poultry residing in the cage. In some example embodiments, each cage can be mapped on a plane (e.g., x-y plain, wherein a location of the plane can comprise an x-y coordinate). Top-facing camera data can be used by the poultry and egg analyzer 215, to determine the coordinate of a particular hen. To facilitate removal or isolation of an underperforming hen, the poultry and egg analyzer 215 can transmit the coordinates of the underperforming hen to the luminaire. When the coordinates are received by the appropriate luminaire (e.g., the luminaire that is most closely to being above the cage of the underperforming hen), the luminaire can, for example using actuators, position itself (or position a light source of the luminaire), so as to direct light to the coordinates where the hen is located. As such, in these embodiments the luminaire does not need to have a camera.
The poultry and egg analyzer 215, top-facing cameras 120, side-facing cameras 125, luminaires 130, and user equipment 225 can comprise or more of the structural components described in FIG. 7.
Fig. 7 schematically shows an example implementation of a computing system 700. In example implementations, various devices used in the systems described above (e.g., farm animal monitoring farm animal operation monitoring system(s) 100, including farm animal sound data monitoring system 800, etc.) in accordance with this disclosure, can comprise one or more components as described in FIG. 7. The computing system 700 can include a computer 702. The local device 105, can be an edge computing device. Computer 702 can comprise a processor 704, memory 706, various interfaces, and various adapters, each of which can be coupled via a local interface, such as system bus 708. The system bus 708 can be any of several types of bus structures that can interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
The processor 704 is capable of processing computer-executable instructions that, when executed by the processor 704, facilitate performance of operations, including methods, operations, functions, or steps described in this disclosure. The processor 704 can comprise one of more devices that can process the instructions. The computer-executable instructions can comprise a program file, software, software module, program module, software application, etc., that is in a form that can ultimately be run by the processor 704. The computer-executable instructions can be, for example: a compiled program that can be translated into machine code in a format that can be loaded into a random access memory 712 of memory 706 and run by the processor 704; source code that may be expressed in proper format such as object code that is capable of being loaded into a random access memory 712 and executed by the processor 704; or source code that may be interpreted by another executable program to generate instructions in a random access memory 712 to be executed by the processor 704, etc. Although the software applications as described herein may be embodied in software or code executed by hardware as discussed in FIG. 7, as an alternative the same may also be embodied in dedicated hardware or a combination of software/general purpose hardware and dedicated hardware.
The computer-executable instructions can be stored on a machine-readable storage media (i.e., computer-readable storage media, also referred to as machine-readable storage medium, or as computer-readable storage medium). The computer-readable storage media can comprise memory 706, as well as storage device 714. The memory 706 can represent multiple memories that operate in parallel processing circuits, and memory 706 can comprise both nonvolatile memory (e.g., read-only memory (ROM)) and volatile memory (e.g., random access memory (RAM)), illustrated by way of example as ROM 710 and RAM 712.
The computer 702 can further comprise a storage device 714 (or additional storage devices) that can store data or software program modules. Storage device 714 can comprise, for example, an internal hard disk drive (HDD) (e.g., EIDE, SATA), solid state drive (SSD), one or more external storage devices (e.g., a magnetic floppy disk drive (FDD), a memory stick or flash drive reader, a memory card reader, etc.), an optical disk drive 720 (e.g., which can read or write from a compact disc (CD), a digital versatile disk (DVD), a BluRay Disc (BD), etc.). While storage device 714 is illustrated as located within the computer 702, the storage device 714 can also be of the variety configured for external, or peripheral, location and use (e.g., external to the housing of the computer 702). The storage device can be connected to the system bus 708 by storage interface 724, which can be an HDD interface, an external storage interface, an optical drive interface, a Universal Serial Bus (USB) interface, and any other internal or external drive interfaces. ROM 710, and also storage device 714, can provide nonvolatile storage of data, data structures, databases, software program modules (e.g., computer-executable instructions), etc., which can be, for example, a basic input/output system (BIOS) 728, an operating system 730, one or more application programs 732, other program modules 734, and application program data 736. Where any component discussed herein is implemented in the form of software, any one of a number of programming languages can be employed, such as, for example, C, C++, C#, Objective C, Java®, JavaScript®, Perl, PHP, Visual Basic®, Python®, Ruby, Flash®, or other programming languages. Data can be stored in a suitable digital format. All or portions of the operating system, applications, modules, or data can also be cached in the RAM 712. The processor 704 can also comprise on-chip memory to facilitate processing of the instructions. The systems and methods described herein can be implemented utilizing various commercially available operating systems or combinations of operating systems. Although the description of computer-readable storage media above refers to respective types of storage devices, it should be appreciated by those skilled in the art that other types of storage media which are readable by a computer, whether presently existing or developed in the future, could also be used in the example operating environment.
Computer 702 can optionally comprise emulation technologies. For example, a hypervisor (not shown) or other intermediary can emulate a hardware environment for operating system 730, and the emulated hardware can optionally be different from the hardware illustrated in FIG. 7. In such an embodiment, operating system 730 can comprise one virtual machine (VM) of multiple VMs hosted at computer 702. Furthermore, operating system 730 can provide runtime environments, such as the Java runtime environment or the .NET framework, for applications 732. Runtime environments are consistent execution environments that allow applications 732 to run on any operating system that includes the runtime environment. Similarly, operating system 730 can support containers, and applications 732 can be in the form of containers, which are lightweight, standalone, executable packages of software that include, e.g., code, runtime, system tools, system libraries and settings for an application.
Further, computer 702 can be enable with a security module, such as a trusted processing module (TPM). For instance with a TPM, boot components hash next in time boot components, and wait for a match of results to secured values, before loading a next boot component. This process can take place at any layer in the code execution stack of computer 702, e.g., applied at the application execution level or at the operating system (OS) kernel level, thereby enabling security at any level of code execution. To the extent that certain user inputs are desirable (as indicated by the dotted line), a user can enter commands and information into the computer 702 using one or more wired/wireless input devices, such as a keyboard 738, a touch screen 740, or a cursor control device 742, such as a mouse, touchpad, or trackball. Other input devices (not shown) can comprise a microphone, an infrared (IR) remote control, a radio frequency (RF) remote control, or other remote control, a joystick, control pad, a virtual reality controller and/or virtual reality headset, a game pad, a stylus pen, an image input device (e.g., camera(s)), a gesture sensor input device, a vision movement sensor input device, an emotion or facial detection device, a biometric input device, (e.g., fingerprint or iris scanner), or the like. These and other input devices are often connected to the processing unit 704 through an input device interface 744 that can be coupled to the system bus 708, but can be connected by other interfaces, such as a parallel port, an IEEE 794 serial port, a game port, a USB port, audio port, an IR interface, a BLUETOOTH® interface, etc.
To the extent desired (as noted by the dotted line), a display device 746, such as a monitor, television, or other type of display device, can be also connected to the system bus 708 via an interface, such as a video adapter 748. In addition to the display device 746, a computer 702 can also connect with other output devices (not shown), such as speakers, printers, etc.
The computer 702 can operate in a networked environment using wired or wireless communications to one or more remote computers, such as a remote computer 750 (e.g., one or more remote computers). The remote computer 750 can be a workstation, a server computer, a router, a personal computer, a tablet, a cellular phone, a portable computer, microprocessor-based entertainment appliance, a peer device, a network node, and internet of things (loT) device, and the like, and typically includes many or all of the elements described relative to the computer 702, although, for purposes of brevity, only a memory/storage device 752 is illustrated. When used in a networked environment, the computer 702 can access cloud storage systems or other network-based storage systems in addition to, or in place of, external storage devices 714 as described above. For example, as part of the cloud storage or network-based storage system, a remote computer 150 can comprise a computing device that is primarily used for storage, such as a network attached storage device (NAS), redundant array of disks (RADs), or a device that is a part of a SAN (storage area network), wherein the storage device comprises memory/storage 752. In a networked environment, program modules depicted relative to the computer 702 or portions thereof, can be stored in the remote memory/storage device 752 (some refer to this as “cloud storage” or “storage in the cloud). Likewise, data and information, including data associated with applications or program modules, can also be stored remotely at the remote memory/storage device 752. A remote computer 750 that is a server device can facilitate storage and retrieval of information to a networked memory/storage device 752. Upon connecting the computer 702 to an associated cloud storage system, the computer 702 can manage storage provided by the cloud storage system as it would other types of external storage. For instance, access to cloud storage sources can be provided as if those sources were stored locally on the computer 702.
Generally, a connection between the computer 702 and a cloud storage system can be established, either via wired or wireless connectivity, over a network 754. The network can be, for example, wireless fidelity (Wi-Fi) network, a local area network (LAN), wireless LAN, larger networks (e.g., a wide area network (WAN)), cable-based communication network (e.g., a communication network implementing the data over cable service interface specification (DOCSIS), asynchronous transfer mode (ATM) network, digital subscriber line (DSL) network, asymmetric digital subscriber line (ADSL) network, a cellular network (e.g., 4G Long Term Evolution (LTE), 5G, etc.), and other typical fixed and mobile broadband communications networks, and can comprise components (e.g., headend equipment, local serving office equipment, Digital Subscriber Line Access Multiplexers (DSLAMs), Cable Modem Termination Systems (CMTSs), cellular nodes, etc.) related to each of these types of networks. The network 754 can facilitate connections to a global communications network (e.g., the Internet).
When used in a networking environment, the computer 702 can be connected to the network 754 through a wired or wireless communications component 758. The communications component 758 can comprise, for example, a network work interface adapter (e.g., network interface card), wireless access point (WAP) adapter. The communications component 758 can also comprise cellular receivers, cellular transmitters, and cellular transceivers that enable cellular communications. The communications component 758 can facilitate wired or wireless communication to the network 754, which can include facilitating communications through a gateway device, such as a cable modem, DSL modem, ADSL modem, cable telephony modem, wireless router, or other devices that can be used to facilitate establishment of communications. The gateway device, which can be internal or external and a wired or wireless device, can be connected to the system bus 708 via the communications component 758. It will be appreciated that the network connections and components shown are examples, and other methods of establishing a communications link between a remote computer 750 can be used.
As used in this application, the terms “system,” “component,” “interface,” and the like are generally intended to refer to a computer-related entity or an entity related to an operational machine with one or more specific functionalities. The entities disclosed herein can be either hardware, a combination of hardware and software, software, or software in execution. For example, a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers. These components also can execute from various computer readable storage media comprising various data structures stored thereon. The components can communicate via local and/or remote processes such as in accordance with a signal comprising one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal). As another example, a component can be an apparatus with specific functionality provided by mechanical parts operated by electric or electronic circuitry that is operated by software or firmware application(s) executed by a processor, wherein the processor can be internal or external to the apparatus and executes at least a part of the software or firmware application. As yet another example, a component can be an apparatus that provides specific functionality through electronic components without mechanical parts, the electronic components can comprise a processor therein to execute software or firmware that confers at least in part the functionality of the electronic components. An interface can comprise input/output (I/O) components as well as associated processor, application, and/or API components.
As it is used in the subject specification, the term “processor” can refer to substantially any computing processing unit or device comprising single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Additionally, a processor can refer to an integrated circuit, a central processing unit (CPU), an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. Processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of UE. A processor also can be implemented as a combination of computing processing units.
Furthermore, the disclosed subject matter can be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter.
The term “article of manufacture” as used herein is intended to encompass any computer-readable device, computer-readable carrier, or computer-readable storage media having stored thereon computer-executable instructions. Computing devices typically comprise a variety of media, which can comprise computer-readable storage media, which can be implemented in connection with any method or technology for storage of information such as computer-readable instructions, program modules, structured data, or unstructured data. Computer-readable storage media can be any available storage media that can be accessed by the computer, and can comprise various forms of memory, as will be elaborated further below.
In the subject specification, terms such as “store,” “data store,” “data storage,” “database,” “repository,” “queue”, and substantially any other information storage component relevant to operation and functionality of a component, refer to “memory components,” or entities embodied in a “memory” or components comprising the memory. Memory can be of various types, such as hard-disk drives (HDD), floppy disks, zip disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, flash memory devices (cards, sticks, key drives, thumb drives), cartridges, optical discs (e.g., compact discs (CD), digital versatile disk (DVD), Blu-ray Disc (BD)), a virtual device that emulates a storage device, and other tangible and/or non-transitory media which can be used to store desired information. It will be appreciated that the memory components or memory elements described herein can be removable or stationary. Moreover, memory can be internal or external to a device or component. Memory can also comprise volatile memory as well as nonvolatile memory, whereby volatile memory components are those that do not retain data values upon loss of power and nonvolatile components are those that retain data upon a loss of power. By way of illustration, and not limitation, nonvolatile memory can comprise read only memory (ROM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory. Volatile memory can comprise random access memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in many forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), magnetic random access memory (MRAM), and direct Rambus RAM (DRRAM). Additionally, the disclosed memory components of systems or methods herein are intended to comprise these and any other suitable types of memory.
The term “facilitate” as used herein is in the context of a system, device or component “facilitating” one or more actions, methods, or example operations, in respect of the nature of complex computing environments in which multiple components and/or multiple devices can be involved in some computing operations. Non-limiting examples of actions that may or may not involve multiple components and/or multiple devices comprise the methods described herein, including but not limited to transmitting or receiving data, establishing a connection between devices, determining intermediate results toward obtaining a result, etc. In this regard, a computing device or component can facilitate an operation by playing any part in accomplishing the operation (e.g., directing, controlling, enabling, etc.). When operations of a component are described herein, it is thus to be understood that where the operations are described as facilitated by the component, the operations can be optionally completed with the cooperation of one or more other computing devices or components, such as, but not limited to, processors, application specific integrated circuits (ASICs), sensors, antennae, audio and/or visual output devices, other devices, etc.
In particular and in regard to the various functions performed by the above described components, devices, circuits, systems and the like, the terms (comprising a reference to a “means”) used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., a functional equivalent), even though not structurally equivalent to the disclosed structure, which performs the function in the herein illustrated example aspects of the embodiments. In this regard, it will also be recognized that the embodiments comprise a system as well as a computer-readable storage media comprising computerexecutable instructions for performing the acts or events of the various methods. Additionally, unless claim language specifically recites “means for”, the claim is intended to encompass a recited claim structure, and not invoke means plus function language.
Furthermore, the terms “user,” “subscriber,” “customer,” “consumer,” and the like are employed interchangeably throughout the subject specification, unless context warrants particular distinction(s) among the terms. It should be appreciated that such terms can refer to human entities, associated devices, or automated components supported through artificial intelligence (e.g., a capacity to make inference based on complex mathematical formalisms) which can provide simulated vision, sound recognition and so forth. In addition, the terms “wireless network” and “network” are used interchangeably in the present application, when context wherein the term is utilized warrants distinction for clarity purposes such distinction is made explicit.
Moreover, the word “exemplary,” where used, is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the word exemplary is intended to present concepts in a concrete fashion. Wherever the phrases "for example," "such as," "including" and the like are used herein, the phrase "and without limitation" is understood to follow unless explicitly stated otherwise.
As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances.
Furthermore, references to singular components or items are intended, unless otherwise specified, to encompass two or more such components or items. For example, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.
The term "about" is meant to account for variations due to experimental error. All measurements or numbers are implicitly understood to be modified by the word about, even if the measurement or number is not explicitly modified by the word about.
The term "substantially" (or alternatively "effectively") is meant to permit deviations from the descriptive term that do not negatively impact the intended purpose. Descriptive terms are implicitly understood to be modified by the word substantially, even if the term is not explicitly modified by the word “substantially.” In addition, while a particular feature may have been disclosed with respect to only one of several implementations, such feature can be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. To the extent that the terms “has,” “have”, “having”, “comprising” and “including” and “involving” and variants thereof (e.g., “comprises,” “includes,” and “involves”) are used interchangeably and mean the same thing - these terms are defined consistent with the common patent law definition of "comprising" and is therefore interpreted to be an open term meaning “at least the following but is not limited to,” and as such is not to be interpreted to exclude additional features, limitations, aspects, etc.
The above descriptions of various example embodiments and example implementations of the subject disclosure, corresponding figures, and what is described in the Abstract, are described herein for illustrative purposes, and are not intended to be exhaustive or to limit the disclosed embodiments to the precise forms disclosed. It is to be understood that one of ordinary skill in the art can recognize that other embodiments comprising modifications, permutations, combinations, and additions can be implemented for performing the same, similar, alternative, or substitute functions of the disclosed subject matter, and are therefore considered within the scope of this disclosure.
For example, disclosed systems and apparatuses and components or subsets thereof (referred to hereinafter as components) should neither be presumed to be exclusive of other disclosed systems and apparatuses, nor should an apparatus be presumed to be exclusive to its depicted components in an example embodiment or embodiments of this disclosure, unless where clear from context to the contrary. Additionally, steps or blocks as shown in example methods, or operations, can be interchangeable with steps or blocks as show in other example methods or operations. The scope of the disclosure is generally intended to encompass modifications of depicted embodiments with additions from other depicted embodiments, where suitable, interoperability among or between depicted embodiments, where suitable, as well as addition of a component(s) from one embodiment s) within another or subtraction of a component(s) from any depicted embodiment, where suitable, aggregation of components (or embodiments) into a single component achieving aggregate functionality, where suitable, or distribution of functionality of a single system or component into multiple systems or components, where suitable. In addition, incorporation, combination or modification of systems or components depicted herein or modified as stated above with systems, apparatuses, components or subsets thereof not explicitly depicted herein but known in the art or made evident to one with ordinary skill in the art through the context disclosed herein are also considered within the scope of the present disclosure. As such, although a particular feature of the present invention may have been illustrated or described with respect to only one of several implementations, any such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application.
Therefore, the disclosed subject matter should not be limited to any single embodiment described herein, but rather should be construed in breadth and scope in accordance with the claims, including all equivalents, that are listed below.

Claims

28 CLAIMS:
1. A data processing device (215), comprising: a processor; and a memory that stores executable instructions that, when executed by the processor, cause the data processing device to perform operations, comprising: receiving camera data representative of images of a poultry located in a poultry cage (105), wherein: the poultry is one of a plurality of poultry, the poultry has identifiable biometric characteristics, and one of the identifiable biometric characteristics is correlated with egg laying performance (405), analyzing the camera data to determine the identifiable biometric characteristics of the poultry (410), based on the identifiable biometric characteristics, identifying the poultry and storing poultry identification data (235) associated with the poultry in a repository (415), determining, based on the camera data, whether the poultry has engaged in a pattern of movement indicative of a laying behavior (420), determining, based on the camera data, a first time during which the pattern of movement occurs (425), determining, based on the camera data, a presence of an egg (115) in a space associated with the poultry cage (430), storing in the repository (220) egg data (230) representative of the egg (115), the egg data (230) comprising quality data related to the egg and a second time at which the egg appears in the space (435), identifying the poultry as a hen that laid the egg (115), based on an analysis considering: the one of the identifiable biometric characteristics of the poultry, the egg data, and a proximity in time between the first time and the second time (440), and associating the egg data with the poultry identification data (445), wherein the operations further comprise: performing an iterative process, wherein the iterative process comprises examining the egg data and further egg data associated with the hen, wherein the further egg data was collected subsequent to the collection of the egg data (230); and determining, based on the iterative process, whether the hen lays eggs (115) of a level of quality, wherein the level of quality is based on one or more of: a size of the eggs laid by the hen, a color of the eggs, whether the eggs comprise shell cracks, whether the eggs are broken, and a frequency at which the eggs are laid by the hen.
2. The device of claim 1, wherein the level of quality is indicative of egg laying underperformance by the hen.
3. The device of claim 1, wherein the level of quality is indicative of an egg laying overperformance by the hen.
4. The device of claim 1, wherein the operations further comprise: in response to a determination that the hen lays eggs of the level of quality, sending positional data of the hen to a luminaire positioned above the poultry cage.
5. The device of claim 4, wherein the operations further comprise: in response to a determination that the hen has been removed from the poultry cage, transmitting a signal to the luminaire indicating that the poultry has been removed.
6. The device of claim 1, wherein the camera data comprises: top-facing camera data transmitted by a camera (120) that faces a top of the poultry cage; and side-facing camera data transmitted by a camera (125) that faces a side of the poultry cage.
7. The device of claim 1, wherein the laying behavior comprises pre-laying behavior and post-laying behavior.
8. The device of claim 1, wherein the one of the identifiable biometric characteristics relates to one or more of a comb of the poultry, and an overall size of the poultry.
9. A data processing method performed by a device (215) comprising a processor and memory, the method comprising: receiving camera data representative of images of a hen located in a cage (105), wherein the hen is one of a plurality of poultry, and wherein the images comprise identifiable biometric characteristics of the hen, and wherein one of the identifiable biometric characteristics is correlated with egg laying performance (505); based on the identifiable biometric characteristics in the camera data, identifying the hen and storing identification data associated with the hen in a repository (510); determining, based on the camera data, whether the hen has engaged in a pattern of movement indicative of a laying behavior (515); determining, based on the camera data, a first time during which the pattern of movement occurs (520); detecting, based on the camera data, a presence of an egg (115) laid by one of the plurality of poultry (525); storing in the repository (220) egg data (230) representative of the egg (115), the egg data comprising quality data for the egg and a second time at which the egg is detected to be present (530); identifying the hen as the hen that laid the egg, based on an analysis that considers: the one of the identifiable biometric characteristics of the hen, the egg data, and a proximity in time between the first time and the second time (535); and associating the egg data with the identification data (540), the method further comprising: performing an iterative process, wherein the iterative process comprises examining the egg data and further egg data associated with the hen, wherein the further egg data was collected subsequent to the collection of the egg data; and determining, based on the iterative process, whether the hen laid eggs of a level of quality indicative of an egg-laying performance of the hen, wherein the level is based one or more of: a size of the eggs laid by the hen, a color of the eggs, a shell quality, and a frequency at which the eggs are laid by the hen.
10. The method of claim 9, wherein the operations further comprise: in response to a determination that the hen lays eggs of the threshold level of quality, sending positional data of the hen to a luminaire positioned above the cage; and in response to a determination that the hen has been removed from the cage, transmitting a signal to the luminaire indicating that the hen has been removed.
11. The method of claim 9, wherein the camera data comprises: top-facing camera data transmitted by a camera that faces the top of the cage; and side-facing camera data transmitted by a camera that faces a side of the cage.
12. The method of claim 9, wherein the one of the identifiable biometric characteristics relates to one or more of a comb of the poultry and an overall size of the poultry.
13. A luminaire (130), compri sing : a network interface device through which the luminaire can receive communication signals from the data processing device according to claim 1; a light source; a camera; a processor; a memory that stores executable instructions that, when executed by the processor, cause the data processing device to perform operations upon reception of a signal from the data processing device according to claim 1, comprising: 32 receiving positional data related to a location of a hen in a cage, wherein the hen has been identified as performing at a level of performance based on an analysis of egg data associated with the hen (605), and using an image captured by the camera and the positional data, directing the light source to emanate a light beam to the location of the hen, wherein a beam width of the light beam is narrow enough to illuminate at least a portion of the hen (610).
14. The luminaire according to claim 13, wherein the light source is an incandescent light source, or a fluorescent light source, or an halogen-based light source, or a light emitting diode (LED).
15. The luminaire according to claim 13 or 14, comprising a plurality of light sources arrange to emit light of different colors.
PCT/EP2021/073060 2020-08-25 2021-08-19 Identifying poultry associated with eggs of a quality WO2022043187A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202063070039P 2020-08-25 2020-08-25
US63/070039 2020-08-25
EP20197462.3 2020-09-22
EP20197462 2020-09-22

Publications (1)

Publication Number Publication Date
WO2022043187A1 true WO2022043187A1 (en) 2022-03-03

Family

ID=77543519

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/073060 WO2022043187A1 (en) 2020-08-25 2021-08-19 Identifying poultry associated with eggs of a quality

Country Status (1)

Country Link
WO (1) WO2022043187A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115979339A (en) * 2022-12-07 2023-04-18 吉林农业科技学院 Laying hen breeding environment intelligent supervision system based on big data analysis
CN116584409A (en) * 2023-05-23 2023-08-15 江苏省农业科学院 Automatic collection and intelligent marking equipment for eggs
CN114831047B (en) * 2022-05-25 2024-01-30 田东前位畜牧科技有限公司 Efficient egg picking method for henhouse

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020009578A1 (en) * 2018-07-05 2020-01-09 Stichting Wageningen Research Method and system for grading hens in a flock
US20200170219A1 (en) 2016-02-19 2020-06-04 International Business Machines Corporation Unmanned aerial vehicle for generating geolocation exclusion zones

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200170219A1 (en) 2016-02-19 2020-06-04 International Business Machines Corporation Unmanned aerial vehicle for generating geolocation exclusion zones
WO2020009578A1 (en) * 2018-07-05 2020-01-09 Stichting Wageningen Research Method and system for grading hens in a flock

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114831047B (en) * 2022-05-25 2024-01-30 田东前位畜牧科技有限公司 Efficient egg picking method for henhouse
CN115979339A (en) * 2022-12-07 2023-04-18 吉林农业科技学院 Laying hen breeding environment intelligent supervision system based on big data analysis
CN115979339B (en) * 2022-12-07 2023-08-15 吉林农业科技学院 Intelligent monitoring system for laying hen breeding environment based on big data analysis
CN116584409A (en) * 2023-05-23 2023-08-15 江苏省农业科学院 Automatic collection and intelligent marking equipment for eggs
CN116584409B (en) * 2023-05-23 2024-05-24 江苏省农业科学院 Automatic collection and intelligent marking equipment for eggs

Similar Documents

Publication Publication Date Title
WO2022043187A1 (en) Identifying poultry associated with eggs of a quality
Wurtz et al. Recording behaviour of indoor-housed farm animals automatically using machine vision technology: A systematic review
Zhuang et al. Development of an early warning algorithm to detect sick broilers
Zhuang et al. Detection of sick broilers by digital image processing and deep learning
Okinda et al. A machine vision system for early detection and prediction of sick birds: A broiler chicken model
Mortensen et al. Weight prediction of broiler chickens using 3D computer vision
Fang et al. Pose estimation and behavior classification of broiler chickens based on deep neural networks
Neethirajan ChickTrack–a quantitative tracking tool for measuring chicken activity
US10058076B2 (en) Method of monitoring infectious disease, system using the same, and recording medium for performing the same
Wario et al. Automatic methods for long-term tracking and the detection and decoding of communication dances in honeybees
Ojo et al. Internet of Things and Machine Learning techniques in poultry health and welfare management: A systematic literature review
Subedi et al. Tracking pecking behaviors and damages of cage-free laying hens with machine vision technologies
Li et al. Analysis of feeding and drinking behaviors of group-reared broilers via image processing
EP3284011A1 (en) Two-dimensional infrared depth sensing
BR112015011892B1 (en) METHOD IN A LEARNING MACHINE SYSTEM AND LEARNING MACHINE APPARATUS
Abd Aziz et al. A review on computer vision technology for monitoring poultry Farm—Application, hardware, and software
Jukan et al. Fog-to-cloud computing for farming: low-cost technologies, data exchange, and animal welfare
Subedi et al. Tracking floor eggs with machine vision in cage-free hen houses
Balachandar et al. Internet of Things based reliable real-time disease monitoring of poultry farming imagery analytics
Wang et al. Evaluation of a laying-hen tracking algorithm based on a hybrid support vector machine
de Alencar Nääs et al. Lameness prediction in broiler chicken using a machine learning technique
Lamping et al. ChickenNet-an end-to-end approach for plumage condition assessment of laying hens in commercial farms using computer vision
US20220343621A1 (en) Tracking system for identification of subjects
Witte et al. Evaluation of deep learning instance segmentation models for pig precision livestock farming
Gourisaria et al. Chicken Disease Multiclass Classification Using Deep Learning

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21762714

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21762714

Country of ref document: EP

Kind code of ref document: A1