CN116451046B - Pet state analysis method, device, medium and equipment based on image recognition - Google Patents
Pet state analysis method, device, medium and equipment based on image recognition Download PDFInfo
- Publication number
- CN116451046B CN116451046B CN202310734138.3A CN202310734138A CN116451046B CN 116451046 B CN116451046 B CN 116451046B CN 202310734138 A CN202310734138 A CN 202310734138A CN 116451046 B CN116451046 B CN 116451046B
- Authority
- CN
- China
- Prior art keywords
- pet
- appeal
- information
- moving image
- training data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 119
- 238000012549 training Methods 0.000 claims abstract description 128
- 230000036541 health Effects 0.000 claims abstract description 90
- 230000036760 body temperature Effects 0.000 claims abstract description 75
- 230000004044 response Effects 0.000 claims abstract description 8
- 238000000605 extraction Methods 0.000 claims description 25
- 230000006399 behavior Effects 0.000 claims description 16
- 230000004927 fusion Effects 0.000 claims description 16
- 230000003862 health status Effects 0.000 claims description 10
- 238000002372 labelling Methods 0.000 claims description 10
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 8
- 239000003651 drinking water Substances 0.000 claims description 7
- 235000020188 drinking water Nutrition 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 6
- 230000009471 action Effects 0.000 claims description 5
- 238000001514 detection method Methods 0.000 claims description 4
- 238000004806 packaging method and process Methods 0.000 claims description 4
- 238000000034 method Methods 0.000 description 10
- 241001465754 Metazoa Species 0.000 description 7
- 230000008859 change Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 241000282472 Canis lupus familiaris Species 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 241000124008 Mammalia Species 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K29/00—Other apparatus for animal husbandry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Medical Informatics (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Environmental Sciences (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Public Health (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Computing Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Animal Husbandry (AREA)
- Biodiversity & Conservation Biology (AREA)
- Human Computer Interaction (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
The invention relates to a pet state analysis method, device, medium and equipment based on image recognition, and belongs to the technical field of intelligent pet feeding. The technical scheme of the invention mainly comprises the following steps: collecting a first pet moving image and performing appeal annotation to construct a first training data set, and performing model training based on the first training data set; acquiring a second pet moving image, pet information and body temperature information after responding to the pet appeal, and marking the health state to construct a second training data set, and performing model training based on the second training data set; registering pet information of the pet, and acquiring a pet moving image and body temperature information in real time; inputting the first pet moving image into a appeal analysis model to obtain pet appeal, and controlling the pet equipment to act in response to the pet appeal; and inputting the second pet moving image, the pet information, the pet complaints and the body temperature information into the health state analysis model to output a pet health state analysis result.
Description
Technical Field
The invention belongs to the technical field of intelligent pet feeding, and particularly relates to a pet state analysis method, device, medium and equipment based on image recognition.
Background
Previously pets have only been animals that a single person had to stock for silence removal or for recreational purposes. Pets are nowadays defined as animals and plants that are kept for non-economic purposes. Pets are animals and plants that are kept for mental purposes. The pet is usually kept for eliminating the dead or entertaining, and the former pets are usually mammals or birds, because the animals are developed in brain and are easy to communicate with people. Nowadays, there are many virtual pets in addition to the well-known living pets. Among all animal pets, dogs and cats are the dominant popular.
At present, in the raising process of pets, because the existing pet activity analysis system can only cooperate with an electric shock necklace at the neck of the pet to define the range of motion and the sound-calling decibel of the pet and detect the body temperature of the pet, the function is single, and the pet activity analysis system is rough, and different treatment measures cannot be obtained according to the habits of different pets, so the pet activity analysis system cannot be suitable for high-rise personal raising and pet stores, and the application range is single.
The invention aims to solve the problem that analysis of the complaint and the state of the pet is inaccurate in the analysis processing according to the habit of the pet in the prior art.
Disclosure of Invention
In view of the above analysis, the embodiments of the present invention aim to provide a pet status analysis method, device, medium and apparatus based on image recognition, so as to solve the problem in the prior art that analysis processing is performed according to pet habits, which is inaccurate for pet complaints and status analysis.
An embodiment of a first aspect of the present invention provides a pet status analysis method based on image recognition, including the steps of:
collecting a first pet moving image and performing appeal annotation to construct a first training data set, and performing model training based on the first training data set to obtain a appeal analysis model, wherein the first pet moving image is a pet moving image before responding to pet appeal;
collecting a second pet moving image, pet information and body temperature information after responding to the pet appeal, and marking the health state to construct a second training data set, and performing model training based on the second training data set to obtain a health state analysis model;
registering pet information of a pet, and acquiring the pet moving image and the body temperature information in real time, wherein the pet information comprises pet age groups and pet social relationship information;
inputting the first pet moving image into the appeal analysis model to obtain the pet appeal and controlling pet equipment action to respond to the pet appeal;
and inputting the responded second pet moving image, the pet information, the pet complaints and the body temperature information into the health state analysis model to output a pet health state analysis result.
In some embodiments, the capturing the first pet moving image and performing the appeal annotation to construct a first training dataset, performing model training based on the first training dataset to obtain a appeal analysis model, comprising:
constructing a first training data set, wherein the first training data set comprises a first pet moving image and a corresponding appeal label;
inputting the first pet moving image into the appeal analysis model for appeal analysis, and determining a first loss value based on the difference between the appeal label and the appeal analysis result;
training a appeal analysis model according to the first loss value, wherein the appeal analysis model obtained through training is used for judging the pet appeal according to the first pet moving image.
In some embodiments, the constructing the first training data set includes:
collecting a first pet moving image, wherein the first pet moving image is obtained by shooting a pet action by shooting equipment before a pet appeal is sent out, and the pet appeal comprises at least one of feeding, drinking water, toilet or temperature adjustment;
labeling the first pet moving image according to the pet appeal which correspondingly occurs in the first pet moving image to form a appeal label;
and constructing the first training data set according to the first pet moving image and the appeal label.
In some embodiments, the health state analysis model includes an image feature extraction module, an information feature extraction module, a body temperature feature extraction module, and a feature fusion module;
the steps of collecting the pet appeal, obtaining a second pet moving image, pet information and body temperature information after response, and marking the health state to construct a second training data set, and performing model training based on the second training data set to obtain a health state analysis model include:
constructing a second training data set, wherein the second training data set comprises input data and health status labels, and the input data comprises a second pet moving image, corresponding pet appeal and pet information and body temperature information of a pet displayed in the second pet moving image;
inputting the second pet moving image into the image feature extraction module to extract image features, inputting the pet information and the pet appeal into the information feature extraction module to extract information features, and inputting the body temperature information into the body temperature feature extraction module to extract body temperature features;
inputting the image features, the information features and the body temperature features into the feature fusion module to obtain fusion features;
and carrying out health state analysis based on the fusion characteristic, and determining a second loss value based on the difference between the health state analysis result and the health state label.
It should be understood that, in this embodiment, the feature extraction method and the feature fusion method for extracting the image feature, the information feature, the body temperature feature and the like all adopt a mature method in the prior art, which is not described herein.
Training a health state analysis model according to the second loss value, wherein the health state analysis model obtained through training is used for outputting the probability of pet health according to the second pet moving image, the pet information, the pet complaints and the body temperature information.
In some embodiments, the constructing the second training data set comprises:
and acquiring a second pet moving image and pet appeal, wherein the second pet moving image is obtained by shooting the behavior of the pet through a shooting device after the pet appeal is responded, and the pet appeal comprises at least one of feeding, drinking water, toilet or temperature adjustment.
Acquiring the pet information, including acquiring a pet age bracket and a pet social relationship network;
and acquiring the body temperature information, wherein the body temperature information comprises a temperature curve obtained by detecting the body temperature of the pet after the pet solicits to respond through the infrared detection equipment.
And packaging the second pet moving image, the pet appeal, the pet information and the body temperature information by taking the affiliated pet as a unit to generate a training data packet, and labeling the health state of the training data packet to obtain a health state label.
And constructing the second training data set according to the training data packets and the corresponding health state labels.
In some embodiments, the pet appliance includes a combination of one or more of a dispenser, a water dispenser, a pet litter box, and an air conditioner.
In some embodiments, further comprising: and comparing the probability with a preset threshold, and if the probability is smaller than the preset threshold, reminding the doctor to seek medical attention.
An embodiment of the second aspect of the present invention provides a pet status analysis device based on image recognition, including:
the appeal model training module is used for acquiring a first pet moving image and carrying out appeal labeling to construct a first training data set, carrying out model training based on the first training data set to obtain a appeal analysis model, wherein the first pet moving image is a pet moving image before responding to pet appeal;
the state model training module is used for acquiring the second pet moving image, pet information and body temperature information after responding to the pet appeal, and carrying out health state labeling to construct a second training data set, and carrying out model training based on the second training data set to obtain a health state analysis model;
the information registration module is used for registering pet information of the pet and acquiring the pet moving image and the body temperature information in real time, wherein the pet information comprises pet age groups and pet social relationship information;
a appeal analysis module for inputting the first pet moving image into the appeal analysis model to obtain the pet appeal and controlling pet equipment to act in response to the pet appeal;
and the state analysis module inputs the responded second pet moving image, the pet information, the pet complaint and the body temperature information into the health state analysis model so as to output a pet health state analysis result.
An embodiment of a third aspect of the present invention provides an electronic device, including a memory and a processor, the memory storing a computer program which, when executed by the processor, implements the pet state analysis method based on image recognition as described in any of the embodiments above.
An embodiment of a fourth aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image recognition-based pet state analysis method of any of the embodiments above.
The above embodiment of the invention has at least the following advantages:
1. the first training data set constructed by the behavior image of the pet when the pet gives out a appeal enables the appeal analysis model to learn the behavior of the pet when the pet gives out the appeal, and the behavior of the pet after responding is obtained through the health state analysis model and the self temperature and information are combined to judge the current health state of the pet.
2. The method and the device can predict the health state change of the pet by extracting the characteristics of the health state of the age index and the social relationship circle which can influence the health state of the pet and fusing the characteristics into the characteristics of pet behaviors and the characteristics of body temperature information after responding to the pet appeal.
Drawings
In order to more clearly illustrate the embodiments of the present description or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present description, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
Fig. 1 is a schematic flow chart of a pet state analysis method based on image recognition according to an embodiment of the first aspect of the present invention;
FIG. 2 is a flowchart of a training method for resorting to analytical models according to an embodiment of the first aspect of the present invention;
FIG. 3 is a flowchart of a training method of a health status analysis model according to an embodiment of the first aspect of the present invention;
fig. 4 is a schematic diagram of a pet status analysis device based on image recognition according to a second embodiment of the present invention;
fig. 5 is a schematic diagram of an electronic device architecture according to an embodiment of the third aspect of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. It should be noted that embodiments and features of embodiments in the present disclosure may be combined, separated, interchanged, and/or rearranged with one another without conflict. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
The pet state analysis method based on image recognition provided by the embodiment of the first aspect of the present invention is described below by way of specific embodiments. Referring to fig. 1, a pet status analysis method based on image recognition according to an embodiment of the first aspect of the present invention includes:
step one, collecting a first pet moving image and performing appeal annotation to construct a first training data set, and performing model training based on the first training data set to obtain a appeal analysis model, wherein the first pet moving image is the pet moving image before responding to pet appeal.
It should be understood that the first pet moving image may be a pet behavior image captured by a camera or a monitoring device, etc., and the first pet moving image may include information of a appeal behavior by performing multi-angle continuous capturing of various appeal behaviors exhibited by the pet when the pet eats, drinks, uses a toilet, or increases in body temperature, and performing continuous capturing and multi-angle capturing of each appeal behavior to form a appeal image set, and by using the continuously arranged image set as the first pet moving image. Such as the pet moving back and forth in a particular area and making a mouth when it is desired to eat or toilet. Training the appeal analysis model with the first training data set can enable the appeal analysis model to learn the appeal contained in the pet behavior.
Preferably, as shown in fig. 2, in some embodiments, the capturing the first pet moving image and performing the appeal annotation to construct a first training dataset, and performing model training based on the first training dataset to obtain the appeal analysis model includes:
and constructing the first training data set, wherein the first training data set comprises a first pet moving image and a corresponding appeal label.
Inputting the first pet moving image into the appeal analysis model for appeal analysis, and determining a first loss value based on the difference between the appeal label and the appeal analysis result.
Training a appeal analysis model according to the first loss value, wherein the appeal analysis model obtained through training is used for judging the pet appeal according to the first pet moving image.
Wherein, in relation to said constructing said first training data set, in some embodiments, comprises:
the method comprises the steps of collecting a first pet moving image, wherein the first pet moving image is obtained by shooting a pet action before issuing a pet appeal through shooting equipment, and the pet appeal comprises at least one of feeding, drinking, toilet or temperature adjustment.
And labeling the first pet moving image according to the pet appeal which occurs corresponding to the obtained first pet moving image to form a appeal label.
And constructing the first training data set according to the first pet moving image and the appeal label.
And secondly, acquiring a second pet moving image, pet information and body temperature information after responding to the pet appeal, and marking the health state to construct a second training data set, and performing model training based on the second training data set to obtain a health state analysis model.
It should be understood that in the prior art, the health status of the pet is often judged directly through the information such as the body temperature of the pet, on the one hand, the judgment on the health status of the pet is not accurate enough, and on the other hand, the health status of the pet can be found only when the health status of the pet is damaged, and cannot be extracted for prevention.
In the embodiment of the invention, after the pet gives out the appeal, the pet is responded, for example, when the pet is judged to need to eat, the pet is aligned to carry out food throwing; or the water dispenser is opened to drink water when the water is needed. The pet behavior after responding to the pet appeal is collected, for example, the pet clearly gives out eating appeal, but the pet does not eat or drink water after food or drinking water is put in, so that other hidden problems of the pet body are indicated to be expressed. And then, by combining the information of the pet and the temperature information, the health state of the pet can be predicted and analyzed through judging the characteristics of the temperature curve. So as to help the pet owner to take care of the pet or send the pet to seek medical attention more timely.
Preferably, as shown in fig. 3, in some embodiments, the health status analysis model includes an image feature extraction module, an information feature extraction module, a body temperature feature extraction module, and a feature fusion module;
the steps of collecting the pet appeal, obtaining a second pet moving image, pet information and body temperature information after response, and marking the health state to construct a second training data set, and performing model training based on the second training data set to obtain a health state analysis model include:
constructing a second training data set, wherein the second training data set comprises input data and health status labels, and the input data comprises a second pet moving image, corresponding pet appeal and pet information and body temperature information of a pet displayed in the second pet moving image;
inputting the second pet moving image into the image feature extraction module to extract image features, inputting the pet information and the pet appeal into the information feature extraction module to extract information features, and inputting the body temperature information into the body temperature feature extraction module to extract body temperature features;
inputting the image features, the information features and the body temperature features into the feature fusion module to obtain fusion features;
performing a health state analysis based on the fusion feature, determining a second loss value based on a difference between the health state analysis result and the health state label;
training a health state analysis model according to the second loss value, wherein the health state analysis model obtained through training is used for outputting the probability of pet health according to the second pet moving image, the pet information, the pet complaints and the body temperature information.
In some embodiments, the constructing the second training data set comprises:
collecting a second pet moving image and a pet appeal, wherein the second pet moving image is obtained by shooting the behavior of the pet through a shooting device after the pet appeal is responded, and the pet appeal comprises at least one of feeding, drinking water, toilet or temperature adjustment;
acquiring the pet information, including acquiring a pet age bracket and a pet social relationship network;
acquiring the body temperature information, wherein the body temperature information comprises a temperature curve obtained by detecting the body temperature of the pet after the pet solicits to respond through infrared detection equipment;
packaging the second pet moving image, the pet appeal, the pet information and the body temperature information by taking the affiliated pet as a unit to generate a training data packet, and marking the health state of the training data packet to obtain a health state label;
and constructing the second training data set according to the training data packets and the corresponding health state labels.
In some embodiments, the pet appliance includes a combination of one or more of a dispenser, a water dispenser, a pet litter box, and an air conditioner. For example, if the pet needs to adjust the room temperature in some cases, the corresponding behavior before the request is connected to the pet to adjust the temperature, and accordingly, indoor equipment such as an air conditioner can be controlled to adjust the room temperature.
And thirdly, registering pet information of the pet, and acquiring the pet moving image and the body temperature information in real time, wherein the pet information comprises pet age groups and pet social relationship information.
Preferably, the pet information comprises the age groups of the pets, including the age groups of young, middle-aged and old, and the specific age groups can be correspondingly divided according to the types of the pets. The pet information also comprises pet social relation information, and other pets and health information thereof which are frequently contacted in the living environment of the pets and possibly influence the health of the pets can be obtained through the pet social relation. The method and the device can predict the health state change of the pet by extracting the characteristics of the health state of the age index and the social relationship circle which can influence the health state of the pet and fusing the characteristics into the characteristics of pet behaviors and the characteristics of body temperature information after responding to the pet appeal.
For example, in some embodiments, among pet groups in a family or community, there are a plurality of animal species such as cats and dogs, and a plurality of social groups among animal species, the social groups are divided by, for example, living areas, host relationships, family relationships of pets, etc., and in this embodiment, the health states of other pets with high contact probability are considered to have a large relationship, and the health states of the pets are comprehensively judged by extracting health features of the pets in the social relationship circle of the pets.
And step four, inputting the first pet moving image into the appeal analysis model to obtain the pet appeal, and controlling the pet equipment to act so as to respond to the pet appeal.
Preferably, in some embodiments, the pet equipment includes a combination of one or more of a feeder, a water dispenser, a pet litter box, and an air conditioner. The pet equipment is controlled to respond to pet appeal through the communication connection of the pet equipment and the appeal analysis model.
And fifthly, inputting the responded second pet moving image, the pet information, the pet complaints and the body temperature information into the health state analysis model to output a pet health state analysis result.
Preferably, in some embodiments, the method further comprises: and comparing the probability with a preset threshold, and if the probability is smaller than the preset threshold, reminding the doctor to seek medical attention.
An embodiment of the second aspect of the present invention provides a pet status analysis device based on image recognition, as shown in fig. 4, including:
the appeal model training module is used for acquiring a first pet moving image and carrying out appeal labeling to construct a first training data set, carrying out model training based on the first training data set to obtain a appeal analysis model, wherein the first pet moving image is a pet moving image before responding to pet appeal;
the state model training module is used for acquiring the second pet moving image, pet information and body temperature information after responding to the pet appeal, and carrying out health state labeling to construct a second training data set, and carrying out model training based on the second training data set to obtain a health state analysis model;
the information registration module is used for registering pet information of the pet and acquiring the pet moving image and the body temperature information in real time, wherein the pet information comprises pet age groups and pet social relationship information;
a appeal analysis module for inputting the first pet moving image into the appeal analysis model to obtain the pet appeal and controlling pet equipment to act in response to the pet appeal;
and the state analysis module inputs the responded second pet moving image, the pet information, the pet complaint and the body temperature information into the health state analysis model so as to output a pet health state analysis result.
An embodiment of a third aspect of the present invention provides an electronic device, as shown in fig. 5, including a memory and a processor, the memory storing a computer program which, when executed by the processor, implements the pet state analysis method based on image recognition according to any of the embodiments above.
An embodiment of a fourth aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image recognition-based pet state analysis method of any of the embodiments above.
Computer-readable storage media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact Disc Read Only Memory (CDROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by the computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of function in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CDROM, or any other form of storage medium known in the art.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the invention, and is not meant to limit the scope of the invention, but to limit the invention to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the invention are intended to be included within the scope of the invention.
Claims (8)
1. The pet state analysis method based on image recognition is characterized by comprising the following steps of:
collecting a first pet moving image and performing appeal annotation to construct a first training data set, and performing model training based on the first training data set to obtain a appeal analysis model, wherein the first pet moving image is a pet moving image before responding to pet appeal;
collecting a second pet moving image, pet information and body temperature information after responding to the pet appeal, and marking the health state to construct a second training data set, and performing model training based on the second training data set to obtain a health state analysis model; the health state analysis model comprises an image feature extraction module, an information feature extraction module, a body temperature feature extraction module and a feature fusion module;
the steps of collecting the pet appeal, obtaining a second pet moving image, pet information and body temperature information after response, and marking the health state to construct a second training data set, and performing model training based on the second training data set to obtain a health state analysis model include:
constructing a second training data set, wherein the second training data set comprises input data and health status labels, and the input data comprises a second pet moving image, corresponding pet appeal and pet information and body temperature information of a pet displayed in the second pet moving image;
inputting the second pet moving image into the image feature extraction module to extract image features, inputting the pet information and the pet appeal into the information feature extraction module to extract information features, and inputting the body temperature information into the body temperature feature extraction module to extract body temperature features;
inputting the image features, the information features and the body temperature features into the feature fusion module to obtain fusion features;
performing a health state analysis based on the fusion feature, determining a second loss value based on a difference between the health state analysis result and the health state label;
training a health state analysis model according to the second loss value, wherein the health state analysis model obtained by training is used for outputting the probability of pet health according to the second pet moving image, the pet information, the pet complaints and the body temperature information;
the constructing the second training data set includes:
collecting a second pet moving image and a pet appeal, wherein the second pet moving image is obtained by shooting the behavior of the pet through a shooting device after the pet appeal is responded, and the pet appeal comprises at least one of feeding, drinking water, toilet or temperature adjustment;
acquiring the pet information, including acquiring a pet age bracket and a pet social relationship network;
acquiring the body temperature information, wherein the body temperature information comprises a temperature curve obtained by detecting the body temperature of the pet after the pet solicits to respond through infrared detection equipment;
packaging the second pet moving image, the pet appeal, the pet information and the body temperature information by taking the affiliated pet as a unit to generate a training data packet, and marking the health state of the training data packet to obtain a health state label;
constructing the second training data set according to a plurality of the training data packets and the corresponding health state labels;
registering pet information of a pet, and acquiring the pet moving image and the body temperature information in real time, wherein the pet information comprises pet age groups and pet social relationship information, and the pet social relationship information comprises health information of other pets;
inputting the first pet moving image into the appeal analysis model to obtain the pet appeal and controlling pet equipment action to respond to the pet appeal;
and inputting the responded second pet moving image, the pet information, the pet complaints and the body temperature information into the health state analysis model to output a pet health state analysis result.
2. The pet state analysis method based on image recognition according to claim 1, wherein: the steps of collecting a first pet moving image and performing appeal annotation to construct a first training data set, performing model training based on the first training data set to obtain a appeal analysis model include:
constructing a first training data set, wherein the first training data set comprises a first pet moving image and a corresponding appeal label;
inputting the first pet moving image into the appeal analysis model for appeal analysis, and determining a first loss value based on the difference between the appeal label and the appeal analysis result;
training a appeal analysis model according to the first loss value, wherein the appeal analysis model obtained through training is used for judging the pet appeal according to the first pet moving image.
3. The pet state analysis method based on image recognition according to claim 2, wherein: said constructing said first training data set comprises:
collecting a first pet moving image, wherein the first pet moving image is obtained by shooting a pet action by shooting equipment before a pet appeal is sent out, and the pet appeal comprises at least one of feeding, drinking water, toilet or temperature adjustment;
labeling the first pet moving image according to the pet appeal which correspondingly occurs in the first pet moving image to form a appeal label;
and constructing the first training data set according to the first pet moving image and the appeal label.
4. The pet state analysis method based on image recognition according to claim 1, wherein: the pet equipment comprises one or a combination of a plurality of feeders, water dispensers, pet toilets and air conditioners.
5. The pet state analysis method based on image recognition according to claim 1, wherein: further comprises: and comparing the probability with a preset threshold, and if the probability is smaller than the preset threshold, reminding the doctor to seek medical attention.
6. A pet state analysis device based on image recognition, comprising:
the appeal model training module is used for acquiring a first pet moving image and carrying out appeal labeling to construct a first training data set, carrying out model training based on the first training data set to obtain a appeal analysis model, wherein the first pet moving image is a pet moving image before responding to pet appeal;
the state model training module is used for acquiring the second pet moving image, pet information and body temperature information after responding to the pet appeal, and carrying out health state labeling to construct a second training data set, and carrying out model training based on the second training data set to obtain a health state analysis model; the health state analysis model comprises an image feature extraction module, an information feature extraction module, a body temperature feature extraction module and a feature fusion module;
the steps of collecting the pet appeal, obtaining a second pet moving image, pet information and body temperature information after response, and marking the health state to construct a second training data set, and performing model training based on the second training data set to obtain a health state analysis model include:
constructing a second training data set, wherein the second training data set comprises input data and health status labels, and the input data comprises a second pet moving image, corresponding pet appeal and pet information and body temperature information of a pet displayed in the second pet moving image;
inputting the second pet moving image into the image feature extraction module to extract image features, inputting the pet information and the pet appeal into the information feature extraction module to extract information features, and inputting the body temperature information into the body temperature feature extraction module to extract body temperature features;
inputting the image features, the information features and the body temperature features into the feature fusion module to obtain fusion features;
performing a health state analysis based on the fusion feature, determining a second loss value based on a difference between the health state analysis result and the health state label;
training a health state analysis model according to the second loss value, wherein the health state analysis model obtained by training is used for outputting the probability of pet health according to the second pet moving image, the pet information, the pet complaints and the body temperature information;
the constructing the second training data set includes:
collecting a second pet moving image and a pet appeal, wherein the second pet moving image is obtained by shooting the behavior of the pet through a shooting device after the pet appeal is responded, and the pet appeal comprises at least one of feeding, drinking water, toilet or temperature adjustment;
acquiring the pet information, including acquiring a pet age bracket and a pet social relationship network;
acquiring the body temperature information, wherein the body temperature information comprises a temperature curve obtained by detecting the body temperature of the pet after the pet solicits to respond through infrared detection equipment;
packaging the second pet moving image, the pet appeal, the pet information and the body temperature information by taking the affiliated pet as a unit to generate a training data packet, and marking the health state of the training data packet to obtain a health state label;
constructing the second training data set according to a plurality of the training data packets and the corresponding health state labels;
the information registration module is used for registering pet information of the pets, acquiring the pet moving images and the body temperature information in real time, wherein the pet information comprises pet age groups and pet social relationship information, and the pet social relationship information comprises health information of other pets;
a appeal analysis module for inputting the first pet moving image into the appeal analysis model to obtain the pet appeal and controlling pet equipment to act in response to the pet appeal;
and the state analysis module inputs the responded second pet moving image, the pet information, the pet complaint and the body temperature information into the health state analysis model so as to output a pet health state analysis result.
7. An electronic device comprising a memory and a processor, the memory storing a computer program that when executed by the processor implements the image recognition-based pet state analysis method of any one of claims 1-5.
8. A computer readable storage medium, having stored thereon a computer program which when executed by a processor implements the image recognition based pet state analysis method of any of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310734138.3A CN116451046B (en) | 2023-06-20 | 2023-06-20 | Pet state analysis method, device, medium and equipment based on image recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310734138.3A CN116451046B (en) | 2023-06-20 | 2023-06-20 | Pet state analysis method, device, medium and equipment based on image recognition |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116451046A CN116451046A (en) | 2023-07-18 |
CN116451046B true CN116451046B (en) | 2023-09-05 |
Family
ID=87136065
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310734138.3A Active CN116451046B (en) | 2023-06-20 | 2023-06-20 | Pet state analysis method, device, medium and equipment based on image recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116451046B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117952230A (en) * | 2024-03-14 | 2024-04-30 | 广州佳可电子科技股份有限公司 | Pet behavior analysis method, device, equipment and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2634452A1 (en) * | 2005-12-20 | 2007-07-05 | Mars, Incorporated | Method and system for determining and providing a comprehensive pet health and nutrition feeding plan |
CN109314762A (en) * | 2016-06-13 | 2019-02-05 | 深圳市智美达科技股份有限公司 | Pet monitoring system and method |
CN112134949A (en) * | 2020-09-22 | 2020-12-25 | 珠海格力电器股份有限公司 | Pet hosting method, device and system |
CN114097645A (en) * | 2020-08-26 | 2022-03-01 | 中移动信息技术有限公司 | Training method, device, equipment and storage medium of pet health model |
CN114270448A (en) * | 2019-06-26 | 2022-04-01 | 马斯公司 | System and method for pet health assessment |
CN114694842A (en) * | 2022-03-29 | 2022-07-01 | 深圳市优必选科技股份有限公司 | Pet health monitoring method and device, computer equipment and readable storage medium |
-
2023
- 2023-06-20 CN CN202310734138.3A patent/CN116451046B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2634452A1 (en) * | 2005-12-20 | 2007-07-05 | Mars, Incorporated | Method and system for determining and providing a comprehensive pet health and nutrition feeding plan |
CN109314762A (en) * | 2016-06-13 | 2019-02-05 | 深圳市智美达科技股份有限公司 | Pet monitoring system and method |
CN114270448A (en) * | 2019-06-26 | 2022-04-01 | 马斯公司 | System and method for pet health assessment |
CN114097645A (en) * | 2020-08-26 | 2022-03-01 | 中移动信息技术有限公司 | Training method, device, equipment and storage medium of pet health model |
CN112134949A (en) * | 2020-09-22 | 2020-12-25 | 珠海格力电器股份有限公司 | Pet hosting method, device and system |
CN114694842A (en) * | 2022-03-29 | 2022-07-01 | 深圳市优必选科技股份有限公司 | Pet health monitoring method and device, computer equipment and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN116451046A (en) | 2023-07-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Van Hertem et al. | Appropriate data visualisation is key to Precision Livestock Farming acceptance | |
CN105554482B (en) | It is a kind of to reinforce the comprehensive management apparatus and method that owner contacts with pet | |
Berckmans | Precision livestock farming technologies for welfare management in intensive livestock systems | |
CN116451046B (en) | Pet state analysis method, device, medium and equipment based on image recognition | |
US8305220B2 (en) | Monitoring and displaying activities | |
CN103475736B (en) | Sensor communication system and method for conducting monitoring through same | |
CN111275911B (en) | Danger prompting method, equipment and computer readable storage medium | |
CN103488148A (en) | Intelligent livestock behavior monitoring system based on internet of things and computer vision | |
KR102396999B1 (en) | Cattle behavior automatic recognition and the monitoring system using the deep learning and method thereof | |
CN109951363A (en) | Data processing method, apparatus and system | |
Chen et al. | Monitoring the behaviours of pet cat based on YOLO model and raspberry Pi | |
CN110896871A (en) | Method and device for putting food and intelligent food throwing machine | |
Kim et al. | An approach for recognition of human's daily living patterns using intention ontology and event calculus | |
CN114258870B (en) | Unattended pet care method, unattended pet care system, storage medium and terminal | |
CN114667948A (en) | Intelligent pet feeding and accompanying system based on Internet of things | |
Pretto et al. | A novel low-cost visual ear tag based identification system for precision beef cattle livestock farming | |
Shu et al. | Determining the onset of heat stress in a dairy herd based on automated behaviour recognition | |
KR20190067616A (en) | Method and system for caring pets and plants | |
CN113728941B (en) | Intelligent pet dog domestication method and system | |
Eagan et al. | Behaviour Real-Time spatial tracking identification (BeRSTID) used for cat behaviour monitoring in an animal shelter | |
CN115423641A (en) | Live pig breeding remote detection and AI control equipment | |
Heseker et al. | Detecting tail biters by monitoring pig screams in weaning pigs | |
US10893243B1 (en) | Lawn violation detection | |
Hung et al. | Pet cat behavior recognition based on YOLO model | |
CN115731428A (en) | Intelligent park monitoring method, device and system and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
PE01 | Entry into force of the registration of the contract for pledge of patent right |
Denomination of invention: Pet state analysis method, device, medium, and equipment based on image recognition Effective date of registration: 20231117 Granted publication date: 20230905 Pledgee: Bank of Nanjing Limited by Share Ltd. Beijing branch Pledgor: KITTEN&PUPPY Co.,Ltd. Registration number: Y2023980066361 |
|
PE01 | Entry into force of the registration of the contract for pledge of patent right |