CN114283943A - Health prediction method and device, electronic equipment and storage medium - Google Patents

Health prediction method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114283943A
CN114283943A CN202111648180.0A CN202111648180A CN114283943A CN 114283943 A CN114283943 A CN 114283943A CN 202111648180 A CN202111648180 A CN 202111648180A CN 114283943 A CN114283943 A CN 114283943A
Authority
CN
China
Prior art keywords
pet
health
nasal
nose
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111648180.0A
Other languages
Chinese (zh)
Inventor
彭永鹤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Ruipeng Pet Healthcare Group Co Ltd
Original Assignee
New Ruipeng Pet Healthcare Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Ruipeng Pet Healthcare Group Co Ltd filed Critical New Ruipeng Pet Healthcare Group Co Ltd
Priority to CN202111648180.0A priority Critical patent/CN114283943A/en
Publication of CN114283943A publication Critical patent/CN114283943A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The embodiment of the application discloses a health prediction method, a health prediction device, electronic equipment and a storage medium, wherein the method comprises the following steps: the electronic equipment acquires the basic identity information of the pet; determining target data corresponding to the pet according to the basic identity information; acquiring the nasal print information of the pet; and determining the health condition of the pet according to the nose print information and the target data. According to the embodiment of the application, the health condition of the pet can be predicted according to the nasal print information of the pet.

Description

Health prediction method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of medical technology, and in particular, to a health prediction method, apparatus, electronic device, and storage medium.
Background
Pets may also suffer from diseases due to environmental factors and resistance factors of the pet itself. However, as the disease process of the pet is always a continuous process, the symptoms of the disease of the pet are not obvious in a short time.
Therefore, how to predict the health condition of the pet, so that timely hospitalizing the pet becomes an urgent problem to be solved.
Disclosure of Invention
The embodiment of the application provides a health prediction method and device, electronic equipment and a storage medium. The health prediction method can predict the health condition of the pet.
In a first aspect, an embodiment of the present application provides a health prediction method, including:
acquiring basic identity information of a pet;
determining target data corresponding to the pet according to the basic identity information;
acquiring the nasal print information of the pet;
and determining the health condition of the pet according to the nose print information and the target data.
In a second aspect, an embodiment of the present application provides a health prediction apparatus, including:
the first acquisition module is used for acquiring the basic identity information of the pet;
the first determining module is used for determining target data corresponding to the pet according to the basic identity information;
the second acquisition module is used for acquiring the nasal print information of the pet;
and the second determination module is used for determining the health condition of the pet according to the nose print information and the target data.
In a third aspect, an electronic device is provided in an embodiment of the present application, including a memory storing executable program code, a processor coupled to the memory; the processor calls the executable program code stored in the memory to execute the steps in the health prediction method provided by the embodiment of the application.
In a fourth aspect, an embodiment of the present application provides a storage medium, where the storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by a processor to perform steps in a health prediction method provided in an embodiment of the present application.
In the embodiment of the application, the electronic equipment acquires the basic identity information of the pet; determining target data corresponding to the pet according to the basic identity information; acquiring the nasal print information of the pet; and determining the health condition of the pet according to the nose print information and the target data. According to the embodiment of the application, the health condition of the pet can be predicted according to the nasal print information of the pet.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a first flowchart of a health prediction method according to an embodiment of the present application.
Fig. 2 is a scene schematic diagram of acquiring nasal print information according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a second process of the health prediction method according to the embodiment of the present application.
Fig. 4 is a schematic structural diagram of a health prediction apparatus according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Pets may also suffer from diseases due to environmental factors and resistance factors of the pet itself. However, as the disease process of the pet is always a continuous process, the symptoms of the disease of the pet are not obvious in a short time.
Therefore, how to predict the health condition of the pet, so that timely hospitalizing the pet becomes an urgent problem to be solved.
In order to solve the above technical problem, embodiments of the present application provide a health prediction method, a health prediction apparatus, an electronic device, and a storage medium. Wherein the health prediction method can predict the physical health condition of the pet.
The health prediction method can be applied to common electronic equipment such as computers, mobile phones and tablet computers, and is also suitable for wearable electronic equipment such as intelligent glasses, intelligent watches and intelligent rings. And are not intended to be limiting herein.
Referring to fig. 1, fig. 1 is a first flowchart of a health prediction method according to an embodiment of the present disclosure. The health prediction method may include the steps of:
110. and acquiring basic identity information of the pet.
In some embodiments, the electronic device may obtain basic identity information of the pet, the basic identity information including breed information, gender information, age information, and the like.
The electronic equipment can scan the pet through the camera to obtain the characteristic images of a plurality of pets, and then identify some characteristics of the pet according to the characteristic images, so as to determine the variety information of the pet. For example, cat and dog can be identified according to the shape of the pet, and then which cat or dog can be identified according to the hair color pattern of the pet.
The electronic equipment can also determine the sex information of the pet through the characteristic image, for example, the sex can be judged through the shape and the size of the head of the pet, for example, the male cat has a large cheek, the face is large, the whole head is also large, and the female cat does not have the cheek, and the head is small.
The electronic device can also determine the age information of the pet through the characteristic image, for example, the age information of the pet is determined through confirming the tooth length, the height and the like of the pet.
In some embodiments, the user may also actively input basic identity information such as breed information, gender information, age information, and the like.
120. And determining target data corresponding to the pet according to the basic identity information.
In some embodiments, after determining the basic identity information, target data corresponding to the pet may be determined in a server or a database, for example, the target data includes a target health database or health prediction model data.
For example, the target data and the basic identity information have a corresponding mapping relationship, and after the electronic device acquires the basic identity information of the pet, the electronic device may determine the target data corresponding to the basic identity information of the pet in the database according to the mapping relationship.
In some embodiments, the physiological characteristics of the pet can reflect the physical health condition of the pet, such as the pet such as a cat, a dog, etc., and the physiological condition of the nose can reflect the physical health condition of the pet, such as the dry nose or the wet nose, which are all changed corresponding to the physical health condition of the pet.
When the nose of the pet is dry or wet, the physiological features of the nose can be slightly changed, such as the number and depth of nasal wrinkles in a certain area of the nose. The electronic equipment can acquire the pet nose print information and determine the health condition of the pet according to the nose print information of the pet.
Wherein the target health database in the target data comprises a first data mapping relationship between the nose print information and the health condition. For example, when the depth of the nasal print of the pet is in the normal range, the corresponding health condition is healthy. When the depth of the nasal veins of the pet is not in the normal range, the corresponding health condition is not good.
In some embodiments, the health prediction models corresponding to the animals with different basic identity information are different, and the health prediction model data corresponding to the animals with different basic identity information are different. The health prediction model data and the basic identity information correspond to a second data mapping relation, and the electronic equipment can find the health prediction model data corresponding to the basic identity information according to the second data mapping relation.
For example, the first health prediction model data corresponds to the gaffert cats aged 0-1, and the second health prediction model data corresponds to the puppet cats aged 0-1, that is, the health prediction model data corresponding to the pets with different basic identity information are also different.
130. And acquiring the nasal print information of the pet.
Referring to fig. 2, fig. 2 is a schematic view of a scene for acquiring nose print information according to an embodiment of the present disclosure.
The electronic device S10 is provided with a sensor S11, and the sensor S11 can scan the nose of the pet to obtain the nasal print information of the pet. The sensor S11 may be a laser sensor, an ultrasonic sensor, an image sensor, etc., and the electronic device may scan the nose of the pet within a certain distance, thereby obtaining the nasal print information of the pet.
Specifically, when the electronic device scans through laser, the nose of the pet can be scanned through a tof (time of fly) sensor, after the laser irradiates the nose, because the depths of the nasal veins corresponding to different nasal veins are different, the time of the laser irradiating different nasal veins is also different, so that the time of different laser signals reflecting to the laser sensor is also different, a phase difference can be generated between the emitted laser signals and the corresponding reflected laser signals, and the electronic device constructs a three-dimensional nasal vein image, namely a 3D nasal vein image, corresponding to the nasal veins of the pet according to the phase difference.
The electronic equipment can also transmit the nasal veins of the pet scanned by the ultrasonic waves to the nose of the pet, because the depth of the nasal veins corresponding to different nasal veins is different, the time for the ultrasonic waves to propagate to different nasal veins is also different, so that the time for different ultrasonic signals to reflect to the ultrasonic sensor is also different, a phase difference can be generated between the transmitted ultrasonic signals and the corresponding reflected ultrasonic signals, and the electronic equipment can construct a three-dimensional space nasal vein image corresponding to the nasal veins of the pet according to the phase difference.
The electronic device can determine the number of the nasal wrinkles and the depth of the nasal wrinkles according to the three-dimensional space nasal pattern. For example, peaks and valleys exist in the three-dimensional nose pattern map, and the peaks and the valleys are distributed in a staggered manner to form the nose pattern, wherein the number of the nose patterns can be determined according to the number of the valleys, namely the number of the valleys can be the number of the nose patterns. And the depth from the wave crest to the wave trough is the depth corresponding to the nose line.
The electronic equipment can also shoot images of the nose of the pet through the camera, then optimize the images of the nose to obtain a plurality of optimized images, then obtain the nasal print image of the pet according to the plurality of optimized images, and then identify the nasal print image so as to determine the number of the nasal prints of the pet and the distribution situation of the nasal prints.
For example, the electronic device may acquire images of noses of multiple pets at the same shooting position, perform grayscale processing on the images of noses of the multiple pets to obtain multiple grayscale images, and perform processing methods such as brightness adjustment, contrast adjustment, and sharpening adjustment on the grayscale images to obtain multiple optimized images. And the electronic equipment performs image fusion processing on the plurality of optimized images to obtain the nose pattern image.
The electronic equipment can identify and process the nasal print image so as to determine the number of the nasal prints corresponding to the final pet nasal print and the distribution condition of the nasal prints.
For example, since the peaks and valleys corresponding to the nose pattern are different in the nose pattern image, the pixel information corresponding to the peaks and valleys of the nose pattern on the nose pattern image is different. For example, the gray values of the peaks and the valleys on the nose pattern image are different, and the gray value corresponding to the peak region pixel is lower than the gray value corresponding to the valley region pixel on the nose pattern image. The electronic device can identify the gray value corresponding to the pixel to determine the peak and the trough, and finally the electronic device can determine the number of the nose wrinkles according to the number of the troughs. The distribution condition of the nose lines can be determined according to the distribution conditions of the wave crests and the wave troughs.
After the electronic equipment acquires the nose image of the pet, the nose image of the pet can be input into the neural network model for image segmentation, for example, the image segmentation models such as a U-Net model and an Encoder-Decoder model can be adopted to segment the image of the nose of the pet, namely, the image corresponding to the peak and the image corresponding to the trough on the nose are segmented, so that the image corresponding to the peak and/or the image corresponding to the trough in the nasal print are obtained, and finally, the number of the nasal prints and the distribution condition of the nasal prints are determined according to the image corresponding to the peak and/or the image corresponding to the trough.
It should be noted that the nasal veins on the pet nose are formed by staggering peaks and troughs, and after the image corresponding to the peak or the image corresponding to the trough is obtained, the number of the nasal veins can be determined according to the number of the peaks or the number of the troughs.
In some embodiments, after the electronic device obtains the number of the pet nasal veins, the depth of the nasal veins and the distribution of the nasal veins, the electronic device may generate the corresponding nasal vein information of the pet according to the information.
Specifically, the electronic device may determine at least one target area on the nose of the pet, and obtain the nasal print information corresponding to the at least one target area.
The nasal print features are more pronounced in some areas of the pet's nose, such as in the middle of the pet's nose. While the area on the nose, such as the edge of a pet's nose, has no obvious nasal print features. At least one target area can be determined in the middle of the nose of the pet, and then the nose print information corresponding to the target area is obtained.
The electronic device may also determine a nostril region and a nose edge region, remove the nostril region and the nose edge region, determine a remaining region of the nose as a region where a nose ridge may be acquired, and then determine at least one target region on the region where the nose ridge may be acquired.
In some embodiments, the electronic device may obtain the number of nasal wrinkles and/or the depth of the nasal wrinkles corresponding to at least one target area, and then determine the nasal wrinkle information of the pet according to the number of nasal wrinkles and/or the depth of the nasal wrinkles.
For example, the electronic device may determine a distribution of the nasal print depth of each nasal print, obtain a nasal print depth distribution map, and then determine the nasal print information of the pet according to the nasal print depth distribution map. As can be seen from the above description, the depth of each nasal print in the target region may be obtained by laser scanning, ultrasonic scanning, or the like, and then a nasal print depth distribution map is constructed according to the depth of each nasal print.
In some embodiments, the electronic device may further determine a plurality of target peak points and a plurality of target valley points within a target region of a three-dimensional nose pattern (nose pattern depth profile), where the target peak points are peak points having a height higher than a first preset height, and the target valleys are valley points having a height lower than a second preset height. And then determining the vector distance between each target peak point and the nearest target valley point, and generating the nose pattern information corresponding to the pet according to the vector distance between each target peak point and the nearest target valley point.
In some embodiments, the electronic device may further determine a plurality of target sub-regions in the target region, determine the number of nasal prints corresponding to each target sub-region, and then determine the nasal print information of the pet according to the number of nasal prints corresponding to each target sub-region.
For example, after the electronic device determines the target area on the nose of the pet, the electronic device may divide the target area into a plurality of target sub-areas and then determine the number of nose prints in each target sub-area. And the electronic equipment generates the nasal print information of the pet according to the number of the nasal prints corresponding to each target sub-area.
When the target area is divided, the target area may be divided according to a preset division rule, for example, the shape of the pet nose is determined first, and then the shape corresponding to each target sub-area is determined according to the shape of the pet nose. Then, the number of the divided target sub-regions is determined according to the area covered by the target region, for example, the larger the area of the target region is, the larger the number of the divided target sub-regions is, and the smaller the area of the target region is, the smaller the number of the divided target sub-regions is.
In some embodiments, after the electronic device determines at least one target area on the nasal print of the pet, the electronic device may obtain the number of the nasal prints of the target area and the corresponding depth of each of the nasal prints, and then generate the nasal print information of the pet according to the number of the nasal prints and the corresponding depth of each of the nasal prints.
It should be noted that the above description is only an example of the acquisition of the pet nose print information, and other ways of determining the pet nose print information may be adopted in the way of actually acquiring the pet nose print information.
140. And determining the health condition of the pet according to the nose print information and the target data.
In some embodiments, the electronic device may match the nasal print information of the pet with the target health database to obtain the health condition of the pet.
For example, the electronic device may obtain a plurality of types of target nose print information in the target health database, then match the nose print information of the pet with each piece of target nose print information, and if the nose print information is successfully matched with a certain piece of target nose print information, obtain a health condition corresponding to the target nose print information as the health condition of the pet.
For example, the target nose print information includes the number of target nose prints and the target nose print depth, the number of the nose prints of the pet is compared with the number of the target nose prints, the nose print depth of the pet is compared with the target nose print depth, and if the final result is that all comparison succeeds, the health condition corresponding to the target nose print information is considered as the health condition of the pet.
In some embodiments, the electronic device may further determine a health prediction model according to the health prediction model data, and input the nasal print information into the health prediction model to obtain the health condition of the pet.
For example, after the corresponding health prediction model data is determined according to the basic identity information of the pet, the health prediction model data can be directly used for parameter setting of the basic model, so that the health prediction model is obtained, and then the nose print information is input into the health prediction model, so that the health condition of the pet is obtained.
In some embodiments, the electronic device also obtains humidity information of the nose of the pet before determining the health condition of the pet based on the nasal print information and the target data, and then determines the health condition of the pet based on the nasal print information, the humidity information, and the target data.
For example, the mapping relation between the target data and the nasal print information and the humidity information can be established simultaneously. The electronic device can determine the target health database or health prediction model data from the target data through the nose print information and the humidity information. And finally, determining the health condition of the pet by using the target data.
In the embodiment of the application, the electronic equipment acquires the basic identity information of the pet; determining target data corresponding to the pet according to the basic identity information; acquiring the nasal print information of the pet; and determining the health condition of the pet according to the nose print information and the target data. According to the embodiment of the application, the health condition of the pet can be predicted according to the nasal print information of the pet.
For a more detailed understanding of the health prediction method provided in the present application, please refer to fig. 3, and fig. 3 is a second flowchart of the health prediction method provided in the present application. The health prediction method may include the steps of:
201. and acquiring basic identity information of the pet.
In some embodiments, the electronic device may obtain basic identity information of the pet, the basic identity information including breed information, gender information, age information, and the like.
The electronic equipment can scan the pet through the camera to obtain the characteristic images of a plurality of pets, and then identify some characteristics of the pet according to the characteristic images, so as to determine the variety information of the pet. For example, cat and dog can be identified according to the shape of the pet, and then which cat or dog can be identified according to the hair color pattern of the pet.
The electronic equipment can also determine the sex information of the pet through the characteristic image, for example, the sex can be judged through the shape and the size of the head of the pet, for example, the male cat has a large cheek, the face is large, the whole head is also large, and the female cat does not have the cheek, and the head is small.
The electronic device can also determine the age information of the pet through the characteristic image, for example, the age information of the pet is determined through confirming the tooth length, the height and the like of the pet.
In some embodiments, the user may also actively input basic identity information such as breed information, gender information, age information, and the like.
202. And determining a target health database or health prediction model data corresponding to the pet according to the variety information, the sex information and the age information in the basic identity information.
The target data and the basic identity information have a corresponding mapping relation, and after the electronic equipment acquires the basic identity information of the pet, the target data corresponding to the basic identity information of the pet can be determined in the database according to the mapping relation.
In some embodiments, the electronic device may determine a first database from the database based on the breed information, then determine a second database from the first database based on the gender information, and finally determine the target data from the second database based on the age information. Wherein the target data includes a target health database and health prediction model data.
In some embodiments, a target health database in the target data includes a first data mapping relationship between the nasal print information and the health condition. For example, when the depth of the nasal print of the pet is in the normal range, the corresponding health condition is healthy. When the depth of the nasal veins of the pet is not in the normal range, the corresponding health condition is not good.
In some embodiments, the health prediction models corresponding to the animals with different basic identity information are different, and the health prediction model data corresponding to the animals with different basic identity information are different. The health prediction model data and the basic identity information correspond to a second data mapping relation, and the electronic equipment can find the health prediction model data corresponding to the basic identity information according to the second data mapping relation.
For example, the first health prediction model data corresponds to the gaffert cats aged 0-1, and the second health prediction model data corresponds to the puppet cats aged 0-1, that is, the health prediction model data corresponding to the pets with different basic identity information are also different.
203. At least one target area is identified on the nose of the pet.
The nasal print features are more pronounced in some areas of the pet's nose, such as in the middle of the pet's nose. While the area on the nose, such as the edge of a pet's nose, has no obvious nasal print features. At least one target area can be determined in the middle of the nose of the pet, and then the nose print information corresponding to the target area is obtained.
The electronic device may also determine a nostril region and a nose edge region, remove the nostril region and the nose edge region, determine a remaining region of the nose as a region where a nose ridge may be acquired, and then determine at least one target region on the region where the nose ridge may be acquired.
204. And acquiring the number and/or the depth of the nasal veins corresponding to at least one target area.
Specifically, when the electronic device scans through laser, the nose of the pet can be scanned through a tof (time of fly) sensor, after the laser irradiates the nose, because the depths of the nasal veins corresponding to different nasal veins are different, the time of the laser irradiating different nasal veins is also different, so that the time of different laser signals reflecting to the laser sensor is also different, a phase difference can be generated between the emitted laser signals and the corresponding reflected laser signals, and the electronic device constructs a three-dimensional nasal vein image, namely a 3D nasal vein image, corresponding to the nasal veins of the pet according to the phase difference.
The electronic equipment can also transmit the nasal veins of the pet scanned by the ultrasonic waves to the nose of the pet, because the depth of the nasal veins corresponding to different nasal veins is different, the time for the ultrasonic waves to propagate to different nasal veins is also different, so that the time for different ultrasonic signals to reflect to the ultrasonic sensor is also different, a phase difference can be generated between the transmitted ultrasonic signals and the corresponding reflected ultrasonic signals, and the electronic equipment can construct a three-dimensional space nasal vein image corresponding to the nasal veins of the pet according to the phase difference.
The electronic device can determine the number of the nasal wrinkles and the depth of the nasal wrinkles according to the three-dimensional space nasal pattern. For example, peaks and valleys exist in the three-dimensional nose pattern map, and the peaks and the valleys are distributed in a staggered manner to form the nose pattern, wherein the number of the nose patterns can be determined according to the number of the valleys, namely the number of the valleys can be the number of the nose patterns. And the depth from the wave crest to the wave trough is the depth corresponding to the nose line.
The electronic equipment can also shoot images of the nose of the pet through the camera, then optimize the images of the nose to obtain a plurality of optimized images, then obtain the nasal print image of the pet according to the plurality of optimized images, and then identify the nasal print image so as to determine the number of the nasal prints of the pet and the distribution situation of the nasal prints.
For example, the electronic device may acquire images of noses of multiple pets at the same shooting position, perform grayscale processing on the images of noses of the multiple pets to obtain multiple grayscale images, and perform processing methods such as brightness adjustment, contrast adjustment, and sharpening adjustment on the grayscale images to obtain multiple optimized images. And the electronic equipment performs image fusion processing on the plurality of optimized images to obtain the nose pattern image.
The electronic equipment can identify and process the nasal print image so as to determine the number of the nasal prints corresponding to the final pet nasal print and the distribution condition of the nasal prints.
For example, since the peaks and valleys corresponding to the nose pattern are different in the nose pattern image, the pixel information corresponding to the peaks and valleys of the nose pattern on the nose pattern image is different. For example, the gray values of the peaks and the valleys on the nose pattern image are different, and the gray value corresponding to the peak region pixel is lower than the gray value corresponding to the valley region pixel on the nose pattern image. The electronic device can identify the gray value corresponding to the pixel to determine the peak and the trough, and finally the electronic device can determine the number of the nose wrinkles according to the number of the troughs. The distribution condition of the nose lines can be determined according to the distribution conditions of the wave crests and the wave troughs.
After the electronic equipment acquires the nose image of the pet, the nose image of the pet can be input into the neural network model for image segmentation, for example, the image segmentation models such as a U-Net model and an Encoder-Decoder model can be adopted to segment the image of the nose of the pet, namely, the image corresponding to the peak and the image corresponding to the trough on the nose are segmented, so that the image corresponding to the peak and/or the image corresponding to the trough in the nasal print are obtained, and finally, the number of the nasal prints and the distribution condition of the nasal prints are determined according to the image corresponding to the peak and/or the image corresponding to the trough.
It should be noted that the nasal veins on the pet nose are formed by staggering peaks and troughs, and after the image corresponding to the peak or the image corresponding to the trough is obtained, the number of the nasal veins can be determined according to the number of the peaks or the number of the troughs.
205. And determining the nasal print information of the pet according to the number of the nasal prints and/or the depth of the nasal prints.
In some embodiments, the electronic device may determine a distribution of the nasal print depth of each nasal print, obtain a nasal print depth distribution map, and then determine the nasal print information of the pet according to the nasal print depth distribution map.
For example, the electronic device may further determine a plurality of target peak points and a plurality of target valley points within the target region of the nose pattern depth profile, where the target peak points are peak points having a height higher than a first preset height, and the target valleys are valley points having a height lower than a second preset height. And then determining the vector distance between each target peak point and the nearest target valley point, and generating the nose pattern information corresponding to the pet according to the vector distance between each target peak point and the nearest target valley point.
In some embodiments, the electronic device may further determine a plurality of target sub-regions in the target region, determine the number of nasal prints corresponding to each target sub-region, and then determine the nasal print information of the pet according to the number of nasal prints corresponding to each target sub-region.
For example, after the electronic device determines the target area on the nose of the pet, the electronic device may divide the target area into a plurality of target sub-areas and then determine the number of nose prints in each target sub-area. And the electronic equipment generates the nasal print information of the pet according to the number of the nasal prints corresponding to each target sub-area.
In some embodiments, after the electronic device determines at least one target area on the nasal print of the pet, the electronic device may obtain the number of the nasal prints of the target area and the corresponding depth of each of the nasal prints, and then generate the nasal print information of the pet according to the number of the nasal prints and the corresponding depth of each of the nasal prints.
206. And determining a health prediction model according to the health prediction model data.
In some embodiments, after the electronic device determines the corresponding health prediction model data according to the basic identity information of the pet, the electronic device may directly perform parameter setting on the basic model by using the health prediction model data, so as to obtain the health prediction model.
For example, the health prediction model is obtained by updating all data such as loss function data and threshold data in the health prediction model data to the base model.
207. And inputting the nasal print information into a health prediction model to obtain the health condition of the pet.
The electronic equipment inputs the nasal print information of the pet into the health prediction model, and the health prediction model processes the nasal print information to obtain a prediction result, wherein the prediction result is the health condition of the pet. For example, the predicted result is a numerical value corresponding to a quantified value of the health condition, such as a numerical value in the range of 0-100, and if the numerical value is 40, the health condition of the pet is not good at the moment.
Through the mode, the health condition of the pet can be predicted.
In the embodiment of the application, the target health database or health prediction model data corresponding to the pet is determined by acquiring the basic identity information of the pet and then according to the variety information, the sex information and the age information in the basic identity information. Determining at least one target area on the nose of the pet, acquiring the number and/or depth of nasal wrinkles corresponding to the at least one target area, determining the nasal wrinkle information of the pet according to the number and/or depth of the nasal wrinkles, determining a health prediction model according to health prediction model data, and inputting the nasal wrinkle information into the health prediction model to obtain the health condition of the pet.
Correspondingly, an embodiment of the present application further provides a health prediction device, as shown in fig. 4, fig. 4 is a schematic structural diagram of the health prediction device provided in the embodiment of the present application. The health prognosis includes:
the first obtaining module 310 is configured to obtain basic identity information of the pet.
The first determining module 320 is configured to determine target data corresponding to the pet according to the basic identity information.
The first determining module 320 is further configured to determine a target health database or health prediction model data corresponding to the pet according to the breed information, the gender information, and the age information in the basic identity information, where the target data includes the target health database and the health prediction model data.
And a second obtaining module 330, configured to obtain nasal print information of the pet.
The second obtaining module 330 is further configured to determine at least one target area on the nose of the pet; and acquiring the corresponding nose print information of at least one target area.
The second obtaining module 330 is further configured to obtain a number of nasal wrinkles and/or a depth of the nasal wrinkles corresponding to at least one target region; and determining the nasal print information of the pet according to the number of the nasal prints and/or the depth of the nasal prints.
The second obtaining module 330 is further configured to obtain humidity information of the nose of the pet;
and a second determining module 340 for determining the health condition of the pet according to the nasal print information and the target data.
The second determining module 340 is further configured to match the nasal print information with the target health database to obtain the health status of the pet.
The second determining module 340 is further configured to determine a health prediction model according to the health prediction model data; and inputting the nasal print information into a health prediction model to obtain the health condition of the pet.
In the embodiment of the application, the electronic equipment acquires the basic identity information of the pet; determining target data corresponding to the pet according to the basic identity information; acquiring the nasal print information of the pet; and determining the health condition of the pet according to the nose print information and the target data. According to the embodiment of the application, the health condition of the pet can be predicted according to the nasal print information of the pet.
Accordingly, an electronic device may include, as shown in fig. 5, a memory 401 having one or more computer-readable storage media, an input unit 402, a display unit 403, a sensor 404, a processor 405 having one or more processing cores, and a power supply 406. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 5 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the memory 401 may be used to store software programs and modules, and the processor 405 executes various functional applications and data processing by operating the software programs and modules stored in the memory 401. The memory 401 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic device, and the like. Further, the memory 401 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 401 may further include a memory controller to provide the processor 405 and the input unit 402 with access to the memory 401.
The input unit 402 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, input unit 402 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 405, and can receive and execute commands sent by the processor 405. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 402 may include other input devices in addition to a touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 403 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 403 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 405 to determine the type of touch event, and then the processor 405 provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 5 the touch-sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch-sensitive surface may be integrated with the display panel to implement input and output functions.
The electronic device may also include at least one sensor 404, such as a light sensor, motion sensor, and other sensors. In particular, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the electronic device is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the motion sensor is stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of an electronic device, vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured to the electronic device, detailed descriptions thereof are omitted.
The processor 405 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 401 and calling data stored in the memory 401, thereby performing overall monitoring of the electronic device. Optionally, processor 405 may include one or more processing cores; preferably, the processor 405 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 405.
The electronic device also includes a power source 406 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 405 via a power management system to manage charging, discharging, and power consumption management functions via the power management system. The power supply 406 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the electronic device may further include a camera, a bluetooth module, and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 405 in the electronic device loads the computer program stored in the memory 401, and the processor 405 loads the computer program, thereby implementing various functions:
acquiring basic identity information of a pet;
determining target data corresponding to the pet according to the basic identity information;
acquiring the nasal print information of the pet;
and determining the health condition of the pet according to the nose print information and the target data.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute the steps in any one of the health prediction methods provided by the embodiments of the present application. For example, the instructions may perform the steps of:
acquiring basic identity information of a pet;
determining target data corresponding to the pet according to the basic identity information;
acquiring the nasal print information of the pet;
and determining the health condition of the pet according to the nose print information and the target data.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium may execute the steps in any one of the health prediction methods provided in the embodiments of the present application, the beneficial effects that can be achieved by any one of the health prediction methods provided in the embodiments of the present application may be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The health prediction method, the health prediction device, the electronic device, and the storage medium provided in the embodiments of the present application are described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the embodiments above is only used to help understanding the method and the core concept of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A method of health prediction, comprising:
acquiring basic identity information of a pet;
determining target data corresponding to the pet according to the basic identity information;
acquiring the nasal print information of the pet;
and determining the health condition of the pet according to the nose print information and the target data.
2. The health prediction method of claim 1, wherein the obtaining of the nasal print information of the pet comprises:
determining at least one target area on the nose of the pet;
and acquiring the corresponding nose print information of the at least one target area.
3. The health prediction method of claim 2, wherein the obtaining nose print information corresponding to the at least one target region comprises:
acquiring the number and/or depth of the nasal veins corresponding to the at least one target area;
and determining the nasal print information of the pet according to the nasal print number and/or the nasal print depth.
4. The health prediction method of claim 1, wherein the determining the target data corresponding to the pet based on the basic identity information comprises:
and determining a target health database or health prediction model data corresponding to the pet according to variety information, gender information and age information in the basic identity information, wherein the target data comprises the target health database and the health prediction model data.
5. The health prediction method of claim 4, wherein the determining the health condition of the pet based on the nose print information and the target data comprises:
and matching the nose print information with the target health database to obtain the health condition of the pet.
6. The health prediction method of claim 4, wherein the determining the health condition of the pet based on the nose print information and the target data comprises:
determining a health prediction model according to the health prediction model data;
and inputting the nasal print information into the health prediction model to obtain the health condition of the pet.
7. The health prediction method of any one of claims 1-6 wherein prior to the determining the health condition of the pet based on the nasal print information and the target data, the method further comprises:
acquiring humidity information of the nose of the pet;
the determining the health condition of the pet according to the nose print information and the target data comprises the following steps:
and determining the health condition of the pet according to the nasal print information, the humidity information and the target data.
8. A health prediction device, comprising:
the first acquisition module is used for acquiring the basic identity information of the pet;
the first determining module is used for determining target data corresponding to the pet according to the basic identity information;
the second acquisition module is used for acquiring the nasal print information of the pet;
and the second determination module is used for determining the health condition of the pet according to the nose print information and the target data.
9. An electronic device, comprising:
a memory storing executable program code, a processor coupled with the memory;
the processor calls the executable program code stored in the memory to perform the steps in the health prediction method of any one of claims 1 to 7.
10. A storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the health prediction method of any one of claims 1 to 7.
CN202111648180.0A 2021-12-30 2021-12-30 Health prediction method and device, electronic equipment and storage medium Pending CN114283943A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111648180.0A CN114283943A (en) 2021-12-30 2021-12-30 Health prediction method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111648180.0A CN114283943A (en) 2021-12-30 2021-12-30 Health prediction method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114283943A true CN114283943A (en) 2022-04-05

Family

ID=80878571

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111648180.0A Pending CN114283943A (en) 2021-12-30 2021-12-30 Health prediction method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114283943A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115176727A (en) * 2022-07-14 2022-10-14 宠步科技(武汉)有限公司 Short kiss dog identification method and pet collar for short kiss dog identification

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948458A (en) * 2019-02-25 2019-06-28 广东智媒云图科技股份有限公司 Pet personal identification method, device, equipment and storage medium based on noseprint
CN112784742A (en) * 2021-01-21 2021-05-11 宠爱王国(北京)网络科技有限公司 Extraction method and device of nose print features and nonvolatile storage medium
CN113270197A (en) * 2021-06-03 2021-08-17 苏州立威新谱生物科技有限公司 Health prediction method, system and storage medium based on artificial intelligence

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109948458A (en) * 2019-02-25 2019-06-28 广东智媒云图科技股份有限公司 Pet personal identification method, device, equipment and storage medium based on noseprint
CN112784742A (en) * 2021-01-21 2021-05-11 宠爱王国(北京)网络科技有限公司 Extraction method and device of nose print features and nonvolatile storage medium
CN113270197A (en) * 2021-06-03 2021-08-17 苏州立威新谱生物科技有限公司 Health prediction method, system and storage medium based on artificial intelligence

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
搜狐网: "狗狗:我鼻子不湿滑了、还变颜色,是生病了吗?", pages 1 - 3, Retrieved from the Internet <URL:https://www.sohu.com/a/442718090_120715195> *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115176727A (en) * 2022-07-14 2022-10-14 宠步科技(武汉)有限公司 Short kiss dog identification method and pet collar for short kiss dog identification

Similar Documents

Publication Publication Date Title
US11181985B2 (en) Dynamic user interactions for display control
CN111989537B (en) System and method for detecting human gaze and gestures in an unconstrained environment
KR102246777B1 (en) Method for displaying content in the expandable screen area and electronic device supporting the same
US10832039B2 (en) Facial expression detection method, device and system, facial expression driving method, device and system, and storage medium
CN112926423B (en) Pinch gesture detection and recognition method, device and system
JP2016502721A (en) Gesture detection management for electronic devices
CN110942479B (en) Virtual object control method, storage medium and electronic device
KR20190030140A (en) Method for eye-tracking and user terminal for executing the same
KR20190110690A (en) Method for providing information mapped between plurality inputs and electronic device supporting the same
CA3158012A1 (en) Automatic pressure ulcer measurement
RU2671990C1 (en) Method of displaying three-dimensional face of the object and device for it
CN114283943A (en) Health prediction method and device, electronic equipment and storage medium
CN114332932A (en) Health prediction method and device, electronic equipment and storage medium
CN114287887A (en) Disease diagnosis method, disease diagnosis device, electronic apparatus, and storage medium
CN112818733B (en) Information processing method, device, storage medium and terminal
CN113986093A (en) Interaction method and related device
KR20180036205A (en) Smart table apparatus for simulation
CN114283452A (en) Animal character analysis method, device, electronic equipment and storage medium
CN114242224A (en) Doctor recommendation method and device, electronic equipment and storage medium
CN114283936A (en) Disease diagnosis method, disease diagnosis device, electronic apparatus, and storage medium
CN111796980B (en) Data processing method and device, electronic equipment and storage medium
KR102251076B1 (en) Method to estimate blueprint using indoor image
CN115187988A (en) Regional text recognition method and device, electronic equipment and storage medium
JP6264003B2 (en) Coordinate input system, coordinate instruction unit, coordinate input unit, control method of coordinate input system, and program
JP2021517314A (en) Electronic device determination method, system, computer system and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination