CN116682163A - Emotion recognition-based medicine curative effect prediction method, device and equipment - Google Patents

Emotion recognition-based medicine curative effect prediction method, device and equipment Download PDF

Info

Publication number
CN116682163A
CN116682163A CN202310732804.XA CN202310732804A CN116682163A CN 116682163 A CN116682163 A CN 116682163A CN 202310732804 A CN202310732804 A CN 202310732804A CN 116682163 A CN116682163 A CN 116682163A
Authority
CN
China
Prior art keywords
curative effect
medication
patient
parameter data
index parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310732804.XA
Other languages
Chinese (zh)
Inventor
秦永生
申东范
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Elm Technology Co ltd
Original Assignee
Shenzhen Elm Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Elm Technology Co ltd filed Critical Shenzhen Elm Technology Co ltd
Priority to CN202310732804.XA priority Critical patent/CN116682163A/en
Publication of CN116682163A publication Critical patent/CN116682163A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medicinal Chemistry (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The invention relates to a medicine curative effect prediction method, device and equipment based on emotion recognition, wherein the method comprises the steps of obtaining facial image data of a patient uploaded by a patient terminal; carrying out index parameter identification on the facial image data by adopting a preset emotion identification model, and generating index parameter data, wherein the index parameter data comprises one or more of brain fatigue and physical characteristic parameters; comparing the index parameter data with standard parameter data to generate a medication difference value; and importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal.

Description

Emotion recognition-based medicine curative effect prediction method, device and equipment
Technical Field
The invention relates to the technical field of image processing, in particular to a medicine curative effect prediction method, device, computer equipment and storage medium based on emotion recognition.
Background
Fever is a pathological increase in body temperature, which is caused by the fact that the setting point of a thermoregulation center moves upwards due to the action of a human body on pyrogens, is the most common clinical symptom, and is generally judged to be diagnosed by adopting a thermometer mode, after a patient takes a medicine, whether the body temperature is relieved is judged by adopting the thermometer mode, but the curative effect of the medicine needs a time period, so the following problems exist:
(1) The curative effect time of the medicine is uncertain, so that patients frequently use a thermometer, and the operation process is complicated;
(2) The effect of the drug treatment is uncertain, and the temperature acquired by the thermometer is the current body temperature of the patient, which does not mean that the fever condition is completely cured.
In the existing related medical equipment technology, a patient monitoring instrument configured in an intelligent ward can pick up a patient in real time and perform temperature analysis processing according to the picked-up image data, so that the patient after taking the medicine can be monitored in real time, but in the actual case, the following problems still exist:
1, by a mode of singly collecting the body temperature of a patient, the curative effect of the medicine cannot be accurately judged, and even if the body temperature is reduced, the fever is not cured;
in practical situations, the diseases such as fever are classified into serious, moderate and slight cases, and the treatment is generally carried out only by moderate or more than moderate cases, and most of slight fever is self-taking medicine at home, so the effect of the patient monitoring instrument is not comprehensive.
Disclosure of Invention
The invention mainly aims to provide a medicine curative effect prediction method, device, computer equipment and storage medium based on emotion recognition, which adopts a mobile terminal as a patient data acquisition main body to acquire face data of a patient, analyzes an emotion recognition model of the face data through the computer equipment to acquire indexes such as brain fatigue degree, disease parameters and physical characteristic parameters of the patient, and processes the curative effect prediction effect of the patient after taking medicine after standard comparison of the indexes.
In order to achieve the above purpose, the invention provides a drug efficacy prediction method based on emotion recognition, which comprises the following steps:
acquiring facial image data of a patient uploaded by a patient terminal;
carrying out index parameter identification on the facial image data by adopting a preset emotion identification model, and generating index parameter data, wherein the index parameter data comprises one or more of brain fatigue and physical characteristic parameters;
comparing the index parameter data with standard parameter data to generate a medication difference value;
and importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal.
Further, the performing index parameter identification on the facial image data by using a preset emotion recognition model, and generating index parameter data includes:
the preset emotion recognition model recognizes the facial image data by adopting a facial image recognition method, wherein the facial image recognition method comprises the following steps:
importing the face image data into a preset rectangular block virtual image, wherein the size of the rectangular block virtual image is adjusted according to the face size of the face image data in the importing process, and the rectangular block virtual image comprises 2×2, 3×3 or 4×4;
performing label injection on each minutiae image in the rectangular block virtual image with the face image data to generate a plurality of face classification minutiae images with the face image data;
and executing a recognition program according to the labels set by the face classification item graphs, and generating index parameter data corresponding to the face classification item graphs after recognition.
Further, the step of executing the identification program according to the labels set by the face classification minutiae images includes:
classifying, by the tag, a number of the face classification term figures, the face classification term figures including forehead term figures 2 parts and above, eye term figures 2 parts and above, and nose term figures 2 parts and above;
the classified forehead minutiae images, eye minutiae images and nose minutiae images are imported into an identification channel matched with the tag for corresponding identification,
comparing a preset skin graph with a forehead detail graph under the same temperature environment, and identifying and judging the first skin sweat humidity in the forehead detail graph;
comparing a preset eye standard chart with an eye detail chart, identifying and judging a first eye ball fatigue degree and a first eyebrow sagging tension degree in the eye detail chart, and judging brain fatigue degree through the first eye ball fatigue degree and the first eyebrow sagging tension degree;
and identifying a blood flow velocity spectrum in the nose-mouth detail drawing by an image, and importing the blood flow velocity spectrum into a non-contact heart rate spectrum table to confirm heart rate data of a first patient.
Further, the step of executing the identification program according to the labels set by the face classification minutiae figures further includes:
the first skin sweat humidity, the first eye fatigue, the first eyebrow sagging tension and the first patient heart rate data are all obtained by identifying a face classification detailed item map, the classified forehead detailed item map, eye detailed item map and nose mouth detailed item map are imported into an identification channel matched with the tag again to carry out corresponding identification, and the face classification detailed item map corresponding to the other items is identified, so that second skin sweat humidity, second eye fatigue, second eyebrow sagging tension and second patient heart rate data are generated;
respectively calculating first skin sweat humidity and second skin sweat humidity, first eye fatigue degree and second eye fatigue degree, first eyebrow sagging tension degree and second eyebrow sagging tension degree, average value of first patient heart rate data and second patient heart rate data for data output,
the average eyestrain and the eye drop tension are judged and then output as brain fatigue;
the average skin sweat humidity, patient heart rate data, and pre-entered body temperature values are output as physical characteristic parameters.
Further, the step of comparing the index parameter data with standard parameter data to generate a medication difference value includes:
the brain fatigue degree and the physical characteristic parameters included in the index parameter data are imported into a medication progress bar, and the medication progress matched with the index parameter data is judged;
confirming medication data from standard parameter data according to the efficacy progress, wherein the medication data comprises medication type, medication amount and post-medication duration;
and comparing the medication data with standard constants in standard parameter data to output the medication effect difference between the patient and the standard human body data, and outputting the medication coefficient of the patient matched with the medication effect difference and the medication difference, wherein the standard constants are preset standard human body medication types, medication amounts and medication times.
Further, the step of importing the index parameter data and the medication difference value into a preset period prediction model, constructing a efficacy progress bar through the period prediction model, generating efficacy prediction information according to the efficacy progress bar, and outputting the efficacy prediction information to the patient terminal includes:
constructing a first curative effect progress bar matched with index parameter data diseases through a preset period prediction model, wherein the first curative effect progress bar is obtained based on standard parameter data;
importing the index parameter data into a period prediction model to form a second curative effect progress bar matched with a patient in the first curative effect progress bar;
the second curative effect progress bar is corrected for the first time according to the medication difference value to obtain a third curative effect bar, wherein the third curative effect bar is obtained after correction according to the difference value between the medication quantity of a patient and the standard medication quantity;
carrying out second correction on the third curative effect strip through the patient medication coefficient to obtain a fourth curative effect strip, wherein the fourth curative effect strip is the correction made according to the drug effect absorption quantity of the patient;
finally, the fourth curative effect bar is divided and compared with the first curative effect bar, and the curative effect progress of the patient is obtained.
Further, the step of importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal, further comprises the steps of:
deducing corresponding time according to the curative effect progress, thereby confirming the residual duration of completing the drug treatment;
and taking the residual duration as curative effect prediction information, and outputting the curative effect prediction information to a patient terminal.
The invention also provides a medicine curative effect prediction device based on emotion recognition, which comprises the following steps:
the acquisition unit is used for acquiring the facial image data of the patient uploaded by the patient terminal;
the recognition unit is used for carrying out index parameter recognition on the facial image data by adopting a preset emotion recognition model and generating index parameter data, wherein the index parameter data comprises one or more of brain fatigue and physical characteristic parameters;
the comparison unit is used for comparing the index parameter data with the standard parameter data to generate a medication difference value;
the prediction unit is used for importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal.
The invention also provides a computer device, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the drug curative effect prediction method based on emotion recognition when executing the computer program.
The present invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the emotion recognition-based drug efficacy prediction method of any of the above.
The invention has the following beneficial effects:
(1) The facial image data of the patient uploaded by the patient terminal is acquired, so that the user can diagnose the fever of the body in a non-hospital place.
(2) The face of the patient can be identified in a correlated way through the emotion recognition model, so that index parameter data is obtained, and belongs to data which can not be collected by patient monitoring instruments or thermometers in general intelligent wards, so that the drug effect of the patient can follow up more timely.
(3) The medicine difference value and the medicine coefficient of the patient, which are different from the standard human body, of the patient can be confirmed through the comparison of the index parameter data and the standard parameter data, so that the medicine can be added to the patient singly.
Drawings
FIG. 1 is a schematic diagram showing steps of a method for predicting therapeutic effects of drugs based on emotion recognition according to an embodiment of the present invention;
FIG. 2 is a block diagram showing a device for predicting therapeutic effects of drugs based on emotion recognition according to an embodiment of the present invention;
fig. 3 is a block diagram schematically illustrating a structure of a computer device according to an embodiment of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, a flow chart of a method for predicting therapeutic effect of a drug based on emotion recognition according to the present invention specifically includes the following steps:
s1, acquiring facial image data of a patient uploaded by a patient terminal;
s2, carrying out index parameter identification on the facial image data by adopting a preset emotion identification model, and generating index parameter data, wherein the index parameter data comprises one or more of brain fatigue and physical characteristic parameters;
s3, comparing the index parameter data with standard parameter data to generate a medication difference value;
s4, importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal.
Through obtaining patient's facial image data that patient's terminal uploaded, realize that the user can carry out the diagnosis that the health is febrile in non-hospital place, can carry out relevant discernment to patient's face through emotion recognition model to obtain index parameter data, this index parameter data belongs to patient's guardianship instrument or thermometer in the general wisdom ward and can not gather the data, make patient's pharmacodynamic follow-up more in time, can confirm through index parameter data and standard parameter data's comparison that patient has the medication difference and the patient medication coefficient of physical distinction with standard human body, make can singly advance more accurately to the patient.
In one embodiment, the performing index parameter recognition on the facial image data using a preset emotion recognition model, and generating index parameter data includes:
the preset emotion recognition model recognizes the facial image data by adopting a facial image recognition method, wherein the facial image recognition method comprises the following steps:
importing the face image data into a preset rectangular block virtual image, wherein the size of the rectangular block virtual image is adjusted according to the face size of the face image data in the importing process, and the rectangular block virtual image comprises 2×2, 3×3 or 4×4;
performing label injection on each minutiae image in the rectangular block virtual image with the face image data to generate a plurality of face classification minutiae images with the face image data;
and executing a recognition program according to the labels set by the face classification item graphs, and generating index parameter data corresponding to the face classification item graphs after recognition.
The size of the rectangular block virtual image is correspondingly adjusted according to the size of the face image of the patient when the face image data is embedded into the rectangular block virtual image;
and classifying the face classification minutiae images, wherein the face classification minutiae images are applied to a face recognition model in the prior art, and each face part in the face classification minutiae images can be subjected to label classification.
And finally, identifying corresponding channels of each classified face classification minutiae image.
Preferably, the step of executing the identification program according to the labels set by the face classification minutiae images includes:
classifying, by the tag, a number of the face classification term figures, the face classification term figures including forehead term figures 2 parts and above, eye term figures 2 parts and above, and nose term figures 2 parts and above;
the classified forehead minutiae images, eye minutiae images and nose minutiae images are imported into an identification channel matched with the tag for corresponding identification,
comparing a preset skin graph with a forehead detail graph under the same temperature environment, and identifying and judging the first skin sweat humidity in the forehead detail graph;
comparing a preset eye standard chart with an eye detail chart, identifying and judging a first eye ball fatigue degree and a first eyebrow sagging tension degree in the eye detail chart, and judging brain fatigue degree through the first eye ball fatigue degree and the first eyebrow sagging tension degree;
and identifying a blood flow velocity spectrum in the nose-mouth detail drawing by an image, and importing the blood flow velocity spectrum into a non-contact heart rate spectrum table to confirm heart rate data of a first patient.
The first skin sweat humidity, the first eye ball fatigue and the first eyebrow sagging tension are obtained by comparing the first skin sweat humidity, the first eye ball fatigue and the first eyebrow sagging tension with the similarity of the corresponding standard images, and the blood flow velocity spectrum in the nose and mouth detail images is identified through images, and the first patient heart rate data is confirmed to be the blood flow velocity spectrum in the prior art according to the blood flow velocity spectrum which is imported into the non-contact heart rate spectrum table, and the heart rate of the human body is confirmed through the blood flow velocity spectrum.
Further, the step of executing the identification program according to the labels set by the face classification minutiae images further includes:
the first skin sweat humidity, the first eye fatigue, the first eyebrow sagging tension and the first patient heart rate data are all obtained by identifying a face classification detailed item map, the classified forehead detailed item map, eye detailed item map and nose mouth detailed item map are imported into an identification channel matched with the tag again to carry out corresponding identification, and the face classification detailed item map corresponding to the other items is identified, so that second skin sweat humidity, second eye fatigue, second eyebrow sagging tension and second patient heart rate data are generated;
respectively calculating first skin sweat humidity and second skin sweat humidity, first eye fatigue degree and second eye fatigue degree, first eyebrow sagging tension degree and second eyebrow sagging tension degree, average value of first patient heart rate data and second patient heart rate data for data output,
the average eyestrain and the eye drop tension are judged and then output as brain fatigue;
the average skin sweat humidity, patient heart rate data, and pre-entered body temperature values are output as physical characteristic parameters.
In one embodiment, the step of comparing the index parameter data with standard parameter data to generate a medication difference value includes:
the brain fatigue degree and the physical characteristic parameters included in the index parameter data are imported into a medication progress bar, and the medication progress matched with the index parameter data is judged;
confirming medication data from standard parameter data according to the efficacy progress, wherein the medication data comprises medication type, medication amount and post-medication duration;
and comparing the medication data with standard constants in standard parameter data to output the medication effect difference between the patient and the standard human body data, and outputting the medication coefficient of the patient matched with the medication effect difference and the medication difference, wherein the standard constants are preset standard human body medication types, medication amounts and medication times.
The drug effect progress matched with the index parameter data is judged by leading the brain fatigue degree and the physical characteristic parameters into the drug progress bar, the drug effect progress is the actual post-drug condition of the patient after drug administration, and the drug effect difference (short for drug effect difference) of the patient and the standard patient under the same drug data can be determined by comparing the post-drug progress condition of the patient with the post-drug progress condition of the standard condition, so that the drug difference of the patient can be deduced according to the drug effect difference.
Further, the step of importing the index parameter data and the medication difference value into a preset period prediction model, constructing a efficacy progress bar through the period prediction model, generating efficacy prediction information according to the efficacy progress bar, and outputting the efficacy prediction information to the patient terminal includes:
constructing a first curative effect progress bar matched with index parameter data diseases through a preset period prediction model, wherein the first curative effect progress bar is obtained based on standard parameter data;
importing the index parameter data into a period prediction model to form a second curative effect progress bar matched with a patient in the first curative effect progress bar;
the second curative effect progress bar is corrected for the first time according to the medication difference value to obtain a third curative effect bar, wherein the third curative effect bar is obtained after correction according to the difference value between the medication quantity of a patient and the standard medication quantity;
carrying out second correction on the third curative effect strip through the patient medication coefficient to obtain a fourth curative effect strip, wherein the fourth curative effect strip is the correction made according to the drug effect absorption quantity of the patient;
finally, the fourth curative effect bar is divided and compared with the first curative effect bar, and the curative effect progress of the patient is obtained.
In one embodiment, the step of importing the index parameter data and the medication difference value into a preset period prediction model, constructing a efficacy progress bar through the period prediction model, generating efficacy prediction information according to the efficacy progress bar, and outputting the efficacy prediction information to the patient terminal further includes:
deducing corresponding time according to the curative effect progress, thereby confirming the residual duration of completing the drug treatment;
and taking the residual duration as curative effect prediction information, and outputting the curative effect prediction information to a patient terminal.
Referring to fig. 2, the invention further provides a medicine curative effect prediction device based on emotion recognition, which comprises:
an acquisition unit 1 for acquiring patient face image data uploaded by a patient terminal;
a recognition unit 2, configured to perform index parameter recognition on the facial image data by using a preset emotion recognition model, and generate index parameter data, where the index parameter data includes one or more of brain fatigue and physical characteristic parameters;
a comparison unit 3, configured to compare the index parameter data with standard parameter data to generate a medication difference value;
and the prediction unit 4 is used for importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal.
In this embodiment, for specific implementation of each unit in the above embodiment of the apparatus, please refer to the description in the above embodiment of the method, and no further description is given here.
Referring to fig. 3, in an embodiment of the present invention, there is further provided a computer device, which may be a server, and an internal structure thereof may be as shown in fig. 3. The computer device includes a processor, a memory, a display screen, an input device, a network interface, and a database connected by a system bus. Wherein the computer is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used to store the corresponding data in this embodiment. The network interface of the computer device is used for communicating with an external terminal through a network connection. Which computer program, when being executed by a processor, carries out the above-mentioned method.
It will be appreciated by those skilled in the art that the architecture shown in fig. 3 is merely a block diagram of a portion of the architecture in connection with the present inventive arrangements and is not intended to limit the computer devices to which the present inventive arrangements are applicable.
An embodiment of the present invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above method. It is understood that the computer readable storage medium in this embodiment may be a volatile readable storage medium or a nonvolatile readable storage medium.
In summary, by acquiring the facial image data of the patient uploaded by the patient terminal, performing index parameter identification on the facial image data by adopting a preset emotion recognition model, and generating index parameter data, wherein the index parameter data comprises one or more of brain fatigue and physical characteristic parameters, comparing the index parameter data with standard parameter data to generate a medication difference value, importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal, thereby realizing acquisition of the facial image data of the patient uploaded by the patient terminal and realizing diagnosis of physical fever of a user in a non-hospital place. The face of the patient can be identified in a correlated way through the emotion recognition model, so that index parameter data are obtained, the index parameter data belong to data which cannot be collected by a patient monitoring instrument or a thermometer in a general intelligent ward, the drug effect of the patient is more timely, and the drug difference value and the patient drug coefficient of the patient with physical distinction from a standard human body can be confirmed through comparison of the index parameter data and the standard parameter data, so that the effect of being more accurate to the patient can be singly achieved.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium provided by the present invention and used in embodiments may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (SSRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM, among others.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article or method that comprises the element.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the invention, and all equivalent structures or equivalent processes using the descriptions and drawings of the present invention or direct or indirect application in other related technical fields are included in the scope of the present invention.

Claims (10)

1. The medicine curative effect prediction method based on emotion recognition is characterized by comprising the following steps of:
acquiring facial image data of a patient uploaded by a patient terminal;
carrying out index parameter identification on the facial image data by adopting a preset emotion identification model, and generating index parameter data, wherein the index parameter data comprises one or more of brain fatigue and physical characteristic parameters;
comparing the index parameter data with standard parameter data to generate a medication difference value;
and importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal.
2. The method for predicting therapeutic effects of drugs based on emotion recognition according to claim 1, wherein the performing index parameter recognition on the facial image data using a preset emotion recognition model and generating index parameter data comprises:
the preset emotion recognition model recognizes the facial image data by adopting a facial image recognition method, wherein the facial image recognition method comprises the following steps:
importing the face image data into a preset rectangular block virtual image, wherein the size of the rectangular block virtual image is adjusted according to the face size of the face image data in the importing process, and the rectangular block virtual image comprises 2×2, 3×3 or 4×4;
performing label injection on each minutiae image in the rectangular block virtual image with the face image data to generate a plurality of face classification minutiae images with the face image data;
and executing a recognition program according to the labels set by the face classification item graphs, and generating index parameter data corresponding to the face classification item graphs after recognition.
3. The emotion recognition-based drug efficacy prediction method according to claim 2, wherein the step of performing a recognition program according to the labels set by the plurality of face classification item graphs includes:
classifying, by the tag, a number of the face classification term figures, the face classification term figures including forehead term figures 2 parts and above, eye term figures 2 parts and above, and nose term figures 2 parts and above;
the classified forehead minutiae images, eye minutiae images and nose minutiae images are imported into an identification channel matched with the tag for corresponding identification,
comparing a preset skin graph with a forehead detail graph under the same temperature environment, and identifying and judging the first skin sweat humidity in the forehead detail graph;
comparing a preset eye standard chart with an eye detail chart, identifying and judging a first eye ball fatigue degree and a first eyebrow sagging tension degree in the eye detail chart, and judging brain fatigue degree through the first eye ball fatigue degree and the first eyebrow sagging tension degree;
and identifying a blood flow velocity spectrum in the nose-mouth detail drawing by an image, and importing the blood flow velocity spectrum into a non-contact heart rate spectrum table to confirm heart rate data of a first patient.
4. A method for predicting efficacy of drugs based on emotion recognition as set forth in claim 3, wherein said step of executing a recognition program based on labels set in a plurality of said face classification minutiae maps further comprises:
the first skin sweat humidity, the first eye fatigue, the first eyebrow sagging tension and the first patient heart rate data are all obtained by identifying a face classification detailed item map, the classified forehead detailed item map, eye detailed item map and nose mouth detailed item map are imported into an identification channel matched with the tag again to carry out corresponding identification, and the face classification detailed item map corresponding to the other items is identified, so that second skin sweat humidity, second eye fatigue, second eyebrow sagging tension and second patient heart rate data are generated;
respectively calculating first skin sweat humidity and second skin sweat humidity, first eye fatigue degree and second eye fatigue degree, first eyebrow sagging tension degree and second eyebrow sagging tension degree, average value of first patient heart rate data and second patient heart rate data for data output,
the average eyestrain and the eye drop tension are judged and then output as brain fatigue;
the average skin sweat humidity, patient heart rate data, and pre-entered body temperature values are output as physical characteristic parameters.
5. The method of claim 4, wherein the step of comparing the index parameter data with standard parameter data to generate a medication difference value comprises:
the brain fatigue degree and the physical characteristic parameters included in the index parameter data are imported into a medication progress bar, and the medication progress matched with the index parameter data is judged;
confirming medication data from standard parameter data according to the efficacy progress, wherein the medication data comprises medication type, medication amount and post-medication duration;
and comparing the medication data with standard constants in standard parameter data to output the medication effect difference between the patient and the standard human body data, and outputting the medication coefficient of the patient matched with the medication effect difference and the medication difference, wherein the standard constants are preset standard human body medication types, medication amounts and medication times.
6. The method for predicting therapeutic effects of drugs based on emotion recognition according to claim 5, wherein the steps of importing the index parameter data and the medication difference value into a preset periodic prediction model, constructing a therapeutic effect progress bar by the periodic prediction model, generating therapeutic effect prediction information according to the therapeutic effect progress bar, and outputting the therapeutic effect prediction information to the patient terminal comprise:
constructing a first curative effect progress bar matched with index parameter data diseases through a preset period prediction model, wherein the first curative effect progress bar is obtained based on standard parameter data;
importing the index parameter data into a period prediction model to form a second curative effect progress bar matched with a patient in the first curative effect progress bar;
the second curative effect progress bar is corrected for the first time according to the medication difference value to obtain a third curative effect bar, wherein the third curative effect bar is obtained after correction according to the difference value between the medication quantity of a patient and the standard medication quantity;
carrying out second correction on the third curative effect strip through the patient medication coefficient to obtain a fourth curative effect strip, wherein the fourth curative effect strip is the correction made according to the drug effect absorption quantity of the patient;
finally, the fourth curative effect bar is divided and compared with the first curative effect bar, and the curative effect progress of the patient is obtained.
7. The method for predicting therapeutic effects of drugs based on emotion recognition according to claim 6, wherein the steps of importing the index parameter data and the medication difference value into a preset periodic prediction model, constructing a therapeutic effect progress bar by the periodic prediction model, generating therapeutic effect prediction information according to the therapeutic effect progress bar, and outputting the therapeutic effect prediction information to the patient terminal further comprise:
deducing corresponding time according to the curative effect progress, thereby confirming the residual duration of completing the drug treatment;
and taking the residual duration as curative effect prediction information, and outputting the curative effect prediction information to a patient terminal.
8. A drug efficacy prediction device based on emotion recognition, comprising:
the acquisition unit is used for acquiring the facial image data of the patient uploaded by the patient terminal;
the recognition unit is used for carrying out index parameter recognition on the facial image data by adopting a preset emotion recognition model and generating index parameter data, wherein the index parameter data comprises one or more of brain fatigue and physical characteristic parameters;
the comparison unit is used for comparing the index parameter data with the standard parameter data to generate a medication difference value;
the prediction unit is used for importing the index parameter data and the medication difference value into a preset period prediction model, constructing a curative effect progress bar through the period prediction model, generating curative effect prediction information according to the curative effect progress bar, and outputting the curative effect prediction information to the patient terminal.
9. A computer device comprising a memory and a processor, the memory having stored therein a computer program, characterized in that the processor, when executing the computer program, implements the steps of the emotion recognition based drug efficacy prediction method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the emotion recognition-based drug efficacy prediction method of any one of claims 1 to 7.
CN202310732804.XA 2023-06-20 2023-06-20 Emotion recognition-based medicine curative effect prediction method, device and equipment Pending CN116682163A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310732804.XA CN116682163A (en) 2023-06-20 2023-06-20 Emotion recognition-based medicine curative effect prediction method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310732804.XA CN116682163A (en) 2023-06-20 2023-06-20 Emotion recognition-based medicine curative effect prediction method, device and equipment

Publications (1)

Publication Number Publication Date
CN116682163A true CN116682163A (en) 2023-09-01

Family

ID=87790699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310732804.XA Pending CN116682163A (en) 2023-06-20 2023-06-20 Emotion recognition-based medicine curative effect prediction method, device and equipment

Country Status (1)

Country Link
CN (1) CN116682163A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117912629A (en) * 2024-01-23 2024-04-19 南京引光医药科技有限公司 Clinical medication decision support system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117912629A (en) * 2024-01-23 2024-04-19 南京引光医药科技有限公司 Clinical medication decision support system

Similar Documents

Publication Publication Date Title
CN110459324B (en) Disease prediction method and device based on long-term and short-term memory model and computer equipment
CN108766512B (en) Health data management method and device, computer equipment and storage medium
US11481578B2 (en) Systems and methods for labeling large datasets of physiological records based on unsupervised machine learning
Camara et al. Real-time electrocardiogram streams for continuous authentication
CN111180024B (en) Data processing method and device based on word frequency and inverse document frequency and computer equipment
US20200205709A1 (en) Mental state indicator
CN109036545A (en) Medical information processing method, device, computer equipment and storage medium
CN109830280A (en) Psychological aided analysis method, device, computer equipment and storage medium
CN116682163A (en) Emotion recognition-based medicine curative effect prediction method, device and equipment
CN109460749A (en) Patient monitoring method, device, computer equipment and storage medium
KR20230003483A (en) Systems and methods for processing retinal signal data and identifying conditions
CN113657970B (en) Medicine recommendation method, device, equipment and storage medium based on artificial intelligence
US20220044818A1 (en) System and method for quantifying prediction uncertainty
CN116665281B (en) Key emotion extraction method based on doctor-patient interaction
US11911172B2 (en) Automated allergy office system and method
CN112487980B (en) Micro-expression-based treatment method, device, system and computer-readable storage medium
CN113902186A (en) Patient death risk prediction method, system, terminal and readable storage medium based on electronic medical record
CN112022124A (en) Physiological monitoring method, physiological monitoring device, computer equipment and storage medium
CN116598023B (en) Internet management system for community of Chinese medical specialist with bare
US20230218234A1 (en) Automated Allergy Office
KR102407987B1 (en) Methdo and apparatus for building bio data hub
Patil et al. IOT in HealthCare: Smart emotion detector utilizing wearable bio sensors
Disha et al. Prediction of Bipolar Disorder Using Machine Learning Techniques
CN117095461A (en) Data processing method, device, equipment and storage medium based on behaviors and gestures
CN118098466A (en) Operation control method and system for simulating human medical teaching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination