WO2020112147A1 - Method of an interactive health status assessment and system thereof - Google Patents

Method of an interactive health status assessment and system thereof Download PDF

Info

Publication number
WO2020112147A1
WO2020112147A1 PCT/US2018/063478 US2018063478W WO2020112147A1 WO 2020112147 A1 WO2020112147 A1 WO 2020112147A1 US 2018063478 W US2018063478 W US 2018063478W WO 2020112147 A1 WO2020112147 A1 WO 2020112147A1
Authority
WO
WIPO (PCT)
Prior art keywords
health status
status assessment
assessment
user
question
Prior art date
Application number
PCT/US2018/063478
Other languages
French (fr)
Inventor
Jung-Hsien Chiang
Pei-Ching Yang
Original Assignee
National Cheng Kung University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Cheng Kung University filed Critical National Cheng Kung University
Priority to CN201880098934.XA priority Critical patent/CN113287175B/en
Priority to PCT/US2018/063478 priority patent/WO2020112147A1/en
Priority to JP2021531393A priority patent/JP7285589B2/en
Publication of WO2020112147A1 publication Critical patent/WO2020112147A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/20ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • the present disclosure relates to health status assessment, and more particularly to an interactive health status assessment system and method thereof.
  • robots with health care and companionship function can help people to care for elders at home or elders living alone.
  • Technologize care work can ease the pressure of the shortage of long-term care givers.
  • robots can assist healthcare professionals in repetitive health education, allowing medical staff to focus on individual care needed for patients to improve medical quality.
  • the present disclosure proposes an interactive health status assessment system and an interactive health status assessment method to solve the above problems.
  • An interactive health status assessment method includes: selecting a health status assessment question from a dialog design database and presenting the health status assessment question by using a data output interface, wherein the dialog design database includes a plurality of health status assessment questions, each health status assessment question has a follow-up item, and the follow-up item corresponds to zero or at least another health status assessment question in the dialog design database; after the question of the health status assessment is presented in the data output interface, the multimodal data input interface receives the reply information; the identification procedure determines the type of the reply information; when the reply information belongs to the numerical type, the health status assessment program generates the first evaluation result and stores the first evaluation result according to the reply information and the numerical rule in the temporary storage space; when the reply information belongs to the text type, the semantic understanding program extracts the keyword from the reply information, and the health status evaluation program generates the second evaluation result according to the keyword and stores the second evaluation result in the temporary storage space; after the result is generated in the first evaluation result or the second, judging whether the interactive end signal is received
  • An interactive health status assessment system includes: a dialog design database, including a plurality of health status assessment questions, each health status assessment question has a follow-up item, and the follow-up item corresponds to zero or at least one other health status assessment question in the dialog design database; a data output interface for health status assessment, communicatively connected to the dialog design database; multimodal data input interface for receiving responses information; an identification module communicatively connected to the multimodal data input interface, wherein the identification module is used to judge whether the response information belongs to text type or numerical type; a semantic interpretation module, communicatively connected to the identification module, for categorizing the information type by taking a keyword; a health status evaluation module, communicatively connected to the identification module and the semantic interpretation module, and is configured to generate a first evaluation result from the response information and a numerical rule belonging to the numerical type response, and generate a second evaluation according to the keyword, and storing of first and second assessments to a temporary storage space, also outputting an evaluation report according to the data of the temporary
  • the interactive health status assessment system and method thereof disclosed by the present disclosure can interpret the semantic meaning and better understand the user's intentions and needs through user dialogue.
  • FIG. 1A is a schematic of a block diagram of an interactive health status assessment system according to an embodiment of the present disclosure.
  • FIG. IB is a schematic of a block diagram of an interactive health status assessment system according to another embodiment of the present disclosure.
  • FIG. 2 is a flow chart of an interactive health status assessment method according to an embodiment of the present disclosure.
  • the interactive health status assessment system and method thereof disclosed in the present disclosure in addition to paying particular attention to the interaction between the system and the user, further evaluate the user's health status through the interactive questions and answers designed by the present disclosure.
  • the following is a description of a hardware architecture embodiment of the interactive health status assessment system of the present disclosure as well as the functions of the components in the system.
  • the present disclosure further introduces another hardware architecture embodiment and its applications while introduces an interactive implementation based on the health status assessment method finally.
  • FIG. 1 A illustrates an interactive health status assessment system according to an embodiment of the present disclosure, including: a dialog design database 102, an identification module 104, a semantic interpretation module 106, a health status evaluation module 108, a data output interface 202, a multimodal data input interface 204 and a control module 206.
  • the dialog design database 102 may be implemented by a memory device; and the identification module 104, the semantic interpretation module 106, the health status evaluation module 108, and control module 206 may be implemented by individual calculating devices or be integrated into a single calculating device.
  • the dialog design database 102 stores a plurality of health status assessment questions in a data sheet format.
  • Each health status assessment question has a follow-up item.
  • the follow up item corresponds to zero or at least one other health status assessment question in the dialog design database 102.
  • each health status assessment question has a response keyword set corresponding to the follow-up item.
  • each health status assessment question has a trigger keyword set.
  • Table 1 is an example of a data sheet in the dialog design database 102.
  • all health status assessment questions in the dialog design database 102 fall in a variety of contents, such as sleep, blood glucose, blood pressure, end-of- answer, and the alike.
  • each health status assessment question corresponds to a number and a question script. The use of the remaining fields in the data sheet will be described later in relation with the functions of other modules.
  • Table 1 An example of a data sheet for the dialog design database 102.
  • the data output interface 202 electrically connects and communicates with the dialog design database 102, and the data output interface 202 is used to present selected health status assessment questions to the user by visual or audio ways.
  • the data output interface 202 such as a screen or a speaker, displays text or images, animated characters, or plays a selected health status evaluation question by voice.
  • the present disclosure relates shows hardware types of the data output interface 202 and the media of the output questions, they are not thus limited.
  • the multimodal data input interface 204 of the interactive health status assessment system receives the response information from the user.
  • the multimodal data input interface 204 can employ hardware devices such as a microphone, a camera, a touch screen, a physiological information measuring instrument (such as a blood glucose meter), or a computer peripheral device;
  • the input modes to the multimodal data input interface 204 are, for example, voice input, image input, click screen input, Bluetooth transmission or wireless transmission; and the data format of the reply information may include sound, image, trigger signal, text and/or number.
  • the identification module 104 is communicatively coupled to the multimodal data input interface 204, and the identification module 104 is configured to execute an identification procedure, such as software or code running on the processor.
  • the identification procedure is used to determine whether the reply information belongs to a text type or a numerical type.
  • the voice signal received by the microphone is converted into text, or the electronic signal measured by the blood glucose meter is converted into a numerical value.
  • numerical types of data represent can be, for example, vital signs such as the body temperature, blood pressure, heart rate, and respiratory rate, or physiological data such as blood sugar, blood lipids, and the alike.
  • the semantic interpretation module 106 electrically connects and communicates with the identification module 104.
  • the semantic interpretation module 106 is used to execute a semantic interpretation program, such as a software or program code running on a processor or a database with a plurality of keywords.
  • the semantic interpretation program is used to retrieve one or several keywords from the response information belonging to the text type.
  • Grammar and semantic analysis are achieved by neural network or machine learning.
  • the semantic understanding program converts the text into a language framework of ideas, from which the user's intentions and the keywords answered by the user are taken out, for example, understanding that the current user is talking about sleep problems or blood sugar problems.
  • the health status evaluation module 108 electrically connects and communicates with the identification module 104 and the semantic interpretation module 106 for performing a health status evaluation program, such as a software or program code running on the processor.
  • the health status assessment program is used to generate a first evaluation result from the response information and a numerical rule correspond to the numerical status. For example, the value measured according to the blood glucose meter is compared with a pre-stored normal range blood glucose value (numerical rule) to generate a first evaluation result.
  • the semantic interpretation module 106 retrieves the keywords
  • the health status assessment module 108 is also operative to generate a second evaluation result based on the keywords and the health status assessment questions selected from the dialog design database 102.
  • the health status evaluation module 108 When the multimodal data input interface 204 obtains the information input from the user, the health status evaluation module 108 generates at least one of the first evaluation result and the second evaluation result, and stores the generated evaluation result in one temporary storage space.
  • the user's reply may include only text data or numeric data; however, it is also possible to include both. For example, the user tells the system "I didn't sleep well last night and should have a fever of more than 38 degrees.” In this example, the health status assessment module 108 will generate a second evaluation result based on keywords such as "sleep" and "bad", and generate a first evaluation result based on the "38 degrees" of the numerical data.
  • the health status assessment module 108 When the interactive health status assessment system according to an embodiment of the present disclosure detects that the user completes the health status assessment process, or detects that the user interrupts the health status assessment process, the health status assessment module 108 will output the evaluation report based on the stored data in the temporary storage space.
  • the interactive health status assessment system may further include a user database for storing a historical assessment report for each user. Therefore, whenever the user generates a new evaluation report through the system, the health status evaluation module 108 can further incorporate the historical data into this new evaluation report, thereby more accurately reflecting the health status of the user.
  • the control module 206 communicatively connects the dialog design database 102, the multimodal data input interface 204, and the health status evaluation module 108, thereby cooperating with the above components to perform an interactive health status assessment process.
  • the control module 206 is, for example, a software or program code running on a processor.
  • the control module 206 Upon receiving the interactive completion signal, the control module 206 notifies the health status assessment module 108 to output an evaluation report; when the interaction completion signal is not received, the control module 206 selects another health status assessment question from the follow-up item of the health status assessment question.
  • the interactive completion signal is generated by the multimodal data input interface 204, such as a press toward the return button on the touch screen or a sound of "end interaction" to the microphone, or generated after the system detects the following condition that, for example, the user has responded to a certain number of health status assessment questions in the dialog design database 102 to achieve a certain level of interaction.
  • control module 206 selects another health status assessment question.
  • the control module 206 further performs: determining whether the keyword conforms to the response keyword set, and selecting the follow-up item corresponding to the response keyword set when the keyword matches the response keyword set. Please refer to Table 1. For example, when the question currently selected is Q3 : " Did you drink something containing caffeine before you sleep last night?", the response key of Q3 contains "Yes” and "No".
  • the control module 206 confirms that the keyword extracted by the semantic interpretation module 106 contains "Yes”, according to the follow-up item field of Q3 in Table 1, Q7 is selected as the next health status assessment question for user to answer.
  • the follow-up items one of them is randomly selected as the next health status assessment question, or the questions are placed in the queue to sequentially query the user.
  • the control module 206 further performs: searching for another health status assessment question from the trigger keyword set which is the same as this health status assessment question. For example, when the current question is Q3, the user does not directly ask questions, but describes that he was "cold sweating" last night. When the control module 206 confirms that the keyword extracted by the semantic interpretation module 106 contains “cold sweat”, it searches all the "trigger keyword sets” in Table 1 and finds Q7, so selects Q7 as the next health status assessment question to be asked. In practice, when the user's reply signal includes both "trigger keyword set” and "response keyword set", the next selected health status assessment question can be determined according to the priority of the two, or one of them can be randomly selected to be the first question. The present disclosure is not thus limited.
  • the control module 206 further calculates a user completion for each intent category.
  • the "user completion" is, for example, a percentage obtained by dividing the number of questions answered by the user by the total number of questions in the category. Therefore, when the system selects the next health status assessment question, if the follow-up item includes several candidate question numbers, the system will first determine whether the user completion degree of the intent category to which the question belongs has reached a preset threshold.
  • the system will select another question from the follow-up item, which belongs to the same intent category as the current question, until the number of questions reaches the preset threshold for the completion degree of the intent category, then the system selects another question of other intent category for the user to continue to answer. Even if the user may first jump to the other intent category through the "trigger keyword set", the system will return to the previous intent category and continue to ask questions after the completion of the intent category being jumped to is reached.
  • the system will end the interactive questions and answers process.
  • the system determines that sufficient information has been collected, and can generate an evaluation report for the user to review, thus leaves the interactive questions and answers process. Otherwise, for another example, the interactive questions and answers process can stop when the user wants to terminate the interaction.
  • the control module 206 will provide Q99 in the dialog design database 102 to confirm whether the user has other comments to offer. If the user selects "No", the interactive questions and answers process is ended.
  • the interactive health status assessment system described in one embodiment of the present disclosure can have a closer understanding of the user's semantics, and knowing more about the user's intention and needs. At the same time, according to the response information provided by the user, it provides a more accurate health status assessment.
  • FIG. IB is a block diagram of an interactive health status assessment system according to another embodiment of the present disclosure, including an evaluation device 10 and an interaction device 20 that are communicatively coupled to each other, for example, by a wired or wireless network connection NC.
  • the present disclosure is not thus limited.
  • the evaluation device 10 of this embodiment is, for example, a cloud platform, and includes a dialog design database 102, an identification module 104, a semantic interpretation module 106, a health status assessment module 108, and a communication module 109, wherein the functions and configurations of the components 102-108 have been illustrated previously and thus would not be repeated again.
  • the communication module 109 is electrically connected to the dialog design database 102 and the health status assessment module 108.
  • the communication module 109 converts the selected health status assessment question and the evaluation report into a data format (for example, a network packet) for the selected health status assessment question and the evaluation report to be sent to the interactive device 20 through the network connection NC.
  • the evaluation device 10 may further include a database such as a living resource, a learning resource, a content resource, or a language database, a user database, and the alike.
  • a database such as a living resource, a learning resource, a content resource, or a language database, a user database, and the alike.
  • the user database is communicatively connected with the health status assessment module 108, and is used to identify the corresponding user according to the user's facial image or the user's voice, and the evaluation report would also be stored in this user database.
  • the interactive device 20 is, for example, a mobile vehicle (for example, a robot) or a mobile device (for example, a smart phone or a tablet) that can be equipped with an imaging device, and the interactive health status assessment system is in the form of an application (APP) installed on the interaction device 20.
  • the interaction device 20 includes a data output interface 202, a multimodal data input interface 204, a control module 206, a detection device 208, and a communication module 209.
  • the communication module 209 communicatively couples to the data output interface 202, the multimodal data input interface 204 and the control module 206, and the control module 206 is communicatively coupled to the detection device 208 in this embodiment.
  • the detection device 208 is, for example, an imaging device or a microphone for obtaining an image data or an audio data, and the obtained data is sent to the control module 206 to determine whether there is a user being detected. Specifically, when the image data contains a user's facial image or the audio data contains the user's voice, the control module 206 selects one health status assessment question from the dialog design data bases 102 via the network connection NC through the communication module 109.
  • Embodiment 1 The system is installed in a mobile vehicle with an imaging device, and the system can detect whether the user is around, send a greeting to the user, and activate the system function by a dialogue or command through the imaging device. After the user operates the system function, the system uploads the operation history to the cloud database, and updates the functions of the system-related components. Finally, the system exports the charts and reports to the user, providing the user further learning and reference.
  • Embodiment 2 The system is installed in the mobile vehicle, and the system can actively engage in dialogue and interaction with the user. Through the process of interacting and establishing of the knowledge base, the system can understand the concepts that the user lacks and provide the user corresponding learning list.
  • Embodiment 3 The system is installed in the mobile device, and the system can actively engage in dialogue and interaction with the user. Through establishing a database of related information such as the process, time, situation and lifestyle of the dialogue, the system can recommend a personalized list to the user to improve their convenience.
  • FIG. 2 is a flowchart of executing of an interactive health status assessment method according to an embodiment of the present disclosure.
  • step S10 determining whether the user is detected.
  • the detecting device 208 obtains image data or audio data; and the control module 206 determines whether the image data has included the facial image or voice data of the user.
  • the process goes to step S12, otherwise the detection device 208 continues the detecting process.
  • the system can detect the user through the detecting device 208 and send a greeting.
  • the user can also directly send a command to the system through the multimodal data input interface 204 to start the system to have a dialogue.
  • step S12 selecting and presenting a health status assessment question.
  • the system selects a health status assessment question from the dialog database 102 and presents the health status assessment question by the data output interface 202.
  • step S14 receiving the response information.
  • the system receives the response information of the user's voice, image, or physiological measurement signal through the multimodal data input interface 204.
  • the voice data is received through the sensing module
  • the trigger signal is received through the touch screen
  • the Bluetooth input signal or the wireless input signal is received through the communication module.
  • step S20 determining the response information type.
  • the system determines whether the response information belongs to a numeric type or a text type by the identification module 104.
  • the health status evaluation module 108 generates a first evaluation result according to the response information and the preset numerical rule.
  • the semantic interpretation module 106 retrieves the keyword from the response information, and then the health status assessment module 108 generates a second evaluation result based on the keyword.
  • the health status assessment module 108 further stores at least one of the above two evaluation results in a temporary storage space.
  • step S30 determining whether an interactive terminating signal is received.
  • another health status assessment question is selected from the follow-up item of the health status assessment question, and the process returns to step S12.
  • Determining whether the interactive terminating signal is received includes the following methods: receiving a trigger signal which represents terminating the interaction; or the follow-up item of the health status assessment question corresponding to zero health status assessment questions.
  • the health status assessment module 108 outputs an evaluation report based on the data in the temporary storage space, and presents the evaluation report through the data output interface 202, as shown in step S32.
  • the method further includes: searching for a storage space of the corresponding user according to the user's facial image or the user's voice; and storing the evaluation report in the storage space corresponding to the current user.
  • the interactive health status assessment system and method thereof can be introduced into a home robot or an APP, and applied to a family to achieve a companion demand for soothing the pressure of the aging society and long-term care. It can also be applied to the medical caring fields to replace repetitive health education for medical staff, so that medical staff can focus on individual care of patients to improve and optimize the medical quality.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

A system of interactive health status assessment comprises a dialogue database including a plurality of health status assessment questions presented by a data output interface; a multimodal data input interface is configured to receive a response information; an identification module is configured to determine the type of the response information; a semantic interpretation module is configured to extract a keyword from the text-type response information; a health status assessment module is configured to generate a first assessment result according to the numeric-type response information, a second assessment result is generated according to the keyword, and an outputted assessment report is generated according to the assessment results; and a control module is configured to selectively notify the health status assessment module to output the assessment report or select another health status assessment question from a follow-up item of the selected health status assessment question.

Description

METHOD OF AN INTERACTIVE HEALTH STATUS ASSESSMENT AND SYSTEM
THEREOF
FIELD OF THE DISCLOSURE
The present disclosure relates to health status assessment, and more particularly to an interactive health status assessment system and method thereof.
BACKGROUND
The arrival of an aging society has made the long-term care of the elderly people one of the most concerned issues in modern society and families. On the other hand, under the rapid development of interaction between robots and artificial intelligence, the homecare services that can be provided are more and more diversified. For example, robots with health care and companionship function can help people to care for elders at home or elders living alone. Technologize care work can ease the pressure of the shortage of long-term care givers. In medical care situations, robots can assist healthcare professionals in repetitive health education, allowing medical staff to focus on individual care needed for patients to improve medical quality.
However, existing systems that combine artificial intelligence technology and health care lack an interactive mechanism with the object being cared for. We communicate using words, a form of unstructured data. If the software of above system lacks the design of deep semantic understanding, it can only rely on experiences on similar things or similar users to assist in judging. Thus, said system usually fails to provide smarter health status assessment results based on current user preferences, causality and context, and is often difficult to accurately detect the user's intention when interacting with the user.
SUMMARY
In view of this, the present disclosure proposes an interactive health status assessment system and an interactive health status assessment method to solve the above problems.
An interactive health status assessment method according to an embodiment of the present disclosure includes: selecting a health status assessment question from a dialog design database and presenting the health status assessment question by using a data output interface, wherein the dialog design database includes a plurality of health status assessment questions, each health status assessment question has a follow-up item, and the follow-up item corresponds to zero or at least another health status assessment question in the dialog design database; after the question of the health status assessment is presented in the data output interface, the multimodal data input interface receives the reply information; the identification procedure determines the type of the reply information; when the reply information belongs to the numerical type, the health status assessment program generates the first evaluation result and stores the first evaluation result according to the reply information and the numerical rule in the temporary storage space; when the reply information belongs to the text type, the semantic understanding program extracts the keyword from the reply information, and the health status evaluation program generates the second evaluation result according to the keyword and stores the second evaluation result in the temporary storage space; after the result is generated in the first evaluation result or the second, judging whether the interactive end signal is received; when the interactive end signal has been received, the health status evaluation program outputs an evaluation report according to the data in the temporary storage space; when the interactive end signal is not received, select another health status assessment question in the follow-up item.
An interactive health status assessment system according to an embodiment of the present disclosure includes: a dialog design database, including a plurality of health status assessment questions, each health status assessment question has a follow-up item, and the follow-up item corresponds to zero or at least one other health status assessment question in the dialog design database; a data output interface for health status assessment, communicatively connected to the dialog design database; multimodal data input interface for receiving responses information; an identification module communicatively connected to the multimodal data input interface, wherein the identification module is used to judge whether the response information belongs to text type or numerical type; a semantic interpretation module, communicatively connected to the identification module, for categorizing the information type by taking a keyword; a health status evaluation module, communicatively connected to the identification module and the semantic interpretation module, and is configured to generate a first evaluation result from the response information and a numerical rule belonging to the numerical type response, and generate a second evaluation according to the keyword, and storing of first and second assessments to a temporary storage space, also outputting an evaluation report according to the data of the temporary storage space; and a control module, communicatively connected to the dialogue design database, a multimodal data input interface, and a health status evaluation module, and the control module is configured to output a health status assessment evaluation report when received the interactive end signal. Also, another health status assessment question is selected from the follow-up item of the health status assessment question when the interactive end signal is not received.
With the above technology, the interactive health status assessment system and method thereof disclosed by the present disclosure can interpret the semantic meaning and better understand the user's intentions and needs through user dialogue.
The above description of the disclosure and the following embodiments are intended to illustrate and explain the spirit and principles of the invention, and to provide further explanation of the scope of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1A is a schematic of a block diagram of an interactive health status assessment system according to an embodiment of the present disclosure.
FIG. IB is a schematic of a block diagram of an interactive health status assessment system according to another embodiment of the present disclosure.
FIG. 2 is a flow chart of an interactive health status assessment method according to an embodiment of the present disclosure.
DETAILED DESCRIPTION
The detailed features and advantages of the present disclosure are described in detail below in the embodiments of the present disclosure. The features and advantages associated with the present disclosure can be readily understood by those skilled in the art. The following examples are intended to describe the present disclosure in further detail, but do not limit the scope of the invention in any way.
The interactive health status assessment system and method thereof disclosed in the present disclosure, in addition to paying particular attention to the interaction between the system and the user, further evaluate the user's health status through the interactive questions and answers designed by the present disclosure. The following is a description of a hardware architecture embodiment of the interactive health status assessment system of the present disclosure as well as the functions of the components in the system. The present disclosure further introduces another hardware architecture embodiment and its applications while introduces an interactive implementation based on the health status assessment method finally.
Please refer to FIG. 1 A, which illustrates an interactive health status assessment system according to an embodiment of the present disclosure, including: a dialog design database 102, an identification module 104, a semantic interpretation module 106, a health status evaluation module 108, a data output interface 202, a multimodal data input interface 204 and a control module 206. Specifically, the dialog design database 102 may be implemented by a memory device; and the identification module 104, the semantic interpretation module 106, the health status evaluation module 108, and control module 206 may be implemented by individual calculating devices or be integrated into a single calculating device.
The dialog design database 102 stores a plurality of health status assessment questions in a data sheet format. Each health status assessment question has a follow-up item. The follow up item corresponds to zero or at least one other health status assessment question in the dialog design database 102. In an embodiment, each health status assessment question has a response keyword set corresponding to the follow-up item. In another embodiment, each health status assessment question has a trigger keyword set.
Please refer to Table 1 below, which is an example of a data sheet in the dialog design database 102. In one embodiment, all health status assessment questions in the dialog design database 102 fall in a variety of contents, such as sleep, blood glucose, blood pressure, end-of- answer, and the alike. In the data sheet, each health status assessment question corresponds to a number and a question script. The use of the remaining fields in the data sheet will be described later in relation with the functions of other modules.
Table 1 : An example of a data sheet for the dialog design database 102.
Figure imgf000006_0001
Figure imgf000007_0001
Figure imgf000008_0001
Please continue to refer to Fig. 1 A. The data output interface 202 electrically connects and communicates with the dialog design database 102, and the data output interface 202 is used to present selected health status assessment questions to the user by visual or audio ways. In practice, the data output interface 202, such as a screen or a speaker, displays text or images, animated characters, or plays a selected health status evaluation question by voice. Although the present disclosure relates shows hardware types of the data output interface 202 and the media of the output questions, they are not thus limited.
After the user obtains a health status assessment question from the data output interface 202, the multimodal data input interface 204 of the interactive health status assessment system according to this embodiment of the present disclosure receives the response information from the user. In practice, the multimodal data input interface 204 can employ hardware devices such as a microphone, a camera, a touch screen, a physiological information measuring instrument (such as a blood glucose meter), or a computer peripheral device; The input modes to the multimodal data input interface 204 are, for example, voice input, image input, click screen input, Bluetooth transmission or wireless transmission; and the data format of the reply information may include sound, image, trigger signal, text and/or number.
Please continue to refer to Fig. 1 A. The identification module 104 is communicatively coupled to the multimodal data input interface 204, and the identification module 104 is configured to execute an identification procedure, such as software or code running on the processor. The identification procedure is used to determine whether the reply information belongs to a text type or a numerical type. For example, the voice signal received by the microphone (multimodal data input interface 204) is converted into text, or the electronic signal measured by the blood glucose meter is converted into a numerical value. Specifically, numerical types of data represent can be, for example, vital signs such as the body temperature, blood pressure, heart rate, and respiratory rate, or physiological data such as blood sugar, blood lipids, and the alike.
Please continue to refer to FIG. 1 A. The semantic interpretation module 106 electrically connects and communicates with the identification module 104. The semantic interpretation module 106 is used to execute a semantic interpretation program, such as a software or program code running on a processor or a database with a plurality of keywords. The semantic interpretation program is used to retrieve one or several keywords from the response information belonging to the text type. Grammar and semantic analysis are achieved by neural network or machine learning. In other words, the semantic understanding program converts the text into a language framework of ideas, from which the user's intentions and the keywords answered by the user are taken out, for example, understanding that the current user is talking about sleep problems or blood sugar problems.
Please continue to refer to FIG. 1 A. The health status evaluation module 108 electrically connects and communicates with the identification module 104 and the semantic interpretation module 106 for performing a health status evaluation program, such as a software or program code running on the processor. The health status assessment program is used to generate a first evaluation result from the response information and a numerical rule correspond to the numerical status. For example, the value measured according to the blood glucose meter is compared with a pre-stored normal range blood glucose value (numerical rule) to generate a first evaluation result. In addition, when the semantic interpretation module 106 retrieves the keywords, the health status assessment module 108 is also operative to generate a second evaluation result based on the keywords and the health status assessment questions selected from the dialog design database 102.
When the multimodal data input interface 204 obtains the information input from the user, the health status evaluation module 108 generates at least one of the first evaluation result and the second evaluation result, and stores the generated evaluation result in one temporary storage space. The user's reply may include only text data or numeric data; however, it is also possible to include both. For example, the user tells the system "I didn't sleep well last night and should have a fever of more than 38 degrees." In this example, the health status assessment module 108 will generate a second evaluation result based on keywords such as "sleep" and "bad", and generate a first evaluation result based on the "38 degrees" of the numerical data. When the interactive health status assessment system according to an embodiment of the present disclosure detects that the user completes the health status assessment process, or detects that the user interrupts the health status assessment process, the health status assessment module 108 will output the evaluation report based on the stored data in the temporary storage space. In another embodiment of the present disclosure, the interactive health status assessment system may further include a user database for storing a historical assessment report for each user. Therefore, whenever the user generates a new evaluation report through the system, the health status evaluation module 108 can further incorporate the historical data into this new evaluation report, thereby more accurately reflecting the health status of the user.
Please continue to refer to FIG. 1 A. The control module 206 communicatively connects the dialog design database 102, the multimodal data input interface 204, and the health status evaluation module 108, thereby cooperating with the above components to perform an interactive health status assessment process. In practice, the control module 206 is, for example, a software or program code running on a processor. Upon receiving the interactive completion signal, the control module 206 notifies the health status assessment module 108 to output an evaluation report; when the interaction completion signal is not received, the control module 206 selects another health status assessment question from the follow-up item of the health status assessment question. The interactive completion signal is generated by the multimodal data input interface 204, such as a press toward the return button on the touch screen or a sound of "end interaction" to the microphone, or generated after the system detects the following condition that, for example, the user has responded to a certain number of health status assessment questions in the dialog design database 102 to achieve a certain level of interaction.
The situation in which control module 206 selects another health status assessment question is described below. In another embodiment of the present disclosure, the control module 206 further performs: determining whether the keyword conforms to the response keyword set, and selecting the follow-up item corresponding to the response keyword set when the keyword matches the response keyword set. Please refer to Table 1. For example, when the question currently selected is Q3 : " Did you drink something containing caffeine before you sleep last night?", the response key of Q3 contains "Yes" and "No". When the control module 206 confirms that the keyword extracted by the semantic interpretation module 106 contains "Yes", according to the follow-up item field of Q3 in Table 1, Q7 is selected as the next health status assessment question for user to answer. When there are multiple question numbers in the follow-up items, one of them is randomly selected as the next health status assessment question, or the questions are placed in the queue to sequentially query the user.
In another embodiment of the present disclosure, the control module 206 further performs: searching for another health status assessment question from the trigger keyword set which is the same as this health status assessment question. For example, when the current question is Q3, the user does not directly ask questions, but describes that he was "cold sweating" last night. When the control module 206 confirms that the keyword extracted by the semantic interpretation module 106 contains "cold sweat", it searches all the "trigger keyword sets" in Table 1 and finds Q7, so selects Q7 as the next health status assessment question to be asked. In practice, when the user's reply signal includes both "trigger keyword set" and "response keyword set", the next selected health status assessment question can be determined according to the priority of the two, or one of them can be randomly selected to be the first question. The present disclosure is not thus limited.
In practice, in the dialog design database 102, the next question number set in the follow-up item is mostly set to other questions of the same intent category. Therefore, the system can make more comprehensive questions about the same intent category in order to gain a deeper understanding of the user's situation. In still another embodiment of the present disclosure, the control module 206 further calculates a user completion for each intent category. The "user completion" is, for example, a percentage obtained by dividing the number of questions answered by the user by the total number of questions in the category. Therefore, when the system selects the next health status assessment question, if the follow-up item includes several candidate question numbers, the system will first determine whether the user completion degree of the intent category to which the question belongs has reached a preset threshold. If not, the system will select another question from the follow-up item, which belongs to the same intent category as the current question, until the number of questions reaches the preset threshold for the completion degree of the intent category, then the system selects another question of other intent category for the user to continue to answer. Even if the user may first jump to the other intent category through the "trigger keyword set", the system will return to the previous intent category and continue to ask questions after the completion of the intent category being jumped to is reached.
When the follow-up item selected in the dialog design database 102 according to the current health status assessment question is empty (such as Q99), the system will end the interactive questions and answers process. In the foregoing case, for example, after the user completes a certain number of interactive questions and answers, the system determines that sufficient information has been collected, and can generate an evaluation report for the user to review, thus leaves the interactive questions and answers process. Otherwise, for another example, the interactive questions and answers process can stop when the user wants to terminate the interaction. In the questions and answers process, the control module 206 will provide Q99 in the dialog design database 102 to confirm whether the user has other comments to offer. If the user selects "No", the interactive questions and answers process is ended. According to the above mechanism of selecting the next health status assessment question, the interactive health status assessment system described in one embodiment of the present disclosure can have a closer understanding of the user's semantics, and knowing more about the user's intention and needs. At the same time, according to the response information provided by the user, it provides a more accurate health status assessment.
Please refer to FIG. IB, which is a block diagram of an interactive health status assessment system according to another embodiment of the present disclosure, including an evaluation device 10 and an interaction device 20 that are communicatively coupled to each other, for example, by a wired or wireless network connection NC. However, the present disclosure is not thus limited.
The evaluation device 10 of this embodiment is, for example, a cloud platform, and includes a dialog design database 102, an identification module 104, a semantic interpretation module 106, a health status assessment module 108, and a communication module 109, wherein the functions and configurations of the components 102-108 have been illustrated previously and thus would not be repeated again.
Specifically, the communication module 109 is electrically connected to the dialog design database 102 and the health status assessment module 108. The communication module 109 converts the selected health status assessment question and the evaluation report into a data format (for example, a network packet) for the selected health status assessment question and the evaluation report to be sent to the interactive device 20 through the network connection NC.
In addition, the evaluation device 10 may further include a database such as a living resource, a learning resource, a content resource, or a language database, a user database, and the alike. In the interactive health status assessment system according to an embodiment of the present disclosure, the user database is communicatively connected with the health status assessment module 108, and is used to identify the corresponding user according to the user's facial image or the user's voice, and the evaluation report would also be stored in this user database.
The interactive device 20 is, for example, a mobile vehicle (for example, a robot) or a mobile device (for example, a smart phone or a tablet) that can be equipped with an imaging device, and the interactive health status assessment system is in the form of an application (APP) installed on the interaction device 20. The interaction device 20 includes a data output interface 202, a multimodal data input interface 204, a control module 206, a detection device 208, and a communication module 209. For the function and configuration of components 202-206, please refer to the previous section, and the details are not repeatedly described here. The communication module 209 communicatively couples to the data output interface 202, the multimodal data input interface 204 and the control module 206, and the control module 206 is communicatively coupled to the detection device 208 in this embodiment.
The detection device 208 is, for example, an imaging device or a microphone for obtaining an image data or an audio data, and the obtained data is sent to the control module 206 to determine whether there is a user being detected. Specifically, when the image data contains a user's facial image or the audio data contains the user's voice, the control module 206 selects one health status assessment question from the dialog design data bases 102 via the network connection NC through the communication module 109.
Based on the above two embodiments, it can be understood that while integrating all the components of the interactive health status assessment system into one hardware device, the three embodiments described below are also included.
Embodiment 1 : The system is installed in a mobile vehicle with an imaging device, and the system can detect whether the user is around, send a greeting to the user, and activate the system function by a dialogue or command through the imaging device. After the user operates the system function, the system uploads the operation history to the cloud database, and updates the functions of the system-related components. Finally, the system exports the charts and reports to the user, providing the user further learning and reference.
Embodiment 2: The system is installed in the mobile vehicle, and the system can actively engage in dialogue and interaction with the user. Through the process of interacting and establishing of the knowledge base, the system can understand the concepts that the user lacks and provide the user corresponding learning list.
Embodiment 3 : The system is installed in the mobile device, and the system can actively engage in dialogue and interaction with the user. Through establishing a database of related information such as the process, time, situation and lifestyle of the dialogue, the system can recommend a personalized list to the user to improve their convenience.
Please refer to FIG. 2, which is a flowchart of executing of an interactive health status assessment method according to an embodiment of the present disclosure. Please refer to step S10: determining whether the user is detected. In detail, the detecting device 208 obtains image data or audio data; and the control module 206 determines whether the image data has included the facial image or voice data of the user. When the image data has included the facial image of the user or the audio data has included the voice data of the user, the process goes to step S12, otherwise the detection device 208 continues the detecting process. In addition, the system can detect the user through the detecting device 208 and send a greeting. The user can also directly send a command to the system through the multimodal data input interface 204 to start the system to have a dialogue.
Please refer to step S12: selecting and presenting a health status assessment question. In particular, the system selects a health status assessment question from the dialog database 102 and presents the health status assessment question by the data output interface 202.
Please refer to step S14: receiving the response information. In detail, the system receives the response information of the user's voice, image, or physiological measurement signal through the multimodal data input interface 204. For example, the voice data is received through the sensing module, the trigger signal is received through the touch screen, or the Bluetooth input signal or the wireless input signal is received through the communication module.
Please refer to step S20: determining the response information type. In detail, the system determines whether the response information belongs to a numeric type or a text type by the identification module 104. When the response information belongs to the numerical type, referring to step S22, the health status evaluation module 108 generates a first evaluation result according to the response information and the preset numerical rule. On the other hand, when the response information belongs to the text type, referring to steps S24 to S26, the semantic interpretation module 106 retrieves the keyword from the response information, and then the health status assessment module 108 generates a second evaluation result based on the keyword. The health status assessment module 108 further stores at least one of the above two evaluation results in a temporary storage space.
Please refer to step S30: determining whether an interactive terminating signal is received. In detail, when the interactive terminating signal is not received, another health status assessment question is selected from the follow-up item of the health status assessment question, and the process returns to step S12. Determining whether the interactive terminating signal is received includes the following methods: receiving a trigger signal which represents terminating the interaction; or the follow-up item of the health status assessment question corresponding to zero health status assessment questions.
Conversely, when the system has received the interactive terminating signal, the health status assessment module 108 outputs an evaluation report based on the data in the temporary storage space, and presents the evaluation report through the data output interface 202, as shown in step S32. In addition, before outputting the evaluation report, the method further includes: searching for a storage space of the corresponding user according to the user's facial image or the user's voice; and storing the evaluation report in the storage space corresponding to the current user.
In summary, the interactive health status assessment system and method thereof according to the present disclosure can be introduced into a home robot or an APP, and applied to a family to achieve a companion demand for soothing the pressure of the aging society and long-term care. It can also be applied to the medical caring fields to replace repetitive health education for medical staff, so that medical staff can focus on individual care of patients to improve and optimize the medical quality.
Although the present disclosure has been disclosed above in the foregoing embodiments, it is not intended to limit the invention. Within the scope of the invention, it can modified without departing from the spirit of the invention. Please refer to the attached patent application for the scope of protection defined by the present disclosure.

Claims

CLAIMS What is Claimed is:
1. An interactive health status assessment method, comprising:
selecting a health status assessment question from a dialog design database, and presenting the health status assessment question by a data output interface, wherein the dialog design database comprises a plurality of health status assessment questions, and each of the health status assessment questions has a follow-up item corresponding to zero or at least one other health status assessment question in the dialog design database;
receiving a reply message by a multimodal data input interface after the health status assessment question is presented by the data output interface;
determining the type of the reply message by an identification procedure;
generating a first assessment result according to the reply message and a numerical rule by a health status assessment procedure and storing the first assessment result in a temporary storage space when the reply message is of a numerical type;
extracting a keyword from the reply message by a semantic interpretation procedure and generating a second assessment result according to the reply message by the health status assessment procedure and storing the second assessment result in the temporary storage space when the reply message is of a text type;
determining whether an interaction end signal is received after generating the first assessment result or the second assessment result;
outputting an assessment report according to the data in the temporary storage space by the health status assessment procedure when the interaction end signal has been received; and selecting another health status assessment question wherein said another health status assessment question is selectively selected from the follow-up item when the interaction end signal has not been received.
2. The interactive health status assessment method according to claim 1, wherein each of the health status assessment questions has a response keyword set corresponding to the follow-up item, and selecting said another health status assessment questions further comprises:
selecting the follow-up item corresponding to the response keyword set when the keyword meets the response keyword set.
3. The interactive health status assessment method according to claim 1, wherein each of the health status assessment questions has a trigger keyword set, and selecting said another health status assessment question further comprises:
finding a corresponding one from the trigger keyword sets of the health status assessment questions as said another health status assessment question.
4. The interactive health status assessment method according to claim 1, wherein before selecting the health status assessment question from the dialog design database, the method further comprises:
acquiring an image data or a sound data by a detecting device; and
determining whether the image data has a facial image of a user or whether the voice data has a voice signal of the user;
starting to select the health status assessment question from the dialog design database when the image data has the facial image of the user or the voice data has the voice signal of the user.
5. The interactive health status assessment method according to claim 1, wherein determining whether the interaction end signal is received comprises:
receiving a trigger signal for the representative to end the interaction, or the follow-up item of the health status assessment question corresponding to zero health status assessment question.
6. The interactive health assessment method described in claim 4, wherein before outputting the assessment report, the method further comprises:
searching for a storage space corresponding to the user according to the facial image of the user or the voice signal of the user from a user database;
storing the assessment report in the storage space.
7. An interactive health status assessment system, including:
a dialog design database comprising a plurality of health status assessment questions, each of the health status assessment questions having a follow-up item corresponding to zero or at least one other health status assessment question in the dialog design database problem; a data output interface, communicatively connected to the dialog design database, and the data output interface is configured to present the health status assessment question;
a multimodal data input interface configured to receive a reply message;
an identification module, connected to the multimodal data input interface, and the identification module is configured to determine that the reply message belongs to a text type or a numerical type;
a semantic interpretation module, communicatively connected to the identification module, and the semantic interpretation module is configured to extract a keyword from the reply message belonging to the text type;
a health status assessment module, communicatively connected to the identification module and the semantic interpretation module, wherein the health status assessment module is configured to generate a first assessment result according to the reply message belonging to the numerical type and a numerical rule, to generate a second assessment result according to the keyword, to store the first and the second assessment results to a temporary storage space ,and to output an assessment report based on the data stored in the temporary storage space; a control module, communicatively connected to the dialog design database, the multimodal data input interface and the health status assessment module, the control module is configured to notify the health status assessment module to output the assessment report when receiving an interaction end signal, and selecting another health status assessment question when the interaction end signal is not received, wherein said another health status assessment question is selectively selected from the follow-up item.
8. The interactive health status assessment system of claim 7, wherein each of the health status assessment questions has a response key corresponding to the follow-up item, and the control module is further configured to determine whether the keyword sink meets the response keyword set sink, and when the keyword matches the response keyword set sink, select the follow-up item corresponding to the response keyword set sink.
9. The interactive health status assessment system of claim 7, wherein each of the health status assessment questions further has a trigger keyword set, and the control module is further configured to use the trigger keyword set in the health status assessment questions to look for the person who matches the keyword as the other health status assessment question.
10. The interactive health status assessment system as described in claim 7 further includes:
a detecting device, communicatively connected to the control module, and the detecting device is configured to obtain an image data or a sound data;
the control module is further connected to the data output interface, and is further configured to determine whether the image data has a user's facial image or whether the sound data has the user's voice, and when the image data has the user's facial image or the voice data has the user's voice, the control module further includes selecting the health status assessment question from the dialog design database and presenting the health status assessment problem by using the data output interface.
11. The interactive health status assessment system as described in claim 8 further comprises:
a user database, connected to the health status assessment module for searching for a storage space corresponding to the user based on the facial image of the user or the voice of the user, and storing the evaluation report in the storage space.
PCT/US2018/063478 2018-11-30 2018-11-30 Method of an interactive health status assessment and system thereof WO2020112147A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201880098934.XA CN113287175B (en) 2018-11-30 2018-11-30 Interactive health state assessment method and system thereof
PCT/US2018/063478 WO2020112147A1 (en) 2018-11-30 2018-11-30 Method of an interactive health status assessment and system thereof
JP2021531393A JP7285589B2 (en) 2018-11-30 2018-11-30 INTERACTIVE HEALTH CONDITION EVALUATION METHOD AND SYSTEM THEREOF

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/063478 WO2020112147A1 (en) 2018-11-30 2018-11-30 Method of an interactive health status assessment and system thereof

Publications (1)

Publication Number Publication Date
WO2020112147A1 true WO2020112147A1 (en) 2020-06-04

Family

ID=70854376

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/063478 WO2020112147A1 (en) 2018-11-30 2018-11-30 Method of an interactive health status assessment and system thereof

Country Status (3)

Country Link
JP (1) JP7285589B2 (en)
CN (1) CN113287175B (en)
WO (1) WO2020112147A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112388643A (en) * 2020-09-24 2021-02-23 中山大学 Multifunctional health science popularization robot
CN112908481A (en) * 2021-03-18 2021-06-04 马尚斌 Automatic personal health assessment and management method and system
CN114550860A (en) * 2022-01-28 2022-05-27 中国人民解放军总医院第一医学中心 Hospitalizing satisfaction evaluation method based on process data and intelligent network model
CN114859800A (en) * 2021-02-04 2022-08-05 中建三局绿色产业投资有限公司 Control method and control system for dredging safe operation of drainage box culvert
CN115840133A (en) * 2023-02-24 2023-03-24 湖南遥光科技有限公司 Circuit health grading evaluation method and electronic product health grading evaluation method
CN117556429A (en) * 2024-01-11 2024-02-13 北京警察学院 Safety protection capability evaluation method and system for public safety video image system
CN117634826A (en) * 2023-12-06 2024-03-01 青岛诚晟达精密机械有限公司 Intelligent building state monitoring management system based on cloud computing and multidimensional data

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120191466A1 (en) * 2011-01-25 2012-07-26 Quint Saul Systems and Methods of Pharmaceutical Information Management
US20140114680A1 (en) * 2009-11-12 2014-04-24 RedBrick Health Corporation Interactive health assessment

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL131873A0 (en) * 1997-03-13 2001-03-19 First Opinion Corp Disease management system
JP2003271741A (en) 2002-03-15 2003-09-26 Toshiba Corp Medical treatment system and medical treatment method used for the medical treatment system
CN1482537A (en) * 2002-09-10 2004-03-17 明日工作室股份有限公司 Testing system and method having question difficulty changing function
JP2008234443A (en) 2007-03-22 2008-10-02 Matsushita Electric Ind Co Ltd Information processor
CN101877082A (en) * 2009-04-30 2010-11-03 郝大忠 High-performance question response recording system and usage method thereof
US9553727B2 (en) * 2010-01-21 2017-01-24 Omid Ebrahimi Kia Secure and mobile biometric authentication for electronic health record management
CN102207863A (en) * 2011-04-01 2011-10-05 奇智软件(北京)有限公司 Method and device for controlling progress bar to advance
TW201308222A (en) * 2011-08-05 2013-02-16 Han Tao Co Ltd On-line food evaluation system
JP2016224784A (en) 2015-06-02 2016-12-28 住友電気工業株式会社 Health care system, server device, terminal device, health care method, and health care program
CN105447573A (en) * 2015-11-25 2016-03-30 杨会志 Method and system for interactively completing solving process of mathematic question
JP6715048B2 (en) * 2016-03-23 2020-07-01 株式会社野村総合研究所 Goal achievement portfolio generation device, program and method
CN205788185U (en) * 2016-06-03 2016-12-07 王莉芳 A kind of mathematics questionnaire survey statistics equipment
WO2018061170A1 (en) 2016-09-30 2018-04-05 株式会社オプティム Interactive history-taking system, interactive history-taking method, and program
JP6817593B2 (en) 2016-11-25 2021-01-20 パナソニックIpマネジメント株式会社 Information processing methods, information processing devices and programs
CN106649760A (en) * 2016-12-27 2017-05-10 北京百度网讯科技有限公司 Question type search work searching method and question type search work searching device based on deep questions and answers
CN106875769A (en) * 2017-03-10 2017-06-20 杭州博世数据网络有限公司 A kind of mathematics practice question-setting system
JP6712961B2 (en) 2017-03-15 2020-06-24 日立グローバルライフソリューションズ株式会社 Communication system and communication control device
CN108682211A (en) * 2018-05-29 2018-10-19 黑龙江省经济管理干部学院 A kind of efficient teaching system for facilitating student to learn
US10978209B2 (en) * 2018-11-30 2021-04-13 National Cheng Kung University Method of an interactive health status assessment and system thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140114680A1 (en) * 2009-11-12 2014-04-24 RedBrick Health Corporation Interactive health assessment
US20120191466A1 (en) * 2011-01-25 2012-07-26 Quint Saul Systems and Methods of Pharmaceutical Information Management

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112388643A (en) * 2020-09-24 2021-02-23 中山大学 Multifunctional health science popularization robot
CN114859800A (en) * 2021-02-04 2022-08-05 中建三局绿色产业投资有限公司 Control method and control system for dredging safe operation of drainage box culvert
CN114859800B (en) * 2021-02-04 2023-09-22 中建三局绿色产业投资有限公司 Control method and control system for dredging safety operation of drainage box culvert
CN112908481A (en) * 2021-03-18 2021-06-04 马尚斌 Automatic personal health assessment and management method and system
CN112908481B (en) * 2021-03-18 2024-04-16 马尚斌 Automatic personal health assessment and management method and system
CN114550860A (en) * 2022-01-28 2022-05-27 中国人民解放军总医院第一医学中心 Hospitalizing satisfaction evaluation method based on process data and intelligent network model
CN115840133A (en) * 2023-02-24 2023-03-24 湖南遥光科技有限公司 Circuit health grading evaluation method and electronic product health grading evaluation method
CN117634826A (en) * 2023-12-06 2024-03-01 青岛诚晟达精密机械有限公司 Intelligent building state monitoring management system based on cloud computing and multidimensional data
CN117634826B (en) * 2023-12-06 2024-05-31 中建正大科技有限公司 Intelligent building state monitoring management system based on cloud computing and multidimensional data
CN117556429A (en) * 2024-01-11 2024-02-13 北京警察学院 Safety protection capability evaluation method and system for public safety video image system
CN117556429B (en) * 2024-01-11 2024-04-19 北京警察学院 Safety protection capability evaluation method and system for public safety video image system

Also Published As

Publication number Publication date
JP7285589B2 (en) 2023-06-02
JP2022510350A (en) 2022-01-26
CN113287175B (en) 2024-03-19
CN113287175A (en) 2021-08-20

Similar Documents

Publication Publication Date Title
JP7285589B2 (en) INTERACTIVE HEALTH CONDITION EVALUATION METHOD AND SYSTEM THEREOF
US10978209B2 (en) Method of an interactive health status assessment and system thereof
CN103561652B (en) Method and system for assisting patients
US10515631B2 (en) System and method for assessing the cognitive style of a person
US9092554B2 (en) Alzheimers support system
US20140122109A1 (en) Clinical diagnosis objects interaction
JP2019527864A (en) Virtual health assistant to promote a safe and independent life
US20170344713A1 (en) Device, system and method for assessing information needs of a person
CN107910073A (en) A kind of emergency treatment previewing triage method and device
JP2018027613A (en) Customer service device, customer service method and customer service system
KR102466438B1 (en) Cognitive function assessment system and method of assessing cognitive funtion
JP2008242534A (en) Healing system, server device, information processor and program
JP6030659B2 (en) Mental health care support device, system, method and program
US20230419030A1 (en) System and Method for Automated Patient Interaction
Bozan et al. Revisiting the technology challenges and proposing enhancements in ambient assisted living for the elderly
US20230018077A1 (en) Medical information processing system, medical information processing method, and storage medium
TWI659429B (en) System and method of interactive health assessment
WO2023015287A1 (en) Systems and methods for automated medical data capture and caregiver guidance
US20240108260A1 (en) System For Managing Treatment Of A Patient By Measuring Emotional Response
Manivannan et al. Health monitoring system for diabetic patients
Vinutha MAMA BOT: A system based on ML, NLP and IOT for supporting women and families during pregnancy
KR102441662B1 (en) Care platform service system using metaverse and method for controlling the same
EP4362033A1 (en) Patient consent
US20230316812A1 (en) Sign language sentiment analysis
JP2022127234A (en) Information processing method, information processing system, and program

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2021531393

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18941865

Country of ref document: EP

Kind code of ref document: A1