CN114189509A - Information processing method, information processing apparatus, and recording medium - Google Patents

Information processing method, information processing apparatus, and recording medium Download PDF

Info

Publication number
CN114189509A
CN114189509A CN202110931288.4A CN202110931288A CN114189509A CN 114189509 A CN114189509 A CN 114189509A CN 202110931288 A CN202110931288 A CN 202110931288A CN 114189509 A CN114189509 A CN 114189509A
Authority
CN
China
Prior art keywords
care
movement
receiver
moving image
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110931288.4A
Other languages
Chinese (zh)
Inventor
江崎日淑
高柳智美
今泉光人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Axa Weize Co ltd
Original Assignee
Axa Weize Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Axa Weize Co ltd filed Critical Axa Weize Co ltd
Publication of CN114189509A publication Critical patent/CN114189509A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Landscapes

  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Signal Processing (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The invention provides an information processing method, an information processing device and a recording medium which can be expected to facilitate the judgment of the implementation level of the life action of a cared person. In the information processing method, an information processing device acquires a moving image obtained by imaging a moving motion of a care-receiver, evaluates the mobility of the care-receiver based on the acquired moving image, and determines the implementation level of a living motion of the care-receiver based on the evaluation result of the mobility. The moving image includes a horizontal moving image and a vertical moving image obtained by imaging a horizontal movement and a vertical movement of the care recipient, the information processing device evaluates the horizontal movement capability of the care recipient based on the horizontal moving image and the vertical movement capability of the care recipient based on the vertical moving image, and determines the implementation level of the living action of the care recipient based on a combination of the evaluation result of the horizontal movement capability and the evaluation result of the vertical movement capability.

Description

Information processing method, information processing apparatus, and recording medium
Technical Field
The present invention relates to an information processing method, an information processing apparatus, and a recording medium for determining a living action performance level of a care-receiver.
Background
In recent years, with the increase in the number of care-receiving persons such as elderly people, the demand for care services has increased. In the care service, there are a case of being implemented in a home of a care-receiver and a case of being implemented in a care facility. In any case, in order to provide a high-quality care service, it is important to accurately grasp the state of the person to be cared for.
Patent document 1 proposes a nursing plan preparation system for assisting the preparation of a nursing plan by simplifying a flow of evaluating a state of a user in a facility for nursing elderly people and the like, and an evaluation table used in the system. In this nursing plan preparation system, a nursing plan preparation evaluation report is generated based on the results of evaluating the state of the user using an ADL (Activity of Daily Living Activity) checklist and a mental checklist.
Patent document 1: japanese laid-open patent publication No. 11-161725
ADL is widely used as an index for evaluating the ability of a care-receiver to perform activities in daily life. Regarding ADL, whether a care-receiver can perform various types of Daily life activities, how much help is required, and the like are evaluated in a plurality of levels, and there are 2 types of bedl (Basic Activity of Daily life Activity) and IADL (instrument Activity of Daily life Activity) depending on the difficulty of the activities. BADL is associated with basic daily living activities, such as living activities, transferring, moving, eating, changing clothes, excreting, bathing, dressing, and the like. IADL is related to more advanced instrumental daily life activities such as cleaning, cooking, washing, shopping, vehicle utilization, phone answering, scheduling, medication administration, money administration and hobbies. However, it is not easy to accurately perform the evaluation of the care-receiver using these indexes, and an expert is required to perform the evaluation.
Disclosure of Invention
The present invention has been made in view of the above circumstances, and an object thereof is to provide an information processing method, an information processing apparatus, and a recording medium which can be expected to facilitate determination of the performance level of a living action of a care-receiver.
In an information processing method according to an embodiment, an information processing device acquires a moving image obtained by imaging a movement of a care-receiver, evaluates the mobility of the care-receiver based on the acquired moving image, and determines an implementation level of a living action of the care-receiver based on a result of evaluation of the mobility.
In the case of one embodiment, it is expected that the judgment of the execution level of the living action of the care-receiver is facilitated.
Drawings
Fig. 1 is a schematic diagram for explaining an overview of an information processing system according to the present embodiment.
Fig. 2 is a block diagram showing a configuration of a server device according to the present embodiment.
Fig. 3 is a schematic diagram showing an example of the determination table.
Fig. 4 is a schematic diagram showing an example of the determination table.
Fig. 5 is a block diagram showing the configuration of the terminal device according to the present embodiment.
Fig. 6 is a schematic diagram showing an example of display of a determination result display screen regarding the implementation level of a life action displayed by the terminal device.
Fig. 7 is a schematic diagram showing an example of display of a determination result display screen regarding the implementation level of a life action displayed by the terminal device.
Fig. 8 is a schematic diagram showing an example of display of a determination result display screen regarding the implementation level of a life action displayed by the terminal device.
Fig. 9 is a schematic diagram showing an example of display of a determination result display screen regarding the implementation level of a life action displayed by the terminal device.
Fig. 10 is a flowchart showing a procedure of processing performed by the terminal device according to the present embodiment.
Fig. 11 is a flowchart showing a procedure of processing performed by the server device according to the present embodiment.
Fig. 12 is a schematic diagram for explaining the extraction of a life motion by the server device according to embodiment 2.
Fig. 13 is a schematic diagram showing an example of an inquiry screen displayed by the terminal device according to embodiment 2.
Fig. 14 is a schematic diagram showing an example of a study screen displayed by the terminal device according to embodiment 2.
Fig. 15 is a schematic diagram showing an example of an analysis screen displayed by the terminal device according to embodiment 2.
Fig. 16 is a flowchart showing a procedure of processing performed by the server device according to embodiment 2.
Description of reference numerals: 1 … server device; 3 … terminal device; 11 … processing unit; 11a … information acquisition unit; 11b … operation evaluation unit; 11c … horizontal determination unit; 11d … decision result transmitting unit; 11e … correction receiving unit; 12 … storage part; 12a … server program; 12b … decision table; 13 … a communication unit; 31 … processing part; 31a … imaging processing unit; 31b … moving image transmitting unit; 31c … decision result receiving unit; 31d … display processing unit; 31e … correction receiving unit; a 32 … storage section; the 32a … program; 33 … a communication part; 34 … display part; 35 … an operation part; 36 … camera; 98. 99 … recording medium.
Detailed Description
A specific example of an information processing system according to an embodiment of the present invention will be described below with reference to the drawings. The present invention is not limited to these examples, and is defined by the claims, and includes all modifications within the meaning and range equivalent to the claims.
< overview of the System >
Fig. 1 is a schematic diagram for explaining an overview of an information processing system according to the present embodiment. The information processing system according to the present embodiment is a system in which the server device 1 provides a service for determining the level of performance of a living action for a care-receiver such as an elderly person. The information processing system according to the present embodiment is assumed to be used mainly by a care staff who cares a care-receiver, a care manager, a care-receiver at home, or the like as a user, but may be used by the care-receiver itself.
A user using this service photographs a care-receiver using a camera mounted on a terminal device 3 such as a smartphone or a tablet terminal, and transmits data of a moving image obtained by the photographing to the server device 1. In the present embodiment, the user captures the horizontal movement and the vertical movement of the care-receiver by the terminal device 3 and transmits the moving image data to the server device 1, but the present invention is not limited thereto. The user may photograph only the movement of the care-receiver in the horizontal direction, may photograph only the movement of the care-receiver in the vertical direction, or may photograph other than the movement in the horizontal direction and the movement in the vertical direction.
The horizontal movement motion captured by the user is, for example, a posture in which the care recipient walks (walking) or a posture in which the care recipient runs (running). The user photographs, for example, the posture of the care recipient walking 5 meters from the front or lateral direction (side direction) of the care recipient using the terminal device 3. The movement in the vertical direction is a standing up operation or a getting up operation of the care-receiver. The user uses the terminal device 3 to photograph, for example, a situation where the caregiver repeats a plurality of (for example, 5) times the operation of standing up from the state of sitting on the chair from the lateral direction (side direction) of the caregiver. The user may photograph a case where the care-receiver repeats the standing-up operation for a predetermined time (for example, 30 seconds), or may photograph the standing-up operation from the front of the care-receiver.
The user who has captured the horizontal movement and the vertical movement of the care-receiver transmits the moving image data obtained by the capturing from the terminal device 3 to the server device 1. In the present embodiment, the terminal device 3 has a communication function, and the moving image data is transmitted from the terminal device 3 to the server device 1 by communication. For example, the moving image data captured by the terminal device 3 may be recorded in a recording medium, and the recording medium may be posted to a company or the like that manages and operates the server device 1, in which case the terminal device 3 may be a camera or the like that does not have a communication function.
The server apparatus 1 that has acquired the moving image data of the horizontal movement and the vertical movement of the care recipient captured by the user performs the evaluation of the horizontal movement ability of the care recipient and the evaluation of the vertical movement ability of the care recipient in a plurality of levels (for example, 5 levels) based on the moving images of the respective movements. The server device 1 determines the level of performance of the living action of the care-receiver based on the two evaluation results in the horizontal direction and the vertical direction. At this time, the server device 1 determines the implementation level for each of a plurality of types of preset life movements. In the present embodiment, the life action determined by the server apparatus 1 may include a life action specified as a daily life Action (ADL) and a life action other than the daily life action. In addition, the daily life action includes basic daily life action (BADL) and instrumental daily life action (IADL).
Basic activities of daily living, also referred to simply as Activities of Daily Living (ADL), can include, for example, activities of daily living, transferring, moving, dining, changing clothes, excreting, bathing, and dressing. The instrumental daily life actions are more complicated than the basic daily life actions, and may include, for example, actions such as cleaning, cooking, washing, shopping, communication such as use of transportation means and handling of telephone calls, schedule adjustment, medication administration, money administration, and hobbies. For example, the washing operation may include an operation of taking out clothes from the washing machine, an operation of transporting laundry to a clothes drying pole, an operation of drying the laundry, an operation of collecting the laundry, an operation of folding the laundry, and the like. For example, the cleaning operation may include cleaning a toilet, cleaning a bathtub, cleaning a floor, operating a vacuum cleaner, and disposing of garbage. The above-described living action is an example, but not limited to this, and the server device 1 may determine the living action of the implementation level, and various actions other than these may be used.
The level of performance of the life activity determined by the server device 1 is obtained by digitizing the degree to which the cared person can perform the life activity. In the present embodiment, the implementation level is determined on 3 levels, for example, levels 1 to 3, and the larger the numerical value, the more the carereceiver can implement the life action. For example, the implementation level 3 indicates that the care-receiver can easily perform the life movement, the implementation level 2 indicates that the care-receiver can perform the life movement although it is not easy, and the implementation level 1 indicates that the care-receiver cannot perform the life movement. The server device 1 determines the execution level for each of a plurality of preset life actions, and transmits the determination result to the terminal device 3 of the user. The execution level may not be a numerical value, and may be represented by a letter such as A, B, C, for example, or may be represented by a symbol such as ∘, Δ, or ×.
The terminal device 3 that receives the determination result from the server device 1 displays the implementation level of the living action of the care-receiver and notifies the user of the implementation level. In this case, the terminal device 3 may display a message corresponding to the determination result, for example, or may display a message urging attention or the like for the living action determined to implement level 1, for example. In the present embodiment, when the user determines that the determination result of the server apparatus 1 is not suitable for the care recipient, the user can feed back the determination result to the server apparatus 1. The terminal device 3 receives from the user a correction of the determination result of the living action implementation level of the care-receiver. For example, the user inputs a level of performance deemed appropriate for the life action of the care-receiver to the terminal device. The terminal device 3 transmits information such as the execution level corrected by the user to the server device 1. The server device 1 receives information on the correction of the determination result of the implementation level from the terminal device 3, and stores and accumulates the information. The accumulated information can be used to improve a criterion for determination of the implementation level, or to improve a criterion for evaluation of the mobility, for example.
< device Structure >
Fig. 2 is a block diagram showing the configuration of the server device 1 according to the present embodiment. The server device 1 of the present embodiment includes a processing unit 11, a storage unit (storage) 12, a communication unit (transceiver) 13, and the like. In the present embodiment, the case where processing is performed by one server apparatus 1 has been described, but processing may be performed by a plurality of server apparatuses 1 in a distributed manner.
The Processing Unit 11 is configured by using an arithmetic Processing device such as a CPU (Central Processing Unit), an MPU (Micro-Processing Unit), or a GPU (Graphics Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like. The processing unit 11 reads and executes the server program 12a stored in the storage unit 12, thereby performing various processes such as a process of evaluating mobility based on a moving image obtained by imaging a care-receiver, and a process of determining the level of performance of a living action of the care-receiver based on the evaluation result.
The storage unit 12 is configured by using a large-capacity storage device such as a hard disk. The storage unit 12 stores various programs executed by the processing unit 11 and various data necessary for processing by the processing unit 11. In the present embodiment, the storage unit 12 stores a server program 12a executed by the processing unit 11, and also stores a determination table 12b for determining the implementation level of the living action of the care-receiver.
In the present embodiment, the server program 12a is provided to be recorded in a recording medium 99 such as a memory card or an optical disk, and the server apparatus 1 reads the server program 12a from the recording medium 99 and stores the same in the storage unit 12. However, the server program 12a may be written in the storage unit 12 at the stage of manufacturing the server device 1, for example. For example, the server program 12a may be a program distributed by another server apparatus or the like that is remotely acquired by the server apparatus 1 through communication. For example, the server program 12a may be a program that is read from the recording medium 99 by a writing device and written into the storage unit 12 of the server device 1. The server program 12a may be provided in a distribution system via a network or may be provided in a system recorded on the recording medium 99. Details of the determination table 12b will be described later.
The communication unit 13 communicates with various devices via a network N including an intra-company LAN, a wireless LAN, the internet, and the like. In the present embodiment, the communication unit 13 communicates with one or more terminal apparatuses 3 via the network N. The communication unit 13 transmits the data supplied from the processing unit 11 to another device, and supplies the data received from the other device to the processing unit 11.
The storage unit 12 may be an external storage device connected to the server device 1. The server apparatus 1 may be a multi-computer including a plurality of computers, or may be a virtual machine virtually constructed by software. The server device 1 is not limited to the above configuration, and may include, for example, a reading unit that reads information stored in a portable storage medium, an input unit that accepts operation input, a display unit that displays an image, and the like.
In the processing unit 11 of the server device 1 according to the present embodiment, the information acquisition unit 11a, the operation evaluation unit 11b, the level determination unit 11c, the determination result transmission unit 11d, the correction reception unit 11e, and the like are realized as software functional units by the processing unit 11 reading and executing the server program 12a stored in the storage unit 12. These functional units are related to processing for determining the level of execution of the living action of the care-receiver, and the other functional units are not shown or described.
The information acquiring unit 11a performs the following processing: data of moving images obtained by imaging the horizontal movement and the vertical movement of the care-receiver are acquired from the terminal device 3 used by the user. The information acquiring unit 11a stores the acquired moving image data in the storage unit 12.
The operation evaluation unit 11b evaluates the mobility of the care-receiver in relation to the horizontal movement and the vertical movement, respectively, based on the moving image data acquired from the terminal device 3 by the information acquisition unit 11 a. In the present embodiment, a plurality of evaluation items are determined in advance for the movement operation in the horizontal direction, and the operation evaluation unit 11b evaluates each evaluation item and calculates a comprehensive evaluation value for the movement capability in the horizontal direction from the evaluation value of each evaluation item. For example, when 4 evaluation items are provided for the horizontal movement operation, the operation evaluation unit 11b evaluates each evaluation item at 5 levels, and uses the average value of the evaluation values (1 to 5) calculated for each evaluation item as the overall evaluation value of the horizontal movement capability. In the present embodiment as well, a plurality of evaluation items are specified for the movement operation in the vertical direction, and the operation evaluation unit 11b evaluates each evaluation item and calculates a comprehensive evaluation value for the movement capability in the vertical direction from the evaluation value of each evaluation item. In the present embodiment, the evaluation value calculated by the action evaluation unit 11b is 5 ranks of 1 to 5, and a larger numerical value indicates a higher evaluation of the item, that is, a higher ability of the caregiver to act.
The evaluation items for the movement motion in the horizontal direction may include, for example, "speed", "sway", "lateral difference", and "tempo". For example, in the case where the movement in the horizontal direction is 5 meters walking, "speed" can be the time required for the caregiver to walk 5 meters. The higher the speed, that is, the shorter the time required for walking 5 meters, the higher the mobility of the care-receiver in the horizontal direction can be evaluated. The operation evaluation unit 11b can calculate the time required for the caregiver to walk 5 meters by detecting the times of walking start and walking end of the subject from the moving image and calculating the time from the walking start to the walking end. The operation evaluation unit 11b compares the calculated time with a predetermined threshold value, and calculates evaluation values of 1 to 5 for the evaluation items of "speed".
The "sway" is a lateral sway width associated with the walking of the care recipient, and the smaller the sway width is, the higher the mobility of the care recipient in the horizontal direction can be evaluated. The motion evaluation unit 11b extracts skeletal information of the care recipient from a moving image obtained by capturing a horizontal movement motion, and calculates a lateral swing width of a predetermined part (waist, spine, neck, head, or the like) of the skeleton. The process of extracting the bone information from the image captured by the camera can be performed by using artificial intelligence that has been previously subjected to deep learning or the like, for example. Since extraction of skeletal information using artificial intelligence is a conventional technique, detailed description of the process is omitted. The process of extracting the bone information may be performed by the server apparatus 1 or by another apparatus. The operation evaluation unit 11b calculates the right-left swing width of the care-receiver, compares the calculated swing width value with a threshold value, and calculates evaluation values of 1 to 5 for the evaluation items of "swing".
The "left-right difference" is a ratio of time supported by the left and right feet, and the movement ability can be evaluated to be higher for the horizontal movement of the care recipient as both the left and right are closer to 50%. The motion evaluation unit 11b extracts the skeletal information of the care recipient from a moving image obtained by capturing the horizontal movement motion, and calculates the difference in the horizontal direction of the motion of a predetermined part (shoulder, foot, or the like) of the skeleton. For example, the motion evaluation unit 11b calculates the contact time between the left foot and the right foot of the care recipient on the ground, calculates the ratio of the contact time between the left foot and the right foot of the person on the entire walking, and calculates the difference between the left ratio and the right ratio. The operation evaluation unit 11b compares the calculated left-right difference value with a threshold value, thereby calculating evaluation values of 1 to 5 for the evaluation items of the "left-right difference".
The "rhythm" is an item for evaluating the smoothness of walking of the care-receiver. The motion evaluation unit 11b extracts the bone information of the care recipient from a moving image obtained by capturing the horizontal movement motion, and calculates the timing of landing the left and right feet based on the motion of a predetermined part of the bone (such as the heel). The operation evaluation unit 11b calculates the time for each step of walking of the care recipient based on the right and left landing timings, and calculates the time variation (variance value) for each step of walking of multiple steps. The less the time variation per step in walking, the smoother the walking of the care recipient becomes, and the higher the mobility of the care recipient in the horizontal direction can be evaluated. The operation evaluation unit 11b compares the calculated deviation (variance value) with a threshold value, and calculates evaluation values of 1 to 5 for evaluation items of the "tempo".
The evaluation items for the movement motion in the vertical direction may include, for example, "required time", "posture", "amount of center of gravity movement", and "muscle strength". For example, in the case where the vertical movement is performed as 5 times of the standing up operation, the "required time" may be a time required for the care-receiver to complete 5 times of the standing up operation. The shorter the required time is, the higher the mobility of the movement operation of the care-receiver in the vertical direction can be evaluated. The operation evaluation unit 11b can detect the start and end times of the standing operation of the care-receiver from the dynamic image, and calculate the time from the start of the first standing operation to the end of the fifth standing operation. The operation evaluation unit 11b compares the calculated required time with a predetermined threshold value, and calculates evaluation values of 1 to 5 for the evaluation items of the "required time".
The "posture" is an item for evaluating the degree of well-being of the posture of the care-receiver during the standing up operation. The motion evaluation unit 11b extracts skeletal information of the person to be cared from a moving image obtained by horizontally capturing a vertical movement motion, for example, calculates an angle of the back with respect to the ground and an angle of the lower limbs with respect to the ground, and calculates a difference between the two angles. The smaller the difference between the angle of the back and the angle of the lower limbs, that is, the closer the back and the lower limbs are parallel to each other, the better the posture of the care-receiver during the standing up operation is, and the higher the mobility of the care-receiver in the vertical direction can be evaluated as the mobility. The operation evaluation unit 11b compares the calculated angle difference with a threshold value, thereby calculating evaluation values of 1 to 5 for the evaluation items of the "posture".
The "center of gravity shift amount" is an item for evaluating how much the center of gravity has shifted in the horizontal direction during the standing up operation of the care-receiver, and the shift amount is larger, the higher the shifting ability can be evaluated for the shifting operation of the care-receiver in the vertical direction. The motion evaluation unit 11b extracts skeletal information of the care recipient from a moving image obtained by horizontally capturing a vertical movement motion, and calculates a horizontal movement distance of the knee or a horizontal movement distance of the waist of the care recipient. The operation evaluation unit 11b compares the calculated movement distance with a threshold value, and calculates evaluation values of 1 to 5 for the evaluation items of the "amount of center of gravity movement".
The "muscle strength" is an item for evaluating the muscle strength related to the vertical movement of the care-receiver. The motion evaluation unit 11b extracts skeletal information of the care recipient from a moving image obtained by horizontally capturing a vertical movement motion, and calculates a movement speed of the waist coordinate when the care recipient sits on the seat. The smaller the movement speed of the waist when the care recipient sits on the seat, the higher the muscle strength of the care recipient, and the higher the movement ability of the care recipient in the vertical movement operation can be evaluated. The operation evaluation unit 11b compares the calculated moving speed with a threshold value, thereby calculating evaluation values of 1 to 5 for the evaluation items of the "muscle strength".
Further, when imaging, as the movement operation in the vertical direction, for example, a case where the care recipient repeatedly performs the operation of standing up from the chair and sitting down within 30 seconds, the operation evaluation unit 11b may calculate, for example, the number of times the care recipient stands up within the 30 seconds. The movement ability of the movement operation of the care recipient in the vertical direction can be evaluated as higher as the number of times the care recipient stands up from the chair within 30 seconds is larger. The operation evaluation unit 11b compares the calculated number of times with a threshold value, thereby calculating evaluation values of 1 to 5 for the evaluation items of the number of standing times.
In the present embodiment, the operation evaluation unit 11b calculates evaluation values of 1 to 5 for each of a plurality of evaluation items related to the horizontal movement operation, calculates an average value of the plurality of calculated evaluation values, and uses the average value as a comprehensive evaluation value of the horizontal movement capability. Similarly, the operation evaluation unit 11b calculates evaluation values of 1 to 5 for each of a plurality of evaluation items related to the movement operation in the vertical direction, calculates an average value of the plurality of calculated evaluation values, and uses the average value as a comprehensive evaluation value of the movement capability in the vertical direction.
In the present embodiment, the server device 1 calculates the average value of the plurality of evaluation values as the comprehensive evaluation value, but the present invention is not limited thereto, and a value other than the average value may be used as the comprehensive evaluation value. The server device 1 can use, for example, a statistical value such as a maximum value, a minimum value, or a mode of the plurality of evaluation values as a comprehensive evaluation value. In addition, when calculating the average value of a plurality of evaluation values, the server device 1 may calculate the average value by weighting each evaluation value (weighted average value). The server apparatus 1 may calculate a comprehensive evaluation value by substituting a plurality of values calculated for each evaluation item into a predetermined arithmetic expression, instead of calculating a separate evaluation value.
The server device 1 may calculate an evaluation value (individual evaluation value or comprehensive evaluation value for each item) of the care-receiver using a learning model in which machine learning such as deep learning is performed. For example, the learning model may be a model that has been machine-learned in advance so as to accept a plurality of evaluation values for a plurality of evaluation items as inputs and output a comprehensive evaluation value corresponding to the plurality of evaluation values. For example, the learning model may be a model that has been machine-learned in advance, and the model may accept, as input, a plurality of values calculated for a plurality of evaluation items and output a comprehensive evaluation value corresponding to the plurality of values. For example, the learning model may be a model that has been machine-learned in advance, and the individual evaluation values or the comprehensive evaluation values may be output to the caregiver who has captured the moving image data, with respect to the plurality of evaluation items, by using the moving image data obtained by capturing the moving motion in the horizontal direction or the vertical direction as an input. These learning models are an example, but not limited thereto.
In the present embodiment, the server device 1 calculates a comprehensive evaluation value from evaluation values for a plurality of evaluation items and performs subsequent processing, but the present invention is not limited to this. The server apparatus 1 may perform subsequent processing using a single evaluation value without calculating a comprehensive evaluation value.
The level determination unit 11c performs the following processing: the living action execution level of the care-receiver is determined based on the evaluation result of the action evaluation unit 11b and the determination table 12b stored in the storage unit 12. Fig. 3 and 4 are schematic diagrams showing an example of the determination table 12 b. In the present embodiment, one illustrated determination table 12b is created for one living action, and a plurality of determination tables 12b related to a plurality of living actions are stored in the storage unit 12 in advance. In this example, fig. 3 shows a determination table 12b relating to "an operation of folding laundry", and fig. 4 shows a determination table 12b relating to "an operation of drying laundry", as a life operation.
The determination table 12b stores a table of implementation levels such as 5 × 5 to 25, based on a combination of 5-level evaluation values of the movement ability in the horizontal direction and 5-level evaluation values of the movement ability in the vertical direction. In the present embodiment, the level of performance of the life movement is determined on 3 levels of levels 1 to 3 (described as Lv1 to Lv3 in fig. 3 and 4). The implementation level 3 indicates that the care-receiver can easily perform the living action of the subject, the implementation level 2 indicates that the care-receiver can perform the living action of the subject although it is not easy, and the implementation level 1 indicates that the care-receiver cannot perform the living action of the subject.
As a result of the evaluation by the operation evaluation unit 11b, the horizontal determination unit 11c acquires an evaluation value of the movement capability in the horizontal direction and an evaluation value of the movement capability in the vertical direction. The level determination unit 11c refers to the determination table 12b stored in the storage unit 12, and acquires the implementation level corresponding to the combination of the two acquired evaluation values. For example, when the evaluation value of the mobility in the horizontal direction for the care-receiver is "2" and the evaluation value of the mobility in the vertical direction for the care-receiver is "3", the horizontal determination unit 11c can determine that the implementation level of the operation of folding the laundry for the care-receiver is level 3 based on the determination table 12b of fig. 3 and that the implementation level of the operation of drying the laundry for the care-receiver is level 2 based on the determination table 12b of fig. 4.
In the present embodiment, the server device 1 creates the determination table 12b for each of a plurality of types of living actions and stores the determination table in the storage unit 12. The determination table 12b is a table prepared in advance by an expert, a manager of the system, a provider of the service, or the like, based on, for example, actual results of past care, at least at the time of starting the provision of the service. Then, the determination table 12b can be updated in accordance with the operation of the present service. The determination table 12b shown in fig. 3 and 4 is an example, and the server apparatus 1 stores the determination table 12b in the storage unit 12 for various living actions, and determines the implementation level for various living actions.
In the present embodiment, the server device 1 stores the determination table 12b for basic daily life activities (BADL) such as daily living activities, transfers, moves, meals, changes, excretions, bathing, and dressing, and instrumental daily life activities (IADL) such as communications, schedule adjustment, medication management, money management, and hobbies, for example, cleaning, cooking, washing, shopping, use of transportation, and handling of telephone. Further, for example, the washing operation includes an operation of taking out clothes from the washing machine, an operation of carrying the laundry to a clothes drying pole, an operation of drying the laundry, an operation of collecting the laundry, an operation of folding the laundry, and the like, and the determination table 12b is stored in the storage unit 12 for each of these operations. Further, the cleaning operation includes, for example, a toilet cleaning operation, a bathtub cleaning operation, a floor cleaning operation, a vacuum cleaner operation, and a garbage disposal operation, and the determination table 12b is stored in the storage unit 12 for each of these operations. The above-described life action is an example, but not limited thereto.
The determination result transmitting unit 11d performs the following processing: the determination result of the implementation level of each living action determined by the level determination unit 11c is transmitted to the terminal device 3 of the user. The determination result transmitting unit 11d transmits the determination result of the implementation level to the terminal apparatus 3 that is the transmission source of the moving image data used for the determination. The determination result transmitting unit 11d may transmit a plurality of determination results related to a plurality of life movements to the terminal device 3 in a lump, for example, or may transmit a determination result corresponding to a request every time a request from the terminal device 3 is received, for example. The determination result transmitting unit 11d may transmit image data of a screen on which the determination result is displayed, together with the determination result of the implementation level of the living action of the care-receiver, to the terminal device 3.
The correction receiving unit 11e performs the following processing: the feedback for correcting the determination result transmitted from the determination result transmitting unit 11d to the terminal device 3 is received from the terminal device 3. In the present embodiment, the terminal device 3 displays the execution level of the living action of the care recipient determined by the server device 1, accepts an operation for correcting the determination result from the user, and transmits the accepted correction content to the server device 1. The correction reception unit 11e receives the correction content transmitted from the terminal device 3, and stores and accumulates the received correction content in the storage unit 12.
Fig. 5 is a block diagram showing the configuration of the terminal device 3 according to the present embodiment. The terminal device 3 of the present embodiment includes a processing unit 31, a storage unit (storage) 32, a communication unit (transceiver) 33, a display unit (display) 34, an operation unit 35, a camera 36, and the like. The terminal device 3 is used by a nursing staff member who nurses a care recipient, a nursing manager, a nursing staff at home, or the like, and can be configured using an information processing device such as a smartphone, a tablet terminal device, or a personal computer, for example.
The processing unit 31 is configured using an arithmetic processing device such as a CPU or MPU, a ROM, or the like. The processing unit 31 performs various processes such as a process of capturing an image of the movement of the care-receiver, a process of transmitting the data of the moving image obtained by the capturing to the server device 1, and a process of receiving and displaying the result of the determination of the living action performance level of the care-receiver transmitted from the server device 1 by reading and executing the program 32a stored in the storage unit 32.
The storage unit 32 is configured using a nonvolatile memory element such as a flash memory. The storage unit 32 stores various programs executed by the processing unit 31 and various data necessary for the processing of the processing unit 31. In the present embodiment, the storage unit 32 stores a program 32a executed by the processing unit 31. In the present embodiment, the program 32a is distributed from a remote server apparatus or the like, and the terminal apparatus 3 acquires the program 32a by communication and stores the program in the storage unit 32. However, the program 32a may be written in the storage unit 32 at the stage of manufacturing the terminal device 3, for example. For example, the program 32a may be read out from the terminal device 3 and stored in the storage unit 32, the program 32a being recorded in the recording medium 98 such as a memory card or an optical disk. For example, the program 32a may be read from the storage medium 98 by a writing device and written into the storage unit 32 of the terminal device 3. The program 32a may be provided to be distributed via a network or may be provided to be recorded in the recording medium 98.
The communication unit 33 communicates with various devices via a network N including a mobile phone communication network, the internet, and the like. In the present embodiment, the communication unit 33 communicates with the server apparatus 1 via the network N. The communication unit 33 transmits the data supplied from the processing unit 31 to another device, and supplies the data received from another device to the processing unit 31.
The display unit 34 is configured using a liquid crystal display or the like, and displays various images, characters, and the like based on the processing of the processing unit 31.
The operation unit 35 receives an operation by a user and notifies the received operation to the processing unit 31. For example, the operation unit 35 receives a user operation through an input device such as a mechanical button or a touch panel provided on the surface of the display unit 34. The operation unit 35 may be an input device such as a mouse or a keyboard, for example, or may be configured to be detachable from the terminal apparatus 3.
The camera 36 is configured using an imaging element such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. The camera 36 is disposed on the back of the housing of the terminal device 3. The camera 36 supplies the image captured by the imaging element to the processing unit 31.
In the terminal device 3 of the present embodiment, the processing unit 31 realizes the image pickup processing unit 31a, the moving image transmission unit 31b, the determination result reception unit 31c, the display processing unit 31d, the correction reception unit 31e, and the like as software functional units by reading out and executing the program 32a stored in the storage unit 32 by the processing unit 31. The program 32a may be a program dedicated to the information processing system according to the present embodiment, or may be a general-purpose program such as an internet browser or a web browser.
The image pickup processing unit 31a performs processing related to image pickup of a moving image by the camera 36. The shooting processing section 31a controls, for example, the shutter speed, frame rate, ISO sensitivity, exposure, aperture value, and the like of the camera 36. The image capture processing unit 31a acquires moving image data captured by the camera 36, and stores the data in the storage unit 32 as moving image data in an appropriate file format such as MP4 or AVI (Audio Video Interleave), for example. In the present embodiment, it is necessary to acquire moving image data obtained by imaging the horizontal movement of the care-receiver and moving image data obtained by imaging the vertical movement of the care-receiver. Therefore, the image pickup processing unit 31a displays a message such as "image pickup movement in the horizontal direction" or "image pickup movement in the vertical direction", for example, and adds information for distinguishing which direction of the horizontal direction or the vertical direction is the moving image data to be picked up based on the message to the moving image data picked up, and stores the information in the storage unit 32. The image capture processing unit 31a may provide the user with information on the direction to be captured by various methods such as a method of displaying a "walking" diagram or shape corresponding to the horizontal movement on the image capture screen in a superimposed manner, and a method of displaying a "standing" diagram or shape corresponding to the vertical movement in a superimposed manner, in addition to a method of displaying a message.
The moving image transmitting unit 31b performs the following processing: the server device 1 is transmitted with a moving image of the person to be cared captured by the user with the camera 36. The moving image transmission unit 31b reads moving image data captured by the camera 36 and stored in the image capture processing unit 31a from the storage unit 32, and transmits the read moving image data to the server device 1 using the communication unit 33. In the present embodiment, the moving image transmitting unit 31b adds information for distinguishing which direction the moving image data is related to the moving image data obtained by imaging the horizontal movement of the care-receiver and the moving image data obtained by imaging the vertical movement, and transmits the moving image data to the server device 1. The moving image transmitting unit 31b adds, together with the moving image data, identification information such as the name or ID of the care recipient and identification information such as the name or ID of the user who is a care recipient, and transmits the information to the server device 1.
The determination result receiving unit 31c performs the following processing: the result of the determination of the implementation level of the living action transmitted from the server device 1 is received for the moving image of the care recipient transmitted from the terminal device 3 to the server device 1. The determination result receiving unit 31c receives the determination result from the server apparatus 1 by the communication unit 33, and stores the data of the received determination result in the storage unit 32. The server device 1 determines the implementation level of a plurality of living actions, and transmits a plurality of determination results to the terminal device 3. When a plurality of determination results are collectively transmitted, the determination result receiving unit 31c collectively receives the plurality of determination results, and when the determination results are individually transmitted in response to a request or the like from the terminal device 3, the determination result receiving unit 31c receives the determination results every time. The determination result receiving unit 31c associates the received determination result with identification information such as the name or ID of the care recipient to be determined, and stores the result in the storage unit 32.
The display processing unit 31d performs the following processing: various characters, images, and the like related to the determination of the level of performance of the living action of the care-receiver are displayed on the display unit 34. The display processing unit 31d displays various screens such as a menu screen and a setting screen related to the information processing system. For example, when the user takes an image of the subject using the camera 36, the display processing unit 31d may display a message or a guide image for assisting the image taking. The display processing unit 31d reads the determination result received from the server device 1 by the determination result receiving unit 31c and stored in the storage unit 32, and displays the determination result of the implementation level of the living action of the care-receiver on the display unit 34.
The correction reception unit 31e performs the following processing: the operation for correcting the execution level of the living action of the care recipient determined by the server device 1 is accepted. For example, the following are the cases: the server device 1 determines that the care-receiver cannot perform the operation of drying the laundry, i.e., the implementation level 1, based on the moving image obtained by imaging the care-receiver, and the care-receiver can actually perform the operation of drying the laundry. In contrast, for example, there are the following cases: the server device 1 determines that the care-receiver can perform the laundry drying operation, that is, the execution level 2, and actually the care-receiver cannot perform the laundry drying operation. In such a case, the user can perform an operation of correcting the determination result of the implementation level. The correction reception unit 31e receives an operation for correcting the determination result of the execution level displayed on the display unit 34 based on the operation of the operation unit 35 by the user, and transmits the content of the received correction to the server device 1. The content of the correction may be, for example, a notification of only an error in the determination result of the implementation level, or may be, for example, a numerical value of the implementation level of the living action of the care recipient determined by the user.
Implementation level determination processing for life action
In the information processing system of the present embodiment, the server device 1 provides a service for determining the level of implementation of a living action for a care-receiver. The user using this service is a care-giver who cares the care-givers, and the user uses the terminal device 3 such as his smartphone to photograph the movement of the care-giver. In the present embodiment, the menu screen displayed on the display unit 34 of the terminal device 3 executing the program 32a includes items such as "movement in the horizontal direction of imaging" or "movement in the vertical direction of imaging", for example, and the user selects the item to display a screen for imaging the movement of the care-receiver on the display unit 34.
In the screen for capturing the horizontal movement, the terminal device 3 may display the image captured by the camera 36 on the display unit 34 in real time, and may display a message such as "please capture a 5-meter walk of the care recipient from the front", for example. In this case, the terminal device 3 may display a moving image of a sample obtained by imaging a horizontal movement on the display unit 34.
Similarly, in the screen for capturing the vertical movement, the terminal device 3 may display the image captured by the camera 36 on the display unit 34 in real time, and may display a message such as "please capture a situation where the caregiver stands up from the chair 5 times in a landscape manner", for example. In this case, the terminal device 3 may display a moving image of a sample obtained by capturing a movement in the vertical direction on the display unit 34.
After the moving image is captured, the user operates the terminal device 3 to transmit moving image data to the server device 1, and requests the judgment of the living action implementation level of the care-receiver. In this case, the terminal device 3 may receive a selection of one or more living actions that the user desires to determine from among a plurality of living actions to be subjected to the implementation level determination, and may transmit information relating to the selected living action to the server device 1 together with the request for determination. In such a case, the server apparatus 1 may determine only one or more life actions requested from the user. Alternatively, the terminal device 3 may not accept the selection of the life action, and in such a case, the server device 1 may determine all the life actions that can be determined.
The server apparatus 1 that receives the moving image data of the care-receiver from the terminal apparatus 3 first evaluates the mobility of the care-receiver based on the received moving image data. In the present embodiment, the server device 1 evaluates evaluation items, such as "speed", "sway", "lateral difference", and "rhythm", which are predetermined with respect to the horizontal movement of the care recipient, based on the moving image obtained by capturing the horizontal movement of the care recipient. The server device 1 evaluates each evaluation item at 5 levels of 1 to 5, calculates an average value of the evaluation values for all the evaluation items, and uses the calculated average value as a comprehensive evaluation value relating to the horizontal mobility of the care-receiver.
Similarly, in the present embodiment, the server device 1 evaluates evaluation items such as "required time", "posture", "center of gravity movement amount", and "muscle strength" predetermined with respect to the vertical movement of the care recipient, based on the moving image obtained by capturing the vertical movement of the care recipient. The server device 1 evaluates each evaluation item at 5 levels of 1 to 5, calculates an average value of the evaluation values for all the evaluation items, and uses the calculated average value as a comprehensive evaluation value relating to the mobility of the care-receiver in the vertical direction.
After evaluating the horizontal mobility and the vertical mobility, the server device 1 refers to the determination table 12b stored in the storage unit 12 based on the two evaluation results, thereby determining the performance level of the living action of the care-receiver. In the present embodiment, since the server device 1 determines a plurality of life actions, a plurality of determination tables 12b are stored in advance. The living action determined by the server device 1 includes, for example, basic daily living actions (BADL) such as daily living action, transfer, movement, dining, changing clothes, excretion, bathing, and dressing, and instrumental daily living actions (IADL) such as communication such as cleaning, cooking, washing, shopping, use of transportation means, and handling of telephone calls, schedule adjustment, medication management, money management, and hobby. In the present embodiment, the determination table 12b is prepared in advance for each of these life actions, and a plurality of determination tables 12b prepared for a plurality of life actions are stored in the server device 1.
The server device 1 acquires, for each determination table 12b, an implementation level corresponding to a comprehensive evaluation value relating to the horizontal movement capability and a comprehensive evaluation value relating to the vertical movement capability. The server apparatus 1 similarly acquires the implementation levels from the plurality of determination tables 12b, and uses them as the evaluation results of the implementation levels of the living motion of the care-receiver. The server device 1 transmits the result of evaluation of the performance level of the living action to the terminal device 3 of the transmission source of the moving image data.
The terminal device 3, which has received the result of determining the level of performance of the living action of the care-receiver from the server device 1, displays the result of determination on the display unit 34. Fig. 6 to 9 are schematic diagrams showing one display example of the determination result display screen regarding the implementation level of the life action displayed by the terminal device 3. The example shown in fig. 6 is an example of an evaluation report showing the comprehensiveness of the results based on the implementation level determined for a plurality of life movements. Such a comprehensive evaluation report is created by the server device 1 or the terminal device 3 based on the results of evaluation of a plurality of life movements by the server device 1. In the present embodiment, the server device 1 creates an evaluation report and transmits the evaluation report to the terminal device 3 together with the determination result.
For example, the server device 1 extracts a living action determined to have the lowest implementation level or a living action deteriorated from the previous determination result. When there are a plurality of life actions extracted under the condition, the server apparatus 1 may appropriately select a life action to be a target of the evaluation report from among the life actions. The server device 1 stores a message or the like predetermined for a living action to be a target of an evaluation report in the storage unit 12, and creates an evaluation report by acquiring a message or the like corresponding to the extracted living action.
In the example shown in fig. 6, the character string of "care recipient a" and the picture of the face of the care recipient are displayed as identification information such as the name or ID of the care recipient to be determined on the uppermost portion of the display unit 34 of the terminal device 3. The terminal device 3 displays an evaluation report generated by the server device 1 based on the determination result of the implementation level of the life movement below the face photograph. In the example shown in fig. 6, the terminal apparatus 3 displays that the "standing up function is slightly lowered below the title character string of the" life action evaluation report ". "such a message. In addition, the terminal device 3 also displays "do there are problems with such a life function? Examples of the life functions include "a message," an operation of drying laundry "," it is difficult to stand in a kitchen for a long time ", and" cleaning is possible with a mop, but heavy and fatigued with a vacuum cleaner ".
In addition to such a comprehensive evaluation report, the information processing system according to the present embodiment allows the user to individually confirm the performance level of each living action determined by the caregiver. In this case, the user operates the terminal device 3, for example, to select a life action for which the determination result of the implementation level is to be confirmed. The terminal device 3 receives the selection of the living action by the user, acquires the determination result related to the selected living action from the determination results received from the server device 1, and displays the determination result of the acquired implementation level of the living action on the display unit 34.
The example shown in fig. 7 is a display example of a selection screen for accepting a selection of a life action by a user. On the illustrated selection screen, a message "please select a life action" is displayed at the uppermost part, and the selectable life action is displayed as an icon or an illustration below the message. In this example, the life actions that the user can select include 10 actions included in the basic daily life action (BADL) and 8 actions included in the instrumental daily life action (IADL). The "basic daily living action (BADL)" can include actions such as "transfer", "walk", "climb stairs", "toilet action", "bath", "dressing", "changing clothes", "defecation", "urination", and "meal". The "instrumental daily life action (IADL)" may include actions such as "call answering", "shopping", "cooking", "money management", "medication management", "use of transportation", "cleaning", and "washing". The terminal device 3 displays icons, illustrations, and the like prepared in advance for these operations in a row, and accepts a selection of a living operation by the user in response to a touch operation on these icons, illustrations, and the like. In the illustrated example, "cook" of "instrumental daily life action (IADL)" is selected.
The example shown in fig. 8 is a display example of a detailed selection screen for accepting selection of a more detailed action for "cooking" of the life action selected on the selection screen shown in fig. 7. On the detailed selection screen shown in the figure, a message "please select detailed actions related to cooking" is displayed on the uppermost part, and a list of selectable detailed life actions is displayed as a character string below the message. The terminal device 3 stores a plurality of detailed operations predetermined for the life operation, and reads out and displays the detailed operations on the detailed selection screen in a list. In this example, the detailed life actions related to "cooking" that the user can select may include, for example, "take out an item from the kitchen", "store the purchased item in the refrigerator", "lift a chair in the restaurant", "store dishes", "carry and store dishes", "wipe and clean the kitchen", "wipe a table", "prepare a meal", and "cooking action itself". The terminal device 3 displays detailed life actions in a list, and accepts selection of a detailed life action by a touch operation on the character strings displayed in the list. In the illustrated example, "cooking operation itself" is selected.
Fig. 9 shows an example of display of a determination result display screen for notifying the user of the determination result of the implementation level of the life action "cooking action itself" selected on the detailed selection screen shown in fig. 8. In the illustrated determination result display screen, a character string indicating "cooking operation itself" of the life operation as the determination target is displayed at the uppermost portion, and the determination result is displayed at the implementation level 2 below the uppermost portion.
The terminal device 3 displays the evaluation result of the movement capability in the horizontal direction and the evaluation result of the movement capability in the vertical direction, which contribute to the determination of the implementation level, below the determination result of the implementation level. In the illustrated example, an evaluation value 2 of the movement ability in the horizontal direction and an evaluation value 4 of the movement ability in the vertical direction are displayed as "physical ability". The terminal device 3 displays a reference value required for the life action to reach the implementation level 3 for the two evaluation values. In the illustrated example, the evaluation value of the horizontal movement ability is 3 or more and the evaluation value of the vertical movement ability is 3 or more, which are "required reference values". Information such as evaluation results of the horizontal movement ability and the vertical movement ability and a required reference value is transmitted from the server device 1 to the terminal device 3 together with the result of the determination of the life movement.
The terminal device 3 displays information such as advice to the care-receiver based on information such as evaluation of the performance level of the life movement and the mobility ability on the determination result display screen. In the illustrated example, the terminal apparatus 3 displays information such as advice for 4 items of "state description", "motion", "maintenance action", and "challenge action". The term "state explanation" explains the current state of the care-receiver related to the life movement, and displays, for example, "the state is explained after" cooking movement itself "is performed, and the person can continue to stand. In the movement of the kitchen, there is a possibility that the kitchen may shake left and right, which is dangerous. "and so on. The "exercise" item proposes an exercise to be performed by the care-receiver, for example, a display "please do exercise capable of smooth movement" raise the legs ". "and so on. The item of the "maintenance operation" is an operation that is recommended to be performed by the care-receiver and can be maintained in the future, and for example, an operation such as "storing purchased articles in a refrigerator" or "wiping and cleaning in a kitchen" is displayed in a list. The item of the "challenge operation" is an operation that is recommended to be performed by the care-receiver and is a challenge for the operation in the future, and for example, an operation such as "carry and store dishes" is displayed in a list.
The articles such as the advice displayed on the terminal device 3 are stored in advance in a database or the like by the server device 1, for example, and the articles are acquired from the database in accordance with the determination result of the implementation level of the life action and transmitted to the terminal device 3 together with the determination result. For example, the terminal device 3 may store texts such as these suggestions in a database in advance, or the terminal device 3 may acquire and display texts from the database based on the determination result received from the server device 1. The articles such as these suggestions may be stored in the database in association with the implementation level of the life movement, or may be stored in the database in association with the implementation level of the life movement, the evaluation values of the horizontal movement ability and the vertical movement ability.
In addition, the user may select the life action at any timing. The information processing system may be configured such that, for example, the user selects a living action after the data of the moving image obtained by imaging the care-receiver is transmitted from the terminal device 3 to the server device 1, or may be configured such that, for example, the living action is selected before or simultaneously with the transmission of the moving image data. The server apparatus 1 may determine the implementation level only for the selected living action, or may determine the implementation level for all living actions. When the implementation level is determined for all the life actions, the server device 1 may transmit the determination result of the implementation level for all the life actions to the terminal device 3, or may transmit the determination result of only the selected life action to the terminal device 3.
In the information processing system according to the present embodiment, the server device 1 determines the execution level of a detailed operation (for example, a cooking operation itself) that is further specified for a plurality of operations (for example, cooking) included in the basic daily life operation and the instrumental daily life operation, but the present invention is not limited to this. The server apparatus 1 may also determine the implementation levels of a plurality of actions (e.g., transfer, walking, stair climbing, toilet action, bath, dressing, changing, defecation, urination, dining, etc.) included in the basic daily life action. The server device 1 may determine the performance levels of a plurality of operations (e.g., telephone call handling, shopping, cooking, money management, medication management, use of transportation, cleaning, washing, etc.) included in the tool-based daily life operation. The server apparatus 1 may determine the level of implementation of the basic daily routine operation, that is, the level of implementation of the plurality of operations included in the basic daily routine operation in a comprehensive manner. The server apparatus 1 may determine, for example, a performance level of the instrumental daily routine operation, that is, a comprehensive performance level of a plurality of operations included in the instrumental daily routine operation. The server apparatus 1 may determine the implementation level of the life movement not included in the daily life movement.
In the information processing system according to the present embodiment, the user can perform a correction operation with respect to the execution level of the life movement determined by the server apparatus 1. For example, the following cases exist: the server apparatus 1 determines that the care recipient performs the "cooking operation itself" as the living operation, but the user determines that the care recipient cannot perform the "cooking operation itself" (performing level 1) or that the care recipient can easily perform the "cooking operation itself" (performing level 3). In such a case, the user can perform an operation of correcting the result of determination of the implementation level of the life movement with respect to the "cooking movement itself" by using the terminal device 3.
A button with a "correction" tab is provided at the lowermost portion of the determination result display screen shown in fig. 9. When the touch operation to the button is accepted, the terminal device 3 displays a screen (not shown) for accepting the correction of the execution level of the life movement, and accepts the correction operation of the execution level from the user. On this screen, the user can perform an operation of correcting the determination result of the implementation level 2 to the implementation level 1 or the implementation level 3, for example. The terminal apparatus 3 transmits the execution level corrected by the user to the server apparatus 1.
When receiving information relating to the correction of the determination result from the terminal device 3, the server device 1 stores the information in the database. The server device 1 stores, in association with each other, moving image data from which the horizontal movement and the vertical movement of the original determination result have been derived, evaluation values of the horizontal movement and the vertical movement evaluated based on the moving image data, the execution level of the living action determined based on the evaluation values (original determination result), and the execution level corrected by the user. These pieces of information are used, for example, to update the determination table 12b, or to update evaluation criteria of the horizontal movement capability and the vertical movement capability. In addition, when the server apparatus 1 uses the learning model subjected to the machine learning for the determination of the implementation level, the information on the correction stored in the database can be used for the relearning process of the learning model.
The server apparatus 1 may transmit, to the terminal apparatus 3, information of a sentence such as a suggestion corresponding to the execution level corrected by the user. The terminal apparatus 3 may receive information of a sentence such as a suggestion corresponding to the corrected implementation level from the server apparatus 1, and update the sentence such as the suggestion displayed on the determination result display screen based on the received information.
< flow chart >
Fig. 10 is a flowchart showing a procedure of processing performed by the terminal device 3 according to the present embodiment. The imaging processing unit 31a of the processing unit 31 of the terminal device 3 according to the present embodiment performs imaging processing for the horizontal movement of the care-receiver (step S1) and imaging processing for the vertical movement (step S2) in response to the user' S operation. In addition, either one of the imaging process of the horizontal movement operation in step S1 and the imaging process of the vertical movement operation in step S2 may be performed first. The moving image transmitting unit 31b of the processing unit 31 transmits the moving image data of the horizontal moving operation captured in step S1 and the moving image data of the vertical moving operation captured in step S2 to the server device 1 (step S3).
The determination result receiving unit 31c of the processing unit 31 determines whether or not the determination result of the implementation level of the living motion of the care recipient transmitted from the server device 1 for the transmission of the moving image data is received (step S4). When the determination result is not received (no in S4), the determination result receiving unit 31c waits until the determination result is received from the server device 1. When the determination result is received (yes in S4), the display processing unit 31d of the processing unit 31 displays the received determination result on the display unit 34 (step S5). In this case, the display processing unit 31d may perform any one of the display of the comprehensive evaluation report shown in fig. 6 and the display of the detailed life action shown in fig. 7 to 9, for example.
The correction reception unit 31e of the processing unit 31 determines whether or not the user has corrected the determination result of the implementation level (step S6). If no correction is made (no in S6), the processing unit 31 ends the process. When the correction is performed (yes in S6), the correction reception unit 31e receives the operation of the correction by the user, transmits correction information including the corrected implementation level and the like to the server apparatus 1 (step S7), and ends the process.
Fig. 11 is a flowchart showing a procedure of processing performed by the server device 1 according to the present embodiment. The information acquisition unit 11a of the processing unit 11 of the server device 1 of the present embodiment determines whether or not the moving image data transmitted from the terminal device 3 has been received (step S21). When the moving image data is not received (no in S21), the information acquiring unit 11a waits until the moving image data is received.
When the moving image data is received (S21: YES), the operation evaluation unit 11b of the processing unit 11 evaluates the horizontal movement ability of the care-receiver based on the moving image data obtained by capturing the horizontal movement operation of the care-receiver (step S22). The motion evaluation unit 11b evaluates the vertical movement ability of the care recipient based on the moving image data obtained by capturing the vertical movement motion of the care recipient (step S23). Next, the level determination unit 11c of the processing unit 11 reads the determination table 12b stored in the storage unit 12 (step S24). The level determination unit 11c refers to the determination table 12b based on the combination of the evaluation value obtained by the evaluation at step S22 and the evaluation value obtained by the evaluation at step S23, and acquires the implementation level from the determination table 12b, thereby determining the implementation level of the life movement of the care-receiver (step S25).
The determination result transmitting unit 11d of the processing unit 11 transmits the determination result of the living action implementation level of the care recipient in step S26 to the terminal device 3 of the transmission source of the moving image data (step S26). In this case, the determination result transmitting unit 11d may transmit various information associated with the determination result, for example, an evaluation value of the mobility and a text such as a suggestion for the determination result, together with the determination result, to the terminal device 3.
The correction reception unit 11e of the processing unit 11 determines whether or not information on the correction of the determination result is received from the terminal device 3 that transmitted the determination result (step S27). If the corrected information is not received (no in S27), the processing unit 11 ends the process. When the information on the correction is received (yes in S27), the correction reception unit 11e stores the received information on the correction in the storage unit 12 (step S28), and the process ends.
< summary >
In the information processing system according to the present embodiment having the above configuration, the server device 1 acquires a moving image obtained by imaging a movement of the care-receiver, the server device 1 evaluates the mobility of the care-receiver based on the acquired moving image, and determines the execution level of the living action of the care-receiver based on the evaluation result. This can be expected to facilitate determination of the level of performance of the life movement of the care-receiver.
In the information processing system according to the present embodiment, the moving image acquired by the server device 1 includes at least one of a moving image obtained by imaging the movement of the care-receiver in the horizontal direction and a moving image obtained by imaging the movement of the care-receiver in the vertical direction. This can be expected to allow the server device 1 to accurately evaluate the mobility of the care-receiver.
In the information processing system according to the present embodiment, the moving image acquired by the server device 1 includes a moving image obtained by imaging the movement of the care recipient in the horizontal direction and a moving image obtained by imaging the movement of the care recipient in the vertical direction. The server device 1 evaluates the mobility for the movement of the care-receiver in the horizontal direction and the mobility for the movement in the vertical direction based on the acquired moving images. The server device 1 refers to the determination table 12b based on the combination of the evaluation results in the horizontal direction and the vertical direction, and determines the execution level of the living action of the care-receiver. This makes it possible to expect that the judgment of the execution level of the living action of the care-receiver can be performed easily and accurately.
In the information processing system according to the present embodiment, a moving image obtained by imaging the walking state of the care-receiver is used as the moving image obtained by imaging the horizontal movement. This makes it possible to easily photograph the horizontal movement of the care-receiver and to determine the level of performance of the living action. The horizontal movement operation is not limited to walking, and may be, for example, running or stepping, as long as the care recipient moves horizontally with respect to the floor.
In the information processing system according to the present embodiment, a moving image obtained by imaging a person being cared up is used as a moving image obtained by imaging a moving operation in the vertical direction. This makes it possible to easily photograph the vertical movement of the care-receiver and to determine the level of performance of the living action. The movement in the vertical direction is not limited to the standing-up movement, and may be, for example, a movement such as getting-up, jumping, or sitting, as long as the care-receiver moves at least a part of the body in a direction perpendicular to the floor surface (vertical direction).
In the information processing system according to the present embodiment, the life actions determined by the server device 1 include basic life actions (BADL), tool life actions (IADL), and other life actions. These daily activities are indices widely used in the care field, and the server device 1 can be expected to contribute to the assistance of care by determining the implementation level of such daily activities.
In the present embodiment, the imaging of the care-receiver is performed for both the horizontal movement and the vertical movement, but the present invention is not limited to this, and the imaging may be performed only for either the horizontal movement or the vertical movement. For example, when only a horizontal movement is imaged, the determination table 12b may be a table in which the implementation level of the living action is stored in association with the evaluation value of the horizontal movement ability. Alternatively, the server device 1 may calculate a plurality of evaluation values from moving image data obtained by capturing a horizontal movement, and the determination table 12b may store an implementation level corresponding to a combination of the plurality of evaluation values.
Further, the determination table 12b may be prepared according to attributes such as sex and age of the care-receiver, for example. The server apparatus 1 may store a determination table 12b for male and a determination table 12b for female, which are used for determining the implementation level of the "cooking operation itself", independently. In this case, the server device 1 acquires attribute information such as sex or age of the care-receiver from the terminal device 3, and reads out and uses the determination table 12b corresponding to the attribute of the care-receiver from the storage unit 12.
Further, the operation of the care-receiver other than the horizontal movement and the vertical movement may be photographed to determine the level of implementation of the living action. For example, the server device 1 may determine the implementation level of the living action based on moving image data obtained by capturing the movement of the care-receiver to raise or lower the arm, the movement of stepping on the arm, the movement of standing on one foot, or the like. It is expected that the level of dressing of the care-receiver can be determined based on the movement of raising and lowering the arm, for example. It is expected that the execution level of the transfer of the care recipient from the wheelchair can be determined based on the operation of stepping, for example, by evaluating the balance force of the care recipient. Further, it is expected that the balance force and the endurance force are evaluated based on the movement of standing on one foot, and the execution level of the movement of shopping by the care-receiver can be determined.
In the present embodiment, the server device 1 determines the execution level of the living action of the care-receiver using the determination table 12b, but the present invention is not limited to this. The server apparatus 1 may be configured as follows: for example, the living action performance level is determined from moving image data obtained by imaging the movement of the care-receiver using a learning model subjected to machine learning such as deep learning. The learning model performs, for example, a learning process in advance such that moving image data obtained by capturing a movement of the care recipient is received as input, and the execution level of the living action of the care recipient is output. For example, teacher data is prepared in advance in which the execution level of the living motion of the care-receiver is correlated with moving image data obtained by imaging the moving motion of the care-receiver as an accurate value, and learning processing of a learning model is performed using a plurality of teacher data. The learning process of the learning model may be performed by the server apparatus 1, or may be performed by an apparatus other than the server apparatus 1.
In addition, the learning model may have the following structure: for example, moving image data obtained by capturing the horizontal movement and the vertical movement of the care-receiver is received as input, and evaluation values of the horizontal movement and the vertical movement of the care-receiver are output. In this case, the server apparatus 1 may input the moving image data acquired from the terminal apparatus 3 to the learning model, acquire the evaluation value relating to the movement ability in the horizontal direction and the evaluation value relating to the movement ability in the vertical direction output from the learning model, and determine the implementation level of the living action of the care-receiver with reference to the determination table 12 b. In this case, the server device 1 may include a plurality of learning models, such as a learning model that outputs an evaluation value of a mobility in the horizontal direction and a learning model that outputs an evaluation value of a mobility in the vertical direction.
In addition, the learning model may have the following structure: for example, an evaluation value of the horizontal mobility of the care-receiver and an evaluation value of the vertical mobility of the care-receiver are received as inputs, and the execution level of the living action of the care-receiver is output. In this case, the server device 1 may calculate an evaluation value of the mobility of the care recipient based on the moving image data acquired from the terminal device 3 and the determination table 12b of the storage unit 12, input one or more calculated evaluation values to the learning model, and acquire the implementation level of the living action output by the learning model.
Further, the server device 1 may include: a first learning model that outputs an evaluation value of a mobility with respect to input of moving image data; and a second learning model that outputs the implementation level of the living action with respect to the input of the evaluation value of the mobility. In this case, the server apparatus 1 inputs the moving image data acquired from the terminal apparatus 3 to the first learning model, and acquires the evaluation value relating to the moving ability in the horizontal direction and the evaluation value relating to the moving ability in the vertical direction output by the first learning model. The server device 1 inputs the evaluation value obtained from the first learning model to the second learning model, and obtains the implementation level of the living action output by the second learning model.
In the present embodiment, the server device 1 is configured to perform processing for determining the level of execution of the living action of the care-receiver based on the moving image data, but the present invention is not limited thereto, and the determination may be performed by the terminal device 3. For example, when the terminal device 3 performs all of calculation of the evaluation value based on the moving image data and determination of the implementation level based on the calculated evaluation value, the information processing system may not include the server device 1. For example, the server device 1 may calculate an evaluation value based on moving image data, the terminal device 3 may transmit the moving image data to the server device 1 to acquire the evaluation value, and the terminal device 3 may determine the implementation level based on the evaluation value acquired from the server device 1.
Note that the processing for calculating the evaluation value of the mobility of the care-receiver based on the moving image data may be performed not by the server apparatus 1, the terminal apparatus 3, or the like but by a specialist related to care. In this case, for example, the server device 1 may transmit the moving image data acquired from the terminal device 3 to a device used by an expert or the like, receive an evaluation operation performed by the expert using the device, and transmit the evaluation value to the server device 1, and the server device 1 may determine the execution level of the life movement based on the evaluation value of the expert. The server device 1 may store and accumulate the moving image data and the evaluation value of the expert in a database, and may use the moving image data and the evaluation value as teacher data used for machine learning of a learning model that outputs the evaluation value in response to input of the moving image data. In addition, some of the plurality of evaluation items related to the mobility may be evaluated by experts or the like, and the other evaluation items may be evaluated by the server device 1.
In the present embodiment, the display of the screens shown in fig. 6 to 9 is an example, but is not limited thereto. In the present embodiment, various actions listed as life actions are an example, but not limited to this. The server apparatus 1 can determine the implementation level of various operations not included in the basic daily life operation (BADL) and the instrumental daily life operation (IADL), for example.
< embodiment 2 >
The information processing system of embodiment 2 performs the following processing: evaluation values of the movement capabilities in the horizontal direction and the vertical direction are calculated based on moving image data obtained by capturing the movement movements of the care-receiver in the horizontal direction and the vertical direction, and the target, the problem, the challenge movement, and the like of the living movement of the care-receiver are presented based on the calculated evaluation values. The processing performed by the information processing system of embodiment 2 to calculate the evaluation values of the mobility in the horizontal direction and the vertical direction for the care-receiver is the same as the processing described in embodiment 1, and therefore, the description thereof is omitted.
The server device 1 according to embodiment 2 extracts a living action such as a living action that can be easily performed by a care recipient (e.g., a living action corresponding to the implementation level 3) or a living action that can be easily performed although not easy (e.g., a living action corresponding to the implementation level 2) from the evaluation values of the movement capabilities in the horizontal direction and the vertical direction calculated based on the moving image data.
Fig. 12 is a schematic diagram for explaining the extraction of a life motion by the server device 1 according to embodiment 2. In the information processing system according to embodiment 2, the life movements that can be performed by the subject are classified into 7 types, i.e., the illustrated a-segment to G-segment, based on a combination of 5-level evaluation values relating to the horizontal movement ability and 5-level evaluation values relating to the vertical movement ability. Fig. 12 shows a graph in which the horizontal axis represents the evaluation value associated with the horizontal movement and the vertical axis represents the evaluation value associated with the vertical movement, and 7 blocks, i.e., a block a to a block G, are shown as rectangular regions in the graph.
For example, the a-zone is a living action that can be performed by a care-receiver with an evaluation value of 5 for the ability to move in the horizontal direction and an evaluation value of 5 for the ability to move in the vertical direction, and is the most difficult living action. The section B is a living action that can be performed by the cared person whose evaluation value of the movement ability in the horizontal direction is 3 or 4 and the evaluation value of the movement ability in the vertical direction is 5. The section C is a living action which can be performed by a cared person and has an evaluation value of 5 for the mobility in the horizontal direction and an evaluation value of 2-4 for the mobility in the vertical direction. The D section is a living action that can be performed by a cared person whose evaluation value of the movement ability in the horizontal direction is 3 or 4 and whose evaluation value of the movement ability in the vertical direction is 3 or 4. The section E is a living action that can be performed by the cared person whose evaluation value of the movement ability in the horizontal direction is 1 or 2 and the evaluation value of the movement ability in the vertical direction is 3 or 4. The F zone is a living action that can be performed by the cared person whose evaluation value of the movement ability in the horizontal direction is 3 or 4 and that of the movement ability in the vertical direction is 1 or 2. The G zone is a living action that can be performed by a cared person whose evaluation value of the movement ability in the horizontal direction is 1 or 2 and whose evaluation value of the movement ability in the vertical direction is 1 or 2, and is the living action with the lowest difficulty level. In this example, the section is not defined when the evaluation value of the horizontal movement ability is 5 and the evaluation value of the vertical movement ability is 1, and when the evaluation values of the horizontal movement ability and the vertical movement ability are 1 and 2 and 5, but the section may be defined for them.
The server device 1 according to embodiment 2 stores one or more living actions that can be performed by a care-giver for each of the sections a to G described above. The association of each section with the life action may be determined by, for example, an expert or the like, or may be based on, for example, information obtained by the operation of the present system.
For example, when the server device 1 calculates the evaluation value of the horizontal mobility of the care-receiver as 4 and the evaluation value of the vertical mobility as 5 based on the moving image data acquired from the terminal device 3, it is estimated that the care-receiver can perform the life movement of the B-zone (see a bold solid line frame in the figure). In addition, it is estimated that the care-receiver has a high difficulty in the life movement of the section a adjacent to the section B and a low difficulty in the life movement of the section D adjacent to the section B.
The server device 1 acquires a living action corresponding to a section (in this example, the section a) having a high difficulty level of living action compared with the living action of the section corresponding to the care recipient, from among the section corresponding to the care recipient (in this example, the section B) and the sections adjacent to the section (in this example, the section a and the section D). Although not shown, for example, when the zone corresponding to the care-receiver is the D zone, the server device 1 acquires the life movements of the D zone and the B zone and the C zone, which are more difficult than the D zone of the care-receiver, of the 4 zones adjacent to the D zone and B, C, E, F. That is, the server device 1 acquires the living action of the section corresponding to the care-receiver and the section adjacent to the right side or the upper side of the section in the graph shown in fig. 12.
The server device 1 transmits information including information on the acquired one or more life activities to the terminal device 3, and inquires whether the cared person can perform the life activities. The terminal device 3 displays a list of the life actions included in the information received from the server device 1, and the display unit 34 displays an inquiry screen for inquiring the user whether the care-receiver can perform the life actions.
Fig. 13 is a schematic diagram showing an example of an inquiry screen displayed by the terminal device 3 according to embodiment 2. The illustrated inquiry screen displays a title character string of "life action item in B section" at the uppermost part, and displays "request to select a currently available item" below. "is received. In addition, the title character string is changed according to the section corresponding to the care recipient.
In the illustrated inquiry screen, a plurality of life actions are displayed in a list in association with check boxes, respectively, below the above-described message. In this example, a list of life actions such as "the user can change his/her equipment in a standing state", "the user can wipe his/her body in a standing state", "the user can straddle the bath while standing still", "the user can sit on the floor to perform wiping cleaning", "the user can perform toilet cleaning", "the user can use a vacuum cleaner", and "the user can perform cleaning of the bath" is displayed. A check box is provided on each of the left sides of the life activities displayed in the list, and the user selects a life activity that can be performed by the caregiver by performing an operation of checking the check box.
In the illustrated inquiry screen, a character string of "next target section" and an image schematically showing the next target section are displayed below the life action displayed in the list. In this example, the section next to the target of the caregiver is the section a, and the section a in the graph is highlighted by surrounding the section a with a rectangular frame. Further, information such as a character string and an image displayed on the inquiry screen may be stored in the server device 1 and transmitted to the terminal device 3, or may be stored in the terminal device 3.
In addition, a button to which an "OK" tag is attached is provided at the lowermost part of the illustrated inquiry screen. When the user performs a click operation or a touch operation on the OK button after performing a check operation on the check box, the life action selected by the check box can be determined as a life action that can be performed by the care-giver. The terminal device 3 that has accepted the operation for the OK button accepts the selection of the life action that can be performed by the care recipient based on the presence or absence of the check for the check box. The terminal device 3 transmits information on a life movement selected as a life movement that can be performed by the care-receiver or an unselected life movement to the server device 1.
In the present embodiment, the selection of the life action that can be performed by the care-receiver is received on the inquiry screen, but the present invention is not limited to this, and the selection of the life action that cannot be performed by the care-receiver may be received.
In response to the inquiry, the server apparatus 1, which has received information on the result of selection of the life action that can be performed by the care recipient from the terminal apparatus 3, causes the terminal apparatus 3 to display a study screen for studying a future target for the care recipient. At this time, the server apparatus 1 transmits information necessary for displaying the study screen to the terminal apparatus 3, and the terminal apparatus 3 displays the study screen on the display unit 34 based on the information received from the server apparatus 1.
Fig. 14 is a schematic diagram showing an example of a study screen displayed by the terminal device 3 according to embodiment 2. The illustrated study screen displays a title character string of "living action target search table" at the uppermost part, and information of "care recipient a" as the "name" of the care recipient and information of "2020/6/24" as the "measurement day" are displayed in a left-right array below the title character string. The "measurement date" may be a date when moving image data obtained by capturing the movement of the care-receiver is transmitted to the server device 1. The "measurement date" may be a date on which the movement of the care-receiver, such as walking or standing, is imaged. When the date of the movement operation in the horizontal direction and the date of the movement operation in the vertical direction are different from each other, for example, a newer date may be used, an older date may be used, or both dates may be displayed separately.
In the illustrated study screen, the image of the care recipient and the information of the zone corresponding to the care recipient are displayed in a left-right arrangement below the above-described "name" and "measurement date". The image of the care recipient can be, for example, a still image or a moving image extracted by the server device 1 from moving image data captured by the terminal device 3 and used for calculation of an evaluation value of mobility. In addition, in the image, the skeletal information of the care recipient extracted when the server device 1 calculates the evaluation value is displayed so as to overlap the care recipient. The information on the section corresponding to the care recipient is shown by adding a star mark to the section corresponding to the care recipient in the image in which the graph shown in fig. 12 is reduced, and in this example, the section corresponding to the care recipient is the B section.
In the illustrated study screen, information on the "current action level" and the "item to be evaluated" are displayed in an up-down arrangement below the information on the image and the section of the care recipient. For example, the "current action level" is a body function level that can be moved across the bath while standing, as the current state. However, it is necessary to pay attention to the shaking to the right and left. Attention was given to the fall while lifting the single foot to obtain balance. "such article. The text displayed here may be, for example, a text describing an objective evaluation result of the measurement result of the movement motion in the horizontal direction and the vertical direction.
In the "project requiring evaluation", for example, a "confirmation of how to cross the bath at present is displayed in a itemized form", and if there is a fear that the user lifts his or her foot, the user is concerned about the installation of the armrest. "," can you squat in the bath, stand up from the bath? "and" are there troubles when opening and closing the door of the bathroom? "and so on. The displayed article cannot be estimated from, for example, the evaluation result of the movement operation and the selection result on the inquiry screen shown in fig. 13, but it can be an article in which contents such as a question that can expect information from the care-receiver are described by a nurse, a specialist, or the like performing an interview with the care-receiver.
The server apparatus 1 stores a plurality of articles displayed in "current operation level" and "item to be evaluated" in association with, for example, a combination of the evaluation result of the movement operation in the horizontal direction and the evaluation result of the movement operation in the vertical direction, or a section or the like corresponding to the care-receiver. The server device 1 selects an appropriate one of the stored articles and transmits the selected article to the terminal device 3, thereby displaying the selected article on the study screen. The articles stored in the server device 1 can be prepared in advance by, for example, a developer of the present system.
In the illustrated study screen, "display below" an item that needs to be evaluated "is a future target? "question texts are displayed as candidates for questions of a plurality of life actions as options for questions. In this example, candidates for the problems such as "stability of the bath spanning operation is improved", "body is wiped in a standing state", "clothes are put on and taken off in a standing state", and "laundry can be dried without falling over" are displayed. The server apparatus 1 determines a candidate of a problem of a living action to be displayed as a selection item of a future target on the study screen based on a living action that is not selected as a living action that can be performed by the care recipient in the inquiry screen shown in fig. 13, that is, a living action that cannot be performed by the care recipient. The server apparatus 1 may store the candidates of the problem of the living action in association with the living action that cannot be performed by the care-receiver, for example, or may directly use the living action that cannot be performed by the care-receiver as the candidate of the problem of the living action, for example.
The terminal apparatus 3 can be used for "the future target is? "among the candidates of the problem of the life action displayed as the option, the selection of the problem of the life action is accepted by accepting a click operation or a touch operation for one problem. In this example, "laundry can be dried without falling over" is selected. The terminal device 3 transmits information on the subject for which the selected life action is accepted to the server device 1. The server device 1 that receives the information determines the problem of the selected life action as the problem of the care-receiver, stores the care-receiver in association with the problem, and causes the terminal device 3 to display an analysis screen related to the problem. At this time, the server apparatus 1 transmits information necessary for displaying the analysis screen to the terminal apparatus 3, and the terminal apparatus 3 displays the analysis screen on the display unit 34 based on the information received from the server apparatus 1.
Fig. 15 is a schematic diagram showing an example of an analysis screen displayed by the terminal device 3 according to embodiment 2. The illustrated analysis screen displays a title character string of "life action evaluation table" at the uppermost part, and information of "care recipient a" as the "name" of the care recipient and information of "2020/6/24" as the "measurement day" are displayed in a left-right array below the title character string.
In the illustrated analysis screen, information of "drying the laundry without falling over" as a "short-term target" and information of "partial achievement" as a "short-term target achievement degree" are displayed in an up-down arrangement below the "name" and the "measurement date". The "short-term goal" is a subject of a life action selected on the study screen shown in fig. 14. The "short-term goal achievement level" indicates how much the "short-term goal" is achieved, for example, achievement levels such as "achieved", "partially achieved", and "not achieved". The server device 1 can determine the achievement degree of the problem of the life movement based on moving image data obtained by imaging the care-receiver, an evaluation value of the mobility calculated based on the moving image data, and the like.
In the illustrated analysis screen, the image of the care recipient and the information of the section corresponding to the care recipient are displayed in a left-right arrangement below the "short-term goal achievement level". The image of the care recipient and the information of the section are the same as those displayed on the study screen, and therefore, the description thereof is omitted.
In the illustrated analysis screen, information on the "current action level", "point for achieving goal", and "content of function training" is displayed in an up-down arrangement below the information on the image and the section of the care recipient. In the "current operation level", for example, "drying laundry" operation at an overhead height is displayed. In the movement with the basket, the basket may shake left and right. "such article. In addition, in "a point for target achievement", for example, a display in the form of a section "can walk indoors or at a short distance even without a crutch. "," it is necessary to improve the carrying/walking ability of the basket or the like. "and" improve the standing balance including the movement of the center of gravity up, down, left, and right without the support. "and so on. In the "functional training content", information related to training, such as "torso muscle training" and "walking exercise (exercise direction training) is displayed in a separate form. These pieces of information can be displayed by the server device 1 transmitting information stored in a database to the terminal device 3, for example. Information such as articles to be displayed in the "current action level", "point for achieving goal", and "function training content" is stored in the database in association with the mobility or section of the care-receiver.
Further, the user can display the analysis screen on the terminal device 3 at any time. The server device 1 stores evaluation values regarding the mobility of the care-receiver, corresponding zones, and problems of life actions and life actions that can be performed, and causes the terminal device 3 to display an analysis screen based on the stored information in response to a request from the terminal device 3. When moving image data obtained by imaging a movement operation of the care-receiver is newly acquired from the terminal device 3, the server device 1 performs calculation of an evaluation value of the mobility based on the moving image data, and updates information displayed on the analysis screen.
Fig. 16 is a flowchart showing a procedure of processing performed by the server device 1 according to embodiment 2. The information acquisition unit 11a of the processing unit 11 of the server device 1 according to embodiment 2 determines whether or not the moving image data transmitted from the terminal device 3 has been received (step S41). When the moving image data is not received (no in S41), the information acquiring unit 11a waits until the moving image data is received.
When the moving image data is received (S41: YES), the operation evaluation unit 11b of the processing unit 11 evaluates the horizontal movement ability of the care-receiver based on the moving image data obtained by capturing the horizontal movement operation of the care-receiver (step S42). The motion evaluation unit 11b evaluates the vertical movement ability of the care recipient based on the moving image data obtained by capturing the vertical movement motion of the care recipient (step S43).
In this case, the motion evaluation unit 11b may acquire information of the imaging time added to the received moving image data, and may end the process without evaluating the mobility when a difference between the imaging time of the horizontal movement motion and the imaging time of the vertical movement motion is larger than a threshold (for example, 1 month to 3 months). When the evaluation is not performed, the server apparatus 1 may notify the terminal apparatus 3 of the fact, and display a message or the like to the terminal apparatus 3 requesting the user to re-photograph the movement of the care-receiver.
Next, the processing unit 11 determines a zone corresponding to the care-receiver based on the evaluation value of the horizontal movement ability calculated in step S42 and the evaluation value of the vertical movement ability calculated in step S43 (step S44). The server device 1 stores the correspondence between the evaluation value and the section shown in fig. 12, and the processing unit 11 can determine the section based on the stored correspondence. The processing unit 11 acquires a living action associated with the section corresponding to the care recipient determined in step S44 and the section adjacent to the section (step S45).
Next, the processing unit 11 performs the following processing: the terminal device 3 is caused to display an inquiry screen for inquiring about the life action that can be performed (or that cannot be performed) by the care-receiver from the life actions acquired in step S45 (step S46). At this time, the processing unit 11 provides the terminal device 3 of the transmission source of the moving image data with an instruction to display an inquiry screen together with the information on the life movement acquired in step S45. The terminal apparatus 3 that has received the display instruction displays an inquiry screen in which the life action is listed as an option on the display unit 34 based on the information on the life action received together with the display instruction. The terminal device 3 receives a selection of a living action that can be performed by the care recipient on the inquiry screen, and transmits information on the received selection result to the server device 1. The processing unit 11 of the server device 1 receives a selection of a living action that can be performed by the care recipient based on the information received from the terminal device 3 (step S47).
The processing unit 11 performs the following processing: based on the living action selected in step S47, the terminal device 3 displays a study screen for studying a problem related to the living action of the care-receiver (step S48). At this time, the processing unit 11 determines one or more candidates of the subject of the life movement to be displayed as the option based on the life movement that the care-receiver cannot perform. The processing unit 11 provides the terminal device 3 with an instruction to display the study screen, together with information on the candidate of the determined subject and various information displayed on the study screen. The terminal apparatus 3 that has received the display instruction presents the candidate of the problem as the option based on the information received together with the display instruction, and displays a study screen presenting various information on the display unit 34. The terminal device 3 receives a selection of a subject from among the plurality of candidates presented on the study screen, and transmits information on the received selection result to the server device 1. The processing unit 11 of the server device 1 receives a selection of a problem of the care recipient based on the information received from the terminal device 3 (step S49), and determines a problem of the living action of the care recipient.
The processing unit 11 performs the following processing: based on the problem selected in step S49, the terminal device 3 displays an analysis screen related to the life movement and the problem of the care-receiver (step S50). At this time, the processing unit 11 provides the terminal device 3 with an instruction to display a study screen together with various information corresponding to the subject of the selected subject. The terminal device 3 that has received the display instruction displays an analysis screen presenting various information related to the problem of the care-receiver on the display unit 34 based on the information received together with the display instruction.
The server device 1 according to embodiment 2 configured as described above calculates the evaluation values of the movement capabilities in the horizontal direction and the vertical direction based on the moving image data obtained by capturing the movement motions of the care-receiver in the horizontal direction and the vertical direction, and classifies the care-receiver into a plurality of groups (segments) corresponding to the movement capabilities based on the calculated evaluation values. The server device 1 accepts selection of a living action that can be performed or cannot be performed by a care recipient from a plurality of living actions corresponding to a group classified and another group (a classified section and an adjacent section) close to the mobility of the group, and determines a candidate of a problem related to the living action of the care recipient. The server device 1 receives a selection of a problem of a living action of a care recipient from a plurality of candidates of the problem of the living action, and determines the selected problem as the problem of the living action of the care recipient.
Thus, the information processing system according to embodiment 2 can expect that the problem, target, and the like of the care-receiver can be easily set by the care-receiver. The server device 1 can be expected to be a help for a nurse or the like to set a care plan target, for example, by managing information such as a problem of a care recipient and the achievement degree thereof.
In embodiment 2, the evaluation values and the classification of the sections shown in fig. 12 are examples, but are not limited to these. The configurations of the screens shown in fig. 13 to 15, the texts displayed on the screens, and the like are examples, but not limited thereto. The screens shown in fig. 14 and 15 are not displayed on the display unit 34 of the terminal device 3, and may be generated by the server device 1 and transmitted to the terminal device 3 as a file such as an image file, a Document file, or a PDF (Portable Document Format) file, or may be printed and output from a printer, for example. The server device 1 according to embodiment 2 is configured to present the candidates for the problem and receive the selection of the problem of the subject, but is not limited to this, and the problem of the subject may be determined by the server device 1 without presenting the candidates for the problem.
Since other configurations of the information processing system according to embodiment 2 are the same as those of the information processing system according to embodiment 1, the same reference numerals are given to the same parts, and detailed description thereof is omitted.
The embodiments disclosed herein are illustrative in all respects and should not be considered as limiting. The scope of the present invention is defined by the claims rather than the above meaning, and includes meanings equivalent to the claims and all modifications within the scope.

Claims (13)

1. An information processing method, wherein an information processing apparatus:
acquiring a moving image obtained by imaging the movement of the person to be cared,
evaluating the mobility of the cared person based on the acquired moving dynamic image,
determining a level of performance of a life action of the care-receiver based on the evaluation result of the mobility.
2. The information processing method according to claim 1,
the moving image includes at least one of a horizontal moving image obtained by imaging a horizontal movement of the care recipient and a vertical moving image obtained by imaging a vertical movement of the care recipient.
3. The information processing method according to claim 2,
the moving image includes a horizontal moving image obtained by capturing a horizontal movement of the care-receiver and a vertical moving image obtained by capturing a vertical movement of the care-receiver,
the information processing apparatus:
evaluating a horizontal movement capability of the cared person based on the horizontally moving dynamic image and a vertical movement capability of the cared person based on the vertically moving dynamic image,
and determining the implementation level of the living action of the cared person according to the combination of the evaluation result of the horizontal movement ability and the evaluation result of the vertical movement ability.
4. The information processing method according to claim 2 or 3,
the horizontal movement moving image is a moving image obtained by capturing a state where the care-receiver is walking.
5. The information processing method according to any one of claims 2 to 4,
the vertical movement moving image is a moving image obtained by capturing a state in which the care-receiver person stands up.
6. The information processing method according to any one of claims 1 to 5,
the life action comprises a daily life action.
7. The information processing method according to claim 6,
the Daily activities include BADL (Basic Activity of Daily Living: Basic Daily Activity) and IADL (Instrument Activity of Daily Living: Instrumental Daily Activity).
8. The information processing method according to any one of claims 1 to 7,
the information processing apparatus:
accepting selection of a living action of an output target from among a plurality of living actions,
obtaining a result of determining the implementation level of the accepted life action,
and outputting the obtained judgment result of the implementation level.
9. The information processing method according to any one of claims 1 to 8,
the information processing apparatus:
acquiring a plurality of life actions based on the evaluated mobility of the cared person,
accepting a selection of one or more life actions that the care recipient can perform or cannot perform from the acquired plurality of life actions,
determining a subject related to the movement of the care-receiver based on the accepted selection of the living action.
10. The information processing method according to claim 9,
the information processing apparatus:
determining a plurality of candidates for a task related to the movement of the care-receiver based on the received selection of the living action,
receiving selection of a task related to the movement of the care-receiver from the determined candidates for the plurality of tasks,
determining a task related to the movement of the care-receiver based on the selection of the accepted task.
11. An information processing apparatus includes:
an acquisition unit that acquires a moving image obtained by imaging a movement of a care-receiver;
an evaluation unit that evaluates the mobility of the care-receiver based on the moving dynamic image acquired by the acquisition unit; and
and a determination unit that determines the level of performance of the living action of the care-receiver based on the result of the evaluation of the mobility by the evaluation unit.
12. A non-volatile recording medium having a computer program recorded thereon, the computer program causing a computer to execute:
acquiring a moving image obtained by imaging the movement of the person to be cared,
evaluating the mobility of the cared person based on the acquired moving dynamic image,
determining a level of performance of a life action of the care-receiver based on the evaluation result of the mobility.
13. A non-volatile recording medium having a computer program recorded thereon, the computer program causing a computer to execute:
accepting selection of a living action of an output target from among a plurality of living actions,
obtaining a result of determining the implementation level of the accepted life action,
and outputting the obtained judgment result of the implementation level.
CN202110931288.4A 2020-08-24 2021-08-13 Information processing method, information processing apparatus, and recording medium Pending CN114189509A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-140853 2020-08-24
JP2020140853A JP6971448B1 (en) 2020-08-24 2020-08-24 Information processing method, information processing device and computer program

Publications (1)

Publication Number Publication Date
CN114189509A true CN114189509A (en) 2022-03-15

Family

ID=78605775

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110931288.4A Pending CN114189509A (en) 2020-08-24 2021-08-13 Information processing method, information processing apparatus, and recording medium

Country Status (2)

Country Link
JP (1) JP6971448B1 (en)
CN (1) CN114189509A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001188859A (en) * 2000-05-19 2001-07-10 Nissetsu Engineering Co Ltd Care necessity level authorization method, care necessity level authorization system, recording medium and portable terminal control equipment
JP2001250004A (en) * 1999-12-27 2001-09-14 Nissetsu Engineering Co Ltd Method and system for authorizing care requirement, recording medium, and portable terminal control equipment
JP2018106437A (en) * 2016-12-27 2018-07-05 積水化学工業株式会社 Behavioral assessment system and behavioral assessment method
CN110279991A (en) * 2019-06-17 2019-09-27 河南翔宇医疗设备股份有限公司 A kind of scene exercise rehabilitation training system
CN110738192A (en) * 2019-10-29 2020-01-31 腾讯科技(深圳)有限公司 Human motion function auxiliary evaluation method, device, equipment, system and medium
US20200075177A1 (en) * 2018-09-04 2020-03-05 International Business Machines Corporation Constructing thresholds based on temporal profiles to determine outliers in smart environments

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005122292A (en) * 2003-10-14 2005-05-12 Sekisui Chem Co Ltd Living situation display device
JP4200217B2 (en) * 2004-09-01 2008-12-24 国立大学法人弘前大学 Nursing care support system
JP2015049825A (en) * 2013-09-04 2015-03-16 日本電気株式会社 Information processor, life support method, and computer program
CN110215188A (en) * 2018-05-23 2019-09-10 加利福尼亚大学董事会 System and method for promoting rehabilitation
WO2020144835A1 (en) * 2019-01-11 2020-07-16 ソニー株式会社 Information processing device and information processing method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001250004A (en) * 1999-12-27 2001-09-14 Nissetsu Engineering Co Ltd Method and system for authorizing care requirement, recording medium, and portable terminal control equipment
JP2001188859A (en) * 2000-05-19 2001-07-10 Nissetsu Engineering Co Ltd Care necessity level authorization method, care necessity level authorization system, recording medium and portable terminal control equipment
JP2018106437A (en) * 2016-12-27 2018-07-05 積水化学工業株式会社 Behavioral assessment system and behavioral assessment method
US20200075177A1 (en) * 2018-09-04 2020-03-05 International Business Machines Corporation Constructing thresholds based on temporal profiles to determine outliers in smart environments
CN110279991A (en) * 2019-06-17 2019-09-27 河南翔宇医疗设备股份有限公司 A kind of scene exercise rehabilitation training system
CN110738192A (en) * 2019-10-29 2020-01-31 腾讯科技(深圳)有限公司 Human motion function auxiliary evaluation method, device, equipment, system and medium

Also Published As

Publication number Publication date
JP6971448B1 (en) 2021-11-24
JP2022036573A (en) 2022-03-08

Similar Documents

Publication Publication Date Title
JP7373788B2 (en) Rehabilitation support device, rehabilitation support system, and rehabilitation support method
TW201909058A (en) Activity support method, program, activity support system
JP2013066672A (en) Training support system
WO2021166605A1 (en) Occupational therapy supporting device, artificial intelligence learning device for occupational therapy supporting device, and use method of occupational therapy supporting device
JP6792892B2 (en) Rehabilitation plan creation support device, rehabilitation plan creation support system, rehabilitation plan creation support method, rehabilitation plan creation support computer program
CN114189509A (en) Information processing method, information processing apparatus, and recording medium
CN113168641B (en) Information processing apparatus and information processing method
JP2015148915A (en) Adl(activity of daily living) assessment system, adl assessment device, and program
JP2022189528A (en) Information processor and method for processing information
JP7156571B2 (en) Life function evaluation system, life function evaluation program, and life function evaluation method
JP2020099534A (en) Function improvement support device and system
WO2023153453A1 (en) Rehabilitation supporting system, information processing method, and program
US20230317273A1 (en) Information processing device and information processing method
JP7226142B2 (en) Housing presentation device, system, method and program
CN113995627B (en) Presentation system, presentation method, and program
CN112542246A (en) Rehabilitation support device, rehabilitation support system, rehabilitation support method, and recording medium
JP2023115876A (en) Rehabilitation support system, information processing method and program
JP2023133838A (en) Information processing apparatus, information processing system, information processing method, and control program
JP2023133834A (en) Information processing apparatus, information processing system, information processing method, and control program
JP2023066549A (en) Rehabilitation support device, rehabilitation support method, and rehabilitation support system
JP2023122319A (en) Child care support system
JP2022113095A (en) Nursing-care plan proposal program
CN115461820A (en) Daily life activity improvement auxiliary device
TW202238614A (en) Information processing device and information processing method
JP2023172460A (en) Information processing device, and information processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination