WO2021085369A1 - Meal amount measuring device and method - Google Patents

Meal amount measuring device and method Download PDF

Info

Publication number
WO2021085369A1
WO2021085369A1 PCT/JP2020/040067 JP2020040067W WO2021085369A1 WO 2021085369 A1 WO2021085369 A1 WO 2021085369A1 JP 2020040067 W JP2020040067 W JP 2020040067W WO 2021085369 A1 WO2021085369 A1 WO 2021085369A1
Authority
WO
WIPO (PCT)
Prior art keywords
meal
tableware
image
amount
unit
Prior art date
Application number
PCT/JP2020/040067
Other languages
French (fr)
Japanese (ja)
Inventor
野中 俊一郎
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2021085369A1 publication Critical patent/WO2021085369A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance

Definitions

  • the present invention relates to a food amount measuring device and a method, and particularly relates to a technique for measuring the amount of food eaten by a care recipient of a long-term care facility or an inpatient in a hospital.
  • the method for measuring the in-hospital meal intake described in Patent Document 1 can calculate various nutrients such as calories and vitamins actually ingested by each patient from the meal provided by the meal tray. It is not possible to recognize the staple food and the side dish from the meal, and it is not possible to measure how much the staple food and the side dish are eaten.
  • the present invention has been made in view of such circumstances, and provides a meal amount measuring device and a method capable of accurately measuring at least one meal amount of a staple food or a side dish provided by a meal tray with a simple device.
  • the purpose is to do.
  • the meal amount measuring device includes an imaging unit that captures a meal tray before and after a meal, and a first image before a meal and a second image after a meal that are captured by the imaging unit.
  • An image acquisition unit that acquires a plurality of dishes in the meal tray based on the first image and the second image, and a tableware detection unit that detects a plurality of dishes in the meal tray based on the first image and the second image.
  • a food recognition unit that recognizes the type of served meal, and a main meal side meal judgment unit that determines whether the food in the tableware corresponds to the main meal or the side meal from the recognized types of food for each table.
  • a meal set determination unit that determines the same set of meals from a plurality of meals detected from the first image by the meal detection unit and a plurality of meals detected from the second image, and a meal set determination unit that determines the same set of meals.
  • a meal amount recognition unit that recognizes the amount of food for each dish served on each table based on the image of the same set of tableware, and a judgment result and meal amount recognition for each tablee determined by the main meal and side meal judgment unit. It is provided with a main meal side meal amount calculation unit that calculates at least one meal amount of the main meal or the side meal based on the meal amount for each tableware recognized by the department.
  • a first image of a pre-meal meal tray and a second image of a post-meal meal tray are acquired, and meals are taken based on the acquired first and second images, respectively. Detect multiple dishes in the tray.
  • the type of dishes served on each tableware is recognized for each tableware detected based on the first image, and the dishes in the tableware correspond to the staple food or the side dish from the recognized types of dishes. It is judged for each tableware.
  • the same tableware set is determined from the plurality of tableware detected from the first image and the plurality of tableware detected from the second image, and each tableware is determined based on the image of the determined same tableware set. Recognize the amount of food for each dish served in the tableware. Based on the determination result of the staple food and the side dish for each tableware and the amount of meal recognized for each tableware, at least one meal amount of the staple food or the side dish is calculated.
  • the staple food side dish determination unit determines the ratio of the staple food and the side dish of the dish based on the dishes in the tableware.
  • the staple food side dish determination unit determines that if the food in the tableware is only the staple food, the food corresponds to the staple food. In this case, the ratio of the staple food to the side dish is 100/0, and if the food in the tableware is only the side dish, It is determined that the dish corresponds to a side dish, and in this case, the ratio of the staple food to the side dish is 0/100.
  • the ratio of the staple food and the side dish in the dish for example, the staple food / side dish is 80/20
  • the meal amount recognition unit is a learning device that machine-learns a pair of an image of dishes served on tableware and the amount of dishes served on tableware as teacher data.
  • the imaging unit includes a plurality of cameras
  • the image acquisition unit includes a plurality of first images and a plurality of first images taken by the plurality of cameras for the same food tray. It is preferable to acquire a second image.
  • a plurality of first images and a plurality of second images taken by a plurality of cameras in different shooting directions are used for detection and determination by the tableware detection unit, the food recognition unit, the staple food side dish determination unit, and the meal amount recognition unit. Therefore, the detection and determination accuracy can be improved.
  • the meal amount measuring device includes a meal tray installation unit on which a meal tray is placed, and the imaging unit photographs the meal tray placed on the meal tray installation unit. ..
  • the meal tray detection unit is provided with a meal tray detection unit for detecting that the food tray is placed on the food tray installation unit.
  • a meal tray detection unit for detecting that the food tray is placed on the food tray installation unit.
  • the imaging unit shoots a moving image during a period other than the shooting of the first image and the second image
  • the meal tray detecting unit installs a meal tray based on the moving image. It is preferable to detect that the meal tray is placed on the part. As a result, it is possible to reliably detect that the meal tray is placed on the meal tray installation portion.
  • the food tray installation unit is provided with a guide unit that guides the food tray to the imaging position by the imaging unit.
  • the image pickup unit can take a good picture of the meal tray.
  • the meal tray installation portion may be formed with a notch portion that allows the meal tray to be placed on the meal tray installation portion while being held by one hand. preferable.
  • the notch formed in the meal tray installation part allows the meal tray to be placed on the meal tray installation part while being held by one hand without the hands interfering with each other.
  • the food amount measuring device it is preferable to include a notification unit for notifying the success or failure of imaging by the imaging unit.
  • the notification unit can display the success or failure of shooting by voice or text.
  • a user ID detection unit that detects a user ID of a user who uses the meal tray and a user ID when the food tray is photographed by the imaging unit.
  • a storage unit that stores the first image and the second image taken by the imaging unit when the user ID is detected by the detection unit in association with the detected user ID, respectively, and calculates the amount of staple food and side dish.
  • the department preferably calculates at least one meal amount of a staple food or a side dish for each user ID.
  • the food amount measuring device when only the first image associated with the user ID exists among the first image and the second image stored in the storage unit, the first image.
  • a complete meal determination unit for determining that both the main dish and the side dish have been completed is provided for the dishes served on the plurality of tableware detected from the above, and the main meal side dish meal amount calculation unit determines the complete meal by the complete meal determination unit. If so, it is preferable to use the determination result of the complete meal determination unit when calculating the amount of at least one meal of the main dish or the side dish. As a result, it is not necessary to take a picture of the completed meal tray, and it is possible to save time and effort and recognition processing of unnecessary meal amount.
  • a record of recording at least one meal amount of a staple food or a side dish calculated for each user ID by the staple food side dish meal amount calculation unit in association with the user ID is preferable to provide a portion. As a result, it is possible to record and manage at least one meal amount of the staple food or side dish eaten by each user for each user who uses the meal tray.
  • the user ID detection unit reads the bar code indicating the user ID attached to the meal tray or the user's name plate placed on the meal tray. Therefore, it is preferable to detect the user ID or communicate with the wireless tag that stores the user ID of the user embedded in the meal tray or the name plate to detect the user ID.
  • the user ID detection unit that reads the barcode can detect the user ID indicated by the barcode by extracting the image area of the barcode included in the first image and the second image.
  • the barcode includes a one-dimensional barcode or a two-dimensional barcode (QR code (registered trademark)).
  • the user ID detection unit that communicates with the wireless tag can be embedded in, for example, the food tray installation unit on which the food tray is placed.
  • the method for measuring the amount of food includes a step of photographing a pre-meal meal tray by an imaging unit, a step of photographing a post-meal meal tray by an imaging unit, and a first pre-meal image and a post-meal image taken by the imaging unit.
  • the step of recognizing the type of food served on each table for each table by the food recognition unit, and whether the food in the table corresponds to the main meal or the side meal from the recognized type of food is the main meal.
  • the table meal set determination unit determines the same set of dishes from the step of the side meal determination unit for each table, the plurality of tableware detected from the first image, and the plurality of tableware detected from the second image.
  • the present invention it is possible to accurately measure the amount of at least one staple food or side dish provided by the meal tray with a simple device.
  • FIG. 1 is an external view showing an example of a main hardware configuration of the food amount measuring device according to the present invention.
  • FIG. 2 is a block diagram showing an embodiment of the food amount measuring device shown in FIG.
  • FIG. 3 is a functional block diagram showing various functions possessed by the CPU of the notebook PC shown in FIG. 1 or 2.
  • FIG. 4 is a diagram showing a meal tray, a plurality of tableware on the meal tray, a name plate, and the like.
  • FIG. 5 is a diagram showing an example of a name plate.
  • FIG. 6 is a functional block diagram showing the functions of the image recognition device shown in FIG.
  • FIG. 7 is a chart showing an example of the type of food and the ratio of staple food / side dish.
  • FIG. 1 is an external view showing an example of a main hardware configuration of the food amount measuring device according to the present invention.
  • FIG. 2 is a block diagram showing an embodiment of the food amount measuring device shown in FIG.
  • FIG. 3 is a functional block diagram showing various functions
  • FIG. 8 is a chart showing a determination result of a staple food and a side dish for each of a plurality of tableware (a plurality of dishes A to D) in the meal tray shown in FIG. 4 and an example of the amount of meal.
  • FIG. 9 is a chart showing the evaluation results of 6 stages with respect to the meal amount of the staple food and the side dish shown in FIG.
  • FIG. 10 is a chart showing a determination result of a staple food and a side dish for each of a plurality of tableware (a plurality of dishes A to C) in a meal tray and another example of the amount of meal.
  • FIG. 11 is a chart showing the evaluation results of 6 stages with respect to the meal amount of the staple food and the side dish shown in FIG.
  • FIG. 10 is a chart showing the evaluation results of 6 stages with respect to the meal amount of the staple food and the side dish shown in FIG.
  • FIG. 12 is a chart showing an example of a list of personal information of each user managed by the recording device.
  • FIG. 13 is a chart showing an example of a meal amount list showing the meal amount for each user recorded and managed by the recording device.
  • FIG. 14 is a flowchart showing an embodiment of the meal amount measuring method according to the present invention, and in particular, is a flowchart showing a work procedure of a staff member at the time of serving.
  • FIG. 15 is a flowchart showing an embodiment of the meal amount measuring method according to the present invention, and in particular, is a flowchart showing a photographing process of the meal tray in step S20 shown in FIG. FIG.
  • FIG. 16 is a flowchart showing an embodiment of the meal amount measuring method according to the present invention, and in particular, is a flowchart showing a work procedure of a staff member at the time of serving.
  • FIG. 17 is a flowchart showing an embodiment of the meal amount measuring method according to the present invention, and in particular, is a flowchart showing the processing of the notebook PC and the image recognition device after the photography of the meal tray before and after the meal is completed.
  • FIG. 1 is an external view showing an example of a main hardware configuration of the food amount measuring device according to the present invention.
  • the meal amount measuring device 1 shown in FIG. 1 is, for example, a device installed in a ward of a care facility or a hospital and measuring the amount of meal eaten by a care recipient of the care facility or an inpatient (user) of the hospital. It is composed of two cameras 10A and 10B that function as image pickup units, a meal tray installation unit 30, a notebook computer (notebook PC: Personal Computer) 100, an image recognition device 200 shown in FIG. 2, and a recording device 300. ing.
  • 32 is a leg that supports the meal tray installation portion 30, and 34 is a support column to which the cameras 10A and 10B are fixed.
  • the notebook PC 100 is placed on the left side of the meal tray installation unit 30, and the meal tray 20 is placed on the right side of the meal tray installation unit 30.
  • the meal tray installation portion 30 is formed with a notch portion 30A that allows the meal tray 20 to be placed on the meal tray installation portion 30 while being held by one hand.
  • staff often hold two meal trays 20 with one hand to serve or serve meals.
  • the notch 30A does not interfere with the hands, and the meal tray 20 is held with one hand. It can be placed on the meal tray installation unit 30 in a held state.
  • the meal tray installation unit 30 is provided with a guide unit 30B that guides the meal tray 20 to the shooting position by the cameras 10A and 10B.
  • the guide portion 30B of this example is configured by an L-shaped stopper that positions the meal tray 20 when the meal tray 20 comes into contact with the guide portion 30B, but is not limited to this, for example, at a position where the meal tray 20 is placed.
  • the placement position of the meal tray 20 may be guided by laying a mat or drawing a positioning frame on the surface of the meal tray installation portion 30.
  • FIG. 2 is a block diagram showing an embodiment of the food amount measuring device 1 shown in FIG. 1, and is mainly shown with respect to the notebook PC 100.
  • the notebook PC 100 shown in FIG. 2 includes an input / output interface 110, a CPU (Central Processing Unit) 120, an operation unit 130, a communication unit 140, a RAM (Random Access Memory) 150, a storage unit 160, a display control unit 170, and a display unit 172. It is composed of a driver 180 and a speaker 182.
  • the two cameras 10A and 10B are connected to the input / output interface 110 that functions as an image acquisition unit.
  • the cameras 10A and 10B capture a still image or a moving image by inputting a shooting instruction input from the CPU 120 via the input / output interface 110.
  • the cameras 10A and 10B photograph the meal trays 20 before and after the meal placed on the meal tray installation unit 30 from different directions.
  • the two pre-meal images (first image) and the two post-meal images (second image) taken by the cameras 10A and 10B are temporarily stored in the RAM 150 or the storage unit 160, respectively.
  • the CPU 120 has various functions by reading various programs in addition to the meal amount measurement program stored in the storage unit 160, controlling each unit in an integrated manner, and executing the meal amount measurement program.
  • FIG. 3 is a functional block diagram showing various functions of the CPU 120.
  • the CPU 120 functions as a staple food side dish meal amount calculation unit 120A, a meal tray detection unit 120B, a user ID (Identification) detection unit 120C, and a complete meal determination unit 120D.
  • the details of the staple food side dish amount calculation unit 120A, the meal tray detection unit 120B, the user ID detection unit 120C, and the complete meal determination unit 120D will be described later.
  • the operation unit 130 includes a power switch, a keyboard, a mouse, and the like.
  • the operation unit 130 functions as an operation unit that receives normal operation input of the notebook PC 100, and is for measuring the amount of food to be displayed on the screen of the display unit 172. It functions as an operation unit for operating various icons.
  • the operation unit 130 includes the touch panel in the case of the display unit 172 provided with the touch panel.
  • the communication unit 140 is for directly or indirectly connecting to an external device by a wireless LAN (Local Area Network), Bluetooth (Bluetooth) (registered trademark), etc., and in the case of this example, the image recognition device 200 and It can be wirelessly connected to the recording device 300. It is also possible to connect to the Internet.
  • a wireless LAN Local Area Network
  • Bluetooth Bluetooth
  • the communication unit 140 is for directly or indirectly connecting to an external device by a wireless LAN (Local Area Network), Bluetooth (Bluetooth) (registered trademark), etc., and in the case of this example, the image recognition device 200 and It can be wirelessly connected to the recording device 300. It is also possible to connect to the Internet.
  • the RAM 150 is used as a work area of the CPU 120, and includes a program read from the storage unit 160, a first image which is an image of the meal tray 20 before meals, a second image which is an image of the meal tray 20 after meals, and the like. It is used as a memory for temporarily storing various types of data.
  • the storage unit 160 is composed of a hard disk device, a flash memory, and the like, and is a non-volatile device that stores various programs, the above-mentioned first image, second image, various data, and the like, in addition to an operating system and a meal amount measurement program. It is a memory part.
  • the display control unit 170 is a part that creates display data to be displayed on the display unit 172 according to an instruction from the CPU 120 and outputs the display data to the display unit 172.
  • Operations with icons for example, "meal measurement” icon, “breakfast” icon, “lunch” icon, “dinner” icon, "before meal” icon, “after meal” icon, “start” button, “end” button, etc.
  • the screen is displayed on the display unit 172, or the display unit 172 is notified of the success or failure of shooting by the cameras 10A and 10B.
  • the display unit 172 is a color liquid crystal display provided in the notebook PC 100, and when the meal amount measurement program is started, the display unit 172 displays an operation screen provided with various icons for measuring the meal amount. ..
  • the driver 180 is a sound driver that creates sound information generated from the speaker 182 according to an instruction from the CPU 120 and outputs the sound information to the speaker 182.
  • the driver 180 generates sound indicating the success or failure of shooting by the cameras 10A and 10B from the speaker 182. Let me. For example, when the images of the meal tray 20 by the cameras 10A and 10B are successful, a shutter sound of "kasha" or a sound of "shooting is completed" can be generated.
  • the display unit 172 and the speaker 182 function as a notification unit for notifying the success or failure of shooting by the cameras 10A and 10B.
  • FIG. 4 is a diagram showing a meal tray 20, tableware 20A to 20D on the meal tray 20, a name plate 22, and the like.
  • the term "meal tray 20" includes a case where tableware and the like are placed on the tray and a state where the meal tray is served, and a case where only the tray is meant.
  • the meal tray 20 shown in FIG. 4 shows the state before meals installed in the meal tray installation unit 30 for photography, and each tableware 20A to 20D contains dishes A to D, and the name plate 22. Has been knocked down, and the back side of the nameplate 22 can be photographed.
  • the dish A shown in FIG. 4 is, for example, a staple food (rice).
  • the dishes B to D are side dishes, the dishes B and C are the main and side dishes of the side dishes, and the dishes D are soups.
  • FIG. 5 is a diagram showing an example of the name plate 22.
  • the name of the user who is the care recipient of the long-term care facility or the inpatient of the hospital is attached to the front side of the name plate 22, and the user who uses the meal tray 20 is specified on the front side of the name plate 22.
  • a barcode (QR code in this example) 22A indicating the user ID is attached. When the name plate 22 is knocked down, the QR code 22A can be photographed.
  • the "cashing" sound means that the meal tray 20 has been successfully photographed by the cameras 10A and 10B.
  • the cameras 10A and 10B usually shoot a moving image when the measurement of the amount of food is started, and the moving image taken by the cameras 10A and 10B is temporarily stored in the RAM 150.
  • the CPU 120 functioning as the meal tray detection unit 120B determines the presence or absence of the meal tray 20 from each frame of the moving image held in the RAM 150, and when the meal tray 20 is detected, the meal tray 20 is appropriate for the meal tray installation unit 30. Judge whether it is placed in a proper position. Further, the CPU 120 functioning as the user ID detection unit 120C reads the QR code 22A attached to the name plate 22 from each frame of the moving image.
  • the CPU 120 When it is detected that the meal tray 20 is arranged at an appropriate position of the meal tray installation portion 30 and the QR code 22A attached to the name plate 22 is successfully read, the CPU 120 is stationary with respect to the cameras 10A and 10B. Instruct them to take a picture and have them take a still image of the meal tray 20. Further, when the shooting of the still image is completed, the CPU 120 generates a "click" sound from the speaker 182 through the driver 180 to notify the success of the shooting.
  • the CPU 120 when the meal tray 20 is not arranged at an appropriate position of the meal tray installation unit 30 for a certain period of time or more after the meal tray 20 is detected by the meal tray detection unit 120B, or the QR attached to the name plate 22 If the code 22A cannot be read, the speaker 182 can be used to notify by voice that correct shooting is not possible (shooting is unsuccessful).
  • the CPU 120 may notify the success or failure of shooting by a display on the display unit 172.
  • the meal tray detection unit 120B detects the meal tray 20 by image processing the moving images taken by the cameras 10A and 10B, but the present invention is not limited to this, and the meal tray 20 is not limited to this.
  • the CPU 120 may be provided with a detector for detecting the above, and the CPU 120 may acquire information on whether or not the meal tray 20 is placed on the meal tray installation unit 30 from the detector.
  • the user ID detection unit 120C acquires the user ID by reading the QR code 22A attached to the name plate 22, but the name plate is not limited to this, and for example, a wireless tag for storing the user ID is used as the name plate.
  • the user ID may be read from the wireless tag by embedding it in a user-dedicated meal tray and using a tag reader installed in the meal tray installation unit 30.
  • the still image (first image) of the meal tray 20 at the time of serving (before meals) taken by the cameras 10A and 10B when the user ID is detected by the user ID detection unit 120C is the detected user. It is preferable to store the ID in the non-volatile storage unit 160 (or RAM 150) in association with the ID. Further, when the shooting of the still image of one meal tray 20 by the cameras 10A and 10B is completed, the moving image temporarily held in the RAM 150 is deleted, and the shooting of the moving image by the cameras 10A and 10B is started again.
  • the still image (second image) of the meal tray 20 at the time of serving (after meal) taken by the cameras 10A and 10B, respectively, is associated with the user ID as in the case of serving (before meal), and is a non-volatile memory. It is preferable to store it in the unit 160 (or RAM 150).
  • the first image of the meal tray 20 before the meal and the second image of the meal tray 20 after the meal as described above are temporarily stored in the storage unit 160 (or RAM 150) in association with the user ID. Will be done.
  • the CPU 120 transmits the first image and the second image associated with the same user ID and stored to the image recognition device 200 via the communication unit 140.
  • FIG. 6 is a functional block diagram showing the functions of the image recognition device 200.
  • the image recognition device 200 includes a communication unit 210, a tableware detection unit 220, a food recognition unit 230, a staple food side dish determination unit 240, a tableware set determination unit 250, and a meal amount recognition unit 260.
  • the image recognition device 200 is realized by a personal computer and software.
  • the communication unit 210 of the image recognition device 200 shows a first image (an image showing a pre-meal meal tray 20) and a second image (a post-meal meal tray 20) transmitted from the notebook PC 100 and associated with the same user ID. Image) is received.
  • the tableware detection unit 220 is a portion that detects a plurality of tableware in the meal tray 20 based on the first image and the second image. For example, in the case of the first image obtained by photographing the meal tray 20 shown in FIG. , The tableware detection unit 220 detects four tableware 20A to 20D based on the first image.
  • the tableware detection unit 220 performs image processing on the first image and the second image based on the shape, color, size, etc. of each tableware in the image, thereby performing image processing on the positions (areas) of a plurality of tableware and the type of each tableware. Etc. may be detected, or may be configured as a machine learning device that extracts (detects) each tableware in the meal tray. It is preferable that the plurality of tableware in the meal tray 20 does not include the same tableware.
  • the food recognition unit 230 is a part that recognizes the type of food served on each tableware for each tableware detected based on the first image, and is machine-learned to recognize (classify) the type of food, for example. It can also be configured by a learner such as a convolutional neural network (CNN).
  • CNN convolutional neural network
  • the dish recognition unit 230 cuts out the image of each tableware detected by the tableware detection unit 220 from the first image, and learns the image of the cut out tableware (or the image of only the dish further cut out from the image of the tableware). By using the input image of the container, the recognition result indicating the type of dishes served on the tableware is output.
  • the dish recognition unit 230 refers to the menu data of "breakfast”, “lunch”, or “dinner” in the menu data on the measurement day, and serves the dishes on each table. By recognizing the type, it is possible to recognize the dish with higher accuracy.
  • the staple food side dish determination unit 240 determines for each tableware whether the food in each tableware corresponds to the staple food or the side dish from the type of the dish for each tableware recognized by the dish recognition unit 230.
  • FIG. 7 is a chart showing an example of the type of food and the ratio of staple food / side dish.
  • the staple food side dish determination unit 240 has, for example, the table shown in FIG. 7, and can read out the ratio of the staple food and the ratio of the side dish from the table based on the type of food in the tableware.
  • the ratio of the staple food is read as 100%
  • the dish is "curry rice”
  • the ratio of the staple food is 80% and the ratio of the side dish is 20%. Read as.
  • the ratio of side dishes in these dishes is read as 100%.
  • the "amount” in the rightmost column of FIG. 7 indicates the amount of a standard dish (for example, weight (g)).
  • the amount of the side dish may be weighted according to whether the side dish is the main dish or the side dish, and in the case of "miso soup", since the amount of water is large, the “amount” may be set to a small value. preferable.
  • these “amounts” may be empirically determined by a staff member or the like who has visually recognized the amount of food for the staple food and the amount of food for the side dish.
  • the tableware set determination unit 250 is a part that determines the same tableware set from a plurality of tableware detected from the first image and a plurality of tableware detected from the second image by the tableware detection unit 220. It suffices if the same set of tableware can be determined, and it is not necessary to determine what kind of tableware it is.
  • the tableware of the same size can be combined into the same set of tableware.
  • the meal amount recognition unit 260 recognizes the amount of dishes served on each tableware and the amount of remaining dishes for each tableware based on the image of the same tableware set determined by the tableware set determination unit 250.
  • the meal amount recognition unit 260 calculates the difference between the amount of food served on each tableware (meal amount) in the first image and the amount of food remaining in each tableware (meal amount) in the second image. Recognize the amount of food for each tableware eaten by the user.
  • the meal amount recognition unit 260 uses a pair of an image of a dish served on the tableware and the amount of the dish (actual amount) served on the tableware as teacher data, and calculates the amount of the dish from the image of the dish. Includes learners such as CNNs that are machine-learned to recognize.
  • the teacher data includes images of various dishes served on the tableware (including images of various amounts of dishes) and various amounts of dishes that are correct data (actual amount of dishes for images of various amounts of dishes). ) Is a pair.
  • the image of each tableware detected by the tableware detection unit 220 is cut out from the first image, and only the dish is further cut out from the cut out tableware image (or the tableware image).
  • a recognition result indicating the amount of food served on the tableware before meals is output.
  • the image of each tableware detected by the tableware detection unit 220 is cut out from the second image, and the image of the cut out tableware (or the image obtained by further cutting out only the food from the tableware image) is used as the input image of the learner.
  • a recognition result indicating the amount of food remaining on the tableware (same tableware) after meals is output.
  • the tableware detection unit 220 can perform tableware detection, cooking recognition, tableware set determination, and meal amount determination using two first and second images, respectively, and can perform more accurate determination and recognition. .. Further, when the food recognition unit 230 and the meal amount recognition unit 260 are configured by the learning device, two images of the same tableware may be input at the same time to recognize the food and the food amount.
  • the image recognition device 200 uses the user to obtain the determination result of the staple food and side dish for each tableware and the amount of meal for each tableware, which are recognized for each tableware (cooking) based on the first image and the second image as described above.
  • Each ID is transmitted to the notebook PC 100 via the communication unit 210.
  • the staple food side dish meal amount calculation unit 120A of the notebook PC 100 is based on the determination result of the staple food side dish for each tableware and the meal amount for each tableware received from the image recognition device 200, and the staple food side dish meal amount and the side dish Calculate the amount of food in.
  • FIG. 8 is a chart showing a determination result of a staple food and a side dish for each tableware 20A to 20D (cooking A to D) in the meal tray 20 shown in FIG. 4 and an example of the amount of meal.
  • “amount” indicates the amount of each meal before meals
  • “decrease amount” indicates the ratio (%) of the amount of meals decreased after meals.
  • the information of "amount” and “decrease amount” is shown for reference, and the staple food side dish meal amount calculation unit 120A does not use this information.
  • the main meal side meal amount calculation unit 120A determines the meal of the staple food A as the staple food.
  • the amount (90) is taken as the meal amount of the staple food, and the total meal amount (150) of the meal amounts (80, 40, 30) of the dishes B to D determined as the side meal is calculated as the meal amount of the side meal.
  • the CPU 120 can transmit the meal amount of the staple food and the meal amount of the side dish calculated for each user ID to the recording device (recording unit) 300 via the communication unit 140 together with the user ID.
  • the staple food side dish meal amount calculation unit 120A evaluates the staple food meal amount and the side dish meal amount calculated for each user ID on a scale of 0 to 5, respectively, and the evaluation value (the larger the numerical value, the more (Evaluation value with a large amount of food) may be calculated as the amount of food.
  • FIG. 9 is a chart showing the evaluation results of 6 stages for the meal amount of the staple food and the side dish shown in FIG. 8.
  • the meal amount of the staple food is 3 in the 6-step evaluation
  • the meal amount of the side dish is It is 5 on a 6-point scale.
  • FIG. 10 is a chart showing the determination result of the staple food and side dish for each tableware (cookings A to C) in the meal tray and other examples of the amount of meal.
  • the dish A shown in FIG. 10 is, for example, curry rice, and the determination result of the staple food and the side dish is that the staple food is 80% and the side dish is 20%.
  • the staple food side meal amount calculation unit 120A uses the staple food meal amount (115) contained in the dish A as the staple food meal amount as it is, and determines the staple food meal amount (29) and the side meal included in the dish A as the side meal.
  • the total meal amount (58) of the meal amounts (20, 9) of the dishes B and C is calculated as the meal amount of the side meal.
  • FIG. 11 is a chart showing the evaluation results of 6 stages for the meal amount of the staple food and the side dish shown in FIG. 10.
  • the meal amount of the staple food is 4 in the 6-step evaluation
  • the meal amount of the side dish is It is 3 on a 6-point scale.
  • the amount of staple food and side dish is evaluated on a 6-point scale, but the present invention is not limited to this.
  • the complete food determination unit 120D shown in FIG. 3 starts from the first image when only the first image associated with the user ID exists among the first image and the second image stored in the storage unit 160. It is determined that the detected dishes served on the plurality of tableware have been completely eaten as both the staple food and the side dish.
  • the staple food side dish meal amount calculation unit 120A uses the determination result of the staple food side dish 120D when calculating the staple food meal amount and the side dish meal amount when the complete meal determination unit 120D determines the complete meal.
  • the image recognition device 200 recognizes the amount of meal for each tableware before meals based on the first image, and obtains the recognition result. Send to the notebook PC 100.
  • the amount of food served on each tableware before meals may be reduced (for example, when the rice is halved), and in that case, the amount of meals will be different even if the meal is completed. Is.
  • the CPU 120 together with the user ID, obtains a 6-step evaluation value indicating the amount of the staple food meal and a 6-step evaluation value indicating the amount of the side dish meal calculated for each user ID by the staple food side dish meal amount calculation unit 120A. It is transmitted to the recording device 300 via the communication unit 140. In this case, information indicating any of "breakfast”, “lunch”, or “dinner” selected on the operation screen displayed on the display unit 172 of the notebook PC 100 is also transmitted to the recording device 300.
  • the recording device 300 is a part realized by a computer and creates and manages a personal information list showing the personal information of the user and a meal amount list showing the meal amount for each user.
  • FIG. 12 is a chart showing an example of a list of personal information of each user managed by the recording device 300.
  • FIG. 13 is a chart showing an example of a meal amount list showing the meal amount for each user recorded and managed by the recording device 300.
  • the meal amount of the staple food for each daily "breakfast”, “lunch”, and “dinner” is shown in the entry field of the meal amount for each user specified by the user ID.
  • the evaluation value and the evaluation value indicating the amount of side dishes are recorded.
  • the recording device 300 When the recording device 300 receives the evaluation value indicating the respective meal amounts of the staple food and the side dish from the notebook PC 100 together with the user ID, the evaluation indicating the respective meal amounts of the staple food and the side dish in the meal amount list shown in FIG. Record the value.
  • an evaluation value indicating the amount of each of the staple food and side dish eaten by each user is automatically recorded for each user specified by the user ID.
  • [Meal measurement method] 14 to 17 are flowcharts showing an embodiment of the food amount measuring method according to the present invention. Hereinafter, a case where the meal amount is measured by using the meal amount measuring device 1 shown in FIGS. 1 and 2 will be described.
  • FIG. 14 is a flowchart showing the work procedure of the staff at the time of serving.
  • the staff of the nursing care facility or the like turns on the power of the notebook PC 100 before serving the meal tray (step S10).
  • the staff double-clicks the "meal amount determination" icon on the desktop of the notebook PC 100 (step S12).
  • the meal amount measurement program is started, and the operation screen for meal amount measurement is displayed on the display unit 172 of the notebook PC 100.
  • the staff selects one of the "breakfast” icon, the “lunch” icon, and the “dinner” icon on this operation screen (step S14), and then presses the "start” button (step S16).
  • the staff holds the meal tray 20 and puts the meal tray 20 on the part where the mat of the meal tray installation part 30 is laid (step S18).
  • step S20 When the meal tray 20 is placed on the meal tray installation section 30, the meal tray 20 is photographed by the cameras 10A and 10B (step S20).
  • FIG. 15 is a flowchart showing a photographing process of the meal tray in step S20 shown in FIG.
  • the cameras 10A and 10B start shooting a moving image
  • the meal tray detection unit 120B sets the meal tray 20 from each frame of the moving image to an appropriate meal tray installation unit 30. It detects whether it is arranged at the position (step S21). Further, the user ID detection unit 120C detects the user ID by reading the QR code 22A attached to the name plate 22 placed on the meal tray 20 (step S22).
  • the CPU 120 instructs the cameras 10A and 10B to switch from the moving image shooting to the still image shooting. , A still image of the meal tray 20 is taken (step S23).
  • the CPU 120 When the shooting of the still image by the cameras 10A and 10B is completed, the CPU 120 generates a "click" sound from the speaker 182 to notify the success of the shooting (step S24).
  • the process returns to step S20 shown in FIG.
  • the captured still image (first image) is temporarily stored in the storage unit 160 in association with the user ID. Further, when the shooting of the still image is completed, the cameras 10A and 10B start shooting the moving image again. That is, the cameras 10A and 10B shoot a moving image during a period other than shooting a still image (first image, second image).
  • step S22 the staff determines whether or not all the servings have been completed (step S22), and if not (in the case of "No"), returns to step S18 and processes from step S18 to step S22. repeat.
  • the notebook PC 100 is left in the activated state.
  • FIG. 16 is a flowchart showing the work procedure of the staff at the time of serving.
  • the staff selects the "after meal” icon on the operation screen when the meal tray is set down (step S30), and then presses the "start” button (step S32).
  • the staff puts the meal tray 20 on the mat-laid portion of the meal tray installation section 30 (step S34).
  • step S20 When the prepared meal tray 20 is placed on the meal tray installation section 30, the meal tray 20 is photographed by the cameras 10A and 10B (step S20). Since the process of step S20 is the same as that of step S20 shown in FIG. 14, detailed description thereof will be omitted.
  • step S36 the staff determines whether or not all the lower dishes have been completed. Subsequently, the staff determines whether or not all the lower dishes have been completed (step S36), and if not (in the case of "No"), returns to step S34 and from step S34 to step S36. Repeat the process.
  • FIG. 17 is a flowchart showing the processing of the notebook PC 100 and the image recognition device 200 after the photography of the meal tray before and after the meal is completed.
  • the CPU 120 of the notebook PC 100 captures the first image of the pre-meal meal tray 20 associated with the same user ID and stored in the storage unit 160 when the pre-meal and post-meal meal trays have been photographed, and
  • the second image of the meal tray 20 after the meal is transmitted to the image recognition device 200 via the communication unit 140, and the image recognition device 200 is the first image transmitted from the notebook PC 100 and associated with the same user ID.
  • the image and the second image are received (acquired) (step S50).
  • the tableware detection unit 220 of the image recognition device 200 detects a plurality of tableware in the meal tray 20 based on the acquired first image and the second image (step S52).
  • the food recognition unit 230 recognizes the type of food served on each tableware for each tableware based on the acquired first image (step S54).
  • the staple food side dish determination unit 240 determines for each tableware (cooking) whether the food in each tableware corresponds to the staple food or the side dish from the recognized types of dishes for each tableware (step S56).
  • the tableware set determination unit 250 determines the same tableware set from the plurality of tableware detected from the first image and the plurality of tableware detected from the second image in step S52 (step S58).
  • the meal amount recognition unit 260 recognizes the amount of meal for each tableware on each tableware based on the image of the same set of tableware, and each of the first images. By subtracting the amount of food remaining on each tableware in the second image from the amount of food on the tableware between the same tableware, the amount of food on each tableware eaten by the user is recognized ( Step S60).
  • the image recognition device 200 uses the user to obtain the determination result of the staple food and side dish for each tableware and the amount of meal for each tableware, which are recognized for each tableware (cooking) based on the first image and the second image as described above.
  • Each ID is transmitted to the notebook PC 100 via the communication unit 210.
  • the staple food side dish meal amount calculation unit 120A of the notebook PC 100 calculates the staple food side dish meal amount and the side dish meal amount based on the determination result of the staple food side dish for each tableware received from the image recognition device 200 and the meal amount for each tableware. (Step S62).
  • the CPU 120 transmits a six-level evaluation value indicating the amount of staple food and side dish meals calculated for each user ID to the recording device 300 together with the user ID, and the recording device 300 transmits the amount of meals as shown in FIG.
  • An evaluation value indicating the amount of each meal of the staple food and the side dish is recorded in the list (step S64).
  • the meal amount measuring device of the present embodiment calculates the meal amount of the staple food and the side dish, respectively, but is not limited to this, and may calculate the meal amount of at least one of the staple food or the side dish.
  • the notebook PC 100, the image recognition device 200, and the recording device 300 are directly or indirectly connected to each other by wireless LAN, Bluetooth, etc., but the image recognition device 200, etc. May be capable of communicating with the notebook PC 100 via the Internet.
  • the meal amount measuring device may be a stand-alone device consisting only of a personal computer including a notebook PC.
  • the imaging unit is not limited to two cameras, and may include a plurality of cameras of three or more, or may be composed of one camera.
  • the hardware that realizes the meal amount measuring device can be configured by various processors.
  • Various processors include programmable logic devices (Programmable), which are processors whose circuit configurations can be changed after manufacturing, such as CPUs and FPGAs (Field Programmable Gate Arrays), which are general-purpose processors that execute programs and function as various processing units.
  • a dedicated electric circuit which is a processor having a circuit configuration specially designed for executing a specific process such as Logic Device (PLD) or ASIC (Application Specific Integrated Circuit), is included.
  • One processing unit constituting the food amount measuring device may be composed of one of the above-mentioned various processors, or may be composed of two or more processors of the same type or different types.
  • one processing unit may be composed of a plurality of FPGAs or a combination of a CPU and an FPGA.
  • a plurality of processing units may be configured by one processor.
  • one processor is configured by a combination of one or more CPUs and software, as represented by a computer such as a client or a server.
  • the processor functions as a plurality of processing units.
  • SoC System On Chip
  • the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
  • the hardware structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Image Processing (AREA)

Abstract

Provided are a meal amount measuring device and method with which the meal amount of at least one main dish or side dish provided by a meal tray can be accurately measured with a simple device. A meal tray (20) before and after a meal is photographed by cameras (10A, 10B), and a pre-meal first image and a post-meal second image are acquired. On the basis of the acquired first and second images, multiple pieces of tableware are detected, the type of food served on the individual pieces of tableware is recognized with each piece of tableware, and with each piece of tableware, a determination is made, from the recognized type of food, as to whether the food in the piece of tableware corresponds to the main dish or corresponds to the side dish. Additionally, a set of the same tableware is determined from the multiple pieces of tableware detected from the first image and second images, and with each piece of tableware, the meal amount of the food served on the individual piece of tableware is recognized based on the image of the determined set of the same tableware. The meal amount of at least one of the main dish and the side dish is calculated based on the determination result of the main or side dish in each determined piece of tableware and the meal amount in each recognized piece of tableware.

Description

食事量測定装置及び方法Meal amount measuring device and method
 本発明は食事量測定装置及び方法に係り、特に介護施設の被介護者又は病院の入院患者が食べた食事量を測定する技術に関する。 The present invention relates to a food amount measuring device and a method, and particularly relates to a technique for measuring the amount of food eaten by a care recipient of a long-term care facility or an inpatient in a hospital.
 昨今、介護施設や病院などでの被介護者や入院患者への食事の提供にあたり、主食と副食のそれぞれの食事量を記録することが重要になってきている。 Recently, it has become important to record the amount of each staple food and side dish when providing meals to care recipients and inpatients at nursing care facilities and hospitals.
 特許文献1に記載の院内用食事摂取量の計測方法は、食物が載った一人分の食事トレイをその食前状態と食後状態についてそれぞれ撮影し、食前状態の画像において食事トレイ上の各食器を抽出し、あらかじめ入力した食器データベースに照らして食器の種類とトレイ上の位置をニューラルネットワークに入力して判別する。 In the method for measuring in-hospital meal intake described in Patent Document 1, one person's meal tray on which food is placed is photographed for its pre-meal state and post-meal state, and each tableware on the meal tray is extracted from the pre-meal state image. Then, in light of the tableware database entered in advance, the type of tableware and the position on the tray are input to the neural network to determine.
 続いて、各食器画像上の食物部分だけを抽出し、食前と食後の食物部分の変化量を面積で求めたのち、あらかじめ作成した食物データベースに照らして摂取カロリー及びビタミンなどの各種栄養素を算出している。 Next, only the food part on each tableware image is extracted, the amount of change in the food part before and after meal is calculated by area, and then various nutrients such as calorie intake and vitamins are calculated in light of the food database created in advance. ing.
特開2005-70908号公報Japanese Unexamined Patent Publication No. 2005-070908
 特許文献1に記載の院内用食事摂取量の計測方法は、食事トレイにより提供された食事により各患者が実際に摂取した摂取カロリー及びビタミンなどの各種栄養素を算出することができるが、提供された食事の中から主食と副食を認識することができず、主食や副食をどの程度食べたかを計測することができない。 The method for measuring the in-hospital meal intake described in Patent Document 1 can calculate various nutrients such as calories and vitamins actually ingested by each patient from the meal provided by the meal tray. It is not possible to recognize the staple food and the side dish from the meal, and it is not possible to measure how much the staple food and the side dish are eaten.
 そこで、殆どの介護施設などでは、スタッフの目視にて主食や副食の食事量を認識し、その食事量を記録しているのが実態だった。 Therefore, in most long-term care facilities, the actual situation was that the staff visually recognized the amount of staple food and side dishes and recorded the amount of meals.
 本発明はこのような事情に鑑みてなされたもので、食事トレイにより提供された主食もしくは副食の少なくとも一つの食事量を簡便な装置で精度よく測定することができる食事量測定装置及び方法を提供することを目的とする。 The present invention has been made in view of such circumstances, and provides a meal amount measuring device and a method capable of accurately measuring at least one meal amount of a staple food or a side dish provided by a meal tray with a simple device. The purpose is to do.
 上記目的を達成するために本発明の一の態様に係る食事量測定装置は、食事の前後の食事トレイを撮影する撮像部と、撮像部により撮影した食前の第1画像及び食後の第2画像をそれぞれ取得する画像取得部と、第1画像及び第2画像に基づいて食事トレイ内の複数個の食器を検出する食器検出部と、第1画像に基づいて検出された食器毎に各食器に盛られた料理の種類を認識する料理認識部と、認識された料理の種類から食器内の料理が主食に相当するか、又は副食に相当するかを食器毎に判定する主食副食判定部と、食器検出部により第1画像から検出された複数個の食器と第2画像から検出された複数個の食器から、同一の食器の組を判定する食器組判定部と、食器組判定部により判定された同一の食器の組の画像に基づいて各食器に盛られた料理に対する食事量を食器毎に認識する食事量認識部と、主食副食判定部により判定された食器毎の判定結果と食事量認識部により認識された食器毎の食事量とに基づいて、主食もしくは副食の少なくとも一つの食事量を算出する主食副食食事量算出部と、を備える。 In order to achieve the above object, the meal amount measuring device according to one aspect of the present invention includes an imaging unit that captures a meal tray before and after a meal, and a first image before a meal and a second image after a meal that are captured by the imaging unit. An image acquisition unit that acquires a plurality of dishes in the meal tray based on the first image and the second image, and a tableware detection unit that detects a plurality of dishes in the meal tray based on the first image and the second image. A food recognition unit that recognizes the type of served meal, and a main meal side meal judgment unit that determines whether the food in the tableware corresponds to the main meal or the side meal from the recognized types of food for each table. A meal set determination unit that determines the same set of meals from a plurality of meals detected from the first image by the meal detection unit and a plurality of meals detected from the second image, and a meal set determination unit that determines the same set of meals. A meal amount recognition unit that recognizes the amount of food for each dish served on each table based on the image of the same set of tableware, and a judgment result and meal amount recognition for each tablee determined by the main meal and side meal judgment unit. It is provided with a main meal side meal amount calculation unit that calculates at least one meal amount of the main meal or the side meal based on the meal amount for each tableware recognized by the department.
 本発明の一の態様によれば、食前の食事トレイを撮影した第1画像と食後の食事トレイを撮影した第2画像とをそれぞれ取得し、取得した第1画像及び第2画像に基づいて食事トレイ内の複数個の食器を検出する。また、第1画像に基づいて検出された食器毎に各食器に盛られた料理の種類を認識し、認識された料理の種類から食器内の料理が主食に相当するか、又は副食に相当するかを食器毎に判定する。そして、第1画像から検出された複数個の食器と第2画像から検出された複数個の食器から同一の食器の組を判定し、判定された同一の食器の組の画像に基づいて各食器に盛られた料理に対する食事量を食器毎に認識する。食器毎の主食、副食の判定結果と食器毎に認識された食事量とに基づいて、主食もしくは副食の少なくとも一つの食事量を算出する。 According to one aspect of the present invention, a first image of a pre-meal meal tray and a second image of a post-meal meal tray are acquired, and meals are taken based on the acquired first and second images, respectively. Detect multiple dishes in the tray. In addition, the type of dishes served on each tableware is recognized for each tableware detected based on the first image, and the dishes in the tableware correspond to the staple food or the side dish from the recognized types of dishes. It is judged for each tableware. Then, the same tableware set is determined from the plurality of tableware detected from the first image and the plurality of tableware detected from the second image, and each tableware is determined based on the image of the determined same tableware set. Recognize the amount of food for each dish served in the tableware. Based on the determination result of the staple food and the side dish for each tableware and the amount of meal recognized for each tableware, at least one meal amount of the staple food or the side dish is calculated.
 本発明の他の態様に係る食事量測定装置において、主食副食判定部は、食器内の料理に基づいて料理の主食の割合と副食の割合を判定することが好ましい。主食副食判定部は、食器内の料理が主食のみの場合、その料理は主食に相当すると判定し、この場合、主食と副食の割合は100/0となり、食器内の料理が副食のみの場合、その料理は副食に相当すると判定し、この場合、主食と副食の割合は0/100となる。一方、主食と副食とが混在する料理の場合には、その料理における主食の割合と副食の割合(例えば、主食/副食が80/20)を判定する。 In the meal amount measuring device according to another aspect of the present invention, it is preferable that the staple food side dish determination unit determines the ratio of the staple food and the side dish of the dish based on the dishes in the tableware. The staple food side dish determination unit determines that if the food in the tableware is only the staple food, the food corresponds to the staple food. In this case, the ratio of the staple food to the side dish is 100/0, and if the food in the tableware is only the side dish, It is determined that the dish corresponds to a side dish, and in this case, the ratio of the staple food to the side dish is 0/100. On the other hand, in the case of a dish in which a staple food and a side dish are mixed, the ratio of the staple food and the side dish in the dish (for example, the staple food / side dish is 80/20) is determined.
 本発明の更に他の態様に係る食事量測定装置において、食事量認識部は、食器に盛られた料理の画像と食器に盛られた料理の量とのペアを教師データとして機械学習した学習器を含み、第1画像のうちの食器毎の画像を学習器に入力した場合に認識された食器毎の料理の量と、第2画像のうちの食器毎の画像を学習器に入力した場合に認識された食器毎の料理の量との同一の食器同士の料理の量の差に基づいて、各食器に盛られた料理に対する食事量を食器毎に認識することが好ましい。 In the meal amount measuring device according to still another aspect of the present invention, the meal amount recognition unit is a learning device that machine-learns a pair of an image of dishes served on tableware and the amount of dishes served on tableware as teacher data. When the amount of dishes for each tableware recognized when the image for each tableware in the first image is input to the learner and the image for each tableware in the second image are input to the learner. It is preferable to recognize the amount of food for each tableware based on the difference between the recognized amount of food for each tableware and the amount of food for the same tableware.
 本発明の更に他の態様に係る食事量測定装置において、撮像部は複数のカメラを備え、画像取得部は、同一の食事トレイについて、複数のカメラにより撮影された複数の第1画像及び複数の第2画像を取得することが好ましい。複数のカメラにより撮影される撮影方向の異なる複数の第1画像及び複数の第2画像を、食器検出部、料理認識部、主食副食判定部、及び食事量認識部での検出や判定に使用することで、検出及び判定精度を上げることができる。 In the food amount measuring device according to still another aspect of the present invention, the imaging unit includes a plurality of cameras, and the image acquisition unit includes a plurality of first images and a plurality of first images taken by the plurality of cameras for the same food tray. It is preferable to acquire a second image. A plurality of first images and a plurality of second images taken by a plurality of cameras in different shooting directions are used for detection and determination by the tableware detection unit, the food recognition unit, the staple food side dish determination unit, and the meal amount recognition unit. Therefore, the detection and determination accuracy can be improved.
 本発明の更に他の態様に係る食事量測定装置において、食事トレイが載置される食事トレイ設置部を備え、撮像部は、食事トレイ設置部に載置された食事トレイを撮影することが好ましい。 It is preferable that the meal amount measuring device according to still another aspect of the present invention includes a meal tray installation unit on which a meal tray is placed, and the imaging unit photographs the meal tray placed on the meal tray installation unit. ..
 本発明の更に他の態様に係る食事量測定装置において、食事トレイ設置部に食事トレイが載置されたことを検知する食事トレイ検知部を備え、撮像部は、食事トレイ検知部が食事トレイを検知した際に、第1画像及び第2画像を撮影することが好ましい。これにより、食事トレイ設置部に食事トレイを載置することで、撮像部により第1画像及び第2画像を自動的に撮影することができる。 In the meal amount measuring device according to still another aspect of the present invention, the meal tray detection unit is provided with a meal tray detection unit for detecting that the food tray is placed on the food tray installation unit. When detected, it is preferable to take a first image and a second image. As a result, by placing the meal tray on the meal tray installation section, the first image and the second image can be automatically captured by the image pickup section.
 本発明の更に他の態様に係る食事量測定装置において、撮像部は、第1画像及び第2画の撮影以外の期間に動画を撮影し、食事トレイ検知部は、動画に基づいて食事トレイ設置部に食事トレイが載置されたことを検知することが好ましい。これにより、食事トレイ設置部に食事トレイが載置されたことを確実に検知することができる。 In the meal amount measuring device according to still another aspect of the present invention, the imaging unit shoots a moving image during a period other than the shooting of the first image and the second image, and the meal tray detecting unit installs a meal tray based on the moving image. It is preferable to detect that the meal tray is placed on the part. As a result, it is possible to reliably detect that the meal tray is placed on the meal tray installation portion.
 本発明の更に他の態様に係る食事量測定装置において、食事トレイ設置部は、食事トレイを撮像部による撮影位置に案内するガイド部を備えることが好ましい。これにより、撮像部により食事トレイを良好に撮影することができる。 In the food amount measuring device according to still another aspect of the present invention, it is preferable that the food tray installation unit is provided with a guide unit that guides the food tray to the imaging position by the imaging unit. As a result, the image pickup unit can take a good picture of the meal tray.
 本発明の更に他の態様に係る食事量測定装置において、食事トレイ設置部には、食事トレイを片手で保持した状態で食事トレイ設置部に載置可能にする切り欠き部が形成されることが好ましい。 In the meal amount measuring device according to still another aspect of the present invention, the meal tray installation portion may be formed with a notch portion that allows the meal tray to be placed on the meal tray installation portion while being held by one hand. preferable.
 食事トレイ設置部に形成された切り欠き部により手が干渉せずに、食事トレイを片手で保持した状態で食事トレイ設置部に載置することができる。 The notch formed in the meal tray installation part allows the meal tray to be placed on the meal tray installation part while being held by one hand without the hands interfering with each other.
 本発明の更に他の態様に係る食事量測定装置において、撮像部による撮影の成否を報知する報知部を備えることが好ましい。報知部は、音声や文字により撮影の成否を表示することができる。 In the food amount measuring device according to still another aspect of the present invention, it is preferable to include a notification unit for notifying the success or failure of imaging by the imaging unit. The notification unit can display the success or failure of shooting by voice or text.
 本発明の更に他の態様に係る食事量測定装置において、撮像部により食事トレイを撮影する際に、食事トレイを利用する利用者の利用者IDを検出する利用者ID検出部と、利用者ID検出部により利用者IDが検出された時に撮像部により撮影された第1画像及び第2画像を、それぞれ検出された利用者IDと関連付けて記憶する記憶部と、を備え、主食副食食事量算出部は、利用者ID毎に主食もしくは副食の少なくとも一つの食事量を算出することが好ましい。 In the meal amount measuring device according to still another aspect of the present invention, a user ID detection unit that detects a user ID of a user who uses the meal tray and a user ID when the food tray is photographed by the imaging unit. A storage unit that stores the first image and the second image taken by the imaging unit when the user ID is detected by the detection unit in association with the detected user ID, respectively, and calculates the amount of staple food and side dish. The department preferably calculates at least one meal amount of a staple food or a side dish for each user ID.
 これにより、食事トレイを利用する利用者毎に各利用者が食べた主食もしくは副食の少なくとも一つの食事量を測定することができる。 This makes it possible to measure the amount of at least one staple food or side dish eaten by each user for each user who uses the meal tray.
 本発明の更に他の態様に係る食事量測定装置において、記憶部に記憶された第1画像及び第2画像のうち、利用者IDと関連付けられた第1画像のみが存在する場合、第1画像から検出された複数個の食器に盛られた料理は主食及び副食とも完食されたものと判定する完食判定部を備え、主食副食食事量算出部は、完食判定部により完食が判定された場合、主食もしくは副食の少なくとも一つの食事量を算出する際に完食判定部の判定結果を使用することが好ましい。これにより、完食された食事トレイの撮影等が不要になり、手間や不要な食事量の認識処理等を省くことができる。 In the food amount measuring device according to still another aspect of the present invention, when only the first image associated with the user ID exists among the first image and the second image stored in the storage unit, the first image. A complete meal determination unit for determining that both the main dish and the side dish have been completed is provided for the dishes served on the plurality of tableware detected from the above, and the main meal side dish meal amount calculation unit determines the complete meal by the complete meal determination unit. If so, it is preferable to use the determination result of the complete meal determination unit when calculating the amount of at least one meal of the main dish or the side dish. As a result, it is not necessary to take a picture of the completed meal tray, and it is possible to save time and effort and recognition processing of unnecessary meal amount.
 本発明の更に他の態様に係る食事量測定装置において、主食副食食事量算出部により利用者ID毎に算出された主食もしくは副食の少なくとも一つの食事量を、利用者IDに関連付けて記録する記録部を備えることが好ましい。これにより、食事トレイを利用する利用者毎に各利用者が食べた主食もしくは副食の少なくとも一つの食事量を記録し、管理することができる。 In the meal amount measuring device according to still another aspect of the present invention, a record of recording at least one meal amount of a staple food or a side dish calculated for each user ID by the staple food side dish meal amount calculation unit in association with the user ID. It is preferable to provide a portion. As a result, it is possible to record and manage at least one meal amount of the staple food or side dish eaten by each user for each user who uses the meal tray.
 本発明の更に他の態様に係る食事量測定装置において、利用者ID検出部は、食事トレイ又は食事トレイに載置される利用者のネームプレートに付された利用者IDを示すバーコードを読み取ることで利用者IDを検出し、若しくは食事トレイ又はネームプレートに埋め込まれた利用者の利用者IDを記憶する無線タグと通信して利用者IDを検出することが好ましい。 In the meal amount measuring device according to still another aspect of the present invention, the user ID detection unit reads the bar code indicating the user ID attached to the meal tray or the user's name plate placed on the meal tray. Therefore, it is preferable to detect the user ID or communicate with the wireless tag that stores the user ID of the user embedded in the meal tray or the name plate to detect the user ID.
 バーコードを読み取る利用者ID検出部は、第1画像及び第2画像に含まれるバーコードの画像領域を抽出することで、バーコードが示す利用者IDを検出することができる。尚、バーコードは、一次元バーコード又は二次元バーコード(QRコード(登録商標))を含む。無線タグと通信する利用者ID検出部は、例えば、食事トレイが載置される食事トレイ設置部に埋め込むことができる。 The user ID detection unit that reads the barcode can detect the user ID indicated by the barcode by extracting the image area of the barcode included in the first image and the second image. The barcode includes a one-dimensional barcode or a two-dimensional barcode (QR code (registered trademark)). The user ID detection unit that communicates with the wireless tag can be embedded in, for example, the food tray installation unit on which the food tray is placed.
 更に他の態様に係る食事量測定方法は、撮像部により食前の食事トレイを撮影するステップと、撮像部により食後の食事トレイを撮影するステップと、撮像部により撮影した食前の第1画像及び食後の第2画像をそれぞれ取得するステップと、第1画像及び第2画像に基づいて食事トレイ内の複数個の食器を、食器検出部がそれぞれ検出するステップと、第1画像に基づいて検出された食器毎に各食器に盛られた料理の種類を、料理認識部により認識するステップと、認識された料理の種類から食器内の料理が主食に相当するか、又は副食に相当するかを、主食副食判定部が食器毎に判定するステップと、第1画像から検出された複数個の食器と第2画像から検出された複数個の食器から、同一の食器の組を食器組判定部が判定するステップと、判定された同一の食器の組の画像に基づいて各食器に盛られた料理に対する食事量を、食事量認識部が食器毎に認識するステップと、判定された食器毎の主食副食の判定結果と認識された食器毎の食事量とに基づいて、主食副食食事量算出部が主食もしくは副食の少なくとも一つの食事量を算出するステップと、を含む。 The method for measuring the amount of food according to still another aspect includes a step of photographing a pre-meal meal tray by an imaging unit, a step of photographing a post-meal meal tray by an imaging unit, and a first pre-meal image and a post-meal image taken by the imaging unit. A step of acquiring the second image of the above, a step of detecting a plurality of dishes in the meal tray based on the first image and the second image, respectively, and a step of detecting the plurality of dishes in the meal tray based on the first image. The step of recognizing the type of food served on each table for each table by the food recognition unit, and whether the food in the table corresponds to the main meal or the side meal from the recognized type of food is the main meal. The table meal set determination unit determines the same set of dishes from the step of the side meal determination unit for each table, the plurality of tableware detected from the first image, and the plurality of tableware detected from the second image. The step and the step in which the meal amount recognition unit recognizes the amount of meal for each dish based on the image of the determined set of the same tableware for each table, and the main meal and side meal for each determined tableware. This includes a step in which the main meal side meal meal amount calculation unit calculates at least one meal amount of the main meal or the side meal based on the determination result and the recognized meal amount for each tableware.
 本発明によれば、食事トレイにより提供された主食もしくは副食の少なくとも一つの食事量を簡便な装置で精度よく測定することができる。 According to the present invention, it is possible to accurately measure the amount of at least one staple food or side dish provided by the meal tray with a simple device.
図1は、本発明に係る食事量測定装置の主要なハードウェア構成の一例を示す外観図である。FIG. 1 is an external view showing an example of a main hardware configuration of the food amount measuring device according to the present invention. 図2は、図1に示す食事量測定装置の実施形態を示すブロック図である。FIG. 2 is a block diagram showing an embodiment of the food amount measuring device shown in FIG. 図3は、図1又は図2に示したノートPCのCPUが有する各種の機能を示す機能ブロック図である。FIG. 3 is a functional block diagram showing various functions possessed by the CPU of the notebook PC shown in FIG. 1 or 2. 図4は、食事トレイ、食事トレイ上の複数個の食器、及びネームプレート等を示す図である。FIG. 4 is a diagram showing a meal tray, a plurality of tableware on the meal tray, a name plate, and the like. 図5は、ネームプレートの一例を示す図である。FIG. 5 is a diagram showing an example of a name plate. 図6は、図2に示した画像認識装置の機能を示す機能ブロック図である。FIG. 6 is a functional block diagram showing the functions of the image recognition device shown in FIG. 図7は、料理の種類と主食/副食の割合の一例を示す図表である。FIG. 7 is a chart showing an example of the type of food and the ratio of staple food / side dish. 図8は、図4に示した食事トレイ内の複数個の食器(複数の料理A~D)毎の主食副食の判定結果及び食事量の一例を示す図表である。FIG. 8 is a chart showing a determination result of a staple food and a side dish for each of a plurality of tableware (a plurality of dishes A to D) in the meal tray shown in FIG. 4 and an example of the amount of meal. 図9は、図8に示した主食及び副食の食事量に対する6段階の評価結果を示す図表である。FIG. 9 is a chart showing the evaluation results of 6 stages with respect to the meal amount of the staple food and the side dish shown in FIG. 図10は、食事トレイ内の複数個の食器(複数の料理A~C)毎の主食副食の判定結果及び食事量の他の例を示す図表である。FIG. 10 is a chart showing a determination result of a staple food and a side dish for each of a plurality of tableware (a plurality of dishes A to C) in a meal tray and another example of the amount of meal. 図11は、図10に示した主食及び副食の食事量に対する6段階の評価結果を示す図表である。FIG. 11 is a chart showing the evaluation results of 6 stages with respect to the meal amount of the staple food and the side dish shown in FIG. 図12は、記録装置で管理される各利用者の個人情報の一覧表の一例を示す図表である。FIG. 12 is a chart showing an example of a list of personal information of each user managed by the recording device. 図13は、記録装置により記録及び管理される利用者毎の食事量を示す食事量一覧表の一例を示す図表である。FIG. 13 is a chart showing an example of a meal amount list showing the meal amount for each user recorded and managed by the recording device. 図14は、本発明に係る食事量測定方法の実施形態を示すフローチャートであり、特に配膳時のスタッフの作業手順等を示すフローチャートである。FIG. 14 is a flowchart showing an embodiment of the meal amount measuring method according to the present invention, and in particular, is a flowchart showing a work procedure of a staff member at the time of serving. 図15は、本発明に係る食事量測定方法の実施形態を示すフローチャートであり、特に図14に示したステップS20での食事トレイの撮影処理を示すフローチャートである。FIG. 15 is a flowchart showing an embodiment of the meal amount measuring method according to the present invention, and in particular, is a flowchart showing a photographing process of the meal tray in step S20 shown in FIG. 図16は、本発明に係る食事量測定方法の実施形態を示すフローチャートであり、特に下膳時のスタッフの作業手順等を示すフローチャートである。FIG. 16 is a flowchart showing an embodiment of the meal amount measuring method according to the present invention, and in particular, is a flowchart showing a work procedure of a staff member at the time of serving. 図17は、本発明に係る食事量測定方法の実施形態を示すフローチャートであり、特に食前及び食後の食事トレイの撮影が終了した後のノートPC及び画像認識装置の処理を示すフローチャートである。FIG. 17 is a flowchart showing an embodiment of the meal amount measuring method according to the present invention, and in particular, is a flowchart showing the processing of the notebook PC and the image recognition device after the photography of the meal tray before and after the meal is completed.
 以下、添付図面に従って本発明に係る食事量測定装置及び方法の好ましい実施形態について説明する。 Hereinafter, preferred embodiments of the food amount measuring device and method according to the present invention will be described with reference to the accompanying drawings.
 [食事量測定装置の主要なハードウェア構成]
 図1は、本発明に係る食事量測定装置の主要なハードウェア構成の一例を示す外観図である。
[Main hardware configuration of the food intake measuring device]
FIG. 1 is an external view showing an example of a main hardware configuration of the food amount measuring device according to the present invention.
 図1に示す食事量測定装置1は、例えば、介護施設や病院の病棟に設置され、介護施設の被介護者又は病院の入院患者(利用者)が食べた食事量を測定する装置であり、撮像部として機能する2台のカメラ10A,10Bと、食事トレイ設置部30と、ノートパソコン(ノートPC:Personal Computer)100と、図2に示す画像認識装置200と、記録装置300とから構成されている。 The meal amount measuring device 1 shown in FIG. 1 is, for example, a device installed in a ward of a care facility or a hospital and measuring the amount of meal eaten by a care recipient of the care facility or an inpatient (user) of the hospital. It is composed of two cameras 10A and 10B that function as image pickup units, a meal tray installation unit 30, a notebook computer (notebook PC: Personal Computer) 100, an image recognition device 200 shown in FIG. 2, and a recording device 300. ing.
 図1において、32は、食事トレイ設置部30を支持する脚部であり、34は、カメラ10A,10Bが固定される支柱である。 In FIG. 1, 32 is a leg that supports the meal tray installation portion 30, and 34 is a support column to which the cameras 10A and 10B are fixed.
 図1上で、食事トレイ設置部30の左側にノートPC100が載置され、食事トレイ設置部30の右側に食事トレイ20が載置される。 On FIG. 1, the notebook PC 100 is placed on the left side of the meal tray installation unit 30, and the meal tray 20 is placed on the right side of the meal tray installation unit 30.
 食事トレイ設置部30には、食事トレイ20を片手で保持した状態で食事トレイ設置部30に載置可能にする切り欠き部30Aが形成されている。介護施設などでは、スタッフが2つの食事トレイ20をそれぞれ片手で保持して配膳や下膳する場合が多いが、この場合、切り欠き部30Aにより手が干渉せずに、食事トレイ20を片手で保持した状態で食事トレイ設置部30に載置することができる。 The meal tray installation portion 30 is formed with a notch portion 30A that allows the meal tray 20 to be placed on the meal tray installation portion 30 while being held by one hand. In long-term care facilities, staff often hold two meal trays 20 with one hand to serve or serve meals. In this case, the notch 30A does not interfere with the hands, and the meal tray 20 is held with one hand. It can be placed on the meal tray installation unit 30 in a held state.
 また、食事トレイ設置部30には、カメラ10A,10Bによる撮影位置に食事トレイ20を案内するガイド部30Bが設けられている。本例のガイド部30Bは、食事トレイ20が接することで食事トレイ20を位置決めするL字状のストッパにより構成されているが、これに限らず、例えば、食事トレイ20が載置される位置にマットを敷いたり、食事トレイ設置部30の表面に位置決め用の枠線を描くことで、食事トレイ20の載置位置を案内するようにしてもよい。 Further, the meal tray installation unit 30 is provided with a guide unit 30B that guides the meal tray 20 to the shooting position by the cameras 10A and 10B. The guide portion 30B of this example is configured by an L-shaped stopper that positions the meal tray 20 when the meal tray 20 comes into contact with the guide portion 30B, but is not limited to this, for example, at a position where the meal tray 20 is placed. The placement position of the meal tray 20 may be guided by laying a mat or drawing a positioning frame on the surface of the meal tray installation portion 30.
 図2は、図1に示す食事量測定装置1の実施形態を示すブロック図であり、主としてノートPC100に関して示している。 FIG. 2 is a block diagram showing an embodiment of the food amount measuring device 1 shown in FIG. 1, and is mainly shown with respect to the notebook PC 100.
 図2に示すノートPC100は、入出力インターフェース110、CPU(Central Processing Unit)120、操作部130、通信部140、RAM(Random Access Memory)150、記憶部160、表示制御部170、表示部172、ドライバ180、及びスピーカ182から構成されている。 The notebook PC 100 shown in FIG. 2 includes an input / output interface 110, a CPU (Central Processing Unit) 120, an operation unit 130, a communication unit 140, a RAM (Random Access Memory) 150, a storage unit 160, a display control unit 170, and a display unit 172. It is composed of a driver 180 and a speaker 182.
 2台のカメラ10A,10Bは、画像取得部として機能する入出力インターフェース110に接続されている。カメラ10A,10Bは、CPU120から入出力インターフェース110を介して入力する撮影指示入力により静止画や動画を撮影する。 The two cameras 10A and 10B are connected to the input / output interface 110 that functions as an image acquisition unit. The cameras 10A and 10B capture a still image or a moving image by inputting a shooting instruction input from the CPU 120 via the input / output interface 110.
 本例では、カメラ10A,10Bは、食事トレイ設置部30に載置された食事の前後の食事トレイ20をそれぞれ異なる方向から撮影する。カメラ10A,10Bにより撮影された食前の2つの画像(第1画像)及び食後の2つの画像(第2画像)は、それぞれRAM150又は記憶部160に一時的に保存される。 In this example, the cameras 10A and 10B photograph the meal trays 20 before and after the meal placed on the meal tray installation unit 30 from different directions. The two pre-meal images (first image) and the two post-meal images (second image) taken by the cameras 10A and 10B are temporarily stored in the RAM 150 or the storage unit 160, respectively.
 CPU120は、記憶部160に記憶された食事量測定プログラムの他、各種のプログラムを読み出し、各部を統括制御するとともに、食事量測定プログラムを実行することで、の各種の機能を有する。 The CPU 120 has various functions by reading various programs in addition to the meal amount measurement program stored in the storage unit 160, controlling each unit in an integrated manner, and executing the meal amount measurement program.
 図3は、CPU120が有する各種の機能を示す機能ブロック図である。同図に示すようにCPU120は、主食副食食事量算出部120A、食事トレイ検知部120B、利用者ID(Identification)検出部120C、及び完食判定部120Dとして機能する。尚、主食副食食事量算出部120A、食事トレイ検知部120B、利用者ID検出部120C、及び完食判定部120Dの詳細については後述する。 FIG. 3 is a functional block diagram showing various functions of the CPU 120. As shown in the figure, the CPU 120 functions as a staple food side dish meal amount calculation unit 120A, a meal tray detection unit 120B, a user ID (Identification) detection unit 120C, and a complete meal determination unit 120D. The details of the staple food side dish amount calculation unit 120A, the meal tray detection unit 120B, the user ID detection unit 120C, and the complete meal determination unit 120D will be described later.
 操作部130は、電源スイッチ、キーボート、及びマウス等を含み、本例では、ノートPC100の通常の操作入力を受け付ける操作部として機能する他に、表示部172の画面に表示させる食事量測定用の各種のアイコンを操作する操作部として機能する。また、操作部130は、タッチパネルを備えた表示部172の場合、そのタッチパネルを含む。 The operation unit 130 includes a power switch, a keyboard, a mouse, and the like. In this example, the operation unit 130 functions as an operation unit that receives normal operation input of the notebook PC 100, and is for measuring the amount of food to be displayed on the screen of the display unit 172. It functions as an operation unit for operating various icons. Further, the operation unit 130 includes the touch panel in the case of the display unit 172 provided with the touch panel.
 通信部140は、無線LAN(Local Area Network)、ブルートゥース(Bluetooth)(登録商標)等により直接的又は間接的に外部機器に接続するためのものであり、本例の場合、画像認識装置200及び記録装置300と無線接続可能になっている。また、インターネットとの接続も可能である。 The communication unit 140 is for directly or indirectly connecting to an external device by a wireless LAN (Local Area Network), Bluetooth (Bluetooth) (registered trademark), etc., and in the case of this example, the image recognition device 200 and It can be wirelessly connected to the recording device 300. It is also possible to connect to the Internet.
 RAM150は、CPU120の作業領域として使用され、記憶部160から読み出されたプログラムや、前述した食前の食事トレイ20の画像である第1画像、食後の食事トレイ20の画像である第2画像等の各種のデータを一時的に記憶するメモリとして用いられる。 The RAM 150 is used as a work area of the CPU 120, and includes a program read from the storage unit 160, a first image which is an image of the meal tray 20 before meals, a second image which is an image of the meal tray 20 after meals, and the like. It is used as a memory for temporarily storing various types of data.
 記憶部160は、ハードディスク装置やフラッシュメモリ等から構成され、オペレーティングシステム、食事量測定プログラムの他、各種のプログラム、前述した第1画像、第2画像、及び各種のデータ等を記憶する不揮発性の記憶部である。 The storage unit 160 is composed of a hard disk device, a flash memory, and the like, and is a non-volatile device that stores various programs, the above-mentioned first image, second image, various data, and the like, in addition to an operating system and a meal amount measurement program. It is a memory part.
 表示制御部170は、CPU120からの指示にしたがって表示部172に表示させる表示用データを作成し、表示部172に出力する部分であり、本例では食事量測定用の操作であって、各種のアイコン(例えば、「食事量測定」アイコン、「朝食」アイコン、「昼食」アイコン、「夕食」アイコン、「食前」アイコン、「食後」アイコン、「開始」ボタン、「終了」ボタン等)を有する操作画面を表示部172に表示させ、あるいはカメラ10A,10Bによる撮影の成否を表示部172に報知させる。 The display control unit 170 is a part that creates display data to be displayed on the display unit 172 according to an instruction from the CPU 120 and outputs the display data to the display unit 172. Operations with icons (for example, "meal measurement" icon, "breakfast" icon, "lunch" icon, "dinner" icon, "before meal" icon, "after meal" icon, "start" button, "end" button, etc.) The screen is displayed on the display unit 172, or the display unit 172 is notified of the success or failure of shooting by the cameras 10A and 10B.
 尚、表示部172は、本例ではノートPC100が備えるカラー液晶表示器であり、食事量測定プログラムが立ち上がっている場合には、食事量測定用の各種のアイコン等を備えた操作画面を表示する。 In this example, the display unit 172 is a color liquid crystal display provided in the notebook PC 100, and when the meal amount measurement program is started, the display unit 172 displays an operation screen provided with various icons for measuring the meal amount. ..
 ドライバ180は、CPU120からの指示にしたがってスピーカ182から発生させる音声情報を作成し、スピーカ182に出力するサウンドドライバであり、本例ではカメラ10A,10Bによる撮影の成否を示す音声をスピーカ182から発生させる。例えば、カメラ10A,10Bによる食事トレイ20の撮影が成功した場合、「カシャ」というシャッタ音を発生させ、あるいは「撮影が終了しました」という音声を発生させることができる。 The driver 180 is a sound driver that creates sound information generated from the speaker 182 according to an instruction from the CPU 120 and outputs the sound information to the speaker 182. In this example, the driver 180 generates sound indicating the success or failure of shooting by the cameras 10A and 10B from the speaker 182. Let me. For example, when the images of the meal tray 20 by the cameras 10A and 10B are successful, a shutter sound of "kasha" or a sound of "shooting is completed" can be generated.
 表示部172及びスピーカ182は、カメラ10A,10Bによる撮影の成否を報知する報知部として機能する。 The display unit 172 and the speaker 182 function as a notification unit for notifying the success or failure of shooting by the cameras 10A and 10B.
 図4は、食事トレイ20、食事トレイ20上の食器20A~20D、及びネームプレート22等を示す図である。尚、食事トレイ20の用語は、トレイに食器等が載せられた配膳、下膳される状態のものを意味する場合と、トレイのみを意味する場合とを含むものとする。 FIG. 4 is a diagram showing a meal tray 20, tableware 20A to 20D on the meal tray 20, a name plate 22, and the like. The term "meal tray 20" includes a case where tableware and the like are placed on the tray and a state where the meal tray is served, and a case where only the tray is meant.
 図4に示す食事トレイ20は、撮影のために食事トレイ設置部30に設置される食前の状態を示しており、各食器20A~20Dにはそれぞれ料理A~Dが入っており、ネームプレート22は倒され、ネームプレート22の裏側が撮影可能になっている。図4に示す料理Aは、例えば主食(ご飯)である。料理B~Dは副食であり、料理B、Cは副食のうちの主菜、副菜であり、料理Dはスープである。 The meal tray 20 shown in FIG. 4 shows the state before meals installed in the meal tray installation unit 30 for photography, and each tableware 20A to 20D contains dishes A to D, and the name plate 22. Has been knocked down, and the back side of the nameplate 22 can be photographed. The dish A shown in FIG. 4 is, for example, a staple food (rice). The dishes B to D are side dishes, the dishes B and C are the main and side dishes of the side dishes, and the dishes D are soups.
 図5は、ネームプレート22の一例を示す図である。 FIG. 5 is a diagram showing an example of the name plate 22.
 ネームプレート22の表側には、介護施設の被介護者又は病院の入院患者である利用者の名前が付されており、ネームプレート22の表側には、食事トレイ20を利用する利用者を特定する利用者IDを示すバーコード(本例では、QRコード)22Aが付されている。ネームプレート22が倒されると、QRコード22Aの撮影が可能になる。 The name of the user who is the care recipient of the long-term care facility or the inpatient of the hospital is attached to the front side of the name plate 22, and the user who uses the meal tray 20 is specified on the front side of the name plate 22. A barcode (QR code in this example) 22A indicating the user ID is attached. When the name plate 22 is knocked down, the QR code 22A can be photographed.
 [食事量測定装置による食事量の測定手順]
 上記構成の食事量測定装置1を使用した食事量の測定の手順は、以下の通りである。
[Procedure for measuring the amount of food with a food amount measuring device]
The procedure for measuring the amount of food using the food amount measuring device 1 having the above configuration is as follows.
 <配膳時>
 (1)利用者への食事の世話をするスタッフは、ノートPC100の電源スイッチをオンにし、ノートPC100のデスクトップに表示されている「食事量測定」アイコンをマウスによりダブルクリックする。これにより、食事量測定プログラムが起動し、表示部172には食事量測定用の操作画面が表示される。
<At the time of serving>
(1) The staff who takes care of meals for users turns on the power switch of the notebook PC 100, and double-clicks the "meal amount measurement" icon displayed on the desktop of the notebook PC 100 with the mouse. As a result, the meal amount measurement program is activated, and the operation screen for measuring the meal amount is displayed on the display unit 172.
 (2)食事量測定用の操作画面上の「朝食」アイコン、「昼食」アイコン、「夕食」アイコンのいずれかを選択する。朝食時の配膳時には、「朝食」アイコンをクリックする。 (2) Select one of the "breakfast" icon, "lunch" icon, and "dinner" icon on the operation screen for measuring the amount of food. When serving breakfast, click the "Breakfast" icon.
 (3)配膳時の操作のため「食前」アイコンをクリックする。 (3) Click the "Before meal" icon for the operation at the time of serving.
 (4)「開始」ボタンを押す。 (4) Press the "Start" button.
 〔一人ひとりの撮影(食事トレイ20を2つ同時に持つ場合)〕
 (1)食事トレイ20を載せたワゴンから食事トレイ20を少しずらして出す。
[Shooting of each person (when holding two meal trays 20 at the same time)]
(1) Remove the meal tray 20 from the wagon on which the meal tray 20 is placed by slightly shifting it.
 (2)食器にかかっている蓋やラップを外す。 (2) Remove the lid and wrap on the tableware.
 (3)QRコード22Aが見えるようにネームプレート22を倒す(図5参照)。 (3) Tilt the nameplate 22 so that the QR code 22A can be seen (see FIG. 5).
 (4)2つの食事トレイ20について、上記(1)~(3)の操作を行う。 (4) Perform the above operations (1) to (3) for the two meal trays 20.
 (5)片手ずつそれぞれ食事トレイ20を持ち、食事トレイ設置部30のマットが敷いてある部分に1つ目の食事トレイ20を載せる。スタッフは、食事トレイ設置部30の切り欠き部30A(図1)を利用することで、食事トレイ20を片手で保持した状態で、食事トレイ設置部30に食事トレイ20を載せることができる。 (5) Hold each meal tray 20 with each hand, and place the first meal tray 20 on the mat-laid part of the meal tray installation section 30. By using the cutout portion 30A (FIG. 1) of the meal tray installation portion 30, the staff can place the meal tray 20 on the meal tray installation portion 30 while holding the meal tray 20 with one hand.
 (6)スピーカ182から「カシャ」という音が鳴るまで待機する。尚、「カシャ」という音は、カメラ10A,10Bによる食事トレイ20の撮影の成功を意味する。 (6) Wait until the speaker 182 makes a clicking sound. The "cashing" sound means that the meal tray 20 has been successfully photographed by the cameras 10A and 10B.
 本例では、カメラ10A,10Bは、食事量の測定が開始されると、通常、動画を撮影しており、カメラ10A,10Bにより撮影された動画は、RAM150に一時的に保存される。 In this example, the cameras 10A and 10B usually shoot a moving image when the measurement of the amount of food is started, and the moving image taken by the cameras 10A and 10B is temporarily stored in the RAM 150.
 食事トレイ検知部120Bとして機能するCPU120は、RAM150に保持された動画の各フレームから食事トレイ20の有無を判定し、食事トレイ20が検知されると、食事トレイ20が食事トレイ設置部30の適切な位置に配置されているかを判定する。また、利用者ID検出部120Cとして機能するCPU120は、動画の各フレームからネームプレート22に付されたQRコード22Aを読み取る。 The CPU 120 functioning as the meal tray detection unit 120B determines the presence or absence of the meal tray 20 from each frame of the moving image held in the RAM 150, and when the meal tray 20 is detected, the meal tray 20 is appropriate for the meal tray installation unit 30. Judge whether it is placed in a proper position. Further, the CPU 120 functioning as the user ID detection unit 120C reads the QR code 22A attached to the name plate 22 from each frame of the moving image.
 食事トレイ20が食事トレイ設置部30の適切な位置に配置されたことが検知され、かつネームプレート22に付されたQRコード22Aの読み取りに成功すると、CPU120は、カメラ10A,10Bに対して静止画の撮影を指示し、食事トレイ20の静止画を撮影させる。また、CPU120は、静止画の撮影が終了すると、ドライバ180を通じてスピーカ182から「カシャ」という音を発生させ、撮影の成功を報知する。 When it is detected that the meal tray 20 is arranged at an appropriate position of the meal tray installation portion 30 and the QR code 22A attached to the name plate 22 is successfully read, the CPU 120 is stationary with respect to the cameras 10A and 10B. Instruct them to take a picture and have them take a still image of the meal tray 20. Further, when the shooting of the still image is completed, the CPU 120 generates a "click" sound from the speaker 182 through the driver 180 to notify the success of the shooting.
 尚、CPU120は、食事トレイ検知部120Bにより食事トレイ20が検出されてから一定時間以上、食事トレイ20が食事トレイ設置部30の適切な位置に配置されない場合、あるいはネームプレート22に付されたQRコード22Aが読み取れない場合には、正しい撮影ができない旨(撮影の不成功)をスピーカ182により音声で報知することができる。CPU120は、撮影の成否を表示部172による表示により報知してもよい。 In the CPU 120, when the meal tray 20 is not arranged at an appropriate position of the meal tray installation unit 30 for a certain period of time or more after the meal tray 20 is detected by the meal tray detection unit 120B, or the QR attached to the name plate 22 If the code 22A cannot be read, the speaker 182 can be used to notify by voice that correct shooting is not possible (shooting is unsuccessful). The CPU 120 may notify the success or failure of shooting by a display on the display unit 172.
 また、本例では、食事トレイ検知部120Bは、カメラ10A,10Bにより撮影された動画を画像処理することにより食事トレイ20を検知するが、これに限らず、食事トレイ設置部30に食事トレイ20を検出する検出器を配設し、CPU120は、その検出器から食事トレイ20が食事トレイ設置部30に載置されたか否かの情報を取得するようにしてもよい。 Further, in this example, the meal tray detection unit 120B detects the meal tray 20 by image processing the moving images taken by the cameras 10A and 10B, but the present invention is not limited to this, and the meal tray 20 is not limited to this. The CPU 120 may be provided with a detector for detecting the above, and the CPU 120 may acquire information on whether or not the meal tray 20 is placed on the meal tray installation unit 30 from the detector.
 また、利用者ID検出部120Cは、ネームプレート22に付されたQRコード22Aを読み取ることで利用者IDを取得するが、これに限らず、例えば、利用者IDを記憶する無線タグをネームプレート又は利用者専用の食事トレイに埋め込み、食事トレイ設置部30に設置したタグリーダにより無線タグから利用者IDを読み取るようにしてもよい。 Further, the user ID detection unit 120C acquires the user ID by reading the QR code 22A attached to the name plate 22, but the name plate is not limited to this, and for example, a wireless tag for storing the user ID is used as the name plate. Alternatively, the user ID may be read from the wireless tag by embedding it in a user-dedicated meal tray and using a tag reader installed in the meal tray installation unit 30.
 (7)「カシャ」という音が聞こえたら、食事トレイ設置部30に配置した食事トレイ20を取り出し、次の食事トレイ20を食事トレイ設置部30に配置する。 (7) When a clicking sound is heard, the meal tray 20 arranged in the meal tray installation section 30 is taken out, and the next meal tray 20 is placed in the meal tray installation section 30.
 尚、利用者ID検出部120Cにより利用者IDが検出された時にカメラ10A,10Bによりそれぞれ撮影された配膳時(食前)の食事トレイ20の静止画(第1画像)は、検出された利用者IDと関連付けて、不揮発性の記憶部160(又はRAM150)に記憶させることが好ましい。また、カメラ10A,10Bによる1つの食事トレイ20の静止画の撮影が終了すると、RAM150に一時的に保持された動画は消去され、再びカメラ10A,10Bによる動画の撮影が開始される。 The still image (first image) of the meal tray 20 at the time of serving (before meals) taken by the cameras 10A and 10B when the user ID is detected by the user ID detection unit 120C is the detected user. It is preferable to store the ID in the non-volatile storage unit 160 (or RAM 150) in association with the ID. Further, when the shooting of the still image of one meal tray 20 by the cameras 10A and 10B is completed, the moving image temporarily held in the RAM 150 is deleted, and the shooting of the moving image by the cameras 10A and 10B is started again.
 (8)スピーカ182から「カシャ」という音が鳴るまで待機する。 (8) Wait until the speaker 182 makes a clicking sound.
 (9)配膳する。 (9) Serve.
 〔すべての利用者への配膳が終了した場合〕
 (1)ノートPC100の操作画面上で、「終了」ボタンを押す。尚、ノートPC100は起動した状態にしておく。
[When serving to all users is completed]
(1) Click the "Exit" button on the operation screen of the notebook PC 100. The notebook PC 100 is left in the activated state.
 <下膳時>
 (1)スタッフは、食事量測定用の操作画面に表示されている「食後」アイコンを選択する。
<At the time of the lower set>
(1) The staff selects the "after meal" icon displayed on the operation screen for measuring the amount of food.
 (2)「開始」ボタンを押す。 (2) Press the "Start" button.
 〔一人ひとりの撮影(完食されている場合)〕
 (1)主食も副食も完食されている場合は、食事トレイ20を撮影せずに片付ける。その際、ネームプレート22は回収する。
[Shooting of each person (if eaten)]
(1) If both the staple food and the side dish have been completed, clean up the meal tray 20 without taking a picture. At that time, the name plate 22 is collected.
 〔一人ひとりの撮影(完食されていない場合)〕
 (1)下膳する際、QRコード22Aが見えるようにネームプレート22を倒す。
[Shooting of each person (if not completely eaten)]
(1) When lowering the table, tilt the name plate 22 so that the QR code 22A can be seen.
 (2)箸やスプーンが食器(カップや皿)の中に立て掛けてある場合は、食事量の測定の邪魔にならないように食事トレイ20上に置く。 (2) If chopsticks or spoons are leaning against tableware (cups or plates), place them on the meal tray 20 so that they do not interfere with the measurement of the amount of food.
 (3)食事トレイ20を食事トレイ設置部30のマットが敷いてある部分に載せる。スタッフは、食事トレイ20を片手で保持した状態で、食事トレイ設置部30に載置する。 (3) Place the meal tray 20 on the mat-laid part of the meal tray installation part 30. The staff holds the meal tray 20 with one hand and places it on the meal tray installation unit 30.
 (4)スピーカ182から「カシャ」という音が鳴るまで待機する。 (4) Wait until the speaker 182 makes a clicking sound.
 (5)「カシャ」という音が聞こえたら、食事トレイ設置部30に配置した食事トレイ20を取り出し、ネームプレート22を回収する。 (5) When a clicking sound is heard, the meal tray 20 arranged in the meal tray installation section 30 is taken out, and the name plate 22 is collected.
 尚、カメラ10A,10Bによりそれぞれ撮影された下膳時(食後)の食事トレイ20の静止画(第2画像)は、配膳時(食前)と同様に利用者IDに関連付けて、不揮発性の記憶部160(又はRAM150)に記憶させることが好ましい。 The still image (second image) of the meal tray 20 at the time of serving (after meal) taken by the cameras 10A and 10B, respectively, is associated with the user ID as in the case of serving (before meal), and is a non-volatile memory. It is preferable to store it in the unit 160 (or RAM 150).
 (6)通常の下膳処理を行う。 (6) Perform normal lower set processing.
 〔全ての食事トレイ20の下膳が終了した場合〕
 (1)ノートPC100の操作画面上で、「終了」ボタンを押す。
[When all meal trays 20 have been prepared]
(1) Click the "Exit" button on the operation screen of the notebook PC 100.
 上記のようにして食前の食事トレイ20を撮影した第1画像、及び食後の食事トレイ20を撮影した第2画像は、利用者IDに関連付けられて記憶部160(又はRAM150)に一時的に記憶される。 The first image of the meal tray 20 before the meal and the second image of the meal tray 20 after the meal as described above are temporarily stored in the storage unit 160 (or RAM 150) in association with the user ID. Will be done.
 次に、CPU120は、同じ利用者IDに関連付けられて記憶された第1画像と第2画像を、通信部140を介して画像認識装置200に送信する。 Next, the CPU 120 transmits the first image and the second image associated with the same user ID and stored to the image recognition device 200 via the communication unit 140.
 図6は、画像認識装置200の機能を示す機能ブロック図である。 FIG. 6 is a functional block diagram showing the functions of the image recognition device 200.
 図6に示すように画像認識装置200は、通信部210、食器検出部220、料理認識部230、主食副食判定部240、食器組判定部250、及び食事量認識部260を備えている。尚、画像認識装置200は、パーソナルコンピュータ及びソフトウェアにより実現されている。 As shown in FIG. 6, the image recognition device 200 includes a communication unit 210, a tableware detection unit 220, a food recognition unit 230, a staple food side dish determination unit 240, a tableware set determination unit 250, and a meal amount recognition unit 260. The image recognition device 200 is realized by a personal computer and software.
 画像認識装置200の通信部210は、ノートPC100から送信された、同じ利用者IDと関連付けられた第1画像(食前の食事トレイ20を示す画像)及び第2画像(食後の食事トレイ20を示す画像)を受信する。 The communication unit 210 of the image recognition device 200 shows a first image (an image showing a pre-meal meal tray 20) and a second image (a post-meal meal tray 20) transmitted from the notebook PC 100 and associated with the same user ID. Image) is received.
 食器検出部220は、第1画像及び第2画像に基づいて食事トレイ20内の複数個の食器を検出する部分であり、例えば、図4に示した食事トレイ20を撮影した第1画像の場合、食器検出部220は、第1画像に基づいて4つの食器20A~20Dを検出する。 The tableware detection unit 220 is a portion that detects a plurality of tableware in the meal tray 20 based on the first image and the second image. For example, in the case of the first image obtained by photographing the meal tray 20 shown in FIG. , The tableware detection unit 220 detects four tableware 20A to 20D based on the first image.
 食器検出部220は、第1画像及び第2画像を画像内の各食器の形、色、大きさ等に基づいて画像処理することで、複数個の食器の位置(領域)や各食器の種別等を検出するものでもよいし、食事トレイ内の各食器を抽出(検出)する機械学習器として構成されたものでもよい。尚、食事トレイ20内の複数の食器は、同一の食器が含まれないようにすることが好ましい。 The tableware detection unit 220 performs image processing on the first image and the second image based on the shape, color, size, etc. of each tableware in the image, thereby performing image processing on the positions (areas) of a plurality of tableware and the type of each tableware. Etc. may be detected, or may be configured as a machine learning device that extracts (detects) each tableware in the meal tray. It is preferable that the plurality of tableware in the meal tray 20 does not include the same tableware.
 料理認識部230は、第1画像に基づいて検出された食器毎に各食器に盛られた料理の種類を認識する部分であり、例えば、料理の種類を認識(分類)するように機械学習された、畳み込みニューラルネットワーク(CNN:Convolutional Neural Network))などの学習器により構成することができる。 The food recognition unit 230 is a part that recognizes the type of food served on each tableware for each tableware detected based on the first image, and is machine-learned to recognize (classify) the type of food, for example. It can also be configured by a learner such as a convolutional neural network (CNN).
 この場合、料理認識部230は、食器検出部220により検出された各食器の画像を、第1画像から切り出し、切り出した食器の画像(又は食器の画像から更に料理のみを切り出した画像)を学習器の入力画像とすることで、食器に盛られた料理の種類を示す認識結果を出力する。 In this case, the dish recognition unit 230 cuts out the image of each tableware detected by the tableware detection unit 220 from the first image, and learns the image of the cut out tableware (or the image of only the dish further cut out from the image of the tableware). By using the input image of the container, the recognition result indicating the type of dishes served on the tableware is output.
 また、料理認識部230は、献立データが存在する場合には、測定日の献立データのうちの「朝食」「昼食」又は「夕食」の献立データを参照し、各食器に盛られた料理の種類を認識することで、より精度の高い料理の認識が可能になる。 If the menu data exists, the dish recognition unit 230 refers to the menu data of "breakfast", "lunch", or "dinner" in the menu data on the measurement day, and serves the dishes on each table. By recognizing the type, it is possible to recognize the dish with higher accuracy.
 主食副食判定部240は、料理認識部230により認識された食器毎の料理の種類から各食器内の料理が主食に相当するか、又は副食に相当するかを食器毎に判定する。 The staple food side dish determination unit 240 determines for each tableware whether the food in each tableware corresponds to the staple food or the side dish from the type of the dish for each tableware recognized by the dish recognition unit 230.
 図7は、料理の種類と主食/副食の割合の一例を示す図表である。 FIG. 7 is a chart showing an example of the type of food and the ratio of staple food / side dish.
 主食副食判定部240は、例えば、図7に示すテーブルを有し、食器内の料理の種類に基づいて主食の割合と副食の割合をテーブルから読み出すことができる。図7に示す例では、料理が「ご飯」の場合には、主食の割合が100%として読み出し、料理が「カレーライス」の場合には、主食の割合が80%、副食の割合が20%として読み出す。 The staple food side dish determination unit 240 has, for example, the table shown in FIG. 7, and can read out the ratio of the staple food and the ratio of the side dish from the table based on the type of food in the tableware. In the example shown in FIG. 7, when the dish is "rice", the ratio of the staple food is read as 100%, and when the dish is "curry rice", the ratio of the staple food is 80% and the ratio of the side dish is 20%. Read as.
 また、料理が「野菜サラダ」、「魚」、「みそ汁」の場合は、これらの料理の副食の割合が100%として読み出す。 If the dishes are "vegetable salad", "fish", and "miso soup", the ratio of side dishes in these dishes is read as 100%.
 また、図7の右端の欄の「量」は、標準的な料理の量(例えば、重量(g))を示す。尚、副食の量は、その副食が主菜か副菜かに応じて重み付けをしてもよいし、「みそ汁」の場合には、水分が多いため、「量」を小さな値にすることが好ましい。また、これらの「量」は、主食の食事量及び副食の食事量を目視により認識していたスタッフ等が経験的に決めてもよい。 In addition, the "amount" in the rightmost column of FIG. 7 indicates the amount of a standard dish (for example, weight (g)). The amount of the side dish may be weighted according to whether the side dish is the main dish or the side dish, and in the case of "miso soup", since the amount of water is large, the "amount" may be set to a small value. preferable. In addition, these "amounts" may be empirically determined by a staff member or the like who has visually recognized the amount of food for the staple food and the amount of food for the side dish.
 食器組判定部250は、食器検出部220により第1画像から検出された複数個の食器と第2画像から検出された複数個の食器から、同一の食器の組を判定する部分である。同一の食器の組の判定ができればよく、どのような種類の食器であるかの判定は不要である。 The tableware set determination unit 250 is a part that determines the same tableware set from a plurality of tableware detected from the first image and a plurality of tableware detected from the second image by the tableware detection unit 220. It suffices if the same set of tableware can be determined, and it is not necessary to determine what kind of tableware it is.
 例えば、複数個の食器の大きさがそれぞれ異なる場合には、同じ大きさの食器を同一の食器の組とすることができる。 For example, when the sizes of a plurality of tableware are different, the tableware of the same size can be combined into the same set of tableware.
 食事量認識部260は、食器組判定部250により判定された同一の食器の組の画像に基づいて各食器に盛られた料理の量及び残った料理の量を食器毎に認識する。食事量認識部260は、第1画像の各食器に盛られた料理の量(食事量)と第2画像の各食器に残った料理の量(食事量)との差を算出することで、利用者が食べた食器毎の料理の食事量を認識する。 The meal amount recognition unit 260 recognizes the amount of dishes served on each tableware and the amount of remaining dishes for each tableware based on the image of the same tableware set determined by the tableware set determination unit 250. The meal amount recognition unit 260 calculates the difference between the amount of food served on each tableware (meal amount) in the first image and the amount of food remaining in each tableware (meal amount) in the second image. Recognize the amount of food for each tableware eaten by the user.
 食事量認識部260は、例えば、食器に盛られた料理の画像と、その食器に盛られた料理の量(実際の量)とのペアを教師データとして、料理の画像からその料理の量を認識するように機械学習した、CNNなどの学習器を含む。教師データは、食器に盛られた様々な料理の画像(種々の量の料理の画像を含む)と、正解データである様々な料理の量(種々の量の料理の画像に対する実際の料理の量)とのペアである。 For example, the meal amount recognition unit 260 uses a pair of an image of a dish served on the tableware and the amount of the dish (actual amount) served on the tableware as teacher data, and calculates the amount of the dish from the image of the dish. Includes learners such as CNNs that are machine-learned to recognize. The teacher data includes images of various dishes served on the tableware (including images of various amounts of dishes) and various amounts of dishes that are correct data (actual amount of dishes for images of various amounts of dishes). ) Is a pair.
 学習器により構成された食事量認識部260の場合、食器検出部220により検出された各食器の画像を、第1画像から切り出し、切り出した食器の画像(又は食器の画像から更に料理のみを切り出した画像)を学習器の入力画像とすることで、食前に食器に盛られた料理の量を示す認識結果を出力する。同様にして、食器検出部220により検出された各食器の画像を、第2画像から切り出し、切り出した食器の画像(又は食器の画像から更に料理のみを切り出した画像)を学習器の入力画像とすることで、食後の食器(同一の食器)に残った料理の量を示す認識結果を出力する。同一の食器の食前に認識した料理の量から食後に認識した料理の量を減算することで、利用者が食べた食事量を認識することができる。この場合の料理の量は、図7の右欄に示した料理の種類毎の量に対応するものである。 In the case of the meal amount recognition unit 260 configured by the learning device, the image of each tableware detected by the tableware detection unit 220 is cut out from the first image, and only the dish is further cut out from the cut out tableware image (or the tableware image). By using the image) as an input image of the learner, a recognition result indicating the amount of food served on the tableware before meals is output. Similarly, the image of each tableware detected by the tableware detection unit 220 is cut out from the second image, and the image of the cut out tableware (or the image obtained by further cutting out only the food from the tableware image) is used as the input image of the learner. By doing so, a recognition result indicating the amount of food remaining on the tableware (same tableware) after meals is output. By subtracting the amount of food recognized after meal from the amount of food recognized before meal of the same tableware, the amount of food eaten by the user can be recognized. The amount of food in this case corresponds to the amount of each type of food shown in the right column of FIG.
 尚、第1画像、第2画像は、2台のカメラ10A,10Bによりそれぞれ2枚ずつ撮影されているため、食器検出部220、料理認識部230、食器組判定部250、及び食事量認識部260は、それぞれ2枚の第1画像及び第2画像を使用して食器検出、料理認識、食器組判定、及び食事量判定を行うことができ、より精度の高い判定、認識を行うことができる。また、料理認識部230及び食事量認識部260を学習器で構成した場合、2枚の同じ食器の画像を同時に入力し、料理の認識及び食事量の認識を行うようにしてもよい。 Since the first image and the second image are taken by two cameras 10A and 10B respectively, the tableware detection unit 220, the cooking recognition unit 230, the tableware set determination unit 250, and the meal amount recognition unit The 260 can perform tableware detection, cooking recognition, tableware set determination, and meal amount determination using two first and second images, respectively, and can perform more accurate determination and recognition. .. Further, when the food recognition unit 230 and the meal amount recognition unit 260 are configured by the learning device, two images of the same tableware may be input at the same time to recognize the food and the food amount.
 画像認識装置200は、上記のようにして第1画像及び第2画像に基づいて食器(料理)毎に認識された、食器毎の主食副食の判定結果と食器毎の食事量とを、利用者ID毎に通信部210を介してノートPC100に送信する。 The image recognition device 200 uses the user to obtain the determination result of the staple food and side dish for each tableware and the amount of meal for each tableware, which are recognized for each tableware (cooking) based on the first image and the second image as described above. Each ID is transmitted to the notebook PC 100 via the communication unit 210.
 図3に戻って、ノートPC100の主食副食食事量算出部120Aは、画像認識装置200から受信した食器毎の主食副食の判定結果と食器毎の食事量とに基づいて、主食の食事量及び副食の食事量を算出する。 Returning to FIG. 3, the staple food side dish meal amount calculation unit 120A of the notebook PC 100 is based on the determination result of the staple food side dish for each tableware and the meal amount for each tableware received from the image recognition device 200, and the staple food side dish meal amount and the side dish Calculate the amount of food in.
 図8は、図4に示した食事トレイ20内の食器20A~20D(料理A~D)毎の主食副食の判定結果及び食事量の一例を示す図表である。 FIG. 8 is a chart showing a determination result of a staple food and a side dish for each tableware 20A to 20D (cooking A to D) in the meal tray 20 shown in FIG. 4 and an example of the amount of meal.
 尚、図8上で、「量」は、食前の各料理の量を示し、「減少量」は食後に減少した食事量の割合(%)を示す。「量」及び「減少量」の情報は、参考のために示したものであり、主食副食食事量算出部120Aは、これらの情報は使用しない。 In FIG. 8, "amount" indicates the amount of each meal before meals, and "decrease amount" indicates the ratio (%) of the amount of meals decreased after meals. The information of "amount" and "decrease amount" is shown for reference, and the staple food side dish meal amount calculation unit 120A does not use this information.
 図8に示す料理A~Dの各料理は、それぞれ主食副食の判定結果が、主食又は副食のいずれか一方であるため、主食副食食事量算出部120Aは、主食として判定された料理Aの食事量(90)を主食の食事量とし、副食として判定された料理B~Dの食事量(80、40、30)の合計の食事量(150)を副食の食事量として算出する。 Since the determination result of the staple food side meal is either the staple food or the side meal for each of the dishes A to D shown in FIG. 8, the main meal side meal amount calculation unit 120A determines the meal of the staple food A as the staple food. The amount (90) is taken as the meal amount of the staple food, and the total meal amount (150) of the meal amounts (80, 40, 30) of the dishes B to D determined as the side meal is calculated as the meal amount of the side meal.
 CPU120は、利用者ID毎に算出された主食の食事量及び副食の食事量を、利用者IDとともに通信部140を介して記録装置(記録部)300に送信することができる。 The CPU 120 can transmit the meal amount of the staple food and the meal amount of the side dish calculated for each user ID to the recording device (recording unit) 300 via the communication unit 140 together with the user ID.
 また、主食副食食事量算出部120Aは、利用者ID毎に算出された主食の食事量及び副食の食事量を、それぞれ0~5の6段階で評価し、その評価値(数値が大きい程、食事量が多い評価値)を食事量として算出するようにしてもよい。 In addition, the staple food side dish meal amount calculation unit 120A evaluates the staple food meal amount and the side dish meal amount calculated for each user ID on a scale of 0 to 5, respectively, and the evaluation value (the larger the numerical value, the more (Evaluation value with a large amount of food) may be calculated as the amount of food.
 図9は、図8に示した主食及び副食の食事量に対する6段階の評価結果を示す図表であり、図9に示す例では、主食の食事量は6段階評価で3、副食の食事量は6段階評価で5である。 FIG. 9 is a chart showing the evaluation results of 6 stages for the meal amount of the staple food and the side dish shown in FIG. 8. In the example shown in FIG. 9, the meal amount of the staple food is 3 in the 6-step evaluation, and the meal amount of the side dish is It is 5 on a 6-point scale.
 図10は、食事トレイ内の食器(料理A~C)毎の主食副食の判定結果及び食事量の他の例を示す図表である。 FIG. 10 is a chart showing the determination result of the staple food and side dish for each tableware (cookings A to C) in the meal tray and other examples of the amount of meal.
 図10に示す料理Aは、例えば、カレーライスであり、主食副食の判定結果は、主食が80%、副食が20%の割合になっている。 The dish A shown in FIG. 10 is, for example, curry rice, and the determination result of the staple food and the side dish is that the staple food is 80% and the side dish is 20%.
 主食副食食事量算出部120Aは、料理Aの減少量が80%の場合、画像認識装置200から料理Aに対する食事量として、144(=180×0.8)を受け取るが、主食副食の判定結果(主食/副食=80/20)により、144の食事量のうちの主食の食事量を、115(144×0.8)として算出し、副食の食事量を、29(=144×0.2)として算出する。 When the reduction amount of the staple food side dish is 80%, the staple food side dish meal amount calculation unit 120A receives 144 (= 180 × 0.8) as the meal amount for the food A from the image recognition device 200, but the determination result of the staple food side dish By (staple food / side dish = 80/20), the amount of staple food out of 144 meals is calculated as 115 (144 × 0.8), and the amount of side dish is 29 (= 144 × 0.2). ).
 そして、主食副食食事量算出部120Aは、料理Aに含まれる主食の食事量(115)をそのまま主食の食事量とし、料理Aに含まれる副食の食事量(29)、及び副食として判定された料理B、Cの食事量(20、9)の合計の食事量(58)を副食の食事量として算出する。 Then, the staple food side meal amount calculation unit 120A uses the staple food meal amount (115) contained in the dish A as the staple food meal amount as it is, and determines the staple food meal amount (29) and the side meal included in the dish A as the side meal. The total meal amount (58) of the meal amounts (20, 9) of the dishes B and C is calculated as the meal amount of the side meal.
 図11は、図10に示した主食及び副食の食事量に対する6段階の評価結果を示す図表であり、図11に示す例では、主食の食事量は6段階評価で4、副食の食事量は6段階評価で3である。 FIG. 11 is a chart showing the evaluation results of 6 stages for the meal amount of the staple food and the side dish shown in FIG. 10. In the example shown in FIG. 11, the meal amount of the staple food is 4 in the 6-step evaluation, and the meal amount of the side dish is It is 3 on a 6-point scale.
 尚、本例では、主食及び副食の食事量をそれぞれ6段階で評価しているが、本発明はこれに限定されない。 In this example, the amount of staple food and side dish is evaluated on a 6-point scale, but the present invention is not limited to this.
 また、図3に示す完食判定部120Dは、記憶部160に記憶された第1画像及び第2画像のうち、利用者IDと関連付けられた第1画像のみが存在する場合、第1画像から検出された複数個の食器に盛られた料理は主食及び副食とも完食されたものと判定する。 Further, the complete food determination unit 120D shown in FIG. 3 starts from the first image when only the first image associated with the user ID exists among the first image and the second image stored in the storage unit 160. It is determined that the detected dishes served on the plurality of tableware have been completely eaten as both the staple food and the side dish.
 主食副食食事量算出部120Aは、完食判定部120Dにより完食が判定された場合、主食の食事量及び副食の食事量を算出する際に完食判定部120Dの判定結果を使用する。 The staple food side dish meal amount calculation unit 120A uses the determination result of the staple food side dish 120D when calculating the staple food meal amount and the side dish meal amount when the complete meal determination unit 120D determines the complete meal.
 一般に、利用者の70%程度は、提供された食事を完食しており、完食判定部120Dを設けることで、完食された食事トレイの撮影等が不要になり、手間や不要な食事量の認識処理等を省くことができる。 In general, about 70% of users have completed the provided meals, and by providing the complete meal determination unit 120D, it is not necessary to take a picture of the completed meal tray, which is troublesome and unnecessary amount of meals. Recognition processing etc. can be omitted.
 尚、画像認識装置200は、第1画像と同じ利用者IDを有する第2画像が存在しない場合でも、その第1画像に基づいて食前の食器毎の食事量の認識を行い、その認識結果をノートPC100に送信する。利用者によっては、もともと食前の各食器に盛られる料理の量が少なくされている場合(例えば、ご飯が半分になっている場合)があり、その場合、完食されても食事量が異なるからである。 Even if the second image having the same user ID as the first image does not exist, the image recognition device 200 recognizes the amount of meal for each tableware before meals based on the first image, and obtains the recognition result. Send to the notebook PC 100. Depending on the user, the amount of food served on each tableware before meals may be reduced (for example, when the rice is halved), and in that case, the amount of meals will be different even if the meal is completed. Is.
 CPU120は、利用者IDとともに、主食副食食事量算出部120Aが利用者ID毎に算出した、主食の食事量を示す6段階の評価値、及び副食の食事量を示す6段階の評価値を、通信部140を介して記録装置300に送信する。この場合、ノートPC100の表示部172に表示された操作画面で選択した「朝食」、「昼食」又は「夕食」のいずれかを示す情報も記録装置300に送信する。 The CPU 120, together with the user ID, obtains a 6-step evaluation value indicating the amount of the staple food meal and a 6-step evaluation value indicating the amount of the side dish meal calculated for each user ID by the staple food side dish meal amount calculation unit 120A. It is transmitted to the recording device 300 via the communication unit 140. In this case, information indicating any of "breakfast", "lunch", or "dinner" selected on the operation screen displayed on the display unit 172 of the notebook PC 100 is also transmitted to the recording device 300.
 記録装置300は、コンピュータによって実現され、利用者の個人情報を示す個人情報一覧表、及び利用者毎の食事量を示す食事量一覧表を作成及び管理する部分である。 The recording device 300 is a part realized by a computer and creates and manages a personal information list showing the personal information of the user and a meal amount list showing the meal amount for each user.
 図12は、記録装置300で管理される各利用者の個人情報の一覧表の一例を示す図表である。 FIG. 12 is a chart showing an example of a list of personal information of each user managed by the recording device 300.
 図12に示す利用者一覧表には、利用者IDにより特定される利用者毎の記入欄に、氏名、年齢、性別等の利用者の個人情報が記録される。利用者一覧表は、利用者の個人情報が記録されるため、特別な権限を有する者のみが閲覧可能になっている。 In the user list shown in FIG. 12, personal information of users such as name, age, and gender is recorded in the entry field for each user specified by the user ID. Since the user list records the personal information of the user, it can be viewed only by those who have special authority.
 図13は、記録装置300により記録及び管理される利用者毎の食事量を示す食事量一覧表の一例を示す図表である。 FIG. 13 is a chart showing an example of a meal amount list showing the meal amount for each user recorded and managed by the recording device 300.
 図13に示す食事量一覧表には、利用者IDにより特定される利用者毎の食事量の記入欄に、毎日の「朝食」、「昼食」、「夕食」毎の主食の食事量を示す評価値、及び副食の食事量を示す評価値が記録される。 In the meal amount list shown in FIG. 13, the meal amount of the staple food for each daily "breakfast", "lunch", and "dinner" is shown in the entry field of the meal amount for each user specified by the user ID. The evaluation value and the evaluation value indicating the amount of side dishes are recorded.
 記録装置300は、ノートPC100から利用者IDとともに、主食及び副食のそれぞれの食事量を示す評価値を受信すると、図13に示した食事量一覧表に主食及び副食のそれぞれの食事量を示す評価値を記録する。 When the recording device 300 receives the evaluation value indicating the respective meal amounts of the staple food and the side dish from the notebook PC 100 together with the user ID, the evaluation indicating the respective meal amounts of the staple food and the side dish in the meal amount list shown in FIG. Record the value.
 この場合、利用者IDにより特定される食事量一覧表の利用者毎の記入欄に、その利用者が食事した日の「朝食」、「昼食」又は「夕食」における「主食」及び「副食」別の食事量を示す評価値を記録する。 In this case, in the entry field for each user in the meal amount list specified by the user ID, "staple food" and "side dish" in "breakfast", "lunch" or "dinner" on the day the user ate. Record a rating that indicates another meal.
 これにより、利用者IDにより特定される利用者毎に、各利用者が食べた主食及び副食のそれぞれの食事量を示す評価値が自動的に記録される。 As a result, an evaluation value indicating the amount of each of the staple food and side dish eaten by each user is automatically recorded for each user specified by the user ID.
 [食事量測定方法]
 図14から図17は、本発明に係る食事量測定方法の実施形態を示すフローチャートである。以下、図1及び図2に示した食事量測定装置1を使用して、食事量を測定する場合について説明する。
[Meal measurement method]
14 to 17 are flowcharts showing an embodiment of the food amount measuring method according to the present invention. Hereinafter, a case where the meal amount is measured by using the meal amount measuring device 1 shown in FIGS. 1 and 2 will be described.
 図14は、配膳時のスタッフの作業手順等を示すフローチャートである。 FIG. 14 is a flowchart showing the work procedure of the staff at the time of serving.
 図14において、介護施設等のスタッフは、食事トレイの配膳前にノートPC100の電源を入れる(ステップS10)。ノートPC100が立ち上がると、スタッフは、ノートPC100のデスクトップ上の「食事量判定」アイコンをダブルクリックする(ステップS12)。 In FIG. 14, the staff of the nursing care facility or the like turns on the power of the notebook PC 100 before serving the meal tray (step S10). When the notebook PC 100 starts up, the staff double-clicks the "meal amount determination" icon on the desktop of the notebook PC 100 (step S12).
 「食事量判定」アイコンがダブルクリックされると(「Yes」の場合)、食事量測定プログラムが起動し、ノートPC100の表示部172には、食事量測定用の操作画面が表示される。スタッフは、この操作画面上で「朝食」アイコン、「昼食」アイコン、「夕食」アイコンのいずれかを選択し(ステップS14)、続いて「開始」ボタンを押す(ステップS16)。 When the "meal amount determination" icon is double-clicked (in the case of "Yes"), the meal amount measurement program is started, and the operation screen for meal amount measurement is displayed on the display unit 172 of the notebook PC 100. The staff selects one of the "breakfast" icon, the "lunch" icon, and the "dinner" icon on this operation screen (step S14), and then presses the "start" button (step S16).
 次に、スタッフは、食事トレイ20を持ち、食事トレイ設置部30のマットが敷いてある部分に食事トレイ20を載せる(ステップS18)。 Next, the staff holds the meal tray 20 and puts the meal tray 20 on the part where the mat of the meal tray installation part 30 is laid (step S18).
 食事トレイ設置部30に食事トレイ20が載置されると、カメラ10A、10Bによる食事トレイ20の撮影が行われる(ステップS20)。 When the meal tray 20 is placed on the meal tray installation section 30, the meal tray 20 is photographed by the cameras 10A and 10B (step S20).
 図15は、図14に示したステップS20での食事トレイの撮影処理を示すフローチャートである。 FIG. 15 is a flowchart showing a photographing process of the meal tray in step S20 shown in FIG.
 図15において、カメラ10A,10Bは、「開始」ボタンが押されると、動画の撮影を開始し、食事トレイ検知部120Bは、動画の各フレームから食事トレイ20が食事トレイ設置部30の適切な位置に配置されているかを検知する(ステップS21)。また、利用者ID検出部120Cは、食事トレイ20に載せられたネームプレート22に付されたQRコード22Aを読み取ることで、利用者IDを検出する(ステップS22)。 In FIG. 15, when the “start” button is pressed, the cameras 10A and 10B start shooting a moving image, and the meal tray detection unit 120B sets the meal tray 20 from each frame of the moving image to an appropriate meal tray installation unit 30. It detects whether it is arranged at the position (step S21). Further, the user ID detection unit 120C detects the user ID by reading the QR code 22A attached to the name plate 22 placed on the meal tray 20 (step S22).
 ステップS21、S22において、食事トレイ20及び利用者IDが適切に検出されると(「Yes」の場合)、CPU120は、カメラ10A,10Bに対して動画撮影から静止画撮影への切り換えを指示し、食事トレイ20の静止画を撮影させる(ステップS23)。 When the meal tray 20 and the user ID are appropriately detected in steps S21 and S22 (in the case of "Yes"), the CPU 120 instructs the cameras 10A and 10B to switch from the moving image shooting to the still image shooting. , A still image of the meal tray 20 is taken (step S23).
 CPU120は、カメラ10A,10Bによる静止画の撮影が終了すると、スピーカ182から「カシャ」という音を発生させ、撮影の成功を報知する(ステップS24)。 When the shooting of the still image by the cameras 10A and 10B is completed, the CPU 120 generates a "click" sound from the speaker 182 to notify the success of the shooting (step S24).
 上記のようにして一人分の食事トレイ20の撮影が終了すると、図14に示したステップS20に戻る。尚、撮影された静止画(第1画像)は、利用者IDに関連付けて記憶部160に一時的に記憶される。また、静止画の撮影が終了すると、カメラ10A,10Bは再び動画の撮影を開始する。即ち、カメラ10A,10Bは、静止画(第1画像、第2画像)の撮影以外の期間に動画を撮影する。 When the shooting of the meal tray 20 for one person is completed as described above, the process returns to step S20 shown in FIG. The captured still image (first image) is temporarily stored in the storage unit 160 in association with the user ID. Further, when the shooting of the still image is completed, the cameras 10A and 10B start shooting the moving image again. That is, the cameras 10A and 10B shoot a moving image during a period other than shooting a still image (first image, second image).
 続いて、スタッフは、全ての配膳が終了したか否かを判断し(ステップS22)、終了していない場合(「No」の場合)には、ステップS18に戻り、ステップS18からステップS22の処理を繰り返す。 Subsequently, the staff determines whether or not all the servings have been completed (step S22), and if not (in the case of "No"), returns to step S18 and processes from step S18 to step S22. repeat.
 一方、スタッフは、全ての配膳が終了したと判断すると(「Yes」の場合)、操作画面上で「終了」ボタンを押す(ステップS24)。 On the other hand, when the staff determines that all the servings have been completed (in the case of "Yes"), the staff presses the "End" button on the operation screen (step S24).
 これにより、配膳時の処理が終了する。尚、ノートPC100は起動した状態にしておく。 This completes the processing at the time of serving. The notebook PC 100 is left in the activated state.
 図16は、下膳時のスタッフの作業手順等を示すフローチャートである。 FIG. 16 is a flowchart showing the work procedure of the staff at the time of serving.
 図16において、スタッフは、食事トレイの下膳時に操作画面上で、「食後」アイコンを選択し(ステップS30)、続いて「開始」ボタンを押す(ステップS32)。 In FIG. 16, the staff selects the "after meal" icon on the operation screen when the meal tray is set down (step S30), and then presses the "start" button (step S32).
 スタッフは、下膳した食事トレイ20が完食されていない場合、その食事トレイ20を食事トレイ設置部30のマットが敷いてある部分に載せる(ステップS34)。 If the prepared meal tray 20 has not been completely eaten, the staff puts the meal tray 20 on the mat-laid portion of the meal tray installation section 30 (step S34).
 食事トレイ設置部30に下膳された食事トレイ20が載置されると、カメラ10A、10Bによる食事トレイ20の撮影が行われる(ステップS20)。尚、ステップS20の処理は、図14に示したステップS20と同様のため、その詳細な説明は省略する。 When the prepared meal tray 20 is placed on the meal tray installation section 30, the meal tray 20 is photographed by the cameras 10A and 10B (step S20). Since the process of step S20 is the same as that of step S20 shown in FIG. 14, detailed description thereof will be omitted.
 続いて、スタッフは、全ての下膳が終了したか否かを判断し(ステップS36)、終了していない場合(「No」の場合)には、ステップS34に戻り、ステップS34からステップS36の処理を繰り返す。 Subsequently, the staff determines whether or not all the lower dishes have been completed (step S36), and if not (in the case of "No"), returns to step S34 and from step S34 to step S36. Repeat the process.
 一方、スタッフは、全ての下膳が終了したと判断すると(「Yes」の場合)、操作画面上で「終了」ボタンを押す(ステップS24)。 On the other hand, when the staff determines that all the lower dishes have been completed (in the case of "Yes"), the staff presses the "End" button on the operation screen (step S24).
 これにより、下膳時の処理が終了する。尚、下膳した食事トレイ20のうちの撮影が終了した食事トレイ20、及び完食された食事トレイ20は、通常の下膳処理が行われる。 This completes the processing at the time of lowering. Of the meal trays 20 that have been prepared, the meal tray 20 that has been photographed and the meal tray 20 that has been completely eaten are subjected to normal meal tray processing.
 図17は、食前及び食後の食事トレイの撮影が終了した後のノートPC100及び画像認識装置200の処理を示すフローチャートである。 FIG. 17 is a flowchart showing the processing of the notebook PC 100 and the image recognition device 200 after the photography of the meal tray before and after the meal is completed.
 図17において、ノートPC100のCPU120は、食前及び食後の食事トレイの撮影が終了すると、同じ利用者IDに関連付けられて記憶部160に記憶された食前の食事トレイ20を撮影した第1画像、及び食後の食事トレイ20を撮影した第2画像を、通信部140を介して画像認識装置200に送信し、画像認識装置200は、ノートPC100から送信された、同じ利用者IDと関連付けられた第1画像及び第2画像を受信(取得)する(ステップS50)。 In FIG. 17, the CPU 120 of the notebook PC 100 captures the first image of the pre-meal meal tray 20 associated with the same user ID and stored in the storage unit 160 when the pre-meal and post-meal meal trays have been photographed, and The second image of the meal tray 20 after the meal is transmitted to the image recognition device 200 via the communication unit 140, and the image recognition device 200 is the first image transmitted from the notebook PC 100 and associated with the same user ID. The image and the second image are received (acquired) (step S50).
 画像認識装置200の食器検出部220は、取得した第1画像及び第2画像に基づいて食事トレイ20内の複数個の食器を検出する(ステップS52)。また、料理認識部230は、取得した第1画像に基づいて食器毎に各食器に盛られた料理の種類を認識する(ステップS54)。 The tableware detection unit 220 of the image recognition device 200 detects a plurality of tableware in the meal tray 20 based on the acquired first image and the second image (step S52). In addition, the food recognition unit 230 recognizes the type of food served on each tableware for each tableware based on the acquired first image (step S54).
 主食副食判定部240は、認識された食器毎の料理の種類から各食器内の料理が主食に相当するか、又は副食に相当するかを食器(料理)毎に判定する(ステップS56)。 The staple food side dish determination unit 240 determines for each tableware (cooking) whether the food in each tableware corresponds to the staple food or the side dish from the recognized types of dishes for each tableware (step S56).
 食器組判定部250は、ステップS52により第1画像から検出された複数個の食器と第2画像から検出された複数個の食器から、同一の食器の組を判定する(ステップS58)。 The tableware set determination unit 250 determines the same tableware set from the plurality of tableware detected from the first image and the plurality of tableware detected from the second image in step S52 (step S58).
 食事量認識部260は、同一の食器の組が判定されると、同一の食器の組の画像に基づいて各食器に盛られた料理に対する食事量を食器毎に認識し、第1画像の各食器に盛られた料理の食事量から第2画像の各食器に残った食事量を、同じ食器同士の間で減算することで、利用者が食べた食器毎の料理の食事量を認識する(ステップS60)。 When the same set of tableware is determined, the meal amount recognition unit 260 recognizes the amount of meal for each tableware on each tableware based on the image of the same set of tableware, and each of the first images. By subtracting the amount of food remaining on each tableware in the second image from the amount of food on the tableware between the same tableware, the amount of food on each tableware eaten by the user is recognized ( Step S60).
 画像認識装置200は、上記のようにして第1画像及び第2画像に基づいて食器(料理)毎に認識された、食器毎の主食副食の判定結果と食器毎の食事量とを、利用者ID毎に通信部210を介してノートPC100に送信する。 The image recognition device 200 uses the user to obtain the determination result of the staple food and side dish for each tableware and the amount of meal for each tableware, which are recognized for each tableware (cooking) based on the first image and the second image as described above. Each ID is transmitted to the notebook PC 100 via the communication unit 210.
 ノートPC100の主食副食食事量算出部120Aは、画像認識装置200から受信した食器毎の主食副食の判定結果と食器毎の食事量とに基づいて、主食の食事量及び副食の食事量を算出する(ステップS62)。 The staple food side dish meal amount calculation unit 120A of the notebook PC 100 calculates the staple food side dish meal amount and the side dish meal amount based on the determination result of the staple food side dish for each tableware received from the image recognition device 200 and the meal amount for each tableware. (Step S62).
 CPU120は、利用者ID毎に算出した主食及び副食の食事量を示す6段階の評価値を、利用者IDとともに記録装置300に送信し、記録装置300は、図13に示したように食事量一覧表に主食及び副食のそれぞれの食事量を示す評価値を記録する(ステップS64)。 The CPU 120 transmits a six-level evaluation value indicating the amount of staple food and side dish meals calculated for each user ID to the recording device 300 together with the user ID, and the recording device 300 transmits the amount of meals as shown in FIG. An evaluation value indicating the amount of each meal of the staple food and the side dish is recorded in the list (step S64).
 [その他]
 本実施形態の食事量測定装置は、主食及び副食の食事量をそれぞれ算出するが、これに限らず、主食もしくは副食の少なくとも一つの食事量を算出するものでもよい。
[Other]
The meal amount measuring device of the present embodiment calculates the meal amount of the staple food and the side dish, respectively, but is not limited to this, and may calculate the meal amount of at least one of the staple food or the side dish.
 また、本例の食事量測定装置1は、ノートPC100、画像認識装置200及び記録装置300が無線LAN、ブルートゥース等により直接的又は間接的に通信可能に接続されているが、画像認識装置200等は、インターネットを通じてノートPC100と通信可能なものでもよい。 Further, in the meal amount measuring device 1 of this example, the notebook PC 100, the image recognition device 200, and the recording device 300 are directly or indirectly connected to each other by wireless LAN, Bluetooth, etc., but the image recognition device 200, etc. May be capable of communicating with the notebook PC 100 via the Internet.
 更に、本発明に係る食事量測定装置は、ノートPCを含むパーソナルコンピュータのみからなるスタンドアローンのものでもよい。 Further, the meal amount measuring device according to the present invention may be a stand-alone device consisting only of a personal computer including a notebook PC.
 また、撮像部は、2台のカメラに限らず、3台以上の複数のカメラを含むものでもよいし、1台のカメラで構成されたものでもよい。 Further, the imaging unit is not limited to two cameras, and may include a plurality of cameras of three or more, or may be composed of one camera.
 本発明に係る食事量測定装置を実現するハードウェアは、各種のプロセッサ(processor)で構成できる。各種プロセッサには、プログラムを実行して各種の処理部として機能する汎用的なプロセッサであるCPU、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device;PLD)、ASIC(Application Specific Integrated Circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。食事量測定装置を構成する1つの処理部は、上記各種プロセッサのうちの1つで構成されていてもよいし、同種又は異種の2つ以上のプロセッサで構成されてもよい。例えば、1つの処理部は、複数のFPGA、あるいは、CPUとFPGAの組み合わせによって構成されてもよい。また、複数の処理部を1つのプロセッサで構成してもよい。複数の処理部を1つのプロセッサで構成する例としては、第1に、クライアントやサーバなどのコンピュータに代表されるように、1つ以上のCPUとソフトウェアの組み合わせで1つのプロセッサを構成し、このプロセッサが複数の処理部として機能する形態がある。第2に、システムオンチップ(System On Chip;SoC)などに代表されるように、複数の処理部を含むシステム全体の機能を1つのIC(Integrated Circuit)チップで実現するプロセッサを使用する形態がある。このように、各種の処理部は、ハードウェア的な構造として、上記各種プロセッサを1つ以上用いて構成される。更に、これらの各種のプロセッサのハードウェア的な構造は、より具体的には、半導体素子などの回路素子を組み合わせた電気回路(circuitry)である。 The hardware that realizes the meal amount measuring device according to the present invention can be configured by various processors. Various processors include programmable logic devices (Programmable), which are processors whose circuit configurations can be changed after manufacturing, such as CPUs and FPGAs (Field Programmable Gate Arrays), which are general-purpose processors that execute programs and function as various processing units. A dedicated electric circuit, which is a processor having a circuit configuration specially designed for executing a specific process such as Logic Device (PLD) or ASIC (Application Specific Integrated Circuit), is included. One processing unit constituting the food amount measuring device may be composed of one of the above-mentioned various processors, or may be composed of two or more processors of the same type or different types. For example, one processing unit may be composed of a plurality of FPGAs or a combination of a CPU and an FPGA. Further, a plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, one processor is configured by a combination of one or more CPUs and software, as represented by a computer such as a client or a server. There is a form in which the processor functions as a plurality of processing units. Secondly, as typified by System On Chip (SoC), there is a form in which a processor that realizes the functions of the entire system including a plurality of processing units with one IC (Integrated Circuit) chip is used. is there. As described above, the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure. Further, the hardware structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
 また、本発明は上述した実施形態に限定されず、本発明の精神を逸脱しない範囲で種々の変形が可能であることは言うまでもない。 Further, the present invention is not limited to the above-described embodiment, and it goes without saying that various modifications can be made without departing from the spirit of the present invention.
1 食事量測定装置
10A、10B カメラ
20 食事トレイ
20A、20B、20C、20D 食器
22 ネームプレート
22A QRコード
30 食事トレイ設置部
30A 切り欠き部
30B ガイド部
100 ノートPC
110 入出力インターフェース
120 CPU
120A 主食副食食事量算出部
120B 食事トレイ検知部
120C 利用者ID検出部
120D 完食判定部
130 操作部
140 通信部
150 RAM
160 記憶部
170 表示制御部
172 表示部
180 ドライバ
182 スピーカ
200 画像認識装置
210 通信部
220 食器検出部
230 料理認識部
240 主食副食判定部
250 食器組判定部
260 食事量認識部
300 記録装置
A、B、C、D 料理
S10~S64 ステップ
1 Meal amount measuring device 10A, 10B Camera 20 Meal tray 20A, 20B, 20C, 20D Tableware 22 Name plate 22A QR code 30 Meal tray installation part 30A Notch part 30B Guide part 100 Notebook PC
110 I / O interface 120 CPU
120A Staple food Side dish Meal amount calculation unit 120B Meal tray detection unit 120C User ID detection unit 120D Complete meal determination unit 130 Operation unit 140 Communication unit 150 RAM
160 Storage unit 170 Display control unit 172 Display unit 180 Driver 182 Speaker 200 Image recognition device 210 Communication unit 220 Tableware detection unit 230 Cooking recognition unit 240 Staple food side dish determination unit 250 Tableware set determination unit 260 Meal amount recognition unit 300 Recording devices A and B , C, D Cooking S10-S64 Step

Claims (15)

  1.  食事の前後の食事トレイを撮影する撮像部と、
     前記撮像部により撮影した食前の第1画像及び食後の第2画像をそれぞれ取得する画像取得部と、
     前記第1画像及び前記第2画像に基づいて前記食事トレイ内の複数個の食器を検出する食器検出部と、
     前記第1画像に基づいて前記検出された食器毎に各食器に盛られた料理の種類を認識する料理認識部と、
     前記認識された料理の種類から前記食器内の料理が主食に相当するか、又は副食に相当するかを食器毎に判定する主食副食判定部と、
     前記食器検出部により前記第1画像から検出された複数個の食器と前記第2画像から検出された複数個の食器から、同一の食器の組を判定する食器組判定部と、
     前記食器組判定部により判定された同一の食器の組の画像に基づいて各食器に盛られた料理に対する食事量を食器毎に認識する食事量認識部と、
     前記主食副食判定部により判定された食器毎の判定結果と前記食事量認識部により認識された食器毎の食事量とに基づいて、主食もしくは副食の少なくとも一つの食事量を算出する主食副食食事量算出部と、
     を備える食事量測定装置。
    An imaging unit that captures the meal tray before and after a meal,
    An image acquisition unit that acquires a first image before a meal and a second image after a meal taken by the imaging unit, and an image acquisition unit.
    A tableware detection unit that detects a plurality of tableware in the meal tray based on the first image and the second image, and
    A cooking recognition unit that recognizes the type of dishes served on each tableware for each of the detected tableware based on the first image.
    A staple food side dish determination unit that determines for each tableware whether the dish in the tableware corresponds to a staple food or a side dish from the recognized types of dishes.
    A tableware set determination unit that determines the same tableware set from the plurality of tableware detected from the first image by the tableware detection unit and the plurality of tableware detected from the second image.
    A meal amount recognition unit that recognizes the amount of food for each dish served on each tableware based on an image of the same tableware set determined by the tableware set determination unit, and a meal amount recognition unit.
    The staple food side dish meal amount is calculated based on the determination result for each table dish determined by the staple food side dish determination unit and the meal amount for each tableware recognized by the meal amount recognition unit. Calculation part and
    A meal amount measuring device including.
  2.  前記主食副食判定部は、前記食器内の料理に基づいて該料理の主食の割合と副食の割合を判定する請求項1に記載の食事量測定装置。 The meal amount measuring device according to claim 1, wherein the staple food side dish determination unit determines the ratio of the staple food and the side dish of the dish based on the dish in the tableware.
  3.  前記食事量認識部は、前記食器に盛られた料理の画像と前記食器に盛られた料理の量とのペアを教師データとして機械学習した学習器を含み、
     前記第1画像のうちの食器毎の画像を前記学習器に入力した場合に認識された食器毎の料理の量と、前記第2画像のうちの食器毎の画像を前記学習器に入力した場合に認識された食器毎の料理の量との同一の食器同士の料理の量の差に基づいて、各食器に盛られた料理に対する食事量を食器毎に認識する、請求項1又は2に記載の食事量測定装置。
    The meal amount recognition unit includes a learning device that machine-learns a pair of an image of a dish served on the tableware and the amount of the dish served on the tableware as teacher data.
    When the amount of food for each tableware recognized when the image for each tableware in the first image is input to the learning device and the image for each tableware in the second image are input to the learning device. The amount of food for each tableware is recognized for each tableware based on the difference between the amount of food for each tableware and the amount of food for the same tableware, according to claim 1 or 2. Meal amount measuring device.
  4.  前記撮像部は複数のカメラを備え、
     前記画像取得部は、同一の食事トレイについて、前記複数のカメラにより撮影された複数の前記第1画像及び複数の前記第2画像を取得する請求項1から3のいずれか1項に記載の食事量測定装置。
    The imaging unit includes a plurality of cameras.
    The meal according to any one of claims 1 to 3, wherein the image acquisition unit acquires a plurality of the first images and a plurality of the second images taken by the plurality of cameras on the same meal tray. Quantitative measuring device.
  5.  前記食事トレイが載置される食事トレイ設置部を備え、
     前記撮像部は、前記食事トレイ設置部に載置された前記食事トレイを撮影する請求項1から4のいずれか1項に記載の食事量測定装置。
    It is equipped with a meal tray installation section on which the meal tray is placed.
    The food amount measuring device according to any one of claims 1 to 4, wherein the imaging unit captures the food tray mounted on the food tray installation unit.
  6.  前記食事トレイ設置部に前記食事トレイが載置されたことを検知する食事トレイ検知部を備え、
     前記撮像部は、前記食事トレイ検知部が前記食事トレイを検知した際に、前記第1画像及び前記第2画像を撮影する請求項5に記載の食事量測定装置。
    A meal tray detection unit for detecting that the meal tray is placed is provided on the meal tray installation unit.
    The food amount measuring device according to claim 5, wherein the imaging unit captures the first image and the second image when the food tray detecting unit detects the food tray.
  7.  前記撮像部は、前記第1画像及び前記第2画像の撮影以外の期間に動画を撮影し、
     前記食事トレイ検知部は、前記動画に基づいて前記食事トレイ設置部に前記食事トレイが載置されたことを検知する請求項6に記載の食事量測定装置。
    The imaging unit captures a moving image during a period other than the capture of the first image and the second image.
    The meal amount measuring device according to claim 6, wherein the meal tray detection unit detects that the meal tray is placed on the meal tray installation unit based on the moving image.
  8.  前記食事トレイ設置部は、前記食事トレイを前記撮像部による撮影位置に案内するガイド部を備える請求項5から7のいずれか1項に記載の食事量測定装置。 The food amount measuring device according to any one of claims 5 to 7, wherein the food tray installation unit includes a guide unit that guides the food tray to a shooting position by the imaging unit.
  9.  前記食事トレイ設置部には、前記食事トレイを片手で保持した状態で前記食事トレイ設置部に載置可能にする切り欠き部が形成された請求項5から8のいずれか1項に記載の食事量測定装置。 The meal according to any one of claims 5 to 8, wherein the meal tray installation portion is formed with a notch portion that allows the meal tray to be placed on the meal tray installation portion while being held by one hand. Quantitative measuring device.
  10.  前記撮像部による撮影の成否を報知する報知部を備える請求項1から9のいずれか1項に記載の食事量測定装置。 The food amount measuring device according to any one of claims 1 to 9, further comprising a notification unit for notifying the success or failure of photography by the imaging unit.
  11.  前記撮像部により前記食事トレイを撮影する際に、前記食事トレイを利用する利用者の利用者IDを検出する利用者ID検出部と、
     前記利用者ID検出部により前記利用者IDが検出された時に前記撮像部により撮影された前記第1画像及び前記第2画像を、それぞれ前記検出された利用者IDと関連付けて記憶する記憶部と、を備え、
     前記主食副食食事量算出部は、前記利用者ID毎に前記主食もしくは前記副食の少なくとも一つの食事量を算出する、請求項1から10のいずれか1項に記載の食事量測定装置。
    A user ID detection unit that detects a user ID of a user who uses the meal tray when the food tray is photographed by the imaging unit.
    A storage unit that stores the first image and the second image taken by the imaging unit when the user ID is detected by the user ID detection unit in association with the detected user ID, respectively. , With
    The meal amount measuring device according to any one of claims 1 to 10, wherein the staple food side dish meal amount calculation unit calculates at least one meal amount of the staple food or the side dish for each user ID.
  12.  前記記憶部に記憶された前記第1画像及び前記第2画像のうち、前記利用者IDと関連付けられた前記第1画像のみが存在する場合、前記第1画像から検出された複数個の食器に盛られた料理は主食及び副食とも完食されたものと判定する完食判定部を備え、
     前記主食副食食事量算出部は、前記完食判定部により完食が判定された場合、前記主食もしくは前記副食の少なくとも一つの食事量を算出する際に前記完食判定部の判定結果を使用する、請求項11に記載の食事量測定装置。
    When only the first image associated with the user ID is present among the first image and the second image stored in the storage unit, the plurality of tableware detected from the first image It is equipped with a complete meal determination unit that determines that both the staple food and the side dish have been completed.
    The staple food side dish meal amount calculation unit uses the determination result of the complete meal determination unit when calculating at least one meal amount of the staple food or the side dish when the complete meal determination unit determines the complete meal. , The food amount measuring device according to claim 11.
  13.  前記主食副食食事量算出部により前記利用者ID毎に算出された前記主食もしくは前記副食の少なくとも一つの食事量を、前記利用者IDに関連付けて記録する記録部を備える請求項11又は12に記載の食事量測定装置。 The 11th or 12th claim is provided with a recording unit including a recording unit that records at least one meal amount of the staple food or the side dish calculated for each user ID by the staple food side dish meal amount calculation unit in association with the user ID. Meal amount measuring device.
  14.  前記利用者ID検出部は、前記食事トレイ又は前記食事トレイに載置される利用者のネームプレートに付された利用者IDを示すバーコードを読み取ることで前記利用者IDを検出し、若しくは前記食事トレイ又は前記ネームプレートに埋め込まれた該利用者の利用者IDを記憶する無線タグと通信して前記利用者IDを検出する、請求項11から13のいずれか1項に記載の食事量測定装置。 The user ID detection unit detects the user ID by reading the bar code indicating the user ID attached to the meal tray or the user's name plate placed on the meal tray, or the user ID detection unit. The meal amount measurement according to any one of claims 11 to 13, which detects the user ID by communicating with a meal tray or a wireless tag embedded in the name plate that stores the user ID of the user. apparatus.
  15.  撮像部により食前の食事トレイを撮影するステップと、
     前記撮像部により食後の食事トレイを撮影するステップと、
     前記撮像部により撮影した食前の第1画像及び食後の第2画像をそれぞれ取得するステップと、
     食器検出部により前記第1画像及び前記第2画像に基づいて前記食事トレイ内の複数個の食器をそれぞれ検出するステップと、
     前記第1画像に基づいて前記検出された食器毎に各食器に盛られた料理の種類を、料理認識部により認識するステップと、
     前記認識された料理の種類から前記食器内の料理が主食に相当するか、又は副食に相当するかを、主食副食判定部が食器毎に判定するステップと、
     前記第1画像から検出された複数個の食器と前記第2画像から検出された複数個の食器から、同一の食器の組を食器組判定部が判定するステップと、
     前記判定された同一の食器の組の画像に基づいて各食器に盛られた料理に対する食事量を、食器毎に食事量認識部が認識するステップと、
     前記判定された食器毎の主食副食の判定結果と前記認識された食器毎の食事量とに基づいて、主食もしくは副食の少なくとも一つの食事量を主食副食食事量算出部が算出するステップと、
     を含む食事量測定方法。
    The step of taking a picture of the meal tray before meals with the imaging unit,
    The step of photographing the meal tray after meals by the imaging unit, and
    A step of acquiring a first image before a meal and a second image after a meal taken by the imaging unit, and
    A step of detecting a plurality of tableware in the meal tray based on the first image and the second image by the tableware detection unit, and
    A step of recognizing by the cooking recognition unit the type of dishes served on each tableware for each of the detected tableware based on the first image.
    A step in which the staple food side dish determination unit determines for each tableware whether the dish in the tableware corresponds to the staple food or the side dish from the recognized types of dishes.
    A step in which the tableware set determination unit determines the same set of tableware from the plurality of tableware detected from the first image and the plurality of tableware detected from the second image.
    A step in which the meal amount recognition unit recognizes the amount of food for each dish served on each tableware based on the image of the same set of tableware determined.
    A step in which the staple food side dish meal amount calculation unit calculates at least one meal amount of the staple food or side dish based on the determination result of the staple food side dish for each determined tableware and the meal amount for each recognized tableware.
    A method for measuring the amount of food including.
PCT/JP2020/040067 2019-10-31 2020-10-26 Meal amount measuring device and method WO2021085369A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-198565 2019-10-31
JP2019198565 2019-10-31

Publications (1)

Publication Number Publication Date
WO2021085369A1 true WO2021085369A1 (en) 2021-05-06

Family

ID=75715158

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/040067 WO2021085369A1 (en) 2019-10-31 2020-10-26 Meal amount measuring device and method

Country Status (1)

Country Link
WO (1) WO2021085369A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114743153A (en) * 2022-06-10 2022-07-12 北京航空航天大学杭州创新研究院 Non-sensory dish-taking model establishing and dish-taking method and device based on video understanding

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008204105A (en) * 2007-02-19 2008-09-04 Shikoku Chuboki Seizo Kk Automatic food intake measuring system and automatic dietary intake measuring method
JP2012212249A (en) * 2011-03-30 2012-11-01 Fujitsu Ltd Meal image analysis method, meal image analysis program and meal image analysis device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008204105A (en) * 2007-02-19 2008-09-04 Shikoku Chuboki Seizo Kk Automatic food intake measuring system and automatic dietary intake measuring method
JP2012212249A (en) * 2011-03-30 2012-11-01 Fujitsu Ltd Meal image analysis method, meal image analysis program and meal image analysis device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114743153A (en) * 2022-06-10 2022-07-12 北京航空航天大学杭州创新研究院 Non-sensory dish-taking model establishing and dish-taking method and device based on video understanding

Similar Documents

Publication Publication Date Title
JP7494439B2 (en) Food service management system and its operation method
Boushey et al. New mobile methods for dietary assessment: review of image-assisted and image-based dietary assessment methods
US11763437B2 (en) Analyzing apparatus and method, and image capturing system
CN103562921A (en) Location enabled food database
CN113179661A (en) Control method, information terminal, program, and recording medium
US20150154371A1 (en) Health data system and method
EP2795574A1 (en) Method for measuring the absorption of fluid in an absorbent product
Dell et al. Mobile tools for point-of-care diagnostics in the developing world
WO2021085369A1 (en) Meal amount measuring device and method
JP6368497B2 (en) Eating habit management program, eating habit management method, and eating habit management device
JP2011076323A (en) Ordering management device, ordering management system and control program
US20220222844A1 (en) Method, device, and program for measuring food
KR101952295B1 (en) A nutrition measuring system for foods in the tray
CN115315718A (en) Information providing method
KR101929501B1 (en) Bodyfat measurement apparatus, terminal communicating the same and body shape management system including the same
KR102488997B1 (en) A method, server, device and program for measuring amount of food using potable device
JP2022530263A (en) Food measurement methods, equipment and programs
CN114359299B (en) Diet segmentation method and diet nutrition management method for chronic disease patients
KR20190104980A (en) Management system of cafeteria and operation method thereof
KR20210008268A (en) Personalized nutrition management system and analysis method thereof
JP2001318991A (en) Nutrition control system using information system
CN114360690A (en) Method and system for managing diet nutrition of chronic disease patient
JP2017045340A (en) Meal content input system
WO2018008686A1 (en) Management system for managing nutritional component in meal
KR20210040495A (en) Image based calorimetry

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20883641

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20883641

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP