WO2023105033A1 - Display apparatus of cooking appliance and cooking appliance - Google Patents

Display apparatus of cooking appliance and cooking appliance Download PDF

Info

Publication number
WO2023105033A1
WO2023105033A1 PCT/EP2022/085130 EP2022085130W WO2023105033A1 WO 2023105033 A1 WO2023105033 A1 WO 2023105033A1 EP 2022085130 W EP2022085130 W EP 2022085130W WO 2023105033 A1 WO2023105033 A1 WO 2023105033A1
Authority
WO
WIPO (PCT)
Prior art keywords
cooking
display
node
display apparatus
progress
Prior art date
Application number
PCT/EP2022/085130
Other languages
French (fr)
Inventor
Hui YANG (TEI)
Chao Li
Original Assignee
BSH Hausgeräte GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202111505007.5A external-priority patent/CN115191838A/en
Application filed by BSH Hausgeräte GmbH filed Critical BSH Hausgeräte GmbH
Publication of WO2023105033A1 publication Critical patent/WO2023105033A1/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C7/00Stoves or ranges heated by electric energy
    • F24C7/08Arrangement or mounting of control or safety devices
    • F24C7/082Arrangement or mounting of control or safety devices on ranges, e.g. control panels, illumination
    • F24C7/085Arrangement or mounting of control or safety devices on ranges, e.g. control panels, illumination on baking ovens

Definitions

  • Embodiments of the present invention relate to the technical field of household appliances, and in particular, to a display apparatus of a cooking appliance and a cooking appliance.
  • cooking devices including microwave ovens and ovens have been increasingly more widely used, and have become one of the indispensable household appliances for every household.
  • the existing cooking device can only mechanically heat food according to parameters such as time and temperature set by a user.
  • the food is easily overcooked or even burnt, or the food cannot be sufficiently cooked under set parameters due to insufficient cooking, which requires the user to increase the cooking time or temperature. It can be seen that the cooking device is not intelligent enough, the cooking effect is not good, and user experience is not good.
  • an oven in the prior art acquires an internal image thereof through a camera and presents the image to the user, so that the user can learn a cooking state of food.
  • the scheme of showing the cooking state of internal food still requires the user to adjust the cooking parameter according to the observed image. If the user does not intervene in time, the cooking effect is also affected, and the working process cannot be intelligently controlled.
  • the user can usually only see the food identification result of the oven and the parameter setting provided by the oven according to determination at the beginning.
  • data that the oven feeds back to the user includes only a real-time picture of a camera, a temperature change in the oven, and the remaining time.
  • the user generally cannot determine and learn a cooking degree of food and what the intelligent control process of the oven is doing. Therefore, the intelligent control is a black box for the user, and it is precisely for the reason that the user is confused about the authenticity of intelligence and has poor experience.
  • an embodiment of the present invention provides a display apparatus of a cooking appliance.
  • the display apparatus includes a display unit and a control unit.
  • the control unit is configured to display a progress pattern representing a current cooking progress through the display unit, the progress pattern includes a plurality of cooking nodes temporally spaced apart from each other, and the cooking nodes represent cooking state information of cooked food in a cooking process and are visually distinguished between each other.
  • an important node intelligently intervened by the cooking appliance in the cooking process is displayed to a user by adding the cooking state to the time progress pattern according to the preset cooking node, so that the user can learn how the artificial intelligence helps cooking and can visually perceive the intelligent control process.
  • the progress pattern includes a bar pattern, and the current cooking progress extends over time along the bar pattern in the form of a brightness or color of light, so that the cooking progress of the cooking appliance can be visually learned through the change of the light or color.
  • the progress pattern includes a background pattern representing a complete cooking progress, where the current cooking progress gradually covers or replaces the background pattern over time in the form of a brightness or color of light different from that of the background pattern. Therefore, the user can visually feel the cooking progress in the whole cooking process.
  • the cooking node is located on the background pattern to present a key node of the cooking process in a visual way, which facilitates the interactive operation of the cooking device or the display apparatus.
  • the cooking node includes time information corresponding to the cooking progress.
  • control unit is further configured to receive an operation of a user on the cooking node through the display unit, and respond to the operation and display the cooking state information of the corresponding cooking node on the display unit based on the operation. In this way, the interaction with the user can be realized through the display unit, and the cooking state at the corresponding cooking node may also be checked.
  • control unit is further configured to respond to the user operation only after the current cooking progress reaches or exceeds the cooking node, and display the cooking state information of the corresponding cooking node on the display unit based on the operation. Therefore, the user may operate the corresponding cooking node and obtain a response only after the current cooking progress reaches or exceeds the cooking node, so as to learn the cooking state information through the display unit.
  • the cooking state information includes preset standard cooking state information and actual cooking state information.
  • the cooking state information includes information about an adjustment for cooking control according to a comparison of the actual cooking state information with the standard cooking state information.
  • the information about the adjustment for cooking control includes at least one of a cooking time, a cooking temperature, and a cooking fire intensity.
  • monitoring the cooking state of cooked food at a preset cooking node can ensure timely intervention in the cooking process at a key cooking node, timely adjustment of the cooking parameter, and timely correction when the cooking state deviates from the expectation, so that a cooking degree of food meets the expectation.
  • the intelligent intervention process may be displayed to the user through the display unit to improve the experience of the user for the intelligent control process.
  • the standard cooking state information includes a standard cooking state image of to-be-cooked food
  • the actual cooking state information includes an actual cooking state image of the to-be-cooked food, so as to present the cooking state of the food at the current cooking node, and visually observe whether the actual cooking progress of the node meets the expectation.
  • control unit is further configured to display, through the display unit, a first area including an image of cooked food displayed in real time and a second area including the progress pattern.
  • a cooking appliance includes the above display apparatus.
  • the cooking appliance further includes an information acquisition unit connected to the display apparatus and configured to acquire actual cooking state information of to-be- cooked food in a cooking process.
  • the information acquisition unit includes a photographing apparatus.
  • FIG. 1 is a schematic diagram of a cooking appliance according to an embodiment of the present invention.
  • FIG. 2 is a schematic frame diagram of a cooking appliance according to an embodiment of the present invention.
  • FIG. 3 is a flowchart of a method for displaying a cooking state of a cooking appliance according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a display mode of the cooking state of FIG. 2 according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a display mode of FIG. 4.
  • FIG. 6 is a schematic diagram of still another display mode of FIG. 4.
  • FIG. 7 is a schematic diagram of a display interface of a display unit in FIG. 2.
  • FIG. 8 is a schematic structural diagram of a control system for a cooking appliance according to an embodiment of the present invention.
  • FIG. 9 is a work flow diagram of a control system for a cooking appliance of FIG. 8.
  • 1-Cooking appliance 10-Body; 101-lnput/output module; 1011-Display unit; 20-Chamber; 11- Control unit; 12-Heating module; 13-lnformation acquisition unit; 14-Tray; 2-Server.
  • an intelligent cooking device During use of an intelligent cooking device, a user hopes that the cooking device such as an oven can run completely automatically, to reduce intervention of the user, but the user is also very concerned that the cooking result produced by intelligent control is not what the user expected, and the user hopes to obtain appropriate information feedback. In this way, the user can monitor the automatic cooking of the oven and obtain a satisfactory cooking result.
  • data that the oven feeds back to the user may include only a real-time picture of a camera, a temperature change in the oven, and the remaining time.
  • the implementations of the present invention provide a method for displaying a cooking state of a cooking appliance, a display apparatus of the cooking appliance, and a cooking appliance and a control system therefor.
  • a visual presentation of dynamic identification of the cooking state of the cooking appliance and of an interventional measure is displayed to a user throughout an entire cooking process, so that the user can understand how intelligent control of the cooking appliance works and helps accurately control the cooking process, and can visually feel the intelligence degree and commercial value of the cooking device, which improves user experience.
  • FIG. 1 and FIG. 2 are schematic diagrams of a cooking appliance according to an embodiment of the present invention.
  • FIG. 3 is a flowchart of a method for displaying a cooking state of a cooking appliance according to an embodiment of the present invention.
  • a cooking appliance 1 shown in FIG. 1 and FIG. 2 may perform the method shown in FIG. 3 to monitor cooking states of food at different cooking nodes during cooking of the food, and dynamically and visually display a working process of the cooking appliance to a user according to an identification result.
  • the cooking appliance 1 of this embodiment may be an oven, which may include a body 10 and a chamber 20 arranged in the body 10.
  • the chamber 20 is configured to store cooked food.
  • the cooking appliance 1 may have a door that can open or close the chamber 20, and when the door is opened, the chamber 20 is exposed for the user to take or place the cooked food.
  • the door is closed, the chamber 10 is closed.
  • cooking operations such as heating, baking, and the like may be performed on the cooked food placed in the chamber.
  • the cooking appliance 1 may further include an input/output module 101. Operating states of the cooking appliance 1 may be adjusted by operating the input/output module 101. Cooking state information may also be displayed through the input/output module 101.
  • the operating states of the cooking appliance 1 may include a heating power, a heating direction, a heating duration, an amount of conveyed steam, a steam conveying direction, and the like.
  • the adjustment to the operating states may be realized by adjusting operating states of specific functional modules in the cooking appliance 1.
  • the cooking appliance 1 may further include a control unit 11 , which is configured to adjust the operating state of the corresponding functional module according to an instruction fed back by the input/output module 101 , so that the operating state of the cooking appliance 1 conforms to the instruction of the user.
  • the control unit 11 may also control an output mode of the input/output module 101 through an input signal or the instruction, for example, a screen content display mode or a display mode.
  • the cooking appliance 1 may include a heating module 12 configured to heat cooked food 2.
  • the cooking appliance 1 may include a plurality of heating modules 12, and the heating modules 12 are dispersed in different areas of the cooking appliance 1 to heat the chamber 20 from different angles so that the cooked food 2 is heated as evenly as possible.
  • the control unit 11 may independently adjust heating power of each of the heating modules 12 to adjust an amount of heat radiated by the each heating module 12 into the chamber 20.
  • the control unit 11 may further independently adjust a heating direction of the each heating module 12 to adjust a radiation angle of radiating heat into the chamber 20 by the each heating module 12.
  • the control unit 11 may further adjust a heating duration of the specific heating module 12 operating at a specific heating power to heat the food to a desired effect.
  • the cooking appliance 1 may include a spray module (not shown) configured to deliver water vapor into the chamber 20 to adjust humidity in the chamber 20.
  • a spray module (not shown) configured to deliver water vapor into the chamber 20 to adjust humidity in the chamber 20.
  • the cooking appliance 1 may include a plurality of spray modules, and the plurality of spray modules are dispersed in different areas of the cooking appliance 1 to deliver water vapor into the chamber 20 from different angles, so that the humidity distribution on the surface of the cooked food is balanced.
  • the control unit 11 may independently adjust an amount of conveyed steam of each of the spray modules.
  • the control unit 11 may also independently adjust a steam conveying direction of the each spray module.
  • the control unit 11 may further adjust a spraying duration of a specific spray module operating at a specific amount of conveyed steam.
  • the input/output module 101 in FIG. 1 includes an input apparatus configured to realize an input function and a display apparatus configured to realize an output function.
  • the input apparatus and the display apparatus may be integrated, for example, which may be a touch display panel with a touch function and configured to realize data input and image information display.
  • the input apparatus and the display apparatus may also be separated and each realize their own functions.
  • the input apparatus may be an adjustment button similar to a knob, and may further be in other forms, such as a touch screen, a voice control module, and the like, and the display apparatus may be a display screen, or the like.
  • the display apparatus may be arranged on the cooking appliance 1 , or may be separated from or detachably connected to the cooking appliance 1 .
  • the display apparatus may include a display unit 1011 and a control unit 11 connected to the display unit.
  • the control unit 11 may be configured to display a progress pattern representing a current cooking progress of the cooking appliance 1 through the display unit 1011. In this way, the cooking states of the cooking appliance 1 at different cooking nodes are displayed to the user in a visual manner, including real-time cooking state information and interventional measure information of the cooking appliance 1.
  • control unit 11 connected to the display unit 1011 may be a control module of the cooking appliance itself, and the control of the cooking appliance 1 and the control of the display unit 1011 are realized through the control module.
  • control unit 11 may perform the method shown in FIG. 3 to monitor the cooking state of the cooked food, and display a control parameter or intervention behavior of the cooking appliance 1 in the cooking process through the display unit 1011 according to an identification result.
  • control unit 11 may include or be externally connected to a memory (not shown).
  • the memory stores a computer program. When the computer program is executed by a processor, the steps of the method shown in FIG. 3 are performed.
  • control unit 11 in the cooking appliance 1 is displayed in FIG. 1 by way of example. In practical application, a specific arrangement position of the control unit 11 may be adjusted as required. In a variation, the control unit 11 may be externally arranged on the cooking appliance 1 (for example, a server 2, and the like).
  • the cooking appliance 1 is provided with a communication module (not shown), and the control unit 11 communicates with the communication module to transmit a control instruction to the cooking appliance 1.
  • the control instruction is used for controlling an operating state of the cooking appliance 1.
  • the cooking appliance 1 may include an information acquisition unit 13 arranged in the chamber 20 or in the door to acquire actual cooking state information of food at least at one angle.
  • the control unit 11 communicates with the information acquisition unit 13 to acquire the data acquired by the information acquisition unit 13.
  • the information acquisition unit 13 may be a photographing apparatus arranged at the top of the chamber 20 and covers at least the tray 14 within an overlooking range.
  • a plurality of photographing apparatuses may be arranged to shoot the cooked food 2 from different angles.
  • the photographing apparatus may further be arranged inside the door, and a shooting range includes at least an area where the cooked food 2 is located.
  • the cooking appliance 1 may include a temperature detection apparatus (not shown) configured to detect temperatures of a food surface, the inside of the food, the inside of the body, and the like.
  • a temperature sensor (not shown) may be arranged in the chamber 20 to collect the temperature (not shown) in the chamber 20.
  • the tray 14 may be provided with a temperature sensor or probe (not shown) to collect the surface temperature or the internal temperature of the cooked food 2.
  • the method for displaying a cooking state of a cooking appliance of the cooking appliance 1 in this embodiment may include the following steps.
  • Step S100 Preset N cooking nodes and N cooking node identification models at different time points for to-be-cooked food in a cooking process, where each of the cooking node identification models includes standard cooking state information of a corresponding cooking node, and N is a positive integer greater than or equal to 1.
  • Step S102 Acquire a cooking instruction for the food, and cook the food according to the received cooking instruction.
  • Step S104 Acquire actual cooking state information of the cooking node.
  • Step S106 Compare the acquired actual cooking state information with the standard cooking state information of the corresponding cooking node through the cooking node identification models for identification.
  • Step S108 Transmit an identification result corresponding to the at least one cooking node to a user terminal interface or a cooking appliance interface for display to a user, where the identification result includes at least normal cooking state information or abnormal cooking state information.
  • the cooking appliance interface herein may be understood as the display apparatus or the display unit of the cooking appliance.
  • the user terminal interface may be a mobile terminal wirelessly connected to the cooking appliance, or the like.
  • a cooking manner of food generally corresponds to at least one cooking program.
  • the cooking program may include different cooking nodes in a cooking process.
  • the cooking nodes are used for monitoring cooking degrees or cooking states of food in different time periods in the cooking process.
  • the cooking node identification model at different cooking nodes in the cooking process is adopted to monitor the cooking state of the cooked food and carry out intelligent intervention and control, and then displays the intelligent control process to the user according to the identification result, so that the user can visually feel the intelligent control process of the cooking appliance and improve user experience.
  • the prepared food is put into the cooking appliance 1.
  • the information acquisition unit 13 starts to work and acquires a captured image in real time.
  • the user sets cooking parameters such as a heating time, a heating temperature, and the like and starts cooking, the information acquisition unit 13 starts to work.
  • the information acquisition unit 13 may store the acquired image in a body memory or transmit the acquired image to a memory of a remote server.
  • the control unit 11 monitors the food cooking state according to the acquired image, identifies the actual cooking state information of the cooking nodes by using the preset cooking node identification model, and transmits identification results corresponding to the cooking nodes to the user terminal interface or the display unit of the cooking appliance for display to the user.
  • the cooking appliance 1 may adjust the cooking parameter according to the identification result, and display the adjusted parameter to the user through the display unit 1011.
  • the display mode may be displaying at the end of a cooking process or in the cooking process, so as to intelligently monitor the cooking process, adjust the cooking parameter in time when abnormal cooking is found, and visually display the intelligent cooking control process to the user.
  • the cooking node identification model may be stored in the memory in advance.
  • the model includes the standard cooking state information of the corresponding cooking node.
  • the standard cooking state information may be standard food cooking state images corresponding to a plurality of cooking nodes.
  • the standard image may be understood as a state that food should reach at the cooking node, which may include a color parameter, a shape parameter, and the like used for representing the state of the food.
  • the standard image is acquired by training a large amount of experimental cooking data.
  • different cooking nodes correspond to different standard images.
  • the cooking node identification models are in a one-to-one correspondence with the types of food.
  • a cooking node identification model A corresponding to cooking steak, a cooking node identification model B corresponding to cooking a sweet potato, and the like are stored in the memory.
  • a steak cooking program includes three cooking nodes, and the corresponding model A includes three cooking node identification sub-models A1 , A2, and A3. Each sub-model includes a standard steak image at a corresponding cooking node.
  • the identification result herein may include information that the steak is in a normal cooking state at one of the cooking nodes, or may include information that the steak is in an abnormal cooking state.
  • the cooking node identification models corresponding to various food types may be arranged in the memory, so as to expand a menu range within which the cooking appliance 1 can intelligently cook.
  • the display mode of the display apparatus of the cooking appliance 1 may be displaying, under the control of the control unit 11 , the identification result corresponding to the cooking node to the user by using a graphic mark according to a cooking progress.
  • the control unit 1 may control the display unit 1011 to display a progress pattern representing a complete cooking process.
  • the progress pattern includes a plurality of cooking nodes temporally spaced apart from each other. Each of the cooking nodes represents different cooking state information of cooked food in a cooking process, and the each cooking node is visually distinguished between each other.
  • the scene may be the display mode of the display unit upon completion of cooking or during cooking.
  • control unit 1 may control the display unit 1011 to display a progress pattern representing a current cooking progress.
  • the progress pattern includes a plurality of cooking nodes temporally spaced apart from each other.
  • Each of the cooking nodes represents different cooking state information of cooked food in a cooking process, and the each cooking node is visually distinguished between each other.
  • the progress pattern may include a bar pattern, and the current cooking progress in the form of a brightness or color of light extends over time along the bar pattern, so that the cooking progress of the cooking appliance can be visually learned through the change of the light or color.
  • the progress pattern may include a background pattern that represents the complete cooking progress in a cooking process, and the current cooking progress gradually covers or replaces the background pattern over time with a light brightness or color different from that of the background pattern.
  • the cooking node may be located on the background pattern to present a key node of the cooking process in a visual way, which facilitates the interactive operation of the cooking device or the display apparatus.
  • the user may visually feel the extent to which the cooked food has been cooked, and what role is played by, what control is performed by, and how to work by the artificial intelligence brought by the image identification technology in the whole cooking process. Therefore, the dynamic identification process of the cooking appliance 1 is displayed to the user in the above manner, so as to improve user experience and enhance the trust of the technology and happiness of success.
  • a complete cooking process is presented in the form of time axis through the display unit, and key state points, that is, cooking nodes, may be set on the time axis.
  • key state points that is, cooking nodes
  • three cooking nodes for cooking steak are presented through bubble graphics in the figure.
  • time information of the corresponding node may also be displayed at the node.
  • a first node in FIG. 4 indicates a cooking state identification result at 10 minutes and 30 seconds.
  • bubble images of different colors may be used for indicating whether the identification result of the current cooking node is a normal cooking state or an abnormal cooking state.
  • a black filled bubble indicates one identification result
  • a blank bubble indicates another identification result. This is only an example, and different identification results may be distinguished between each other in different ways in the actual application process.
  • the user may view the identification result of the corresponding node by operating the graphic mark such as clicking the bubble graphic in the display unit.
  • the user can click on the image mark to be viewed on the display unit 1011 , such as the bubble mark.
  • the cooking appliance 1 may acquire, through the control unit 11 , a control signal for a user to operate the graphic mark, and display the cooking state information of the cooking node corresponding to the graphic mark through the user terminal interface or the cooking appliance interface according to the control signal, where the cooking state information indicates whether a current cooking state of the cooking node is a normal cooking state or an abnormal cooking state.
  • the interaction with the user can be realized through the display unit, and the cooking state at the corresponding cooking node may also be checked.
  • the control unit 11 is further configured to respond to the user operation only after the current cooking progress reaches or exceeds the cooking node, and display the cooking state information of the corresponding cooking node on the display unit 1011 based on the operation.
  • the cooking nodes represent different cooking state information of the cooked food in a cooking process
  • the cooking appliance 1 or the control unit 11 has not intervened in or adjusted or identified the cooking process, and the corresponding cooking node has no cooking state information available for the user to view. Therefore, the user may operate the corresponding cooking node and obtain a response only after the current cooking progress reaches or exceeds the cooking node, so as to learn the cooking state information through the display unit.
  • the cooking state information of the cooking node corresponding to the graphic mark may include the actual cooking state image and the standard cooking state image corresponding to the cooking node of the to-be-cooked food, so as to visually observe whether the actual cooking progress of the node meets the expectation.
  • adjusted control parameter information in the cooking process is displayed through the user terminal interface or the cooking appliance interface according to the control signal, so as to visually observe the measure of intelligently intervening the node by the cooking appliance 1 , so that the cooking result meets the expectation as much as possible.
  • the display unit presents a picture shown on the right side of the figure to the user.
  • the picture includes the standard cooking state image and the actual cooking state image, and the cooking time is increased by 10 seconds.
  • the presented information indicates the comparison between the actual cooking state of steak and the standard cooking state at the current cooking node, that is, the cooking state is an abnormal state.
  • the cooking appliance adjusts the cooking time through the comparison, and displays the adjusted situation to the user through the display unit.
  • the display unit when the user clicks on a third bubble graphic in the figure, that is, a third cooking node, the display unit presents a picture shown on the right side of the figure to the user.
  • the picture includes the standard cooking state image and the actual cooking state image, and the current cooking state is a standard state.
  • the presented information indicates that at the current cooking node, the actual cooking state of steak is equivalent to the standard cooking state, which conforms to the cooking state that the current cooking node should reach, that is, the normal cooking state. Therefore, "the current cooking state is the standard state" is directly displayed through the display unit.
  • the abnormal cooking state information includes at least one of the following: an actual cooking state does not reach a standard cooking state corresponding to the cooking node, and the actual cooking state is beyond the standard cooking state corresponding to the cooking node.
  • the cooking appliance controls to increase the control parameter, for example, increase as at least one of a cooking time, a cooking temperature, a cooking fire intensity, and the like.
  • the cooking appliance controls to decrease the control parameter, for example, decrease as at least one of a cooking time, a cooking temperature, a cooking fire intensity, and the like.
  • the adjusted control parameter information includes the control parameter information automatically adjusted by the cooking appliance, and a correspondence between the intelligent identification result of the cooking appliance 1 and the control parameter information may be preset, so that the cooking appliance 1 can automatically acquire the corresponding control parameter information according to the identification result.
  • the cooking appliance 1 is used for cooking steak, a cooking degree selected by the user is medium well, and the expected cooking duration is 8 minutes.
  • the control unit 11 monitors that at the third cooking node, such as at 6 minutes of the cooking time, the comparison result between the real-time image and the standard image acquired by the information acquisition unit 13 indicates that the current cooking state of the steak has been beyond the cooking state of steak in the corresponding standard image. If heating is continuously performed for preset 8 minutes, the final steak will probably be overcooked or not up to the expectation.
  • control unit controls to shorten the total heating time, for example, by 20 seconds, that is, to adjust the originally planned heating time of 8 minutes to 7 minutes and 40 seconds. Such fine-tuning can well avoid over-cooking, so that the cooking result meets the expectation.
  • N cooking nodes at different time points for to-be-cooked food in a cooking process may be preset.
  • the cooking nodes are obtained according to experiment cooking data of different foods. It may be learned according to the experiment that most of the ingredients have color and shape changes in the cooking process, which can be used for comparison to determine the cooking state of food. Therefore, the whole cooking process may be divided into different state points according to the characteristics of food, that is, cooking nodes. Different food state points vary. For example, three state points for steak and two state points for chicken wings are found through experiments. The state points will become important reference points for the cooking appliance to determine the cooking state of food.
  • identifying the cooking state of cooked food at a preset cooking node can ensure timely intervention in the cooking process at a key cooking node, timely adjustment of the cooking parameter, and timely correction when the cooking state deviates from the expectation.
  • the cooking result of food meets the expectation, and on the other hand, an important node intervened by artificial intelligence is displayed to the user by adding the cooking state to the time schedule axis, so that the user can understand what the artificial intelligence has done for cooking and can visually perceive the intelligent control process.
  • the real-time image and the corresponding standard image of the food at the current cooking node may be displayed through the display unit 1011 , to present the cooking state of the food at the current cooking node.
  • the control unit 11 may invoke the actual cooking state image and the standard cooking state image of the node stored in the control unit and present the actual cooking state image and the standard cooking state image to the user through the display unit 1011.
  • the control unit 11 may invoke the actual cooking state image and the standard cooking state image of the node stored in the control unit and present the actual cooking state image and the standard cooking state image to the user through the display unit.
  • control unit 11 may further control the display unit 1011 to display, through the display unit 1011 , the first area including an image of the cooked food displayed in real time and the second area including the progress pattern.
  • a display interface of the display unit 1011 is shown.
  • the interface includes an image of a real-time cooking state of the cooked food, that is, steak, and an intelligent identification pattern of the cooking process presented by a progress bar.
  • the display interface may further include cooking control parameter information of the cooking appliance, and the like.
  • the cooking parameter may further be adjusted by using the method in combination with the real-time internal temperature of food.
  • the real-time internal temperature of food may be acquired by using a probe or other sensors, and the cooking appliance 1 is controlled according to the real-time internal temperature and the current cooking state of the food.
  • the cooking appliance 1 is controlled the internal temperature of food, so that the cooking parameter can be better controlled to obtain the expected food cooking effect.
  • the internal temperature of the food does not reach the expected temperature, which may lead to a problem that the food is not sufficiently cooked.
  • the cooking time and the cooking temperature are adjusted by monitoring the internal temperature in real time.
  • over-cooking of the food may be caused, which can be avoided by monitoring the internal temperature in real time and intervening in advance.
  • food identification may be performed before the method, and the cooking node identification model is automatically matched based on the food identification result.
  • a food identification model for food identification is preset, an image of cooked food is acquired, the image is identified through the food identification model, and a corresponding cooking instruction is acquired based on the identified food.
  • N cooking node identification models in the cooking process corresponding to the food are determined according to the identification result of the food.
  • the cooking appliance 1 may be provided with a plurality of information acquisition units 13 configured to acquire images and identification results of cooked food from different angles, and determine one identification result based on a priority to improve the accuracy of food identification.
  • the specific embodiment of the present invention further discloses a control system for a cooking appliance 1 , including a cooking appliance 1 and a server 2 that can interact with each other.
  • the control unit 11 may also be arranged on the server 2, and the control unit 11 is configured to perform the method described in FIG. 3.
  • the control unit 11 is communicatively connected to the cooking appliance 1 .
  • the cooking appliance 1 is configured to receive an identification result transmitted by the server 2, and display the identification result to a user through the display unit 1011.
  • the control unit 11 is arranged in the server 2, and performs data processing such as identification in the server 2, which not only facilitates training, learning, and updating of data such as models, and improves the data accuracy of processed data, but also reduces the complexity of the cooking appliance itself. Therefore, a degree of intelligence of controlling the cooking appliance is improved.
  • the cooking appliance 1 may work according to the cooking control parameter transmitted by the control unit 11. Specifically, the cooking appliance 1 may adjust the cooking control parameter for food cooking by the cooking appliance according to the identification result.
  • the cooking appliance 1 is further configured to display the adjusted cooking control parameter. Specifically, the cooking appliance controls the display unit 1011 through the control unit 11 to display the adjusted control parameter, so as to visually display the intelligent control process of the cooking appliance to the user, and improve user experience.
  • control system With reference to FIG. 9, the specific implementation of the control system is described in detail by using the intelligent oven as an example.
  • the oven is started to acquire video information of cooked food in real time, and the video information is uploaded to a server in the form of video stream.
  • the server identifies the cooked food through the food identification model, and invokes the corresponding cooking parameter according to the cooked food.
  • the server transmits the cooking parameter to the oven in the form of cooking instruction, and the oven displays the instruction information through the display unit.
  • the user confirms the displayed instruction and feeds back user response information to the oven to instruct the oven to start cooking.
  • the oven starts cooking, and in the cooking process, the oven uploads a video stream of to- be-cooked food to the server in real time through a photographing apparatus.
  • the server receives the video stream, and identifies a cooking state of a corresponding cooking node through a preset cooking node identification model.
  • the cooking node 1 identification model identifies the node as an abnormal node
  • an interventional measure for the abnormal node that is, increasing or decreasing the cooking time
  • a real-time cooking state image of the cooking node are transmitted to the oven.
  • the oven receives the identification result, and controls the display unit to display the identification result.
  • the display mode of the identification result may be using a graphic mark, including the actual cooking state image and the standard image, and the interventional measure of the oven for the node, for example, increasing or decreasing parameter information such as the cooking time, as shown in FIG. 4 and FIG. 5.
  • the cooking node 1 identification model identifies the node as a normal node, transmits the identification result to the oven, and controls the display unit to display the identification result, the displayed contents include the actual cooking state image and the standard image, as shown in FIG. 4 and FIG. 6.
  • a plurality of cooking nodes of a kind of food in a cooking process are provided, and each of the nodes is correspondingly preset with a cooking node identification model.
  • the identification process and the display mode of other cooking nodes such as cooking nodes 2 to N are similar to the process of the cooking node 1 , and the details are not described herein again.
  • a cooking ending instruction is transmitted to the oven at the end of cooking, and a cooking ending state is displayed through the display unit.
  • An embodiment of the present invention further provides a computer-readable storage medium, which is a non-volatile storage medium or a non-transient storage medium, having a computer program stored thereon, the computer program, when running on a processor, performing the steps of the control method according to any one of the foregoing embodiments.
  • a computer-readable storage medium which is a non-volatile storage medium or a non-transient storage medium, having a computer program stored thereon, the computer program, when running on a processor, performing the steps of the control method according to any one of the foregoing embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Cookers (AREA)

Abstract

Disclosed are a display apparatus of a cooking appliance and a cooking appliance. The display apparatus includes a display unit and a control unit. The control unit is configured to display a progress pattern representing a current cooking progress through the display unit. The progress pattern includes a plurality of cooking nodes temporally spaced apart from each other. The cooking nodes represent different cooking state information of cooked food in a cooking process and are visually distinguished between each other. According to the display apparatus and the cooking appliance, an important node intervened through artificial intelligence is displayed to a user by adding the cooking state to the time progress pattern, so that the user can learn how the artificial intelligence helps cooking and can visually perceive the intelligent control.

Description

DISPLAY APPARATUS OF COOKING APPLIANCE AND COOKING APPLIANCE
TECHNICAL FIELD
Embodiments of the present invention relate to the technical field of household appliances, and in particular, to a display apparatus of a cooking appliance and a cooking appliance.
BACKGROUND
In recent years, cooking devices including microwave ovens and ovens have been increasingly more widely used, and have become one of the indispensable household appliances for every household. Generally, the existing cooking device can only mechanically heat food according to parameters such as time and temperature set by a user. In such circumstances, the food is easily overcooked or even burnt, or the food cannot be sufficiently cooked under set parameters due to insufficient cooking, which requires the user to increase the cooking time or temperature. It can be seen that the cooking device is not intelligent enough, the cooking effect is not good, and user experience is not good.
For the above situation, an oven in the prior art acquires an internal image thereof through a camera and presents the image to the user, so that the user can learn a cooking state of food. However, the scheme of showing the cooking state of internal food still requires the user to adjust the cooking parameter according to the observed image. If the user does not intervene in time, the cooking effect is also affected, and the working process cannot be intelligently controlled.
Even if in the process of controlling the oven in an intelligent manner, the user can usually only see the food identification result of the oven and the parameter setting provided by the oven according to determination at the beginning. Once cooking begins, data that the oven feeds back to the user includes only a real-time picture of a camera, a temperature change in the oven, and the remaining time. However, through these parameters, the user generally cannot determine and learn a cooking degree of food and what the intelligent control process of the oven is doing. Therefore, the intelligent control is a black box for the user, and it is precisely for the reason that the user is confused about the authenticity of intelligence and has poor experience.
SUMMARY
In order to solve the above problem, an embodiment of the present invention provides a display apparatus of a cooking appliance. The display apparatus includes a display unit and a control unit. The control unit is configured to display a progress pattern representing a current cooking progress through the display unit, the progress pattern includes a plurality of cooking nodes temporally spaced apart from each other, and the cooking nodes represent cooking state information of cooked food in a cooking process and are visually distinguished between each other.
Through the above display apparatus, an important node intelligently intervened by the cooking appliance in the cooking process is displayed to a user by adding the cooking state to the time progress pattern according to the preset cooking node, so that the user can learn how the artificial intelligence helps cooking and can visually perceive the intelligent control process.
Optionally, the progress pattern includes a bar pattern, and the current cooking progress extends over time along the bar pattern in the form of a brightness or color of light, so that the cooking progress of the cooking appliance can be visually learned through the change of the light or color.
Optionally, the progress pattern includes a background pattern representing a complete cooking progress, where the current cooking progress gradually covers or replaces the background pattern over time in the form of a brightness or color of light different from that of the background pattern. Therefore, the user can visually feel the cooking progress in the whole cooking process.
Optionally, the cooking node is located on the background pattern to present a key node of the cooking process in a visual way, which facilitates the interactive operation of the cooking device or the display apparatus.
Optionally, the cooking node includes time information corresponding to the cooking progress.
Optionally, the control unit is further configured to receive an operation of a user on the cooking node through the display unit, and respond to the operation and display the cooking state information of the corresponding cooking node on the display unit based on the operation. In this way, the interaction with the user can be realized through the display unit, and the cooking state at the corresponding cooking node may also be checked.
Optionally, the control unit is further configured to respond to the user operation only after the current cooking progress reaches or exceeds the cooking node, and display the cooking state information of the corresponding cooking node on the display unit based on the operation. Therefore, the user may operate the corresponding cooking node and obtain a response only after the current cooking progress reaches or exceeds the cooking node, so as to learn the cooking state information through the display unit.
Optionally, the cooking state information includes preset standard cooking state information and actual cooking state information.
Optionally, the cooking state information includes information about an adjustment for cooking control according to a comparison of the actual cooking state information with the standard cooking state information.
Optionally, the information about the adjustment for cooking control includes at least one of a cooking time, a cooking temperature, and a cooking fire intensity.
On the one hand, during cooking, monitoring the cooking state of cooked food at a preset cooking node can ensure timely intervention in the cooking process at a key cooking node, timely adjustment of the cooking parameter, and timely correction when the cooking state deviates from the expectation, so that a cooking degree of food meets the expectation. On the other hand, the intelligent intervention process may be displayed to the user through the display unit to improve the experience of the user for the intelligent control process.
Optionally, the standard cooking state information includes a standard cooking state image of to-be-cooked food, and the actual cooking state information includes an actual cooking state image of the to-be-cooked food, so as to present the cooking state of the food at the current cooking node, and visually observe whether the actual cooking progress of the node meets the expectation.
Optionally, the control unit is further configured to display, through the display unit, a first area including an image of cooked food displayed in real time and a second area including the progress pattern.
A cooking appliance includes the above display apparatus.
Optionally, the cooking appliance further includes an information acquisition unit connected to the display apparatus and configured to acquire actual cooking state information of to-be- cooked food in a cooking process. Optionally, the information acquisition unit includes a photographing apparatus.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic diagram of a cooking appliance according to an embodiment of the present invention.
FIG. 2 is a schematic frame diagram of a cooking appliance according to an embodiment of the present invention.
FIG. 3 is a flowchart of a method for displaying a cooking state of a cooking appliance according to an embodiment of the present invention.
FIG. 4 is a schematic diagram of a display mode of the cooking state of FIG. 2 according to an embodiment of the present invention.
FIG. 5 is a schematic diagram of a display mode of FIG. 4.
FIG. 6 is a schematic diagram of still another display mode of FIG. 4.
FIG. 7 is a schematic diagram of a display interface of a display unit in FIG. 2.
FIG. 8 is a schematic structural diagram of a control system for a cooking appliance according to an embodiment of the present invention.
FIG. 9 is a work flow diagram of a control system for a cooking appliance of FIG. 8. In the accompanying drawings:
1-Cooking appliance; 10-Body; 101-lnput/output module; 1011-Display unit; 20-Chamber; 11- Control unit; 12-Heating module; 13-lnformation acquisition unit; 14-Tray; 2-Server.
DETAILED DESCRIPTION
During use of an intelligent cooking device, a user hopes that the cooking device such as an oven can run completely automatically, to reduce intervention of the user, but the user is also very worried that the cooking result produced by intelligent control is not what the user expected, and the user hopes to obtain appropriate information feedback. In this way, the user can monitor the automatic cooking of the oven and obtain a satisfactory cooking result.
In the existing manner, the user can only see a result of identifying food by the oven at the beginning and the parameter setting provided by the oven according to the identification result. Once cooking begins, data that the oven feeds back to the user may include only a real-time picture of a camera, a temperature change in the oven, and the remaining time.
However, through these parameters, the user generally cannot determine and learn a cooking degree of food in the cooking process and what the intelligent control of the oven is doing. Therefore, the intelligent control process is a black box for the user, and it is precisely for the reason that the user is confused about the authenticity of intelligence and has poor experience. The implementations of the present invention provide a method for displaying a cooking state of a cooking appliance, a display apparatus of the cooking appliance, and a cooking appliance and a control system therefor. A visual presentation of dynamic identification of the cooking state of the cooking appliance and of an interventional measure is displayed to a user throughout an entire cooking process, so that the user can understand how intelligent control of the cooking appliance works and helps accurately control the cooking process, and can visually feel the intelligence degree and commercial value of the cooking device, which improves user experience.
To make the foregoing objectives, features, and advantages of the present invention clearer and easier to understand, specific embodiments of the present invention are described below in detail with reference to the accompanying drawings.
FIG. 1 and FIG. 2 are schematic diagrams of a cooking appliance according to an embodiment of the present invention. FIG. 3 is a flowchart of a method for displaying a cooking state of a cooking appliance according to an embodiment of the present invention. A cooking appliance 1 shown in FIG. 1 and FIG. 2 may perform the method shown in FIG. 3 to monitor cooking states of food at different cooking nodes during cooking of the food, and dynamically and visually display a working process of the cooking appliance to a user according to an identification result.
Specifically, referring to FIG. 1 , the cooking appliance 1 of this embodiment may be an oven, which may include a body 10 and a chamber 20 arranged in the body 10. The chamber 20 is configured to store cooked food. For example, the cooking appliance 1 may have a door that can open or close the chamber 20, and when the door is opened, the chamber 20 is exposed for the user to take or place the cooked food. When the door is closed, the chamber 10 is closed. In this case, cooking operations such as heating, baking, and the like may be performed on the cooked food placed in the chamber.
The cooking appliance 1 may further include an input/output module 101. Operating states of the cooking appliance 1 may be adjusted by operating the input/output module 101. Cooking state information may also be displayed through the input/output module 101. The operating states of the cooking appliance 1 may include a heating power, a heating direction, a heating duration, an amount of conveyed steam, a steam conveying direction, and the like. The adjustment to the operating states may be realized by adjusting operating states of specific functional modules in the cooking appliance 1. Further, the cooking appliance 1 may further include a control unit 11 , which is configured to adjust the operating state of the corresponding functional module according to an instruction fed back by the input/output module 101 , so that the operating state of the cooking appliance 1 conforms to the instruction of the user. The control unit 11 may also control an output mode of the input/output module 101 through an input signal or the instruction, for example, a screen content display mode or a display mode.
For example, the cooking appliance 1 may include a heating module 12 configured to heat cooked food 2. The cooking appliance 1 may include a plurality of heating modules 12, and the heating modules 12 are dispersed in different areas of the cooking appliance 1 to heat the chamber 20 from different angles so that the cooked food 2 is heated as evenly as possible. The control unit 11 may independently adjust heating power of each of the heating modules 12 to adjust an amount of heat radiated by the each heating module 12 into the chamber 20. The control unit 11 may further independently adjust a heating direction of the each heating module 12 to adjust a radiation angle of radiating heat into the chamber 20 by the each heating module 12. The control unit 11 may further adjust a heating duration of the specific heating module 12 operating at a specific heating power to heat the food to a desired effect.
In another example, the cooking appliance 1 may include a spray module (not shown) configured to deliver water vapor into the chamber 20 to adjust humidity in the chamber 20. By adjusting the humidity in the chamber 20, surface humidity of the cooked food can be adjusted, so that the cooked food has a moderate water content, to prevent the surface of the cooked food from being heated to be excessively dry or excessively wet. The cooking appliance 1 may include a plurality of spray modules, and the plurality of spray modules are dispersed in different areas of the cooking appliance 1 to deliver water vapor into the chamber 20 from different angles, so that the humidity distribution on the surface of the cooked food is balanced. The control unit 11 may independently adjust an amount of conveyed steam of each of the spray modules. The control unit 11 may also independently adjust a steam conveying direction of the each spray module. The control unit 11 may further adjust a spraying duration of a specific spray module operating at a specific amount of conveyed steam.
It should be noted that the input/output module 101 in FIG. 1 includes an input apparatus configured to realize an input function and a display apparatus configured to realize an output function. The input apparatus and the display apparatus may be integrated, for example, which may be a touch display panel with a touch function and configured to realize data input and image information display. The input apparatus and the display apparatus may also be separated and each realize their own functions. For example, the input apparatus may be an adjustment button similar to a knob, and may further be in other forms, such as a touch screen, a voice control module, and the like, and the display apparatus may be a display screen, or the like.
The display apparatus may be arranged on the cooking appliance 1 , or may be separated from or detachably connected to the cooking appliance 1 .
In the embodiment, the display apparatus may include a display unit 1011 and a control unit 11 connected to the display unit. The control unit 11 may be configured to display a progress pattern representing a current cooking progress of the cooking appliance 1 through the display unit 1011. In this way, the cooking states of the cooking appliance 1 at different cooking nodes are displayed to the user in a visual manner, including real-time cooking state information and interventional measure information of the cooking appliance 1.
In a specific embodiment, the control unit 11 connected to the display unit 1011 may be a control module of the cooking appliance itself, and the control of the cooking appliance 1 and the control of the display unit 1011 are realized through the control module.
Further, the control unit 11 may perform the method shown in FIG. 3 to monitor the cooking state of the cooked food, and display a control parameter or intervention behavior of the cooking appliance 1 in the cooking process through the display unit 1011 according to an identification result. For example, the control unit 11 may include or be externally connected to a memory (not shown). The memory stores a computer program. When the computer program is executed by a processor, the steps of the method shown in FIG. 3 are performed.
It should be noted that only a possible arrangement position of the control unit 11 in the cooking appliance 1 is displayed in FIG. 1 by way of example. In practical application, a specific arrangement position of the control unit 11 may be adjusted as required. In a variation, the control unit 11 may be externally arranged on the cooking appliance 1 (for example, a server 2, and the like). The cooking appliance 1 is provided with a communication module (not shown), and the control unit 11 communicates with the communication module to transmit a control instruction to the cooking appliance 1. The control instruction is used for controlling an operating state of the cooking appliance 1.
Further, the cooking appliance 1 may include an information acquisition unit 13 arranged in the chamber 20 or in the door to acquire actual cooking state information of food at least at one angle. The control unit 11 communicates with the information acquisition unit 13 to acquire the data acquired by the information acquisition unit 13.
For example, the information acquisition unit 13 may be a photographing apparatus arranged at the top of the chamber 20 and covers at least the tray 14 within an overlooking range. A plurality of photographing apparatuses may be arranged to shoot the cooked food 2 from different angles.
For example, the photographing apparatus may further be arranged inside the door, and a shooting range includes at least an area where the cooked food 2 is located.
Further, the cooking appliance 1 may include a temperature detection apparatus (not shown) configured to detect temperatures of a food surface, the inside of the food, the inside of the body, and the like.
For example, a temperature sensor (not shown) may be arranged in the chamber 20 to collect the temperature (not shown) in the chamber 20. In another example, the tray 14 may be provided with a temperature sensor or probe (not shown) to collect the surface temperature or the internal temperature of the cooked food 2.
In a specific embodiment, referring to FIG. 3, the method for displaying a cooking state of a cooking appliance of the cooking appliance 1 in this embodiment may include the following steps.
Step S100: Preset N cooking nodes and N cooking node identification models at different time points for to-be-cooked food in a cooking process, where each of the cooking node identification models includes standard cooking state information of a corresponding cooking node, and N is a positive integer greater than or equal to 1.
Step S102: Acquire a cooking instruction for the food, and cook the food according to the received cooking instruction.
Step S104: Acquire actual cooking state information of the cooking node.
Step S106: Compare the acquired actual cooking state information with the standard cooking state information of the corresponding cooking node through the cooking node identification models for identification. Step S108: Transmit an identification result corresponding to the at least one cooking node to a user terminal interface or a cooking appliance interface for display to a user, where the identification result includes at least normal cooking state information or abnormal cooking state information. The cooking appliance interface herein may be understood as the display apparatus or the display unit of the cooking appliance. The user terminal interface may be a mobile terminal wirelessly connected to the cooking appliance, or the like.
In the embodiment, a cooking manner of food generally corresponds to at least one cooking program. The cooking program may include different cooking nodes in a cooking process. The cooking nodes are used for monitoring cooking degrees or cooking states of food in different time periods in the cooking process. The cooking node identification model at different cooking nodes in the cooking process is adopted to monitor the cooking state of the cooked food and carry out intelligent intervention and control, and then displays the intelligent control process to the user according to the identification result, so that the user can visually feel the intelligent control process of the cooking appliance and improve user experience.
Specifically, when the cooking is ready to be started, the prepared food is put into the cooking appliance 1. When the door of the cooking appliance 1 is closed, the information acquisition unit 13 starts to work and acquires a captured image in real time. Alternatively, when the user sets cooking parameters such as a heating time, a heating temperature, and the like and starts cooking, the information acquisition unit 13 starts to work.
The information acquisition unit 13 may store the acquired image in a body memory or transmit the acquired image to a memory of a remote server. The control unit 11 monitors the food cooking state according to the acquired image, identifies the actual cooking state information of the cooking nodes by using the preset cooking node identification model, and transmits identification results corresponding to the cooking nodes to the user terminal interface or the display unit of the cooking appliance for display to the user.
In an embodiment, the cooking appliance 1 may adjust the cooking parameter according to the identification result, and display the adjusted parameter to the user through the display unit 1011. The display mode may be displaying at the end of a cooking process or in the cooking process, so as to intelligently monitor the cooking process, adjust the cooking parameter in time when abnormal cooking is found, and visually display the intelligent cooking control process to the user. The cooking node identification model may be stored in the memory in advance. The model includes the standard cooking state information of the corresponding cooking node. For example, the standard cooking state information may be standard food cooking state images corresponding to a plurality of cooking nodes. The standard image may be understood as a state that food should reach at the cooking node, which may include a color parameter, a shape parameter, and the like used for representing the state of the food. The standard image is acquired by training a large amount of experimental cooking data.
During specific implementation, different cooking nodes correspond to different standard images.
During specific implementation, the cooking node identification models are in a one-to-one correspondence with the types of food.
For example, a cooking node identification model A corresponding to cooking steak, a cooking node identification model B corresponding to cooking a sweet potato, and the like are stored in the memory.
In another example, a steak cooking program includes three cooking nodes, and the corresponding model A includes three cooking node identification sub-models A1 , A2, and A3. Each sub-model includes a standard steak image at a corresponding cooking node. In the process of cooking steak, by monitoring and comparing the real-time cooking state images of three cooking nodes and the corresponding standard images, the cooking parameter may be adjusted according to the identification result, and the result is displayed to the user. The identification result herein may include information that the steak is in a normal cooking state at one of the cooking nodes, or may include information that the steak is in an abnormal cooking state.
It should be noted that the above is only an example. In practical application, the cooking node identification models corresponding to various food types may be arranged in the memory, so as to expand a menu range within which the cooking appliance 1 can intelligently cook.
In a specific embodiment, the display mode of the display apparatus of the cooking appliance 1 may be displaying, under the control of the control unit 11 , the identification result corresponding to the cooking node to the user by using a graphic mark according to a cooking progress. In an implementation, the control unit 1 may control the display unit 1011 to display a progress pattern representing a complete cooking process. The progress pattern includes a plurality of cooking nodes temporally spaced apart from each other. Each of the cooking nodes represents different cooking state information of cooked food in a cooking process, and the each cooking node is visually distinguished between each other. The scene may be the display mode of the display unit upon completion of cooking or during cooking.
Preferably, the control unit 1 may control the display unit 1011 to display a progress pattern representing a current cooking progress. The progress pattern includes a plurality of cooking nodes temporally spaced apart from each other. Each of the cooking nodes represents different cooking state information of cooked food in a cooking process, and the each cooking node is visually distinguished between each other.
For example, the progress pattern may include a bar pattern, and the current cooking progress in the form of a brightness or color of light extends over time along the bar pattern, so that the cooking progress of the cooking appliance can be visually learned through the change of the light or color.
In another example, the progress pattern may include a background pattern that represents the complete cooking progress in a cooking process, and the current cooking progress gradually covers or replaces the background pattern over time with a light brightness or color different from that of the background pattern.
Further, the cooking node may be located on the background pattern to present a key node of the cooking process in a visual way, which facilitates the interactive operation of the cooking device or the display apparatus.
Through the arrangement of the display apparatus, in the intelligent cooking process of the cooking appliance, the user may visually feel the extent to which the cooked food has been cooked, and what role is played by, what control is performed by, and how to work by the artificial intelligence brought by the image identification technology in the whole cooking process. Therefore, the dynamic identification process of the cooking appliance 1 is displayed to the user in the above manner, so as to improve user experience and enhance the trust of the technology and happiness of success.
For example, in FIG. 4, a complete cooking process is presented in the form of time axis through the display unit, and key state points, that is, cooking nodes, may be set on the time axis. For example, three cooking nodes for cooking steak are presented through bubble graphics in the figure.
In addition, time information of the corresponding node may also be displayed at the node. For example, a first node in FIG. 4 indicates a cooking state identification result at 10 minutes and 30 seconds.
Further, bubble images of different colors may be used for indicating whether the identification result of the current cooking node is a normal cooking state or an abnormal cooking state. In FIG. 4, a black filled bubble indicates one identification result, and a blank bubble indicates another identification result. This is only an example, and different identification results may be distinguished between each other in different ways in the actual application process.
Further, the user may view the identification result of the corresponding node by operating the graphic mark such as clicking the bubble graphic in the display unit.
In a specific embodiment, with reference to FIG. 5 and FIG. 6, the user can click on the image mark to be viewed on the display unit 1011 , such as the bubble mark. The cooking appliance 1 may acquire, through the control unit 11 , a control signal for a user to operate the graphic mark, and display the cooking state information of the cooking node corresponding to the graphic mark through the user terminal interface or the cooking appliance interface according to the control signal, where the cooking state information indicates whether a current cooking state of the cooking node is a normal cooking state or an abnormal cooking state. In this way, the interaction with the user can be realized through the display unit, and the cooking state at the corresponding cooking node may also be checked.
Preferably, the control unit 11 is further configured to respond to the user operation only after the current cooking progress reaches or exceeds the cooking node, and display the cooking state information of the corresponding cooking node on the display unit 1011 based on the operation. Since the cooking nodes represent different cooking state information of the cooked food in a cooking process, when the current cooking progress has not reached a certain cooking node, the cooking appliance 1 or the control unit 11 has not intervened in or adjusted or identified the cooking process, and the corresponding cooking node has no cooking state information available for the user to view. Therefore, the user may operate the corresponding cooking node and obtain a response only after the current cooking progress reaches or exceeds the cooking node, so as to learn the cooking state information through the display unit. Further, the cooking state information of the cooking node corresponding to the graphic mark may include the actual cooking state image and the standard cooking state image corresponding to the cooking node of the to-be-cooked food, so as to visually observe whether the actual cooking progress of the node meets the expectation.
Further, when the current cooking state of the cooking node is the abnormal cooking state, adjusted control parameter information in the cooking process is displayed through the user terminal interface or the cooking appliance interface according to the control signal, so as to visually observe the measure of intelligently intervening the node by the cooking appliance 1 , so that the cooking result meets the expectation as much as possible.
For example, as shown in FIG. 5, when the user clicks on a first bubble graphic in the figure, that is, a first cooking node, the display unit presents a picture shown on the right side of the figure to the user. The picture includes the standard cooking state image and the actual cooking state image, and the cooking time is increased by 10 seconds. The presented information indicates the comparison between the actual cooking state of steak and the standard cooking state at the current cooking node, that is, the cooking state is an abnormal state. In addition, the cooking appliance adjusts the cooking time through the comparison, and displays the adjusted situation to the user through the display unit.
In another example, as shown in FIG. 6, when the user clicks on a third bubble graphic in the figure, that is, a third cooking node, the display unit presents a picture shown on the right side of the figure to the user. The picture includes the standard cooking state image and the actual cooking state image, and the current cooking state is a standard state. The presented information indicates that at the current cooking node, the actual cooking state of steak is equivalent to the standard cooking state, which conforms to the cooking state that the current cooking node should reach, that is, the normal cooking state. Therefore, "the current cooking state is the standard state" is directly displayed through the display unit.
In a common embodiment, the abnormal cooking state information includes at least one of the following: an actual cooking state does not reach a standard cooking state corresponding to the cooking node, and the actual cooking state is beyond the standard cooking state corresponding to the cooking node.
For example, when the abnormal cooking state is that the actual cooking state does not reach the standard cooking state corresponding to the cooking node, the cooking appliance controls to increase the control parameter, for example, increase as at least one of a cooking time, a cooking temperature, a cooking fire intensity, and the like.
When the abnormal cooking state is that the actual cooking state is beyond the standard cooking state corresponding to the cooking node, the cooking appliance controls to decrease the control parameter, for example, decrease as at least one of a cooking time, a cooking temperature, a cooking fire intensity, and the like.
In an implementation, the adjusted control parameter information includes the control parameter information automatically adjusted by the cooking appliance, and a correspondence between the intelligent identification result of the cooking appliance 1 and the control parameter information may be preset, so that the cooking appliance 1 can automatically acquire the corresponding control parameter information according to the identification result.
For example, the cooking appliance 1 is used for cooking steak, a cooking degree selected by the user is medium well, and the expected cooking duration is 8 minutes. In the cooking process, the control unit 11 monitors that at the third cooking node, such as at 6 minutes of the cooking time, the comparison result between the real-time image and the standard image acquired by the information acquisition unit 13 indicates that the current cooking state of the steak has been beyond the cooking state of steak in the corresponding standard image. If heating is continuously performed for preset 8 minutes, the final steak will probably be overcooked or not up to the expectation.
Therefore, the control unit controls to shorten the total heating time, for example, by 20 seconds, that is, to adjust the originally planned heating time of 8 minutes to 7 minutes and 40 seconds. Such fine-tuning can well avoid over-cooking, so that the cooking result meets the expectation.
During specific implementation, N cooking nodes at different time points for to-be-cooked food in a cooking process may be preset. The cooking nodes are obtained according to experiment cooking data of different foods. It may be learned according to the experiment that most of the ingredients have color and shape changes in the cooking process, which can be used for comparison to determine the cooking state of food. Therefore, the whole cooking process may be divided into different state points according to the characteristics of food, that is, cooking nodes. Different food state points vary. For example, three state points for steak and two state points for chicken wings are found through experiments. The state points will become important reference points for the cooking appliance to determine the cooking state of food. During cooking, identifying the cooking state of cooked food at a preset cooking node can ensure timely intervention in the cooking process at a key cooking node, timely adjustment of the cooking parameter, and timely correction when the cooking state deviates from the expectation. On the one hand, the cooking result of food meets the expectation, and on the other hand, an important node intervened by artificial intelligence is displayed to the user by adding the cooking state to the time schedule axis, so that the user can understand what the artificial intelligence has done for cooking and can visually perceive the intelligent control process.
In an embodiment, the real-time image and the corresponding standard image of the food at the current cooking node may be displayed through the display unit 1011 , to present the cooking state of the food at the current cooking node.
For example, in the cooking process, when a certain cooking node is reached, the control unit 11 may invoke the actual cooking state image and the standard cooking state image of the node stored in the control unit and present the actual cooking state image and the standard cooking state image to the user through the display unit 1011. Alternatively, as described above, when the user clicks the graphic mark corresponding to the cooking node, the control unit 11 may invoke the actual cooking state image and the standard cooking state image of the node stored in the control unit and present the actual cooking state image and the standard cooking state image to the user through the display unit. In this way, not only the cooking device can intelligently adjust the cooking process, but also the user can visually see the cooking state and the cooking progress of food, as well as the intervention process of the cooking appliance on the cooking parameter, so that user experience is better.
In an embodiment, the control unit 11 may further control the display unit 1011 to display, through the display unit 1011 , the first area including an image of the cooked food displayed in real time and the second area including the progress pattern. For example, as shown in FIG. 7, a display interface of the display unit 1011 is shown. The interface includes an image of a real-time cooking state of the cooked food, that is, steak, and an intelligent identification pattern of the cooking process presented by a progress bar. The display interface may further include cooking control parameter information of the cooking appliance, and the like.
In a variation, the cooking parameter may further be adjusted by using the method in combination with the real-time internal temperature of food. The real-time internal temperature of food may be acquired by using a probe or other sensors, and the cooking appliance 1 is controlled according to the real-time internal temperature and the current cooking state of the food.
Specifically, when the real-time internal temperature is lower than the standard temperature, the cooking time or the cooking temperature is increased, or otherwise, the cooking time is reduced or the cooking temperature is lowered. On the basis of monitoring the cooking state by image identification, the cooking appliance 1 is controlled the internal temperature of food, so that the cooking parameter can be better controlled to obtain the expected food cooking effect.
For example, during cooking, at a certain cooking node, the internal temperature of the food does not reach the expected temperature, which may lead to a problem that the food is not sufficiently cooked. The cooking time and the cooking temperature are adjusted by monitoring the internal temperature in real time. On the contrary, when the internal temperature of the food at the cooking node is excessively high, over-cooking of the food may be caused, which can be avoided by monitoring the internal temperature in real time and intervening in advance.
In another example, during cooking of some food (such as steak), a small difference exists in images of medium-well steak and fully-cooked steak, or some food has little change in appearance and color in a certain cooking process, which cannot be effectively identified by images. The cooking effect can be better controlled according to the internal temperature.
During specific implementation, food identification may be performed before the method, and the cooking node identification model is automatically matched based on the food identification result. Specifically, a food identification model for food identification is preset, an image of cooked food is acquired, the image is identified through the food identification model, and a corresponding cooking instruction is acquired based on the identified food. Further, N cooking node identification models in the cooking process corresponding to the food are determined according to the identification result of the food.
In a variation, the cooking appliance 1 may be provided with a plurality of information acquisition units 13 configured to acquire images and identification results of cooked food from different angles, and determine one identification result based on a priority to improve the accuracy of food identification.
During specific implementation, it may be further determined whether the food is sufficiently cooked or has reached the expectation at the end of cooking by using the method. Specifically, when the cooking process is coming to an end, the food cooking state image of the last cooking node is acquired, it is determined whether the food at the cooking node is sufficiently cooked, if so, the cooking appliance is controlled to stop working, or if not, the cooking time or the cooking temperature is increased to control the cooking appliance. Therefore, a situation that the food is not sufficiently cooked after cooking is finished, or the food is still being cooked after being sufficiently cooked is avoided, which wastes resources and damages the taste of the food.
The specific embodiment of the present invention further discloses a control system for a cooking appliance 1 , including a cooking appliance 1 and a server 2 that can interact with each other. The control unit 11 may also be arranged on the server 2, and the control unit 11 is configured to perform the method described in FIG. 3. Specifically, the control unit 11 is communicatively connected to the cooking appliance 1 . The cooking appliance 1 is configured to receive an identification result transmitted by the server 2, and display the identification result to a user through the display unit 1011. In this way, the control unit 11 is arranged in the server 2, and performs data processing such as identification in the server 2, which not only facilitates training, learning, and updating of data such as models, and improves the data accuracy of processed data, but also reduces the complexity of the cooking appliance itself. Therefore, a degree of intelligence of controlling the cooking appliance is improved.
The cooking appliance 1 may work according to the cooking control parameter transmitted by the control unit 11. Specifically, the cooking appliance 1 may adjust the cooking control parameter for food cooking by the cooking appliance according to the identification result.
Further, the cooking appliance 1 is further configured to display the adjusted cooking control parameter. Specifically, the cooking appliance controls the display unit 1011 through the control unit 11 to display the adjusted control parameter, so as to visually display the intelligent control process of the cooking appliance to the user, and improve user experience.
With reference to FIG. 9, the specific implementation of the control system is described in detail by using the intelligent oven as an example.
The oven is started to acquire video information of cooked food in real time, and the video information is uploaded to a server in the form of video stream. The server identifies the cooked food through the food identification model, and invokes the corresponding cooking parameter according to the cooked food. The server transmits the cooking parameter to the oven in the form of cooking instruction, and the oven displays the instruction information through the display unit. The user confirms the displayed instruction and feeds back user response information to the oven to instruct the oven to start cooking.
The oven starts cooking, and in the cooking process, the oven uploads a video stream of to- be-cooked food to the server in real time through a photographing apparatus.
The server receives the video stream, and identifies a cooking state of a corresponding cooking node through a preset cooking node identification model.
In FIG. 9, if the cooking node 1 identification model identifies the node as an abnormal node, an interventional measure for the abnormal node, that is, increasing or decreasing the cooking time, and a real-time cooking state image of the cooking node are transmitted to the oven.
The oven receives the identification result, and controls the display unit to display the identification result. The display mode of the identification result may be using a graphic mark, including the actual cooking state image and the standard image, and the interventional measure of the oven for the node, for example, increasing or decreasing parameter information such as the cooking time, as shown in FIG. 4 and FIG. 5.
If the cooking node 1 identification model identifies the node as a normal node, transmits the identification result to the oven, and controls the display unit to display the identification result, the displayed contents include the actual cooking state image and the standard image, as shown in FIG. 4 and FIG. 6.
A plurality of cooking nodes of a kind of food in a cooking process are provided, and each of the nodes is correspondingly preset with a cooking node identification model. The identification process and the display mode of other cooking nodes such as cooking nodes 2 to N are similar to the process of the cooking node 1 , and the details are not described herein again.
A cooking ending instruction is transmitted to the oven at the end of cooking, and a cooking ending state is displayed through the display unit.
An embodiment of the present invention further provides a computer-readable storage medium, which is a non-volatile storage medium or a non-transient storage medium, having a computer program stored thereon, the computer program, when running on a processor, performing the steps of the control method according to any one of the foregoing embodiments. Although specific implementations have been described above, these implementations are not intended to limit the scope of the present disclosure, even if only one implementation is described with respect to specific features. The feature example provided in the present disclosure is intended to be illustrative rather than limiting, unless otherwise stated. In specific implementations, the technical features of one or more dependent claims may be combined with the technical features of the independent claims, and the technical features from the corresponding independent claims may be combined in any appropriate manner, rather than only in the specific combinations listed in the claims.
Although the present invention is disclosed above, the present invention is not limited thereto. Any person skilled in the art can make various changes and modifications without departing from the spirit and the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the scope defined by the claims.

Claims

CLAIMS A display apparatus of a cooking appliance, characterized by comprising a display unit and a control unit, wherein the control unit is configured to display a progress pattern representing a current cooking progress through the display unit, the progress pattern comprises a plurality of cooking nodes temporally spaced apart from each other, and the cooking nodes represent cooking state information of cooked food in a cooking process and are visually distinguished between each other. The display apparatus according to claim 1 , characterized in that the progress pattern comprises a bar pattern, and the current cooking progress extends over time along the bar pattern in the form of a brightness or color of light. The display apparatus according to claim 1 , characterized in that the progress pattern comprises a background pattern representing a complete cooking progress, wherein the current cooking progress gradually covers or replaces the background pattern over time in the form of a brightness or color of light different from that of the background pattern. The display apparatus according to claim 3, characterized in that the cooking node is located on the background pattern. The display apparatus according to claim 1 , characterized in that the cooking node comprises time information corresponding to the cooking progress. The display apparatus according to claim 1 , characterized in that the control unit is further configured to receive an operation of a user on the cooking node through the display unit, and respond to the operation and display the cooking state information of the corresponding cooking node on the display unit based on the operation. The display apparatus according to claim 6, characterized in that the control unit is further configured to respond to the user operation only after the current cooking progress reaches or exceeds the cooking node, and display the cooking state information of the corresponding cooking node on the display unit based on the operation. The display apparatus according to claim 6 or 7, characterized in that the cooking state information comprises preset standard cooking state information and actual cooking state information. The display apparatus according to claim 8, characterized in that the cooking state information comprises information about an adjustment for cooking control according to a comparison of the actual cooking state information with the standard cooking state information. The display apparatus according to claim 9, characterized in that the information about the adjustment for cooking control comprises at least one of a cooking time, a cooking temperature, and a cooking fire intensity. The display apparatus according to claim 8, characterized in that the standard cooking state information comprises a standard cooking state image of to-be-cooked food, and the actual cooking state information comprises an actual cooking state image of the to- be-cooked food. The display apparatus according to claim 1 , characterized in that the control unit is further configured to display, through the display unit, a first area comprising an image of cooked food displayed in real time and a second area comprising the progress pattern. A cooking appliance, characterized by comprising the display apparatus according to any of claims 1 to 12. The cooking appliance according to claim 13, characterized by further comprising an information acquisition unit connected to the display apparatus and configured to acquire actual cooking state information of to-be-cooked food in a cooking process. The cooking appliance according to claim 14, characterized in that the information acquisition unit comprises a photographing apparatus.
PCT/EP2022/085130 2021-12-10 2022-12-09 Display apparatus of cooking appliance and cooking appliance WO2023105033A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111505007.5A CN115191838A (en) 2021-04-02 2021-12-10 Display device of cooking utensil and cooking utensil
CN202111505007.5 2021-12-10

Publications (1)

Publication Number Publication Date
WO2023105033A1 true WO2023105033A1 (en) 2023-06-15

Family

ID=84785272

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/085130 WO2023105033A1 (en) 2021-12-10 2022-12-09 Display apparatus of cooking appliance and cooking appliance

Country Status (1)

Country Link
WO (1) WO2023105033A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180015926A (en) * 2016-08-04 2018-02-14 엘지전자 주식회사 Cooking apparatus and method of controlling the same
CN109564000B (en) * 2016-08-18 2021-01-12 Bsh家用电器有限公司 Determination of browning level of cooked item
US20210228022A1 (en) * 2018-10-15 2021-07-29 Guangdong Midea Kitchen Appliances Manufacturing Co., Ltd. System and Method for Collecting and Annotating Cooking Images for Training Smart Cooking Appliances

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180015926A (en) * 2016-08-04 2018-02-14 엘지전자 주식회사 Cooking apparatus and method of controlling the same
CN109564000B (en) * 2016-08-18 2021-01-12 Bsh家用电器有限公司 Determination of browning level of cooked item
US20210228022A1 (en) * 2018-10-15 2021-07-29 Guangdong Midea Kitchen Appliances Manufacturing Co., Ltd. System and Method for Collecting and Annotating Cooking Images for Training Smart Cooking Appliances

Similar Documents

Publication Publication Date Title
CN110780628B (en) Control method and device of cooking equipment, cooking equipment and storage medium
US20230269832A1 (en) Configurable cooking systems and methods
CN110664259A (en) Cooking apparatus, control method thereof, control system thereof, and computer-readable storage medium
CN110824942B (en) Cooking apparatus, control method thereof, control system thereof, and computer-readable storage medium
US20220412568A1 (en) Cooking appliance with a user interface
JP6661778B2 (en) Microwave sound control method and microwave
US11722330B2 (en) Method for data communication with a domestic appliance by a mobile computer device, mobile computer device and domestic appliance
CN203914599U (en) A kind of intelligent baking box
CN111596563B (en) Intelligent smoke kitchen system and cooking guiding method thereof
WO2019119473A1 (en) Air conditioner control method and apparatus
CN107468048A (en) Cooking apparatus and its control method
CN110123149B (en) Cooking control method of cooking equipment and cooking equipment
CN110806699A (en) Control method and device of cooking equipment, cooking equipment and storage medium
KR20080066171A (en) Cooking appliance, control information calibration system for the cooking appliance, and control information calibration method for the cooking appliance
US20210401223A1 (en) Cooking device having camera
CN110234040B (en) Food material image acquisition method of cooking equipment and cooking equipment
CN111131855A (en) Cooking process sharing method and device
CN115981141A (en) Control method, device, equipment and medium based on adaptive matching
US20210207811A1 (en) Method for preparing a cooking product, cooking device, and cooking device system
WO2023105033A1 (en) Display apparatus of cooking appliance and cooking appliance
WO2023105023A1 (en) Method for displaying cooking state of cooking appliance and cooking appliance and control system therefor
CN114494996A (en) Food baking method and device, electronic equipment and storage medium
CN209733642U (en) Intelligent cooking equipment
CN108433516B (en) Cooking learning method
CN115191839A (en) Method for displaying cooking state of cooking appliance, cooking appliance and control system thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22835293

Country of ref document: EP

Kind code of ref document: A1