CN116687220A - Cooking method, cooking device, electronic equipment, cooking equipment and storage medium - Google Patents

Cooking method, cooking device, electronic equipment, cooking equipment and storage medium Download PDF

Info

Publication number
CN116687220A
CN116687220A CN202310677507.XA CN202310677507A CN116687220A CN 116687220 A CN116687220 A CN 116687220A CN 202310677507 A CN202310677507 A CN 202310677507A CN 116687220 A CN116687220 A CN 116687220A
Authority
CN
China
Prior art keywords
cooking
state
food material
user
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310677507.XA
Other languages
Chinese (zh)
Inventor
王聪
唐杰
林进华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202310677507.XA priority Critical patent/CN116687220A/en
Publication of CN116687220A publication Critical patent/CN116687220A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J36/00Parts, details or accessories of cooking-vessels
    • A47J36/32Time-controlled igniting mechanisms or alarm devices
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47JKITCHEN EQUIPMENT; COFFEE MILLS; SPICE MILLS; APPARATUS FOR MAKING BEVERAGES
    • A47J27/00Cooking-vessels

Landscapes

  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Cookers (AREA)

Abstract

According to the cooking method, the device, the electronic equipment, the cooking equipment and the storage medium, under the condition that the cooking equipment finishes cooking by using a cooking mode set by a user, a first image of food materials in the cooking equipment is acquired; determining a state of the food material based on the first image; determining whether the food material reaches a target state based on the state; determining a cooking mode based on the state if the state does not reach the target state; and controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state, and automatically cooking the food materials to reach the target state under the condition that the food materials do not reach the target state.

Description

Cooking method, cooking device, electronic equipment, cooking equipment and storage medium
Technical Field
The present application relates to the field of cooking technologies, and in particular, to a cooking method, a device, an electronic apparatus, a cooking apparatus, and a storage medium.
Background
In life, the cooking apparatus is an indispensable product for a kitchen, and generally the cooking apparatus is configured in a user-set mode to perform cooking, but there may be a problem that the user-set mode and the food material are not suitable, resulting in that the food material does not reach an optimal state after the cooking is completed in the user-set cooking mode, for example, there is no steaming problem.
Disclosure of Invention
In view of the above, the present application provides a cooking method, apparatus, electronic device, cooking device, and storage medium, capable of bringing food materials to a target state.
The application provides a cooking method, comprising the following steps:
under the condition that cooking equipment finishes cooking by using a cooking mode set by a user, acquiring a first image of food in the cooking equipment;
determining a state of the food material based on the first image;
determining whether the food material reaches a target state based on the state;
determining a cooking mode based on the state if the state does not reach the target state;
and controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state.
In some embodiments, the method further comprises:
collecting a second image of the food material;
determining an attribute of the food material based on the second image;
determining a target cooking pattern based on the properties of the food material;
under the condition that the cooking mode set by the user is acquired, determining whether the cooking mode set by the user is matched with the target cooking mode;
and outputting prompt information under the condition that the target cooking mode is not matched with the cooking mode set by the user.
In some embodiments, the method further comprises:
under the condition that a target cooking mode selected by a user is acquired, cooking the food based on the target cooking mode;
when the cooking mode set by the user is acquired, cooking is performed based on the cooking mode set by the user.
In some embodiments, the determining the state of the food item based on the first image comprises:
the first image is input into a first neural network model to determine the state of the food material, wherein the first neural network model is obtained by training first training data, and the first training data comprises: and (3) an image of the food material and a state corresponding to the image of the food material.
In some embodiments, the method further comprises:
acquiring the checking times of the food materials in the cooking process, which are set by a user;
inputting the checking times and the state of the food material in the cooking process into a second neural network model to determine the shooting time in the cooking process, wherein the second neural network model is obtained by training second training data, and the second training data comprises: checking the times, the states of food materials and shooting time;
and shooting the food material based on the shooting time, and sending the shot image to the user.
In some embodiments, the method further comprises:
determining a health index of the food material based on the attribute of the food material;
generating a health report based on health indexes of each food material within a preset time period;
and recommending a menu for the user based on the health report.
An embodiment of the present application provides a cooking apparatus including:
the acquisition module is used for acquiring a first image of food materials in the cooking equipment under the condition that the cooking equipment finishes cooking by using a cooking mode set by a user;
a first determining module for determining a state of the food material based on the first image;
a second determining module for determining whether the food material reaches a target state based on the state;
a third determining module for determining a cooking mode based on the state if the state does not reach the target state;
and the cooking module is used for controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state.
An embodiment of the present application provides an electronic device, including a memory and a processor, where the memory stores a computer program, and when the computer program is executed by the processor, the cooking method according to any one of the above is executed.
An embodiment of the present application provides a cooking apparatus including: the electronic device is described above.
Embodiments of the present application provide a storage medium storing a computer program executable by one or more processors for implementing any one of the above cooking methods.
According to the cooking method, the device, the electronic equipment, the cooking equipment and the storage medium, under the condition that the cooking equipment finishes cooking by using a cooking mode set by a user, a first image of food materials in the cooking equipment is acquired; determining a state of the food material based on the first image; determining whether the food material reaches a target state based on the state; determining a cooking mode based on the state if the state does not reach the target state; and controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state, and automatically cooking the food materials to reach the target state under the condition that the food materials do not reach the target state.
Drawings
The application will be described in more detail hereinafter on the basis of embodiments and with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an implementation of a cooking method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a cooking method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a composition structure of an electronic device according to an embodiment of the present application.
In the drawings, like parts are given like reference numerals, and the drawings are not drawn to scale.
Detailed Description
The present application will be further described in detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
If a similar description of "first\second\third" appears in the application document, the following description is added, in which the terms "first\second\third" are merely distinguishing between similar objects and do not represent a particular ordering of the objects, it being understood that the "first\second\third" may be interchanged in a particular order or precedence, where allowed, to enable embodiments of the application described herein to be practiced in an order other than that illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
Based on the problems existing in the related art, the embodiment of the application provides a cooking method, which is applied to electronic equipment, wherein the electronic equipment can be a computer, a mobile terminal, cooking equipment and the like. The electronic device may also be a controller of a computer, a mobile terminal, a cooking device, and the cooking device may include: electric rice cookers, steam ovens, and the like. The functions achieved by the cooking method provided by the embodiment of the application can be achieved by calling the program codes by the processor of the electronic equipment, wherein the program codes can be stored in a computer storage medium.
An embodiment of the present application provides a cooking method, and fig. 1 is a schematic implementation flow chart of the cooking method provided by the embodiment of the present application, as shown in fig. 1, including:
step S101, when cooking is completed by the cooking device using the cooking mode set by the user, acquiring a first image of food in the cooking device.
In the embodiment of the application, the cooking equipment can be an electric cooker, a steaming oven and the like, and can cook in two cooking modes, namely a cooking mode set by a user and a cooking mode determined by the electronic equipment according to an image of food materials.
The cooking process corresponding to the cooking mode may be a steaming and baking process, a rice cooking process, etc. For a steam oven, the cooking modes may include: a roast mode, a steam roast mode, a quick steam roast mode, and the like. For an electric rice cooker, the cooking modes may include: quick cooking mode, porridge mode, soup stewing mode, etc. Each cooking mode corresponds to a different cooking temperature, time, etc.
In the embodiment of the application, a user can set a cooking mode through a key on the cooking equipment, for example, a temperature setting key and a time setting key are arranged on the cooking equipment, the user can set the temperature through the temperature setting key, and the time of cooking is set through the time setting key.
In some embodiments, the user can set the cooking mode by controlling the APP. The cooking temperature, cooking time, etc. can be set at the control APP.
In the embodiment of the application, when the user selects the cooking mode set by the user and starts to cook, the cooking device uses the cooking mode set by the user to cook, and the cooking device can control the heating device to heat according to the cooking temperature and cook according to the set cooking time. In case the cooking time set by the user is reached, the cooking is completed.
In the embodiment of the application, since the user uses the self-set cooking mode to cook, there may be unreasonable cooking parameter settings, for example, for larger food materials, the food materials may not be cooked due to shorter time settings and lower temperature settings. Therefore, in the embodiment of the present application, it is also necessary to acquire the first image of the food material in the cooking apparatus to determine whether the target food material state is satisfied.
In the embodiment of the application, the acquisition device can be arranged in the cooking equipment, the acquisition device can comprise a camera, the electronic equipment can be in communication connection with the acquisition device of the cooking equipment, and the acquisition device is used for acquiring the first image of the food in the cooking equipment.
In the embodiment of the application, the food material can be bread, rice, dishes and the like.
In some embodiments, the electronic device may be communicatively coupled to an input device through which a first image of the food material is acquired. The input device may be a user terminal, and the user may input the first image of the food material through an APP on the user terminal. In an embodiment of the present application, the first image may include: pictures, videos, etc.
Step S102, determining a state of the food material based on the first image.
In the embodiment of the present application, a first image may be input into a first neural network model to determine the state of the food material, where the first neural network model is obtained by training first training data, and the first training data includes: and (3) an image of the food material and a state corresponding to the image of the food material.
In an embodiment of the present application, the first neural network model may be a convolutional neural network model. The state of the food material may include: color of food material, whether it is ripe, etc.
In the embodiment of the application, the image data of each food material can be obtained in advance, and then the label of the state is added to the image of the food material, so that the first training data is obtained, and the neural network can be trained by taking the image of the food material as input and the state corresponding to the image of the food material as output during training. When training is performed, training may be stopped after a preset number of training times is reached, or training may be stopped when the loss function of the neural network model converges.
In the embodiment of the application, the first neural network model can be obtained by training the electronic equipment, or can be obtained by connecting the electronic equipment with the internet in a communication way and obtaining the first neural network model through the internet.
Step S103, determining whether the food material reaches a target state based on the state.
In the embodiment of the application, the target state may be a state of the food material when the food material is cooked and ripe, for example, the taste, ripeness, surface color and the like of the food material are all optimal after the food material is cooked, and the target state of the food material is, for example, the chicken wings are steamed and roasted, and the optimal state of the chicken wings after the steamed and roasted is that the color of the chicken wings is slightly burnt.
In the embodiment of the application, the similarity between the current state of the food material and the target state can be calculated, and whether the food material reaches the target state or not can be determined through the similarity. If the similarity is smaller than the similarity threshold, determining that the food material does not reach the optimal state, and if the similarity is larger than the similarity threshold, representing that the food material reaches the optimal state.
In the embodiment of the present application, if the target state is reached, cooking is ended, and if the target state is not reached, step S104 is performed.
Step S104, determining a cooking mode based on the state in the case that the state does not reach the target state.
In the embodiment of the application, the electronic device can output the state to the third neural network model to determine the cooking mode.
In an embodiment of the present application, the third neural network model may be a convolutional neural network model. The state of the food material may include: color of food material, whether it is ripe, etc.
In the embodiment of the present application, status data of each food material may be obtained in advance, where the status data may include: and adding a label of a cooking mode to the state data of the food materials to obtain third training data, wherein the state of the food materials is input and the cooking mode is output to train the neural network when training is performed. When training is performed, training may be stopped after a preset number of training times is reached, or training may be stopped when the loss function of the neural network model converges.
In an embodiment of the present application, the cooking modes may include: a roast mode, a steam roast mode, a quick steam roast mode, and the like. For an electric rice cooker, the cooking modes may include: quick cooking mode, porridge mode, soup stewing mode, etc.
Step S105, controlling the cooking device to continue cooking the food material based on the cooking mode, so as to enable the food material to reach the target state.
In the embodiment of the application, after the cooking mode is determined, the cooking mode can be sent to the cooking equipment, so that the cooking equipment can continue to cook the food.
In the embodiment of the application, the cooking mode and the cooking time are determined at least based on the target state, so that the food material can reach the target state after automatic cooking.
According to the cooking method provided by the application, under the condition that cooking equipment finishes cooking by using a cooking mode set by a user, a first image of food in the cooking equipment is acquired; determining a state of the food material based on the first image; determining whether the food material reaches a target state based on the state; determining a cooking mode based on the state if the state does not reach the target state; and controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state, and automatically cooking the food materials to reach the target state under the condition that the food materials do not reach the target state.
In some embodiments, prior to step S101, the method further comprises:
step S1, acquiring a second image of the food material.
In the embodiment of the application, the cooking equipment can be provided with the acquisition device, the acquisition device can comprise a camera, the electronic equipment can be in communication connection with the acquisition device of the cooking equipment, and the acquisition device is used for acquiring the second image of the food in the cooking equipment.
In the embodiment of the present application, the second image of the food material may be considered as an image when the food material does not start cooking. The second image may be an image, video, or the like.
And step S2, determining the attribute of the food material based on the second image.
In the embodiment of the application, the second image of the food material can be input into a fourth neural network model, and the attribute of the food material is determined, wherein the fourth neural network model is trained in advance.
In an embodiment of the present application, the properties of the food material may include: the type, size, etc. of the food material.
And step S3, determining a target cooking mode based on the attribute of the food material.
In the embodiment of the application, the corresponding relation between different food material attributes and the target cooking mode is prestored in the electronic equipment. When the electronic device obtains the attribute of the food material, the target cooking mode can be determined based on the corresponding relation between different pre-stored food material attributes and the target cooking mode.
In the embodiment of the application, the target cooking mode can be regarded as the optimal cooking mode of the food material, and the food material can reach the target state after being cooked by the target cooking mode.
And step S4, determining whether the cooking mode set by the user is matched with the target cooking mode or not under the condition that the cooking mode set by the user is acquired.
In the embodiment of the application, the parameters of the cooking mode set by the user and the parameters of the target cooking mode can be compared to determine whether the set cooking mode is matched with the target cooking mode.
In an embodiment of the present application, parameters of the cooking mode may include: cooking time, cooking temperature, etc.
In the embodiment of the application, the cooking time and the cooking temperature in the cooking mode set by the user can be compared with the cooking time and the cooking temperature in the target cooking mode to determine whether the cooking time and the cooking temperature are matched.
In the embodiment of the application, if the matching is performed, the step S6 is performed, and if the matching is not performed, the step S5 is performed.
And S5, outputting prompt information when the target cooking mode is not matched with the cooking mode set by the user.
In the embodiment of the present application, the prompt information may include: the text information and the sound information can be displayed through the display module, and the sound information can be sent out through the sound module.
According to the cooking method provided by the embodiment of the application, the second image of the food material can be collected; determining an attribute of the food material based on the second image; determining a target cooking pattern based on the properties of the food material; under the condition that the cooking mode set by the user is acquired, determining whether the cooking mode set by the user is matched with the target cooking mode; and outputting prompt information to enable the user to select the optimal cooking mode for cooking under the condition that the target cooking mode is not matched with the cooking mode set by the user.
Step S6, when the target cooking mode selected by the user is obtained, the food is cooked based on the target cooking mode.
Step S7, when the cooking mode set by the user is obtained, cooking is performed based on the cooking mode set by the user.
After step S7, step S101 is performed.
In some embodiments, prior to step S101, the method further comprises:
step S8, obtaining the checking times of the food materials in the cooking process, which are set by a user.
In the embodiment of the application, the user can set the checking times in the cooking process by the APP, and the checking times are used for representing the times of the acquisition module to acquire the pictures in the cooking process.
Step S9, inputting the checking times and the state of the food materials in the cooking process into a second neural network model to determine the shooting time in the cooking process, wherein the second neural network model is obtained by training second training data, and the second training data comprises: the number of checks, the state of the food material, and the shooting time.
In the embodiment of the application, the input of the second neural network model is the checking times and the state of food materials, and the output of the second neural network model is the shooting time.
And step S10, shooting the food material based on the shooting time, and sending the shot image to the user.
In the embodiment of the application, the electronic equipment can start the camera to shoot based on shooting time. After shooting, the camera sends the picture to the electronic device, and the electronic device can send the image to the user through the APP.
For example, the user cooks the egg tart, with a setting of 4, and the neural network may calculate the optimal contrast time period for the egg tart to take a picture, which may be 1 sheet at the beginning and 2 sheets at the end of cooking, and 1 sheet after the end of cooking.
According to the method provided by the embodiment of the application, the user can know the cooking process by acquiring the checking times of the user, calculating the shooting time, shooting the food based on the shooting time and sending the shot image to the user.
In some embodiments, after step S105, the method further comprises:
step S106, determining a health index of the food material based on the attribute of the food material.
In an embodiment of the present application, the properties of the food material may include: type of food material.
In the embodiment of the present application, correspondence between different food material types and different health indexes may be stored in advance in the electronic device, and after determining the attribute of the food material, the health index of the food material may be determined, where the health index may include: heat, fat mass, etc.
In some embodiments, the electronic device may be communicatively coupled to the server, send the attribute of the food material to the server, and cause the server to return the health index of the food material, thereby causing the electronic device to determine the health index of the food material.
Step S107, a health report is generated based on the health index of each food material in the preset time period.
In the embodiment of the application, the preset time length can be configured, and the cooking can be performed for one week, one month, one year and the like.
In the embodiment of the application, the health report may include: fat mass, food calories, and the like.
Step S108, menu recommendation is conducted for the user based on the health report.
For example, if the fat content in the health report is high, a low fat dish may be recommended to the user.
According to the method provided by the embodiment of the application, the health index of the food material is determined based on the attribute of the food material; generating a health report based on health indexes of each food material within a preset time period; and recommending the menu for the user based on the health report, so that the user can be effectively assisted in cooking, and the user experience is improved.
Based on the foregoing embodiments, embodiments of the present application provide a cooking method, which is applied to a cooking system, the cooking system including: fig. 2 is a schematic flow chart of a cooking method provided by the embodiment of the application, as shown in fig. 2, a camera is arranged in the steaming and baking box equipment end, and the camera collects the state of food on one hand, automatically photographs according to recipes and viewing times preset by a user on the other hand, and uploads the pictures to the cloud. The algorithm end analyzes the real-time food materials, generates food material state and cooking state results, stores the results in the cloud and synchronously uploads the results to the intelligent terminal, so that a user knows the state of the food materials in real time. Wherein the algorithm calculates the health index from the recipe and the food material and is presented on the APP together with the cooking report. When the user uses the self-set mode to cook, and the expected effect (the same as the target state in the above embodiment) is not achieved after the completion, the APP prompts the user to start the recommended mode to continue cooking, and automatically starts the recommended cooking. The cooking system provided by the embodiment of the application not only can detect the state of food materials and recommend a cooking mode according to a recipe, but also can generate a health index and a cooking report of the recipe. The cooking system can effectively assist a user in cooking, and improves user experience.
In the embodiment of the application, a camera for detecting the state of food materials is arranged in the steaming oven, the camera uploads video or pictures for each food material, an algorithm adopts a deep learning technology to train the collected pictures or video data, the state of the food materials (each recipe contains different states) is obtained by analysis after training the data of the pictures, each interval state is manually marked at the initial stage of training, a cooking curve is generated according to the different states, and health indexes are analyzed and synchronously displayed according to different recipes of a user. Here, the input is state data of the food material, and the output is a curve formed by the data.
The camera automatically shoots according to the food material mode and the preset checking times of the user, performs pushing display through the APP, and uploads the pictures to the cloud. And when the mode set by the user is not matched with the menu, automatically reminding the user of the optimal mode. A cooking report is generated after cooking is completed, including a cooking curve and a health index. When the user uses the self-set mode for cooking, the user is reminded to start the recommended mode continuously to continue cooking after the cooking is completed and the ideal state is not reached. And storing the cooking report into a cloud end, and using the health report by an algorithm in combination with the latest cooking output month. The intelligent terminal is also provided with a mode time button for adjusting the steaming oven, when a user wants to terminate cooking in the clinic, the steaming oven can be turned off, the camera is turned off, and meanwhile, the steaming oven is turned off, so that unnecessary resource consumption is avoided. The cooking state is updated when the user cooks again.
Based on the foregoing embodiments, the embodiments of the present application provide a cooking apparatus, where each module included in the cooking apparatus, and each unit included in each module may be implemented by a processor in a computer device; of course, the method can also be realized by a specific logic circuit; in practice, the processor may be a central processing unit (CPU, central Processing Unit), a microprocessor (MPU, microprocessor Unit), a digital signal processor (DSP, digital Signal Processing), or a field programmable gate array (FPGA, field Programmable Gate Array), or the like.
An embodiment of the present application provides a cooking apparatus, including:
the acquisition module is used for acquiring a first image of food materials in the cooking equipment under the condition that the cooking equipment finishes cooking by using a cooking mode set by a user;
a first determining module for determining a state of the food material based on the first image;
a second determining module for determining whether the food material reaches a target state based on the state;
a third determining module for determining a cooking mode based on the state if the state does not reach the target state;
and the cooking module is used for controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state.
In some embodiments, the cooking device is further configured to:
collecting a second image of the food material;
determining an attribute of the food material based on the second image;
determining a target cooking pattern based on the properties of the food material;
under the condition that the cooking mode set by the user is acquired, determining whether the cooking mode set by the user is matched with the target cooking mode;
and outputting prompt information under the condition that the target cooking mode is not matched with the cooking mode set by the user.
In some embodiments, the cooking device is further configured to:
under the condition that a target cooking mode selected by a user is acquired, cooking the food based on the target cooking mode;
when the cooking mode set by the user is acquired, cooking is performed based on the cooking mode set by the user.
In some embodiments, the determining the state of the food item based on the first image comprises:
the first image is input into a first neural network model to determine the state of the food material, wherein the first neural network model is obtained by training first training data, and the first training data comprises: and (3) an image of the food material and a state corresponding to the image of the food material.
In some embodiments, the cooking device is further configured to:
acquiring the checking times of the food materials in the cooking process, which are set by a user;
inputting the checking times and the state of the food material in the cooking process into a second neural network model to determine the shooting time in the cooking process, wherein the second neural network model is obtained by training second training data, and the second training data comprises: checking the times, the states of food materials and shooting time;
and shooting the food material based on the shooting time, and sending the shot image to the user.
In some embodiments, the cooking device is further configured to:
determining a health index of the food material based on the attribute of the food material;
generating a health report based on health indexes of each food material within a preset time period;
and recommending a menu for the user based on the health report.
According to the cooking device provided by the embodiment of the application, under the condition that cooking equipment finishes cooking by using a cooking mode set by a user, a first image of food in the cooking equipment is acquired; determining a state of the food material based on the first image; determining whether the food material reaches a target state based on the state; determining a cooking mode based on the state if the state does not reach the target state; and controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state, and automatically cooking the food materials to reach the target state under the condition that the food materials do not reach the target state.
The embodiment of the application provides electronic equipment; fig. 3 is a schematic diagram of a composition structure of an electronic device according to an embodiment of the present application, as shown in fig. 3, the electronic device 700 includes: a processor 701, at least one communication bus 702, a user interface 703, at least one external communication interface 704, a memory 705. Wherein the communication bus 702 is configured to enable connected communication between these components. The user interface 703 may include a display screen, and the external communication interface 704 may include a standard wired interface and a wireless interface, among others. The processor 701 is configured to execute a program of a cooking method stored in a memory, to implement the steps in the cooking method provided in the above embodiment, where the cooking method includes:
under the condition that cooking equipment finishes cooking by using a cooking mode set by a user, acquiring a first image of food in the cooking equipment;
determining a state of the food material based on the first image;
determining whether the food material reaches a target state based on the state;
determining a cooking mode based on the state if the state does not reach the target state;
and controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state.
In some embodiments, the method further comprises:
collecting a second image of the food material;
determining an attribute of the food material based on the second image;
determining a target cooking pattern based on the properties of the food material;
under the condition that the cooking mode set by the user is acquired, determining whether the cooking mode set by the user is matched with the target cooking mode;
and outputting prompt information under the condition that the target cooking mode is not matched with the cooking mode set by the user.
In some embodiments, the method further comprises:
under the condition that a target cooking mode selected by a user is acquired, cooking the food based on the target cooking mode;
when the cooking mode set by the user is acquired, cooking is performed based on the cooking mode set by the user.
In some embodiments, the determining the state of the food item based on the first image comprises:
the first image is input into a first neural network model to determine the state of the food material, wherein the first neural network model is obtained by training first training data, and the first training data comprises: and (3) an image of the food material and a state corresponding to the image of the food material.
In some embodiments, the method further comprises:
acquiring the checking times of the food materials in the cooking process, which are set by a user;
inputting the checking times and the state of the food material in the cooking process into a second neural network model to determine the shooting time in the cooking process, wherein the second neural network model is obtained by training second training data, and the second training data comprises: checking the times, the states of food materials and shooting time;
and shooting the food material based on the shooting time, and sending the shot image to the user.
In some embodiments, the method further comprises:
determining a health index of the food material based on the attribute of the food material;
generating a health report based on health indexes of each food material within a preset time period;
and recommending a menu for the user based on the health report.
In the embodiment of the present application, if the cooking method is implemented in the form of a software functional module and sold or used as a separate product, it may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in essence or a part contributing to the prior art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, an optical disk, or other various media capable of storing program codes. Thus, embodiments of the application are not limited to any specific combination of hardware and software.
Accordingly, an embodiment of the present application provides a storage medium having a computer program stored thereon, characterized in that the computer program, when executed by a processor, implements the steps of the cooking method provided in the above embodiment.
The description of the electronic device and the storage medium embodiments above is similar to that of the method embodiments described above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the computer apparatus and the storage medium of the present application, please refer to the description of the method embodiment of the present application.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units; can be located in one place or distributed to a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware related to program instructions, and the foregoing program may be stored in a computer readable storage medium, where the program, when executed, performs steps including the above method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read Only Memory (ROM), a magnetic disk or an optical disk, or the like, which can store program codes.
Alternatively, the above-described integrated units of the present application may be stored in a computer-readable storage medium if implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied essentially or in part in the form of a software product stored in a storage medium, including instructions for causing a controller to perform all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a removable storage device, a ROM, a magnetic disk, or an optical disk.
The foregoing is merely an embodiment of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A cooking method, comprising:
under the condition that cooking equipment finishes cooking by using a cooking mode set by a user, acquiring a first image of food in the cooking equipment;
determining a state of the food material based on the first image;
determining whether the food material reaches a target state based on the state;
determining a cooking mode based on the state if the state does not reach the target state;
and controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state.
2. The method according to claim 1, wherein the method further comprises:
collecting a second image of the food material;
determining an attribute of the food material based on the second image;
determining a target cooking pattern based on the properties of the food material;
under the condition that the cooking mode set by the user is acquired, determining whether the cooking mode set by the user is matched with the target cooking mode;
and outputting prompt information under the condition that the target cooking mode is not matched with the cooking mode set by the user.
3. The method according to claim 2, wherein the method further comprises:
under the condition that a target cooking mode selected by a user is acquired, cooking the food based on the target cooking mode;
when the cooking mode set by the user is acquired, cooking is performed based on the cooking mode set by the user.
4. The method of claim 1, wherein the determining the state of the food material based on the first image comprises:
the first image is input into a first neural network model to determine the state of the food material, wherein the first neural network model is obtained by training first training data, and the first training data comprises: and (3) an image of the food material and a state corresponding to the image of the food material.
5. The method according to claim 1, wherein the method further comprises:
acquiring the checking times of the food materials in the cooking process, which are set by a user;
inputting the checking times and the state of the food material in the cooking process into a second neural network model to determine the shooting time in the cooking process, wherein the second neural network model is obtained by training second training data, and the second training data comprises: checking the times, the states of food materials and shooting time;
and shooting the food material based on the shooting time, and sending the shot image to the user.
6. The method according to claim 2, wherein the method further comprises:
determining a health index of the food material based on the attribute of the food material;
generating a health report based on health indexes of each food material within a preset time period;
and recommending a menu for the user based on the health report.
7. A cooking device, comprising:
the acquisition module is used for acquiring a first image of food materials in the cooking equipment under the condition that the cooking equipment finishes cooking by using a cooking mode set by a user;
a first determining module for determining a state of the food material based on the first image;
a second determining module for determining whether the food material reaches a target state based on the state;
a third determining module for determining a cooking mode based on the state if the state does not reach the target state;
and the cooking module is used for controlling the cooking equipment to continue cooking the food materials based on the cooking mode so as to enable the food materials to reach the target state.
8. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program which, when executed by the processor, performs the cooking method according to any one of claims 1 to 6.
9. A cooking apparatus, comprising: the electronic device of claim 8.
10. A storage medium storing a computer program executable by one or more processors for implementing the cooking method of any one of claims 1 to 6.
CN202310677507.XA 2023-06-08 2023-06-08 Cooking method, cooking device, electronic equipment, cooking equipment and storage medium Pending CN116687220A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310677507.XA CN116687220A (en) 2023-06-08 2023-06-08 Cooking method, cooking device, electronic equipment, cooking equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310677507.XA CN116687220A (en) 2023-06-08 2023-06-08 Cooking method, cooking device, electronic equipment, cooking equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116687220A true CN116687220A (en) 2023-09-05

Family

ID=87830693

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310677507.XA Pending CN116687220A (en) 2023-06-08 2023-06-08 Cooking method, cooking device, electronic equipment, cooking equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116687220A (en)

Similar Documents

Publication Publication Date Title
CN111527348B (en) Configurable cooking system and method
CN113133682B (en) Cooking equipment, cooking curve adjusting method and adjusting device
JP5657066B1 (en) Cooker
CN111596563B (en) Intelligent smoke kitchen system and cooking guiding method thereof
WO2019033843A1 (en) Method, apparatus and system for controlling cooking utensil
CN107095120A (en) Cooking methods and device
CN108447543A (en) Menu method for pushing based on cooking equipment and device
CN108133743A (en) A kind of methods, devices and systems of information push
CN107665198A (en) Image-recognizing method, server, terminal and refrigerating equipment
CN108937554B (en) Steaming and baking equipment and method for reminding diet by using terminal
JP5897088B2 (en) Cooker
CN107763694A (en) Cook linked system, method and cigarette machine
CN109062277B (en) Food heating method and device, storage medium and processor
CN112394149A (en) Food material maturity detection prompting method and device and kitchen electrical equipment
JP2016136085A (en) system
CN111419096B (en) Food processing method, controller and food processing equipment
CN112006520B (en) Cooking processing method, cooking equipment, user terminal and storage device
CN116687220A (en) Cooking method, cooking device, electronic equipment, cooking equipment and storage medium
CN116548836A (en) Electric oven control method, system, terminal equipment and storage medium
CN108134809A (en) A kind of methods, devices and systems of information push
WO2019037750A1 (en) Electronic apparatus and system thereof
CN112099372A (en) Menu generation method and device, cooking equipment, mobile terminal and storage medium
CN111090812A (en) Menu recommendation method and device, electronic equipment and storage medium
CN111603050A (en) Method and device for controlling cooker, storage medium and cooker
CN112965542B (en) Intelligent cooking equipment control method, control equipment and intelligent terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination