CN113673401A - Cooking processing method and device, storage medium and intelligent equipment - Google Patents

Cooking processing method and device, storage medium and intelligent equipment Download PDF

Info

Publication number
CN113673401A
CN113673401A CN202110925340.5A CN202110925340A CN113673401A CN 113673401 A CN113673401 A CN 113673401A CN 202110925340 A CN202110925340 A CN 202110925340A CN 113673401 A CN113673401 A CN 113673401A
Authority
CN
China
Prior art keywords
image
target food
cooking
target
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110925340.5A
Other languages
Chinese (zh)
Inventor
林鸿飞
乔国坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Aishen Yingtong Information Technology Co Ltd
Original Assignee
Shenzhen Aishen Yingtong Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Aishen Yingtong Information Technology Co Ltd filed Critical Shenzhen Aishen Yingtong Information Technology Co Ltd
Priority to CN202110925340.5A priority Critical patent/CN113673401A/en
Publication of CN113673401A publication Critical patent/CN113673401A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Evolutionary Biology (AREA)
  • Human Resources & Organizations (AREA)
  • Emergency Management (AREA)
  • Economics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a cooking processing method and device, a storage medium and intelligent equipment. In the cooking processing method, the state information of the target food in the target image can be determined according to the target image shot in the cooking process, and then the user is reminded to perform corresponding cooking operation by acquiring the cooking operation information corresponding to the state information and generating the prompt information corresponding to the cooking operation information, so that the reduction of wrong operation in the cooking process of the user is facilitated.

Description

Cooking processing method and device, storage medium and intelligent equipment
Technical Field
The application relates to the technical field of image processing, in particular to a cooking processing method and device, a storage medium and intelligent equipment.
Background
The current popular method for learning to cook food by watching videos is. And (4) cooking by the learners according to the operation method in the video, so that dishes in the video can be obtained.
However, for beginners, the dish in the video is difficult to cook due to wrong operations such as poor control of the duration of the fire and improper timing of adding food.
Disclosure of Invention
Based on this, in order to solve or improve the problems in the prior art, the present application provides a cooking processing method and apparatus, a storage medium, and an intelligent device, which can send out a prompt message according to a cooking node, and are beneficial to reducing the erroneous operation of a user.
In a first aspect, the present application provides a cooking process comprising:
acquiring a target image shot in a cooking process;
determining state information of the target food in the target image;
acquiring cooking operation information corresponding to the state information;
and generating prompt information corresponding to the cooking operation information.
In one embodiment, the target image includes a first image and a second image, and determining the state information of the target food in the target image includes:
determining position information of the target food in the first image;
determining the position information of the target food in the second image according to the position corresponding relation of the first image and the second image;
and acquiring the state information of the target food according to the position information of the target food in the second image.
In one embodiment, the acquiring the state information of the target food according to the position information of the target food in the second image includes:
and acquiring the temperature of the target food according to the position information of the target food in the second image.
In one embodiment, the first image is a black and white image or a color image and the second image is a thermographic image.
In one embodiment, the state information includes maturity, and determining the state information of the target food in the target image includes:
extracting an image of the target food from the target image;
and acquiring the maturity of the target food according to the image of the target food.
In one embodiment, the ripeness includes color and form, and the obtaining of the ripeness of the target food from the image of the target food includes:
and acquiring the color and the shape of the target food according to the image of the target food.
In one embodiment, the cooking operation information includes at least one cooking operation node, and the acquiring of the cooking operation information corresponding to the state information includes:
determining a cooking operation node in which the cooking process is located according to the state information;
generating prompt information corresponding to the cooking operation information, including:
and generating prompt information corresponding to the cooking operation node.
In a second aspect, there is provided a cooking processing apparatus comprising:
the first acquisition module is used for acquiring a target image shot in the cooking process;
the determining module is used for determining the state information of the target food in the target image;
the second acquisition module is used for acquiring cooking operation information corresponding to the state information;
and the generating module is used for generating prompt information corresponding to the cooking operation information.
In a third aspect, a computer-readable storage medium is provided, on which a computer program is stored, which, when the computer program is executed on a computer, causes the computer to perform the above method.
The fourth aspect provides an intelligent device, which comprises a memory, a processor and a camera module, wherein the camera module is used for shooting images of a cooking process and sending the shot images to the processor, and the processor is used for executing the method by calling a computer program stored in the memory.
According to the cooking processing method, the state information of the target food in the target image can be determined according to the target image shot in the cooking process, and then the user is reminded to perform corresponding cooking operation by acquiring the cooking operation information corresponding to the state information and generating the prompt information corresponding to the cooking operation information, so that the reduction of wrong operation in the cooking process of the user is facilitated.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It should be understood that the drawings in the following description are for purposes of illustrating the present application only and are not intended to limit the present application.
Fig. 1 is a schematic diagram of an internal structure of an intelligent device in an embodiment of the present application;
FIG. 2 is a schematic flow chart of a cooking process according to an embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating a process of determining status information of a target food in a target image according to an embodiment of the present application;
fig. 4 is a block diagram of a cooking processing device according to an embodiment of the present application;
fig. 5 is a schematic view of an application scenario of a cooking processing device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Fig. 1 is a schematic diagram of an internal structure of an intelligent device in one embodiment. As shown in fig. 1, the terminal includes a processor, a memory, and a network interface connected by a system bus. The processor is used for providing calculation and control capacity and supporting the operation of the whole intelligent device. The memory is used for storing data, programs and the like, and the memory stores at least one computer program which can be executed by the processor to realize the wireless network communication method suitable for the intelligent device provided by the embodiment of the application. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor for implementing a cooking processing method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The network interface may be an ethernet card or a wireless network card, and is used for communicating with an external intelligent device.
The smart device described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and those skilled in the art will understand that the configuration according to the embodiment of the present application can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
As shown in fig. 2, the cooking method includes steps 10 to 40.
Step 10, acquiring a target image shot in the cooking process;
step 20, determining the state information of the target food in the target image;
step 30, obtaining cooking operation information corresponding to the state information;
and step 40, generating prompt information corresponding to the cooking operation information.
According to the cooking processing method, the state information of the target food in the target image can be determined according to the target image shot in the cooking process, and then the user is reminded to perform corresponding cooking operation by acquiring the cooking operation information corresponding to the state information and generating the prompt information corresponding to the cooking operation information, so that the reduction of wrong operation in the cooking process of the user is facilitated.
In step 10, the target image is an image of the target food photographed during the cooking process. The target image is used to extract state information of the target food.
Alternatively, the photographed image may be acquired from the photographed cate cooking color video and cate cooking thermal imaging video. For example, one frame image is extracted as a target image from a gourmet cooking color video and a gourmet cooking thermal imaging video, respectively.
In step 20, the state information is a state that can be reflected on the target food. Generally, the states of the corresponding food are different when different cooking operations are performed, and therefore, whether the current food meets the condition for performing the dependent cooking operation is judged by judging the states of the food.
In one embodiment, the target image includes a first image and a second image. Referring to fig. 3, determining the status information of the target food in the target image includes:
step 210, determining position information of the target food in the first image;
step 220, determining the position information of the target food in the second image according to the position corresponding relation between the first image and the second image;
and step 230, acquiring the state information of the target food according to the position information of the target food in the second image.
The first image and the second image are combined in the embodiment to acquire the state information of the target food, so that the state information of the target food is more accurate. It can be understood that the first image and the second image are target images shot at the same time for the same cooking, the relative positions of the food and other objects in the first image and the second image are relatively fixed, and based on the relative positions, the position information of the target food in the second image is determined according to the position corresponding relationship between the first image and the second image. Alternatively, the coordinate correspondence between the pixels of the image and the pixels of the two images can be obtained by determining the pixels of the preset positions (e.g., the preset positions of the edge of the pan, the handle of the pan, etc.).
In one embodiment, the acquiring the state information of the target food according to the position information of the target food in the second image includes:
and acquiring the temperature of the target food according to the position information of the target food in the second image.
The state information in this embodiment includes temperature, i.e., the state of the target food is characterized by at least temperature. In some cooking scenarios, the temperature of the target food is continuously increased as the cooking progresses, and thus the state of the target food can be determined by acquiring the temperature of the target food.
In one embodiment, the first image is a black and white image or a color image and the second image is a thermographic image. Specifically, state information regarding the state of the food is extracted from the first image and the second image, and it can be determined whether the food meets the condition for performing the corresponding cooking operation during the cooking process according to the state information.
In one embodiment, the distribution area of the target food in the first image is obtained by utilizing the advantage of higher resolution of the first image, and the distribution area of the target food in the second image is determined according to the distribution area of the target food in the first image, so that the corresponding thermal imaging information in the second distribution area is analyzed, and the temperature information of the target food is obtained. Specifically, determining the temperature of the target food from the first image and the second image comprises:
step 211, obtaining a distribution area of the target food in the first image to obtain a first distribution area;
step 221, acquiring a coordinate corresponding relation between pixels of a first image and pixels of a second image which are established in advance;
step 222, determining a distribution area of the target food in the second image according to the first distribution area and the coordinate corresponding relation to obtain a second distribution area;
231, analyzing corresponding thermal imaging information in the second distribution area to obtain temperature information of the target food;
step 232, constructing first status information including the temperature of the target food.
In this embodiment, the first distribution area is obtained by using the advantage of the color difference in the color image or the black-and-white image in the image recognition, and then the first distribution area is converted into the second distribution area, which is beneficial to accurately calculating the second distribution area.
Step 211 is to analyze the first image to obtain a distribution area of the target food in the first image. In one embodiment, the obtaining of the distribution area of the target food in the first image is specifically to input the first image into a cooking image segmentation model for image segmentation, and determine the distribution area of the target food in the first image.
Specifically, the division of the first image into a plurality of food areas according to different food types is realized through a cooking image segmentation model. The cooking image segmentation model is a semantic segmentation model based on a RefineNet neural network. The cooking image segmentation model is a cooking image segmentation model which can obtain a first image by segmenting pixels accurately through training a neural network of RefineNet. The target food area is an area where the target food is distributed in the first image. It is understood that, in the case where a plurality of kinds of foods are contained, the first image contains a plurality of first distribution regions corresponding to the plurality of kinds of foods, respectively.
Step 221 is to obtain a coordinate corresponding relationship between the pixels of the first image and the pixels of the second image, so as to find the pixels of the target food in the second image according to the pixels of the target food in the first image.
Step 222 is to search a region corresponding to the first distribution region, i.e. a second distribution region, in the second image according to the coordinate correspondence and the first distribution region. The second distribution area is a distribution area of the target food in the second image.
Step 231 is to obtain temperature information of the target food according to the thermal imaging image. Each pixel in the thermographic image reflects a temperature value, so that by analyzing the pixels of the second distribution area, a plurality of temperature values of the second distribution area may be obtained.
By analyzing the plurality of temperature values of the second distribution area, the temperature capable of reflecting the state of the target food can be obtained. In one embodiment, analyzing the thermal imaging information corresponding to the second distribution area to obtain the temperature information of the target food includes:
step 2311, obtaining a plurality of temperature values of the second distribution area corresponding to each pixel point according to the thermal imaging information in the second distribution area;
step 2312, calculating the mean square error of the plurality of temperature values;
step 2313, when the mean square error is larger than the preset mean square error, taking a temperature value with a temperature difference value with an adjacent position larger than a preset temperature difference value as a boundary temperature, and dividing the second distribution area into a plurality of temperature areas according to the boundary temperature;
step 2314, calculating an average temperature in each temperature zone to obtain a plurality of average temperatures;
step 2315, the plurality of average temperatures are taken as the temperature of the target food.
In this embodiment, when the mean square deviation is greater than the preset mean square deviation, which indicates that the temperature values at different positions in the second distribution area are different greatly, which may indicate that the target food is heated unevenly, the second distribution area is divided into a plurality of temperature areas, wherein the temperatures with similar temperature values (smaller than the preset temperature difference) are divided into the same temperature area, and then the average temperature in each temperature area is calculated, thereby facilitating accurate reflection of the state of the target food.
It will be appreciated that each temperature value corresponds to a pixel in the second distribution area, the location of the temperature value also corresponding to the location of the pixel in the second distribution area. Since temperatures with similar temperature values are often distributed in a sheet, the temperature values (boundary temperature) on the boundary of the temperature regions are selected, and the second distribution region is divided into a plurality of temperature regions by the boundary temperature. Specifically, when the distance between the corresponding positions of the two boundary temperature values is smaller than the preset distance value, the two boundary temperature values are connected into a line, so as to form a plurality of closed temperature regions. In step 1315, the temperature of the target food may be a temperature set consisting of a plurality of average temperatures.
Step 232 is to establish first status information based at least on the temperature of the target food. That is, the first state information includes the temperature of the target food.
In addition, the first state information can also comprise information such as the time length of the target food at the temperature and the time length between the first cooking operation node and the previous cooking operation node, which is beneficial to more comprehensively reflecting the state of the target food. In one embodiment, constructing the first state information including the temperature of the target food comprises:
step 240, acquiring a time length from the time when the target food starts to reach the temperature to the time when the first cooking operation node is performed, and acquiring a first time length;
step 250, acquiring the time interval between the first cooking operation node and the previous cooking operation node to obtain a second time interval;
step 260, constructing first status information comprising the first duration, the second duration, and the temperature of the target food.
Step 240 may specifically be to continuously monitor the temperature of the target food during the process of recording the cooking video, and establish a temperature change table for recording the temperature of the target food. According to the relation table of the time and the target food temperature in the temperature change table, the time length from the time when the target food starts to reach the temperature to the time when the first cooking operation node is performed, namely the first time length, can be determined according to the temperature of the target food.
The previous cooking operation node in step 250 is a cooking operation node chronologically before and adjacent to the first cooking operation node.
Step 260 is to establish first status information based on the first time period, the second time period and the temperature of the target food. It can be seen that the first status information is specifically a set of the first time period, the second time period and the temperature of the target food.
In one embodiment, the state information includes maturity, and determining the state information of the target food in the target image includes:
step 271, extracting an image of the target food from the target image;
and step 272, acquiring the maturity of the target food according to the image of the target food.
In this embodiment, the ripeness level is a cooking level of food and can reflect a state of being cooked. For example, medium, and full ripeness, etc. In general, maturity may be predefined by professional chefs and gourmets according to the state of the food. The content of the image of the target food is only the target food.
In the present embodiment, the ripeness of the target food is generally expressed by the appearance of the target food, and based on this, the ripeness of the target food can be obtained by analyzing and extracting the target food image. Wherein the target image may be a color image or a black and white image photographed during cooking.
In one embodiment, the image of the target food may be extracted from the target image by a target food segmentation model.
In one embodiment, the maturity of the obtained target food is realized through feature comparison.
Specifically, acquiring the maturity of the target food comprises the following steps:
2721, acquiring characteristic information of an image of target food;
2722, comparing the characteristic information of the image of the obtained target food with the characteristic information of the preset maturity;
step 2723, when the similarity between the feature information of the image of the target food and the feature information of the preset maturity is within a preset range, determining that the maturity of the target food is the preset maturity.
In the embodiment, the maturity of the target food is determined by performing feature comparison between the feature information of the image of the target food and the preset maturity feature information. The preset maturity characteristic information is information extracted from a target image corresponding to the preset maturity in advance.
In some cooking processes, the shape and/or color of the target food may be changed at different stages, and thus, it may be determined whether the target food satisfies the condition for performing the first cooking operation by determining the shape and/or color of the target food. In one embodiment, the ripeness includes color and form, and the obtaining of the ripeness of the target food from the image of the target food includes: and acquiring the color and the shape of the target food according to the image of the target food. That is, the maturity of the target food is determined by the color and morphological characteristics of the target food.
In one embodiment, the maturity of the target food is identified through a food form identification model. Specifically, acquiring the maturity of the target food according to the image of the target food comprises the following steps:
step 2724, inputting the image of the target food into a food maturity recognition model, and recognizing the maturity of the target food;
the food maturity recognition model is obtained by training images of target food corresponding to various maturity as training pictures.
At step 2724, a food maturity recognition model is capable of recognizing the maturity of the target food, wherein the food form recognition model may be a classification model based on a VGG neural network, a *** lenet neural network or a ResNet neural network. The food form recognition model is obtained by taking target food with corresponding maturity in different cooking stages as training pictures for training. It should be noted that the maturity level can represent the state of the target food in different cooking stages during the cooking process, and specifically, the morphology of the target food is reflected by extracting the morphological feature and the color feature of the target food.
In step 30, cooking operation information corresponding to the state information is acquired. Specifically, a mapping table of the state information and the cooking operation information may be established in advance, that is, the cooking operation information corresponding to the state information may be acquired. The state information in the mapping table may extract the state information in the target food in the image corresponding to the cooking operation node, where the image corresponding to the cooking operation node may be an image of a user (such as a gourmet blogger) when the user performs the cooking operation node, and the state information in the target food extracted by the mapping table may be the same as the state information in step 20.
In one embodiment, the cooking operation information includes at least one cooking operation node, and the acquiring of the cooking operation information corresponding to the state information includes: and determining a cooking operation node where the cooking process is located according to the state information. One cooking process comprises one or more cooking operation nodes, and corresponding cooking operation is needed to be carried out on the cooking operation nodes, so that food delicacies can be obtained through cooking. For example, the first cooking operation node is one of a plurality of cooking operation nodes in a cooking process.
The generating of the prompt information corresponding to the cooking operation information in step 40 includes: and generating prompt information corresponding to the cooking operation node. The user can perform corresponding operation according to the prompt message.
In a specific application scenario of this embodiment, when cooking onion beef, the normal sequence is: heating oil in a pot, frying beef, frying onion, and frying. Recording videos of the onion fried meat by a user A (the user A can be a food blogger), wherein the videos comprise color videos and thermal imaging videos; after the color video and the thermal imaging video are finished, the color video and the thermal imaging video are processed at the terminal: when the cooking operation of putting beef is needed, by analyzing the oil temperature information (such as 100 ℃) in the color video and the thermal imaging video corresponding to the beef, the method can edit the insertion voice or text prompt 'please put beef', and the mapping relation between the cooking operation of please put beef and the oil temperature information; when the onion is required to be put in, acquiring the temperature information (such as 80 ℃) of the beef by analyzing the frame images corresponding to the onion in the color video and the thermal imaging video, editing the insertion voice or the text prompt 'please put in the onion', and mapping the cooking operation of please put in the onion with the beef temperature information; when the cooking is finished, acquiring the state information of the target food (onion or beef) when the cooking is finished by analyzing the frame image corresponding to the cooking finish in the color video and the thermal imaging video; and editing and inserting voice or text to prompt that the cooking is finished, establishing a mapping table of the mapping relation between the cooking operation for finishing the cooking and the state information of the target food, and obtaining a guide video. The user B (the user B can be a learner who fries beef with one onion) can import the guiding video, the cooking process of the user B is monitored in real time by shooting the color video and the thermal imaging video, when the oil temperature information in the color video and the thermal imaging video is analyzed to be matched with the oil temperature information in the image corresponding to the beef put in, the guidance video automatically prompts ' please put the beef ', when the beef temperature information in the color video and the thermal imaging video is analyzed to be matched with the beef temperature information in the image corresponding to the onion ', the guidance video automatically prompts ' please put the onion ', and when the state information of the target food in the color video and the thermal imaging video is analyzed to be matched with the state information of the target food in the image corresponding to the completion of cooking, the guidance video prompts ' completion of cooking '.
According to the cooking processing method, the state information of the target food in the target image can be determined according to the target image shot in the cooking process, and then the user is reminded to perform corresponding cooking operation by acquiring the cooking operation information corresponding to the state information and generating the prompt information corresponding to the cooking operation information, so that the reduction of wrong operation in the cooking process of the user is facilitated.
It should be understood that, although the steps in the flowchart of fig. 2 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 2 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
Based on the same inventive concept of the cooking processing method, the present embodiment provides a cooking processing apparatus, including:
the first acquisition module is used for acquiring a target image shot in the cooking process;
the determining module is used for determining the state information of the target food in the target image;
the second acquisition module is used for acquiring cooking operation information corresponding to the state information;
and the generating module is used for generating prompt information corresponding to the cooking operation information.
The cooking processing device can determine the state information of the target food in the target image according to the target image shot in the cooking process, and then reminds a user of performing corresponding cooking operation by acquiring the cooking operation information corresponding to the state information and generating the prompt information corresponding to the cooking operation information, thereby being beneficial to reducing the error operation in the cooking of the user.
Optionally, the first obtaining module is composed of a thermal imaging camera 72 and a color camera 71, the determining module and the second obtaining module are disposed in a terminal management module 74, the generating module is specifically a prompting module 75, and the generating module is disposed in the terminal management module 74. Specifically, fig. 4 is a block diagram of a cooking processing device according to an embodiment. As shown in fig. 4, the cooking processing device includes:
a thermal imaging camera 72, a color camera 71, a communication module 73, a terminal management module 74 and a prompt module 75;
the color camera 71 is connected with the communication module 73, the thermal imaging camera 72 is connected with the communication module 73, the communication module 73 is connected with the terminal management module 74, and the terminal management module 74 is connected with the prompt module 75;
the color camera 71 is used for acquiring a color image shot at the current moment in the cooking process;
the thermal imaging camera 72 is used for acquiring a thermal imaging image shot at the current moment in the cooking process;
the communication module 73 is used for transmitting the color image and the thermal imaging image to the terminal management module 74;
the terminal management module 74 is configured to obtain the color image and the thermal imaging image, determine status information of the target food according to the color image and the thermal imaging image, and obtain corresponding cooking operation information;
the prompt module 75 is used for generating prompt information corresponding to the cooking operation information.
Fig. 5 is a schematic view of an application scenario of the cooking processing device. As shown, the cooking process means can be divided into a fixed end 81 and a movable end 82. The fixed end 81 can be adsorbed on the range hood 83, the thermal imaging camera 72, the color camera 71 and the communication module 73 are arranged on the fixed end 81, and the lenses of the thermal imaging camera 72 and the color camera 71 on the fixed end 81 are used for aligning the position of the cooker 84, so that food making shooting is realized. It should be noted that the fixed end 81 and the range hood 83 are detachably connected (adsorption connection), and when a video does not need to be recorded, the fixed end 81 can be conveniently detached from the range hood 83.
The thermal imaging camera 72 and the color camera 71(AI camera) are respectively used for shooting the cooking process of the user, obtaining the video or image of the target food, and transmitting the video or image to the mobile terminal 82 through the communication module 73.
The terminal management module 74 and the prompting module 75 may be disposed on the mobile terminal 82. The terminal management module 74 includes a processor therein, and can perform food identification on the color image, identify target food in cooking, further calculate the temperature of the target food through the thermal imaging image, and analyze the current cooking steps. Specifically, the terminal management module 74 identifies the food type and the coordinate position according to the color image, and combines the food type and the coordinate position with the thermal imaging data to finally obtain the temperature of the target food; and comparing the temperature of the target food with the temperature corresponding to the cooking operation, judging which operation step is currently performed, and reminding through a voice module. And the voice module is used for receiving the control signal of the terminal management module 74 and responding to the control signal to perform voice prompt.
In this embodiment, the communication module 73 may be a WIFI communication module 73, and transmits video or image data to the terminal management module 74; the mobile terminal 82 may be a mobile phone, a tablet, or the like, and acquires video or image data through WIFI of the local area network, displays the video or image data, and sends a prompt through the prompt module 75. The prompt module 75 may specifically be a display screen (e.g., a human-computer interaction touch screen) or a voice module.
The mobile terminal 82 is also provided with a touch screen and has an editing function, so that the recorded cooking video can be decomposed; and identifying the state information of the target food, and editing corresponding text or voice reminding information by combining the state information of the target food of each cooking operation to obtain an edited cooking video. The edited cooking video can be shared by common videos through WeChat, QQ and the like. When cooking starts, the terminal management module 74 analyzes the edited cooking video to obtain the state information of the target food corresponding to each cooking operation; analyzing the color images and the thermal imaging images shot by the color camera 71 and the thermal imaging camera 72, identifying the state information of the current target food, and acquiring corresponding text or voice reminding information for reminding when the state information of the current target food is consistent with the state information of the target food corresponding to the cooking operation node.
In one embodiment, a control module is further disposed in the fixed end 81, the control module is connected to the communication module 73, and the control module is configured to receive a signal from the terminal management module 74 and control the thermal imaging camera 72 and the color camera 71 to be turned on and off by outputting a control signal. When the prompt module 75 is disposed at the mobile terminal 82, the control module is configured to control the prompt module 75 to turn on or off.
The fixed end 81 can be arranged on the cigarette making machine in the embodiment, and is beneficial to shooting cooked food. The mobile terminal 82 can identify the status information of the target food and send out the reminding information according to the status of the target food.
The division of the various modules in the cooking processing device is merely for illustration, and in other embodiments, the cooking processing device may be divided into different modules as needed to complete all or part of the functions of the cooking processing device.
In one embodiment, the cooking processing apparatus further includes a cooker 84 module, the cooker 84 module and the terminal management module 74, and is configured to receive a fire power adjustment signal from the terminal management module 74, so as to achieve automatic adjustment of fire power.
For specific limitations of the cooking process device, reference may be made to the above limitations of the cooking process method, which are not described herein again. The various modules in the cooking processing device described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
The implementation of each module in the cooking processing device provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
In one embodiment, the determining module is further configured to determine location information of the target food in the first image; determining the position information of the target food in the second image according to the position corresponding relation of the first image and the second image; and acquiring the state information of the target food according to the position information of the target food in the second image.
In one embodiment, the determining module is further configured to obtain the temperature of the target food according to the position information of the target food in the second image.
In one embodiment, the determining module is further configured to extract an image of the target food from the target image; and acquiring the maturity of the target food according to the image of the target food.
In one embodiment, the determining module is further configured to obtain the color and shape of the target food according to the image of the target food.
In one embodiment, the second obtaining module is further configured to determine a cooking operation node where the cooking process is located according to the state information;
in one embodiment, the prompt module is further configured to generate a prompt message corresponding to the cooking operation node.
The present embodiment provides a computer-readable storage medium on which a computer program is stored, characterized in that when the computer program is executed on a computer, the computer is caused to execute the above method.
Referring to fig. 1, an embodiment of the present application further provides an intelligent device (electronic device), which includes a memory, a processor, and a camera module, where the camera module takes an image of a cooking process and sends the taken image to the processor, and the processor is configured to execute the method by calling a computer program stored in the memory.
Embodiments of the present application also provide a computer program product containing instructions that, when run on a computer, cause the computer to perform a cooking process method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A cooking process, comprising:
acquiring a target image shot in a cooking process;
determining state information of target food in the target image;
acquiring cooking operation information corresponding to the state information;
and generating prompt information corresponding to the cooking operation information.
2. The cooking process method of claim 1, wherein the target image comprises a first image and a second image, and the determining the state information of the target food in the target image comprises:
determining location information of a target food in the first image;
determining the position information of the target food in the second image according to the position corresponding relation of the first image and the second image;
and acquiring the state information of the target food according to the position information of the target food in the second image.
3. The cooking processing method according to claim 2, wherein the status information includes a temperature, and the obtaining the status information of the target food according to the position information of the target food in the second image includes:
and acquiring the temperature of the target food according to the position information of the target food in the second image.
4. The cooking process of claim 2, wherein the first image is a black and white image or a color image and the second image is a thermographic image.
5. The cooking process of claim 1, wherein the status information includes a maturity level, and wherein the determining the status information of the target food in the target image includes:
extracting an image of a target food from the target image;
and acquiring the maturity of the target food according to the image of the target food.
6. The cooking process of claim 5, wherein the ripeness level includes color and morphology, and the obtaining the ripeness level of the target food from the image of the target food comprises:
and acquiring the color and the shape of the target food according to the image of the target food.
7. The cooking processing method according to any one of claims 1 to 6, wherein the cooking operation information includes at least one cooking operation node, and the acquiring the cooking operation information corresponding to the state information includes:
determining a cooking operation node where the cooking process is located according to the state information;
the generating of the prompt information corresponding to the cooking operation information includes:
and generating prompt information corresponding to the cooking operation node.
8. A cooking process device, comprising:
the first acquisition module is used for acquiring a target image shot in the cooking process;
the determining module is used for determining the state information of the target food in the target image;
the second acquisition module is used for acquiring cooking operation information corresponding to the state information;
and the generating module is used for generating prompt information corresponding to the cooking operation information.
9. A computer-readable storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to carry out the method according to any one of claims 1 to 7.
10. An intelligent device comprising a memory, a processor and a camera module, wherein the camera module takes an image of a cooking process and sends the taken image to the processor, and the processor is configured to execute the method according to any one of claims 1 to 7 by calling a computer program stored in the memory.
CN202110925340.5A 2021-08-12 2021-08-12 Cooking processing method and device, storage medium and intelligent equipment Pending CN113673401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110925340.5A CN113673401A (en) 2021-08-12 2021-08-12 Cooking processing method and device, storage medium and intelligent equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110925340.5A CN113673401A (en) 2021-08-12 2021-08-12 Cooking processing method and device, storage medium and intelligent equipment

Publications (1)

Publication Number Publication Date
CN113673401A true CN113673401A (en) 2021-11-19

Family

ID=78542509

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110925340.5A Pending CN113673401A (en) 2021-08-12 2021-08-12 Cooking processing method and device, storage medium and intelligent equipment

Country Status (1)

Country Link
CN (1) CN113673401A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115657896A (en) * 2022-12-26 2023-01-31 中科航迈数控软件(深圳)有限公司 Information prompting method based on MR (magnetic resonance) equipment image acquisition and related equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190057282A1 (en) * 2017-08-16 2019-02-21 Zaigle Co.,Ltd Carbonization time point management service providing system for cooking fish-meat stuff
CN109446228A (en) * 2018-11-12 2019-03-08 天津市协力自动化工程有限公司 Information cuing method, device, terminal and computer storage medium
CN109709856A (en) * 2018-12-29 2019-05-03 珠海优特智厨科技有限公司 A kind of step switching method, device and the intelligent cooking equipment of intelligence menu
CN110441485A (en) * 2019-08-22 2019-11-12 虫洞(北京)卫生科技有限公司 Multithreading detects the sensor of food materials maturity in food cooking or process
CN111652314A (en) * 2020-06-04 2020-09-11 上海眼控科技股份有限公司 Temperature detection method and device, computer equipment and storage medium
CN112287825A (en) * 2020-10-28 2021-01-29 维沃移动通信有限公司 Cooking assistance method and electronic device
CN112579211A (en) * 2019-09-27 2021-03-30 北京京东尚科信息技术有限公司 Intelligent cooking prompting method, device and system
CN112704380A (en) * 2020-12-30 2021-04-27 珠海格力电器股份有限公司 Cooking monitoring method and device, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190057282A1 (en) * 2017-08-16 2019-02-21 Zaigle Co.,Ltd Carbonization time point management service providing system for cooking fish-meat stuff
CN109446228A (en) * 2018-11-12 2019-03-08 天津市协力自动化工程有限公司 Information cuing method, device, terminal and computer storage medium
CN109709856A (en) * 2018-12-29 2019-05-03 珠海优特智厨科技有限公司 A kind of step switching method, device and the intelligent cooking equipment of intelligence menu
CN110441485A (en) * 2019-08-22 2019-11-12 虫洞(北京)卫生科技有限公司 Multithreading detects the sensor of food materials maturity in food cooking or process
CN112579211A (en) * 2019-09-27 2021-03-30 北京京东尚科信息技术有限公司 Intelligent cooking prompting method, device and system
CN111652314A (en) * 2020-06-04 2020-09-11 上海眼控科技股份有限公司 Temperature detection method and device, computer equipment and storage medium
CN112287825A (en) * 2020-10-28 2021-01-29 维沃移动通信有限公司 Cooking assistance method and electronic device
CN112704380A (en) * 2020-12-30 2021-04-27 珠海格力电器股份有限公司 Cooking monitoring method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115657896A (en) * 2022-12-26 2023-01-31 中科航迈数控软件(深圳)有限公司 Information prompting method based on MR (magnetic resonance) equipment image acquisition and related equipment

Similar Documents

Publication Publication Date Title
JP7181437B2 (en) A technique for identifying skin tones in images under uncontrolled lighting conditions
CN107886032B (en) Terminal device, smart phone, authentication method and system based on face recognition
CN111481049B (en) Cooking equipment control method and device, cooking equipment and storage medium
CN107862018B (en) Recommendation method and device for food cooking method
CN108416902B (en) Real-time object identification method and device based on difference identification
US10607372B2 (en) Cosmetic information providing system, cosmetic information providing apparatus, cosmetic information providing method, and program
US10318797B2 (en) Image processing apparatus and image processing method
CN109767261A (en) Products Show method, apparatus, computer equipment and storage medium
CN106713811B (en) Video call method and device
CN105117399B (en) Image searching method and device
CN111083537B (en) Cooking video generation method and device
CN110956217A (en) Food maturity recognition method and device and computer storage medium
CN109787977B (en) Product information processing method, device and equipment based on short video and storage medium
CN111739155A (en) Virtual character face pinching method and device and terminal equipment
CN110147854A (en) Clothes recognition methods, computer equipment and storage medium
CN113673401A (en) Cooking processing method and device, storage medium and intelligent equipment
WO2020011124A1 (en) Portrait image evaluation based on aesthetics
CN108764248B (en) Image feature point extraction method and device
CN112887615B (en) Shooting method and device
CN113297499A (en) Information recommendation system, method, computer equipment and storage medium
CN111248716B (en) Food cooking control method, image processing method and device and cooking equipment
CN108205664B (en) Food identification method and device, storage medium and computer equipment
CN109446915B (en) Dish information generation method and device and electronic equipment
CN111144141A (en) Translation method based on photographing function
CN116703503A (en) Intelligent recommendation method and system for campus canteen dishes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination