CN112256230B - Menu interaction method and system and storage medium - Google Patents

Menu interaction method and system and storage medium Download PDF

Info

Publication number
CN112256230B
CN112256230B CN202011109051.XA CN202011109051A CN112256230B CN 112256230 B CN112256230 B CN 112256230B CN 202011109051 A CN202011109051 A CN 202011109051A CN 112256230 B CN112256230 B CN 112256230B
Authority
CN
China
Prior art keywords
menu
voice
display
instruction
switching instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011109051.XA
Other languages
Chinese (zh)
Other versions
CN112256230A (en
Inventor
胡子坚
孙峰
孙涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Group Co Ltd
Guangdong Midea Kitchen Appliances Manufacturing Co Ltd
Original Assignee
Midea Group Co Ltd
Guangdong Midea Kitchen Appliances Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Group Co Ltd, Guangdong Midea Kitchen Appliances Manufacturing Co Ltd filed Critical Midea Group Co Ltd
Priority to CN202011109051.XA priority Critical patent/CN112256230B/en
Publication of CN112256230A publication Critical patent/CN112256230A/en
Application granted granted Critical
Publication of CN112256230B publication Critical patent/CN112256230B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/162Interface to dedicated audio devices, e.g. audio drivers, interface to CODECs
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Computational Linguistics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a menu interaction method, a menu interaction system and a storage medium. The menu interaction method is used for a menu interaction system, and the menu interaction system comprises a voice device and a display device. The menu interaction method comprises the following steps: a menu step of acquiring a menu according to a menu starting instruction, wherein the menu step comprises voice broadcasting information and display information; transmitting the voice broadcasting information to voice equipment and transmitting the display information to display equipment; the voice equipment broadcasts the menu step according to the voice broadcast information, and the display equipment displays the menu step according to the display information. According to the menu interaction method, the menu step is broadcasted through the voice equipment and the menu step is displayed through the display equipment, so that the voice and the menu are displayed on different equipment, and a user can understand the menu conveniently.

Description

Menu interaction method and system and storage medium
Technical Field
The invention relates to the field of intelligent cooking, in particular to a menu interaction method, a menu interaction system and a storage medium.
Background
In the related art, with the popularization of smart speakers, more and more families have smart speakers or other devices capable of performing voice interaction. Generally, the smart speaker is in the way of price and cost, and provides a voice interaction entrance, but has no screen, so in many other scenes, such as menu instruction, the user cannot fully understand the voice.
Disclosure of Invention
The embodiment of the invention provides a menu interaction method, a menu interaction system and a storage medium.
The menu interaction method is used for a menu interaction system, and the menu interaction system comprises voice equipment and display equipment. The menu interaction method comprises the following steps: a menu step of acquiring a menu according to a menu starting instruction, wherein the menu step comprises voice broadcasting information and display information; the voice broadcasting information is sent to the voice equipment, and the display information is sent to the display equipment; the voice equipment broadcasts the menu step according to the voice broadcast information, and the display equipment displays the menu step according to the display information.
According to the menu interaction method, the menu step is broadcasted through the voice equipment and the menu step is displayed through the display equipment, so that the voice and the menu are displayed on different equipment, and a user can understand the menu conveniently.
In some embodiments, the step of obtaining a recipe according to a recipe initiation instruction includes: the voice equipment generates the menu starting instruction according to the acquired voice instruction; and/or the display equipment generates the menu starting instruction according to the acquired input instruction.
In some embodiments, the step of obtaining a recipe according to a recipe initiation instruction includes: transmitting the menu starting instruction to a cloud; and receiving the menu step of synchronously sending the cloud to the voice equipment and the display equipment.
In some embodiments, the recipe interaction system comprises a cooking device. The menu obtaining step according to the menu starting instruction comprises the following steps: causing the recipe initiation instruction to be sent to the cooking device; and receiving the menu step of synchronously transmitting the cooking equipment to the voice equipment and the display equipment.
In some embodiments, the menu interaction method includes: and acquiring a menu step switching instruction, and synchronously transmitting voice broadcasting information of the switched menu steps to the voice equipment and display information to the display equipment according to the menu step switching instruction.
In some embodiments, obtaining the recipe step switch instruction includes: the voice equipment acquires a first switching instruction, and the display equipment acquires a second switching instruction; when the interval duration of the first switching instruction and the second switching instruction is smaller than the preset duration and the first switching instruction is consistent with the second switching instruction, the first switching instruction or the second switching instruction is caused to generate the menu step switching instruction
In some embodiments, the menu interaction method includes: and under the condition that the interval duration of the first switching instruction and the second switching instruction is smaller than the preset duration and the first switching instruction is inconsistent with the second switching instruction, ignoring the first switching instruction and the second switching instruction or sending out instruction conflict prompt information.
In some embodiments, the voice device includes a display unit, the display device including a voice unit; the menu interaction method comprises the following steps: the voice broadcasting information is sent to the display equipment, and the display information is sent to the voice equipment; the voice unit broadcasts the menu step according to the voice broadcast information, and the display unit displays the menu step according to the display information.
In some embodiments, the menu interaction method according to claim 1, wherein the total number of the voice device and the display device is at least three, causing the voice broadcast information to be sent to the voice device and the display information to be sent to the display device, comprising: according to the selection instruction, determining a voice device for receiving the voice broadcasting information and a display device for receiving the display information; and sending the voice broadcasting information to the determined voice equipment, and sending the display information to the determined display equipment.
The menu interaction system of the embodiment of the invention comprises a controller, wherein the controller is used for realizing the steps of the menu interaction method of any embodiment.
According to the menu interaction system, the menu steps are broadcasted through the voice equipment and the menu steps are displayed through the display equipment, so that the voice and the menu steps are displayed on different equipment, and a user can understand the menu further conveniently.
The computer-readable storage medium according to an embodiment of the present invention stores thereon a computer program that, when executed by a processor, implements the steps of the recipe interaction method according to any of the above embodiments.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic flow chart of a recipe interaction method according to an embodiment of the present invention;
FIG. 2 is a block diagram of a recipe interaction system according to an embodiment of the present invention;
FIG. 3 is an interaction schematic of a recipe interaction system of an embodiment of the present invention;
FIG. 4 is a schematic flow chart of a recipe interaction method according to an embodiment of the present invention;
FIG. 5 is a block diagram of a recipe interaction system according to an embodiment of the present invention;
fig. 6 to 11 are schematic flow diagrams of a menu interaction method according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, and are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the present invention and are not to be construed as limiting the present invention.
In the description of embodiments of the present invention, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present invention, the meaning of "plurality" is two or more, unless explicitly defined otherwise.
In describing embodiments of the present invention, it should be noted that the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be either fixedly coupled, detachably coupled, or integrally coupled, for example, unless otherwise indicated and clearly defined; can be mechanically connected, electrically connected or can be communicated with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the embodiments of the present invention can be understood by those of ordinary skill in the art according to specific circumstances.
Referring to fig. 1 and 2, a menu interaction method provided in an embodiment of the present invention is used in a menu interaction system 100, where the menu interaction system 100 includes a voice device 10 and a display device 20, and the menu interaction method includes:
step 01: a menu step of acquiring a menu according to a menu starting instruction, wherein the menu step comprises voice broadcasting information and display information;
Step 02: causing the voice broadcast information to be transmitted to the voice device 10 and the display information to be transmitted to the display device 20;
step 03: the voice device 10 broadcasts the menu step according to the voice broadcast information, and the display device 20 displays the menu step according to the display information.
The menu interaction method of the embodiment of the present invention may be implemented by the menu interaction system 100 of the embodiment of the present invention. Specifically, referring to FIG. 2, menu interaction system 100 includes a controller 30. Steps 01, 02 and 03 may all be implemented by the controller 30, that is, the controller 30 is configured to: a menu step of acquiring a menu according to a menu starting instruction, wherein the menu step comprises voice broadcasting information and display information; causing the voice broadcast information to be transmitted to the voice device 10 and the display information to be transmitted to the display device 20; the voice device 10 broadcasts the menu step according to the voice broadcast information, and the display device 20 displays the menu step according to the display information.
According to the menu interaction method and system 100, menu steps are respectively broadcast through the voice equipment 10 and displayed through the display equipment 20, so that the voice and the menu steps are displayed on different equipment, and further understanding of a user on the menu is facilitated.
It will be appreciated that when a user wants to use voice broadcast information and display information simultaneously in cooking, the user can obtain the voice broadcast information and the display information simultaneously through the menu interaction system 100. Specifically, the user may input a related instruction to input a recipe start instruction, and the controller 30 may acquire a recipe step according to the recipe start instruction, the recipe step including voice broadcast information and display information. In some embodiments, the voice broadcast information and the display information may be associated together according to recipe steps, which may include cooking materials and weight information of the cooking materials, cooking steps, estimated duration of cooking, cooking knowledge base, cooking prompts, and the like. The voice broadcasting information prompts menu steps in a voice playing mode, and the display information prompts menu steps in a picture, video and text mode. The controller 30 may cause the voice broadcast information to be transmitted to the voice device 10 and the display information to be transmitted to the display device 20. The voice device 10 broadcasts the menu step according to the voice broadcast information, and the display device 20 displays the menu step according to the display information. The speech device 10 may include a bluetooth speaker, a home speaker, a horn, etc. that may convert electrical signals into acoustic signals. The display device 20 includes an electronic device having a display screen capable of displaying information, such as a smart phone, a tablet computer, a television, a notebook computer, and the like. In the menu interaction system 100 of the embodiment of the invention, the voice device 10 and the display device 20 are arranged separately, so that a user can obtain voice broadcasting information and display information on different electronic devices, and the user can more conveniently obtain menu steps in a cooking scene. Specifically, the existing voice equipment and the electronic equipment which are expected to be added into the menu interaction method and system can be bound in advance. The voice device 10 and the display device 20 realize menu interaction in a linkage synchronous mode. In some embodiments, the voice device 10 and the display device 20 may each be portable electronic devices that facilitate placement of the user in a desired location.
It will be appreciated that in some embodiments, the overall functionality of the controller 30 may be implemented by a controller or processor of the speech device 10 itself, or a control panel, or a computer board.
In some embodiments, the overall functionality of the controller 30 may be implemented by a controller or processor of the display device 20 itself, or a control panel, or a computer board.
In some embodiments, some of the functions of the controller 30 may be implemented by the controller or processor, or control board, or computer board of the speech device 10 itself, and another part of the functions of the controller 30 may be implemented by the controller or processor, or control board, or computer board of the display device 20 itself.
In some embodiments, all of the functions of the controller 30 may be implemented by a separately manufactured control box or control terminal including a controller, processor, control board or computer board. The embodiment of the present invention is not particularly limited.
In some embodiments, the step of obtaining the recipe according to the recipe start instruction further comprises:
The voice equipment 10 generates a menu starting instruction according to the acquired voice instruction; and/or
The display device 20 generates a recipe start instruction according to the acquired input instruction.
In some embodiments, the steps described above may be implemented by the controller 30, that is, the controller 30 is configured to: the voice equipment 10 generates a menu starting instruction according to the acquired voice instruction; and/or the display device 20 generates a recipe start instruction from the acquired input instruction.
Thus, two different modes of menu starting instructions can be used for acquiring menu steps, so that a user has better use experience.
Specifically, in one embodiment, the recipe start instruction may be input through the voice device 10, the voice device 10 generates the recipe start instruction according to the acquired voice instruction, and the voice device 10 may include an acousto-electric element that may convert an acoustic signal into an electric signal to generate the voice instruction. In another embodiment, the menu initiation instruction may be input through the display device 20, the display device 20 generates the menu initiation instruction according to the acquired input instruction, the display device 20 may include a touch display screen and/or a key, the user may input information such as a menu name on the touch display screen to generate the input instruction, and the user may input information such as a menu name through the key to generate the input instruction. In yet another embodiment, the recipe start instruction may also be generated using the voice device 10 and the display device 20. The present invention is not particularly limited herein.
In one example, where the recipe the user wants to cook is a chiffon cake, when the user wants to generate a recipe start instruction through the voice device 10, the user can say: the voice of "please help me search for a recipe for chiffon cake" so that the voice device 10 can generate a recipe start instruction according to the acquired voice instruction. When the user wants to generate a recipe start instruction via the display device 20, the input instruction may be: "how the chiffon cake does", the display device 20 can generate a menu start instruction according to the acquired input instruction.
Referring to fig. 3 and fig. 4 together, in some embodiments, the step of obtaining a recipe according to a recipe start instruction includes:
Step 013: causing a recipe start instruction to be sent to the cloud 40;
step 014: the receiving cloud 40 synchronizes the menu steps sent to the voice device 10 and the display device 20.
In certain embodiments, both step 013 and step 014 may be implemented by controller 30, that is, controller 30 acts to: causing a recipe start instruction to be sent to the cloud 40; the receiving cloud 40 synchronizes the menu steps sent to the voice device 10 and the display device 20.
In this way, the voice device 10 and the display device 20 can obtain the related information of the menu step through the cloud 40 to realize menu interaction in a linkage synchronization manner.
Specifically, the controller 30 may send a menu start instruction to the cloud 40, where the cloud 40 may include a cloud server, and the cloud server has the characteristics of high distribution and high virtualization, and may be configured as required and may be flexibly adjusted. The cloud end 40 can deploy intelligent menu information, including voice broadcast information and display information such as cooking steps, cooking pictures, cooking knowledge base and prompts. In some embodiments, the voice device 10 and the display device 20 may be pre-bound in advance, and the cloud 40 may send the recipe steps synchronously to the pre-bound voice device 10 and display device 20.
In the embodiment shown in fig. 4, the voice device 10 generates a menu starting instruction according to the acquired voice instruction, and sends the menu starting instruction to the cloud end 40. In other embodiments, the display device 20 generates a menu startup instruction according to the acquired input instruction, and sends the menu startup instruction to the cloud 40.
Referring to fig. 5 and 6 together, in some embodiments, the recipe interaction system 100 includes a cooking device 50. The menu obtaining step according to the menu starting instruction comprises the following steps:
step 015: causing a recipe initiation instruction to be sent to the cooking apparatus 50;
step 016: receiving the recipe step by cooking device 50, which is sent to speech device 10 and display device 20 simultaneously.
In certain embodiments, both step 015 and step 016 may be implemented by the controller 30, that is, the controller 30 is configured to: a recipe start command is sent to the cooking device 50; receiving the recipe step by cooking device 50, which is sent to speech device 10 and display device 20 simultaneously.
As such, recipe steps sent synchronously by cooking device 50 to speech device 10 and display device 20 may be received to adapt the recipe steps to cooking device 50.
Specifically, the menu interaction system 100 includes a cooking device 50, and the cooking device 50 may include an induction cooker, a gas range, an electric cooker, a pressure cooker, an air fryer, an oven, a microwave oven, and the like having a cooking function. The controller 30 may send a menu start command to the cooking device 50, the cooking device 50 may search its own menu database according to a menu name included in the menu start command, or the cooking device 50 may also send a menu request command to the cloud 40, so that the cloud 40 returns the searched menu step, and the cooking device 50 may send the acquired menu step to the voice device 10 and the display device 20 synchronously.
In some embodiments, some or all of the functions of the controller 30 may be implemented by a controller or processor of the cooking apparatus 50 itself, or a control panel, or a computer board.
Referring to fig. 3 and fig. 7 together, in some embodiments, the menu interaction method includes:
Step 04: acquiring a menu step switching instruction;
Step 05: the voice broadcast information of the switched menu step is synchronously transmitted to the voice device 10 and the display information is synchronously transmitted to the display device 20 according to the menu step switching instruction.
In certain embodiments, both step 04 and step 05 may be implemented by the controller 30, that is, the controller 30 is configured to: and acquiring a menu step switching instruction, and synchronously transmitting voice broadcasting information of the switched menu steps to the voice equipment 10 and display information to the display equipment 20 according to the menu step switching instruction.
Therefore, the user can switch menu steps more conveniently and rapidly according to cooking needs, and voice broadcasting information and display information are obtained.
Specifically, the menu step switching instruction may be a switching instruction to switch to the next step, to switch to the previous step, or the like. The menu step switching instruction may be a voice instruction generated by the user using the voice device 10, for example: the user can use the menu step switching instruction such as voice please help me switch to next step. Recipe step switching may also be an input instruction generated by the user using the display device 20, for example: the user may use the display device 20 to perform menu step switching, the display device 20 includes an operation interface, and the user may click or slide a "next step" button of the operation interface to perform menu step switching. The controller 30 acquires a menu step switching instruction, and synchronously sends the voice broadcasting information of the switched menu steps to the voice equipment 10 and the display information to the display equipment 20 according to the menu step switching instruction, so as to realize menu interaction.
Referring to fig. 3 and fig. 8 together, in some embodiments, obtaining a menu step switching instruction includes:
step 042: the voice device 10 acquires a first switching instruction, and the display device 20 acquires a second switching instruction;
Step 044: and under the condition that the interval time length of the first switching instruction and the second switching instruction is smaller than the preset time length and the first switching instruction is consistent with the second switching instruction, enabling the first switching instruction or the second switching instruction to generate a menu step switching instruction.
In certain embodiments, both step 042 and step 044 may be implemented by the controller 30, that is, the controller 30 is configured to: the voice device 10 acquires a first switching instruction, and the display device 20 acquires a second switching instruction; and under the condition that the interval time length of the first switching instruction and the second switching instruction is smaller than the preset time length and the first switching instruction is consistent with the second switching instruction, enabling the first switching instruction or the second switching instruction to generate a menu step switching instruction.
Thus, the inconvenience in use caused by the collision of the switching instructions can be avoided.
Specifically, since the window for inputting the instruction may include the voice device 10 and the display device 20, the user may input the switching instruction through the voice device 10 for a while, and input the switching instruction through the display device 20 for a while, in order to avoid the system conflict, when the interval duration of the first switching instruction acquired by the voice device 10 and the second switching instruction acquired by the display device 20 is smaller than the preset duration and the first switching instruction is consistent with the second switching instruction, the first switching instruction or the second switching instruction is made to generate the menu step switching instruction. For example, the preset duration may be 2 seconds, and when the first switching instruction acquired by the voice device 10 is the "next step" instruction and the second switching instruction acquired by the display device 20 is also the "next step" instruction within 2 seconds, the menu step switching instruction may be generated according to the first switching instruction or the second switching instruction, so that inconvenience in use caused by different instructions may be avoided. After the menu step switching instruction is confirmed, the voice broadcasting information corresponding to the step is sent to the voice equipment 10, and the corresponding display information is synchronously sent to the display equipment 20.
Referring to fig. 9, in some embodiments, obtaining a menu step switching instruction includes:
step 046: and under the condition that the interval time length of the first switching instruction and the second switching instruction is smaller than the preset time length and the first switching instruction is inconsistent with the second switching instruction, ignoring the first switching instruction and the second switching instruction or sending out instruction conflict prompt information.
In certain embodiments, step 046 may be implemented by controller 30, that is, controller 30 is to: and under the condition that the interval time length of the first switching instruction and the second switching instruction is smaller than the preset time length and the first switching instruction is inconsistent with the second switching instruction, ignoring the first switching instruction and the second switching instruction or sending out instruction conflict prompt information.
Thus, the conflict problem of the first switching instruction and the second switching instruction can be solved.
Specifically, when the interval duration of the first switching instruction and the second switching instruction is less than the preset duration and the first switching instruction is inconsistent with the second switching instruction, the first switching instruction and the second switching instruction can be ignored, and step switching processing is not performed. The prompt message of instruction conflict can also be sent to remind the user to reconfirm the first switching instruction and the second switching instruction, so that the accuracy of acquiring the menu step switching instruction is ensured. The prompt message may be prompted by the voice device 10 or may be prompted by the display device 20 by prompting text or graphics.
In one example, the preset duration may be 2 seconds, and when the first switching instruction acquired by the voice device 10 is the "next step" instruction and the second switching instruction acquired by the display device 20 is the "last step" instruction within 2 seconds, the first switching instruction and the second switching instruction may not be consistent at this time, so the first switching instruction and the second switching instruction may be omitted, the voice device 20 may continue to broadcast the voice broadcast information of the current menu step, the display device 20 may continue to display the display information of the current menu step, or the prompt information of the instruction conflict performed by the voice device 10 and/or the display device 20.
Referring to fig. 10, in some embodiments, the speech device 10 includes a display unit and the display device 20 includes a speech unit. The menu interaction method comprises the following steps:
Step 06: causing the voice broadcast information to be transmitted to the display device 20 and the display information to be transmitted to the voice device 10;
Step 07: the voice unit broadcasts the menu step according to the voice broadcast information, and the display unit displays the menu step according to the display information.
In certain embodiments, both steps 06 and 07 may be implemented by the controller 30, that is, the controller 30 is configured to: causing the voice broadcast information to be transmitted to the display device 20 and the display information to be transmitted to the voice device 10; the voice unit broadcasts the menu step according to the voice broadcast information, and the display unit displays the menu step according to the display information.
In one example, the voice device 10 includes a display unit, which may include a display screen, and the voice device 10 may include a Bluetooth speaker with a display screen, or the like. The display device 20 comprises a speech unit, which may comprise a speaker, and the display device 20 comprises a smartphone with a speaker and a display screen. The controller 30 can send the voice broadcasting information to the display device 20, send the display information to the voice device 10, and the voice unit broadcasts the menu step according to the voice broadcasting information, and the display unit displays the menu step according to the display information, so that the user can obtain the menu step on different devices, and the user can obtain the menu step more conveniently in the cooking scene.
In this embodiment, the user may freely select a device for voice broadcasting information and displaying information according to a requirement, for example: the voice device 10 includes a display unit, the display device 20 includes a voice unit, and the user can select the voice device 10 to play the voice broadcast information, and the display unit of the voice device 10 and the display device 20 are selected to display the information simultaneously. Therefore, the user can more conveniently obtain the menu step in the cooking scene, and various requirements of the user on the menu step obtaining mode are met.
Referring to fig. 11, in some embodiments, the total number of voice devices 10 and display devices 20 is at least three. Causing the voice broadcast information to be transmitted to the voice device 10 and the display information to be transmitted to the display device 20, includes:
step 022: according to the selection instruction, determining a voice device 10 for receiving voice broadcasting information and a display device 20 for receiving display information;
Step 024: the voice broadcast information is caused to be transmitted to the determined voice device 10 and the display information is caused to be transmitted to the determined display device 20.
In certain embodiments, both step 022 and step 024 may be implemented by the controller 30, that is, the controller 30 is configured to: according to the selection instruction, determining a voice device 10 for receiving voice broadcasting information and a display device 20 for receiving display information; the voice broadcast information is caused to be transmitted to the determined voice device 10 and the display information is caused to be transmitted to the determined display device 20.
Thus, the related information of the menu steps can be flexibly acquired on various devices.
In some embodiments, the total number of voice devices 10 and display devices 20 is at least three. For example: the voice device 10 comprises a bluetooth speaker and the display device 20 comprises a smart phone and a tablet computer. According to the selection instruction, the voice device 10 for receiving the voice broadcasting information is determined to be a bluetooth speaker, the display device 20 for receiving the display information is determined to be a smart phone, and the cloud 40 or the cooking device 50 can send the voice broadcasting information to the bluetooth speaker, and synchronously, the display information is sent to the smart phone.
It should be noted that the voice device 10 and the display device 20 may be preset binding in advance, or may be dynamic binding, which is not limited herein. Dynamic binding, it can be understood that other voice devices 10 or other display devices 20 can be added at any time during the menu step process to enter the menu interaction method and system to participate in the menu interaction process.
It is noted that the specific values mentioned above are only for the purpose of illustrating the implementation of the present invention in detail and are not to be construed as limiting the present invention. In other examples or embodiments or examples, other values may be selected according to the present invention, without specific limitation.
The embodiment of the invention also provides a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and when the program is executed by a processor, the steps of any menu interaction method are realized.
For example, in the case where the program is executed by a processor, the steps of the following control method are implemented:
step 01: a menu step of acquiring a menu according to a menu starting instruction, wherein the menu step comprises voice broadcasting information and display information;
Step 02: causing the voice broadcast information to be transmitted to the voice device 10 and the display information to be transmitted to the display device 20;
step 03: the voice device 10 broadcasts the menu step according to the voice broadcast information, and the display device 20 displays the menu step according to the display information.
The computer readable storage medium may be disposed in the menu interaction system 100, or may be disposed in a cloud server, where the menu interaction system 100 may communicate with the cloud server to obtain a corresponding program.
It is understood that the computer program comprises computer program code. The computer program code may be in the form of source code, object code, executable files, or in some intermediate form, among others. The computer readable storage medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a software distribution medium, and so forth.
The controller of the menu interactive system 100 is a single chip microcomputer chip, and integrates a processor, a memory, a communication module and the like. The processor may refer to a processor comprised by the controller. The Processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), off-the-shelf Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like.
In the description of the present specification, reference to the terms "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, system that includes a processing module, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of embodiments of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and the program, when executed, includes one or a combination of the steps of the method embodiments.
Furthermore, functional units in various embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (9)

1. A menu interaction method for a menu interaction system, the menu interaction system comprising:
Voice equipment, and
The display device is provided with a display screen,
The menu interaction method comprises the following steps:
a menu step of acquiring a menu according to a menu starting instruction, wherein the menu step comprises voice broadcasting information and display information;
the voice broadcasting information is sent to the voice equipment, and the display information is sent to the display equipment;
The voice equipment broadcasts the menu step according to the voice broadcast information, and the display equipment displays the menu step according to the display information;
The menu interaction method further comprises the following steps:
acquiring a menu step switching instruction;
according to the menu step switching instruction, synchronously transmitting voice broadcasting information of the switched menu steps to the voice equipment and display information to the display equipment;
the menu step switching instruction is acquired, and the menu step switching instruction comprises the following steps:
The voice equipment acquires a first switching instruction, and the display equipment acquires a second switching instruction;
and under the condition that the interval duration of the first switching instruction and the second switching instruction is smaller than the preset duration and the first switching instruction is consistent with the second switching instruction, the first switching instruction or the second switching instruction is enabled to generate the menu step switching instruction.
2. The menu interaction method according to claim 1, wherein the step of acquiring a menu according to a menu start instruction comprises:
the voice equipment generates the menu starting instruction according to the acquired voice instruction; and/or
And the display equipment generates the menu starting instruction according to the acquired input instruction.
3. The menu interaction method according to claim 1, wherein the step of acquiring a menu according to a menu start instruction comprises:
transmitting the menu starting instruction to a cloud;
And receiving the menu step of synchronously sending the cloud to the voice equipment and the display equipment.
4. The recipe interaction method according to claim 1, wherein the recipe interaction system comprises:
The cooking device is provided with a cooking device,
The menu obtaining step according to the menu starting instruction comprises the following steps:
Causing the recipe initiation instruction to be sent to the cooking device;
and receiving the menu step of synchronously transmitting the cooking equipment to the voice equipment and the display equipment.
5. The recipe interaction method according to claim 1, wherein the recipe interaction method comprises:
And under the condition that the interval duration of the first switching instruction and the second switching instruction is smaller than the preset duration and the first switching instruction is inconsistent with the second switching instruction, ignoring the first switching instruction and the second switching instruction or sending out instruction conflict prompt information.
6. The menu interaction method of claim 1, wherein the voice device comprises a display unit, the display device comprising a voice unit;
The menu interaction method comprises the following steps:
The voice broadcasting information is sent to the display equipment, and the display information is sent to the voice equipment;
The voice unit broadcasts the menu step according to the voice broadcast information, and the display unit displays the menu step according to the display information.
7. The menu interaction method according to claim 1, wherein the total number of the voice device and the display device is at least three,
And sending the voice broadcasting information to the voice equipment, and sending the display information to the display equipment, wherein the voice broadcasting information comprises the following steps:
According to the selection instruction, determining a voice device for receiving the voice broadcasting information and a display device for receiving the display information;
And sending the voice broadcasting information to the determined voice equipment, and sending the display information to the determined display equipment.
8. A menu interaction system comprising a controller for implementing the steps of the menu interaction method of any of claims 1-7.
9. A computer readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements the steps of the recipe interaction method of any of claims 1-7.
CN202011109051.XA 2020-10-16 2020-10-16 Menu interaction method and system and storage medium Active CN112256230B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011109051.XA CN112256230B (en) 2020-10-16 2020-10-16 Menu interaction method and system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011109051.XA CN112256230B (en) 2020-10-16 2020-10-16 Menu interaction method and system and storage medium

Publications (2)

Publication Number Publication Date
CN112256230A CN112256230A (en) 2021-01-22
CN112256230B true CN112256230B (en) 2024-07-05

Family

ID=74244013

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011109051.XA Active CN112256230B (en) 2020-10-16 2020-10-16 Menu interaction method and system and storage medium

Country Status (1)

Country Link
CN (1) CN112256230B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114343437B (en) * 2022-01-14 2023-08-04 深圳市伊欧乐科技有限公司 Auxiliary cooking system and method based on voice recognition

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207778540U (en) * 2018-01-12 2018-08-28 广东万家乐厨房科技有限公司 Intelligent range hood and smart kitchen systems
CN109637531A (en) * 2018-12-06 2019-04-16 珠海格力电器股份有限公司 Voice control method and device, storage medium and air conditioner

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107204185B (en) * 2017-05-03 2021-05-25 深圳车盒子科技有限公司 Vehicle-mounted voice interaction method and system and computer readable storage medium
CN109308897B (en) * 2018-08-27 2022-04-26 广东美的制冷设备有限公司 Voice control method, module, household appliance, system and computer storage medium
CN109067997A (en) * 2018-08-31 2018-12-21 上海与德通讯技术有限公司 The method and mobile terminal of voice guidance culinary art
CN110338681A (en) * 2019-08-15 2019-10-18 广东工业大学 A kind of intelligence chopping block
CN111524516A (en) * 2020-04-30 2020-08-11 青岛海信网络科技股份有限公司 Control method based on voice interaction, server and display device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207778540U (en) * 2018-01-12 2018-08-28 广东万家乐厨房科技有限公司 Intelligent range hood and smart kitchen systems
CN109637531A (en) * 2018-12-06 2019-04-16 珠海格力电器股份有限公司 Voice control method and device, storage medium and air conditioner

Also Published As

Publication number Publication date
CN112256230A (en) 2021-01-22

Similar Documents

Publication Publication Date Title
KR102674808B1 (en) Audio apparatus and control method thereof
KR101829036B1 (en) Method and apparatus for controlling smart household appliance and terminal
JP2021073567A (en) Voice control method, terminal device, cloud server, and system
US10453331B2 (en) Device control method and apparatus
WO2017193540A1 (en) Method, device and system for playing overlay comment
US8063884B2 (en) Information processing apparatus, display control method, and program for controlling a display of the information processing apparatus based on an input received from a remote controller
RU2672173C2 (en) Video processing method and device
JP2016532986A (en) Upgrade method, apparatus, device, program, and recording medium
JP2017531891A (en) Remote assistance method, client, program, and recording medium
US11671556B2 (en) Method of performing video call and display device
CN113590067A (en) Screen projection control method, system, device and computer readable storage medium
CN112256230B (en) Menu interaction method and system and storage medium
CN106453032B (en) Information-pushing method and device, system
JP2011166691A (en) Electronic device
CN110719530A (en) Video playing method and device, electronic equipment and storage medium
CN104881304A (en) Resource downloading method and device
CN111343495A (en) Display device and method for playing music in terminal
WO2018010338A1 (en) Display method and device
CN104243607A (en) Method and device for acquiring equipment information
CN112055234A (en) Television equipment screen projection processing method, equipment and storage medium
KR101638957B1 (en) Display device and method for program reservating
US20110285862A1 (en) Method and apparatus for providing web camera service in a portable terminal
EP4380163A1 (en) Livestreaming method and apparatus, storage medium, and electronic device
CN101448110A (en) Television with built-in specification
CN112449235A (en) Pairing method, pairing device and television terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant