CN106648522A - Mobile terminal human-computer interaction method and human-computer interaction module - Google Patents

Mobile terminal human-computer interaction method and human-computer interaction module Download PDF

Info

Publication number
CN106648522A
CN106648522A CN201610864721.6A CN201610864721A CN106648522A CN 106648522 A CN106648522 A CN 106648522A CN 201610864721 A CN201610864721 A CN 201610864721A CN 106648522 A CN106648522 A CN 106648522A
Authority
CN
China
Prior art keywords
food
making step
food making
user
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610864721.6A
Other languages
Chinese (zh)
Inventor
江丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LeTV Holding Beijing Co Ltd
LeTV Mobile Intelligent Information Technology Beijing Co Ltd
Original Assignee
LeTV Holding Beijing Co Ltd
LeTV Mobile Intelligent Information Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LeTV Holding Beijing Co Ltd, LeTV Mobile Intelligent Information Technology Beijing Co Ltd filed Critical LeTV Holding Beijing Co Ltd
Priority to CN201610864721.6A priority Critical patent/CN106648522A/en
Publication of CN106648522A publication Critical patent/CN106648522A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • G06F40/211Syntactic parsing, e.g. based on context-free grammar [CFG] or unification grammars

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Telephone Function (AREA)

Abstract

The invention relates to the field of electronic equipment, and discloses a mobile terminal human-computer interaction method and a human-computer interaction module. The method comprises the following steps of: carrying out segmentation identification on imported food making steps and dividing the food making steps according to the segmentation identification result; broadcasting the food making steps and broadcasting one food making step at each time; and when a trigger instruction, in a preset manner, from a user is received or a preset time length is satisfied, broadcasting the next food making step. In the food making process of the user, the user does not need to browse and check a mobile terminal and a recipe is broadcasted according to the food making progress of the user, so that the next food making step can be obtained conveniently. According to the method, the user is prevented from touching the mobile terminal in the food making process and contaminating the mobile terminal by the water/oil/seasonings on the hands, or prevented from repeatedly cleaning the hands to avoid the mobile terminal from being infected with greasy dirt, so that the operation convenience of the user is greatly improved and the food making process of the user is more healthful and convenient.

Description

Mobile terminal man-machine interaction method and human-computer interaction module
Technical field
The present invention relates to a kind of electronic equipment field, the man-machine interaction skill that more particularly to a kind of food making step is reported Art.
Background technology
With the raising of people's quality of the life, increasing white collar begins to focus on life product while the cause of pursuit Matter, everybody is no longer satisfied with the food and drink of dining room finished product, and tends to do yourself and update the food of delicious taste, in health diet While, increased the interest of life.
In recent years, the shared application APP of many recipes is released on mobile phone terminal, such as " is gone to the kitchen ", " menu net ". User shares food making step, manufacturing process picture, the successful experience of oneself on these websites.Other users may be referred to These food making steps explanation (recipe), according to above step, make step by step corresponding food, substantially increase and begin to learn The success rate that person's food makes.
However, during the present invention is realized, inventor is had found because food manufacturing process is mostly comparatively laborious, for All making steps cannot be quickly remembered for abecedarian, generally in food manufacturing process, a step is often completed, it is necessary to Mobile phone is browsed, next step is checked, so as to be easy to that the oil stain in food manufacturing process, batter, water etc. are stained with mobile phone, Mobile phone is made dirty, and in order to avoid mobile phone of making dirty, user needs constantly to clean both hands, then mobile phone operation is carried out, using very Inconvenience.
The content of the invention
The purpose of embodiment of the present invention is to provide a kind of mobile terminal man-machine interaction method and human-computer interaction module, is made User is obtained during food is made, without the need for browsing mobile terminal, you can the convenient food making step for obtaining next step.
To solve above-mentioned technical problem, embodiments of the present invention provide a kind of mobile terminal man-machine interaction method, bag Include:
The explanation of food making step is imported, the food making step explanation to being imported carries out punctuate identification, according to punctuate Recognition result, divides food making step;
Food making step is reported, a food making step is reported every time;
When receiving from the triggering command of the predetermined manner of user, or when meeting preset duration, next food system is reported Make step.
Embodiments of the present invention additionally provide a kind of mobile terminal human-computer interaction module, including:
Punctuate identification module, for carrying out punctuate identification to the food making step for importing explanation, according to punctuate identification knot Really, food making step is divided;
Word broadcasting module, for reporting to the food making step that punctuate identification module is divided, reports every time one Individual food making step;
Control module, for when receiving from the triggering command of the predetermined manner of user, or when meeting preset duration, refers to Show that word broadcasting module reports next food making step.
Embodiment of the present invention in terms of existing technologies, is made pauses in reading unpunctuated ancient writings by the food making step explanation to importing Identification, according to punctuate recognition result, divides food making step;When reporting to food making step, one is reported every time Individual food making step;When receiving from the triggering command of the predetermined manner of user, or when meeting preset duration, report next Food making step.So as to user is during food is made, without the need for browsing, checking mobile terminal, you can according to user's Material object manufacture progress reports recipe, the convenient food making step for obtaining next step.Avoid user make food during Mobile terminal is touched, water/oil on hand/flavoring agent is infected with mobile terminal;Or in order to avoid mobile terminal is infected with greasy dirt, and Both hands are cleaned repeatedly;Greatly improve the operation ease of user so that the process that user makes food is relatively sanitary and convenient.
In addition, the triggering command of predetermined manner at least includes one below:Screen is tapped or default phonetic order.This two Kind of mode operates all very easy, even if user's both hands speckle with food, also can easily realize.
Furthermore it is also possible to pre-save the title of each generic operation and generic operation required time length in food manufacturing process Corresponding relation;
After food making step is divided, each action name included in each food making step is recognized, according to institute The corresponding relation of preservation calculates each food making step and distinguishes corresponding total time length, and each food making step is corresponding total Time span is the summation of the corresponding time span of all operations title included in the food making step;
After the food making step is reported, receive from the triggering command of the predetermined manner of user when, or meet this food During the corresponding total time length of thing making step, next food making step is reported.
Such that it is able to more reasonably control the dead time of each step, match with the actual operating time of user, make Obtaining user can obtain more preferable operating experience.
Furthermore it is also possible to whether after food making step is divided, recognizing to be included in each food making step needs timing Operation, if comprising, record the timing length in the food making step, after the food making step is reported, according to The instruction of user starts timing, when timing length is met, reminds user.So as to user can trust operation subsequent step, and Reminded without the need for arranging quarter-bell in addition without having to worry about the time of forgetting, and user, further facilitated the use of user.
Description of the drawings
Fig. 1 is the mobile terminal man-machine interaction method flow chart according to first embodiment of the invention;
Fig. 2 is the mobile terminal man-machine interaction method flow chart according to second embodiment of the invention;
Fig. 3 is the mobile terminal man-machine interaction method flow chart according to third embodiment of the invention;
Fig. 4 is the mobile terminal human-computer interaction module structure chart according to four embodiment of the invention;
Fig. 5 is the mobile terminal human-computer interaction module structure chart according to fifth embodiment of the invention;
Fig. 6 is the mobile terminal structure schematic diagram according to sixth embodiment of the invention.
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with each reality of the accompanying drawing to the present invention The mode of applying is explained in detail.However, it will be understood by those skilled in the art that in each embodiment of the invention, In order that reader more fully understands the application and proposes many ins and outs.But, even if without these ins and outs and base Many variations and modification in following embodiment, it is also possible to realize each claim of the application technical side required for protection Case.
The first embodiment of the present invention is related to a kind of mobile terminal man-machine interaction method.Idiographic flow is as shown in Figure 1.
In step 101, food making step explanation (recipe) that user chooses is imported, punctuate identification is carried out, according to punctuate Recognition result, divides food making step.Specifically Text region can be carried out using prior art, according in the recipe for being recognized Punctuation mark enter punctuate identification, or, be identified according to step number (step 1, step 2).
In step 102, food making step is reported according to the instruction of user, a food is reported every time and makes step Suddenly.
In step 103, judge whether to receive the triggering command of the predetermined manner from user, if received, enter step Rapid 104, report next food making step.Wherein the triggering command of predetermined manner can be:Screen is tapped, or default voice refers to Order, such as " continuation ".Subsequently enter step 105.If not receiving instruction, return to step 103 is continued waiting for.
In step 105, judge whether the food making step explanation is reported and terminate, if it is, process ends;If Do not terminate, then return to step 103.
By present embodiment, user during food is made, without the need for browsing, checking mobile terminal, you can according to The material object manufacture progress of user reports recipe, the convenient food making step for obtaining next step.User is avoided to make food During touch mobile terminal, water/oil on hand/flavoring agent is infected with mobile terminal;Or in order to avoid mobile terminal contamination Greasy dirt, and both hands are cleaned repeatedly;Greatly improve the operation ease of user so that user make food process it is relatively sanitary and It is convenient.
Second embodiment of the present invention is related to a kind of mobile terminal man-machine interaction method.Second embodiment is real with first Apply mode roughly the same, main distinction part is:In the first embodiment, only the predetermined manner from user is being received During triggering command, next food making step is reported.And in second embodiment of the invention, also pre-set a time (preset duration, such as 5 minutes, 10 minutes), when preset duration is met, even if not receiving the triggering of the predetermined manner from user Instruction, can equally report next food making step.Here preset duration is fixed duration, can be set by mobile terminal acquiescence Put or setting is needed according to oneself by user.
In step 201, the food making step chosen explanation (recipe) is imported, carry out punctuate identification, known according to punctuate Other result, divides food making step.Specifically Text region can be carried out using prior art, according in the recipe for being recognized Punctuation mark enters punctuate identification, or, punctuate identification is carried out according to step number (step 1, step 2).
In step 202, food making step is reported according to the instruction of user, a food is reported every time and makes step Suddenly.And start timing after the report of this food making step terminates.
In step 203, judge whether to receive the triggering command of the predetermined manner from user, if received, enter step Rapid 205, next food making step is reported, and restart timing after the report of this food making step terminates, subsequently enter Step 206.Wherein the triggering command of predetermined manner can be:Screen is tapped, or default phonetic order, such as " continuation ".If not The triggering command of predetermined manner is received, then into step 204.
In step 204, judge whether timing result meets preset duration, if meeting preset duration, into step 205, next food making step is reported, and restart timing after the report of this food making step terminates, subsequently enter step Rapid 206., whereas if not meeting preset duration, then return to step 203, continue waiting for receiving triggering command.
In step 206, judge whether the food making step explanation is reported and terminate, if it is, process ends;If Do not terminate, then return to step 203.
By present embodiment, when user is inconvenient to operate completely, or because other reasonses have no time to attend to operate movement eventually During end, the operation content of next step can be also obtained.
Above the step of various methods divide, be intended merely to description it is clear, can merge into when realizing a step or Some steps are split, multiple steps are decomposed into, as long as comprising identical logical relation, all in the protection domain of this patent It is interior;To either adding inessential modification in algorithm in flow process or introducing inessential design, but its algorithm is not changed With the core design of flow process all in the protection domain of the patent.
Third embodiment of the present invention is related to a kind of mobile terminal man-machine interaction method.3rd embodiment and second is in fact Apply mode roughly the same, main distinction part is:In this second embodiment, preset duration is fixed duration, can be by moving Dynamic terminal default setting needs setting by user according to oneself;And in present embodiment, preset duration is a variable.
Specifically, all kinds of action names in food manufacturing process are pre-saved corresponding with operation required time length Relation;Each action name that each food making step is included in identification recipe, according to above-mentioned corresponding relation each food is calculated Thing making step distinguishes corresponding total time length, i.e., each action name corresponding time that the food making step is included is long The summation of degree;After each food making step is reported, when receiving from the triggering command of the predetermined manner of user, or meet During the corresponding total time length of this food making step, next food making step is reported.
Idiographic flow is as shown in Figure 3.
In step 301, the food making step chosen explanation (recipe) is imported, carry out punctuate identification, known according to punctuate Thing making step is shared the meal in other fruiting area, and recognizes each action name that each food making step is included, according to action name The each self-corresponding total time length of each food making step is calculated with the corresponding relation of required time length, i.e., the food makes The corresponding time span summation of each action name that step is included.
In step 302, food making step is reported according to the instruction of user, a food is reported every time and makes step Suddenly.And start timing after the report of this food making step terminates.This step can start to perform after step 301 has been performed, Can also after step 301 performs a part i.e. start perform, i.e., without the need for etc. all food making steps corresponding total time it is long Degree is calculated and finished, and just starts to report, it is also possible to divide in the food making step for being currently needed for reporting and finish, and calculate To after corresponding total time length, i.e., into step 302, start to report.
In step 303, judge whether to receive the triggering command of the predetermined manner from user, if received, enter step Rapid 305, next food making step is reported, and restart timing after the report of this food making step terminates, subsequently enter Step 306.Wherein the triggering command of predetermined manner can be:Screen is tapped, or default phonetic order, such as " continuation ".If not The triggering command of predetermined manner is received, then into step 304.
In step 304, judge whether timing result meets the corresponding total time length of this food making step, if full Foot, then into step 305, report next food making step, and restarts meter after food making step report terminates When, subsequently enter step 306., whereas if do not meet the corresponding total time length of this food making step, then return to step 303, continue waiting for receiving triggering command.
In step 306, judge whether food making step explanation (recipe) is reported and terminate, if it is, terminating this stream Journey;If do not terminated, return to step 303.
Such as, the time needed for some conventional action names can be pre-saved, such as dough-making powder needs 10 minutes, rolls Face needs 5 minutes, and aerofluxuss bubble needs 5 minutes, etc..After the recipe chosen is imported, punctuate identification is carried out, divide food and make Step, recognizes which action name is each step include, such as step 1 include making dough, aerofluxuss bubble step, determine these operations The corresponding time span of title, make dough (5 minutes), aerofluxuss bubble (5 minutes), calculate a total time length:10 minutes, when this is total Between length be the corresponding total time length of step 301.The corresponding total time length of each step is calculated successively.Step 301 is broadcast After report, when receiving from the triggering command of the predetermined manner of user, or when reaching 10 minutes, next step is reported.
Such that it is able to more reasonably control the dead time of each step, match with the actual operating time of user, make Obtaining user can obtain more preferable operating experience.
As further improvement, can be whether after food making step is divided, recognizing in each food making step Comprising the operation for needing timing, if comprising recording the timing length in the food making step.In each food making step After report, whether the food making step is determined comprising timing length, if comprising, user is pointed out, and according to the finger of user Order starts timing, when the timing length of record is met, reminds user.
For example, such as after a loaf of bread making step is reported, recognize and include in the step of currently reporting " flour The operation of fermentation 1 hour ", then prompt the user whether to start timing, after user sends enabled instruction, starts timing, reaches 1 As a child, user was reminded to reach the prescribed ferment time.So as to user can trust operation subsequent step, and during without having to worry about forgeing Between, and user further facilitates the use of user without the need for arranging quarter-bell in addition being reminded.
In practical operation, the enabled instruction of user equally can be:Screen is tapped, or default phonetic order, such as " is opened Begin ".
By present embodiment, user during food is made, without the need for browsing, checking mobile terminal, you can according to The material object manufacture progress of user reports recipe, the convenient food making step for obtaining next step.Greatly improve the operation of user just Profit so that the process that user makes food is relatively sanitary and convenient.
Four embodiment of the invention is related to a kind of mobile terminal human-computer interaction module, as shown in figure 4, including:
Punctuate identification module, for punctuate identification to be carried out (such as recipe) to the food making step for importing explanation, according to disconnected Sentence recognition result, divides food making step.
Word broadcasting module, for reporting to the food making step that punctuate identification module is divided, reports every time one Individual food making step.
Control module, for when receiving from the triggering command of the predetermined manner of user, or when meeting preset duration, refers to Show that word broadcasting module reports next food making step.
Wherein, punctuate identification module can make the punctuation mark in explanation and carry out punctuate identification according to food, or, root Making the step number in explanation according to food carries out punctuate identification;
Control module receives the triggering command of predetermined manner at least includes one below:Screen is tapped or default voice Instruction.
By present embodiment, punctuate identification can be carried out to the food making step explanation for importing;Food is being made When step is reported, according to punctuate recognition result, a food making step is reported every time;Receiving from the default of user During the triggering command of mode, or when meeting preset duration, next food making step is reported.So as to user is in the mistake for making food Cheng Zhong, without the need for browsing, checking mobile terminal, you can report recipe according to the material object manufacture progress of user is convenient to obtain next step Food making step.User is avoided to touch mobile terminal during food is made, by water/oil on hand/flavoring agent contamination To on mobile terminal;Or in order to avoid mobile terminal is infected with greasy dirt, and both hands are cleaned repeatedly;Greatly improve the operation facility of user Property so that the process that user makes food is relatively sanitary and convenient.
It is seen that, present embodiment is the system embodiment corresponding with second embodiment, and present embodiment can be with Second embodiment is worked in coordination enforcement.The relevant technical details mentioned in second embodiment still have in the present embodiment Effect, in order to reduce repetition, repeats no more here.Correspondingly, the relevant technical details mentioned in present embodiment are also applicable in In second embodiment.
It is noted that each module involved in present embodiment is logic module, in actual applications, one Individual logic module can be a part for a physical module, or a physical module, can be with multiple physics moulds The combination of block is realized.Additionally, the innovative part in order to project the present invention, will not be with solution institute of the present invention in present embodiment The module that the technical problem relation of proposition is less close is introduced, but this is not intended that in present embodiment there are no other moulds Block.
Fifth embodiment of the invention is related to a kind of mobile terminal human-computer interaction module.5th embodiment and the 4th is implemented Mode is roughly the same, and main distinction part is that fifth embodiment of the invention increased on the basis of the 4th embodiment With lower module, as shown in Figure 5:
Memory module, for preserving food manufacturing process in each generic operation title and the generic operation the time required to length Corresponding relation.
Duration calculation module, for recognizing the explanation of food making step in operation name included in each food making step Claim, each food making step is calculated according to the corresponding relation preserved in memory module and distinguishes corresponding total time length, each food The corresponding total time length of thing making step is that all operations title corresponding time included in the food making step is long The summation of degree.
Control module is additionally operable to after food making step is reported, and receives the triggering command of the predetermined manner from user When, or when meeting the corresponding total time length of the current food making step reported, report next food making step.
Such that it is able to more reasonably control the dead time of each step, match with the actual operating time of user, make Obtaining user can obtain more preferable operating experience.
As further improvement, in present embodiment, can also include with lower module:
Timing identification module, for recognizing the explanation of food making step in whether include needing timing in each food making step Operation, if including, record food making step in timing length.
Whether control module is additionally operable to after each food making step is reported, determine the food making step comprising timing Duration, if comprising, user is pointed out, and timing is started according to the instruction of user, meeting determining for timing identification module record During Shi Shichang, the user is reminded.
In present embodiment, the food making step of next step not only can be conveniently obtained.Greatly improve the operation of user Convenience so that the process that user makes food is relatively sanitary and convenient.And user can trust operation subsequent step, and nothing The forgetting time need to be worried, and user is reminded without the need for additionally arranging quarter-bell, further facilitates the use of user.
Because the 3rd embodiment is mutually corresponding with present embodiment, therefore present embodiment can be mutual with the 3rd embodiment It is engaged enforcement.The relevant technical details mentioned in 3rd embodiment still effectively, are implemented in the present embodiment the 3rd The technique effect to be reached in mode is in the present embodiment similarly it is achieved that in order to reduce repetition, no longer go to live in the household of one's in-laws on getting married here State.Correspondingly, the relevant technical details mentioned in present embodiment are also applicable in the 3rd embodiment.
Sixth embodiment of the invention is related to a kind of mobile terminal, as shown in fig. 6, including:
Controller, for punctuate identification to be carried out (such as recipe) to the food making step for importing explanation, according to punctuate identification As a result, food making step is divided;Control microphone is reported to the food making step for dividing, and a food is reported every time Making step;It is in triggering command (as screen is tapped or default phonetic order) for receiving the predetermined manner from user or full During sufficient preset duration, control microphone reports next food making step.
Microphone, for being reported to food making step according to the instruction of controller.
Shaking sensor, for when screen percussion is detected, to controller sending signal, controller can to receive shake After the signal of dynamic sensor, control microphone reports next food making step.
Mike, for the voice of typing user, and sends voice information to controller, and controller is transmitted to mike The voice messaging for coming is identified, if matched with default sound instruction, control microphone is reported next food and made Step.
Wherein, controller can make the punctuation mark in explanation and carry out punctuate identification according to food, or, according to food The step number made in explanation carries out punctuate identification;
By present embodiment, punctuate identification can be carried out to the food making step explanation for importing;Food is being made When step is reported, according to punctuate recognition result, a food making step is reported every time;Receiving from the default of user During the triggering command of mode, or when meeting preset duration, next food making step is reported.So as to user is in the mistake for making food Cheng Zhong, without the need for browsing, checking mobile terminal, you can report recipe according to the material object manufacture progress of user is convenient to obtain next step Food making step.User is avoided to touch mobile terminal during food is made, by water/oil on hand/flavoring agent contamination To on mobile terminal;Or in order to avoid mobile terminal is infected with greasy dirt, and both hands are cleaned repeatedly;Greatly improve the operation facility of user Property so that the process that user makes food is relatively sanitary and convenient.
In order to project the innovative part of the present invention, not by the technology proposed by the invention with solution in present embodiment The less close module of issue concerns is introduced, but this is not intended that in present embodiment there are no other modules.
It will be appreciated by those skilled in the art that realizing that all or part of step in above-described embodiment method can be by Program is completed to instruct the hardware of correlation, and the program storage is in a storage medium, including some instructions are used so that one Individual equipment (can be single-chip microcomputer, chip etc.) or processor (processor) perform the application each embodiment methods described All or part of step.And aforesaid storage medium includes:USB flash disk, portable hard drive, read only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. are various can store journey The medium of sequence code.
It will be understood by those skilled in the art that the respective embodiments described above are to realize the specific embodiment of the present invention, And in actual applications, can in the form and details to it, various changes can be made, without departing from the spirit and scope of the present invention.

Claims (10)

1. a kind of mobile terminal man-machine interaction method, it is characterised in that include:
The explanation of food making step is imported, punctuate identification is carried out to the explanation;
According to punctuate recognition result, food making step is divided;
The food making step is reported, a food making step is reported every time;
When receiving from the triggering command of the predetermined manner of user, or when meeting preset duration, report next food and make step Suddenly.
2. mobile terminal man-machine interaction method according to claim 1, it is characterised in that it is described food is made illustrate into Row punctuate identification, including:
Making the punctuation mark in explanation according to the food carries out punctuate identification, or, made in explanation according to the food The step of number carry out punctuate identification.
3. mobile terminal man-machine interaction method according to claim 1, it is characterised in that the triggering of the predetermined manner refers to Order at least includes one below:
Screen is tapped or default phonetic order.
4. mobile terminal man-machine interaction method according to claim 1, it is characterised in that the preset duration for it is fixed when It is long, arrange by mobile terminal default setting or by user.
5. mobile terminal man-machine interaction method according to claim 1, it is characterised in that also include:
Pre-save the corresponding relation of the title of each generic operation and generic operation required time length in food manufacturing process;
After the division food making step, also include:
The each action name included in each food making step is recognized, each food is calculated according to the corresponding relation Making step distinguishes corresponding total time length, and the corresponding total time length of each described food making step makes for the food The summation of the corresponding time span of all operations title included in step;
After the food making step is reported, when receiving from the triggering command of the predetermined manner of user, or this food is met During the corresponding total time length of making step, next food making step is reported.
6. mobile terminal man-machine interaction method according to claim 1, it is characterised in that the division food making step Afterwards, also include:
Whether recognize in each food making step comprising the operation for needing timing, if comprising recording the food and make Timing length in step, after the food making step is reported, according to the instruction of user timing is started, described fixed meeting During Shi Shichang, the user is reminded.
7. a kind of mobile terminal human-computer interaction module, it is characterised in that include:
Punctuate identification module, for carrying out punctuate identification to the food making step for importing explanation, according to punctuate recognition result, draws Share the meal thing making step;
Word broadcasting module, for reporting to the food making step that the punctuate identification module is divided, reports every time one Individual food making step;
Control module, for when receiving from the triggering command of the predetermined manner of user, or when meeting preset duration, indicates institute State word broadcasting module and report next food making step.
8. mobile terminal human-computer interaction module according to claim 7, it is characterised in that the punctuate identification module according to The punctuation mark that the food is made in explanation carries out punctuate identification, or, the step number in explanation is made according to the food Carry out punctuate identification;
The control module receives the triggering command of predetermined manner at least includes one below:Screen is tapped or default voice Instruction.
9. mobile terminal human-computer interaction module according to claim 7, it is characterised in that also include:
Memory module, for preserving food manufacturing process in each generic operation title with the time required to the generic operation length it is corresponding Relation;
Duration calculation module, for recognizing food making step explanation in operation name included in each food making step Claim, each food making step difference corresponding total time is calculated according to the corresponding relation preserved in the memory module long Degree, the corresponding total time length of each described food making step is all operations title included in the food making step The summation of corresponding time span;
The control module is additionally operable to, and after the food making step is reported, receives the triggering of the predetermined manner from user During instruction, or when meeting the corresponding total time length of the current food making step reported, report next food and make step Suddenly.
10. mobile terminal human-computer interaction module according to claim 7, it is characterised in that also include:
Timing identification module, for recognizing the food making step in whether comprising the operation for needing timing, if comprising remembering Record the timing length in the food making step;
The control module is additionally operable to, and after the food making step is reported, according to the instruction of user timing is started, and is meeting During the timing length of the timing identification module record, the user is reminded.
CN201610864721.6A 2016-09-29 2016-09-29 Mobile terminal human-computer interaction method and human-computer interaction module Pending CN106648522A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610864721.6A CN106648522A (en) 2016-09-29 2016-09-29 Mobile terminal human-computer interaction method and human-computer interaction module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610864721.6A CN106648522A (en) 2016-09-29 2016-09-29 Mobile terminal human-computer interaction method and human-computer interaction module

Publications (1)

Publication Number Publication Date
CN106648522A true CN106648522A (en) 2017-05-10

Family

ID=58854154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610864721.6A Pending CN106648522A (en) 2016-09-29 2016-09-29 Mobile terminal human-computer interaction method and human-computer interaction module

Country Status (1)

Country Link
CN (1) CN106648522A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112034721A (en) * 2020-08-07 2020-12-04 苏宁智能终端有限公司 Intelligent cooking guidance control method, device, equipment, terminal and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080068356A1 (en) * 2006-09-20 2008-03-20 Aleksandrs Zavoronkovs Method and apparatus for delivering and displaying multimedia content to food service customers
CN101694772A (en) * 2009-10-21 2010-04-14 北京中星微电子有限公司 Method for converting text into rap music and device thereof
CN104751379A (en) * 2015-02-15 2015-07-01 张梅云 Manufacture method of digital menu
CN105223856A (en) * 2015-10-30 2016-01-06 上海斐讯数据通信技术有限公司 A kind of culinary art backup system based on speech recognition and method
CN105808660A (en) * 2016-03-01 2016-07-27 深圳前海勇艺达机器人有限公司 Robot menu system based on speech recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080068356A1 (en) * 2006-09-20 2008-03-20 Aleksandrs Zavoronkovs Method and apparatus for delivering and displaying multimedia content to food service customers
CN101694772A (en) * 2009-10-21 2010-04-14 北京中星微电子有限公司 Method for converting text into rap music and device thereof
CN104751379A (en) * 2015-02-15 2015-07-01 张梅云 Manufacture method of digital menu
CN105223856A (en) * 2015-10-30 2016-01-06 上海斐讯数据通信技术有限公司 A kind of culinary art backup system based on speech recognition and method
CN105808660A (en) * 2016-03-01 2016-07-27 深圳前海勇艺达机器人有限公司 Robot menu system based on speech recognition

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112034721A (en) * 2020-08-07 2020-12-04 苏宁智能终端有限公司 Intelligent cooking guidance control method, device, equipment, terminal and system

Similar Documents

Publication Publication Date Title
US8990274B1 (en) Generating a presentation associated with a set of instructions
US10101711B2 (en) Past and future time visualization device
US8447761B2 (en) Lifestyle collecting apparatus, user interface device, and lifestyle collecting method
CN105094801B (en) Application function activation method and device
US20170049360A1 (en) Method and apparatus for controlling display device, and intelligent pad
EP2905689A1 (en) Method and apparatus for displaying character on touchscreen
CN110021299B (en) Voice interaction method, device, system and storage medium
CN109067997A (en) The method and mobile terminal of voice guidance culinary art
KR20170066537A (en) Method and apparatus for adjusting mode
CN107291772A (en) One kind search access method, device and electronic equipment
CN112579873A (en) Cooking recipe recommendation method and device, storage medium and electronic equipment
WO2019080900A1 (en) Neural network training method and device, storage medium, and electronic device
CN106648522A (en) Mobile terminal human-computer interaction method and human-computer interaction module
CN103945059A (en) Mobile terminal with reminding unit and working method thereof
WO2017020373A1 (en) Terminal application starting method and terminal
CN102915110A (en) Mobile phone having tapping selection menu and method thereof
WO2017000344A1 (en) Operating method and terminal based on fingerprint recognition
CN111048126B (en) Menu broadcasting method, storage medium and electronic device
CN106648540A (en) Music switching method and device
CN106951541A (en) Cooking information recommendation, acquisition methods and device
WO2016148124A1 (en) Portable terminal used for wine taste evaluation system
CN107479444A (en) Heat preserving method and device
CN105094361B (en) Associate display methods, device, terminal and the storage medium of words
CN105902180B (en) Electric cooker cleans information cuing method and device
CN107374301B (en) Juicing method and self-service juicer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170510

WD01 Invention patent application deemed withdrawn after publication