CN109343897A - Awakening method and device for robot - Google Patents

Awakening method and device for robot Download PDF

Info

Publication number
CN109343897A
CN109343897A CN201810904151.8A CN201810904151A CN109343897A CN 109343897 A CN109343897 A CN 109343897A CN 201810904151 A CN201810904151 A CN 201810904151A CN 109343897 A CN109343897 A CN 109343897A
Authority
CN
China
Prior art keywords
robot
user
executes
event
operation case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810904151.8A
Other languages
Chinese (zh)
Inventor
李全印
支涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunji Technology Co Ltd
Original Assignee
Beijing Yunji Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunji Technology Co Ltd filed Critical Beijing Yunji Technology Co Ltd
Priority to CN201810904151.8A priority Critical patent/CN109343897A/en
Publication of CN109343897A publication Critical patent/CN109343897A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/003Controls for manipulators by means of an audio-responsive input
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/081Touching devices, e.g. pressure-sensitive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Electric Clocks (AREA)

Abstract

This application discloses a kind of awakening methods and device for robot.The wake-up side includes that robot enters after suspend mode the user operation case monitored in the robot;Judge whether the user operation case executes on the display module of robot;And if it is determined that the user operation case executes on the display module of robot, then according to user operation case wake-up robot, wherein the display module includes at least: for showing the preset standby interface of the robot.Present application addresses there is technical issues that cannot achieve screen touch wakes up in robot awakening technology.

Description

Awakening method and device for robot
Technical field
This application involves robot awakening technology field, in particular to a kind of awakening method for robot and Device.
Background technique
With the development of artificial intelligence technology, robot is that the production and living of the mankind bring many conveniences.Active user When controlling robot, wake-up word can be preset to robot, when robot hear one it is specific wake up word (such as The name of robot) when, it is known that be that user is calling oneself.Such as user can for robot setting wake up word Alexa or Person Mike, when user calls Alexa Mike, robot will know that user is to call oneself.
In the prior art, at least there are the following problems in the related technology for inventor's discovery: when user thinks not at a certain moment The name of robot of getting up or user possess multiple robots, he can not remember the name of each robot, or because Certain robots grow more alike, and user cannot correctly distinguish robot.In these above-mentioned occasions, user will be unable to wake-up machine Device people will wake up the robot of mistake, and then be unable to complete the actual demand of user.
There are problems that cannot achieve screen touch wakes up in robot awakening technology in the related technology, at present not yet It puts forward effective solutions.
Summary of the invention
The main purpose of the application is to provide a kind of awakening method and device for robot, be called out with solving robot There are problems that cannot achieve screen touch wake-up in the technology of waking up.
To achieve the goals above, according to the one aspect of the application, a kind of awakening method for robot is provided.
Include: according to the awakening method for robot of the application
Robot enters after suspend mode the user operation case monitored in the robot;
Judge whether the user operation case executes on the display module of robot;And
If it is determined that the user operation case executes on the display module of robot, then according to the user's operation thing Part wakes up robot,
Wherein, the display module includes at least: for showing the preset standby interface of the robot.
Further, robot enters the user operation case monitored in the robot after suspend mode and includes:
Robot enters the first dormant state;
Judge whether robot enters the second dormant state according to holding time for first dormant state;And
If judging that robot enters the second dormant state according to holding time for first dormant state, monitor The clicking operation event of user under second dormant state.
Further, judging whether the user operation case executes on the display module of robot includes:
Judge whether user's clicking operation event executes on the screen of robot;
If it is determined that the user operation case executes on the display module of robot, then according to the user's operation thing Part wakes up robot
If it is determined that whether user's clicking operation event executes on the screen of robot, is then clicked and grasped according to the user Sleep state is exited as event and wakes up robot.
Further, judging whether the user operation case executes on the display module of robot includes:
Judge whether user's slide event executes on the screen of robot;
If it is determined that the user operation case executes on the display module of robot, then according to the user's operation thing Part wakes up robot
If it is determined that user's slide event executes on the screen of robot, then according to user's clicking operation thing Part exits sleep state and wakes up robot.
Further, judging whether the user operation case executes on the display module of robot includes:
Judge whether user's touch event executes on the display screen of robot;
If it is determined that the user operation case executes on the display module of robot, then according to the user's operation thing Part wakes up robot
If it is determined that user's touch event executes on the display screen of robot, then according to user's touch operation thing Part exits sleep state and wakes up the sound acquisition module of robot.
To achieve the goals above, according to the another aspect of the application, a kind of Rouser for robot is provided.
Include: according to the Rouser for robot of the application
Module is monitored, enters after suspend mode the user operation case monitored in the robot for robot;
Judgment module, for judging whether the user operation case executes on the display module of robot;And
Wake-up module, for if it is determined that the user operation case executes on the display module of robot, then basis The user operation case wakes up robot,
Wherein, the display module includes at least: for showing the preset standby interface of the robot.
Further, the monitoring module includes:
First dormant state unit enters the first dormant state for robot;
Second dormant state unit, for judging whether robot enters according to holding time for first dormant state Second dormant state;And
Monitoring unit, if for judging that robot enters the second suspend mode according to holding time for first dormant state State then monitors the clicking operation event of the user under second dormant state.
Further, the judgment module includes:
First judging unit, for judging whether user's clicking operation event executes on the screen of robot;
The wake-up module includes:
First wakeup unit, for if it is determined that user's clicking operation event executes on the screen of robot, then basis User's clicking operation event exits sleep state and wakes up robot.
Further, the judgment module includes:
Second judgment unit, for judging whether user's slide event executes on the screen of robot;
The wake-up module includes:
Second wakeup unit, for if it is determined that user's slide event executes on the screen of robot, then basis User's clicking operation event exits sleep state and wakes up robot.
Further, the judgment module includes:
Third judging unit, for judging whether user's touch event executes on the display screen of robot;
The wake-up module includes:
Third wakeup unit, for if it is determined that user's touch event executes on the display screen of robot, then basis User's touch operation event exits sleep state and wakes up the sound acquisition module of robot.
In the embodiment of the present application, by the way of monitoring the user operation case in the robot, by judging It states whether user operation case executes on the display module of robot, has reached and machine is waken up according to the user operation case The purpose of people, thus realize wake up robot technical effect, and then solve in robot awakening technology exist can not be real The technical issues of existing screen touch wakes up.
Detailed description of the invention
The attached drawing constituted part of this application is used to provide further understanding of the present application, so that the application's is other Feature, objects and advantages become more apparent upon.The illustrative examples attached drawing and its explanation of the application is for explaining the application, not Constitute the improper restriction to the application.In the accompanying drawings:
Fig. 1 is the awakening method schematic diagram for robot according to the application first embodiment;
Fig. 2 is the awakening method schematic diagram for robot according to the application second embodiment;
Fig. 3 is the awakening method schematic diagram for robot according to the application 3rd embodiment;
Fig. 4 is the awakening method schematic diagram for robot according to the application fourth embodiment;
Fig. 5 is the awakening method schematic diagram for robot according to the 5th embodiment of the application;
Fig. 6 is the Rouser schematic diagram for robot according to the application first embodiment;
Fig. 7 is the Rouser schematic diagram for robot according to the application second embodiment;
Fig. 8 is the Rouser schematic diagram for robot according to the application 3rd embodiment;
Fig. 9 is the Rouser schematic diagram for robot according to the application fourth embodiment;And
Figure 10 is the Rouser schematic diagram for robot according to the 5th embodiment of the application.
Specific embodiment
In order to make those skilled in the art more fully understand application scheme, below in conjunction in the embodiment of the present application Attached drawing, the technical scheme in the embodiment of the application is clearly and completely described, it is clear that described embodiment is only The embodiment of the application a part, instead of all the embodiments.Based on the embodiment in the application, ordinary skill people Member's every other embodiment obtained without making creative work, all should belong to the model of the application protection It encloses.
It should be noted that the description and claims of this application and term " first " in above-mentioned attached drawing, " Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that using in this way Data be interchangeable under appropriate circumstances, so as to embodiments herein described herein.In addition, term " includes " and " tool Have " and their any deformation, it is intended that cover it is non-exclusive include, for example, containing a series of steps or units Process, method, system, product or equipment those of are not necessarily limited to be clearly listed step or unit, but may include without clear Other step or units listing to Chu or intrinsic for these process, methods, product or equipment.
In this application, term " on ", "lower", "left", "right", "front", "rear", "top", "bottom", "inner", "outside", " in ", "vertical", "horizontal", " transverse direction ", the orientation or positional relationship of the instructions such as " longitudinal direction " be orientation based on the figure or Positional relationship.These terms are not intended to limit indicated dress primarily to better describe the application and embodiment Set, element or component must have particular orientation, or constructed and operated with particular orientation.
Also, above-mentioned part term is other than it can be used to indicate that orientation or positional relationship, it is also possible to for indicating it His meaning, such as term " on " also are likely used for indicating certain relations of dependence or connection relationship in some cases.For ability For the those of ordinary skill of domain, the concrete meaning of these terms in this application can be understood as the case may be.
In addition, term " installation ", " setting ", " being equipped with ", " connection ", " connected ", " socket " shall be understood in a broad sense.For example, It may be a fixed connection, be detachably connected or monolithic construction;It can be mechanical connection, or electrical connection;It can be direct phase It even, or indirectly connected through an intermediary, or is two connections internal between device, element or component. For those of ordinary skills, the concrete meaning of above-mentioned term in this application can be understood as the case may be.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
As shown in Figure 1, the awakening method for robot includes the following steps, namely S102 to step S106: step S102, Robot enters after suspend mode the user operation case monitored in the robot;Step S104 judges the user operation case Whether executed on the display module of robot;And step S106, if it is determined that the user operation case is in robot It is executed on display module, then robot is waken up according to the user operation case, wherein the display module includes at least: used In the display preset standby interface of robot.It can be seen from the above description that the application realizes following technology effect Fruit: in the embodiment of the present application, by the way of monitoring the user operation case in the robot, by judging the user Whether action event executes on the display module of robot, has reached the mesh that robot is waken up according to the user operation case , to realize the technical effect for waking up robot, and then solves to exist in robot awakening technology and cannot achieve screen Touch the technical issues of waking up.
According to the embodiment of the present application, a kind of awakening method for robot is provided, as shown in Fig. 2, robot enters The user operation case in the robot is monitored after suspend mode includes the following steps, namely S202 to step S206:
Step S202 enters the first dormant state for robot;
Robot, which enters the first dormant state, can be light sleep state of the robot when just entering dormant state As the first dormant state.
Step S204 judges whether robot enters the second suspend mode shape according to holding time for first dormant state State;And
General robot is similar to computer or smart phone, is generally provided with set time, such as setting 3 in a program The first dormancy time of minute, when the first dormant state reaches 3 minutes, robot enters the second dormant state.
Second dormant state can be deep sleep state, and in this state, robot shows standby interface.
Step S206, if judging that robot enters the second suspend mode shape according to holding time for first dormant state State then monitors the clicking operation event of the user under second dormant state.
If the first dormant state of robot, which is held time, reaches the dormancy time of system setting, robot is just entered Deep sleep state, robot begins listening for the clicking operation event of user at this time.
The user that clicking operation event can be robot clicks robot screen, is also possible to user in robot screen On slide, can also be user touch robot screen.
According to the embodiment of the present application, a kind of awakening method for robot is provided, as shown in figure 3, judging the use Whether family action event executes on the display module of robot includes the following steps, namely 302:
Step 302, judge whether user's clicking operation event executes on the screen of robot;
Judge whether user's clicking operation event executes can be on the screen of robot and judges that user click event is No generation is on the screen of robot.
If it is determined that the user operation case executes on the display module of robot, then according to the user's operation thing Part wakes up robot and includes the following steps, namely 304:
Step 304, if it is determined that whether user's clicking operation event executes on the screen of robot, then according to the use Family clicking operation event exits sleep state and wakes up robot.
If it is determined that the screen in robot occurs for the either user's touch operation of user click event, user's slide On, then robot exits sleep state and wakes up the voice interaction module of robot.
According to the embodiment of the present application, a kind of awakening method for robot is provided, as shown in figure 4, judging the use Whether family action event executes on the display module of robot includes the following steps, namely 402:
Step 402, judge whether user's slide event executes on the screen of robot;
Judge whether user's slide event executes can be on the screen of robot and judges user or slide Whether event acts on the screen of robot.
If it is determined that the user operation case executes on the display module of robot, then according to the user's operation thing Part wakes up robot and includes the following steps, namely 404:
Step 404, if it is determined that user's slide event executes on the screen of robot, then according to user's point Action event is hit to exit sleep state and wake up robot.
If it is determined that user or slide event act on the screen of robot, then robot exits sleep state simultaneously Wake up the voice interaction module of robot.
According to the embodiment of the present application, a kind of awakening method for robot is provided, as shown in figure 5, judging the use Whether family action event executes on the display module of robot includes the following steps, namely 502:
Step 502, judge whether user's touch event executes on the display screen of robot;
Judge whether user's touch event executes can be on the display screen of robot and judges user's manual palpation Whether movement acts on the display screen of robot.
If it is determined that the user operation case executes on the display module of robot, then according to the user's operation thing Part wakes up robot and includes the following steps, namely 504:
Step 504, it if it is determined that user's touch event executes on the display screen of robot, is then touched according to the user Action event is touched to exit sleep state and wake up the sound acquisition module of robot.
If it is determined that then robot exits sleep shape on the display screen for acting on robot of user's manual palpation State and the voice interaction module for waking up robot.
It should be noted that step shown in the flowchart of the accompanying drawings can be in such as a group of computer-executable instructions It is executed in computer system, although also, logical order is shown in flow charts, and it in some cases, can be with not The sequence being same as herein executes shown or described step.
According to the embodiment of the present application, a kind of wake-up dress for implementing the above-mentioned awakening method for robot is additionally provided It sets, as shown in fig. 6, the device includes:
Module 10 is monitored, enters after suspend mode the user operation case monitored in the robot for robot;
The user behaviour in the robot is monitored after entering suspend mode for robot in the monitoring module 10 of the present embodiment It can be robot monitoring users clicking operation event, user's slide, user's touch operation etc. as event.
Judgment module 20, for judging whether the user operation case executes on the display module of robot;And
In the judgment module 20 of the present embodiment for judge the user operation case whether robot display mould It executes can be on block and judges whether the movements such as user's clicking operation event, user's slide, user's touch operation act on On the display screen of robot.
Wake-up module 30, for if it is determined that the user operation case executes on the display module of robot, then root Robot is waken up according to the user operation case,
Wherein, the display module includes at least: for showing the preset standby interface of the robot.
In the wake-up module 30 of the present embodiment be used for if it is determined that the user operation case robot display mould It is executed on block, then waking up robot according to the user operation case can be if it is determined that user's clicking operation event, user Whether the movements such as slide, user's touch operation act on the display screen of robot, then robot exits sleep state And wake up the voice interaction module of robot.
According to the embodiment of the present application, as shown in fig. 7, the monitoring module 10 includes:
First dormant state unit 101 enters the first dormant state for robot;
Robot can be robot into the first dormant state and exist in the first dormant state unit 101 of the present embodiment Light sleep state when just having entered dormant state is the first dormant state.
Second dormant state unit 102, for whether judging robot according to the holding time for the first dormant state Into the second dormant state;And
General robot is similar to computer or smart phone in the second dormant state unit 102 of the present embodiment, in journey Set time, such as the first dormancy time of setting 3 minutes are generally provided in sequence, when the first dormant state reaches 3 minutes When, robot enters the second dormant state.
Second dormant state can be deep sleep state, and in this state, robot shows standby interface.
Monitoring unit 103, if for judging that robot enters second according to holding time for first dormant state Dormant state then monitors the clicking operation event of the user under second dormant state.
If the first dormant state of robot, which is held time, reaches system setting in the monitoring unit 103 of the present embodiment Dormancy time, robot just enters deep sleep state, and robot begins listening for the clicking operation event of user at this time.
The user that clicking operation event can be robot clicks robot screen, is also possible to user in robot screen On slide, can also be user touch robot screen.
According to the embodiment of the present application, as shown in figure 8, the judgment module 20 includes:
First judging unit 201, for judging whether user's clicking operation event executes on the screen of robot;
Judge user's clicking operation event whether on the screen of robot in first judging unit 201 of the present embodiment Execution, which can be, judges whether user click event occurs on the screen of robot.
The wake-up module 30 includes:
First wakeup unit 301, for if it is determined that user's clicking operation event executes on the screen of robot, then root Sleep state is exited according to user's clicking operation event and wakes up robot.
In first wakeup unit 301 of the present embodiment if it is determined that user click event, user's slide either User's touch operation occurs on the screen of robot, then robot exits sleep state and wakes up the interactive voice mould of robot Block.
According to the embodiment of the present application, as shown in figure 9, the judgment module 20 includes:
Second judgment unit 202, for judging whether user's slide event executes on the screen of robot;
Judge user's slide event whether on the screen of robot in the second judgment unit 202 of the present embodiment Execution, which can be, judges whether user or slide event act on the screen of robot.
The wake-up module 30 includes:
Second wakeup unit 302, for if it is determined that user's slide event executes on the screen of robot, then root Sleep state is exited according to user's clicking operation event and wakes up robot.
If it is determined that user or slide event act on robot in second wakeup unit 302 of the present embodiment On screen, then robot exits sleep state and wakes up the voice interaction module of robot.
According to the embodiment of the present application, as shown in Figure 10, the judgment module 20 includes:
Third judging unit 203, for judging whether user's touch event executes on the display screen of robot;
Judge user's touch event whether on the display screen of robot in the third judging unit 203 of the present embodiment Execution, which can be, judges whether the movement of user's manual palpation acts on the display screen of robot.
The wake-up module 30 includes:
Third wakeup unit 303, for if it is determined that user's touch event executes on the display screen of robot, then root Sleep state is exited according to user's touch operation event and wakes up the sound acquisition module of robot.
If it is determined that user's manual palpation acts on robot in the third wakeup unit 303 of the present embodiment It shows on screen, then robot exits sleep state and wakes up the voice interaction module of robot.
Obviously, those skilled in the art should be understood that each module of above-mentioned the application or each step can be with general Computing device realize that they can be concentrated on a single computing device, or be distributed in multiple computing devices and formed Network on, optionally, they can be realized with the program code that computing device can perform, it is thus possible to which they are stored Be performed by computing device in the storage device, perhaps they are fabricated to each integrated circuit modules or by they In multiple modules or step be fabricated to single integrated circuit module to realize.In this way, the application be not limited to it is any specific Hardware and software combines.
The foregoing is merely preferred embodiment of the present application, are not intended to limit this application, for the skill of this field For art personnel, various changes and changes are possible in this application.Within the spirit and principles of this application, made any to repair Change, equivalent replacement, improvement etc., should be included within the scope of protection of this application.

Claims (10)

1. a kind of awakening method for robot characterized by comprising
Robot enters after suspend mode the user operation case monitored in the robot;
Judge whether the user operation case executes on the display module of robot;And
If it is determined that the user operation case executes on the display module of robot, then called out according to the user operation case Wake up robot,
Wherein, the display module includes at least: for showing the preset standby interface of the robot.
2. awakening method according to claim 1, which is characterized in that robot is monitored in the robot after entering suspend mode User operation case include:
Robot enters the first dormant state;
Judge whether robot enters the second dormant state according to holding time for first dormant state;And
If judging that robot enters the second dormant state according to holding time for first dormant state, monitor described The clicking operation event of user under second dormant state.
3. awakening method according to claim 1, which is characterized in that
Judging whether the user operation case executes on the display module of robot includes:
Judge whether user's clicking operation event executes on the screen of robot;
If it is determined that the user operation case executes on the display module of robot, then called out according to the user operation case Awake robot includes:
If it is determined that whether user's clicking operation event executes on the screen of robot, then according to user's clicking operation thing Part exits sleep state and wakes up robot.
4. awakening method according to claim 1, which is characterized in that
Judging whether the user operation case executes on the display module of robot includes:
Judge whether user's slide event executes on the screen of robot;
If it is determined that the user operation case executes on the display module of robot, then called out according to the user operation case Awake robot includes:
If it is determined that user's slide event executes on the screen of robot, then moved back according to user's clicking operation event Sleep state and wake up robot out.
5. awakening method according to claim 1, which is characterized in that
Judging whether the user operation case executes on the display module of robot includes:
Judge whether user's touch event executes on the display screen of robot;
If it is determined that the user operation case executes on the display module of robot, then called out according to the user operation case Awake robot includes:
If it is determined that user's touch event executes on the display screen of robot, then moved back according to user's touch operation event Sleep state and wake up the sound acquisition module of robot out.
6. a kind of Rouser for robot characterized by comprising
Module is monitored, enters after suspend mode the user operation case monitored in the robot for robot;
Judgment module, for judging whether the user operation case executes on the display module of robot;And
Wake-up module, for if it is determined that the user operation case executes on the display module of robot, then according to User operation case wakes up robot,
Wherein, the display module includes at least: for showing the preset standby interface of the robot.
7. Rouser according to claim 6, which is characterized in that the monitoring module includes:
First dormant state unit enters the first dormant state for robot;
Second dormant state unit, for judging whether robot enters second according to holding time for first dormant state Dormant state;And
Monitoring unit, if for judging that robot enters the second suspend mode shape according to holding time for first dormant state State then monitors the clicking operation event of the user under second dormant state.
8. Rouser according to claim 6, which is characterized in that
The judgment module includes:
First judging unit, for judging whether user's clicking operation event executes on the screen of robot;
The wake-up module includes:
First wakeup unit, for if it is determined that user's clicking operation event executes on the screen of robot, then according to User's clicking operation event exits sleep state and wakes up robot.
9. Rouser according to claim 6, which is characterized in that
The judgment module includes:
Second judgment unit, for judging whether user's slide event executes on the screen of robot;
The wake-up module includes:
Second wakeup unit, for if it is determined that user's slide event executes on the screen of robot, then according to User's clicking operation event exits sleep state and wakes up robot.
10. Rouser according to claim 6, which is characterized in that
The judgment module includes:
Third judging unit, for judging whether user's touch event executes on the display screen of robot;
The wake-up module includes:
Third wakeup unit, for if it is determined that user's touch event executes on the display screen of robot, then according to User's touch operation event exits sleep state and wakes up the sound acquisition module of robot.
CN201810904151.8A 2018-08-09 2018-08-09 Awakening method and device for robot Pending CN109343897A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810904151.8A CN109343897A (en) 2018-08-09 2018-08-09 Awakening method and device for robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810904151.8A CN109343897A (en) 2018-08-09 2018-08-09 Awakening method and device for robot

Publications (1)

Publication Number Publication Date
CN109343897A true CN109343897A (en) 2019-02-15

Family

ID=65296940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810904151.8A Pending CN109343897A (en) 2018-08-09 2018-08-09 Awakening method and device for robot

Country Status (1)

Country Link
CN (1) CN109343897A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306358A (en) * 2019-08-23 2021-02-02 北京字节跳动网络技术有限公司 Voice interaction device, interactive voice device control method, device and storage medium
CN114167823A (en) * 2021-11-11 2022-03-11 合肥欣奕华智能机器有限公司 Robot control method, device, electronic equipment and storage medium
CN114265542A (en) * 2021-12-14 2022-04-01 美的集团(上海)有限公司 Robot voice interaction method and device and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677604A (en) * 2012-09-17 2014-03-26 联想(北京)有限公司 Information processing method and device
CN105511608A (en) * 2015-11-30 2016-04-20 北京光年无限科技有限公司 Intelligent robot based interaction method and device, and intelligent robot
CN105881548A (en) * 2016-04-29 2016-08-24 北京快乐智慧科技有限责任公司 Method for waking up intelligent interactive robot and intelligent interactive robot
CN106584458A (en) * 2016-11-28 2017-04-26 南京熊猫电子股份有限公司 Energy conservation method for robot and energy conservation robot
CN106737750A (en) * 2017-01-13 2017-05-31 合肥优智领英智能科技有限公司 A kind of man-machine interactive intelligent robot
CN106863319A (en) * 2017-01-17 2017-06-20 北京光年无限科技有限公司 A kind of robot awakening method and device
CN107422960A (en) * 2017-07-19 2017-12-01 上海青橙实业有限公司 Operating method and mobile terminal
CN108098767A (en) * 2016-11-25 2018-06-01 北京智能管家科技有限公司 A kind of robot awakening method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677604A (en) * 2012-09-17 2014-03-26 联想(北京)有限公司 Information processing method and device
CN105511608A (en) * 2015-11-30 2016-04-20 北京光年无限科技有限公司 Intelligent robot based interaction method and device, and intelligent robot
CN105881548A (en) * 2016-04-29 2016-08-24 北京快乐智慧科技有限责任公司 Method for waking up intelligent interactive robot and intelligent interactive robot
CN108098767A (en) * 2016-11-25 2018-06-01 北京智能管家科技有限公司 A kind of robot awakening method and device
CN106584458A (en) * 2016-11-28 2017-04-26 南京熊猫电子股份有限公司 Energy conservation method for robot and energy conservation robot
CN106737750A (en) * 2017-01-13 2017-05-31 合肥优智领英智能科技有限公司 A kind of man-machine interactive intelligent robot
CN106863319A (en) * 2017-01-17 2017-06-20 北京光年无限科技有限公司 A kind of robot awakening method and device
CN107422960A (en) * 2017-07-19 2017-12-01 上海青橙实业有限公司 Operating method and mobile terminal

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306358A (en) * 2019-08-23 2021-02-02 北京字节跳动网络技术有限公司 Voice interaction device, interactive voice device control method, device and storage medium
CN114167823A (en) * 2021-11-11 2022-03-11 合肥欣奕华智能机器有限公司 Robot control method, device, electronic equipment and storage medium
CN114167823B (en) * 2021-11-11 2023-11-14 合肥欣奕华智能机器股份有限公司 Robot control method, apparatus, electronic device, and storage medium
CN114265542A (en) * 2021-12-14 2022-04-01 美的集团(上海)有限公司 Robot voice interaction method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN109343897A (en) Awakening method and device for robot
CN106250163B (en) A kind of application control method and device
CN101938818B (en) Two-way waking method, equipment and system based on USB
CN105807965B (en) A kind of method and device for preventing from accidentally touching
CN109660665A (en) A kind of screen control method, screen control device and terminal device
CN105786148A (en) Method and device for controlling power consumption of terminal, and terminal
CN103376414A (en) Method and device for monitoring electric quantity of battery
CN109582778A (en) A kind of intelligent answer method, apparatus, equipment and medium
CN107786754B (en) Cell-phone power saving system based on intelligence wearing equipment
CN108972592A (en) Intelligent awakening method and device for robot
CN108616906A (en) A kind of LTE base station power-economizing method and device
CN105827869A (en) Anti-addiction method for mobile terminal, anti-addiction device for mobile terminal, and mobile terminal
CN204389907U (en) A kind ofly to get along the device of time by counting user
CN108762473A (en) A kind of method that can effectively reduce smart lock stand-by power consumption
CN107608723A (en) One kind application alignment awakening method, terminal and computer-readable recording medium
CN107977068A (en) The method, apparatus and terminal of a kind of delay display of terminal
CN107371214A (en) Control method, device, storage medium and the terminal of terminal
CN106537425A (en) Method and system for generating robot interaction content, and robot
CN109036426A (en) A kind of awakening method and wearable device of wearable device
CN106249948B (en) The control method and device of touch control display device
CN113423035A (en) Storage box-based Bluetooth headset control method and storage box
CN204883184U (en) Intelligent watch
CN113516450A (en) Reminding task generation method and device and electronic equipment
CN109669359A (en) A kind of method, apparatus and electronic equipment detecting passive equipment
CN206711064U (en) A kind of touch communicator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190215