Specific implementation mode
It is with reference to the accompanying drawings and embodiments, right in order to make the object, technical solution and advantage of the application be more clearly understood
The application section Example is further elaborated.It should be appreciated that specific embodiment described herein is only used to solve
The application is released, is not used to limit the application.
The first embodiment of the application is related to a kind of man-machine interaction method, which is applied to robot, tool
Body flow is as shown in Figure 1.
It should be noted that robot described in the present embodiment is to automatically control being commonly called as machine, including all simulations
The machinery (such as robot dog, Doraemon etc.) of human behavior or thought and simulation other biological.
In a step 101, the biological information of at least one object recognized is extracted.
Specifically, in the present embodiment, the operation of the biological information of at least one object recognized is extracted, is had
Body can be when using robot present position to have detected that at least one object is close in the preset range (such as 5 meters) in the center of circle
Triggering, it allows the robot to perceive the object within the scope of itself 360 degree of present position by this detection mode.
It is noted that in the present embodiment, robot determines the operation for recognizing object, can be specifically to pass through machine
The proximity sensor installed on device people is realized, for example is placed in public places by robot, and after being started, close to sensing
Whether device can perceive to there is within the scope of 5 meters of the artificial center of circle of machine object close, if perceiving the mobile letter of object
After breath or existence information, the information perceived is just converted into electric signal, the life of robot is controlled by the processor of robot
Object collection apparatus device goes to extract the biological information of at least one object recognized.
For the ease of understanding the specific implementation of extraction biological information, several specific extraction sides are enumerated below
Formula, it is specific as follows:
Mode one:It controls robot and carries out Image Acquisition, and extract the life of at least one object from the image collected
Object feature obtains the biological information of at least one object.
Mode two:It controls robot and carries out voice collecting, and extract the life of at least one object from collected voice
Object feature obtains the biological information of at least one object.
Mode three:It controls robot and carries out Image Acquisition and voice collecting, and extract at least one from the image collected
The biological characteristic of a object obtains the biological information of at least one object, while being extracted at least from collected voice
The biological characteristic of one object obtains the biological information of at least one object.
In addition, when employing mode three extracts biological information, it can be further to the object that is obtained from image
Biological information and the object organisms characteristic information obtained from voice carry out analyzing processing, so that it is determined that belonging to same right
The biological information of elephant so that in the follow-up operation for determining target interactive object, image can be come from according to the same object
The biological information of middle object and in voice object biological information carry out comprehensive analysis, to promote determining mesh
Mark the accuracy of interactive object.
It should be noted that in the present embodiment, the biological information of extraction specifically include physiological characteristic information and/or
Behavior characteristic information.
Wherein, physiological characteristic information can be specifically the facial information, eye information, voiceprint of the object recognized
Any one in relevant informations such as (referring specifically to analyze whose information sound comes from) or arbitrary combination, behavioural characteristic letter
Breath can be specifically the displacement information of the object recognized, the voice content information in what is said or talked about language (refers specifically to identify
Go out the information of content) etc. any one or arbitrary combination in relevant informations.
Such as when extracting the biological characteristic of at least one object from the image collected, can usually it extract
The behavior characteristic informations such as physiological characteristic informations and displacement information such as facial information and/or eye information of object.
Also such as, when extracting the biological characteristic of at least one object from collected voice, can usually it extract
The behavior characteristic informations such as the physiological characteristic informations such as the voiceprint to object and voice content information.
In addition, above-mentioned control robot carries out Image Acquisition, can be specifically the image collector for controlling robot itself
It sets, can also be to be obtained from the external image collecting device being connect with robot communication if camera carries out Image Acquisition, such as quotient
Monitoring device or two ways the cooperation acquisition installed in.
Similarly, above-mentioned control robot carries out voice collecting, can also be the voice acquisition device using robot itself
With/communicate with connection external voice harvester obtain.
In addition, it is noted that after determination recognizes object, control robot carries out Image Acquisition and/or voice
Before acquisition, it can control robot according to the directional information for perceiving object and turn to towards where the object recognized
Then direction carries out the acquisition operations of image and/or voice, to ensure in the image collected and voice in control robot
There is the object recognized so that the biological information of the object of subsequent extracted is more complete, and then ensures finally determining mesh
It is more accurate to mark interactive object.
In addition, the image collected described in the present embodiment is not limited to the image informations such as photo, it can also be and regard
Image information in frequency, is not limited herein.
It should be noted that these are only for example, in practical applications, those skilled in the art can be according to it
The technological means of grasp is rationally arranged, as long as can be at least one right from what is recognized according to the biological information extracted
As middle determining target interactive object.
In a step 102, according to biological information, determine that the target for needing to interact is handed over from least one object
Mutual object.
In the present embodiment, according to biological information, the target for needing to interact is determined from least one object
The operation of interactive object can specifically be accomplished in the following manner:
First, according to biological information, determine that at least one object is to wait for interactive object.For convenience of description, this reality
Example is applied to wait for that interactive object behaviour is specifically described.
Specifically, because in practical applications, the object close to robot differs that establish a capital be pair for needing to interact
As close to robot may be such as toy or other-end equipment, and inhuman.It therefore, can be by life that will extract
Object characteristic information is compared with the sample information of the people to prestore, and non-human object is excluded, to ensure the standard of subsequent operation
True property.
In addition, when having multiple people in determining the object recognized, it can also further pass through everyone biology of analysis
Feature, such as direction of displacement (whether being directed towards robot motion), eye information (whether being look at robot etc.) etc. determine it
Whether asking for help, is being determined as the people that these really ask for help to wait for interactive object.
Then, it is waited in interactive object from determining, selection one is satisfactory to wait for interactive object as target interaction pair
As, i.e., robot final choice progress human-computer interaction object.
Specifically, if waiting for, the number of interactive object is equal to 1, this is directly waited for that interactive object is determined as target interaction pair
As if waiting for, the number of interactive object is more than 1, and condition is arranged according to preset priority, waits for that interactive object is arranged for each
Priority, then determine highest priority waits for that interactive object is target interactive object.
In order to make it easy to understand, being specifically described below in conjunction with Fig. 2.
As shown in Fig. 2, occur 3 objects, respectively A, B, C in the range of robot can recognize, and according to
After biological information is judged, this 3 objects meet interactive condition, i.e., are all to wait for interactive object.In this case, really
The mode of interactive object of setting the goal can be determined by the height of priority, such as according to the location information for waiting for interactive object
For it, priority is set.
Specifically, as shown in Fig. 2, the location information for waiting for interactive object A got is (x0, y0), waits for interactive object B
Location information be (x1, y1), wait for that the location information of interactive object C is (x2, y2), according to distance calculation formulaIt can calculate and wait for that interactive object A, B, C are respectively d0, d1, d2 apart from the distance of robot.If d2<d0
<D1, then condition being arranged according to preset priority, (closer apart from robot, priority is higher, remoter apart from robot, preferentially
Grade is lower), to wait for that priority is arranged in interactive object A, B, C, the priority of setting is respectively:Waiting for interactive object C, (priority is most
It is high), wait for interactive object B (priority is minimum), wait for interactive object A (priority bit in wait for interactive object C and wait for interactive object B it
Between), it is assured that waits for that interactive object C is target interactive object at this time.
In addition, it is noted that in practical applications, it is understood that there may be multiple to wait for position of the interactive object apart from robot
Which identical situation can be moved to by robot and need dynamic angle of walking around when interactive object in this case
Minimum principle makes priority judgement.
It should be noted that these are only for example, not to the technical solution of the application and scope of protection structure
At restriction, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
In step 103, the location information of target interactive object is obtained.
At step 104, according to location information, control robot is moved towards target interactive object.
Specifically, after determining target interactive object, can according to the location information of the target interactive object got,
It controls robot to move towards target interactive object so that robot can actively interact operation, promote user experience.
Compared with prior art, the man-machine interaction method provided in the present embodiment, can make robot only for need into
The object of row interaction makes a response, and to effectively prevent accidentally responding operation, greatly improves user experience.
The second embodiment of the application is related to a kind of man-machine interaction method.The present embodiment is done on the basis of first embodiment
It is further improved, specific improvements are:During response matched with target interactive object is made by control robot,
The identity information of target interactive object can be also obtained, and after robot is moved to the region where target interactive object, according to
Identity information is made to carry out specifically below in conjunction with Fig. 3 and Fig. 4 for convenience of description with the matched response of target interactive object
It is bright.
Specifically, in the present embodiment, including step 301 is to step 305, wherein step 301 and step 302, step
304 respectively in first embodiment step 101 and step 102, step 104 it is roughly the same, details are not described herein again, below it is main
Introduce difference, not the technical detail of detailed description in the present embodiment, reference can be made to first embodiment provided it is man-machine
Exchange method, details are not described herein again.
In step 303, the location information and identity information of target interactive object are obtained.
By taking the behaviour of target interactive object as an example, the identity information of the target interactive object obtained in the present embodiment may include
Whether name gender, the age, is any one or arbitrary combination in the relevant informations such as VIP client.
It, specifically can be by face recognition technology by target interactive object it should be noted that above-mentioned various identity informations
Information and robot residing for occasion (such as bank bussiness hall) record the user for handling business face database in store
Human face data matched, after successful match, the Association Identity of the user for handling business of record can be directly acquired
Information.If not successful match, gender and substantially the range of age are first determined according to face recognition technology, then by mutual
The identity information for further improving target interactive object is searched in networking.
In addition, it is noted that in practical applications, when determining target interactive object, can be combined with waiting handing over
The identity information of mutual object is determined, and the priority of interactive object is such as waited for according to the VIP parameter settings carried in identity information,
And consider the factors such as distance, target interactive object is determined, in order to make it easy to understand, being specifically described below in conjunction with Fig. 4.
Specifically, there is A, B, C tri- to wait for interactive object in the range of robot can identify, and each wait for interaction pair
Location information and identity information such as Fig. 4 mark of elephant, wherein wait for interactive object A, B, C respectively apart from the distance of robot be d0,
D1, d2, and d2<d0<d1.
In this case, it determines that the mode of target interactive object can pay the utmost attention to distance factor, selects to wait for interaction pair
As C is target interactive object;VIP factors can also be paid the utmost attention to, select to wait for interactive object A for target interactive object;It can be with
It is to pay the utmost attention to age factor, older is waited for that interactive object is preferentially determined as target interactive object.
It needs, these are only distance explanation, the technical solution of the application and scope of protection are not constituted
It limits, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
In step 305, it behind the region where being moved to target interactive object, is made according to identity information and being handed over target
The response of mutual object matching.
Such as target interactive object be Fig. 4 in C, behind the region where being moved to target interactive object C (such as away from
Position of one meter from target interactive object), robot can actively carry out service query or business guiding, such as " Mr. Zhang Yi, you
It is good, it may I ask you need that business handled?".
Further, in order to promote user experience, after being inquired to target interactive object C, target interaction pair is waited for
It, can also be to waiting for interactive object A and wait for that interactive object B makes that " current guest is more, woulds you please resistance to during making answer as C
The heart waits for!" voice prompt.
It should be noted that these are only for example, not to the technical solution of the application and scope of protection structure
At restriction, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
Compared with prior art, the man-machine interaction method provided in the present embodiment, in the position for obtaining target interactive object
When information, by further obtaining the identity information of target interactive object, so as in robot according to target interactive object
Location information be moved to the region where target interactive object after, can be made according to identity information and target interactive object
The response matched, further the user experience is improved.
The 3rd embodiment of the application is related to a kind of man-machine interaction method.The present embodiment is implemented in first embodiment or second
It is further improved on the basis of example, specific improvements are:It is made in control robot matched with target interactive object
After response, when redefining the target interactive object for needing to interact, need first it is judged whether or not new object connects
Nearly robot, detailed process are as shown in Figure 5.
Specifically, in the present embodiment, including step 501 is to step 508, wherein step 501 to step 504 is distinguished
Roughly the same to step 104 with the step 101 in first embodiment, details are not described herein again, mainly introduces difference below, not
The technical detail of detailed description in the present embodiment, reference can be made to the human-computer interaction that the first embodiment or the second embodiment is provided
Method, details are not described herein again.
In step 505, new object is judged whether there is close to robot.If it is determined that there is new object close to machine
People enters step 506;Otherwise, it is directly entered step 507, it is remaining from last interactive process to wait in interactive object
Again it chooses one and waits for that interactive object makes target interactive object.
Specifically, in the present embodiment, the mode for judging whether there is new object close to robot may be used such as
Described in one embodiment, if new to be detected in the preset range (such as 5 meters) in the center of circle being presently in position using robot
Object is close, it is determined that has new object close to robot, specific to judge operation, details are not described herein again.
In addition, it is necessary to explanation, in the present embodiment, the new object close to robot can be one, it can also
More than 1, it is not limited herein.
In step 506, the biological information of new object is extracted.
In step 507, the target interactive object for needing to interact is redefined.
Specifically, the target object that the needs redefined in the present embodiment interact is specially from new object
It is chosen in the object in addition to the target interactive object of last interactive operation.
In order to make it easy to understand, being specifically described below:
In practical applications, public place especially larger in flow of the people, the same time there may be it is multiple needs with
(i.e. according to the biological information of the object recognized, it is needs that determining has more than 1 object to the object that robot interacts
What is interacted waits for interactive object), however, when carrying out human-computer interaction, synchronization, robot can only wait for interaction pair to one
As making a response and (selected target interactive object being needed to interact), after completing primary interaction, can just wait handing over other
Mutual object interacts.But after completing primary interaction, predetermined it may wait for interactive object also in addition to having around robot
Made a response waiting for robot, it is also possible to have it is new need object to be interacted to occur, therefore in this case, again
Determine the operation for the target interactive object for needing to interact, it is necessary to wait for interactive object and upper primary man-machine friendship what is newly confirmed
It is remaining during mutually to wait for that choosing one in interactive object again waits for that interactive object makes target interactive object.
In addition, it is necessary to explanation, by redefining the target interactive object for needing to interact in this present embodiment
Mode, it is roughly the same with the method for determination in first embodiment, it is required to, according to biological information, determine the object recognized
To wait for interactive object, the target interactive object that interacts finally then is needed from waiting for choosing in interactive object, concrete implementation
Details are not described herein again for details.
It, in the present embodiment still can be according to respectively waiting for that interactive object is preferential in addition, the selection about target interactive object
The height of grade is selected, naturally it is also possible to be determined new target interactive object according to other selection modes, is not limited herein.
In step 508, the matched response of target interactive object that control robot makes and redefines.
Specifically, the matched response of target interactive object that control robot makes and redefines, response process can
Think:Moved towards target interactive object, and after being moved to the region where target interactive object, actively carry out service consultation or
Business guides, and specific response mode can be configured, herein according to the relevant information of the target interactive object redefined
It is not limited.
It should be noted that these are only for example, not to the technical solution of the application and scope of protection structure
At restriction, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
Compared with prior art, the man-machine interaction method provided in the present embodiment, after completing a man-machine interactive operation,
By monitoring whether new object close to robot, and determining that extraction is emerging when having new object close to robot
Whether the biological information of object and determining emerging object are to wait for interactive object, if emerging object is to wait interacting
Object then waits for choosing one again in interactive object waiting for of newly confirming is remaining in interactive object and upper primary interactive process
It is a to wait for that interactive object makes target interactive object, then carry out human-computer interaction;If emerging object is not to wait for interactive object, then
It is remaining directly in upper primary interactive process to wait for that choosing one in interactive object again waits for that interactive object makes target friendship
Then mutual object carries out human-computer interaction.
By foregoing description it is not difficult to find that the man-machine interaction method provided in the present embodiment, can make robot work
Dynamic update perceives the state of object in the process, so as to accurately make the response for meeting current scene, reduces and misses
Operation, further the user experience is improved.
The fourth embodiment of the application is related to a kind of human-computer interaction device, which is applied to robot, tool
Body structure is as shown in Figure 6.
As shown in fig. 6, human-computer interaction device includes extraction module 601, determining module 602 and control module 603.
Wherein, extraction module 601, the biological information for extracting at least one object recognized.
Determining module 602, for according to biological information, the mesh for needing to interact to be determined from least one object
Mark interactive object.
Control module 603 is made and the matched response of target interactive object for controlling robot.
Specifically, in the present embodiment, the biological characteristic at least one object recognized that extraction module 601 extracts
Information can be specifically the combination of any one or both in physiological characteristic information, behavior characteristic information.
In addition, it is noted that the physiological characteristic information that extraction module 601 extracts in the present embodiment can be specifically
Any one or the arbitrary combination such as facial information, eye information, voiceprint of object.The behavior that extraction module 601 extracts
Characteristic information can be specifically the displacement information of object, voice content information etc. any one or combinations thereof.
Determining module 602 is determined from least one object and is handed over according to above-mentioned various biological informations
When mutual target interactive object, can be specifically:First, determine that the object recognized is according to above-mentioned various biological informations
Interactive object (needing the object interacted) is waited for, such as the expression in the eyes note according to the eye information analysis of the object object recognized
Optionally and the displacement information of the object determines whether it is currently asking for help, so that it is determined that whether it is to wait handing over
Mutual object;Then, it is determining after interactive object, is waiting for choosing a satisfactory object in interactive object as mesh from these
Mark interactive object (finally needing the object interacted).
In addition, in the present embodiment, control module 603 control robot make with the matched response of target interactive object,
Can be specifically that control robot is moved towards target interactive object.
Further, after robot is moved to the region where target interactive object, robot can be controlled according to this
The identity information of object makes matched response, such as the guiding of the carry out service query of active, business, can be specifically:
" you are good, may I ask what business you will handle?"
It should be noted that these are only for example, not to the technical solution of the application and scope of protection structure
At restriction, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
In addition, the not technical detail of detailed description in the present embodiment, reference can be made to the application any embodiment is provided
Man-machine interaction method, details are not described herein again.
By foregoing description it is not difficult to find that the human-computer interaction device provided in the present embodiment, is extracted using extraction module
The biological information of at least one object recognized, determining module is according to biological information, from least one object
It determines the target interactive object for needing to interact, is then made and target interactive object using control module control robot
The response matched directly is cooperated so that the robot for being equipped with the human-computer interaction device can be only by above-mentioned each module
For needing the object interacted to make a response, to effectively prevent accidentally responding operation, user experience is greatly improved.
The apparatus embodiments described above are merely exemplary, does not constitute and limits to the protection domain of the application,
In practical applications, it is next real can to select according to the actual needs some or all of module therein by those skilled in the art
The purpose of existing this embodiment scheme, is not limited herein.
The 5th embodiment of the application is related to a kind of robot, and concrete structure is as shown in Figure 7.
The robot can be the intelligence machine positioned at such as bank bussiness hall, megastore, airport equipment public place
Equipment.One or more processors 701 and memory 702 are specifically included inside it, in Fig. 7 by taking a processor 701 as an example.
In the present embodiment, involved in above-described embodiment to human-computer interaction device in each function module be deployed in place
It manages on device 701, processor 701 can be connected with memory 702 by bus or other modes, to be connected by bus in Fig. 7
For.
Memory 702 is used as a kind of computer readable storage medium, can be used for storing software program, computer can perform journey
Sequence and module, the corresponding program instruction/module of man-machine interaction method as involved in the application any means embodiment.Processing
Device 701 is stored in software program, instruction and module in memory 702 by operation, to the various work(of execute server
It can apply and data processing, that is, realize the man-machine interaction method involved in the application any means embodiment.
Memory 702 may include storing program area and storage data field, wherein storing program area can store operation system
System, the required application program of at least one function;Storage data field can establish historical data base, for storing priority setting
Condition etc..In addition, memory 702 may include high-speed random access memory, can also include readable and writable memory (Random
Access Memory, RAM) etc..In some embodiments, it includes remotely located relative to processor 701 that memory 702 is optional
Memory, these remote memories can pass through network connection to terminal device.The example of above-mentioned network includes but not limited to
Internet, intranet, LAN, mobile radio communication and combinations thereof.
In practical applications, the instruction of the execution of at least one processor 701 can be stored in memory 702, instructed by extremely
A few processor 701 executes, so that at least one processor 701 is able to carry out the people that the application any means embodiment is related to
Machine exchange method controls the positioning operation in each function module finishing man-machine interaction method in human-computer interaction device, does not exist
The technical detail of detailed description in the present embodiment, reference can be made to the man-machine interaction method that the application any embodiment is provided.
The sixth embodiment of the application is related to a kind of computer readable storage medium, is deposited in the computer readable storage medium
Computer instruction is contained, which enables a computer to execute the man-machine friendship involved in the application any means embodiment
Mutual method.
It will be understood by those skilled in the art that the various embodiments described above are to realize the specific embodiment of the application, and
In practical applications, can to it, various changes can be made in the form and details, without departing from spirit and scope.