CN107020637A - The emotion expression method and pet robot of pet robot - Google Patents
The emotion expression method and pet robot of pet robot Download PDFInfo
- Publication number
- CN107020637A CN107020637A CN201610067839.6A CN201610067839A CN107020637A CN 107020637 A CN107020637 A CN 107020637A CN 201610067839 A CN201610067839 A CN 201610067839A CN 107020637 A CN107020637 A CN 107020637A
- Authority
- CN
- China
- Prior art keywords
- emotion
- type
- pet robot
- expression service
- emotion expression
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Manipulator (AREA)
Abstract
Emotion expression method and pet robot of the present invention suitable for robotic technology field, more particularly to pet robot.This method includes:The emotional information of user is obtained, and determines according to the emotional information of user the type of emotion of pet robot;Emotion expression service position and the emotion expression service instruction of pet robot corresponding with type of emotion are determined according to the map listing prestored;The expressive site that controls the emotion performs emotion expression service instruction.The present invention determines the type of emotion of pet robot by the emotional information of user, and the mood of pet robot is expressed by the limb action of pet robot, so as to improve the interactive degree between pet robot and user, improve the emotion expression service effect of pet robot, interest is enhanced, and improves Consumer's Experience.
Description
Technical field
The invention belongs to emotion expression method and the pet of robotic technology field, more particularly to pet robot
Robot.
Background technology
Robot is the automatic installations for performing work, and it can both receive mankind commander, and can also run
The program of advance layout, can also be according to the principle guiding principle action formulated with artificial intelligence technology.Pet machine
The most important amusement function of people is can be with human interaction.However, existing pet robot typically can only
Express mood by display screen, it is interesting relatively low, and the effect of emotion expression service is poor, it is impossible to user with
Preferably experience.Meanwhile, existing pet robot can only passively receive the mood input of outside, not
The emotional change of external user can actively be experienced to carry out corresponding mood interaction.
The content of the invention
In consideration of it, the embodiments of the invention provide a kind of emotion expression method of pet robot and pet machine
People, asks so that the emotion expression service mode for solving existing pet robot is single, emotion expression service effect is poor
Topic, and solve existing pet robot and can only passively receive the mood input of outside can not actively to experience outer
The problem of emotional change of portion user is to carry out corresponding mood interaction.
In a first aspect, the embodiments of the invention provide a kind of emotion expression method of pet robot, including:
The emotional information of user is obtained, and the pet robot is determined according to the emotional information of the user
Type of emotion;
The pet robot corresponding with the type of emotion is determined according to the map listing prestored
Emotion expression service position and emotion expression service instruction;
The emotion expression service position is controlled to perform the emotion expression service instruction.
Second aspect, the embodiments of the invention provide a kind of pet robot with emotion expression service function, bag
Include and be provided with pet robot body, the pet robot body:
Type of emotion determining unit, the emotional information for obtaining user, and believed according to the mood of the user
Breath determines the type of emotion of the pet robot;
Emotion expression service position and emotion expression service instruction-determining unit, for true according to the map listing prestored
The emotion expression service position of the fixed pet robot corresponding with the type of emotion and emotion expression service instruction;
Control unit, for controlling the emotion expression service position to perform the emotion expression service instruction.
The beneficial effect that the embodiment of the present invention exists compared with prior art is:The embodiment of the present invention passes through pet
The emotional information of robot active obtaining user, the mood of pet robot is determined according to the emotional information of user
Type, pet machine corresponding with the type of emotion of pet robot is determined according to the map listing prestored
The emotion expression service position of people and emotion expression service instruction, then control the emotion expression service position determined to perform mood table
Up to instruction, thus actively experience the emotional change of external user and pet is determined by the emotional information of user
The type of emotion of robot, and express by the limb action of pet robot the mood of pet robot,
So as to improve the interactive degree between pet robot and user, the emotion expression service effect of pet robot is improved
Really, interest is enhanced, and improves Consumer's Experience.
Brief description of the drawings
Technical scheme in order to illustrate the embodiments of the present invention more clearly, below will be to embodiment or existing skill
The accompanying drawing used required in art description is briefly described, it should be apparent that, drawings in the following description are only
Only it is some embodiments of the present invention, for those of ordinary skill in the art, is not paying creative labor
On the premise of dynamic property, other accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 is the implementation process figure of the emotion expression method of pet robot provided in an embodiment of the present invention;
Fig. 2 is to work as pet robot in the emotion expression method of pet robot provided in an embodiment of the present invention
Type of emotion for happiness when, the action schematic diagram of pet robot;
Fig. 3 is to work as pet robot in the emotion expression method of pet robot provided in an embodiment of the present invention
Type of emotion be anger when, the action schematic diagram of pet robot;
Fig. 4 is to work as pet robot in the emotion expression method of pet robot provided in an embodiment of the present invention
Type of emotion for sorrow when, the action schematic diagram of pet robot;
Fig. 5 is to work as pet robot in the emotion expression method of pet robot provided in an embodiment of the present invention
Type of emotion for it is happy when, the action schematic diagram of pet robot;
Fig. 6 is the implementation process of the emotion expression method for the pet robot that another embodiment of the present invention is provided
Figure;
Fig. 7 is the structured flowchart of pet robot provided in an embodiment of the present invention.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, below in conjunction with accompanying drawing and reality
Example is applied, the present invention will be described in further detail.It should be appreciated that specific embodiment described herein is only
Only to explain the present invention, it is not intended to limit the present invention.
Fig. 1 shows the implementation process of the emotion expression method of pet robot provided in an embodiment of the present invention
Figure, details are as follows:
In step S101, the emotional information of user is obtained, and determine according to the emotional information of the user
The type of emotion of the pet robot.
It should be noted that the pet robot in the embodiment of the present invention can design outer according to the hobby of user
Shape, such as being designed as penguin robot, dog type robot, cat humanoid robot or squirrel robot.
Preferably, the emotional information for obtaining user includes:
The emotional information of user is obtained by camera, microphone and/or pressure sensor.
As one embodiment of the present of invention, the face-image of user is obtained by camera, according to user's
Face-image determines the facial expression of user, and the emotional information of user is determined according to the facial expression of user.Example
Such as, when the facial expression of user is smiles, the emotional information of user is happy, is believed according to the mood of user
It is happiness to cease the type of emotion of the pet robot determined.
As an alternative embodiment of the invention, the volume and sound frequency of user, root are obtained by microphone
The emotional information of user is determined according to the volume and sound frequency of user.For example, when the volume of user is less than first
Preset value, and the sound frequency of user is when being less than the second preset value, determines the emotional information of user to be sad,
Type of emotion according to the pet machine that the emotional information of user is determined is sorrow.
As an alternative embodiment of the invention, obtained and used by pressure sensor and/or acceleration transducer
The emotional information at family, and determine according to the emotional information of user the type of emotion of pet robot.For example, logical
Over-pressed force snesor detects user when embracing pet robot, and the type of emotion for determining pet robot is
Happiness;Exert oneself to rock pet robot for another example detecting user by pressure sensor and acceleration transducer
When, the type of emotion for determining pet robot is anger.
Preferably, the type of emotion includes happiness, anger, sorrow and/or pleasure.
In step s 102, institute corresponding with the type of emotion is determined according to the map listing prestored
State emotion expression service position and the emotion expression service instruction of pet robot.
Preferably, a kind of type of emotion is at least corresponding with an emotion expression service position.In the embodiment of the present invention
In, emotion expression service instruction is corresponding with emotion expression service position;The emotion expression service instruction is action command
And/or facial expression instruction.
In step s 103, the emotion expression service position is controlled to perform the emotion expression service instruction.
Preferably, the emotion expression service position includes forelimb, hind leg, trunk, head and/or face;Institute
Stating hind leg includes leg and pin.
As one embodiment of the present of invention, when pet robot is penguin robot, forelimb can be double
Wing.
Preferably, when the type of emotion is happiness, the corresponding emotion expression service position of the type of emotion
For dipteron, the corresponding emotion expression service instruction of the type of emotion is agitates above and below the dipteron.Fig. 2 shows
Go out in the emotion expression method of pet robot provided in an embodiment of the present invention when the mood of pet robot
When type is happiness, the action schematic diagram of pet robot.Reference picture 2, when the type of emotion of pet robot
During for happiness, corresponding emotion expression service position is dipteron, and the expressive site that controls the emotion performs emotion expression service instruction tool
Body can also carry out Facial expression simultaneously, such as happy table is presented in face to be agitated above and below control dipteron
Feelings.
Preferably, when the type of emotion is anger, the corresponding emotion expression service position of the type of emotion
For dipteron, trunk, right leg and right crus of diaphragm, the corresponding emotion expression service instruction of the type of emotion is described double
Wing expanse opens motionless, and the trunk slightly inclines to the left, and the right leg swing, and the right crus of diaphragm are stamped one's foot.Figure
3 show in the emotion expression method of pet robot provided in an embodiment of the present invention when the feelings of pet robot
When thread type is anger, the action schematic diagram of pet robot.Reference picture 3, when the mood class of pet robot
When type is anger, corresponding emotion expression service position includes dipteron, trunk, right leg and right crus of diaphragm, and control the emotion expression
It is specially that control dipteron expansion is motionless that position, which performs emotion expression service instruction, and control trunk slightly inclines to the left, and control is right
Leg swing, and control right crus of diaphragm to stamp one's foot, Facial expression can also be carried out simultaneously, and such as hair is presented in face
The expression of anger.
Preferably, when the type of emotion is sorrow, the corresponding emotion expression service position of the type of emotion
For head, the corresponding emotion expression service instruction of the type of emotion goes to shoulder position for the head, with
And the low head.Fig. 4 shows the emotion expression method of pet robot provided in an embodiment of the present invention
In when pet robot type of emotion for sorrow when, the action schematic diagram of pet robot.Reference picture 4, when
When the type of emotion of pet robot is sorrow, corresponding emotion expression service position includes head, and control the emotion expression
It is specially that control head goes to shoulder position, and slightly low head that position, which performs emotion expression service instruction, may be used also
To carry out Facial expression simultaneously, sad expression is presented in such as face.
Preferably, when the type of emotion for it is happy when, the corresponding emotion expression service position of the type of emotion
For dipteron and trunk, the corresponding emotion expression service instruction of the type of emotion for agitated above and below the dipteron with
And the trunk swings.Fig. 5 shows the emotion expression service of pet robot provided in an embodiment of the present invention
In method when the type of emotion of pet robot is happy, the action schematic diagram of pet robot.Reference picture 5,
When the type of emotion of pet robot is happy, corresponding emotion expression service position includes dipteron and trunk, control
It is specially to be agitated above and below control dipteron that emotion expression service position, which performs emotion expression service instruction, and controls trunk or so to put
It is dynamic, Facial expression can also be carried out simultaneously, and such as happy expression is presented in face.
Preferably, the action command includes the corresponding type of action information in the emotion expression service position, action
Amplitude information, operating frequency information and/or action duration information.
For example, corresponding emotion expression service position is dipteron when the type of emotion of pet robot is happiness, correspondence
Type of action information for up and down agitate;Movement range information refers to the amplitude agitated above and below dipteron;Action
Frequency information refers to the frequency agitated above and below dipteron, such as once per second;Action duration information refers to control
The total duration agitated above and below dipteron processed.
Fig. 6 shows the realization stream of the emotion expression method for the pet robot that another embodiment of the present invention is provided
Cheng Tu, reference picture 6:
In step s 601, the emotional information of user is obtained, and is determined according to the emotional information of the user
The type of emotion of the pet robot;
In step S602, institute corresponding with the type of emotion is determined according to the map listing prestored
Emotion expression service position and the emotion expression service instruction of pet robot are stated, and is obtained corresponding with the type of emotion
Audio-frequency information;
In step S603, the emotion expression service position is controlled to perform the emotion expression service instruction, and play
The audio-frequency information.
For example, it is happy cry that type of emotion, which is the corresponding audio-frequency information of happiness,;Type of emotion is that anger is corresponding
Audio-frequency information is angry cry;Type of emotion is that sad corresponding audio-frequency information is grieved cry;Mood class
Type is that happy corresponding audio-frequency information is happy cry.
It should be understood that in embodiments of the present invention, the size of the sequence number of above-mentioned each process is not meant to that execution is suitable
The priority of sequence, the execution sequence of each process should be determined with its function and internal logic, real without tackling the present invention
The implementation process for applying example constitutes any limit.
The embodiment of the present invention is by the emotional information of pet robot active obtaining user, according to the mood of user
Information determines the type of emotion of pet robot, is determined according to the map listing prestored and pet robot
The corresponding pet robot of type of emotion emotion expression service position and emotion expression service instruction, then control determine
Emotion expression service position perform emotion expression service instruction, thus actively experience the emotional change of external user and pass through
The emotional information of user determines the type of emotion of pet robot, and the limb action for passing through pet robot
To express the mood of pet robot, so as to improve the interactive degree between pet robot and user, improve
The emotion expression service effect of pet robot, enhances interest, and improve Consumer's Experience.
Fig. 7 shows the structural frames of the pet robot provided in an embodiment of the present invention with emotion expression service function
Figure, the pet robot can be used for service chart 1 to the emotion expression method of the pet robot shown in Fig. 6.
The pet robot includes being provided with pet robot body, reference picture 7, the pet robot body:
Type of emotion determining unit 71, the emotional information for obtaining user, and according to the mood of the user
Information determines the type of emotion of the pet robot;
Emotion expression service position and emotion expression service instruction-determining unit 72, for according to the map listing prestored
It is determined that the emotion expression service position of the pet robot corresponding with the type of emotion and emotion expression service refer to
Order;
Control unit 73, for controlling the emotion expression service position to perform the emotion expression service instruction.
The pet robot body also includes the emotion expression service position 74 being connected with control unit 73.
Preferably, the type of emotion determining unit 71 specifically for:
The emotional information of user is obtained by camera, microphone and/or pressure sensor.
Preferably, it is additionally provided with the pet robot body:
The audio-frequency information acquiring unit 75 being connected with the type of emotion determining unit 71, for acquisition and institute
State the corresponding audio-frequency information of type of emotion;
Broadcast unit 76, for playing the audio-frequency information.
Preferably, it is additionally provided with the pet robot body:
The audio data storage unit 77 being connected with the audio-frequency information acquiring unit 75, the audio-frequency information
Acquiring unit 75 obtains audio letter corresponding with the type of emotion from the audio data storage unit 77
Breath.
Preferably, the action command includes the corresponding type of action information in the emotion expression service position, action
Amplitude information, operating frequency information and/or action duration information.
Preferably, the type of emotion includes happiness, anger, sorrow and/or pleasure.
Preferably, the emotion expression service position 74 includes forelimb, leg, pin, trunk, head and/or face.
The embodiment of the present invention is by the emotional information of pet robot active obtaining user, according to the mood of user
Information determines the type of emotion of pet robot, is determined according to the map listing prestored and pet robot
The corresponding pet robot of type of emotion emotion expression service position and emotion expression service instruction, then control determine
Emotion expression service position perform emotion expression service instruction, thus actively experience the emotional change of external user and pass through
The emotional information of user determines the type of emotion of pet robot, and the limb action for passing through pet robot
To express the mood of pet robot, so as to improve the interactive degree between pet robot and user, improve
The emotion expression service effect of pet robot, enhances interest, and improve Consumer's Experience.
Those of ordinary skill in the art with reference to what the embodiments described herein was described it is to be appreciated that respectively show
The unit and algorithm steps of example, can be come with the combination of electronic hardware or computer software and electronic hardware
Realize.These functions are performed with hardware or software mode actually, depending on the application-specific of technical scheme
And design constraint.Professional and technical personnel can be realized to each specific application using distinct methods
Described function, but this realization is it is not considered that beyond the scope of this invention.
It is apparent to those skilled in the art that, for convenience and simplicity of description, foregoing description
Pet robot and unit specific work process, may be referred to corresponding in preceding method embodiment
Journey, will not be repeated here.
In several embodiments provided herein, it should be understood that disclosed pet robot and side
Method, can be realized by another way.For example, pet robot embodiment described above is only
Schematically, for example, the division of the unit, only a kind of division of logic function can when actually realizing
To there is other dividing mode, such as multiple units can combine or be desirably integrated into another system, or
Some features can be ignored, or not perform.Another, shown or discussed coupling or straight each other
It can be by some interfaces, the INDIRECT COUPLING of unit or communication connection, Ke Yishi to connect coupling or communication connection
Electrically, machinery or other forms.
The unit illustrated as separating component can be or may not be it is physically separate, as
The part that unit is shown can be or may not be physical location, you can with positioned at a place, or
It can also be distributed on multiple NEs.It can select according to the actual needs therein some or all of
Unit realizes the purpose of this embodiment scheme.
In addition, each functional unit in each embodiment of the invention can be integrated in a processing unit,
Can also be that unit is individually physically present, can also two or more units be integrated in a unit
In.
If the function is realized using in the form of SFU software functional unit and as independent production marketing or used
When, it can be stored in a computer read/write memory medium.Understood based on such, skill of the invention
The part or the part of the technical scheme that art scheme substantially contributes to prior art in other words can be with
Embodied in the form of software product, the computer software product is stored in a storage medium, including
Some instructions are to cause a computer equipment (can be personal computer, server, or network are set
It is standby etc.) perform all or part of step of each embodiment methods described of the invention.And foregoing storage medium
Including:USB flash disk, mobile hard disk, read-only storage (ROM, Read-Only Memory), deposit at random
Access to memory (RAM, Random Access Memory), magnetic disc or CD etc. are various to be stored
The medium of program code.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited to
This, any one skilled in the art the invention discloses technical scope in, can readily occur in
Change or replacement, should all be included within the scope of the present invention.Therefore, protection scope of the present invention should
It is described to be defined by scope of the claims.
Claims (12)
1. a kind of emotion expression method of pet robot, it is characterised in that including:
The emotional information of user is obtained, and the pet robot is determined according to the emotional information of the user
Type of emotion;
The pet robot corresponding with the type of emotion is determined according to the map listing prestored
Emotion expression service position and emotion expression service instruction
The emotion expression service position is controlled to perform the emotion expression service instruction.
2. the method as described in claim 1, it is characterised in that the emotional information bag of the acquisition user
Include:
The emotional information of user is obtained by camera, microphone and/or pressure sensor.
3. the method as described in claim 1, it is characterised in that in the emotional information according to the user
After the type of emotion for determining the pet robot, methods described also includes:
Obtain audio-frequency information corresponding with the type of emotion;
Play the audio-frequency information.
4. the method as described in claim any one of 1-3, it is characterised in that the emotion expression service instruction
It is that action command and/or facial expression are instructed.
5. method as claimed in claim 4, it is characterised in that the action command includes the mood
The corresponding type of action information of expressive site, movement range information, operating frequency information and/or action duration
Information.
6. method as claimed in claim 5, it is characterised in that the emotion expression service position include forelimb,
Hind leg, trunk, head and/or face.
7. method as claimed in claim 6, it is characterised in that the pet robot is penguin machine
People, dog type robot, cat humanoid robot or squirrel robot;The forelimb of the penguin robot is dipteron.
8. method as claimed in claim 7, it is characterised in that the type of emotion include happiness, anger,
Sorrow and/or pleasure;
When the type of emotion is happiness, the corresponding emotion expression service position of the type of emotion is dipteron, institute
The corresponding emotion expression service instruction of type of emotion is stated to be agitated above and below the dipteron, the face is presented happy
Expression;
When the type of emotion is anger, the corresponding emotion expression service position of the type of emotion is dipteron, body
Dry, right leg and right crus of diaphragm, the corresponding emotion expression service instruction of the type of emotion deploy motionless for the dipteron,
The trunk is slightly inclined to the left, and the right leg swing, and the right crus of diaphragm are stamped one's foot, and hair is presented in the face
The expression of anger;
When the type of emotion is sorrow, the corresponding emotion expression service position of the type of emotion is head, institute
State the corresponding emotion expression service instruction of type of emotion and go to shoulder position for the head, and it is low described
Sad expression is presented in head, the face;
The type of emotion for it is happy when, the corresponding emotion expression service position of the type of emotion is dipteron and body
Dry, the corresponding emotion expression service instruction of the type of emotion is agitates and the trunk above and below the dipteron
Swing, happy expression is presented in the face.
9. a kind of pet robot with emotion expression service function, including pet robot body, its feature
It is, is provided with the pet robot body:
Type of emotion determining unit, the emotional information for obtaining user, and believed according to the mood of the user
Breath determines the type of emotion of the pet robot;
Emotion expression service position and emotion expression service instruction-determining unit, for true according to the map listing prestored
The emotion expression service position of the fixed pet robot corresponding with the type of emotion and emotion expression service instruction;
Control unit, for controlling the emotion expression service position to perform the emotion expression service instruction.
10. pet robot as claimed in claim 9, it is characterised in that the type of emotion determines single
Member obtains the emotional information of user by camera, microphone and/or pressure sensor.
11. the pet robot as described in claim 9 or 10, it is characterised in that the pet machine
It is additionally provided with human body:
The audio-frequency information acquiring unit being connected with the type of emotion determining unit, for obtaining and the mood
The corresponding audio-frequency information of type;
Broadcast unit, for playing the audio-frequency information.
12. pet robot as claimed in claim 11, it is characterised in that the pet robot sheet
It is additionally provided with body:
The audio data storage unit being connected with the audio-frequency information acquiring unit, the audio-frequency information obtains single
Member obtains audio-frequency information corresponding with the type of emotion from the audio data storage unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610067839.6A CN107020637A (en) | 2016-01-29 | 2016-01-29 | The emotion expression method and pet robot of pet robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610067839.6A CN107020637A (en) | 2016-01-29 | 2016-01-29 | The emotion expression method and pet robot of pet robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107020637A true CN107020637A (en) | 2017-08-08 |
Family
ID=59525186
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610067839.6A Pending CN107020637A (en) | 2016-01-29 | 2016-01-29 | The emotion expression method and pet robot of pet robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107020637A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108741822A (en) * | 2018-05-22 | 2018-11-06 | 国家电网公司客户服务中心南方分中心 | A kind of call center's customer service seat work chair |
CN110625608A (en) * | 2018-06-21 | 2019-12-31 | 卡西欧计算机株式会社 | Robot, robot control method, and storage medium |
CN111191765A (en) * | 2019-12-31 | 2020-05-22 | 华为技术有限公司 | Emotional information processing method and device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1942289A (en) * | 2004-04-16 | 2007-04-04 | 松下电器产业株式会社 | Robot, hint output device, robot control system, robot control method, robot control program, and integrated circuit |
CN101567058A (en) * | 2009-05-31 | 2009-10-28 | 塔米智能科技(北京)有限公司 | Method and system for managing emotion generation and representation of robot |
JP2012000694A (en) * | 2010-06-14 | 2012-01-05 | Fujitsu Ltd | Method and program for controlling robot, and robot |
CN102955565A (en) * | 2011-08-31 | 2013-03-06 | 德信互动科技(北京)有限公司 | Man-machine interaction system and method |
CN103218654A (en) * | 2012-01-20 | 2013-07-24 | 沈阳新松机器人自动化股份有限公司 | Robot emotion generating and expressing system |
CN104158964A (en) * | 2014-08-05 | 2014-11-19 | 广东欧珀移动通信有限公司 | Intelligent emotion expression method of intelligent mobile phone |
-
2016
- 2016-01-29 CN CN201610067839.6A patent/CN107020637A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1942289A (en) * | 2004-04-16 | 2007-04-04 | 松下电器产业株式会社 | Robot, hint output device, robot control system, robot control method, robot control program, and integrated circuit |
CN101567058A (en) * | 2009-05-31 | 2009-10-28 | 塔米智能科技(北京)有限公司 | Method and system for managing emotion generation and representation of robot |
JP2012000694A (en) * | 2010-06-14 | 2012-01-05 | Fujitsu Ltd | Method and program for controlling robot, and robot |
CN102955565A (en) * | 2011-08-31 | 2013-03-06 | 德信互动科技(北京)有限公司 | Man-machine interaction system and method |
CN103218654A (en) * | 2012-01-20 | 2013-07-24 | 沈阳新松机器人自动化股份有限公司 | Robot emotion generating and expressing system |
CN104158964A (en) * | 2014-08-05 | 2014-11-19 | 广东欧珀移动通信有限公司 | Intelligent emotion expression method of intelligent mobile phone |
Non-Patent Citations (1)
Title |
---|
日本早稻田大学ASMEW生物医学工程研究所: "Whole body Emotion Expressions for KOBIAN Humanoid Robot – preliminary experiments with different emotional patterns", 《第18届机器人与人机交互IEEE国际研讨会》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108741822A (en) * | 2018-05-22 | 2018-11-06 | 国家电网公司客户服务中心南方分中心 | A kind of call center's customer service seat work chair |
CN108741822B (en) * | 2018-05-22 | 2021-06-08 | 国家电网公司客户服务中心南方分中心 | Calling center customer service seat worker's chair |
CN110625608A (en) * | 2018-06-21 | 2019-12-31 | 卡西欧计算机株式会社 | Robot, robot control method, and storage medium |
CN111191765A (en) * | 2019-12-31 | 2020-05-22 | 华为技术有限公司 | Emotional information processing method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10922866B2 (en) | Multi-dimensional puppet with photorealistic movement | |
US11452941B2 (en) | Emoji-based communications derived from facial features during game play | |
CN109789550B (en) | Control of social robots based on previous character depictions in novels or shows | |
US10105608B1 (en) | Applying participant metrics in game environments | |
KR101907136B1 (en) | System and method for avatar service through cable and wireless web | |
CN107340865A (en) | Multi-modal virtual robot exchange method and system | |
CN109086860B (en) | Interaction method and system based on virtual human | |
CN109064387A (en) | Image special effect generation method, device and electronic equipment | |
CN112379780B (en) | Multi-mode emotion interaction method, intelligent device, system, electronic device and medium | |
WO2023015921A1 (en) | Animation data processing method, non-volatile storage medium and electronic device | |
CN109324688A (en) | Exchange method and system based on visual human's behavioral standard | |
US11978145B2 (en) | Expression generation for animation object | |
CN107020637A (en) | The emotion expression method and pet robot of pet robot | |
JPH11508491A (en) | Installations and methods for controlling movable equipment (apparatus) | |
KR102222911B1 (en) | System for Providing User-Robot Interaction and Computer Program Therefore | |
CN108052250A (en) | Virtual idol deductive data processing method and system based on multi-modal interaction | |
KR20220094008A (en) | Apparatus and method for creating content based on digital human using artificial intelligence | |
CN108416420A (en) | Limbs exchange method based on visual human and system | |
WO2021174144A1 (en) | Systems and methods for interactive, multimodal book reading | |
EP4315261A1 (en) | Artificial intelligence for capturing facial expressions and generating mesh data | |
CN108681398A (en) | Visual interactive method and system based on visual human | |
CN108415561A (en) | Gesture interaction method based on visual human and system | |
EP2447908A2 (en) | Imaging device and computer reading and recording medium | |
CN103207745A (en) | Virtual avatar interacting system and method | |
Huang | Embodied conversational agents |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170808 |