CN103927091A - Man-machine interaction method, device and system - Google Patents

Man-machine interaction method, device and system Download PDF

Info

Publication number
CN103927091A
CN103927091A CN201310014368.9A CN201310014368A CN103927091A CN 103927091 A CN103927091 A CN 103927091A CN 201310014368 A CN201310014368 A CN 201310014368A CN 103927091 A CN103927091 A CN 103927091A
Authority
CN
China
Prior art keywords
interactive device
virtual image
instruction
interactive
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310014368.9A
Other languages
Chinese (zh)
Other versions
CN103927091B (en
Inventor
姚洪杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Huawei Device Shenzhen Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN201310014368.9A priority Critical patent/CN103927091B/en
Priority to PCT/CN2014/070596 priority patent/WO2014111008A1/en
Publication of CN103927091A publication Critical patent/CN103927091A/en
Application granted granted Critical
Publication of CN103927091B publication Critical patent/CN103927091B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a man-machine interaction method, a man-machine interaction device and a man-machine interaction system and relates to the technical field of communication; the man-machine interaction method, method and system can enrich the man-machine interaction mode, and thus user experience can be effectively improved. The method comprises the steps of: sending a first interaction device activating command of a virtual image to a first interaction device to enable the first interaction device to activate the virtual image and to display the virtual image on the first interaction device; receiving a second interaction device calling command of a user, wherein the calling command is used for instructing a second interaction device to activate the virtual image; sending a first interaction device hiding command of the virtual image to the first interaction device to enable the first interaction device to hide the virtual image displayed on the first interaction device; sending a second interaction device activating command of the virtual image to a second interaction device to enable the second interaction device to activate the virtual image and display the virtual image on the second interaction device.

Description

A kind of method of man-machine interaction, Apparatus and system
Technical field
The present invention relates to communication technical field, relate in particular to a kind of method, Apparatus and system of man-machine interaction.
Background technology
Human-computer interaction technology (English full name is: Human-Computer Interaction Techniques) refers to by computer input, output device, realize the technology of people and computer dialog in effective mode, it comprises that machine provides in a large number for information about to people by output or display device and prompting is asked for instructions etc., people by input equipment to machine input for information about and prompting ask for instructions etc.Nowadays human-computer interaction technology has obtained increasing application in fields such as mobile phone, panel computer, TVs.
At present, the mode of man-machine interaction is mainly, machine provides human-computer interaction interface for user, and user and machine carry out information interaction by human-computer interaction interface, for example, carry out interactive voice.Wherein, in human-computer interaction interface, the dialog box being presented on machine be can comprise, the virtual image being presented on machine, for example cartoon character also can be comprised.
But the mode of existing this man-machine interaction is more single, and for user, the fresh and alive machine of a right and wrong carries out alternately, therefore, lacks the telepresenc that is similar to interpersonal face-to-face exchange, and feeling of user is more ice-cold, and experience sense is poor.
Summary of the invention
The invention provides a kind of method, Apparatus and system of man-machine interaction, can enrich the mode of man-machine interaction, effectively improve user and experience.
A first aspect of the present invention, provides a kind of method of man-machine interaction, and the method comprises:
Send the first interactive device activation instruction to the first interactive device of virtual image, so that described the first interactive device activates virtual image, described virtual image is presented on described the first interactive device;
Receive the second interactive device calling instruction of user, described second establishes mutual standby calling instruction is used to indicate on the second interactive device and activates virtual image;
The first interactive device that sends virtual image is hidden instruction to described the first interactive device, so that described the first interactive device is hidden the virtual image being presented on described the first interactive device;
Send the second interactive device activation instruction of virtual image to described the second interactive device, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device.
In the possible implementation of the first of first aspect, the second interactive device calling instruction of described user comprises phonetic order, literal order or body sense instruction.
In conjunction with the possible implementation of the first of first aspect or first aspect, in the possible implementation of the second of first aspect, after the second interactive device calling instruction of described reception user, the first interactive device of described transmission virtual image is hidden instruction to described the first interactive device, and described method also comprises:
Inquire about the current state of described virtual image, taking determine described virtual image on described the first interactive device as activate show state.
In conjunction with the possible implementation of the second of the possible implementation of the first of first aspect or first aspect or first aspect, in the third possible implementation of first aspect, the first interactive device of described transmission virtual image is hidden instruction to described the first interactive device, comprises so that described the first interactive device is hidden the virtual image being presented on described the first interactive device:
The first interactive device that sends virtual image is hidden instruction to described the first interactive device, so that described the first interactive device is hidden described virtual image, described virtual image is faded away on described the first interactive device or directly disappear completely.
In conjunction with the third possible implementation of the possible implementation of the second of the possible implementation of the first of first aspect or first aspect or first aspect or first aspect, in the 4th kind of possible implementation of first aspect, the second interactive device activation instruction of described transmission virtual image is to described the second interactive device, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device and comprises:
Send the second interactive device activation instruction of virtual image to described the second interactive device, so that described the second interactive device activates virtual image, make described virtual image gradually or be directly presented at completely on described the second interactive device.
A second aspect of the present invention, provides a kind of device of man-machine interaction, and this device comprises:
Control module, for sending extremely described first interactive device of the first interactive device activation instruction of virtual image, so that described the first interactive device activates virtual image, is presented on described the first interactive device described virtual image;
Receiving element, for receiving the second interactive device calling instruction of user, described calling instruction is used to indicate on described the second interactive device and activates virtual image;
Described control module, also hides instruction to described the first interactive device, to hide the virtual image being presented on described the first interactive device for sending the first interactive device of virtual image;
Described control module, also arrives described the second interactive device for the second interactive device activation instruction that sends virtual image, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device.
In the possible implementation of the first of second aspect, the second interactive device calling instruction of the described user that described receiving element receives comprises phonetic order, literal order or body sense instruction.
In conjunction with the possible implementation of the first of second aspect or second aspect, in the possible implementation of the second of second aspect, described device also comprises:
Query unit, for receiving at described receiving element after the second interactive device calling instruction of user, described control module sends the first interactive device of virtual image and hides instruction to described the first interactive device, inquire about the current state of described virtual image, taking determine described virtual image on described the first interactive device as activate show state.
In conjunction with the possible implementation of the second of the possible implementation of the first of second aspect or second aspect or second aspect, in the third possible implementation of second aspect, described control module is used for:
The first interactive device that sends virtual image is hidden instruction to described the first interactive device, so that described the first interactive device is hidden described virtual image, described virtual image is faded away on described the first interactive device or directly disappear completely.
In conjunction with the third possible implementation of the possible implementation of the second of the possible implementation of the first of second aspect or second aspect or second aspect or second aspect, in the 4th kind of possible implementation of second aspect, described control module is used for:
Send the second interactive device activation instruction of virtual image to described the second interactive device, so that described the second interactive device activates virtual image, make described virtual image gradually or be directly presented at completely on described the second interactive device.
A third aspect of the present invention, provides a kind of man-machine interactive system, and this man-machine interactive system comprises:
At least two interactive devices, described at least two interactive devices comprise the first interactive device and the second interactive device;
Human-computer interaction device, is connected with described at least two interactive devices respectively, for:
Send the first interactive device activation instruction to the first interactive device of virtual image, so that described the first interactive device activates virtual image, described virtual image is presented on described the first interactive device;
Receive the second interactive device calling instruction of user, described calling instruction is used to indicate on the second interactive device and activates virtual image;
The first interactive device that sends virtual image is hidden instruction to the first interactive device, hides the virtual image being presented on described the first interactive device with the first interactive device;
Send the second interactive device activation instruction of virtual image to described the second interactive device, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device.
In the implementation of the third aspect, the interactive device in man-machine interactive system comprises mobile phone, panel computer, PC, TV, game machine or network player.
The methods, devices and systems of man-machine interaction provided by the invention, after the second interactive device calling instruction that receives user, by sending instruction to the first interactive device, this first interactive device receives after instruction, hide the virtual image on this first interactive device, by sending instruction to this second interactive device, this second interactive device receives after this instruction, activates and show this virtual image on this second interactive device.For user, in the time that user calls virtual image, on sense organ, showing as virtual image has left the first interactive device and has come on the second interactive device, that is to say, virtual image shows as and is similar to that to have the true life of soul the same, can understand user's instruction, can come another interactive device from an interactive device according to user's instruction, like this, on the one hand, the mode of man-machine interaction is enriched, on the other hand, because virtual image is similar to the life of necessary being, on sense organ, user can feel carrying out alternately with fresh and alive life, increase user's telepresenc, effectively improve user's experience.
Brief description of the drawings
In order to be illustrated more clearly in the technical scheme of the embodiment of the present invention, below the accompanying drawing of required use during embodiment is described is briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
A kind of schematic flow sheet of the man-machine interaction method that Fig. 1 provides for the embodiment of the present invention;
A kind of structural representation of the human-computer interaction device that Fig. 2 provides for the embodiment of the present invention;
The another kind of structural representation of the human-computer interaction device that Fig. 3 provides for the embodiment of the present invention;
A kind of structural representation of the human-computer interaction device that Fig. 4 provides for the embodiment of the present invention;
A kind of system architecture schematic diagram of the man-machine interactive system that Fig. 5 provides for the embodiment of the present invention;
The another kind of system architecture schematic diagram of the man-machine interactive system that Fig. 6 provides for the embodiment of the present invention;
A kind of illustrative diagram of the man-machine interaction method that Fig. 7 provides for the embodiment of the present invention;
The another kind of illustrative diagram of the man-machine interaction method that Fig. 8 provides for the embodiment of the present invention;
The another kind of schematic flow sheet of the man-machine interaction method that Fig. 9 provides for the embodiment of the present invention;
The another kind of schematic flow sheet of the man-machine interaction method that Figure 10 provides for the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiment.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtaining under creative work prerequisite, belong to the scope of protection of the invention.
The embodiment of the present invention provides a kind of method of man-machine interaction, is carried out by human-computer interaction device, as shown in Figure 1, comprises the following steps:
S11, the first interactive device activation instruction to the first interactive device of transmission virtual image, so that the first interactive device activates virtual image, is presented on described the first interactive device described virtual image.
First it should be noted that, in the embodiment of the present invention, interactive device can be mobile phone, panel computer, PC, intelligent television etc.
In the embodiment of the present invention, so-called virtual image, in the time that user and interactive device carry out man-machine interaction, be presented at the one personification/plan figure image on interactive device, make mutual between people and virtual image of mutual performance in form between user and interactive device, be that user assigns instruction, virtual image response user instruction, wherein, the instruction that user assigns can be phonetic order, literal order or the action command of such as gesture etc. of input, and the language that the response of the instruction of virtual image to user can be said for virtual image, the body movement of virtual image, or when saying language, virtual image follows body movement etc., the response of the instruction that the present invention assigns user and the virtual image instruction to user is for how form does not limit, those skilled in the art can arrange arbitrarily.
In the embodiment of the present invention, this virtual image can be two dimensional surface effect, can be also three-dimensional stereo effect, and the present invention does not limit this.And, the form of virtual image can have abundant selection, for example, virtual image can be designed to figure image, such as beautiful beauty's image, lovely little girl etc. that can people, can certainly be designed to zoomorphism, such as lively and lovely doggie etc., even can be designed to reality and do not exist but imagine personification/plan figure image out, as spirit, the robot etc. of fairy.Especially, the design of virtual image can be added commercialization element, for example, virtual image is designed to certain star image messenger's image, to carry out advertising by this virtual detailed dressing, talk and/or action etc.Apparent, this publicity pattern has good human-oriented property, is easier to make people to accept.Be understandable that, the present invention does not limit the form of virtual image.
It should be noted that, the virtual image being presented on interactive device can be static charge image, certainly, for user, in order to increase the sense of reality of virtual image, the virtual image being presented on interactive device can be designed to dynamic image, and similar personage is the same, can move while speaking, also can be silent but have action.
Concrete, in this step, human-computer interaction device sends after the first interactive device activation instruction of virtual image, this virtual image will appear on the first interactive device and (appear on the display screen of the first interactive device), wherein, the mode that virtual image can be walked to be similar to personage, slowly engenders on the first interactive device, also can directly intactly appear on the first interactive device.Optionally, after virtual image appears on the first interactive device, virtual image can be sounded, and for example, says language such as " I have come, what " or " hello ", the action such as can also make in a short time or wave.Be understandable that, it is only example that above-mentioned virtual image activates the mode showing, the present invention is not done to any restriction.
As seen from the above, in the embodiment of the present invention, for user, no longer with one ice-cold equipment of user carries out alternately, but carries out information interaction with a lively virtual image, and this interactive mode can obviously improve user's experience.
S12, receives the second interactive device calling instruction of user, and described calling instruction is used to indicate on the second interactive device and activates virtual image.
In this step, user wants virtual image to appear on the second interactive device, sends the second interactive device calling instruction to human-computer interaction device.
In this step, concrete, user's calling instruction can comprise phonetic order, literal order or body sense instruction.Wherein, similarly language of " coming ", " little child comes " " Laiing here " " this comes to me " that phonetic order can be said for user or " spirit is come " etc., to call virtual image to appear on the second interactive device.And literal order refers to that user passes through input equipment, the instruction that for example keyboard is keyed in, for example, keys in literal order " you come " or " you come soon " etc., thereby requires virtual image to appear on the second interactive device.Body sense instruction refers to the calling instruction that user assigns by gesture or other body actions, and for example user makes towards the virtual image appearing at now on a certain interactive device the action of waving, and is intended to virtual image to call on distance users interactive device at one's side.Certainly, be understandable that, the embodiment of the present invention to calling instruction why the particular content of instruction and this instruction do not limit, those skilled in the art can select arbitrarily.
S13, the first interactive device that sends virtual image is hidden instruction to the first interactive device, is presented at the virtual image on described the first interactive device so that the first interactive device is hidden this.
Optionally, in the embodiment of the present invention, in the time hiding the virtual image being just presented on the first interactive device, can make virtual image as people leaves on foot, on described the first interactive device, fade away, certainly, also can make virtual image directly disappear completely on described the first interactive device.
S14, sends the second interactive device activation instruction of virtual image to the second interactive device, so that the second interactive device activates virtual image, described virtual image is presented on described the second interactive device.
Similarly, optional, in the time making virtual image appear on the second interactive device, can make virtual image as people comes over, on the second interactive device, engender, certainly, also can make virtual image directly be presented at completely on described the second interactive device on the second interactive device.
In order to embody the fresh and alive property of virtual image, make virtual image as a real life, in the embodiment of the present invention, in step S12, receive after the second interactive device calling instruction of user, in S13 step, by sending instruction to the first interactive device, this first interactive device receives after instruction, hide the virtual image on this first interactive device, in S14 step, by sending instruction to this second interactive device, this second interactive device receives after this instruction, activates and show this virtual image on this second interactive device.So, call after virtual image user, the be felt as virtual image of user on sense organ " passed through " the second interactive device from the first interactive device, left the first interactive device and come on the second interactive device, can obviously make people feel that virtual image is similar to that to have the true life of soul the same, can understand user's instruction, can come another interactive device from an interactive device according to user's instruction.
It should be noted that, S13 step and S14 step can be carried out successively, can also be by the design of the sequential of these two steps, make when virtual image fades away on the first interactive device, on the second interactive device, show gradually, for example, the part that virtual image disappears on the first interactive device shows simultaneously on the second interactive device, has more increased the true life sense of virtual image for user.
Like this, on the one hand, the mode of man-machine interaction is enriched, and it is not only mutual between user and single interactive device that man-machine interaction becomes, but by virtual image " passing through ", user can carry out with several interactive devices alternately.On the other hand, by virtual image " passing through ", make virtual image be similar to the life of necessary being, on sense organ, user can feel, carrying out alternately with fresh and alive life, to have increased user's telepresenc, has effectively improved user's experience.
Optionally, in order further to increase user's telepresenc, improve user and experience, in one embodiment of the invention, after step S14, after virtual image is presented on the second interactive device, the man-machine interaction method that the embodiment of the present invention provides also comprises:
Send calling response by the virtual image being presented on described the second interactive device to user.
That is to say, after virtual image is presented on the second interactive device, virtual image can respond user's calling in anthropomorphic mode, this response can be output voice, carry out body movement, at least one in the modes such as output character, for example, after virtual image is presented on the second interactive device, virtual image can be exported voice, speak to user, for example, it can be said: " I have come ", " I have come, what ", " I have come, what what can I do for you " or " be very glad for you serve " etc. language, the present invention does not limit this.Certainly, after virtual image is presented on the second interactive device, virtual image also can carry out body movement, for example, to user's greeting of waving, virtual image also can be used the mode that ejects dialog box, and output class is similar to the word of the voice language of above-mentioned input, for example, eject dialog box, on it with word " I have come, what? "Above-mentioned three kinds of modes also can simultaneously or be selected two ground and carry out, and for example, virtual image moves while speaking.
It should be noted that, the mode of above-mentioned response is only example, and the present invention is not done to any restriction, and those skilled in the art can select arbitrarily virtual image to call the mode of response.
Be understandable that, the method of the man-machine interaction that the embodiment of the present invention provides is applied in man-machine interactive system, this man-machine interactive system comprises the first interactive device, the second interactive device and human-computer interaction device, and human-computer interaction device is connected with the second interactive device with the first interactive device respectively.But the embodiment of the present invention is not limited to this, in fact, the applied man-machine interactive system of method of the man-machine interaction that the embodiment of the present invention provides can comprise at least two interactive devices, human-computer interaction device is connected with these at least two interactive devices respectively, and these at least two interactive devices comprise the first interactive device and the second interactive device.
In order effectively to ensure the true life sense of virtual image for user, preferably, in the man-machine interaction method providing at the present embodiment, no matter in man-machine interactive system, have how many interactive devices, same time virtual image can only activate and be presented on an interactive device.Like this, user can feel, virtual image similarly is a more real life, only has one in space, the virtual image that different time is presented on distinct interaction equipment is because the movement of virtual image occurs, thereby has effectively ensured the true life sense of virtual image for user.
Because man-machine interactive system comprises at least two interactive devices, therefore, concrete, after S12 step, before S13 step, receive after the second interactive device calling instruction of user, the man-machine interaction method providing in the embodiment of the present invention, also comprises:
The current state of inquiry virtual image, inquire about virtual image current is activation show state in which interactive device in man-machine interactive system.
In the present embodiment, the state of the virtual image of the multiple interactive devices of preservation that this human-computer interaction device can be real-time, also can be understood as, and the relation mapping table of the state of an interactive device and virtual image is set in this human-computer interaction device, and real-time update.In the time will inquiring about the current state of virtual image, directly query relation mapping table, just can determine interactive device corresponding to virtual image in activating show state.In the present embodiment, also can by send inquiry request to multiple interactive devices to obtain the current state of the virtual image in this interactive device.In the present embodiment, when the current state of the virtual image on the first interactive device is while activating show state, and then, in S13 step, by sending instruction, the virtual image on the first interactive device is hidden, in S14 step, on the second interactive device, show virtual image by sending instruction.
The method of the man-machine interaction of the embodiment of the present invention, after the second interactive device calling instruction that receives user, by sending instruction to the first interactive device, this first interactive device receives after instruction, hide the virtual image on this first interactive device, by sending instruction to this second interactive device, this second interactive device receives after this instruction, activates and show this virtual image on this second interactive device.For user, in the time that user calls virtual image, on sense organ, showing as virtual image has left the first interactive device and has come on the second interactive device, that is to say, virtual image shows as and is similar to that to have the true life of soul the same, can understand user's instruction, can come another interactive device from an interactive device according to user's instruction, like this, on the one hand, the mode of man-machine interaction is enriched, on the other hand, because virtual image is similar to the life of necessary being, on sense organ, user can feel carrying out alternately with fresh and alive life, increase user's telepresenc, effectively improve user's experience.
Accordingly, the embodiment of the present invention also provides a kind of human-computer interaction device 20, and as shown in Figure 2, this device comprises:
Control module 21, for sending extremely described first interactive device of the first interactive device activation instruction of virtual image, so that described virtual image is presented at described the first interactive device;
Receiving element 22, for receiving the second interactive device calling instruction of user, described calling instruction is used to indicate on described the second interactive device and activates virtual image;
Described control module 21, also hides instruction to described the first interactive device, to hide the virtual image being presented on described the first interactive device for sending the first interactive device of virtual image;
Described control module 21, also arrives described the second interactive device for the second interactive device activation instruction that sends virtual image, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device.
The device of the man-machine interaction that the embodiment of the present invention provides, receive at receiving element 22 after the second interactive device calling instruction of user, control module 21 sends the first interactive device of virtual image and hides instruction to described the first interactive device, originally activate to hide the virtual image being presented on the first interactive device, and the second interactive device activation instruction that sends virtual image is to described the second interactive device, shows this virtual image to activate on the second interactive device.For user, in the time that user calls virtual image, on sense organ, showing as virtual image has left the first interactive device and has come on the second interactive device, that is to say, virtual image shows as and is similar to that to have the true life of soul the same, can understand user's instruction, can come another interactive device from an interactive device according to user's instruction, like this, on the one hand, the mode of man-machine interaction is enriched, on the other hand, because virtual image is similar to the life of necessary being, on sense organ, user can feel carrying out alternately with fresh and alive life, increase user's telepresenc, effectively improve user's experience.
Concrete, control module 21 sends the first interactive device activation instruction of virtual image to described the first interactive device, described in this, the first interactive device receives after this first interactive device activation instruction, this first interactive device can activate this virtual image, and this virtual image is presented on the first interactive device (appear on the display screen of the first interactive device), wherein, the mode that virtual image can be walked to be similar to personage, slowly engender on the first interactive device, also can directly intactly appear on the first interactive device.
Optionally, after virtual image appears on the first interactive device, virtual image can be sounded, and for example, says language such as " I have come, what " or " hello ", the action such as can also make in a short time or wave.Be understandable that, it is only example that above-mentioned virtual image activates the mode showing, the present invention is not done to any restriction.
Concrete, the user's that receiving element 22 receives calling instruction can comprise phonetic order, literal order or body sense instruction.Wherein, similarly language of " coming ", " little child comes " " Laiing here " " this comes to me " that phonetic order can be said for user or " spirit is come " etc., to call virtual image to appear on the second interactive device.And literal order refers to that user passes through input equipment, the instruction that for example keyboard is keyed in, for example, keys in literal order " you come " or " you come soon " etc., thereby requires virtual image to appear on the second interactive device.Body sense instruction refers to the calling instruction that user assigns by gesture or other body actions, and for example user makes towards the virtual image appearing at now on a certain interactive device the action of waving, and is intended to virtual image to call on distance users interactive device at one's side.Certainly, be understandable that, the embodiment of the present invention to calling instruction why the particular content of instruction and this instruction do not limit, those skilled in the art can select arbitrarily.
Optionally, the first interactive device that sends virtual image at control module 21 is hidden instruction to the first interactive device, so that the first interactive device is hidden while being just presented at virtual image on the first interactive device, can make virtual image as people leaves on foot, on described the first interactive device, fade away, certainly, also can make virtual image directly disappear completely on described the first interactive device.
Optionally, send the second interactive device activation instruction to the second interactive device of virtual image at control module 21, so that the second interactive device first activates virtual image, while making this virtual image be presented on the second interactive device, can make virtual image as people comes over, on the second interactive device, engender, certainly, also can make virtual image directly be presented at completely on described the second interactive device on the second interactive device.
In order to embody the fresh and alive property of virtual image, make virtual image as a real life, in the embodiment of the present invention, receive at receiving element 22 after the second interactive device calling instruction of user, control module 21 sends the first interactive device of virtual image and hides instruction to the first interactive device, so that the first interactive device is hidden the virtual image being presented on described the first interactive device, and control module 21 sends the second interactive device activation instruction to the second interactive device of virtual image, so that the second interactive device first activates virtual image and is presented on described the second interactive device.So, call after virtual image user, the be felt as virtual image of user on sense organ " passed through " the second interactive device from the first interactive device, left the first interactive device and come on the second interactive device, can obviously make people feel that virtual image is similar to that to have the true life of soul the same, can understand user's instruction, can come another interactive device from an interactive device according to user's instruction.
Optionally, in order further to increase user's telepresenc, improving user experiences, after control module 21 makes virtual image be presented on described the second interactive device by transmission instruction, virtual image can send calling response to described user, and described calling response comprises output voice, carries out body movement and/or output character.
This be it should be noted that, virtual image can respond user's calling in anthropomorphic mode, this response can be output voice, carry out at least one in the modes such as body movement, output character, for example, after virtual image is presented on the second interactive device, virtual image can be exported voice, speak to user, for example, it can be said: " I have come ", " I have come; what ", " I have come, what what can I do for you " or " be very glad for you serve " etc. language, the present invention does not limit this.Certainly, after virtual image is presented on the second interactive device, virtual image also can carry out body movement, for example, to user's greeting of waving, virtual image also can be used the mode that ejects dialog box, and output class is similar to the word of the voice language of above-mentioned input, for example, eject dialog box, on it with word " I have come, what? "Above-mentioned three kinds of modes also can simultaneously or be selected two ground and carry out, and for example, virtual image moves while speaking.
The mode that it should be noted that above-mentioned response is only example, and the present invention is not done to any restriction, and those skilled in the art can select arbitrarily virtual image to call the mode of response.
Optionally, in one embodiment of the invention, as shown in Figure 3, the human-computer interaction device providing in the embodiment of the present invention, also comprises:
Query unit 23, for receiving at receiving element 22 after the second interactive device calling instruction of user, control module 21 sends the first interactive device of virtual image and hides before instruction, the current state of inquiry virtual image, inquire about virtual image current is activation show state in which human-computer interaction device in the man-machine interactive system at this human-computer interaction device place.In the present embodiment, the state of the virtual image of the multiple interactive devices of preservation that this human-computer interaction device can be real-time, also can be understood as, and the relation mapping table of the state of an interactive device and virtual image is set in this human-computer interaction device, and real-time update.In the time will inquiring about the current state of virtual image, directly query relation mapping table, just can determine interactive device corresponding to virtual image in activating show state.In the present embodiment, also can by send inquiry request to multiple interactive devices to obtain the current state of the virtual image in this interactive device.
Figure 4 shows that the another kind of embodiment of human-computer interaction device provided by the invention, as shown in Figure 4, the human-computer interaction device 400 that the present embodiment provides, comprises processor 401, storer 402, communication interface 403 and bus 404.Processor 401, storer 402 and communication interface 403 are connected by bus 404 and complete mutual communication.Described bus 404 can be industry standard architecture (Industry StandardArchitecture, referred to as ISA) bus, peripheral component interconnect (Peripheral Component, referred to as PCI) bus or extended industry-standard architecture (Extended Industry Standard Architecture, referred to as EISA) bus etc.Described bus 404 can be divided into address bus, data bus, control bus etc.For ease of representing, in Fig. 4, only represent with a thick line, but do not represent only to have the bus of a bus or a type.Wherein:
Storer 402 is for stores executable programs code, and this program code comprises computer-managed instruction.Storer 402 may comprise high-speed RAM storer, also may also comprise nonvolatile memory (non-volatile memory), for example at least one magnetic disk memory.
Processor 401 moves the program corresponding with described executable program code by the executable program code of storage in read memory 402, for:
Produce the first interactive device activation instruction of virtual image, and send the first interactive device activation instruction to the first interactive device of virtual image by communication interface 403, so that the first interactive device activates described virtual image and is presented on described the first interactive device;
In the time that communication interface 403 receives the second interactive device calling instruction of user, the first interactive device that produces virtual image is hidden instruction, and hide instruction to the first interactive device by the first interactive device that communication interface 403 sends virtual image, hide and be presented at virtual image on described the first interactive device with the first interactive device;
In the time that communication interface 403 receives the second interactive device calling instruction of user, produce the second interactive device activation instruction of virtual image, and the second interactive device activation instruction that sends virtual image by communication interface 403 is to the second interactive device, so that the second interactive device activates virtual image and is presented on described the second interactive device.Wherein, described calling instruction is used to indicate on the second interactive device and activates virtual image.
Wherein, processor 401 may be a central processing unit (Central Processing Unit, referred to as CPU), or specific integrated circuit (Application Specific Integrated Circuit, referred to as ASIC), or be configured to implement one or more integrated circuit of the embodiment of the present invention.
It should be noted that, above-mentioned processor 401, except having above-mentioned functions, also can be used for carrying out other flow processs in said method embodiment, does not repeat them here.
Also it should be noted that, in processor 401, the division of each functional unit can, referring to the embodiment of aforesaid human-computer interaction device, not repeat them here.
Be understandable that, optional, the human-computer interaction device that the embodiment of the present invention provides, can be embodied in network side server, carries out alternately with the first interactive device and the second interactive device by wired or wireless connection.
Accordingly, the embodiment of the present invention also provides a kind of man-machine interactive system, and as shown in Figure 5, this system comprises:
At least two interactive devices, described at least two interactive devices comprise the first interactive device 50 and the second interactive device 51;
Human-computer interaction device 60, is connected with described at least two interactive devices respectively, for:
Send the first interactive device activation instruction to the first interactive device 50 of virtual image, so that the first interactive device 50 activates described virtual image and is presented on described the first interactive device;
Receive the second interactive device calling instruction of user, described calling instruction is used to indicate on the second interactive device 51 and activates virtual image;
The first interactive device that sends virtual image is hidden instruction to the first interactive device 50, hides with the first interactive device 50 virtual image being presented on described the first interactive device 50;
Send the second interactive device activation instruction of virtual image to the second interactive device 51, so that the second interactive device 51 activates virtual image and is presented on described the second interactive device 51.
The man-machine interactive system of the embodiment of the present invention, after the second interactive device calling instruction that receives user, hide and originally activate the virtual image being presented on the first interactive device 50, and activation shows this virtual image on the second interactive device 51, for user, in the time that user calls virtual image, on sense organ, showing as virtual image has left the first interactive device 50 and has come on the second interactive device 51, that is to say, virtual image shows as and is similar to that to have the true life of soul the same, can understand user's instruction, can come another interactive device from an interactive device according to user's instruction.Like this, on the one hand, the mode of man-machine interaction is enriched, on the other hand, because virtual image is similar to the life of necessary being, on sense organ user can feel with fresh and alive life carrying out mutual, increase user's telepresenc, effectively improved user's experience.
Human-computer interaction device 60 in the man-machine interactive system that the embodiment of the present invention provides can be any human-computer interaction device that embodiment provides above, specifically can, referring to human-computer interaction device embodiment above, not repeat them here.
For example, interactive device in the man-machine interactive system that the embodiment of the present invention provides can be PC (PC), mobile phone, panel computer (for example PAD), TV, such as intelligent television, common TV etc., any one interactive device such as game machine, network player, the present invention does not limit this, and those skilled in the art can select arbitrarily.
In order to make those skilled in the art better understand technical scheme of the present invention, below by specific embodiment, man-machine interaction method provided by the invention, Apparatus and system are described in further detail, be understandable that, following specific embodiment is only for the present invention is described, but does not limit the present invention.
In conjunction with shown in Fig. 6 to Fig. 9, especially referring to Fig. 6, in the present embodiment, man-machine interactive system comprises four interactive devices and human-computer interaction device 60.
Wherein, these four interactive devices are respectively mobile phone 611, Smart TV (intelligent television) 612, PAD (panel computer) 613, digital TV (Digital Television) 614.
In the present embodiment, human-computer interaction device 60 comprises control device 601 and intelligent apparatus 602;
Concrete, control device 601 comprises receiving element 6011, query unit 6012, and deactivation/activation unit 6013 (corresponding with the control module in human-computer interaction device embodiment above).
Concrete, intelligent apparatus 602 comprises interaction process unit 6021, signal receiving unit 6022, artificial intelligence unit 6023 and display unit 6024.
In the present embodiment, corresponding with four interactive devices, human-computer interaction device 60 comprises 4 cover intelligent apparatus 602, is connected respectively with each interactive device, every cover intelligent apparatus 602 can directly be installed in each self-corresponding interactive device, also can be independent of interactive device separately and exist.And human-computer interaction device 60 only includes a set of control device 601, be connected with each cover intelligent apparatus 602 respectively, this control device 601 can independently arrange, and is independently present in outside interactive device, also can be arranged in an interactive device with certain a set of intelligent apparatus 602 simultaneously.For example, be that common TV, game machine, network are while broadcasting the equipment such as device at interactive device, control device 601 can be arranged in the Set Top Box being connected with these equipment, can also be arranged in the network side server of providing services on the Internet for these equipment, also can independently be arranged to an independent device.
The interactive instruction that interaction process unit 6021 issues for the interactive device that receives user and connect to this intelligent apparatus 602, and this interactive instruction is sent to the receiving element 6011 of control device 601, receiving element 6011 sends to query unit 6012 by this instruction after receiving interactive instruction, and query unit 6012 is for inquiring about the current state of the described virtual image on interactive device.Deactivation/activation unit 6013 is according to the Query Result of query unit 6012, to signal receiving unit 6022 sending controling instructions of intelligent apparatus 602, so that signal receiving unit 6022 notifies artificial intelligent cell 6023 according to received steering order, send and activate or hiding instruction to display unit 6024, by display unit 6024, activation or hiding instruction are sent on interactive device, so that interactive device activates virtual image or hides this virtual image.
Be understandable that, intelligent apparatus in above-described human-computer interaction device and the cellular construction of control device are only example, the human-computer interaction device of the embodiment of the present invention is not limited to this, and those skilled in the art can be according to the different cellular construction of the function setting performed from control device of intelligent apparatus in human-computer interaction device.
First following user carries out man-machine interaction with digital TV614 (the first interactive device), then carrying out man-machine interaction with PAD 613 (the second interactive device) is example, and the man-machine interaction method that the man-machine interactive system of the present embodiment is adopted is elaborated.For convenience of describing, in this example, the intelligent apparatus 602 being connected with digital TV614 is called to intelligent apparatus A, the intelligent apparatus being connected with PAD613 602 is called to intelligent apparatus B, and only from the aspect of intelligent apparatus 602 and control device 601, the man-machine interaction method of the present embodiment is described, specifically as shown in Figure 7 to 10, the man-machine interaction method of the present embodiment comprises the following steps:
10, user and digital TV614 carry out man-machine interaction, send the activation instruction of virtual image to digital TV614.
Concrete, " elfin, the language such as out " that the activation instruction that user sends can be said for user.
11, man-machine interactively device 60 receives the activation instruction that digital TV614 sends, and sends the activation instruction of virtual image to this numeral TV614, so that this numeral TV614 activates this virtual image, this virtual image is presented on digital TV614.
Concrete, receive activation instruction at digital TV614, for example, " elfin, out ", after, this instruction is transmitted to the intelligent apparatus A being connected with digital TV614, this instruction is sent to control device 601 by intelligent apparatus A, control device 601 carries out query processing after receiving instruction, determine that current virtual image un-activation is on any one interactive device, and according to definite result to intelligent apparatus A sending controling instruction, notice intelligent apparatus A activates and shows virtual image on digital TV614, concrete, intelligent apparatus A can control the mode that virtual image is walked to be similar to people, engender on digital TV614, also can be accompanied by the response of action and the language such as wave, for example, say to user that " I have come, what ".
What Fig. 7 was exemplary shows after this step, the displaying contents of digital TV614 and PAD613, and wherein, virtual image is illustrated as doggie image, and it is upper that this doggie is presented at digital TV614, and on PAD613, do not show virtual image.
12, user picks up PAD613, and sends virtual image calling instruction, is presented on PAD613 so that virtual image activates.
Concrete, this calling instruction can be " elfin comes over here " or " coming " etc.
Concrete, as shown in figure 10, this step is by intelligent apparatus B, control device 601, and intelligent apparatus A three carries out, and comprises following several small step:
120, intelligent apparatus B receives the calling instruction that PAD613 forwards.
121, the state of the upper virtual image of intelligent apparatus B inquiry PAD613.
If virtual image does not activate on PAD613, execution step 122, if activated, performs step 131.
122, intelligent apparatus B sends user's calling instruction, the i.e. activation request of virtual image to control device 601.
123, control device 601 receives calling instruction.
124, control device 601 is inquired about the state of the virtual image on all interactive devices in man-machine interactively system, determine virtual image on digital TV614 in activate show state.
In this step, control device 601 can inquire in four interactive devices, and virtual image activates and is presented at digital TV614 above, will perform step 125.
It should be noted that, if in this step, control device 601 inquires virtual image and does not all activate on all interactive devices, performs step 129.
125, control device 601, according to Query Result, sends instruction to the intelligent apparatus A being connected with digital TV614, and indicating intelligent device A hides the upper virtual image of digital TV614.
126, intelligent apparatus A receives the instruction of control device 601, the current state of the upper virtual image of enquiring digital TV614.
If virtual image does not activate on PAD613, execution step 128, if activated, performs step 127.
Here it should be noted that, consider the generation of the situations such as network interruption, unit exception, the current state of the virtual image that control device A inquires is not necessarily identical with reality, therefore, in the present embodiment, receive at intelligent apparatus A after the instruction of control device 601, further inquired about the state of the virtual image on digital TV614.
Be understandable that, the abnormal situation that continues a very long time such as network interruption, device shutdown of considering can occur, in this step, control device A possibly cannot further inquire the concrete state of the virtual image on digital TV614, in this case, thus control device A can think that virtual image on digital TV614 is further processed for activating.
127, intelligent apparatus A sends and hides instruction to TV614, so that TV614 hides the upper virtual image of digital TV614.
Concrete, the mode that intelligent apparatus A hides virtual image can make virtual image as people leaves on foot, on digital TV614, fades away, and also can make virtual image directly disappear completely on digital TV614.Also can be accompanied by the response of action and the language such as wave, for example, say " good-by, and I have walked " to user.
128, intelligent apparatus A notice control device 601 virtual images are hiding on digital TV614.
129, control device 601 receives after the notice of intelligent apparatus A, sends virtual image activation instruction to intelligent apparatus B, and indicating intelligent device B activates and shows virtual image on PAD613.
130, intelligent apparatus B receives the instruction of self-control device 601, sends virtual image activation instruction to PAD613, so that this PAD613 activates this virtual image and shows.
Concrete, intelligent apparatus B can control the mode that virtual image is walked to be similar to people, engenders on PAD613, and also can being accompanied by waves waits and move or the response of language, for example, says " I have come, what " to user.
What Fig. 8 was exemplary shows after this step, the displaying contents of digital TV614 and PAD613, and wherein, virtual image is illustrated as doggie image, and it is upper that this doggie appears at PAD613, and the virtual image being originally presented on digital TV614 disappears.
It should be noted that, in the concrete enforcement of the present embodiment, can make the be felt as virtual image of user on sense organ " pass through " PAD613 from digital TV614, leave the first interactive device and come on the second interactive device, increase user's telepresenc.
131, the virtual image that intelligent apparatus B controls on PAD613 is responded user's calling, and for example, language is responded, action response etc.
Each embodiment in this instructions all adopts the mode of going forward one by one to describe, between each embodiment identical similar part mutually referring to, what each embodiment stressed is and the difference of other embodiment.Especially,, for device embodiment, because it is substantially similar in appearance to embodiment of the method, so describe fairly simplely, relevant part is referring to the part explanation of embodiment of the method.
It should be noted that, device embodiment described above is only schematic, the wherein said unit as separating component explanation can or can not be also physically to separate, the parts that show as unit can be or can not be also physical locations, can be positioned at a place, or also can be distributed in multiple network element.Can select according to the actual needs some or all of device wherein to realize the object of the present embodiment scheme.In addition, in device embodiment accompanying drawing provided by the invention, the annexation between device represents to have communication connection between them, specifically can be implemented as one or more communication bus or signal wire.Those of ordinary skill in the art, in the situation that not paying creative work, are appreciated that and implement.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that the present invention can add essential common hardware by software and realize, and can certainly comprise that special IC, dedicated cpu, private memory, special components and parts etc. realize by specialized hardware.Generally, all functions being completed by computer program can realize with corresponding hardware easily, and the particular hardware structure that is used for realizing same function can be also diversified, such as mimic channel, digital circuit or special circuit etc.But software program realization is better embodiment under more susceptible for the purpose of the present invention condition.Based on such understanding, the part that technical scheme of the present invention contributes to prior art in essence in other words can embody with the form of software product, this computer software product is stored in the storage medium can read, as the floppy disk of computing machine, USB flash disk, portable hard drive, ROM (read-only memory) (ROM, Read-OnlyMemory), random access memory (RAM, Random AcceSS Memory), magnetic disc or CD etc., comprise that some instructions (can be personal computers in order to make a computer equipment, server, or the network equipment etc.) carry out the method described in the present invention each embodiment.
The above; be only the specific embodiment of the present invention, but protection scope of the present invention is not limited to this, any be familiar with those skilled in the art the present invention disclose technical scope in; can expect easily changing or replacing, within all should being encompassed in protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of described claim.

Claims (12)

1. a method for man-machine interaction, is characterized in that, comprising:
Send the first interactive device activation instruction to the first interactive device of virtual image, so that described the first interactive device activates virtual image, described virtual image is presented on described the first interactive device;
Receive the second interactive device calling instruction of user, described calling instruction is used to indicate on the second interactive device and activates virtual image;
The first interactive device that sends virtual image is hidden instruction to described the first interactive device, so that described the first interactive device is hidden the virtual image being presented on described the first interactive device;
Send the second interactive device activation instruction of virtual image to described the second interactive device, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device.
2. method according to claim 1, is characterized in that:
The second interactive device calling instruction of described user comprises phonetic order, literal order or body sense instruction.
3. method according to claim 1 and 2, is characterized in that,
After the second interactive device calling instruction of described reception user, the first interactive device of described transmission virtual image is hidden instruction to described the first interactive device, and described method also comprises:
Inquire about the current state of described virtual image, taking determine described virtual image on described the first interactive device as activate show state.
4. according to the method described in claims 1 to 3 any one, it is characterized in that, the first interactive device of described transmission virtual image is hidden instruction to described the first interactive device, comprises so that described the first interactive device is hidden the virtual image being presented on described the first interactive device:
The first interactive device that sends virtual image is hidden instruction to described the first interactive device, so that described the first interactive device is hidden described virtual image, described virtual image is faded away on described the first interactive device or directly disappear completely.
5. according to the method described in claim 1 to 4 any one, it is characterized in that, the second interactive device activation instruction of described transmission virtual image is to described the second interactive device, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device and comprises:
Send the second interactive device activation instruction of virtual image to described the second interactive device, so that described the second interactive device activates virtual image, make described virtual image gradually or be directly presented at completely on described the second interactive device.
6. a human-computer interaction device, is characterized in that, comprising:
Control module, for sending extremely described first interactive device of the first interactive device activation instruction of virtual image, so that described the first interactive device activates virtual image, is presented on described the first interactive device described virtual image;
Receiving element, for receiving the second interactive device calling instruction of user, described calling instruction is used to indicate on described the second interactive device and activates virtual image;
Described control module, also hides instruction to described the first interactive device, to hide the virtual image being presented on described the first interactive device for sending the first interactive device of virtual image;
Described control module, also arrives described the second interactive device for the second interactive device activation instruction that sends virtual image, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device.
7. device according to claim 6, is characterized in that:
The second interactive device calling instruction of the described user that described receiving element receives comprises phonetic order, literal order or body sense instruction.
8. according to the device described in claim 6 or 7, it is characterized in that, described device also comprises:
Query unit, for receiving at described receiving element after the second interactive device calling instruction of user, described control module sends the first interactive device of virtual image and hides instruction to described the first interactive device, inquire about the current state of described virtual image, taking determine described virtual image on described the first interactive device as activate show state.
9. according to the device described in claim 6 to 8 any one, it is characterized in that, described control module is used for:
The first interactive device that sends virtual image is hidden instruction to described the first interactive device, so that described the first interactive device is hidden described virtual image, described virtual image is faded away on described the first interactive device or directly disappear completely.
10. according to the device described in claim 6 to 9 any one, it is characterized in that, described control module is used for:
Send the second interactive device activation instruction of virtual image to described the second interactive device, so that described the second interactive device activates virtual image, make described virtual image gradually or be directly presented at completely on described the second interactive device.
11. 1 kinds of man-machine interactive systems, is characterized in that, comprising:
At least two interactive devices, described at least two interactive devices comprise the first interactive device and the second interactive device;
Human-computer interaction device, is connected with described at least two interactive devices respectively, for:
Send the first interactive device activation instruction to the first interactive device of virtual image, so that described the first interactive device activates virtual image, described virtual image is presented on described the first interactive device;
Receive the second interactive device calling instruction of user, described calling instruction is used to indicate on the second interactive device and activates virtual image;
The first interactive device that sends virtual image is hidden instruction to the first interactive device, so that the first interactive device is hidden the virtual image being presented on described the first interactive device;
Send the second interactive device activation instruction of virtual image to described the second interactive device, so that described the second interactive device activates virtual image, described virtual image is presented on described the second interactive device.
12. man-machine interactive systems according to claim 11, is characterized in that, described interactive device comprises mobile phone, panel computer, PC, TV, game machine or network player.
CN201310014368.9A 2013-01-15 2013-01-15 The method, apparatus and system of a kind of human-computer interaction Active CN103927091B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310014368.9A CN103927091B (en) 2013-01-15 2013-01-15 The method, apparatus and system of a kind of human-computer interaction
PCT/CN2014/070596 WO2014111008A1 (en) 2013-01-15 2014-01-14 Human-computer interaction method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310014368.9A CN103927091B (en) 2013-01-15 2013-01-15 The method, apparatus and system of a kind of human-computer interaction

Publications (2)

Publication Number Publication Date
CN103927091A true CN103927091A (en) 2014-07-16
CN103927091B CN103927091B (en) 2018-04-27

Family

ID=51145332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310014368.9A Active CN103927091B (en) 2013-01-15 2013-01-15 The method, apparatus and system of a kind of human-computer interaction

Country Status (2)

Country Link
CN (1) CN103927091B (en)
WO (1) WO2014111008A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427865A (en) * 2015-11-04 2016-03-23 百度在线网络技术(北京)有限公司 Voice control system and method of intelligent robot based on artificial intelligence
CN105469787A (en) * 2015-12-02 2016-04-06 百度在线网络技术(北京)有限公司 Information displaying method and device
CN105898062A (en) * 2016-04-26 2016-08-24 乐视控股(北京)有限公司 Method and apparatus for incoming call management based on virtual equipment
CN108375958A (en) * 2018-01-15 2018-08-07 珠海格力电器股份有限公司 Electrical appliance system
CN108376067A (en) * 2018-03-08 2018-08-07 腾讯科技(深圳)有限公司 A kind of application operating method and its equipment, storage medium, terminal
CN108600071A (en) * 2018-04-16 2018-09-28 青岛海信移动通信技术股份有限公司 A kind of sharing method and mobile terminal of virtual portrait
CN108847239A (en) * 2018-08-31 2018-11-20 上海擎感智能科技有限公司 Interactive voice/processing method, system, storage medium, engine end and server-side
CN110430553A (en) * 2019-07-31 2019-11-08 广州小鹏汽车科技有限公司 Interactive approach, device, storage medium and controlling terminal between vehicle
CN110822647A (en) * 2019-11-25 2020-02-21 广东美的制冷设备有限公司 Control method of air conditioner, air conditioner and storage medium
WO2021189967A1 (en) * 2020-03-25 2021-09-30 北京百度网讯科技有限公司 Human-machine interaction control method, apparatus and system, and electronic device
CN114037467A (en) * 2015-10-20 2022-02-11 索尼公司 Information processing system, information processing method, and computer-readable storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101651747A (en) * 2009-09-17 2010-02-17 杭州聚贝软件科技有限公司 Method and system for carrying out anime interaction among phone terminals with virtual image interfaces
CN101931621A (en) * 2010-06-07 2010-12-29 上海那里网络科技有限公司 Device and method for carrying out emotional communication in virtue of fictional character
CN102176197A (en) * 2011-03-23 2011-09-07 上海那里网络科技有限公司 Method for performing real-time interaction by using virtual avatar and real-time image

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114037467A (en) * 2015-10-20 2022-02-11 索尼公司 Information processing system, information processing method, and computer-readable storage medium
CN105427865A (en) * 2015-11-04 2016-03-23 百度在线网络技术(北京)有限公司 Voice control system and method of intelligent robot based on artificial intelligence
CN105469787A (en) * 2015-12-02 2016-04-06 百度在线网络技术(北京)有限公司 Information displaying method and device
CN105898062A (en) * 2016-04-26 2016-08-24 乐视控股(北京)有限公司 Method and apparatus for incoming call management based on virtual equipment
CN108375958B (en) * 2018-01-15 2020-06-19 珠海格力电器股份有限公司 Electrical appliance system
CN108375958A (en) * 2018-01-15 2018-08-07 珠海格力电器股份有限公司 Electrical appliance system
CN108376067A (en) * 2018-03-08 2018-08-07 腾讯科技(深圳)有限公司 A kind of application operating method and its equipment, storage medium, terminal
CN108600071A (en) * 2018-04-16 2018-09-28 青岛海信移动通信技术股份有限公司 A kind of sharing method and mobile terminal of virtual portrait
CN108847239A (en) * 2018-08-31 2018-11-20 上海擎感智能科技有限公司 Interactive voice/processing method, system, storage medium, engine end and server-side
CN110430553A (en) * 2019-07-31 2019-11-08 广州小鹏汽车科技有限公司 Interactive approach, device, storage medium and controlling terminal between vehicle
CN110430553B (en) * 2019-07-31 2022-08-16 广州小鹏汽车科技有限公司 Interaction method and device between vehicles, storage medium and control terminal
CN110822647A (en) * 2019-11-25 2020-02-21 广东美的制冷设备有限公司 Control method of air conditioner, air conditioner and storage medium
WO2021189967A1 (en) * 2020-03-25 2021-09-30 北京百度网讯科技有限公司 Human-machine interaction control method, apparatus and system, and electronic device

Also Published As

Publication number Publication date
WO2014111008A1 (en) 2014-07-24
CN103927091B (en) 2018-04-27

Similar Documents

Publication Publication Date Title
CN103927091A (en) Man-machine interaction method, device and system
US20210099674A1 (en) Method and apparatus for generating video file, and storage medium
CN106254311A (en) Live broadcasting method and device, live data streams methods of exhibiting and device
CN111596985A (en) Interface display method, device, terminal and medium in multimedia conference scene
CN107277242A (en) A kind of processing method of displaying information on screen, display methods and display system
CN104980563A (en) Operation demonstration method and operation demonstration device
CN109947388B (en) Page playing and reading control method and device, electronic equipment and storage medium
CN107085495A (en) A kind of information displaying method, electronic equipment and storage medium
CN111464430B (en) Dynamic expression display method, dynamic expression creation method and device
TWM425348U (en) System providing interactive management service
CN106028172A (en) Audio/video processing method and device
CN108696489A (en) The playing method and device of media information
CN109391848A (en) A kind of interactive advertisement system
CN107179952A (en) Collaboration Input Method Editor (IME) activity between virtual application client and server
CN113824976A (en) Method and device for displaying approach show in live broadcast room and computer equipment
CN103596051A (en) A television apparatus and a virtual emcee display method thereof
CN114979682A (en) Multi-anchor virtual live broadcasting method and device
CN112911052A (en) Information sharing method and device
CN114743422A (en) Answering method and device and electronic equipment
CN106250007B (en) A kind of system and method realizing branching selection and playing
CN113791855A (en) Interactive information display method and device, electronic equipment and storage medium
CN107770253A (en) Long-range control method and system
CN107016281A (en) Permission setting method and device of application program and electronic equipment
JP2023099309A (en) Method, computer device, and computer program for interpreting voice of video into sign language through avatar
CN114827643A (en) Live broadcast room approach method and device based on cover wiping drawing and computer equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 518129 Building 2, B District, Bantian HUAWEI base, Longgang District, Shenzhen, Guangdong.

Patentee after: Huawei terminal (Shenzhen) Co.,Ltd.

Address before: 518129 Building 2, B District, Bantian HUAWEI base, Longgang District, Shenzhen, Guangdong.

Patentee before: HUAWEI DEVICE Co.,Ltd.

CP01 Change in the name or title of a patent holder
TR01 Transfer of patent right

Effective date of registration: 20181220

Address after: 523808 Southern Factory Building (Phase I) Project B2 Production Plant-5, New Town Avenue, Songshan Lake High-tech Industrial Development Zone, Dongguan City, Guangdong Province

Patentee after: HUAWEI DEVICE Co.,Ltd.

Address before: 518129 Building 2, B District, Bantian HUAWEI base, Longgang District, Shenzhen, Guangdong.

Patentee before: Huawei terminal (Shenzhen) Co.,Ltd.

TR01 Transfer of patent right