WO2016206643A1 - 机器人交互行为的控制方法、装置及机器人 - Google Patents

机器人交互行为的控制方法、装置及机器人 Download PDF

Info

Publication number
WO2016206643A1
WO2016206643A1 PCT/CN2016/087258 CN2016087258W WO2016206643A1 WO 2016206643 A1 WO2016206643 A1 WO 2016206643A1 CN 2016087258 W CN2016087258 W CN 2016087258W WO 2016206643 A1 WO2016206643 A1 WO 2016206643A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
control
information
behavior
sensing
Prior art date
Application number
PCT/CN2016/087258
Other languages
English (en)
French (fr)
Inventor
聂华闻
Original Assignee
北京贝虎机器人技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201510363346.2A external-priority patent/CN106325113B/zh
Priority claimed from CN201510363348.1A external-priority patent/CN106325065A/zh
Priority claimed from CN201510364661.7A external-priority patent/CN106325228B/zh
Application filed by 北京贝虎机器人技术有限公司 filed Critical 北京贝虎机器人技术有限公司
Publication of WO2016206643A1 publication Critical patent/WO2016206643A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices

Definitions

  • the invention relates to the field of robot technology, in particular to a method, a device and a robot for controlling the interaction behavior of a robot.
  • Today's robots are mostly industrial robots, while industrial robots are mostly non-sense.
  • the operating procedures of these robots are pre-defined and the determined tasks are completed without fail in accordance with the predetermined procedures. They lack adaptability and produce consistent results only when the objects involved are the same.
  • the embodiment of the invention provides a method, a device and a robot for controlling the interaction behavior of the robot, so as to at least effectively improve the adaptive interaction behavior and the degree of intelligence of the robot.
  • a method for controlling interaction behavior of a robot includes: acquiring information perceived by a robot; generating, according to the sensed information, sensing data according to at least a predefined sensing unit, wherein the sensing data includes a sensing unit Identification and value; a control entry matching the generated perceptual data, wherein each control entry includes a trigger condition and an action triggered by the trigger condition, each trigger condition being composed of at least one perceptual unit; if The generated sensed data matches the control entry, causing the robot to perform the behavior in the found control entry.
  • a method for controlling interaction behavior of a robot includes: sensing at least one piece of information; generating sensing data according to at least a predefined sensing unit according to the perceived information, wherein the sensing data includes an identifier of the sensing unit And a value; sending the generated sensing data; receiving a control entry that matches the perceptual data, wherein the control entry includes a trigger condition and an action triggered by the trigger condition, and the trigger condition is formed by at least one sensing unit; The interaction behavior in the control entry.
  • a method of controlling robot interaction behavior includes: providing a control entry document including a plurality of control entries, wherein each control entry includes a trigger condition and an action triggered by the trigger condition, each trigger condition Constructed by at least one predefined sensing unit; matching the sensing data of the robot with the control item to determine whether there is a control item matching the sensing data of the robot, wherein the sensing data of the robot is based on the information perceived by the robot, Generated at least according to a predefined perceptual unit.
  • a control device for robot interaction behavior includes: an acquisition module, configured to acquire information perceived by the robot; and a generation module, configured to generate a perception according to the perceptual information, according to at least a predefined sensing unit Data, wherein the sensing data includes an identifier and a value of the sensing unit; and a searching module, configured to search for a control entry that matches the generated sensing data, wherein each control entry includes a triggering condition and a behavior triggered by the triggering condition, each The trigger condition is composed of at least one sensing unit; the execution module is configured to cause the robot to perform the behavior in the found control item when the control item matching the generated sensing data is found.
  • a control device for robot interaction behavior includes: a sensing module, configured to perceive at least one piece of information; and a generating module, configured to generate sensing data according to the perceptual information, according to at least a predefined sensing unit
  • the sensing data includes an identifier and a value of the sensing unit, a sending module, configured to send the generated sensing data, and a receiving module, configured to receive information of a control entry that matches the sensing data, where the control entry
  • the triggering condition is triggered by the triggering condition and the triggering condition is formed by the at least one sensing unit; and the executing module is configured to perform the interaction behavior in the control item according to the information of the control item.
  • a control device for robot interaction behavior includes: a receiving module, configured to receive the sensory data of the robot, wherein the sensory data is generated according to the information perceived by the robot, at least according to a predefined sensing unit; a module for finding a control entry that matches the perceptual data of the robot, wherein each control entry includes a trigger condition and an action triggered by the trigger condition, each trigger condition being composed of at least one perceptual unit; an execution module for when searching A control entry corresponding to the sensory data of the robot causes the robot to perform a behavior triggered by the trigger condition in the found control entry.
  • a control device for robot interaction behavior includes: a control entry document for providing a control entry document including a plurality of control entries, wherein each control entry includes a trigger condition and the trigger condition triggering Behavior, each trigger condition is composed of at least one predefined sensing unit; a matching module is configured to match the sensing data of the robot with the control item to determine whether there is a control item that matches the sensing data of the robot, wherein the sensing data According to the information perceived by the robot, it is generated according to at least a predefined sensing unit.
  • the above method can be performed by a robot having one or more units, a memory, and one or more modules, programs, or sets of instructions stored in memory to perform the methods.
  • Instructions for performing the above methods may be included in a computer program product configured to be executed by one or more processors.
  • a control method, device and robot for a robot are proposed, and a sensing unit for controlling the interaction behavior of the robot is defined in advance as a minimum unit for controlling the interaction behavior of the robot, and a trigger condition and a trigger are set according to the sensing unit.
  • the interaction behavior triggered by the condition, the control item of the control robot is obtained, and the machine is unified.
  • the input and output standards controlled by the human person enable non-technical personnel to edit the behavior of the robot, thereby facilitating the control of the interactive behavior of the robot, and effectively improving the adaptive interaction behavior and intelligence of the robot.
  • FIG. 1 illustrates a schematic structural view of a robot in accordance with some embodiments of the present invention
  • FIG. 2 illustrates a flow chart of a method of generating control data for a robot in accordance with some embodiments of the present invention
  • FIG. 3 illustrates a flow chart 1 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 4 illustrates a flow chart 2 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 5 illustrates a flow chart 3 of a method of controlling robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 6 is a block diagram showing the structure of a device for generating control data of a robot according to some embodiments of the present invention.
  • Figure 7 illustrates a block diagram 1 of a control apparatus for robot interaction behavior in accordance with some embodiments of the present invention
  • FIG. 8 is a block diagram showing the structure of a control device for robot interaction behavior according to some embodiments of the present invention.
  • FIG. 9 illustrates a block diagram 3 of a control apparatus for robot interaction behavior in accordance with some embodiments of the present invention.
  • Figure 10 illustrates a block diagram of a block diagram of a control device for robot interaction behavior in accordance with some embodiments of the present invention.
  • the robot 100 includes a memory 102, a memory controller 104, one or more processing units (CPUs) 106, a peripheral interface 108, a radio frequency (RF) circuit 114, an audio circuit 116, a speaker 118, a microphone 120, a sensing subsystem 122, a gesture Sensor 132, camera 134, tactile sensor 136, and one or more other sensing devices 138, as well as external interface 140. These components communicate over one or more communication buses or signal lines 110.
  • CPUs processing units
  • RF radio frequency
  • the robot 100 is just one example of the robot 100, which may have more or fewer components than the illustration, or have different component configurations.
  • the bot 100 can include one or more CPUs 106, memory 102, one or more sensing devices (eg, as described above) A sensing device), and one or more modules, programs, or sets of instructions stored in memory 102 to perform a robot interaction behavior control method.
  • the various components shown in FIG. 1 can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the robot 100 may be an electromechanical device having a biological shape (eg, a humanoid, etc.), or may be a smart device that does not have a biological appearance but has human characteristics (eg, language communication, etc.), the smart device may Mechanical devices are also included, as well as virtual devices implemented by software (eg, virtual chat bots, etc.).
  • the virtual chat bot can perceive information through the device in which it is located, and the device in which it is located includes electronic devices such as handheld electronic devices, personal computers, and the like.
  • Memory 102 can include high speed random access memory and can also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • memory 102 may also include memory remote from one or more CPUs 106, such as network attached memory accessed via RF circuitry 114 or external interface 140 and a communication network (not shown), wherein the communication network It can be the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof.
  • Memory controller 104 can control access to memory 102 by other components of robot 100, such as CPU 106 and peripheral interface 108.
  • Peripheral interface 108 couples the input and output peripherals of the device to CPU 106 and memory 102.
  • the one or more processors 106 described above execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the robot 100 and process the data.
  • peripheral interface 108, CPU 106, and memory controller 104 can be implemented on a single chip, such as chip 112. In some other embodiments, they may be implemented on multiple discrete chips.
  • the RF circuit 114 receives and transmits electromagnetic waves.
  • the RF circuit 114 converts an electrical signal into an electromagnetic wave, or converts the electromagnetic wave into an electrical signal, and communicates with the communication network and other communication devices via the electromagnetic wave.
  • the RF circuit 114 may include well-known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip. Group, User Identity Module (SIM) card, memory, and more.
  • SIM User Identity Module
  • the RF circuit 112 can communicate with a network and other devices via wireless communication, such as the World Wide Web (WWW) Internet, an intranet, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and / or Metropolitan Area Network (MAN).
  • WWW World Wide Web
  • LAN wireless local area network
  • MAN Metropolitan Area Network
  • the above wireless communication may use any of a variety of communication standards, protocols, and technologies including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (W-CDMA). ), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (eg IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), Voice over Internet Protocol (VoIP), Wi-MAX, protocols for email, instant messaging, and/or short message service (SMS), or Any other suitable communication protocol, including communication protocols that have not been developed at the filing date of this document.
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • W-CDMA Code Division Multiple Access
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • Bluetooth Bluetooth
  • Wi-Fi eg IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11
  • Audio circuitry 116, speaker 118, and microphone 120 provide an audio interface between the user and the robot 100.
  • Audio circuitry 116 receives audio data from peripheral interface 108, converts the audio data into electrical signals, and transmits the electrical signals to speaker 118.
  • the speaker transforms the electrical signal into a human audible sound wave.
  • Audio circuit 116 also receives electrical signals that are converted from sound waves by microphone 118.
  • the audio circuit 116 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 108 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 114 by peripheral interface 108 and/or transmitted to memory 102 and/or RF circuitry 114.
  • a plurality of microphones 120 can be included, the plurality of microphones 120 being distributed at different locations, and the direction in which the sound is emitted is determined according to a predetermined strategy based on the microphones 120 at different locations. It should be understood that the direction of the sound can also be identified by some sensors.
  • audio circuit 116 also includes a headset jack (not shown).
  • the headset jack provides an interface between the audio circuit 114 and a removable audio input/output peripheral, for example, the audio input/output peripheral can be either a pure output headset or both Output (for single or binaural headphones) and input (microphone) headset.
  • a speech recognition device (not shown) is also included for implementing speech-to-text recognition and synthesizing speech based on text.
  • the speech recognition device can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the audio circuit 116 receives the audio data from the peripheral interface 108, converts the audio data into electrical signals, and the voice recognition device can identify the audio data and convert the audio data into text data.
  • the speech recognition apparatus can also synthesize the audio data based on the text data, convert the audio data into an electrical signal through the audio circuit 116, and transmit the electrical signal to the speaker 118.
  • Perception subsystem 122 provides an interface between the perceptual peripherals of robot 100 and peripheral interface 108, such as attitude sensor 132, camera 134, tactile sensor 136, and other sensing devices 128.
  • Perception subsystem 122 includes an attitude controller 124, a visual controller 126, a haptic controller 128, and one or more other perceptual device controllers 130.
  • the one or more other sensing device controllers 130 receive/transmit electrical signals from/to other sensing devices 138.
  • the other sensing devices 138 may include temperature sensors, distance sensors, proximity sensors, air pressure sensors, air quality detecting devices, and the like.
  • the robot 100 can have a plurality of attitude controllers 124 to control different limbs of the robot 100, which can include, but are not limited to, arms, feet, and heads.
  • the robot 100 can A plurality of attitude sensors 132 are included.
  • the robot 100 may not have the attitude controller 124 and the attitude sensor 132.
  • the robot 100 may be in a fixed configuration and does not have mechanical moving parts such as an arm or a foot.
  • the pose of the robot 100 may not be a mechanical arm, foot, and head, but may also employ a deformable configuration.
  • the robot 100 also includes a power system 142 for powering various components.
  • the power system 142 can include a power management system, one or more power sources (eg, batteries, alternating current (AC)), charging systems, power failure detection circuits, power converters or inverters, power status indicators (eg, light emitting diodes (eg LED)), as well as any other components associated with power generation, management, and distribution in portable devices.
  • the charging system can be a wired charging system or a wireless charging system.
  • the software components include an operating system 144, a communication module (or set of instructions) 146, an interactive behavior control device (or set of instructions) 148, and one or more other devices (or sets of instructions) 150.
  • Operating system 144 eg, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vxworks
  • controls and management of general system tasks eg, memory management, storage device control, power management, etc.
  • software components and/or drivers that facilitate communication between various hardware and software components.
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • Communication module 146 facilitates communication with other devices via one or more external interfaces 140, and it also includes various software components for processing data received by RF circuitry 114 and/or external interface 140.
  • External interface 140 eg, Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • the robot 100 may also include a display device (not shown), which may include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • a display device (not shown), which may include, but is not limited to, a touch sensitive display, a touch pad, and the like.
  • One or more of the other devices 150 described above can include a graphics module (not shown) that includes various known software components for presenting and displaying graphics on the display device.
  • graphics includes any object that can be displayed to a user, including but not limited to text, web pages, icons (eg, user interface objects including soft keys), digital images, video, animation, and the like. Touch sensitive displays or touch pads can also be used for user input.
  • the robot 100 senses the external environment of the robot 10 and the condition of the robot itself by, for example, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128, the microphone 120, etc., and the information perceived by the robot 100 is controlled via the sensing peripheral.
  • the device processes and is processed by one or more CPUs 106.
  • the perception of the environment by the robot 100 includes, but is not limited to, information detected by its own sensors (eg, the attitude sensor 132, the camera 134, the tactile sensor 136, and other sensing devices 128), and may also be coupled to the robot 100.
  • the information detected by the external device (not shown) establishes a communication connection between the robot 100 and the external device, and the robot 100 and the external device transmit data through the communication connection.
  • External devices include various types of sensors, smart home devices, and the like.
  • the information perceived by the robot 100 includes, but is not limited to, sound, images, environmental parameters, haptic information, time, space, and the like.
  • Environmental parameters include, but are not limited to, temperature, humidity, gas concentration, etc.
  • tactile information includes, but is not limited to, contact with the robot 100, including but not limited to contact with a touch sensitive display, contact or proximity to a tactile sensor, and the tactile sensor can be placed at The head, arm, etc. of the robot (not shown) should be described to include other forms of information.
  • the sound may include voice and other sounds, the sound may be the sound collected by the microphone 120, or may be the sound stored in the memory 102; the voice may include, but is not limited to, human speaking or singing.
  • the image may be a single picture or video, including but not limited to captured by camera 134, or may be read from memory 102 or transmitted to the robot 100 over a network.
  • the information perceived by the robot 100 includes not only information external to the robot 100 but also information of the robot 100 itself, including but not limited to information such as the amount of power, temperature, and the like of the robot 100.
  • the robot 100 can be moved to the charging position for automatic charging when it is perceived that the power of the machine 100 is less than 20%.
  • the robot 100 is not limited to perceiving information in the manner described above, but may also perceive information in other forms, including perceptual techniques that have not been developed at the filing date of this document.
  • the sensing device of the robot 100 is not limited to the sensing device provided on the robot 100, and may also include a sensing device associated with the robot 100 and not provided on the robot 100, such as various sensors for sensing information.
  • the robot 100 may be associated with a temperature sensor, a humidity sensor (not shown), or the like disposed within a certain area through which the corresponding information is perceived.
  • the robot 100 can communicate with these sensors through various types of communication protocols to obtain information from these sensors.
  • the information perceived by the robot 100 may be set according to preset conditions, which may include, but are not limited to, setting which information the robot 100 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user's voice conversation, without perceiving other information, or reducing the effect of other information when generating the sensing unit, or sensing the Other information is processed, etc.; or, during a certain period of time (for example, when the user goes out, the robot 100 is indoors alone), the environmental parameters, the perceived image, and the video data are sensed, and the environmental parameters are used to determine whether it is necessary to interact with the air conditioner or the like.
  • preset conditions may include, but are not limited to, setting which information the robot 100 perceives, when to perceive the information, and the like. For example, it may be set to sense the user's voice, track the user's face, recognize the user's gesture, etc., in the user
  • condition for setting the perceived information is not limited thereto, and the above conditions are merely exemplified, and the information that the robot 100 needs to perceive may be set depending on the situation.
  • At least one sensing unit is defined, which is the smallest unit (or referred to as a minimum input unit) that controls the robot 100, and the robot 100 makes interactive behavior based on at least the sensing unit.
  • the interaction behavior of the robot 100 may be controlled by one or more sensing units, for example, when the values of one or more sensing units change, the robot 100 may react to the changes in response to the changes; or, when one or more perceptions When the value of the unit is within a certain value range or equal to a certain value, the robot 100 can perform an interactive behavior in response to the sensing unit. It should be understood that the control of the interaction behavior of the robot 100 by the sensing unit is not limited to the above case, and the above case is merely illustrative.
  • the sensing unit can include multiple levels, and the higher level sensing unit can include one or more sensing units of the lower level.
  • the higher level perceptual unit may include one or more perceptual units of the lower level adjacent thereto, and the sensing unit of the same higher level may include different lower level perceptual units.
  • the low-level sensing units that synthesize the high-level sensing units include, but are not limited to, low-level sensing units of the same time or time period, and historical low-level sensing units of the time or time period.
  • the higher level perceptual units are determined by lower level sensing units at different times.
  • the value of the sensing unit may be one or a set of values, or may be a range of one or more values.
  • the value of the sensing unit may be determined according to the information perceived by the robot 100.
  • One sensing unit may be determined by one or more pieces of information that is perceived, and the same sensing unit may be determined by different data that is perceived.
  • the perceived information may include real-time perceived information, or historically perceived information (such as information perceived at a certain time or some time in the past). In some cases, the value of the sensing unit is determined by the information perceived in real time and the information perceived by the history.
  • the audible description describes the voice that is heard, and when the robot 100 receives the sound, performs voice recognition processing on the received voice to identify the text of the voice in the voice, and the value of the audible voice may be the text of the voice that is heard; in some implementations
  • the vision may also include the direction of the sound, and the direction of the sound is referenced to the face of the robot, including left, right, front, back, and the like.
  • the robot 100 can analyze the image or video to determine whether there is or is there a current movement, and the visual value can include whether or not someone has moved or not. Whether someone at home can be "0" or “1", “0” means no one is at home, and "1" means someone is at home.
  • the time describes the time information, and the value may be a time point or a time range, for example, 14:00 every February 1st.
  • the environment describes the environmental conditions, including temperature, humidity, noise, PM2.5, ppm of gas in the air, carbon monoxide content in the air, oxygen content in the air, etc., and the value may be the value or range of each parameter.
  • the value of the sensing unit can be predefined.
  • the value of the predefined sensing unit may be one or more specific values, or one or more ranges of values.
  • the value of the sensing unit may be an explicit value, or may be formed by a wildcard (or the like) together with an explicit value, but is not limited thereto. For example, when the sensing unit is “speech”, the value may be “*rain*”, indicating that any voice information containing “raining” is included; or the value may be “*[below] rain*”, indicating Any voice message that contains "rain” or "rain”.
  • the robot 100 may generate the sensing data according to the sensing unit and the perceived information, and the sensing data may include one or more sensing units, where the sensing data includes the identification and value of the sensing unit.
  • the sensing data includes the identification and value of the sensing unit.
  • the robot 100 generates the sensing data according to the perceived information according to the sensing unit, and can obtain the value of the sensing unit according to the perceived information by using various analysis methods, for example, obtaining the text of the voice through the voice recognition technology, and analyzing the sensing through the image recognition technology. Whether there is a portrait in the image to be obtained, the attribute of the portrait is determined by the portrait (face) recognition technique, and the like. It should be understood that the robot 100 is not limited to obtaining the value of the sensing unit by the above manner, and may also include processing techniques that have not been developed at the filing date of this document by other means.
  • a plurality of sensing units may be preset, and it should be understood that the setting of the following exemplary sensing unit is not a division of the sensing unit, or a limitation of the number of sensing units, or the expression of the sensing unit, in fact any The division of the sensing unit can all be considered.
  • An example of the sensing unit is shown in Table 1.
  • sensing unit in Table 1 an exemplary sensing data is given below. It should be understood that the following sensing data is not the number of elements of the sensing data, or the definition of the sensing data, or the format of the sensing data, or the sensing data. The definition of the way of expression.
  • the JSON-aware data of an example case is expressed as follows, but is not limited thereto, and other methods are also possible.
  • “vision_human_position” records that the human user is behind the “machine” ("back")
  • “back” can also be represented by other characters, which can distinguish different positions, and should understand the position. It can also be expressed by "angle value”, for example, “vision_human_position”: “45°” or the like.
  • "sensing_touch” records the touch of the human user on the machine device. The position of the touch is the hand ("hand”). The “hand” can also be represented by other characters. It can be distinguished from different positions. It should be understood that the touch position can be Multiple, the value of "sensing_touch” can be an array that records multiple locations.
  • “audio_speak_txt” records what the human user said “very happy to see you”, and the content can also be audio data.
  • “audio_speak_language” records the language “chinese” spoken by human users.
  • “vision_human_posture” records the human user's gesture “posture1", and “posture1” can also be represented by other characters, which can distinguish different postures.
  • “system_date” records the date “2016/3/16" of the generation of the perceptual data
  • “system_time” records the time “13-00-00” of the perceptual data generation.
  • “system_power” records the “80%” of the power of the machine. It should be understood that the power can also be identified in other ways.
  • the trigger condition and the interaction behavior triggered by the trigger condition can be set.
  • a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot is generated.
  • Control entries can have unique identifiers to distinguish control entries.
  • the triggering condition may be composed of one or more sensing units, and the logical units may be configured between the sensing units, and the logical relationships include, but are not limited to, “and”, “or”, “not”, and the like.
  • the triggering condition may include an identification and a value of the sensing unit constituting the triggering condition, and the value of the sensing unit may be one or a set of values or one or a set of value ranges.
  • the value of the sensing unit may be an explicit value, or may be formed by a wildcard (or the like) together with an explicit value, but is not limited thereto.
  • the sensing unit in the trigger condition is “speech”
  • the value may be “*rain*”, indicating that any voice information containing “rain” is included; or the value may be “*[below] rain* ",” means that any voice message containing "rain” or "rain” is included.
  • One or more interactions that the trigger condition can trigger.
  • the order between interactions can be set to perform multiple interactions in a set order.
  • the interaction behavior can be configured as one or more action instructions that can be parsed by the robot for execution, and the action instructions can also include one or more parameters.
  • the order of execution of the one or more action instructions can also be configured.
  • the execution order may include, but is not limited to, randomly executing one or a set of action instructions to effect random execution of one or more actions; or executing a plurality of action instructions in a predetermined sequence of steps.
  • the operating system 144 of the robot 100 and other related devices can parse the action instructions of the interactive behavior, so that the robot performs the interactive behavior. For example, to move the robot forward by 5 meters, the action command can be "move ⁇ "m":5 ⁇ ".
  • the robot interaction behavior control device 148 parses the action instruction, obtains the task (move) and task parameters (5 meters forward) to be executed, passes the task and parameters to the operating system 144, and the operating system 144 further processes the mobile device (not shown) To perform the movement, the mobile device may include a foot, a wheel, a crawler, and the like. It should be understood that specific instructions, such as parameters of individual motors (or similar components) of the mobile device, may also be provided.
  • the action instructions of the interactive behavior include: links to other control entries set for execution of other control entries, and/or for selecting content from a plurality of content and/or parameters and/or A link to multiple parameters and/or multiple content set by parameters or parameters.
  • Each control entry may have a unique identification to which an action instruction may refer to the control entry.
  • the content linked to the action instruction may be a set of actions, and the robot 100 may perform actions in a set of actions according to other factors.
  • attributes such as personality or gender of the robot 100 may be pre-configured, and the attributes may be stored in the memory 102, different The gender or personality of the robot 100 may be different for the interaction behavior of the same situation (or called a scene), and the robot 100 may select an executed action from a set of actions according to attributes such as a set personality or gender, and the actions may include, but are not limited to, a robot. 100 limb movements, etc.
  • the action instruction may also be linked to one or a group of content, which may include, but is not limited to, the content of the voice chat, various Internet information, etc., for example, the action performed by the robot 100 according to the control item is to query the weather in Beijing, and the action instruction may be An address for querying the weather, the robot 100 obtains the weather in Beijing at this address, and the address may include a uniform resource locator (URL), a memory address, a database field, and the like.
  • URL uniform resource locator
  • the interactive behavior of the robot 100 includes, but is not limited to, by outputting a voice, adjusting a gesture, outputting an image or video, interacting with other devices, and the like.
  • Output speech includes, but is not limited to, chatting with a user, playing music; adjusting gestures including, but not limited to, moving (eg, mimicking human walking, etc.), limb swings (eg, arm, head swing), posture adjustment, etc.; outputting images or videos Including but not limited to displaying an image or video on a display device, the image may be a dynamic electronic expression, or the like, or an image obtained from a network, or an image obtained from a network; It includes but is not limited to controlling other devices (such as adjusting operating parameters of air conditioning devices, etc.), transmitting data to other devices, establishing connections with other devices, and the like. It should be understood that the interactive behavior is not limited to the above enumerated contents, and the reaction of the robot 100 to the perceived information can be regarded as the interactive behavior of the robot 100.
  • Control entries can be configured in a data exchange format, although other formats can be used.
  • Data exchange formats include, but are not limited to, XML, JSON, or YAML.
  • JSON Take JSON as an example, you need to implement: When the user says, "Sing me a song,” first go back to 10cm at a medium speed of 0 and then start singing a song. After singing the song, take a photo and send it to the user 10 seconds later. Then 0 angle forward 5CM.
  • the control entry for the JSON data format can be as follows:
  • the "ifs” part is a trigger condition set according to the sensing unit
  • "ear” is the identification of the sensing unit
  • “singing” is the value of the sensing unit.
  • the “trigger” part is the interactive behavior triggered by the trigger condition, including three interaction behaviors of “move”, “song” and “take_pic”, each of which includes a corresponding action instruction. Among them, “song” is linked to "http://bpeer.com/i.mp3", the content of singing is obtained from “http://bpeer.com/i.mp3”, and "gr" is action Execution order.
  • control entries may be stored as documents in a data exchange format, or may also be stored in a database.
  • the bot 100 can also include a database system for storing control entries.
  • the database system provides an interface for one or more CPUs 106 to read data from the database and to write data to the database system.
  • the control device 148 of the robot interaction behavior may control the interaction behavior of the robot according to the control item, and the control device 148 acquires the information perceived by the robot, and generates the sensing data according to the perceived information, at least according to a predefined sensing unit, wherein the sensing data includes the sensing.
  • the identification and value of the unit finding a control entry that matches the generated perceptual data; if the control entry matching the generated perceptual data is found, causing the robot to perform the interaction behavior in the found control entry.
  • control device 148 can also transmit the information perceived by the robot 100, and the remote server (not shown) generates the sensing data according to the sensed information and the sensing unit, and searches for and generates the sensing unit.
  • the matching control entries are then sent to the control device 148, which causes the robot to perform the interactive behavior in the control entry.
  • an identification of the perceptual information may be generated to determine if the received control entry is a control entry for the transmitted perceptual information.
  • the control device 148 may be sent to the control entry itself, or may be an identifier of the control entry, or interactive behavior data that controls the configuration of the entry, or other information that causes the control device 148 to determine the interaction behavior of the control entry configuration.
  • control device 148 can generate the sensing data according to the information and the sensing unit perceived by the robot 100, and send the generated sensing data to the remote server, and the remote server receives the sensing data to find the control that matches the sensing data.
  • An entry, which sends the found control entry to the robot 100, causes the robot 100 to perform the interactive behavior in the control entry.
  • control device 148 is not limited to controlling the interactive behavior of the robot by the manner described above, but may be a combination of the above several manners or other means.
  • the behavioral instruction and behavior control parameters can be written in the JSON language, but are not limited thereto, and other methods are also possible.
  • Non-limiting ones may include:
  • the behavior name is: audio_speak;
  • Behavior control parameters can include: text (content to say), volume (volume of speech), etc. (eg, vocal gender, or vocal age, etc.)
  • JSON JSON is expressed as follows:
  • text may include a conversion character that corresponds to a parameter.
  • the "owner” conversion character can be defined as "@master”.
  • JSON representation containing the conversion characters is as follows:
  • volume is set as a percentage, and the robot can calculate the specific parameters of the robot based on the percentage value of "volume”.
  • volume can also be represented as a specific parameter of the robot.
  • the behavior name is: audio_sound_music
  • Behavior control parameters may include: path (path to play music, or file name, etc.), volume (volume of playing music), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: audio_sound_info
  • Behavior control parameters include: name (the name of the tone to be played), volume (play volume), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_head;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • velocity is represented as a gear position, and the robot can calculate a specific "velocity” based on the gear position.
  • "velocity” can also be expressed as a specific parameter of the robot's head movement.
  • angle is expressed as the angle of the motor.
  • angle can be expressed as relative data such as percentage, for example, “angle”: “50%”
  • the robot can determine the specific range according to the angle range.
  • the parameter for example, the maximum angle is 180 degrees, then the calculated specific angle is 90 degrees, but is not limited thereto.
  • the behavior name is: motion_neck;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_shoulder;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_elbow;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_wrist
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_waist
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: motion_eye;
  • Behavior control parameters may include: motor (motor that performs motion), velocity (motor speed), angle (motor motion angle), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: display_emotion
  • Behavior control parameters can include: content (displayed emoticons), velocity (display speed), etc.
  • JSON JSON is expressed as follows:
  • the behavior name is: program_photo;
  • Behavior control parameters can include: flash (whether the flash is turned on), etc.
  • JSON JSON is expressed as follows:
  • control_tv The behavior name is: control_tv;
  • Behavior control parameters can include: state (eg open, close), etc.
  • JSON JSON is expressed as follows:
  • control_led The behavior name is: control_led;
  • Behavior control parameters can include: state (eg open, close), color, etc.
  • JSON JSON is expressed as follows:
  • the trigger condition may be set according to the sensing unit, and the interaction behavior triggered by the trigger condition, the control item is obtained, and the control item is used as data for controlling the interaction behavior of the robot 100.
  • FIG. 2 illustrates a flow chart of a method of generating control data for a robot in accordance with some embodiments of the present invention. As shown in FIG. 2, the method includes:
  • Step S202 setting a trigger condition for controlling the interaction behavior of the robot according to one or more preset sensing units
  • Step S204 setting an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
  • Step S206 generating a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot according to the set trigger condition and the interaction behavior.
  • At least one sensing unit may be selected from the preset sensing unit; the attribute of the selected sensing unit is set, wherein the attribute of the sensing unit includes the value of the sensing unit; according to the selected sensing unit and the sensing unit
  • the property settings are used to control the triggering conditions of the robot's interactive behavior.
  • the relationship between the plurality of sensing units is also set, and the relationship between the sensing units includes, but is not limited to, logical relationships such as “AND”, “OR”, “NO”, etc.;
  • a trigger condition is set for the relationship between the sensing unit and the sensing unit and the relationship between the sensing units.
  • At least one interaction behavior may be selected from a preset interaction behavior set for the robot to perform; setting an attribute of the selected interaction behavior, wherein the attribute of the interaction behavior includes one or more interaction behaviors
  • the robot parses the action instruction and the parameters of the action instruction; and sets the interaction behavior triggered by the trigger condition according to the selected interaction behavior and the attribute of the interaction behavior.
  • the execution order of the plurality of interaction behaviors may also be set, and the execution order of the interaction behaviors includes, but is not limited to, randomly performing one or more interaction behaviors, or performing a plurality of interaction behaviors according to predetermined steps.
  • the interaction behavior triggered by the trigger condition may be set according to the selected interaction behavior and the attribute of the interaction behavior and the execution order.
  • the triggering condition and the triggering condition triggered interaction behavior are described in terms of a predetermined data representation.
  • the control entry may be generated using a data exchange format based on the interaction behavior triggered by the trigger condition and the trigger condition.
  • Data exchange formats include, but are not limited to, one or any combination of the following: XML, JSON, or YAML. It should be understood that other formats may also be used to generate triggering conditions triggered by trigger conditions and trigger conditions, including data representations that have not yet been developed on the filing date of this document.
  • multiple control entries can be set and multiple control entries stored as documents in a data exchange format.
  • multiple control entries can also be stored in the database.
  • adjacent control entries can be separated by a predetermined symbol to distinguish between different control entries.
  • the document storing the control entry may be stored in the memory 102 of the robot 100, and the document of the control entry may also be stored in the remote server.
  • the interaction behavior is configured as one or more action instructions.
  • the above action instructions include: links to other control entries set for executing other control entries, and/or multiple parameters set for selecting content and/or parameters from a plurality of content and/or parameters. And/or links to multiple content.
  • the action command of "query weather” it is possible to link to a webpage providing weather information, and obtain weather information of the city to be queried from the webpage. After the weather information is queried, it may be displayed on the display device of the robot 100, or the weather information may be broadcasted by voice.
  • the actions performed may be selected according to other configurations. Parameters;
  • when linking to multiple content such as multiple corpora of chats), you can also choose what to render based on other configurations.
  • the execution order of the action instructions may also be set, wherein the execution sequence includes: randomly executing one or more action instructions, or executing a plurality of action instructions in predetermined steps.
  • the order of execution can be marked with symbols. If there is no mark, it can be in the order in which the actions are described. The same type of action can be used as a whole, and the order of actions can be marked. For example, “move forward 5 meters, nod 5 times, then back 10 meters”, the action instruction can be expressed as [move: ⁇ gr: 0,m:+5;gr:2,m:-10 ⁇ ;head ⁇ gr:1,head:5 ⁇ ], "gr" indicates the execution order of the action, and the action with a small value is executed first.
  • a graphical user interface can be provided for setting trigger conditions and interaction behaviors, the graphical user interface providing a set sensing unit (eg, the name of the sensing unit, the identification, etc.), a perceptible unit that can be set The value of the relationship between the sensing unit and the sensing unit.
  • the user who sets the triggering condition can select the sensing unit, the value of the sensing unit, and the logical relationship of the sensing unit. After selecting the sensing unit that sets the triggering condition, the triggering condition is generated according to the corresponding format.
  • the graphical user interface can also provide set interaction behaviors, which can be pre-defined interaction behaviors, and after the interaction behavior is selected, the interaction behavior is generated according to the corresponding format.
  • the trigger condition and the interaction behavior may also be directly edited, for example, according to the data exchange format described above, the action instruction specification using the predefined sensing unit and the interaction behavior, and the interaction behavior triggered by the trigger condition and the trigger condition are edited. Get control entries.
  • content may be fetched from the Internet (eg, a web page, etc.), the captured content may be analyzed, content for setting control entries may be obtained, and triggering conditions triggered by trigger conditions and trigger conditions may be set according to the content.
  • the trigger condition of “ill” can be set according to the sensing unit, and the interaction behavior triggered by the trigger condition is set to “call emergency call”. If the sensing unit of “health status” is predefined, the value of the sensing unit can be directly set to “ill”, and the triggering condition can be ⁇ if(“health”: “sick”) ⁇ .
  • the robot 100 can determine the health status of the user based on the perceived data, determine whether the health condition is "ill", for example, perform a voice chat with the user to understand the state of the user, and detect the heart rate, body temperature, and the like of the user.
  • the sensor 100 When the health condition is "ill", the sensor 100 generates perceptual data including ⁇ "health": "sick" ⁇ .
  • control item After the control item is used as the data to control the interaction behavior of the robot, the interaction behavior of the robot can be controlled according to the control item.
  • FIG. 3 illustrates a flow chart 1 of a method of controlling robot interaction behavior, as shown in FIG. 3, in accordance with some embodiments of the present invention, including:
  • Step S302 acquiring data that is perceived by the robot
  • Step S304 Generate sensing data according to the perceptual information, at least according to a predefined sensing unit, where the sensing data includes an identifier and a value of the sensing unit.
  • Step S306 searching for a control entry that matches the generated sensing data among the stored plurality of control entries;
  • Step S308 if the control item matching the generated sensing data is found, the robot is caused to perform the interactive behavior in the found control item.
  • the bot 100 communicates with a remote server (not shown) over a network, the bot 100 perceives at least one piece of data, and the remote server acquires information perceived by the bot from the bot 100, the acquisition including a remote server request
  • the robot 100 transmits the information it perceives, or the robot senses the information, and transmits the information perceived by the robot 100 to the remote server.
  • the robot 100 may periodically transmit the perceived information to the remote server or transmit the perceived information to the remote server when the perceived information changes to reduce the amount of data transmission between the remote server and the robot 100.
  • the control entry document can be stored in a remote server that includes one or more processors and one or more modules, programs, or sets of instructions that are stored in memory to perform the method illustrated in FIG.
  • the remote server can be a single server or a server cluster consisting of multiple servers. It should be understood that the above described program or set of instructions is not limited to running on a single server, but can also be run on distributed computing resources.
  • the found control entry can be sent to the bot 100, which reads the interactive behavior from the control entry and performs the interactive behavior.
  • data of the interactive behavior in the found control entry may be transmitted to the robot 100.
  • the data of the interactive behavior in the control entry may be parsed to obtain an instruction that the robot 100 can execute, and the obtained command is transmitted to the robot 100, and the robot 100 executes the instruction. It should be understood that the above manner is merely illustrative.
  • FIG. 4 illustrates a flow chart 2 of a method of controlling robot interaction behavior, as shown in FIG. 4, according to some embodiments of the present invention, the method comprising:
  • Step S402 receiving the sensing data of the robot, wherein the sensing data is generated according to the information perceived by the robot, at least according to a predefined sensing unit, and the sensing data includes the identifier and the value of the sensing unit;
  • Step S404 searching for a control item matching the sensing data of the robot among the stored plurality of control items;
  • Step S406 if the control item matching the perceptual data of the robot is found, the robot is caused to perform the interaction behavior in the found control item.
  • the robot 100 senses at least one piece of information, and generates sensing data based on the sensed information and the sensing unit, and transmits the sensing data.
  • the bot 100 sends the sensory data to a remote server (not shown).
  • the robot 100 may transmit the sensing data after generating the sensing data, or may send the sensing data after receiving the request from the remote server.
  • the remote server stores documents that control entries, such as documents in a data exchange format, or a database, and the like.
  • control entry documents can be distributed across multiple storage spaces.
  • the remote server may include one or more processors and one or more modules, programs or sets of instructions stored in memory to perform the method illustrated in FIG.
  • FIG. 5 illustrates a third flowchart of a method for controlling the interaction behavior of a robot according to some embodiments of the present invention. As shown in FIG. 5, the method includes:
  • Step S502 sensing at least one piece of information
  • the sensing data is generated according to the perceptual information, at least according to the predefined sensing unit, where the sensing data includes the identifier and the value of the sensing unit;
  • Step S506 sending the generated sensing data
  • Step S508 receiving information of a control item that matches the sensing data
  • Step S510 performing an interaction behavior of the control item configuration according to the information of the control item.
  • the interactive behavior control device 148 of the robot 100 performs the method as shown in FIG.
  • the robot 100 perceives at least one piece of information, generates a policy according to the perceptual data, and generates perceptual data according to the sensing unit. After the robot 100 generates the sensing data, it transmits the sensing data to the remote server.
  • the document storing the control entry in the remote server searches for a control entry that matches the sensory data of the robot among the stored plurality of control entries, and sends the control entry to the robot 100 if a control entry matching the sensory data of the robot is found.
  • an action instruction that controls the interactive behavior in the entry can be sent to the robot 100.
  • the identification of the generated sensory data may also be determined prior to transmitting the generated sensory data. After determining the identifier of the generated sensing data, the generated sensing data and its identifier are sent out. After the remote server finds the control entry that matches the generated sensing data, the information of the control entry and the identifier of the corresponding sensing data are sent to the control device 148, and the information of the control entry may be the control entry itself, the identifier of the control entry, The behavior of the control bar entry configuration and any combination thereof, but is not limited to this. The control device receives the information of the control entry, and determines whether the information of the received control entry is the information of the control entry that matches the generated sensing data according to the identifier of the sensing data carried in the information of the control entry.
  • Control device 148 can determine a corresponding control entry based on the identification of the control entry and perform an interaction behavior in the control entry. Alternatively, the control device 148 can read the interaction behavior of the control entry configuration directly from the control entry sent by the remote server to perform the interaction. Moreover, if the remote server sends the interaction behavior configured in the control entry, the control device 148 can directly parse and execute the interaction behavior.
  • the sensory data of the robot may be matched with a trigger condition in the control entry, including but not limited to determining whether there is a certain sensing unit, the value of the comparison sensing unit.
  • the degree of matching between the perceptual data of the robot and the matched plurality of trigger conditions may be determined, at least according to the degree of matching.
  • a control entry that senses data matching may be determined for the speech text in the perceptual data, the degree of matching may be determined by using the editing distance, and the smaller the value of the editing distance, the more similar the two texts are. Speech text can also be matched using regular expressions.
  • the priority of the control entry can also be set, and the priority of the control entry can be referenced when selecting the control entry.
  • control entries can be classified into core control entries, user control entries, and temporary control entries, with the core control entry being the highest priority control entry, followed by the user control entry, and finally the temporary control entry.
  • the robot 100 can perceive at least one piece of information, generate perceptual data based on the perceptual information and sensing unit, and read control items (including but not limited to reading from the memory 102 of the robot 100), looking up and The generated control data matches the control entry, and if a control entry matching the generated perceptual data is found, the robot 100 performs the interactive behavior in the found control entry.
  • the document of the control entry may be stored in the memory 102 and the remote server of the bot 100.
  • the robot 100 perceives at least one piece of information, generates sensing data based on the sensed information and the sensing unit, reads the control item from the memory 102, and searches for the control item matching the generated sensing data in the read control item.
  • the robot 100 performs the interaction behavior in the found control entry; if the control entry matching the generated perceptual data is not found in the read control entry, the robot 100 may Sending the generated sensing data to the remote server, the remote server searching for the control entry matching the received sensing data in the stored control entry, and if the control entry matching the received sensing data is found, causing the robot 100 to execute The interaction behavior in this control entry.
  • the remote server can also find the The control item is sent to the robot 100, which can receive the control entry through an interface (not shown) and store the received control entry.
  • the robot 100 when a control entry that matches the perceptual data is found, the robot 100 is caused to perform an interactive behavior in the control entry. When the control item matching the perceptual data is not found, the interaction behavior may be omitted, and the robot 100 may continue to perceive at least one piece of information, and perceive what information can be determined according to the preset condition. In some embodiments, when a control entry that matches the perceptual data is not found, a voice reply can be made or imported into the Internet (eg, displaying web page information, etc.).
  • the control item matching the sensing data it may be determined whether the sensing data is related to the voice (for example, whether the user's voice instruction is received, etc.), and if the sensing data is determined to be related to the voice, the voice response may be performed, or according to the voice.
  • the content is searched for relevant content in the Internet and presented to the user in the display device of the robot 100.
  • the control entries can be set based on the interaction behavior of the robot with the user.
  • the robot 100 can perform a voice chat with the user.
  • the robot 100 analyzes the user's needs and intentions, and obtains the interaction behavior of the scene and the robot in the situation.
  • the interaction behavior of the robot the control item is generated according to the sensing unit. For example, when the user is sick, the robot says “I am sick", and the control item of the robot 100 does not have an interaction behavior when the user is sick.
  • the robot 100 can perform a voice interaction with the user, such as asking the user "I don't know what needs to be done.
  • the robot 100 can make a call.
  • the robot 100 needs to contact the doctor when analyzing that the user is "ill", and according to the result of the analysis, the robot 100 can generate a control item, for example, the trigger condition is [if(health:sick)], The interaction behavior triggered by the trigger condition is [call ⁇ number:"//doctor_number.php"].
  • the structure of the apparatus for generating control data of the robot of some embodiments will be described below. Since the principle of solving the problem by the generating device of the control data of the robot is similar to the control method of the interactive behavior of the robot, the implementation of the generating device for controlling the data of the robot can be referred to the implementation of the method for generating the control data of the robot, and the repeated description will not be repeated.
  • the term "unit” or "module” may implement a combination of software and/or hardware of a predetermined function.
  • the apparatus described in the following embodiments is preferably implemented in software, hardware, or a combination of software and hardware, is also possible and contemplated.
  • FIG. 6 is a block diagram showing the structure of a control device for generating control data of a robot according to some embodiments of the present invention. As shown in FIG. 6, the device includes:
  • the trigger condition setting module 602 is configured to set a trigger condition for controlling the interaction behavior of the robot according to the one or more preset sensing units, wherein the sensing unit is set as a minimum unit that controls the interaction behavior of the robot;
  • the interaction behavior setting module 604 is connected to the trigger condition setting module 602, and configured to set an interaction behavior triggered by the trigger condition according to one or more preset interaction behaviors set for the robot to perform;
  • the generating module 606 is connected to the interaction behavior setting module 604, and is configured to generate a control item for controlling the interaction behavior of the robot in response to the information perceived by the robot according to the set trigger condition and the interaction behavior.
  • FIG. 7 is a block diagram showing the structure of a control apparatus for robot interaction behavior according to some embodiments of the present invention.
  • the apparatus includes: an acquisition module 702, configured to acquire information perceived by a robot; and a generation module 704,
  • the obtaining module 702 is connected to generate sensing data according to the perceptual information at least according to the predefined sensing unit.
  • the searching module 706 is connected to the generating module 704 for searching for a control item that matches the generated sensing data.
  • the executing module 708 And being coupled to the lookup module 706 for causing the robot to perform the behavior in the found control entry when the control entry matching the generated perceptual data is found.
  • the apparatus includes: a sensing module 802 for sensing at least one piece of information; a generating module 804, and sensing
  • the module 802 is connected to generate sensing data according to the perceptual information, at least according to a predefined sensing unit.
  • the sending module 806 is connected to the generating module 804, and configured to send the generated sensing data.
  • the receiving module 808 and the sending module The 806 is connected to receive information about the control entry that matches the sensing data.
  • the executing module 810 is connected to the receiving module 808, and configured to perform the interaction behavior in the control entry according to the information of the control entry.
  • FIG. 9 is a block diagram showing the structure of a control apparatus for robot interaction behavior according to some embodiments of the present invention.
  • the apparatus includes: a receiving module 902 for receiving sensing data of a robot; a searching module 904, and receiving The module 902 is connected to find a control item that matches the sensing data of the robot; the executing module 906 is connected to the searching module 904, and is configured to: when finding a control item corresponding to the sensing data of the robot, to make the robot Executes the behavior triggered by the trigger condition in the found control entry.
  • the apparatus includes a control entry document 1002 for providing a control entry document including a plurality of control entries.
  • each control item includes a trigger condition and an action triggered by the trigger condition, each trigger condition being composed of at least one predefined sensing unit;
  • the matching module 1004 is connected to the control entry document 1002 for matching the sensing data of the robot with the control item to determine whether there is a control item matching the sensing data of the robot, wherein the sensing data is sensed according to the robot.
  • the information is generated at least in accordance with a predefined sensing unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Manipulator (AREA)

Abstract

一种机器人交互行为的控制方法、装置及机器人,其中该控制方法包括:获取机器人感知到的信息(S202);根据感知到的信息、至少按照预先定义的感知单元生成包含感知单元的标识和取值的感知数据(S304);查找与生成的感知数据匹配的用于响应机器人感知到的信息来控制机器人的行为的控制条目(S306),其中,控制条目包含由至少一个感知单元构成的触发条件和该触发条件触发的行为;如果查找到与生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的行为(S308)。该方法定义感知单元为控制机器人交互行为的最小单位,根据感知单元设置控制条目来控制机器人的交互行为,有效提高机器人自适应交互行为能力与智能化程度。

Description

机器人交互行为的控制方法、装置及机器人 技术领域
本发明涉及机器人技术领域,特别涉及一种机器人交互行为的控制方法、装置及机器人。
背景技术
当今的机器人多为工业机器人,而工业机器人以无感知能力的居多。这些机器人的操作程序都是预先制定的,并按照预定程序重复无误地完成确定的任务。它们缺乏适应性,只有当涉及的对象相同时,才能产生一致的结果。
发明内容
本发明实施例提供了一种机器人交互行为的控制方法、装置及机器人,以至少有效提高机器人自适应交互行为能力与智能化程度。
在某些实施例中,一种机器人交互行为的控制方法,包括:获取机器人感知到的信息;根据感知到的信息、至少按照预先定义的感知单元生成感知数据,其中,该感知数据包括感知单元的标识和取值;查找与生成的感知数据匹配的控制条目,其中,每个控制条目包含触发条件和该触发条件触发的行为,每个触发条件由至少一个感知单元构成;如果查找到与上述生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的行为。
在某些实施例中,一种机器人交互行为的控制方法,包括:感知至少一项信息;根据感知到的信息、至少按照预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;将生成的感知数据发送出去;接收与上述感知数据匹配的控制条目,其中,该控制条目包含触发条件和该触发条件触发的行为,触发条件由至少一个感知单元构成;执行接收到的控制条目中的交互行为。
在某些实施例中,一种机器人交互行为的控制方法,包括:提供包含多个控制条目的控制条目文档,其中,每个控制条目包含触发条件和该触发条件触发的行为,每个触发条件由至少一个预先定义的感知单元构成;将机器人的感知数据与控制条目进行匹配,以确定是否存在与所述机器人的感知数据匹配的控制条目,其中,机器人的感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成。
在某些实施例中,一种机器人交互行为的控制装置,包括:获取模块,用于获取机器人感知到的信息;生成模块,用于根据感知到的信息、至少按照预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;查找模块,用于查找与生成的感知数据匹配的控制条目,其中,每个控制条目包含触发条件和该触发条件触发的行为,每个触发条件由至少一个感知单元构成;执行模块,用于当查找到与生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的行为。
在某些实施例中,一种机器人交互行为的控制装置,包括:感知模块,用于感知至少一项信息;生成模块,用于根据感知到的信息、至少按照预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;发送模块,用于将生成的感知数据发送出去;接收模块,用于接收与所述感知数据匹配的控制条目的信息,其中,该控制条目包含触发条件和该触发条件触发的行为,触发条件由至少一个感知单元构成;执行模块,用于根据控制条目的信息执行该控制条目中的交互行为。
在某些实施例中,一种机器人交互行为的控制装置,包括:接收模块,用于接收机器人的感知数据,其中,感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成;查找模块,用于查找与机器人的感知数据匹配的控制条目,其中,每个控制条目包含触发条件和该触发条件触发的行为,每个触发条件由至少一个感知单元构成;执行模块,用于当查找到与机器人的感知数据对应的控制条目,使机器人执行查找到的控制条目中触发条件触发的行为。
在某些实施例中,一种机器人交互行为的控制装置,包括:控制条目文档,用于提供包含多个控制条目的控制条目文档,其中,每个控制条目包含触发条件和该触发条件触发的行为,每个触发条件由至少一个预先定义的感知单元构成;匹配模块,用于将机器人的感知数据与控制条目进行匹配,以确定是否存在与机器人的感知数据匹配的控制条目,其中,感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成。
上述方法可以由机器人执行,其中所述机器人具有:一个或多个单元,存储器,以及一个或多个保存在存储器中以执行这些方法的模块、程序或指令集。
用于执行上述方法的指令可以包括在被配置为由一个或多个处理器执行的计算机程序产品中。
在本发明实施例中,提出了一种机器人的控制方法、装置及机器人,预先定义了控制机器人交互行为的感知单元,将其作为控制机器人交互行为的最小单元,根据感知单元设置触发条件和触发条件所触发的交互行为,得到控制机器人的控制条目,统一了机 器人控制的输入输出标准,使得非技术人员也可以编辑机器人的行为,从而便于控制机器人的交互行为,有效提高机器人自适应交互行为能力与智能化程度。
附图说明
此处所说明的附图用来提供对本发明的进一步理解,构成本申请的一部分,并不构成对本发明的限定。在附图中:
图1说明根据本发明某些实施例的机器人的结构示意图;
图2说明根据本发明某些实施例的机器人的控制数据的生成方法的流程图;
图3说明根据本发明某些实施例的机器人交互行为的控制方法的流程图一;
图4说明根据本发明某些实施例的机器人交互行为的控制方法的流程图二;
图5说明根据本发明某些实施例的机器人交互行为的控制方法的流程图三;
图6说明根据本发明某些实施例的机器人的控制数据的生成装置的结构框图;
图7说明根据本发明某些实施例的机器人交互行为的控制装置的结构框图一;
图8说明根据本发明某些实施例的机器人交互行为的控制装置的结构框图二;
图9说明根据本发明某些实施例的机器人交互行为的控制装置的结构框图三;以及
图10说明根据本发明某些实施例的机器人交互行为的控制装置的结构框图四。
具体实施方式
现在详细参考附图中描述的实施例。为了全面理解本发明,在以下详细描述中提到了众多具体细节。但是本领域技术人员应该理解,本发明可以无需这些具体细节而实现。在其他实例中,不详细描述公知的方法、过程、组件和电路,以免不必要地使实施例模糊。
图1说明根据本发明某些实施例的机器人的结构示意图。机器人100包括存储器102、存储器控制器104、一个或多个处理单元(CPU)106、外设接口108、射频(RF)电路114、音频电路116、扬声器118、麦克风120、感知子***122、姿态传感器132、摄像机134、触觉传感器136以及一个或多个其他感知装置138,以及外部接口140。这些组件通过一条或多条通信总线或信号线110进行通信。
应当理解,机器人100只是机器人100的一个实例,该机器人100的组件可以比图示具有更多或更少的组件,或具有不同的组件配置。例如,在某些实施例中,机器人100可以包括一个或多个CPU 106、存储器102、一个或多个感知装置(例如如上所述的 感知装置),以及一个或多个保存在存储器102中以执行机器人交互行为控制方法的模块、程序或指令集。图1所示的各种组件可以用硬件、软件或软硬件的组合来实现,包括一个或多个信号处理和/或专用集成电路。
在某些实施例中,机器人100可以是具有生物外形(例如,人形等)的机电设备,还可以是不具有生物外形但具有人类特征(例如,语言交流等)的智能装置,该智能装置可以包括机械装置,也可以包括由软件实现的虚拟装置(例如,虚拟聊天机器人等)。虚拟聊天机器人可以通过其所在的设备感知到信息,其所在的设备包括电子设备,例如手持电子设备、个人计算机等。
存储器102可包括高速随机存取存储器,并且还可包括非易失性存储器,例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。在某些实施例中,存储器102还可以包括远离一个或多个CPU 106的存储器,例如经由RF电路114或外部接口140以及通信网络(未示出)访问的网络附加存储器,其中所述通信网络可以是因特网、一个或多个内部网、局域网(LAN)、广域网(WLAN)、存储局域网(SAN)等,或其适当组合。存储器控制器104可控制机器人100的诸如CPU 106和外设接口108之类的其他组件对存储器102的访问。
外设接口108将设备的输入和输出外设耦接到CPU 106和存储器102。上述一个或多个处理器106运行各种存储在存储器102中的软件程序和/或指令集,以便执行机器人100的各种功能,并对数据进行处理。
在某些实施例中,外设接口108、CPU 106以及存储器控制器104可以在单个芯片,例如芯片112上实现。而在某些其他实施例中,它们可能在多个分立芯片上实现。
RF电路114接收并发送电磁波。该RF电路114将电信号变换成电磁波,或是将电磁波变换成电信号,并且经由电磁波来与通信网络以及其他通信设备进行通信。该RF电路114可以包括用于执行这些功能的公知电路,包括但不局限于天线***、RF收发机、一个或多个放大器、调谐器、一个或多个振荡器、数字信号处理器、CODEC芯片组、用户身份模块(SIM)卡、存储器等等。该RF电路112可以通过无线通信来与网络和其他设备进行通信,该网络例如又名万维网(WWW)的因特网、内部网和/或诸如蜂窝电话网络之类的无线网络、无线局域网(LAN)和/或城域网(MAN)。
上述无线通信可以使用多种通信标准、协议和技术中的任何一种,包括但不局限于全球移动通信***(GSM)、增强型数据GSM环境(EDGE)、宽带码分多址(W-CDMA)、码分多址(CDMA)、时分多址(TDMA)、蓝牙、无线保真(Wi-Fi)(例如IEEE 802.11a、 IEEE 802.11b、IEEE802.11g和/或IEEE 802.11n)、基于因特网协议的语音传输(VoIP)、Wi-MAX,用于电子邮件、即时消息传递和/或短消息服务(SMS)的协议,或任何其他合适的通信协议,包括在本文件提交日尚未开发出的通信协议。
音频电路116、扬声器118和麦克风120提供了用户与机器人100之间的音频接口。音频电路116接收来自外设接口108的音频数据,将音频数据变换成电信号,并且将电信号传送到扬声器118。扬声器将电信号变换成人类可听见的声波。音频电路116还接收由麦克风118从声波变换的电信号。该音频电路116将电信号变换成音频数据,并且将音频数据传送到外设接口108,以便进行处理。音频数据可以由外设接口108从存储器102和/或RF电路114中检索出,和/或传送到存储器102和/或RF电路114。
在某些实施例中,可以包括多个麦克风120,多个麦克风120分布可以在不同位置,根据不同位置的麦克风120、按照预定策略确定声音发出的方向。应当理解,也可以通过某些传感器来识别声音方向。
在某些实施例中,音频电路116还包括头戴送受话器插孔(未示出)。该头戴送受话器插孔提供音频电路114与可拆装的音频输入/输出外设之间的接口,举例来说,该音频输入/输出外设既可以是纯输出耳机,也可以是同时具有输出(用于单耳或双耳的耳机)和输入(麦克风)的头戴送受话器。
在某些实施例中,还包括语音识别装置(未示出),用于实现语音到文字的识别,以及根据文字合成语音。语音识别装置可以用硬件、软件或软硬件的组合来实现,包括一个或多个信号处理和/或专用集成电路。音频电路116接收来自外设接口108的音频数据,将音频数据变换成电信号,语音识别装置可以对音频数据进行识别,将音频数据转换为文本数据。语音识别装置还可以根据文字数据合成音频数据,通过音频电路116将音频数据变换成电信号,并且将电信号传送到扬声器118。
感知子***122提供机器人100的感知外设和外设接口108之间的接口,感知外设例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128。感知子***122包括姿态控制器124、视觉控制器126、触觉控制器128以及一个或多个其他感知装置控制器130。所述一个或多个其他感知装置控制器130接收/发送来自/去往其他感知装置138的电信号。所述其他感知装置138可包括温度传感器、距离传感器、接近觉传感器、气压传感器以及空气质量检测装置等等。
在某些实施例中,机器人100可以具有多个姿态控制器124,以控制机器人100的不同肢体,机器人的肢体可以包括但不限于手臂、足和头部。相应的,机器人100可以 包括多个姿态传感器132。在某些实施方式中,机器人100可以不具备姿态控制器124和姿态传感器132,机器人100可以是固定形态,不具备手臂、足等机械活动部件。在某些实施例中,机器人100的姿态可以不是机械的手臂、足和头部,也可以采用可变形的构造。
机器人100还包括用于为各种组件供电的电源***142。该电源***142可以包括电源管理***、一个或多个电源(例如电池、交流电(AC))、充电***、电源故障检测电路、电源转换器或逆变器、电源状态指示器(例如发光二极管(LED)),以及与便携式设备中的电能生成、管理和分布相关联的其他任何组件。充电***可以是有线充电***,或者也可以是无线充电***。
在某些实施例中,软件组件包括操作***144、通信模块(或指令集)146、交互行为控制装置(或指令集)148以及一个或多个其他装置(或指令集)150。
操作***144(例如Darwin、RTXC、LINUX、UNIX、OS X、WINDOWS或是诸如Vxworks之类的嵌入式操作***)包括用于控制和管理常规***任务(例如内存管理、存储设备控制、电源管理等等)以及有助于各种软硬件组件之间通信的各种软件组件和/或驱动器。
通信模块146有助于经一个或多个外部接口140而与其他设备进行通信,并且它还包括用于处理RF电路114和/或外部接口140接收的数据的各种软件组件。外部接口140(例如通用串行总线(USB)、FIREWIRE等等)适合于直接或者经网络(例如因特网,无线LAN等等)间接耦接到其他设备。
在某些实施例中,机器人100还可以包括显示装置(未示出),显示装置可以包括但不限于触敏显示器、触摸板等。上述一个或多个其他装置150可以包括图形模块(未示出),图形模块包括用于在显示装置上呈现和显示图形的各种已知软件组件。注意术语“图形”包括可以显示给用户的任何对象,包括但不局限于文本、网页、图标(例如包括软按键在内的用户界面对象)、数字图像、视频、动画等等。触敏显示器或触摸板还可以用于用户输入。
机器人100通过例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128、麦克风120等感知外设感知机器人10的外部环境和机器人本身的状况,机器人100感知到的信息经由感知外设对应控制装置处理,并交由一个或多个CPU 106处理。机器人100对环境的感知包括但不限于自身的传感器(例如姿态传感器132、摄像机134、触觉传感器136和其他感知装置128)检测到的信息,还可以是与机器人100相连 的外部装置(未示出)检测到的信息,机器人100与外部装置之间建立通信连接,机器人100和外部装置通过该通信连接传输数据。外部装置包括各种类型的传感器、智能家居设备等。
在某些实施例中,机器人100感知到的信息包括但不限于声音、图像、环境参数、触觉信息、时间、空间等。环境参数包括但不限于温度、湿度、气体浓度等;触觉信息包括但不限于与机器人100的接触,包括但不限于与触敏显示器的接触、与触觉传感器的接触或靠近,触觉传感器可以设置在机器人的头部、手臂等部位(未示出),应当说明的是还包括其他形式的信息。声音可以包括语音和其他声音,声音可以是麦克风120采集到的声音,也可以是存储器102中存储的声音;语音可以包括但不限于人类说话或唱歌等。图像可以是单张图片或视频,图片和视频包括但不限于由摄像机134拍摄得到,也可以从存储器102中读取或者通过网络传输到机器人100。
机器人100感知的信息不仅包括机器人100外部的信息,还可以包括机器人100自身的信息,包括但不限于机器人100的电量、温度等信息。例如,可以在感知到机器100的电量低于20%时,使机器人100移动到充电位置自动充电。
应当理解,机器人100不限于通过上述的方式感知到信息,还可以通过其他形式感知到信息,包括在本文件提交日尚未开发出的感知技术。此外,机器人100的感知装置也不限于设置在机器人100上的感知装置,还可以包括与机器人100关联而未设置在机器人100上的感知装置,例如各种用于感知信息的传感器。作为一个示例,机器人100可以与设置在一定区域内的温度传感器、湿度传感器(未示出)等关联,通过这些传感器感知到相应的信息。机器人100可以通过多种类型的通信协议与这些传感器通信,以从这些传感器获取信息。
在某些实施例中,可以根据预设的条件设定机器人100感知的信息,这些条件可以包括但不限于设定机器人100感知哪些信息、在什么时间感知信息等。例如,可以设定在于用户语音对话时,感知用户的声音、追踪用户的面部、识别用户的姿态等,而不感知其他信息、或者在生成感知单元时降低其他信息的作用、或者对感知到的其他信息进行处理等;或者,在某一时间段(例如,用户外出、机器人100单独在室内的时间内)感知环境参数、感知图像和视频数据,通过环境参数判断是否需要与空调等设备交互,通过图像和视频数据判断室内是否有陌生人进入等。应当理解,设定感知的信息的条件并不限于此,上述条件仅作为举例说明,可以根据情况设定机器人100需要感知的信息。
关于感知单元
定义至少一个感知单元,感知单元作为控制机器人100的最小单元(或者称为最小输入单元),机器人100至少根据感知单元做出交互行为。机器人100的交互行为可以受到一个或多个感知单元控制,例如,当一个或多个感知单元的取值发生变化时,机器人100可以响应这些变化做出交互行为;或者,当一个或多个感知单元的取值在某一取值范围内或等于某一值时,机器人100可以响应感知单元做出交互行为。应当理解,感知单元对机器人100交互行为的控制不限于上述情况,上述情况仅作为举例说明。
在某些实施例中,感知单元可以包括多个层级,高层级的感知单元可以包含低层级的一个或多个感知单元。在某些实施例中,高层级的感知单元可以包含与其相邻的低层级的一个或多个感知单元,同一高层级的感知单元可以包含不同的低层级的感知单元。在时间上,合成高层级的感知单元的低层级感知单元包括但不限于同一时间或时间段的低层级感知单元,以及该时间或时间段之前的历史的低层级的感知单元。在某些实施例中,高层级的感知单元由不同时间的低层级感知单元确定。
在某些实施例中,感知单元的取值可以是一个或一组值,也可以是一个或多个取值的范围。可以根据机器人100感知到的信息确定感知单元的取值,一个感知单元可以由感知到的一项或多项信息确定,同一感知单元可以由感知到的不同数据来确定。感知到的信息可以包括实时感知到的信息,或者历史感知到的信息(例如过去某一时刻或某段时间感知到的信息)。在某些情况下,感知单元的取值由实时感知到的信息和历史感知到的信息共同确定。
作为一个例子,可以设置听觉(ear)、视觉(eye)、时间(timer)、是否有人在家(so_at_home)以及环境(environment)几个感知单元。听觉描述听到的语音,在机器人100接收到声音时,对接收到的声音进行语音识别处理,识别得到声音中语音的文本,听觉的取值可以是听到的语音的文本;在某些实施例中,视觉还可以包括声音的方向,声音的方向以机器人的面部为参考,包括左、右、前、后等方向。视觉描述视频监控情况,机器人100可以对图像或视频进行分析,判断当前是否有人或者是否有移动,视觉的取值可以包括是否有人、是否有移动等等。是否有人在家的取值可以是“0”或“1”,“0”表示没有人在家,“1”表示有人在家。时间描述时间信息,其取值可以是一个时间点或者一个时间范围,例如每年2月1日14点整。环境描述环境情况,包括温度、湿度、噪音、PM2.5、空气中的燃气的ppm、空气中的一氧化碳含量、空气中的氧气含量等,其取值可以是每种参数的值或者范围。
在某些实施例中,可以预定义感知单元的取值。预定义的感知单元的取值可以是一个或多个具体值、或者一个或多个取值范围。感知单元的取值可以是明确的值,也可以由通配符(或其类似)与明确的值共同构成,但不限于此。例如,感知单元为“语音”时,其取值可以是“*下雨*”,表示任意包含“下雨”的语音信息;或者其取值可以是“*[下有]雨*”,表示任意包含“下雨”或“有雨”的语音信息。
机器人100可以根据感知单元和感知到的信息生成感知数据,感知数据可以包括一项或多项感知单元,感知数据中包括感知单元的标识和取值。感知数据中每个感知单元的取值参见对感知单元的描述。机器人100根据感知到的信息、按照感知单元生成感知数据,可以采用多种分析方法根据感知到的信息得到感知单元的取值,例如,通过语音识别技术得到语音的文本、通过图像识别技术分析感知到的图像中是否存在人像、通过人像(面部)识别技术确定人像的属性等。应当理解,机器人100不限于通过上述的方式得到感知单元的取值,还可以通过其他方式,包括在本文件提交日尚未开发出的处理技术。
作为一个非限制性示例,可预先设定多个感知单元,应当理解下述示例性感知单元的设置不是对感知单元的划分、或感知单元的数量、或感知单元的表达的限定,实际上任何感知单元的划分都是可以被考虑的。感知单元的示例如表1所示。
表1 感知单元示例表
Figure PCTCN2016087258-appb-000001
Figure PCTCN2016087258-appb-000002
以表1中感知单元的定义,以下给出了一个示例性感知数据,应当理解,以下感知数据并不是对感知数据的元素个数、或感知数据的定义、或感知数据的格式、或感知数据的表达方式的限定。一个示例情况的JSON感知数据表示如下,但不限于此,其他方式也是可行的。
Figure PCTCN2016087258-appb-000003
Figure PCTCN2016087258-appb-000004
在该示例在感知数据中,“vision_human_position”记录了人类用户在相对于机器装置的后面(“back”),“back”也可以用其他字符表示,能区分开不同的位置即可,应当理解位置也可以用“角度值”表示,例如“vision_human_position”:“45°”等。“sensing_touch”记录了人类用户在机器装置上的触摸,触摸的位置为手(“hand”),“hand”也可以用其他字符表示,能区分开不同的位置即可,应当理解触摸位置可以由多个,“sensing_touch”的值可以为数组,其记录多个位置。“audio_speak_txt”记录了人类用户所说的内容“很高兴见到你”,所说的内容也可以为音频数据。“audio_speak_language”记录了人类用户所说的语种“chinese”。“vision_human_posture”记录了人类用户的姿势“posture1”,“posture1”也可以用其他字符表示,能区分开不同的姿势即可。“system_date”记录了感知数据产生的日期“2016/3/16”,“system_time”记录了感知数据产生的时间“13-00-00”。“system_power”记录了机器装置的电量“80%”,应当理解,电量还可以按照其他方式标识。
关于控制条目
基于预先定义的感知单元和预设的供机器人执行的交互行为,可以设置触发条件以及触发条件触发的交互行为。根据触发条件和触发条件触发的交互行为,生成用于响应机器人感知到的信息控制机器人交互行为的控制条目。控制条目可以具有唯一标识,以区分控制条目。
触发条件可以由一个或多个感知单元构成,感知单元之间可以配置逻辑关系,逻辑关系包括但不限于“与”、“或”以及“非”等。在某些实施例中,触发条件可以包括构成触发条件的感知单元的标识和取值,感知单元的取值可以是一个或一组值或者一个或一组取值范围。感知单元的取值可以是明确的值,也可以由通配符(或其类似)与明确的值共同构成,但不限于此。例如,触发条件中感知单元为“语音”时,其取值可以是“*下雨*”,表示任意包含“下雨”的语音信息;或者其取值可以是“*[下有]雨*”,表示任意包含“下雨”或“有雨”的语音信息。
触发条件可以触发的一个或多个交互行为。在某些实施例中,可以设置交互行为之间的顺序,以按照设置的顺序执行多个交互行为。交互行为可以被配置为一个或多个可被机器人解析以执行的动作指令,动作指令还可以包括一个或多个参数。在某些实施例中,还可以配置所述一个或多个动作指令的执行顺序。该执行顺序可以包括但不限于随机执行一个或一组动作指令,以实现随机执行一个或多个动作;或者,按照预定步骤顺序执行多个动作指令。
机器人100的操作***144及其他相关装置,可以解析交互行为的动作指令,使得机器人执行交互行为。例如,为了使机器人向前移动5米,动作指令可以是“move{“m”:5}”。机器人交互行为的控制装置148解析该动作指令,得到要执行的任务(移动)和任务参数(向前5米),向操作***144传递任务和参数,操作***144进一步处理使得移动装置(未示出)执行移动,移动装置可以包括足式、轮式以及履带式等。应当理解,也可以设置具体指令,比如移动装置的各个电机(或者类似部件)的参数。
在某些实施例中,交互行为的动作指令包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。每个控制条目可以具有唯一标识,动作指令可以引用控制条目的标识连接到该控制条目。动作指令链接到的内容可以是一组动作,机器人100可以根据其他因素执行一组动作中的动作,例如,可以预先配置机器人100的性格或性别等属性,这些属性可以存储在存储器102中,不同性别或者性格的机器人100对同一情况(或称为场景)的交互行为可以不同,机器人100可以根据设置的性格或性别等属性从一组动作中选择执行的动作,这些动作可以包括但不限于机器人100的肢体动作等。动作指令还可以链接到一个或一组内容,该内容可以包括但不限于语音聊天的内容、各种互联网信息等,例如,机器人100根据控制条目执行的动作为查询北京的天气,动作指令可以是一个查询天气的地址,机器人100到这一地址获取北京的天气,这一地址可以包括统一资源定位符(URL)、内存地址、数据库字段等。
机器人100的交互行为包括但不限于通过输出语音、调整姿态、输出图像或视频、与其他设备进行交互等。输出语音包括但不限于与用户聊天、播放音乐;调整姿态包括但不限于移动(例如,模仿人类步行等)、肢体摆动(例如,手臂、头部的摆动)、神态调整等;输出图像或视频包括但不限于在显示装置上显示图像或视频,图像可以是动态电子表情等,也可以是拍摄得到的图像,或者从网络中获取到的图像;与其他设备交 互包括但不限于控制其他设备(例如调整空调设备的工作参数等)、向其他设备传输数据、与其他设备建立连接等。应当理解,交互行为并不限于上述列举的内容,机器人100对感知到的信息的反应均可被视为机器人100的交互行为。
控制条目可以采用数据交换格式配置,当然也可以采用其他格式配置。数据交换格式包括但不限于XML、JSON或者YAML等。以JSON为例,需要实现:当用户说:“给我唱一首歌”,先往以中等速度0角度后退10cm然后开始唱一首歌,唱完歌以后10秒拍个照片发送给用户,然后0角度前行5CM。JSON数据格式的控制条目可以是如下内容:
Figure PCTCN2016087258-appb-000005
在上述的控制条目中,“ifs”部分为根据感知单元设置的触发条件,“ear”为感知单元的标识,“唱歌”为感知单元的取值。"trigger"部分为触发条件触发的交互行为,包括“move(移动)”、“song(唱歌)”和“take_pic(拍照)”三个交互行为,每个交互行为包括相应的动作指令。其中,“song(唱歌)”链接到“http://bpeer.com/i.mp3”,唱歌的内容从“http://bpeer.com/i.mp3”中获取,“gr”为动作的执行顺序。
在某些实施例中,多个控制条目可以存储为数据交换格式的文档,或者也可以存储在数据库中。在某些实施例中,机器人100还可以包括数据库***,该数据库***用以存储控制条目。数据库***提供接口供一个或多个CPU 106从数据库中读取数据,以及向数据库***写入数据。
机器人交互行为的控制装置148可以根据控制条目控制机器人的交互行为,控制装置148获取机器人感知到的信息,根据感知到的信息、至少按照预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;查找与生成的感知数据匹配的控制条目;如果查找到与生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的交互行为。
在某些实施例中,控制装置148也可以将机器人100感知到的信息发送出去,由远端服务器(未示出)根据感知到的信息和感知单元生成感知数据,并查找与生成的感知单元匹配的控制条目,然后将查找到的控制条目发送给控制装置148,控制装置148使机器人执行控制条目中的交互行为。可选地,可以生成感知到的信息的标识,以确定接收到的控制条目是否为针对发送的感知到的信息的控制条目。可选地,发送给控制装置148的可以是控制条目本身,也可以是控制条目的标识,或者控制条目配置的交互行为数据,或者其他使控制装置148确定控制条目配置的交互行为的信息。
在某些实施例中,控制装置148可以根据机器人100感知到的信息和感知单元生成感知数据,将生成的感知数据发送至远端服务器,远端服务器接收感知数据,查找与感知数据匹配的控制条目,将查找到的控制条目发送至机器人100,控制装置148使机器人100执行控制条目中的交互行为。
应当理解,控制装置148并不限于通过如上所述的方式控制机器人的交互行为,还可以是以上几种方式的组合或者其他方式。
作为一个非限制性示例,行为的指令和行为控制参数可用JSON语言编写,但不限于此,其他方式也是可行的。非限制性的可包括:
1、让机器人说话
行为名称为:audio_speak;
行为控制参数可包括:text(要说的内容)、volume(说话的音量)等(例如,发声性别、或发声年龄等)
JSON表示如下:
“audio_speak”:{“text”:“你好吗”,“volume”:“50%”}
非限制性地,“text”可包括转换字符,转换字符与参数对应。例如,“主人”的转换字符可被定义为“@master”,作为一个例子,包含转换字符的JSON表示如下:
“audio_speak”:{“text”:“你好,@master”,“volume”:“50%”}
在执行“audio_speak”时,可将“@master”替换成“主人”的姓名。
另外,在上述示例中“volume”被设置为百分比,机器人可根据“volume”的百分比值计算得到机器人的具体参数。作为另一个示例,“volume”也可以被表示为机器人的具体参数。
2、让机器人播放音乐
行为名称为:audio_sound_music;
行为控制参数可包括:path(要播放音乐的路径、或文件名等)、volume(播放音乐的音量)等
JSON表示如下:
“audio_sound_music”:{“path”:“http//bpeer.com/happy.mp3”,“volume”:“50%”}
3、让机器人播放提示音
行为名称为:audio_sound_info;
行为控制参数包括:name(要播放的提示音的名称)、volume(播放音量)等
JSON表示如下:
“audio_sound_info”:{“name”:“warning”,“volume”:“normal”}
4、让机器人的头部运动
行为名称为:motion_head;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等
JSON表示如下:
“motion_head”:{“motor”:“1”,“velocity”:“1”,“angle”:“45”}
在上述示例中,“velocity”被表示为档位,机器人可根据该档位计算得到具体的“velocity”。实际上,“velocity”也可表示为机器人头部运动的具体参数。
另外,上述示例中,“angle”被表示为电机的角度,实际上,“angle”可被表示为百分比等相对数据,例如,“angle”:“50%”,机器人可根据角度范围确定具体的参数,例如最大角度为180度,那么计算得到具体角度为90度,但不限于此。
5、让机器人的脖子运动
行为名称为:motion_neck;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等
JSON表示如下:
“motion_neck”:{“motor”:“1”,“velocity”:“2”,“angle”:“60”}
6、让机器人的肩膀运动
行为名称为:motion_shoulder;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等
JSON表示如下:
“motion_shoulder”:{“motor”:“1”,“velocity”:“3”,“angle”:“60”}
7、让机器人的肘部运动
行为名称为:motion_elbow;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等
JSON表示如下:
“motion_elbow”:{“motor”:“1”,“velocity”:“2”,“angle”:“50”}
8、让机器人的腕部运动
行为名称为:motion_wrist;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等
JSON表示如下:
“motion_wrist”:{“motor”:“1”,“velocity”:“2”,“angle”:“50”}
9、让机器人的腰部运动
行为名称为:motion_waist;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等
JSON表示如下:
“motion_waist”:{“motor”:“1”,“velocity”:“2”,“angle”:“50”}
10、让机器人的眼睛运动
行为名称为:motion_eye;
行为控制参数可包括:motor(执行运动的电机)、velocity(电机运动速度),angle(电机运动角度)等
JSON表示如下:
“motion_eye”:{“motor”:“1”,“velocity”:“2”,“angle”:“50”}
11、让机器人显示表情
行为名称为:display_emotion;
行为控制参数可包括:content(有显示的表情内容)、velocity(显示速度)等
JSON表示如下:
“display_emotion”:{“content”:“happy”,“velocity”:“3”}
12、让机器人拍照
行为名称为:program_photo;
行为控制参数可包括:flash(是否打开闪光灯)等
JSON表示如下:
“program_photo”:{“flash”:“1”}
13、让机器人控制电视
行为名称为:control_tv;
行为控制参数可包括:state(例如open、close)等
JSON表示如下:
“control_tv”:{“state”:“open”}
14、让机器人控制LED灯
行为名称为:control_led;
行为控制参数可包括:state(例如open、close)、color等
JSON表示如下:
“control_led”:{“state”:“open”,“color”:“yellow”}
关于生成控制条目
在某些实施例中,可以根据感知单元设置触发条件,以及该触发条件触发的交互行为,得到控制条目,并将控制条目作为控制机器人100交互行为的数据。
图2说明根据本发明某些实施例的机器人的控制数据的生成方法的流程图,如图2所示,该方法包括:
步骤S202,根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件;
步骤S204,根据一个或多个预设的被设置为供机器人执行的交互行为,设置所述触发条件触发的交互行为;
步骤S206,根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目。
在上述步骤S202中,可以从预设的感知单元中选取至少一个感知单元;设置选取的感知单元的属性,其中,感知单元的属性包括感知单元的取值;根据选取的感知单元及感知单元的属性设置用于控制机器人交互行为的触发条件。在某些实施例中,还设置多个感知单元之间的关系,感知单元之间的关系包括但不限于“与”、“或”、“非”等逻辑关系;上述步骤S202,可以根据选择的感知单元及感知单元的属性、以及感知单元之间的关系设置触发条件。
上述步骤S204中,可以从预设的被设置为供机器人执行的交互行为中选取至少一个交互行为;设置选取的交互行为的属性,其中,交互行为的属性包括交互行为的一个或多个可被机器人解析以执行的动作指令以及动作指令的参数;根据选取的交互行为及交互行为的属性设置触发条件触发的交互行为。在某些实施例中,还可以设置多个交互行为的执行顺序,交互行为的执行顺序包括但不限于随机执行一个或多个交互行为,或者按预定步骤执行多个交互行为。上述步骤S204,可以根据选取的交互行为及交互行为的属性、以及上述执行顺序设置触发条件触发的交互行为。
在某些实施例中,按照预定的数据表达方式来描述触发条件和触发条件触发的交互行为。可选地,可以使用数据交换格式根据触发条件和触发条件触发的交互行为生成控制条目。数据交换格式包括但不限于以下之一或任意组合:XML、JSON或者YAML。应当理解,还可以采用其他格式生成触发条件和触发条件触发的交互行为,包括本文件提交日尚未开发出的数据表达方式。
在某些实施例中,可以设置多个控制条目,并将多个控制条目存储为数据交换格式的文档。或者,多个控制条目也可以存储在数据库中。将多个控制条目存储为数据交换格式的文档时,相邻的控制条目之间可以用预定符号分隔,以区分不同的控制条目。存储控制条目的文档可以存储在机器人100的存储器102中,控制条目的文档也可以存储在远端服务器中。
交互行为被配置为一个或多个动作指令。上述动作指令包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。例如,对于“查询天气”的动作指令,可以链接到提供天气信息的网页,从网页中获取要查询的城市的天气信息。查询到天气信息后,可以显示在机器人100的显示装置上,或者也可以通过语音播报天气信息。在某些实施例中,链接到一组动作的参数时,可以根据其他配置选择执行的动作的 参数;同样,链接到多个内容时(例如聊天的多个语料),也可以根据其他配置选择呈现的内容。
还可以设置动作指令的执行顺序,其中,执行顺序包括:随机执行一个或多个动作指令,或者按预定步骤执行多个动作指令。执行顺序可以用符号进行标记,如果没有标记,可以按照描述动作的先后顺序。同一类型的动作可以作为一个整体,动作之间的先后顺序可以进行标记,例如“先向前移动5米,点头5次,然后向后退10米”,动作指令可以表达为[move:{gr:0,m:+5;gr:2,m:-10};head{gr:1,head:5}],“gr”表示动作的执行先后顺序,取值小的动作先执行。
在某些实施例中,可以提供用于设置触发条件和交互行为的图形用户界面(GUI),图形用户界面提供设置的感知单元(例如,感知单元的名称、标识等)、可以设置的感知单元的取值、感知单元之间的逻辑关系,设置触发条件的用户可以选择感知单元、感知单元的取值以及感知单元的逻辑关系,选择设置触发条件的感知单元后,按照相应的格式生成触发条件。图形用户界面还可以提供设置的交互行为,可以是预先定义好的交互行为,在选择完交互行为之后,按照相应的格式生成交互行为。在某些实施例中,也可以直接编辑触发条件和交互行为,例如按照上述的数据交换格式、使用预先定义的感知单元以及交互行为的动作指令规范,编辑触发条件和触发条件触发的交互行为,得到控制条目。
在某些实施例中,可以从互联网抓取内容(例如网页等),对抓取的内容进行分析,得到用于设置控制条目的内容,根据这些内容设置触发条件和触发条件触发的交互行为。例如,从互联网中抓取到生病时拨打急救电话,可以根据感知单元设置“生病”的触发条件,并将该触发条件触发的交互行为设置为“拨打急救电话”。如果预先定义了“健康状况”这一感知单元,可以直接将感知单元的值设置为“生病”,构成的触发条件可以为{if(“health”:“sick”)}。机器人100可以根据感知到的数据判断用户的健康状况,确定健康状况是否为“生病”,例如,与用户进行语音聊天以了解用户的状态,以及检测用户的心率、体温等。在健康状况为“生病”时,机器人100生成的感知数据中包括可以{“health”:“sick”}。
关于使用控制条目控制机器人的交互行为
将控制条目作为控制机器人交互行为的数据之后,可以根据控制条目来控制机器人的交互行为。
图3说明根据本发明某些实施例的机器人交互行为的控制方法的流程图一,如图3所示,该方法包括:
步骤S302,获取机器人感知到的数据;
步骤S304,根据感知到的信息、至少按照预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;
步骤S306,在存储的多个控制条目中查找与生成的感知数据匹配的控制条目;
步骤S308,如果查找到与生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的交互行为。
在某些实施例中,机器人100通过网络与远端服务器(未示出)通信,机器人100感知至少一项数据,远端服务器从机器人100获取机器人感知到的信息,该获取包括远端服务器请求机器人100发送其感知到的信息,或者机器人感知到信息后,向远端服务器发送机器人100感知到的信息。机器人100可以周期性向远端服务器发送感知到的信息,或者在感知到的信息发生变化时向远端服务器发送感知到的信息,以降低远端服务器与机器人100之间的数据传输量。
控制条目文档可以存储在远端服务器中,远端服务器包括一个或多个处理器,以及一个或多个保存在存储器中以执行图3所示的方法的模块、程序或指令集。远端服务器可以是单一的服务器,也可以是由多个服务器组成的服务器集群。应当理解,上述的程序或指令集并不局限于在一台服务器上运行,也可以在分布式的计算资源上运行。
在某些实施例中,可以将查找到的控制条目发送给机器人100,机器人100从控制条目中读取交互行为,并执行交互行为。或者,可以将查找到的控制条目中的交互行为的数据发送给机器人100。或者,也可以对控制条目中的交互行为的数据进行解析,得到机器人100可以执行的指令,将得到的指令发送至机器人100,机器人100执行该指令。应当理解,上述方式仅为举例说明。
图4说明根据本发明某些实施例的机器人交互行为的控制方法的流程图二,如图4所示,该方法包括:
步骤S402,接收机器人的感知数据,其中,感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成,感知数据包括感知单元的标识和取值;
步骤S404,在存储的多个控制条目中查找与机器人的感知数据匹配的控制条目;
步骤S406,如果查找到与机器人的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的交互行为。
如图4所示,机器人100感知至少一项信息,并根据感知到的信息和感知单元生成感知数据,将感知数据发送出去。在某些实施例中,机器人100将感知数据发送至远端服务器(未示出)。机器人100可以在生成感知数据之后发送感知数据,也可以在接收到远端服务器的请求后发送感知数据。
在某些实施例中,远端服务器中存储控制条目的文档,例如,数据交换格式的文档,或者数据库等。当然,控制条目文档可以分布式存储在多个存储空间。远端服务器可以包括一个或多个处理器,以及一个或多个保存在存储器中以执行图3所示的方法的模块、程序或指令集。
图5说明根据本发明某些实施例的机器人交互行为的控制方法的流程图三,如图5所示,该方法包括:
步骤S502,感知至少一项信息;
步骤S504,根据感知到的信息、至少按照所述预先定义的感知单元生成感知数据,其中,感知数据包括感知单元的标识和取值;
步骤S506,将生成的感知数据发送出去;
步骤S508,接收与感知数据匹配的控制条目的信息;
步骤S510,根据控制条目的信息执行该控制条目配置的交互行为。
在某些实施例中,机器人100的交互行为控制装置148执行如图5所示的方法。机器人100感知至少一项信息,按照感知数据生成策略,根据感知单元生成感知数据。机器人100生成感知数据之后,将感知数据发送至远端服务器。远端服务器中存储控制条目的文档,在存储多个控制条目中查找与机器人的感知数据匹配的控制条目,如果查找到与机器人的感知数据匹配的控制条目,将该控制条目发送至机器人100。在某些实施例中,可以将控制条目中的交互行为的动作指令发送至机器人100。
在某些实施例中,将生成的感知数据发送出去之前,还可以确定生成的感知数据的标识。确定生成的感知数据的标识后,将生成的感知数据及其标识发送出去。远端服务器在查找到与生成的感知数据匹配的控制条目后,将控制条目的信息和对应的感知数据的标识发送至控制装置148,控制条目的信息可以是控制条目本身、控制条目的标识、控制条条目配置的行为及其任意组合,但不限于此。控制装置接收控制条目的信息,并根据控制条目的信息中携带的感知数据的标识,判断接收到的控制条目的信息是否为与生成的感知数据匹配的控制条目的信息。
控制装置148可以根据控制条目的标识确定对应的控制条目,并执行控制条目中的交互行为。或者,控制装置148可以直接从远端服务器发送的控制条目中读取控制条目配置的交互行为,执行该交互行为。再者,如果远端服务器发送的是控制条目中配置的交互行为,控制装置148可以直接解析并执行该交互行为。
在某些实施例中,可以将机器人的感知数据与控制条目中的触发条件进行匹配,所述的匹配包括但不限于判断是否存在某一感知单元、比较感知单元的取值。
在某些实施例中,当查找到多个与机器人的感知数据项匹配的触发条件时,可以确定机器人的感知数据与匹配到的多个触发条件的匹配程度,至少根据匹配程度选择与生成的感知数据匹配的控制条目。作为一个例子,对于感知数据中的语音文本,可以但不限于采用编辑距离确定匹配程度,编辑距离的取值越小两个文本越相似。语音文本还可以采用正则表达式来匹配。
在某些实施例中,还可以设置控制条目的优先级,在选择控制条目时可以参考控制条目的优先级。例如,可以将控制条目分类为核心控制条目、用户控制条目以及临时控制条目,核心控制条目为优先级最高的控制条目,其次是用户控制条目,最后是临时控制条目。在查找控制条目时,可以先从核心控制条目中查找与感知数据匹配的控制条目。如果在核心控制条目中未查找到与感知数据匹配的控制条目,可以在用户控制条目中查找与感知数据匹配的控制条目。如果在用户控制条目中未查找到与感知数据匹配的控制条目,可以在临时训练库中查找与感知数据匹配的控制条目。
在某些实施例中,机器人100可以感知至少一项信息,根据感知到的信息和感知单元生成感知数据,并读取控制条目(包括但不限于从机器人100的存储器102读取),查找与生成的感知数据匹配的控制条目,如果查找到与生成的感知数据匹配的控制条目,机器人100执行查找到的控制条目中的交互行为。
在某些实施例中,控制条目的文档可以存储机器人100的存储器102和远端服务器中。机器人100感知至少一项信息,根据感知到的信息和感知单元生成感知数据,从存储器102中读取控制条目,在读取的控制条目中查找与生成的感知数据匹配的控制条目。如果查找到与生成的感知数据匹配的控制条目,机器人100执行查找到的控制条目中的交互行为;如果在读取的控制条目中没有查找到与生成的感知数据匹配的控制条目,机器人100可以将生成的感知数据发送到远端服务器,远端服务器在存储的控制条目中查找与接收到的感知数据匹配的控制条目,如果查找到与接收到的感知数据匹配的控制条目,使机器人100执行该控制条目中的交互行为。远端服务器还可以将查找到的 控制条目发送至机器人100,机器人100可以通过接口(未示出)接收控制条目,并存储接收到的控制条目。
如上所述,当查找到与感知数据匹配的控制条目时,使机器人100执行控制条目中的交互行为。当未查找到与感知数据匹配的控制条目时,可以不作交互行为,机器人100可以继续感知至少一项信息,感知何种信息可以根据预设的条件确定。在某些实施例中,当未查找到与感知数据匹配的控制条目时,可以进行语音回复或者导入互联网中的内容(例如,展示网页信息等)。当未查找到与感知数据匹配的控制条目时,可以判断感知数据是否与语音有关(例如,是否接收到用户的语音指令等),如果确定感知数据与语音有关,可以进行语音回复,或者根据语音内容在互联网中搜索相关内容,在机器人100的显示装置中呈现给用户。
在某些实施例中,可以根据机器人与用户的交互行为设置控制条目。当未查找到与机器人100的感知数据匹配的控制条目时,机器人100可以与用户进行语音聊天,在聊天过程中,机器人100分析用户的需求和意图,得到情景和该情境下机器人的交互行为,根据情景、机器人的交互行为、按照感知单元生成控制条目。例如,用户生病时,对机器人说“我生病了”,机器人100的控制条目中没有在用户生病时的交互行为,此时机器人100可以和用户进行语音交互,比如询问用户“我不清楚需要做什么”,用户可以说“帮我拨打我的私人医生的电话吧,电话号码是….”,机器人100可以拨打电话。此外,在这种情况下,机器人100分析得出用户“生病”时需要联系医生,根据分析得出的结果,机器人100可以生成控制条目,例如,触发条件为[if(health:sick)],触发条件触发的交互行为为[call{number:“//doctor_number.php”]。
下面对某些实施例的机器人的控制数据的生成装置的结构进行描述。由于机器人的控制数据的生成装置解决问题的原理与机器人交互行为的控制方法相似,因此机器人的控制数据的生成装置的实施可以参见机器人的控制数据的生成方法的实施,重复之处不再赘述。以下所使用的,术语“单元”或者“模块”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。
图6说明根据本发明某些实施例的机器人的控制数据的生成装置的结构框图,如图6所示,该装置包括:
触发条件设置模块602,用于根据一个或多个预设的感知单元设置用于控制机器人交互行为的触发条件,其中,感知单元被设置为控制机器人交互行为的最小单元;
交互行为设置模块604,与触发条件设置模块602相连,用于根据一个或多个预设的被设置为供机器人执行的交互行为,设置所述触发条件触发的交互行为;
生成模块606,与交互行为设置模块604相连,用于根据设置的触发条件和交互行为生成用于响应机器人感知到的信息来控制机器人交互行为的控制条目。
图7说明根据本发明某些实施例的机器人交互行为的控制装置的结构框图一,如图7所示,该装置包括:获取模块702,用于获取机器人感知到的信息;生成模块704,与获取模块702相连,用于根据感知到的信息、至少按照预先定义的感知单元生成感知数据;查找模块706,与生成模块704相连,用于查找与生成的感知数据匹配的控制条目;执行模块708,与查找模块706相连,用于当查找到与生成的感知数据匹配的控制条目时,使机器人执行查找到的控制条目中的行为。
图8说明根据本发明某些实施例的机器人交互行为的控制装置的结构框图二,如图8所示,该装置包括:感知模块802,用于感知至少一项信息;生成模块804,与感知模块802相连,用于根据感知到的信息、至少按照预先定义的感知单元生成感知数据;发送模块806,与生成模块804相连,用于将生成的感知数据发送出去;接收模块808,与发送模块806相连,用于接收与感知数据匹配的控制条目的信息;执行模块810,与接收模块808相连,用于根据所述控制条目的信息执行该控制条目中的交互行为。
图9说明根据本发明某些实施例的机器人交互行为的控制装置的结构框图三,如图9所示,该装置包括:接收模块902,用于接收机器人的感知数据;查找模块904,与接收模块902相连,用于查找与所述机器人的感知数据匹配的控制条目;执行模块906,与查找模块904相连,用于当查找到与所述机器人的感知数据对应的控制条目,使所述机器人执行查找到的控制条目中触发条件触发的行为。
图10说明根据本发明某些实施例的机器人交互行为的控制装置的结构框图四,如图10所示,该装置包括:控制条目文档1002,用于提供包含多个控制条目的控制条目文档,其中,每个控制条目包含触发条件和该触发条件触发的行为,每个触发条件由至少一个预先定义的感知单元构成;
匹配模块1004,与控制条目文档1002相连,用于将机器人的感知数据与控制条目进行匹配,以确定是否存在与所述机器人的感知数据匹配的控制条目,其中,所述感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成。
显然,本领域的技术人员应该明白,上述的本发明实施例的各模块或各步骤可以用通用的计算装置来实现,它们可以集中在单个的计算装置上,或者分布在多个计算装置 所组成的网络上,可选地,它们可以用计算装置可执行的程序代码来实现,从而,可以将它们存储在存储装置中由计算装置来执行,并且在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤,或者将它们分别制作成各个集成电路模块,或者将它们中的多个模块或步骤制作成单个集成电路模块来实现。这样,本发明实施例不限制于任何特定的硬件和软件结合。
以上所述仅为本发明的优选实施例而已,并不用于限制本发明,对于本领域的技术人员来说,本发明实施例可以有各种更改和变化。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (40)

  1. 一种机器人交互行为的控制方法,其特征在于,包括:
    获取机器人感知到的信息;
    根据所述感知到的信息、至少按照预先定义的第一感知单元生成包含感知单元的标识和取值的感知数据;
    查找与生成的感知数据匹配的用于响应机器人感知到的信息来控制机器人的行为的控制条目,其中,所述控制条目包含由与所述第一感知单元相应的第二感知单元构成的触发条件和该触发条件触发的行为;以及
    如果查找到与所述生成的感知数据匹配的控制条目,使所述机器人执行查找到的控制条目中的行为。
  2. 根据权利要求1所述的方法,其特征在于,所述查找与生成的感知数据匹配的控制条目之前,还包括:
    从控制条目文档中读取控制条目,其中,所述控制条目文档记录有多个控制条目。
  3. 根据权利要求2所述的方法,其特征在于,所述控制条目文档为数据库或数据交换格式的文档。
  4. 根据权利要求1所述的方法,其特征在于,控制条目还指示行为的执行逻辑,其中,所述执行逻辑包括:随机执行一个或多个行为,或者按预定步骤执行多个行为。
  5. 根据权利要求1所述的方法,其特征在于,控制条目的触发条件触发的行为被配置为一个或多个动作指令及动作指令的参数。
  6. 根据权利要求5所述的方法,其特征在于,所述动作指令的参数包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。
  7. 根据权利要求1所述的方法,其特征在于,所述查找与生成的感知数据匹配的用于响应机器人感知到的信息来控制机器人的行为的控制条目,包括:
    将所述生成的感知数据与用于响应机器人感知到的信息来控制机器人的行为的控制条目的触发条件进行匹配,其中,将相匹配的触发条件对应的控制条目作为与所述生成的感知数据匹配的控制条目。
  8. 根据权利要求7所述的方法,其特征在于,所述查找与生成的感知数据匹配的控制条目,还包括:
    当查找到多个与所述生成的感知数据项匹配的触发条件时,确定所述生成的感知数据与匹配到的多个触发条件的匹配程度;
    至少根据所述匹配程度选择与所述生成的感知数据匹配的控制条目。
  9. 根据权利要求1所述的方法,其特征在于,所述根据所述感知到的信息、至少按照预先定义的第一感知单元生成包含感知单元的标识和取值的感知数据,包括:
    至少根据预先定义的第一感知单元分析感知到的一项信息和/或多项信息的组合,得到各个第一感知单元的取值;
    根据所述各个第一感知单元的取值生成感知数据。
  10. 根据权利要求1所述的方法,其特征在于,所述感知到的信息包括:实时感知的信息和/或历史感知的信息。
  11. 根据权利要求1至10中任一项所述的方法,其特征在于,所述机器人为室内机器人。
  12. 根据权利要求1至10中任一项所述的方法,其特征在于,所述感知单元具有一个或多个预设值。
  13. 一种机器人交互行为的控制方法,其特征在于,包括:
    感知至少一项信息;
    根据感知到的信息、至少按照预先定义的第一感知单元生成包含第一感知单元的标识和取值的感知数据;
    将生成的感知数据发送出去;
    接收与所述生成的感知数据匹配的控制条目的信息,其中,控制条目用于响应机器人感知到的信息来控制机器人的行为,控制条目包含与所述第一感知单元相应的第二感知单元构成的触发条件和该触发条件触发的行为;
    根据所述控制条目的信息执行接收到的控制条目中的行为。
  14. 根据权利要求13所述的方法,其特征在于,所述根据感知到的信息、至少按照预先定义的第一感知单元生成包含第一感知单元的标识和取值的感知数据,包括:
    至少根据预先定义的第一感知单元分析感知到的一项信息和/或多项信息的组合,得到各个第一感知单元的取值;
    根据所述各个第一感知单元的取值生成感知数据。
  15. 根据权利要求13所述的方法,其特征在于,所述感知的数据包括:实时感知的信息和/或历史感知的信息。
  16. 根据权利要求13至15中任一项所述的方法,其特征在于,所述第一感知单元和/或所述第二感知单元具有一个或多个预设值。
  17. 根据权利要求13至15中任一项所述的方法,其特征在于,将生成的感知数据发送出去之前,还包括:确定生成的感知数据的标识;
    其中,将生成的感知数据发送出去包括:将生成的感知数据及所述生成的感知数据的标识发送出去;
    其中,接收与所述生成的感知数据匹配的控制条目的信息,包括:接收控制条目的信息,并根据控制条目的信息中携带的感知数据的标识,判断接收到的控制条目的信息是否为与所述生成的感知数据匹配的控制条目的信息。
  18. 根据权利要求13至15中任一项所述的方法,其特征在于,所述控制条目的信息包括以下至少之一或任意组合:控制条目本身、控制条目的标识、控制条条目配置的行为。
  19. 根据权利要求13至15中任一项所述的方法,其特征在于,还包括:如果未接收到与所述生成的感知数据匹配的控制条目,与用户进行语音交互。
  20. 根据权利要求19所述的方法,其特征在于,还包括:根据与用户的语音交互生成控制条目。
  21. 一种机器人交互行为的控制方法,其特征在于,包括:
    接收机器人的包含感知单元的标识和取值的感知数据,其中,所述感知数据根据机器人感知到的信息、且至少按照预先定义的感知单元生成;
    查找与所述机器人的感知数据匹配的用于响应机器人感知到的信息来控制机器人的行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和该触发条件触发的行为;
    如果查找到与所述机器人的感知数据匹配的控制条目,使所述机器人执行查找到的控制条目中的行为。
  22. 根据权利要求21所述的方法,其特征在于,所述控制条目还包含多个行为的执行顺序,其中,所述执行顺序包括:随机执行一个或多个行为,或者按预定步骤执行多个行为。
  23. 根据权利要求21所述的方法,其特征在于,控制条目的触发条件触发的行为被配置为一个或多个动作指令及动作指令的参数。
  24. 根据权利要求23所述的方法,其特征在于,所述动作指令的参数包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。
  25. 根据权利要求23所述的方法,其特征在于,所述感知单元具有一个或多个预设值。
  26. 根据权利要求21至25中任一项所述的方法,其特征在于,所述机器人为室内机器人。
  27. 一种机器人交互行为的控制方法,其特征在于,包括:
    提供包含多个用于响应机器人感知到的信息来控制机器人的行为的控制条目的控制条目文档,其中,每个控制条目包含由至少一个预先定义的感知单元构成的触发条件和该触发条件触发的行为;
    将机器人的包含感知单元的标识和取值的感知数据与控制条目进行匹配,以确定是否存在与所述机器人的感知数据匹配的控制条目,其中,所述机器人的感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成。
  28. 根据权利要求27所述的方法,其特征在于,所述控制条目文档为数据库或数据交换格式的文档。
  29. 根据权利要求27所述的方法,其特征在于,所述控制条目的触发条件触发的行为被配置为一个或多个动作指令及动作指令的参数。
  30. 根据权利要求29所述的方法,其特征在于,所述动作指令的参数包括:用于执行其他控制条目而设置的到其他控制条目的链接,和/或用于从多个内容和/或多个参数中选取内容和/或参数而设置的到多个参数和/或多个内容的链接。
  31. 根据权利要求29所述的方法,其特征在于,控制条目还包括触发条件触发的多个行为的执行顺序,其中,所述执行顺序包括:随机执行一个或多个行为,或者按预定步骤执行多个行为。
  32. 一种机器人交互行为的控制装置,其特征在于,包括:
    获取模块,用于获取机器人感知到的信息;
    生成模块,用于根据所述感知到的信息、至少按照预先定义的感知单元生成包括感知单元的标识和取值的感知数据;
    查找模块,用于查找与生成的感知数据匹配的用于响应机器人感知到的信息来控制机器人的行为的控制条目,其中,所述控制条目包含由至少一个感知单元构成的触发条件和该触发条件触发的行为;
    执行模块,用于当查找到与所述生成的感知数据匹配的控制条目时,使所述机器人执行查找到的控制条目中的行为。
  33. 一种机器人交互行为的控制装置,其特征在于,包括:
    感知模块,用于感知至少一项信息;
    生成模块,用于根据感知到的信息、至少按照预先定义的感知单元生成包括感知单元的标识和取值的感知数据;
    发送模块,用于将生成的感知数据发送出去;
    接收模块,用于接收与所述感知数据匹配的控制条目的信息,其中,控制条目用于响应机器人感知到的信息来控制机器人的行为,控制条目包含由至少一个感知单元构成的触发条件和该触发条件触发的行为;
    执行模块,用于执行接收到的控制条目中的行为。
  34. 一种机器人交互行为的控制装置,其特征在于,包括:
    接收模块,用于接收机器人的包括感知单元的标识和取值的感知数据,其中,所述感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成;
    查找模块,用于查找与所述机器人的感知数据匹配的用于响应机器人感知到的信息来控制机器人的行为的控制条目,其中,控制条目包含由至少一个感知单元构成的触发条件和该触发条件触发的行为;
    执行模块,用于当查找到与所述机器人的感知数据对应的控制条目时,使所述机器人执行查找到的控制条目中的行为。
  35. 一种机器人交互行为的控制装置,其特征在于,包括:
    控制条目文档,用于提供包含多个用于响应机器人感知到的信息来控制机器人的行为的控制条目的控制条目文档,其中,每个控制条目包含由至少一个预先定义的感知单元构成的触发条件和该触发条件触发的行为;
    匹配模块,用于将机器人的包括感知单元的标识和取值的感知数据与控制条目进行匹配,以确定是否存在与所述机器人的感知数据匹配的控制条目,其中,所述感知数据根据机器人感知到的信息、至少按照预先定义的感知单元生成。
  36. 一种机器人,其特征在于,包括:
    一个或多个感知装置;
    存储器,设置为存储用于响应机器人感知到的信息来控制机器人的行为的控制条目,其中,控制条目包含由至少一个预先定义的感知单元构成的触发条件和触发条件触发的行为;
    一个或多个处理器,设置为根据所述一个或多个感知装置感知到的信息生成包括感知单元的标识和取值的感知数据,查找与生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的行为。
  37. 一种机器人,其特征在于,包括:
    一个或多个感知装置;
    接口,设置为接收用于响应机器人感知到的信息来控制机器人的行为的控制条目,其中,所述控制条目包含由至少一个预先定义的感知单元构成的触发条件和触发条件触发的行为;
    一个或多个处理器,设置为根据所述一个或多个感知装置感知到的信息生成包括感知单元的标识和取值的感知数据,查找与生成的感知数据匹配的控制条目,使机器人执行查找到的控制条目中的行为。
  38. 根据权利要求37所述的机器人,其特征在于,还包括:存储器,设置为存储接收到的控制条目。
  39. 一种机器人,其特征在于,包括:
    一个或多个感知装置;
    获取装置,设备为获取所述一个或多个感知装置感知到的信息;
    生成装置,设备为根据所述感知到的信息、至少按照预先定义的感知单元生成包括感知单元的标识和取值的感知数据;
    查找装置,设置为查找与生成的感知数据匹配的用于响应机器人感知到的信息来控制机器人的行为的控制条目,其中,所述控制条目包含由至少一个感知单元构成的触发条件和该触发条件触发的行为;
    执行装置,设置为当查找到与所述生成的感知数据匹配的控制条目时,执行查找到的控制条目中的行为。
  40. 一种机器人,其特征在于,包括:
    一个或多个感知装置;
    存储器;
    一个或多个处理器;以及
    一个或多个模块,所述一个或多个模块被存储在所述存储器中并被配置成由所述一个或多个处理器执行,所述一个或多个模块包括用于执行以下步骤的指令:
    根据所述一个或多个感知装置感知到的信息生成包括至少一个预先定义的感知单元的标识和取值的感知数据;
    查找与生成的感知数据匹配的用于响应机器人感知到的信息来控制机器人的行为的控制条目,其中,所述控制条目包含由至少一个感知单元构成的触发条件和该触发条件触发的行为;
    当查找到与所述生成的感知数据匹配的控制条目时,使机器人执行查找到的控制条目中的行为。
PCT/CN2016/087258 2015-06-26 2016-06-27 机器人交互行为的控制方法、装置及机器人 WO2016206643A1 (zh)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN201510363346.2 2015-06-26
CN201510363346.2A CN106325113B (zh) 2015-06-26 2015-06-26 机器人控制引擎及***
CN201510363348.1A CN106325065A (zh) 2015-06-26 2015-06-26 机器人交互行为的控制方法、装置及机器人
CN201510364661.7 2015-06-26
CN201510364661.7A CN106325228B (zh) 2015-06-26 2015-06-26 机器人的控制数据的生成方法及装置
CN201510363348.1 2015-06-26

Publications (1)

Publication Number Publication Date
WO2016206643A1 true WO2016206643A1 (zh) 2016-12-29

Family

ID=57584497

Family Applications (6)

Application Number Title Priority Date Filing Date
PCT/CN2016/087260 WO2016206645A1 (zh) 2015-06-26 2016-06-27 为机器装置加载控制数据的方法及装置
PCT/CN2016/087259 WO2016206644A1 (zh) 2015-06-26 2016-06-27 机器人控制引擎及***
PCT/CN2016/087261 WO2016206646A1 (zh) 2015-06-26 2016-06-27 使机器装置产生动作的方法及***
PCT/CN2016/087258 WO2016206643A1 (zh) 2015-06-26 2016-06-27 机器人交互行为的控制方法、装置及机器人
PCT/CN2016/087262 WO2016206647A1 (zh) 2015-06-26 2016-06-27 用于控制机器装置产生动作的***
PCT/CN2016/087257 WO2016206642A1 (zh) 2015-06-26 2016-06-27 机器人的控制数据的生成方法及装置

Family Applications Before (3)

Application Number Title Priority Date Filing Date
PCT/CN2016/087260 WO2016206645A1 (zh) 2015-06-26 2016-06-27 为机器装置加载控制数据的方法及装置
PCT/CN2016/087259 WO2016206644A1 (zh) 2015-06-26 2016-06-27 机器人控制引擎及***
PCT/CN2016/087261 WO2016206646A1 (zh) 2015-06-26 2016-06-27 使机器装置产生动作的方法及***

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/CN2016/087262 WO2016206647A1 (zh) 2015-06-26 2016-06-27 用于控制机器装置产生动作的***
PCT/CN2016/087257 WO2016206642A1 (zh) 2015-06-26 2016-06-27 机器人的控制数据的生成方法及装置

Country Status (1)

Country Link
WO (6) WO2016206645A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108388399A (zh) * 2018-01-12 2018-08-10 北京光年无限科技有限公司 虚拟偶像的状态管理方法及***

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11220008B2 (en) * 2017-07-18 2022-01-11 Panasonic Intellectual Property Management Co., Ltd. Apparatus, method, non-transitory computer-readable recording medium storing program, and robot
JP7188950B2 (ja) 2018-09-20 2022-12-13 株式会社Screenホールディングス データ処理方法およびデータ処理プログラム
TWI735168B (zh) * 2020-02-27 2021-08-01 東元電機股份有限公司 語音控制機器人

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101618280A (zh) * 2009-06-30 2010-01-06 哈尔滨工业大学 具有人机交互功能的仿人头像机器人装置及行为控制方法
WO2011058530A1 (en) * 2009-11-16 2011-05-19 Koninklijke Philips Electronics, N.V. Human-robot shared control for endoscopic assistant robot
CN102446428A (zh) * 2010-09-27 2012-05-09 北京紫光优蓝机器人技术有限公司 基于机器人的交互式学习***及其交互方法
CN103399637A (zh) * 2013-07-31 2013-11-20 西北师范大学 基于kinect人体骨骼跟踪控制的智能机器人人机交互方法
CN104640677A (zh) * 2012-06-21 2015-05-20 睿信科机器人有限公司 训练和操作工业机器人

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001353678A (ja) * 2000-06-12 2001-12-25 Sony Corp オーサリング・システム及びオーサリング方法、並びに記憶媒体
JP4108342B2 (ja) * 2001-01-30 2008-06-25 日本電気株式会社 ロボット、ロボット制御システム、およびそのプログラム
US7089184B2 (en) * 2001-03-22 2006-08-08 Nurv Center Technologies, Inc. Speech recognition for recognizing speaker-independent, continuous speech
US6957215B2 (en) * 2001-12-10 2005-10-18 Hywire Ltd. Multi-dimensional associative search engine
WO2005050921A1 (en) * 2003-11-20 2005-06-02 Matsushita Electric Industrial Co., Ltd. Association control apparatus, association control method and service association system
JP2005193331A (ja) * 2004-01-06 2005-07-21 Sony Corp ロボット装置及びその情動表出方法
WO2006093394A1 (en) * 2005-03-04 2006-09-08 Chutnoon Inc. Server, method and system for providing information search service by using web page segmented into several information blocks
JP2007044825A (ja) * 2005-08-10 2007-02-22 Toshiba Corp 行動管理装置、行動管理方法および行動管理プログラム
US7945441B2 (en) * 2007-08-07 2011-05-17 Microsoft Corporation Quantized feature index trajectory
KR101088406B1 (ko) * 2008-06-27 2011-12-01 주식회사 유진로봇 유아교육시 로봇을 이용한 양방향 학습 시스템 및 그 운영방법
FR2946160B1 (fr) * 2009-05-26 2014-05-09 Aldebaran Robotics Systeme et procede pour editer et commander des comportements d'un robot mobile.
US20110213659A1 (en) * 2010-02-26 2011-09-01 Marcus Fontoura System and Method for Automatic Matching of Contracts in an Inverted Index to Impression Opportunities Using Complex Predicates and Confidence Threshold Values
FR2963132A1 (fr) * 2010-07-23 2012-01-27 Aldebaran Robotics Robot humanoide dote d'une interface de dialogue naturel, methode d'utilisation et de programmation de ladite interface
KR20120047577A (ko) * 2010-11-04 2012-05-14 주식회사 케이티 대화형 행동모델을 이용한 로봇 인터랙션 서비스 제공 장치 및 방법
EP2764455B1 (en) * 2011-10-05 2022-04-20 Opteon Corporation System and method for monitoring and/or controlling dynamic environments
WO2014050192A1 (ja) * 2012-09-27 2014-04-03 オムロン株式会社 デバイス管理装置及びデバイス検索方法
CN103324100B (zh) * 2013-05-02 2016-08-31 郭海锋 一种信息驱动的情感车载机器人
CN103729476A (zh) * 2014-01-26 2014-04-16 王玉娇 一种根据环境状态来关联内容的方法和***
CN103793536B (zh) * 2014-03-03 2017-04-26 陈念生 一种智能平台实现方法及装置
CN105511608B (zh) * 2015-11-30 2018-12-25 北京光年无限科技有限公司 基于智能机器人的交互方法及装置、智能机器人

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101618280A (zh) * 2009-06-30 2010-01-06 哈尔滨工业大学 具有人机交互功能的仿人头像机器人装置及行为控制方法
WO2011058530A1 (en) * 2009-11-16 2011-05-19 Koninklijke Philips Electronics, N.V. Human-robot shared control for endoscopic assistant robot
CN102446428A (zh) * 2010-09-27 2012-05-09 北京紫光优蓝机器人技术有限公司 基于机器人的交互式学习***及其交互方法
CN104640677A (zh) * 2012-06-21 2015-05-20 睿信科机器人有限公司 训练和操作工业机器人
CN103399637A (zh) * 2013-07-31 2013-11-20 西北师范大学 基于kinect人体骨骼跟踪控制的智能机器人人机交互方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108388399A (zh) * 2018-01-12 2018-08-10 北京光年无限科技有限公司 虚拟偶像的状态管理方法及***
CN108388399B (zh) * 2018-01-12 2021-04-06 北京光年无限科技有限公司 虚拟偶像的状态管理方法及***

Also Published As

Publication number Publication date
WO2016206647A1 (zh) 2016-12-29
WO2016206645A1 (zh) 2016-12-29
WO2016206642A1 (zh) 2016-12-29
WO2016206646A1 (zh) 2016-12-29
WO2016206644A1 (zh) 2016-12-29

Similar Documents

Publication Publication Date Title
US20220115034A1 (en) Audio response messages
JP6816925B2 (ja) 育児ロボットのデータ処理方法及び装置
US11010601B2 (en) Intelligent assistant device communicating non-verbal cues
CN106325228B (zh) 机器人的控制数据的生成方法及装置
JP2022537011A (ja) 人工知能に基づく音声駆動アニメーション方法及び装置、デバイス及びコンピュータプログラム
AU2017228574A1 (en) Apparatus and methods for providing a persistent companion device
WO2016206643A1 (zh) 机器人交互行为的控制方法、装置及机器人
US11367443B2 (en) Electronic device and method for controlling electronic device
KR102193029B1 (ko) 디스플레이 장치 및 그의 화상 통화 수행 방법
EP3866160A1 (en) Electronic device and control method thereof
CN106325065A (zh) 机器人交互行为的控制方法、装置及机器人
WO2015155977A1 (ja) 連携システム、装置、方法、および記録媒体
US20130159400A1 (en) User device, server, and operating conditions setting system
JP6798258B2 (ja) 生成プログラム、生成装置、制御プログラム、制御方法、ロボット装置及び通話システム
US20200234187A1 (en) Information processing apparatus, information processing method, and program
WO2023006033A1 (zh) 语音交互方法、电子设备及介质
CN116610777A (zh) 具有提取问答的会话式ai平台
WO2020087534A1 (en) Generating response in conversation
KR20200077936A (ko) 사용자 상태에 기초하여 반응을 제공하는 전자 장치 및 그의 동작 방법
US11731262B2 (en) Robot and method for operating the same
US20230230293A1 (en) Method and system for virtual intelligence user interaction
US20240078732A1 (en) Avatar facial expressions based on semantical context
WO2020153146A1 (ja) 情報処理装置、及び情報処理方法
US20240078731A1 (en) Avatar representation and audio generation
US12047660B2 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16813759

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16813759

Country of ref document: EP

Kind code of ref document: A1