WO2003061916A1 - Robot-telephone - Google Patents

Robot-telephone Download PDF

Info

Publication number
WO2003061916A1
WO2003061916A1 PCT/JP2002/009335 JP0209335W WO03061916A1 WO 2003061916 A1 WO2003061916 A1 WO 2003061916A1 JP 0209335 W JP0209335 W JP 0209335W WO 03061916 A1 WO03061916 A1 WO 03061916A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
robot
movable
position information
operation right
Prior art date
Application number
PCT/JP2002/009335
Other languages
English (en)
Japanese (ja)
Inventor
Dairoku Sekiguchi
Masahiko Inami
Naoki Kawakami
Ichiro Kawabuchi
Susumu Tachi
Original Assignee
Toudai Tlo, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toudai Tlo, Ltd. filed Critical Toudai Tlo, Ltd.
Publication of WO2003061916A1 publication Critical patent/WO2003061916A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds

Definitions

  • the present invention relates to a robot "user interface” (RUI) which is one of a robot “user interface” (RUI) for human communication by synchronizing the shapes "movements” and the like of a plurality of robots placed at distant places.
  • a robot can be considered as a computer with physicality, and the physical existence of the body itself is a source of overwhelming presence, and through physical interaction with the body, And exert a great influence.
  • telexistence is a concept for "working and communicating in remote places by sharing the shape and movement of objects in remote places with myself.”
  • Telexistence is a technique based on the premise of presenting a high degree of presence to the operator, and the burden on hardware and software tends to increase in the part that measures, transmits, and presents the presence.
  • telexistence operates the remote robot from the first person's point of view, as if it were a remote robot itself. It is effective.
  • the slave robots may not have the same structure and size as humans, or the third-person viewpoint (overhead view) may be more operational than the first-person viewpoint. It is often considered advantageous.
  • the remote robot instead of reconstructing the remote environment around the user, the remote robot itself is reconstructed at the user's hand, making it easier and more intuitive!
  • the conventional teleexistence was an environment-oriented system of "how to connect the remote environment and the operator closely and transparently”
  • the present invention "how to connect the remote robot and the device at hand. Is it tightly coupled? ”Is an object-oriented telexistence.
  • (1) relates to the transmission of rotational force only by three wooden rollers, (2) is a chess piece, (3) is the balloon bulge in hand, and the information presented is extremely limited. Therefore, in conducting communication, it is contained in an ambient low-information transmission method.
  • sharing a robot having a degree of freedom close to that of a person enables not only sharing of tactile information having a high degree of freedom but also visually transmitting gesture information.
  • the following document is an example of using a stuffed toy as a user interface.
  • Hoshino et al. Use stuffed animals as physical agents, and Yonezawa et al. Use dolls as input interfaces for interactive music operations.
  • Another object of the present invention is to solve the problem of control system oscillation caused by communication delay in object-oriented teleexistence.
  • a mouth pot phone is used as a user interface, and outputs a mouth pot including a movable part in a part of a body, a drive unit for driving the movable part, and a signal indicating a position of the movable part.
  • a position information sensor, and a communication connection unit wherein the communication connection unit transmits a signal indicating the position of the movable unit from the position information sensor to a partner via a communication line,
  • the position information corresponding to the part is received and sent to the drive part, and the drive part drives the movable part based on the received position information and the movable part on its own side is moved by an external force.
  • the driving based on the position information from the other party is not performed during a predetermined time.
  • a robot phone is used as a user interface, a robot including a movable part in a part of a body, a driving unit that drives the movable part, and position information that outputs a signal indicating a position of the movable part.
  • a sensor, a communication connection unit, and an operation right determination unit that determines whether the operation right of the robot is on the own side or the other side.
  • the communication connection unit includes a communication line, A signal indicating the position of the movable section from the information sensor is transmitted to the other party, and position information corresponding to the movable section is received from the other party and sent to the drive section.
  • the determination section determines that the user has no operation right, the movable section is driven based on the received position information.
  • a robot phone is used as a user interface, a robot including a movable part in a part of a body, a driving unit that drives the movable part, and position information that outputs a signal indicating a position of the movable part.
  • a sensor, a communication connection unit, and an external force detection unit that detects whether an external force is applied to the movable unit.
  • An active operation / passive operation separation unit that separates an active operation and a passive operation based on an output of the external force detection unit, wherein the communication connection unit is separated by the active operation / passive operation separation unit via a communication line.
  • a signal indicating the position of the movable unit from the position information sensor in response to the passive operation is transmitted to the other party, and position information corresponding to the movable unit is received from the other party and sent to the drive unit.
  • the driving unit drives the movable unit based on the received positional information.
  • a robot phone is used as a user interface, a robot including a movable part in a part of a body, a driving unit that drives the movable part, and position information that outputs a signal indicating a position of the movable part.
  • a sensor, a communication connection unit, and an adaptive filter that receives as input a signal indicating the position of the movable unit from the position information sensor and position information corresponding to the movable unit from a counterpart received from the communication connection unit.
  • a driving unit configured to drive the movable unit based on an output of the adaptive filter.
  • a robot phone is used as a user interface, and has a mouth pot including a movable part in a part of a body, a drive unit for driving the movable unit, and a position for outputting a signal indicating a position of the movable unit.
  • An information sensor, a communication connection unit, and an encoder that encodes a signal indicating the position of the movable unit from the position information sensor into an audio signal in an audible range, wherein the communication connection unit has an echo canceling function.
  • the drive section drives the movable section based on the received position information.
  • a robot phone is used as a user interface, a robot including a movable part in a part of a body, a microphone and a speaker for talking, a drive unit that drives the movable unit, and a position of the movable unit.
  • a position information sensor that outputs a signal indicating the position
  • a communication connection unit wherein the communication connection unit transmits an audio signal from the microphone to a partner via a communication line, and receives a sound received from the partner.
  • a signal indicating the position of the movable section from the position information sensor is transmitted to the other party, and position information corresponding to the movable section is received from the other party and sent to the drive section.
  • the driving section drives the movable section based on the received position information.
  • FIG. 1 is a diagram showing an example of a robot pot according to an embodiment of the present invention.
  • FIG. 2 is an explanatory diagram of a use form of the robot pot according to the embodiment of the present invention.
  • FIG. 3 is a diagram showing another example of the robot phon according to the embodiment of the present invention.
  • FIG. 4 is an explanatory diagram of a control system of the robot phone according to the embodiment of the present invention.
  • FIG. 5 is a functional block diagram of the control system according to Embodiment 1 of the present invention.
  • FIG. 6 is an operation flowchart of the control system according to the first embodiment of the present invention.
  • FIG. 7 is another operation flowchart of the control system according to the first embodiment of the present invention.
  • FIG. 8 is a functional block diagram of a control system according to Embodiment 2 of the present invention.
  • FIG. 9 is a functional block diagram of the control system according to Embodiment 3 of the present invention.
  • FIG. 10 is a diagram showing an example of a protocol according to Embodiment 3 of the present invention.
  • FIG. 11 is a functional block diagram of a control system according to Embodiment 4 of the present invention.
  • FIG. 12 is a functional block diagram of a control system according to Embodiment 5 of the present invention.
  • FIG. 13 is a functional block diagram of the control system according to Embodiment 6 of the present invention.
  • FIG. 14 is a functional block diagram of a control system according to Embodiment 7 of the present invention.
  • FIG. 15 is a functional block diagram of a control system according to Embodiment 8 of the present invention.
  • the present invention relates to a shape sharing system as one form of an object-oriented telexistence.
  • the shape sharing system is a system that synchronizes the shape of an object to share the shape with a remote location and enables interaction with a remote location.
  • “Shape” is one of the most fundamental elements in identifying and recognizing an object, and is also an important key to knowing the state of an object.
  • the shape sharing system achieves close coupling between the remote robot and the device at hand by synchronizing shapes that play an important role in object recognition.
  • the shape of the object on the user's side shows exactly the shape of the object on the remote side, and has an effect as a display.
  • Input and output are performed by the same device, realizing an intuitive operation system without switching between input and output.
  • the interaction with the object is performed through an organ that can simultaneously input the sensation of the hand and output to the outside world, the system is essentially regarded as an interactive interface.
  • a robot that enables real world and strong interaction is used as an interface between the real world and the information world (Robot User Interface - Robotic User Interface (RUI)) 0 RUI has the following features.
  • -Tactile information can be presented by applying force to the person from the robot.
  • Interaction via voice such as a call to the robot or an utterance of the robot itself, is possible.
  • a robot phone has been proposed as one of the R UI.
  • a robot phone is an RUI that allows humans to communicate by synchronizing the shape, movement, etc., of multiple robots placed at remote locations. By synchronizing the shape of the robot phone in real time, it is possible to transmit not only the shape of the object but also its movement. Also, unlike electronic dents displayed on ordinary displays, it is also possible to actually touch a person to convey power or move an object to perform work. In other words, it can be said that it is a phone that can present visual, tactile, and auditory sounds in an integrated manner. If both users apply power to the robot at the same time, they will feel the power of each other.
  • a robot makes its own judgment based on information from sensors and the like, and it can be divided into an autonomous type, which operates automatically, and another type of robot, which is operated by a human operator. Kwon is categorized as the latter type of robot.
  • Figure 1 shows a stuffed robot phone.
  • la and 1b are robot phones housed in stuffed bears, respectively.
  • It comprises a gear reducer 13 and position detecting means (potentiometer) 14, a processor 15 for controlling them, and a communication connection section 16 for performing communication through the communication line 2.
  • the speaker 12 is mounted on the stuffed chest and the microphone 11 is mounted on the head. These are incorporated into the stuffed bear together with the skeleton.
  • the microphone 11 and the speaker 12 are installed facing the front of the stuffed toy, and users can talk and operate with the robot phone. In this way, it is possible to obtain a sense of being in conversation with the other party.
  • the motor 'planetary gear reducer 13 and the position detecting means 14 are provided at joints of the skeleton. For example, it is provided inside the right arm or head of a stuffed bear.
  • These motor / planetary gear reducer 13 and O standing detection means 14 are actuators with two degrees of freedom and two degrees of freedom on the head, for a total of four degrees of freedom. Gives a degree of freedom closer to humans and enables movement with lighter force For this reason, it is preferable to have 11 degrees of freedom for the whole body with 2 degrees of freedom for each limb and 3 degrees of freedom for the head. This makes it possible to output changes in the posture of the stuffed animal due to external force, and to change the posture based on external signals.
  • the processor 15 performs the lateral control and keeps the mouth pot phones la and 1 b in the same posture.
  • the term pyramid means bidirectional. Bilateral control is known, for example, as a control method for transmitting the weight or reaction force (feeling of contact) received by a manipulator as weight to an operation lever or the like.
  • the two robot phones 1a and 1b are connected to the communication network 2 and can communicate with each other via the communication network 2 while applying force to their stuffed animal to change its posture and reflect this to the other party. That is, it is possible to have the same posture.
  • FIG. 2 shows an example of a call and a call operation performed using the mouth pot phones la and 1b according to the embodiment of the present invention.
  • Two users are talking and performing operations facing the robot phones l a and l b, respectively.
  • Many users have experience playing with dolls and can easily operate the doll-shaped interface.
  • the hand of the other robot phone can be waved, and the gesture of YS / NO can be performed by the movement of the neck.
  • both users simultaneously wave the doll's hand they can shake hands while feeling the opponent's power through the doll's hand. Since the robot phone operates one object at the same time, depending on the state, it sometimes works on itself as an alter ego, and sometimes as an alter ego.
  • Fig. 3 shows another example of a snake-shaped robot phone.
  • the trunk of the body is composed of seven sections 17_1 to 17-7. Sections 17_1 to 17-7 are connected to each other by rotatable joints, and as a whole, can bend like a snake.
  • Each section is equipped with a module consisting of a motor, planetary teeth ⁇ structure 13 and potentiometer 14 respectively.
  • six servomotors are used as actuators.
  • This snake-type robot phone can express its shape by the body itself, although there is a restriction that the operable area is within a two-dimensional plane.
  • the shape can be freely configured by touching it with the hand.
  • control of the servomotor is realized by software on a one-board microcomputer, for example.
  • PWM control is used to drive the motor.
  • a symmetrical control method for the pyramidal servo is adopted, and control is performed so that the positional deviation of the servo motors that make up the pair is always minimized.
  • reference numeral 20 denotes a subtractor for calculating the position deviation
  • 21a and 21b represent the position deviation based on the position deviation.
  • Angle signals output from the robots 1 a and 1 b are obtained by a potentiometer 14.
  • a robot is a machine based on any or all of the shape, structure, and function of a living thing.
  • the slave device when the master device is operated, the slave device follows the operation without delay, so that the master device operator can freely control the shape of the slave device. .
  • the device present at the operator's hand also serves as a display device that constantly presents the remote shape.
  • the device since it has completely symmetric pilot control, there is no distinction as to which device is the master or the slave, and they can operate with each other.
  • not only position but also force transmission is performed.For example, if one device is restrained by hand so as not to move the joint, the other device is constrained by the other device You can feel.
  • the line delay measurement units 22a and 22b measure the line delay and output this to the position command units 21a and 21.
  • the position command sections 21a and 21b drive the servomotors of the mouth bot bodies 10a and 10b based on the above-mentioned positional deviation, but the control is controlled by the robots 10a and 10b.
  • FIG. 6 is a flowchart showing an outline of the processing.
  • the communication delay amount is measured (Sl), and the communication delay amount is compared with a predetermined threshold value (S2).
  • the threshold is determined according to the degree of delay and oscillation allowed in the system.
  • a feed pack invalid time is set according to the communication delay amount in order to avoid an adverse effect due to the communication delay (S3).
  • S4 the feed pack from the other party is invalidated for the set time (S5).
  • FIG. 7 shows another processing example.
  • the processing in FIG. 7 does not include the step (S 2) of comparing the communication delay amount and the threshold value in FIG. Therefore, in the process of FIG. 7, step S5 is executed irrespective of the communication delay amount, but its invalid period is determined according to the communication delay amount.
  • Embodiment 1 of the present invention when an operator on one side starts operating the robot 10, the feed pack from the other side is invalidated, and the system is temporarily brought into a state where operation in only one direction is enabled. (S4, S5). In other words, the movement input on the operation side is transmitted to the other party as it is, but the input operation on the other party is ignored for a certain period. If normal pilot control is a full-duplex (fully bidirectional) system, this method can be said to be a half-duplex system.
  • the system always measures the delay of the communication path, and switches from the normal pilot control to the above method according to the amount of communication delay. You can choose the law (S1, S2).
  • which side the system is to receive an input from is determined by which operator starts operating first. Therefore, what is to be done when both operators start operating at almost the same time is a problem. In particular, when the communication delay is large, the time interval at which the start of operation is considered to be simultaneous increases rapidly, so it can be said that this is a critical problem. Receiving it, it is not possible to decide which one should be given the next operation right).
  • reference numerals 23a and 23b denote operation right judging sections which receive the operation information and the time information associated therewith to determine which operation right is to be given, and 24a and 24b.
  • the same reference numerals as those in the other drawings indicate the same or corresponding parts.
  • the system according to the second embodiment of the present invention solves the above problem by strictly performing time synchronization at both ends of the system.
  • the system can know the global time on each side.
  • By associating the operation information exchanged with the system with the global time it is possible to determine which has moved first at the global time.
  • the longer the delay time the longer the period during which you will be controlled by the other party while you are operating.
  • robot phones l a and 1 b should have absolute time sources 24 a and 24 b such as atomic clocks and GPS that can be accessed locally. This eliminates the need for time synchronization between the two ends of the system.
  • the inquiry / response section 26a, 26b receives an answer from the other party to the inquiry, and the inquiry / answer section 25a, 26b receives a signal indicating that the robot 10a, 10b has been operated. 25b, send an inquiry, receive an answer from the same department 25a, 25b, determine the operation right based on this, and when the user has the operation right, the position command units 21a, 21b send the robot 10 This is an operation right determination unit that issues drive permission for a, 10b.
  • the same reference numerals as those in the other drawings indicate the same or corresponding parts.
  • FIG. 10 is a schematic flowchart according to the third embodiment of the effort.
  • the system according to the third embodiment of the invention prepares a protocol that allows both ends of the system to inquire each other about the state of the other party, and the operator operates the system.
  • the status of the other party is inquired using the inquiry protocol (S11 to S13), and the answer is received from the other party (S14, S15). (S16, SI7).
  • the delay time of the communication line becomes longer, the operator starts to operate (after applying force) until the system permits the operation (until the system actually moves). ) Is longer (so-called, the longer the delay, the heavier the operation).
  • the operation on the operating side may be permitted earlier while making an inquiry to the other party (S18). However, depending on the situation, the temporary permission of operation may be canceled after starting to move.
  • Embodiment 3 of the present invention is effective when the communication delay is relatively small.
  • FIG. 11 Another method for solving the problem when both operators start operating at about the same time will be described.
  • the system according to the fourth embodiment of the invention (see FIG. 11) is for solving the above problem.
  • 27 a and 27 b are operation right judging units for judging the operation right of the operation input to the position command units 21 a and 21 b
  • 28 a and 28 b are conflicting operation rights.
  • Priority tables that preliminarily define the side to be given priority in advance, and 29a and 29b are operation history storage tables for storing past operation histories.
  • the same reference numerals as those in the other drawings denote the same or corresponding parts.
  • the operation on one side is preferentially selected based on the contents of the priority tables 28a and 28b.
  • the priority table 2 8 a, 28 b information priority so that is uniquely determined operation right is stored in advance. This method makes a difference without constructing the system completely symmetrically.
  • an operation right is given to one side according to the operation history of the operation history tables 29a and 29b. For example, the side that performed the last operation is preferentially selected.
  • an operation right is given to a person who operates a specific part. For example, if the robot's arm is moved on the one hand and the robot's neck is moved on the other hand, the operation right is given to the person who operated the neck (or arm).
  • Which parts have priority and their priorities are predetermined. For example, priorities are determined in descending order of meaning as a gesture (neck> arm> foot). For example, priorities are determined in order of operation frequency.
  • FIG. 12 reference numerals 30a and 30b denote operation right display sections that receive the information on the operation right from the operation right judgment sections 26a and 26b or 27a and 27b and display the presence or absence of the operation right.
  • FIG. 12 the same reference numerals as those in the other drawings denote the same or corresponding parts.
  • the operation right display sections 30a and 30b can be applied to the systems shown in FIGS. 5, 8, 9 and 11.
  • the operation right display sections 30a and 30b are, for example, LEDs provided on the stuffed nose. For example, this LED glows blue when your side is operable, and glows red when only the other side is operable. Embodiment of the invention 6.
  • FIG. 13 Another method for solving the influence of the oscillation due to the communication delay will be described.
  • the system according to the sixth embodiment solves the above problem.
  • 31a and 31b indicate whether the robot operation is active as a result of driving by the position command unit 21 or passive due to the application of an external force.
  • the external force detectors 32a and 32b are active / passive operation separators that transmit only position information on passive operations to the other party based on the determination results of the external force detectors 31a and 31b. is there.
  • the same reference numerals as those in the other drawings are used. Indicates the same or corresponding part.
  • 20a and 2Ob are computing units that calculate the deviation between the robot's current position and an external position command only when an output is received from the corresponding external force detector, and input it to the position command unit.
  • both the master and slave actuators are controlled so that the slave follows the displacement of the master. That is, when a displacement occurs in the master, a relative displacement occurs between the master and the slave.
  • Symmetric bilateral control is to control the drive current of the motor attached to the master or slave so that the relative displacement is zero, and to control the drive torque and restraint torque for both.
  • relative displacement when force is applied to master and slave, relative displacement always exists between both. This method is said to have simple control and good stability because it detects force and does not feed it back to position control of the control system.
  • the system according to the sixth embodiment is intended to reduce the influence of the delay.
  • Embodiment 6 of the invention relates to an external force transmission type bilateral control. This is based on the active movement of the mouth pot joints (when the robot is moving according to internal control commands and when the robot is stationary) and the passive movement (when an external human applies force to the mouth pot). (When the robot comes into contact with an external object), and transmits the joint angle, joint speed, or joint torque data during passive operation to the pair of mouth bots.
  • the external force detector 31a, 3lb in Fig. 13 distinguishes between active operation and passive operation, and the active / passive operation separators 32a, 32b select position information on the operation determined to be passive. And send it to the other party.
  • the following method can be considered.
  • FIG. 14 Another method for solving the influence of the oscillation due to the communication delay will be described.
  • the system according to the embodiment of the present invention solves the above problem.
  • 33a and 33b are adaptive filters such as LMS (Least Mean Square) and RLS (Recursive Least Square).
  • LMS Least Mean Square
  • RLS Recursive Least Square
  • Echo is canceled using an applied filter on general telephone lines or Internet phones.
  • the instability of the robot operation due to the transmission delay which is the subject of the present application, can be stabilized by using an adaptive filter by considering the robot operation as the same wave as the voice.
  • the adaptive filters 33a and 33b record the transmitted operation, and when the operation signal resulting from the operation transmitted by the partner port returns, the recorded operation signal Adjust the gain appropriately and take the difference. This makes it possible to extract only the operation of the other party while suppressing oscillation.
  • Reference numerals 34a and 34b denote encoders that encode the position (angle) information of the mouth pot into an audio signal in an audible range.
  • the same reference numerals as those in the other drawings denote the same or corresponding parts.
  • Communication line 2 has an echo canceller function.
  • the communication line When a telephone line or a VoIP connection is used as the communication line 2, the communication line has an echo cancel function. Therefore, if the angle information is encoded into an audio signal in the audible range and then transmitted as audio information to the other party through the communication line, the echo is canceled by the communication line itself. In this case, the robot phone need only have the encoder 33.
  • the present invention is not limited to the above embodiments, and various modifications can be made within the scope of the invention described in the claims, which are also included in the scope of the present invention. It goes without saying that it is a thing.
  • the unit means does not necessarily mean a physical means, but also includes a case where the function of each unit means is realized by software. Furthermore, the functional capabilities of one part The functions of two or more units / means may be realized by two or more physical means or one physical means.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un robot-téléphone permettant la communication humaine par la synchronisation des formes, des mouvement et des positions d'une pluralité de robots éloignés les uns des autres. Le robot-téléphone sert d'interface utilisateur et comprend un robot se présentant sous forme d'une poupée rembourrée dont le corps comporte une partie mobile, un microphone (11) permettant de communiquer, un haut-parleur (12), une partie (13) commande permettant de commander la partie mobile, un capteur (14) d'information de position permettant d'obtenir une information concernant la position de la partie mobile du robot, et une unité (16) de connexion de communication. L'unité de connexion de communication transmet un signal vocal à un correspondant à partir du microphone, par l'intermédiaire d'une ligne de communication, reproduit le signal vocal reçu en provenance du correspondant dans le haut-parleur, transmet au correspondant un signal indiquant la position de la partie mobile obtenue par le capteur d'information de position, reçoit de la part du correspondant une information de position se rapportant à la partie mobile du robot, et transmet cette information de position à l'unité de commande. L'unité de commande actionne la partie mobile conformément à l'information de position reçue. Ainsi, outre la communication vocale, ce robot-téléphone permet une communication par l'intermédiaire des gestes du robot.
PCT/JP2002/009335 2002-01-21 2002-09-12 Robot-telephone WO2003061916A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2002-12218 2002-01-21
JP2002012218 2002-01-21

Publications (1)

Publication Number Publication Date
WO2003061916A1 true WO2003061916A1 (fr) 2003-07-31

Family

ID=27606040

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2002/009335 WO2003061916A1 (fr) 2002-01-21 2002-09-12 Robot-telephone

Country Status (1)

Country Link
WO (1) WO2003061916A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371455A (zh) * 2016-08-29 2017-02-01 马登山 一种智能交互方法和***

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08318479A (ja) * 1995-05-19 1996-12-03 Nippondenso Co Ltd 遠隔操縦システム
JP2001198865A (ja) * 2000-01-20 2001-07-24 Toshiba Corp 2足歩行ロボット装置およびその運用方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08318479A (ja) * 1995-05-19 1996-12-03 Nippondenso Co Ltd 遠隔操縦システム
JP2001198865A (ja) * 2000-01-20 2001-07-24 Toshiba Corp 2足歩行ロボット装置およびその運用方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371455A (zh) * 2016-08-29 2017-02-01 马登山 一种智能交互方法和***

Similar Documents

Publication Publication Date Title
WO2003068461A1 (fr) Robot-telephone
Kittmann et al. Let me introduce myself: I am Care-O-bot 4, a gentleman robot
US8926329B2 (en) Sign language action generating device and communication robot
JP7119896B2 (ja) コミュニケーションロボットおよびコミュニケーションロボットの制御プログラム
WO2018097223A1 (fr) Système de commande de robot, système de commande de machine, procédé de commande de robot, procédé de commande de machine et support d'enregistrement
CN106737760B (zh) 一种人型智能机器人及人机交流***
JP2003266351A (ja) ロボット装置及びロボット装置の動作制御方法
JP4022477B2 (ja) ロボットフォン
Clark et al. On the role of wearable haptics for force feedback in teleimpedance control for dual-arm robotic teleoperation
Ferland et al. Natural interaction design of a humanoid robot
JP2012161851A (ja) ロボットシステムおよびそれに用いる空間陣形認識装置
JP2018126810A (ja) ロボットシステム及びロボット対話方法
Kawamura et al. Humanoids: Future robots for home and factory
Lenz et al. Bimanual telemanipulation with force and haptic feedback through an anthropomorphic avatar system
Igorevich et al. Behavioral synchronization of human and humanoid robot
Luo et al. Team Northeastern's approach to ANA XPRIZE Avatar final testing: A holistic approach to telepresence and lessons learned
Luo et al. Towards robot avatars: Systems and methods for teleinteraction at avatar xprize semi-finals
JP4022478B2 (ja) ロボットフォン
JP2006088276A (ja) 動作生成システム
Cisneros-Limón et al. A cybernetic avatar system to embody human telepresence for connectivity, exploration, and skill transfer
JP2004034274A (ja) 会話ロボット及びその動作方法
Van Breemen et al. A user-interface robot for ambient intelligent environments
WO2003061916A1 (fr) Robot-telephone
Yim et al. Design considerations of expressive bidirectional telepresence robots
WO2018157355A1 (fr) Robot intelligent humanoïde et système de communication homme-machine

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase