WO2011003628A2 - Systeme et procede pour generer des comportements contextuels d'un robot mobile - Google Patents
Systeme et procede pour generer des comportements contextuels d'un robot mobile Download PDFInfo
- Publication number
- WO2011003628A2 WO2011003628A2 PCT/EP2010/004234 EP2010004234W WO2011003628A2 WO 2011003628 A2 WO2011003628 A2 WO 2011003628A2 EP 2010004234 W EP2010004234 W EP 2010004234W WO 2011003628 A2 WO2011003628 A2 WO 2011003628A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- editing
- control
- text
- behaviors
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/34—Graphical or visual programming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
Definitions
- the present invention pertains to the field of robot programming systems. More precisely, it applies to the control of behaviors coherent with the context in which the robot, in particular of human or animal form, evolves, expresses itself and moves on members articulated or not.
- a robot can be called a humanoid from the moment it has certain attributes of the appearance and functionality of the man: a head, a trunk, two arms, possibly two hands, two legs, two feet ...
- One of the features likely to give the robot an appearance and quasi-human behavior is the possibility of ensuring a strong coupling between gestural expression and oral expression.
- to achieve this result in an intuitive way would allow new groups of users to access the programming of behaviors of humanoid robots. This problem has not been correctly solved by the prior art.
- the invention makes it possible to insert in a text spoken by the robot commands of intonation and / or behaviors associated with the oral expression, said commands being able to be expressed for example graphically.
- the present invention discloses a system for editing and controlling behaviors of at least one robot comprising a module for editing a script to be reproduced by the robot, a sound synthesis module of said script, a library behavior control tags to be executed by the robot, a module for inserting said tags into said script and a module for generating and controlling the behaviors of said robot, said system being characterized in that said command tag library comprises at least one tag for controlling a behavior depending on the content of the script.
- the editing and control system further comprises a robot environment monitoring module and said control tag library comprises at least one tag for controlling a behavior depending on the environment of the robot.
- the control tag library comprises at least one tag selected from the waiting commands group, response commands to an expected external action.
- control tags are chosen from the group comprising graphic symbols, punctuation marks and words.
- a portion of the command tags is inserted into the script in the form of a list, each item of said list corresponding to a phrase with which a group of behaviors is associated.
- control tags is inserted into the script between at least one opening separator and at least one closing separator which respectively trigger the start of a sequence of behaviors and the end of said sequence.
- the behaviors to be executed by the robot are chosen from the group comprising combinations of the modulations of the voices reading the script, script accompanying gestures and signal transmissions.
- the modulations of the voices are chosen from a group comprising combinations of languages, voices, tones, speeds and different intonations.
- the script editing module communicates with the behavior control module of the robot by an electronic mail module.
- the editing and control system is able to interact with elements belonging to a group comprising physical robots, virtual robots and computers.
- the robots belonging to the commanded group each execute a part of the script determined by at least one control tag characteristic of each of the elements of the group.
- the behavior controlled by a control tag can change according to a method chosen from the group comprising the random change, the cyclic change or the contextual change.
- the invention also discloses a method for editing and controlling behaviors of at least one robot comprising a step of editing a script to be reproduced by the robot, a step of sound synthesis of said script, a step of reading a library of behavior control tags to be executed by the robot, a step of inserting said tags into said script and a step of controlling behaviors of said robot, said method being characterized in that said control tag library comprises at least one minus a tag to control a behavior depending on the contents of the script.
- the invention makes it possible to create behavioral libraries and to easily insert them into a script of scenes played by the robot.
- the language used is formally very close to the one used on the Internet and is therefore very easily accessible, without any prior knowledge or training.
- the invention also satisfactorily completes French Patent Application No. 09/53434 relating to a system and a method for editing and controlling the behavior of a mobile robot belonging to the Applicant.
- This provides means to execute behaviors by a robot, said behaviors can be controlled either using a specialized scripting language, accessible to programmers, or graphically by using preprogrammed libraries that can be selected and insert into a series of boxes of behaviors linked by events.
- the invention makes it possible to further simplify the programming interface of the behavior of the robot.
- the invention will be better understood and its various features and advantages will emerge from the following description of several exemplary embodiments and its appended figures including:
- FIG. 1 is an example of programming behaviors using elements of the BML language according to a document of the prior art
- FIG. 2 is a flowchart indicating the main steps of the method of the invention in one of its embodiments
- FIG. 3 is a view of a control beacon shaft provided for implementing the invention in one of its embodiments
- FIG. 4 is a first example of a scenario of use of the invention in one of its embodiments
- FIG. 5 is a second example of a scenario of use of the invention in one of its embodiments.
- FIG. 6 is a third example of a scenario of use of the invention in one of its embodiments.
- FIG. 7 is a fourth example of a scenario of use of the invention in one of its embodiments.
- FIG. 1 is an example of programming behaviors using elements of the BML language according to a document of the prior art.
- BML programming is for a virtual robot and not a physical robot. It does not generate commands directly executable by a physical robot.
- the system is able to control the behavior of a robot 10 which may be a humanoid robot with two lower limbs and two upper limbs or a robot having the appearance of a two or four-limb animal.
- the members of the robot can be provided with motorized joints, controlled by cards, possibly allowing it to move according to commands that are embedded and interpreted by one or more processors or transmitted from a server in form directly executable by the or processors embedded or to be developed by said processors.
- the robot is advantageously endowed with a head and a body completing its humanoid or animal form. It is also advantageously equipped with sensors enabling it to position itself in its environment and to perceive sound or movement signals coming from said environment.
- the robot can also have other action capabilities with its environment, such as sources of light emission or sound, or the ability to act on the Internet such as sending a mail or a phone call or a modification of the environment by communication with a home automation system.
- the editing and control system is provided with a script editor 20 in which is entered a text intended to be reproduced by the robot in the form of sound signals.
- This entry can be made directly using a simple computer keyboard or by loading into the system a file type text (* .doc, * .txt or other) or a html file (optionally designated by its URL).
- These files can also be received from a remote site, for example via a messaging system.
- the system or the robot is provided with a synthesis device 30 able to interpret the text of the script editor to produce sounds, which can be either words in the case of a humanoid robot, or sounds representative of the behavior of an animal.
- the sound synthesis device can also reproduce background sounds, for example background music which, possibly, can be played on a remote computer.
- the triggering of the reading of a story can be done during the reception of an external event to the robot such as:
- the system includes a library 40 of control tags to be inserted into the script editor by via an insertion module 50.
- the insertion is effected by dragging (gesture "drag and drop” with a mouse connected to the computer on which the editor is implanted) the command tag chosen in the editor.
- Fig. 3 is a view of a control beacon shaft provided for implementing the invention in one of its embodiments
- the different command tags are grouped by type of behavior to generate in directories some of which are viewable next to the script editor.
- Forms of reading of the script for example, the language in which the script will be read, in the case of a humanoid robot, may be chosen (French, English, Anglo-American, German, Russian, etc.) ); the voice may be a male, female or child voice; the tone may be more or less serious or acute; the speed may be more or less rapid; the intonation can be chosen according to the emotion that the robot is likely to feel according to the text of the script (affection, astonishment, anger, joy, remonstrance, etc.);
- Script accompanying gestures for example, arm movement up or forward; strike a foot on the ground; movements of the head upwards, downwards, to the right or to the left, according to the impression that one wants to communicate coherently with the script ...
- Emitting signals for example, if the robot is equipped with LEDs, they can be activated to express strong emotions "felt" by the robot by reading the text or to generate a blink of the eye adapted to the form and speed of speech.
- the tags inserted in the script generate the execution of the corresponding commands by the robot via a control module 6OB which can pass to the module for generating the signals transmitted by the robot. robot in response to commands.
- one of the features of the invention is to allow the automatic generation of commands consistent with the content of the script to be read by the robot.
- a first way to achieve this coupling is to have control tags related to punctuation: in this case, a question mark will generate a tone control consistent with the interrogative mode. Similarly for an exclamation point. Some words or punctuation may also automatically generate a tag that will command consistent behavior with that word. As non-limiting illustrations:
- the quotation marks can be linked to a beacon programmed to trigger a "quote” gesture, which consists of lifting the arms and opening and closing the fingers twice; adverbs or key words such as “here”, “after”, “many” can be linked to a programmed beacon to trigger the corresponding gesture;
- the word “hello” may be related to a programmed beacon to automatically generate an arm lift command tag;
- the expression “I love you” can be linked to a beacon programmed to automatically generate a control of the LEDs of the face which generates a blush of it;
- the expression "drive a car” can be linked to a beacon programmed to generate a movement imitating the gesture indicated in the expression;
- certain commands can be commands for interrupting and waiting for an external event, such as a movement in response to a question posed by the robot.
- certain commands may be dependent on the reactions of the robot to its environment, as measured by means of an environmental monitoring module 70.
- This module interfaces with sensors embedded on the robot.
- sensors embedded on the robot.
- a camera can identify features in the robot environment, such as another robot or a human handing the robot, or simply the presence or absence of spectators to play the robot. text only in the presence of an audience.
- the sensors can be, alternatively or in addition, ultrasonic sensors that detect movement in the environment close to the robot. In this case, if it is an expected response by the script in which a wait command has been inserted, the following command is executed.
- the commands of the library are transformed into the syntax of the tag language used to implement the invention from commands generated by the editing and control system disclosed by the French patent application No.
- the editing and control system can allow the control of several robots near or distant. It can be foreseen that the choice of the robot called to play a particular passage of a script is at the discretion of the user who can make your choice for example using tabs provided for this purpose in the interface of the editing and control system. We can also automate this allocation of passages of the script corresponding to different characters to separate robots, by embedding particular tags corresponding to each of the characters within the script.
- FIG. 4 is a first example of a scenario of use of the invention in one of its embodiments.
- [LANG en] -> changes the current language of the robot, so the robot is multi-language and the editor supports all languages of the robot (English, French, Italian, Spanish, German ...);
- Fig. 5 is a second exemplary use scenario of the invention in one of its embodiments.
- a graphical tag is inserted.
- This tag has the appearance of a "smiley” or emoticon, which is now a common language used in messaging and internet chats. This type of tag makes it even more intuitive to use the editing and control system of the invention.
- FIG. 6 is a third example of a scenario of use of the invention in one of its embodiments.
- Fig. 7 is a fourth exemplary use scenario of the invention in one of its embodiments.
- Figure 7 combines graphic tags and text tags.
- the user also has the possibility to change the name of the tags as he wishes and so replace them with a word of his choice. He may also decide to create a new behavior that he will associate with a word of his choice. For example, he can quite decide that every time he writes "Hi" in the editing and command system of the invention, the robot accompanies this word with a hand movement, the behavior containing the movement of the hand will therefore be associated with the tag "Hi".
- a beacon can be programmed to correspond to several behaviors: thus at each reading the behavior can change (it is either taken randomly, or sequentially, or can be dependent on the preceding and following words and markers, that is to say contextualized) .
- the invention can be implemented on a commercial computer that communicates with one or more robots via wired or wireless links.
- the generation of commands is advantageously carried out by the editing and control system disclosed by French Patent Application No. 09/53434 already cited.
- the robot or robots are advantageously equipped with a three-level control command architecture such as that disclosed by the French patent application No. 08/01956000 also belonging to the applicant. They must be equipped with software for interpreting the commands sent by the command generator of the invention.
- the examples described above are given by way of illustration of embodiments of the invention. They in no way limit the scope of the invention which is defined by the following claims.
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Robotics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Manipulator (AREA)
- Toys (AREA)
- Stored Programmes (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/383,196 US9205557B2 (en) | 2009-07-10 | 2010-07-12 | System and method for generating contextual behaviors of a mobile robot |
EP10784435A EP2451617A2 (fr) | 2009-07-10 | 2010-07-12 | Systeme et procede pour generer des comportements contextuels d'un robot mobile |
CN201080035241.XA CN102596516B (zh) | 2009-07-10 | 2010-07-12 | 用于产生移动机器人的情境行为的***和方法 |
JP2012518831A JP2012532390A (ja) | 2009-07-10 | 2010-07-12 | 移動ロボットのコンテキスト動作を生成するためのシステムおよび方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0954837 | 2009-07-10 | ||
FR0954837A FR2947923B1 (fr) | 2009-07-10 | 2009-07-10 | Systeme et procede pour generer des comportements contextuels d'un robot mobile |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2011003628A2 true WO2011003628A2 (fr) | 2011-01-13 |
WO2011003628A3 WO2011003628A3 (fr) | 2011-03-03 |
Family
ID=42111526
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2010/004234 WO2011003628A2 (fr) | 2009-07-10 | 2010-07-12 | Systeme et procede pour generer des comportements contextuels d'un robot mobile |
Country Status (6)
Country | Link |
---|---|
US (1) | US9205557B2 (fr) |
EP (1) | EP2451617A2 (fr) |
JP (1) | JP2012532390A (fr) |
CN (1) | CN102596516B (fr) |
FR (1) | FR2947923B1 (fr) |
WO (1) | WO2011003628A2 (fr) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013150076A1 (fr) | 2012-04-04 | 2013-10-10 | Aldebaran Robotics | Robot apte a integrer des dialogues naturels avec un utilisateur dans ses comportements, procedes de programmation et d'utilisation dudit robot |
WO2013178741A1 (fr) | 2012-06-01 | 2013-12-05 | Aldebaran Robotics | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel |
EP2933067A1 (fr) | 2014-04-17 | 2015-10-21 | Aldebaran Robotics | Procédé de réalisation de dialogue multimodal entre un utilisateur et un robot humanoïde, produit de programme informatique et robot humanoïde mettant en oeuvre ce procédé |
EP2933070A1 (fr) * | 2014-04-17 | 2015-10-21 | Aldebaran Robotics | Procédés et systèmes de manipulation d'un dialogue avec un robot |
US10583559B2 (en) | 2014-04-17 | 2020-03-10 | Softbank Robotics Europe | Humanoid robot with an autonomous life capability |
US10915709B2 (en) * | 2016-04-28 | 2021-02-09 | Masoud Amri | Voice-controlled system |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2947923B1 (fr) * | 2009-07-10 | 2016-02-05 | Aldebaran Robotics | Systeme et procede pour generer des comportements contextuels d'un robot mobile |
JP5698614B2 (ja) * | 2011-06-22 | 2015-04-08 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | コンテキスト情報処理システム及び方法 |
EP3178040A4 (fr) * | 2014-08-07 | 2018-04-04 | Okinawa Institute of Science and Technology School Corporation | Apprentissage par renforcement inverse par l'estimation de rapport de densite |
US10239205B2 (en) * | 2016-06-29 | 2019-03-26 | International Business Machines Corporation | System, method, and recording medium for corpus curation for action manifestation for cognitive robots |
US11443161B2 (en) * | 2016-12-12 | 2022-09-13 | Microsoft Technology Licensing, Llc | Robot gesture generation |
CN107186706B (zh) * | 2017-06-16 | 2019-08-09 | 重庆柚瓣家科技有限公司 | 养老机器人控制模块远程化的方法及*** |
CN109521927B (zh) * | 2017-09-20 | 2022-07-01 | 阿里巴巴集团控股有限公司 | 机器人互动方法和设备 |
JP2019118992A (ja) * | 2017-12-28 | 2019-07-22 | 株式会社日立ビルシステム | ロボット装置制御システム |
US10691113B1 (en) * | 2018-02-06 | 2020-06-23 | Anthony Bergman | Robotic process control system |
CN111098301B (zh) * | 2019-12-20 | 2020-08-18 | 西南交通大学 | 一种基于场景知识图谱任务型机器人的控制方法 |
CN111723557A (zh) * | 2020-06-30 | 2020-09-29 | 北京来也网络科技有限公司 | 基于ai的语段编辑方法、装置、设备以及存储介质 |
US20230236575A1 (en) * | 2022-01-26 | 2023-07-27 | Vivek Ramdas Pai | Computer-automated scripted electronic actor control |
Family Cites Families (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0561637A (ja) * | 1991-09-02 | 1993-03-12 | Toshiba Corp | 音声合成メールシステム |
JPH09185607A (ja) * | 1995-12-28 | 1997-07-15 | Dainippon Screen Mfg Co Ltd | ハイパーテキストレイアウト装置 |
DE19615693C1 (de) * | 1996-04-19 | 1997-12-11 | Siemens Ag | Vorrichtung und Verfahren zur Aktionsermittlung |
US5838973A (en) * | 1996-05-03 | 1998-11-17 | Andersen Consulting Llp | System and method for interactively transforming a system or process into a visual representation |
US6259969B1 (en) * | 1997-06-04 | 2001-07-10 | Nativeminds, Inc. | System and method for automatically verifying the performance of a virtual robot |
JPH11224179A (ja) * | 1998-02-05 | 1999-08-17 | Fujitsu Ltd | 対話インタフェース・システム |
JPH11327872A (ja) * | 1998-05-18 | 1999-11-30 | Nippon Telegr & Teleph Corp <Ntt> | 電子メール提示方法及び電子メール端末及び電子メール提示プログラムを格納した記憶媒体 |
US6629087B1 (en) * | 1999-03-18 | 2003-09-30 | Nativeminds, Inc. | Methods for creating and editing topics for virtual robots conversing in natural language |
US6947893B1 (en) * | 1999-11-19 | 2005-09-20 | Nippon Telegraph & Telephone Corporation | Acoustic signal transmission with insertion signal for machine control |
JP4032273B2 (ja) * | 1999-12-28 | 2008-01-16 | ソニー株式会社 | 同期制御装置および方法、並びに記録媒体 |
JP4465768B2 (ja) * | 1999-12-28 | 2010-05-19 | ソニー株式会社 | 音声合成装置および方法、並びに記録媒体 |
JP2001353678A (ja) * | 2000-06-12 | 2001-12-25 | Sony Corp | オーサリング・システム及びオーサリング方法、並びに記憶媒体 |
JP2002127062A (ja) * | 2000-08-18 | 2002-05-08 | Nippon Telegr & Teleph Corp <Ntt> | ロボットシステム、ロボット制御信号生成装置、ロボット制御信号生成方法、記録媒体、プログラムおよびロボット |
US6724409B1 (en) * | 2000-11-16 | 2004-04-20 | Hewlett-Packard Development Company L.P. | Tree-based graphical user interface for creating and editing machine control sequences |
US7367017B2 (en) * | 2001-01-31 | 2008-04-29 | Hewlett-Packard Development Company, L.P. | Method and apparatus for analyzing machine control sequences |
JP2002268663A (ja) * | 2001-03-08 | 2002-09-20 | Sony Corp | 音声合成装置および音声合成方法、並びにプログラムおよび記録媒体 |
US7813835B2 (en) * | 2002-03-15 | 2010-10-12 | Sony Corporation | Robot behavior control system, behavior control method, and robot device |
JP2003308142A (ja) * | 2002-04-17 | 2003-10-31 | Seiko Epson Corp | メッセージ処理システム、音声信号処理システム、メッセージ処理設備、メッセージ送信端末、音声信号処理設備、メッセージ処理プログラム、音声信号処理プログラム、設備用プログラム、端末用プログラム及びメッセージのデータ構造、並びにメッセージ処理方法、音声信号処理方法及びメッセージ生成方法 |
WO2004063883A2 (fr) * | 2003-01-09 | 2004-07-29 | Evolution Robotics, Inc. | Programmation de robots et/ou systemes informatiques en fonction de signes visuels et de l'environnement |
US7707135B2 (en) * | 2003-03-04 | 2010-04-27 | Kurzweil Technologies, Inc. | Enhanced artificial intelligence language |
JP2004287016A (ja) * | 2003-03-20 | 2004-10-14 | Sony Corp | 音声対話装置及び方法並びにロボット装置 |
JP2004318862A (ja) * | 2003-03-28 | 2004-11-11 | Sony Corp | 情報提供装置及び方法、並びに情報提供システム |
CN100351789C (zh) * | 2003-03-28 | 2007-11-28 | 索尼株式会社 | 信息提供设备、方法和信息提供*** |
US7689319B2 (en) * | 2003-08-12 | 2010-03-30 | Advanced Telecommunications Research Institute International | Communication robot control system |
US7434176B1 (en) * | 2003-08-25 | 2008-10-07 | Walt Froloff | System and method for encoding decoding parsing and translating emotive content in electronic communication |
JP2005088179A (ja) * | 2003-09-22 | 2005-04-07 | Honda Motor Co Ltd | 自律移動ロボットシステム |
JP2005199373A (ja) * | 2004-01-14 | 2005-07-28 | Toshiba Corp | コミュニケーション装置及びコミュニケーション方法 |
JP2005266671A (ja) * | 2004-03-22 | 2005-09-29 | Yamaha Corp | ロボット及び音声再生方法 |
JP2006155299A (ja) * | 2004-11-30 | 2006-06-15 | Sharp Corp | 情報処理装置、情報処理プログラム、および、プログラム記録媒体 |
JP2006263858A (ja) * | 2005-03-24 | 2006-10-05 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
GB2427109B (en) * | 2005-05-30 | 2007-08-01 | Kyocera Corp | Audio output apparatus, document reading method, and mobile terminal |
JP2007044825A (ja) * | 2005-08-10 | 2007-02-22 | Toshiba Corp | 行動管理装置、行動管理方法および行動管理プログラム |
JP4815940B2 (ja) * | 2005-08-17 | 2011-11-16 | 日本電気株式会社 | ロボット制御システム、ロボット装置、およびロボット制御方法 |
JP4718987B2 (ja) * | 2005-12-12 | 2011-07-06 | 本田技研工業株式会社 | インターフェース装置およびそれを備えた移動ロボット |
US7738997B2 (en) * | 2005-12-19 | 2010-06-15 | Chyi-Yeu Lin | Robotic system for synchronously reproducing facial expression and speech and related method thereof |
US20080263164A1 (en) * | 2005-12-20 | 2008-10-23 | Koninklijke Philips Electronics, N.V. | Method of Sending Motion Control Content in a Message, Message Transmitting Device Abnd Message Rendering Device |
CN100348381C (zh) * | 2006-01-06 | 2007-11-14 | 华南理工大学 | 家政服务机器人 |
JP2007232829A (ja) * | 2006-02-28 | 2007-09-13 | Murata Mach Ltd | 音声対話装置とその方法及びプログラム |
US20080071540A1 (en) * | 2006-09-13 | 2008-03-20 | Honda Motor Co., Ltd. | Speech recognition method for robot under motor noise thereof |
JP2008168375A (ja) * | 2007-01-10 | 2008-07-24 | Sky Kk | ボディランゲージロボット、ボディランゲージロボットの制御方法及び制御プログラム |
CN101020312A (zh) * | 2007-03-13 | 2007-08-22 | 叶琛 | 基于网络功能的机器人传递行为的方法和装置 |
JP5025353B2 (ja) * | 2007-07-03 | 2012-09-12 | ニュアンス コミュニケーションズ,インコーポレイテッド | 対話処理装置、対話処理方法及びコンピュータ・プログラム |
TWI338588B (en) * | 2007-07-31 | 2011-03-11 | Ind Tech Res Inst | Method and apparatus for robot behavior series control based on rfid technology |
FR2947923B1 (fr) * | 2009-07-10 | 2016-02-05 | Aldebaran Robotics | Systeme et procede pour generer des comportements contextuels d'un robot mobile |
-
2009
- 2009-07-10 FR FR0954837A patent/FR2947923B1/fr not_active Expired - Fee Related
-
2010
- 2010-07-12 US US13/383,196 patent/US9205557B2/en not_active Expired - Fee Related
- 2010-07-12 CN CN201080035241.XA patent/CN102596516B/zh not_active Expired - Fee Related
- 2010-07-12 EP EP10784435A patent/EP2451617A2/fr not_active Ceased
- 2010-07-12 WO PCT/EP2010/004234 patent/WO2011003628A2/fr active Application Filing
- 2010-07-12 JP JP2012518831A patent/JP2012532390A/ja active Pending
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013150076A1 (fr) | 2012-04-04 | 2013-10-10 | Aldebaran Robotics | Robot apte a integrer des dialogues naturels avec un utilisateur dans ses comportements, procedes de programmation et d'utilisation dudit robot |
US10052769B2 (en) | 2012-04-04 | 2018-08-21 | Softbank Robotics Europe | Robot capable of incorporating natural dialogues with a user into the behaviour of same, and methods of programming and using said robot |
JP2015524934A (ja) * | 2012-04-04 | 2015-08-27 | アルデバラン ロボティクス | ユーザとの自然対話をロボットの挙動に組み込むことができるロボットならびに前記ロボットをプログラムする方法および使用する方法 |
CN104470686A (zh) * | 2012-06-01 | 2015-03-25 | 奥尔德巴伦机器人公司 | 用于生成被实时地执行的移动机器人的上下文行为的***和方法 |
FR2991222A1 (fr) * | 2012-06-01 | 2013-12-06 | Aldebaran Robotics | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel |
WO2013178741A1 (fr) | 2012-06-01 | 2013-12-05 | Aldebaran Robotics | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel |
EP2933067A1 (fr) | 2014-04-17 | 2015-10-21 | Aldebaran Robotics | Procédé de réalisation de dialogue multimodal entre un utilisateur et un robot humanoïde, produit de programme informatique et robot humanoïde mettant en oeuvre ce procédé |
EP2933070A1 (fr) * | 2014-04-17 | 2015-10-21 | Aldebaran Robotics | Procédés et systèmes de manipulation d'un dialogue avec un robot |
WO2015158878A1 (fr) * | 2014-04-17 | 2015-10-22 | Aldebaran Robotics | Procédés et systèmes de traitement d'un dialogue avec un robot |
AU2015248713B2 (en) * | 2014-04-17 | 2018-03-29 | Softbank Robotics Europe | Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method |
US10242666B2 (en) | 2014-04-17 | 2019-03-26 | Softbank Robotics Europe | Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method |
AU2018202162B2 (en) * | 2014-04-17 | 2020-01-16 | Softbank Robotics Europe | Methods and systems of handling a dialog with a robot |
US10583559B2 (en) | 2014-04-17 | 2020-03-10 | Softbank Robotics Europe | Humanoid robot with an autonomous life capability |
US10915709B2 (en) * | 2016-04-28 | 2021-02-09 | Masoud Amri | Voice-controlled system |
Also Published As
Publication number | Publication date |
---|---|
CN102596516B (zh) | 2016-04-13 |
US9205557B2 (en) | 2015-12-08 |
EP2451617A2 (fr) | 2012-05-16 |
FR2947923B1 (fr) | 2016-02-05 |
JP2012532390A (ja) | 2012-12-13 |
CN102596516A (zh) | 2012-07-18 |
WO2011003628A3 (fr) | 2011-03-03 |
FR2947923A1 (fr) | 2011-01-14 |
US20120197436A1 (en) | 2012-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011003628A2 (fr) | Systeme et procede pour generer des comportements contextuels d'un robot mobile | |
KR102627948B1 (ko) | 다수의 연령 및/또는 어휘 수준을 수용하는 자동화 어시스턴트 | |
US20180133900A1 (en) | Embodied dialog and embodied speech authoring tools for use with an expressive social robot | |
FR2963132A1 (fr) | Robot humanoide dote d'une interface de dialogue naturel, methode d'utilisation et de programmation de ladite interface | |
US9116880B2 (en) | Generating stimuli for use in soliciting grounded linguistic information | |
AU2003288790B2 (en) | Proactive user interface | |
CN106663219B (zh) | 处理与机器人的对话的方法和*** | |
US7836437B2 (en) | Semantic annotations for virtual objects | |
EP2834811A1 (fr) | Robot apte a integrer des dialogues naturels avec un utilisateur dans ses comportements, procedes de programmation et d'utilisation dudit robot | |
CA2925930C (fr) | Procede de dialogue entre une machine, telle qu'un robot humanoide, et un interlocuteur humain, produit programme d'ordinateur et robot humanoide pour la mise en oeuvre d'un tel procede | |
EP2855105A1 (fr) | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel | |
US20150142434A1 (en) | Illustrated Story Creation System and Device | |
Strauss et al. | Proactive spoken dialogue interaction in multi-party environments | |
JP2023156489A (ja) | 自然言語処理システム、自然言語処理方法および自然言語処理プログラム | |
Klüwer et al. | Talking NPCs in a virtual game world | |
Ishizuka et al. | Describing and generating multimodal contents featuring affective lifelike agents with MPML | |
EP4378638A2 (fr) | Procédé de commande d'une pluralité d'effecteurs d'un robot | |
Nakatsu | Nonverbal information recognition and its application to communications | |
Ma | Confucius: An intelligent multimedia storytelling interpretation and presentation system | |
Wu | Storytelling and the ineffable: an inquiry into Chan pedagogy from the perspective of unnatural narratology | |
Slater et al. | Emotionally responsive robotic avatars as characters in virtual worlds | |
Boyarin | Take the Bible for example: Midrash as literary theory | |
Brunnberg | How tools shape the game authoring process | |
Newland | Interface Design: Seeking an Appropriate Analysis Framework | |
Birkeland | ‘Things he wanted to say’: Figures in the Dialogue of Raymond Carver’s Short Fiction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080035241.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10784435 Country of ref document: EP Kind code of ref document: A2 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012518831 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010784435 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13383196 Country of ref document: US |