US20090306823A1 - Method and System for Robot Generation - Google Patents

Method and System for Robot Generation Download PDF

Info

Publication number
US20090306823A1
US20090306823A1 US12/499,411 US49941109A US2009306823A1 US 20090306823 A1 US20090306823 A1 US 20090306823A1 US 49941109 A US49941109 A US 49941109A US 2009306823 A1 US2009306823 A1 US 2009306823A1
Authority
US
United States
Prior art keywords
robotic
robotic device
components
environment
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/499,411
Other languages
English (en)
Inventor
Hansjorg Baltes
Jack Elmin Peterson
Shawn Samuel Schaerer
Xiao-Wen Terry Liu
Brian P. McKinnon
Sara Epp
Vergil Kanne
Shane Yanke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/499,411 priority Critical patent/US20090306823A1/en
Publication of US20090306823A1 publication Critical patent/US20090306823A1/en
Priority to US13/399,505 priority patent/US9671786B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0227Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0261Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic plots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Definitions

  • the invention relates generally to the field of robotics, and more specifically to improved methods and systems for generating robot devices.
  • a method for the automated generation of a robotic device in whole or in part comprises the steps of eliciting and receiving user input to determine a task specification for one or more tasks for the robotic device; determining a task list comprising one or more tasks based on the provided task specification; determining based on the task list provided, one or more movement components, one or more processing components, and logic components required to execute one or more tasks; and generating the logic components required to execute one or more tasks, and embedding the logic components onto a recordable medium associated with the robotic device.
  • a method of creating a robotic device wherein an environment for the robotic device to operate in is provided, the method comprises inputting a map that defines the area of the environment; inputting a starting point indicating a position within the area that the robotic device is to start operating from inputting one or more stopping points along a route emanating from the starting point that the robotic device is to follow; and inputting one or more tasks to be completed by the robotic device at the starting points, stopping points or along the route.
  • a method of specifying one or more tasks for a robotic device to complete comprises specifying whether the robotic device is a physical robotic device or a virtual robotic device; specifying a location for the robotic device to operate in; specifying an activity for the robotic device to undertake; and providing detailed information regarding the activity that the robotic device is to undertake.
  • a method of completing one or more tasks by a plurality of robotic devices operating in a team environment comprises selecting one or more of the plurality of the robotic devices to be a controlling robotic devices; determining at the controlling robotic device whether new robotic specifications are required for one or more of the plurality of robotic devices; transmitting requests for new robotic specifications from the controlling robotic device to a robotic server; and receiving the new robotic specifications at the one or more of the plurality of robotic devices.
  • FIG. 1 is a block diagram illustrating components of a robot generation system
  • FIG. 2 is a block diagram illustrating the components of a robotic device
  • FIG. 3 is a block diagram illustrating the components of the generator
  • FIG. 4 is a diagram illustrating the components of the knowledge base module
  • FIG. 6 is a flowchart illustrating the steps of a task specification method
  • FIG. 7 is a flowchart illustrating the steps of a task generation method
  • FIG. 8 is a flowchart illustrating the steps of an environment specification method
  • FIG. 9 is a screenshot of a sample map editor window
  • FIG. 10 is another screenshot of a sample map editor window
  • FIG. 11 is another screenshot of a sample map editor window
  • FIG. 12 is another screenshot of a sample map editor window
  • FIG. 13 is another screenshot of a sample map editor window
  • FIG. 14A and FIG. 14B are sample screenshots of task specification windows.
  • FIG. 1 where a block diagram illustrating the components of a robot generation system 10 is shown.
  • the robot generation system 10 is used to generate robotic devices 12 which may be real robotic devices, virtual robotic devices and robotic devices that comprise a mixture of real and virtual components referred to as mixed reality robotic devices.
  • the system 10 in an exemplary embodiment is comprised of one or more robotic devices 12 .
  • a group of one or more robotic devices 12 may be referred to as a robotic team 14 .
  • the robotic devices 12 interact with and receive instructions through a robotic server 20 .
  • the robotic server 20 is used to receive instructions regarding a robotic device 12 that is created and performs the processing required to determine the components associated with a robotic device 12 .
  • the robotic server 20 is also referred to as the robotic controller, and in an exemplary embodiment, as described below is comprised of a robot generator application 22 , a robot generation interface 24 , and a knowledge database 26 .
  • the robotic server 20 receives instructions from one or more users who are communicating with the robotic server 20 through a computing device 30 .
  • the computing device 30 interacts with the robotic server 20 through a communication network of any kind 31 .
  • the robotic device 12 operates in a defined environment and by implementing the task specification performs one or more tasks that have been defined for the device 12 .
  • the environment that the robotic device 12 operates in may include real settings, simulated settings or mixed reality settings. Examples of such real environments include, but are not limited to, home environments, office environments, parks, shopping centres, malls, processing and manufacturing plants, oil drilling and mining environments, refineries, flight and space environments, medical and microscopic environments, and any other environment that may be described with respects to its area and potential obstacles.
  • the user will generally describe the location of the environment (i.e. indoor, outdoor or both), the type of environment (i.e. virtual environment, real environment or mixed), the type of control (i.e. user controlled or self controlled).
  • the user may further specify points of interest, starting points, directions of travel and task for competition for a robotic device within the environment.
  • Standard distinctions include but are not limited to: indoor, outdoor or mixed environments; virtual, physical or mixed reality; self-controlled, user-controller, or user-guided; while distinctions such as travel between waypoints, fully cover a given area or wander, or specific obstacles, or specific tasks may be specified relative to a particular application.
  • Simulated settings include any virtual environment that is defined in which a virtual robotic device 12 may operate, and examples of such virtual environments include virtual environments that are used in simulations such as video games and environments that provide for training through simulations.
  • Examples of such training environments may include, but are not limited to, simulated operating room environments, simulated disaster/rescue environments, and simulated space environments.
  • Mixed reality settings may include any environment that is defined in which a real, virtual or mixed reality robotic device may operate, and examples of such mixed reality environments include, but are not limited to, real environments where a part of the environment is a virtual environment projected into or onto the real environment.
  • the tasks may include any activities or actions that are defined for the robotic device to complete within the defined environment.
  • the robotic team 14 may include any number of robotic devices 12 .
  • the robotic devices 12 that are part of a robotic team collaborate to perform tasks that have been defined for robotic devices 12 in a particular environment.
  • the robotic devices 12 within a robotic team 14 may interact with one another to complete defined tasks.
  • the interaction between robotic devices 12 may consist of exchange of sensor data, action plans, and/or control instructions and controlling the overall operation of the respective other robotic devices 12 .
  • the robotic server 20 is a computing device that is used to generate robotic devices, and more specifically, the firmware, hardware and software (referred to as logic components) that are associated with a robotic device 12 .
  • the robotic server 20 is engaged through the use of computing stations 30 when a user wishes to initiate the process by which a robot is generated, or to interact with an already generated robotic device 12 .
  • the robotic server 20 may be accessed by multiple computing stations 30 through a communication network 31 .
  • the server 20 and computing station may reside on the same computer, and need not be separate devices.
  • the robotic server 20 may be a web application, cloud computer, networked computer, personal computer, handheld computer or may be found on a mobile device.
  • the robotic generator application 22 is resident upon the robotic server 20 in an exemplary embodiment.
  • the robot generator application 22 in alternative embodiments may be distributed across two or more devices 12 .
  • the robotic generator application 22 therefore may be accessed by multiple computing devices.
  • the robotic generator application 22 is described in further detail with respect to FIG. 3 .
  • the robot generator application 22 when engaged allows for a description of the robotic device 12 that is to be generated to be received, such that an appropriate design solution may be created.
  • the robotic generation interface 24 is used to receive instructions from users concerning the design of a robotic device 12 and to interact with an already created robotic device 12 .
  • the generation interface 24 also generates a customized interface according to the specifications of the robotic device 12 , to allow the user to control the operation of the device 12 as required.
  • the knowledge base module 26 in an exemplary embodiment is a data storage or repository that stores multiple types of information that are used in the generation of components for the robotic devices 12 .
  • multiple types of knowledge bases are provided for, where the knowledge bases are accessed and used by the robotic generator application 22 to determine the specifications of a robot and to create the robotic device 12 .
  • the components may include mechanical components, processing components, and logic components.
  • the specification is provided in XML format, and may represent actual software code, or specifications.
  • the mechanical components associated with robotic devices in an exemplary embodiment, the specification comprises a description of the mechanical components that should be used in the assembly of the robotic device 12 .
  • the mechanical components may include, but are not limited to, wheels, joints, gears, chassis, grippers, sensor and actuators.
  • the mechanical components may be described in relation to any combination of the manufacturer, manufacturer's instructions, device type, electrical requirements, firmware interfaces, size, weight and cost.
  • the specifications are provided in a file format such as XML in an exemplary embodiment. After receiving such a specification, the physical components of the robotic device 12 may be manually assembled, or alternatively may be assembled through automated processes.
  • the robotic device 12 once assembled, then has the appropriate software/firmware that is required for its operation downloaded to the robotic device 12 .
  • the fabrication module 28 creates the software/firmware for the robotic device 12 .
  • the computing devices 30 that may engage the robotic generation server 20 may be any type of computing device that a user may be able to provide input to, and include but are not limited to laptop computers, slimline computers, mainframe computers, handheld computers, video game type console computers, cloud computers, cellular devices and any other computing device or devices that has processing capabilities and may receive input.
  • the robotic device 12 in an exemplary embodiment is comprised of one or more sensors 32 , one or more actuators 34 , one or more communication channels 36 , at least one processor 38 and power sources and memory storages (not shown).
  • the robotic device 12 is comprised of various hardware, software and firmware components (logic components) that may be stored externally, internally or on a combination of both to the robotic device 12 .
  • the sensors 32 associated with the robotic device may include visual sensors, audio sensors, and temperature sensors or any other sensors that are used to determine characteristics regarding the environment within which the robotic device 12 operates.
  • the sensed characteristics are then transmitted to the processor 38 , where they may then optionally be transmitted to devices including the robotic controller that allow for external control of the robotic device, or to other robotic devices 12 that are part of a robotic team 14 .
  • the actuators are able to perform various actions within a respective environment, and examples of such actions include, movement within the environment, and engaging with one or more objects within the environment.
  • the communication channels 36 allow for wired or wireless communication with external components or devices. Communications that are sent to, and received from external communication devices allow for exchange of sensor data as well as instructions to be received that allow for control of the device 12 .
  • the processor 38 associated with the robotic device is used to execute commands and receive and send instructions such that the robotic device 12 will operate in a specified manner.
  • the generator application 22 is used to specify the purpose of the creation of the robotic device 12 , and then to create the robotic device without the need of any further low-level specification or programming by a developer.
  • the generator application 22 will therefore allow for the automatic generation of the various hardware, software and firmware components that are to be associated with the robotic device 12 based on user specifications.
  • the generator application 22 comprises the following components, a task specification module 40 , a task generator module 42 , a robot generator module 44 , an interface generator module 46 , and a communication generator module 48 .
  • the various components associated with the generator application 22 are able to interact with the components associated with the knowledge module 26 as described in further detail below with reference to FIG. 4 and this therefore allows the generation of the components associated with the robotic device 12 .
  • the task specification module 40 is engaged by a user through the generator application 22 which provides the purpose and other details associated with the creation of a robotic device 12 .
  • the user provides descriptions of the tasks that are to be associated with a robotic device 12 , as described in further detail below.
  • the user engages the task generator module 40 through a user interface that may include multiple types of interaction with the user.
  • a task description associated with the creation of a robotic device 12 may first be determined based on any combination of questions and answers, multiple choice questions, pictorial engagement where the user may manipulate graphical objects that may be used to provide descriptions of the tasks.
  • the user may provide task descriptions of tasks that have been previously accomplished, or for tasks that are similar to tasks that have previously been accomplished.
  • the task specification module 40 first receives from the user, descriptions of the tasks that the robotic device 12 will implement, along with the skill level (this may range from a new beginner to an experienced user) associated with the user. Based on the specification of a skill level, any exchange of communication between the user and the respective interface is adjusted for the appropriate level.
  • the functionality that is provided to the user may vary depending on the skill-level and associated privileges given to a user. In certain embodiments, advanced users may be provided with the ability to specify a higher level of detail that is used for the generation of a robotic device 12 .
  • the task specification module 40 Upon receiving a description of the task specification, the task specification module 40 as detailed below, creates a specification of the tasks that are to be performed by the robotic device 12 and any information related to constraints associated with the environment in which the robotic device 12 operates. The constraints and tasks that have been specified are then used by the task generator module 42 to generate the appropriate hardware/software specifications that are to be used by the robotic device 12 .
  • task specification details may be constructed using an expert system that employs a series of questions and choices to narrow the specification from a generalized range of possibilities to one or more focused and specific tasks.
  • learning algorithms may be used, where the previous actions are examined to help determine future actions. Further, cased-base reasoning algorithms may also be used.
  • the task generator module 42 receives various inputs in order to generate the hardware/software specifications and this is described in further detail below.
  • the task generator module 42 receives a high level specification of a task that has been provided by the user. If a similar or identical task has previously been implemented, the task generator module 42 retrieves the specifications from the appropriate knowledge database. The user may provide input to further clarify or specify a pre-existing task specification. A new task may also be composed of several pre-existing tasks.
  • the task generator in an exemplary embodiment, as is further described with regards to FIG. 5 , receives as input the tasks that have been specified for a particular robotic device, information pertaining to the environment within which it will operate, and processing requirements.
  • Information regarding the environment within which a robotic device 12 operates includes, but is not limited to, information pertaining to the dimensions of the area, obstacles, and environmental conditions. Physical capabilities and costs may also be provided as aspects of the task specification.
  • the processing requirement determinations are made once a determination is made as to the tasks that a robotic device 12 is to implement based on the computation requirements that are associated with implementations of such tasks.
  • the respective knowledge database that the generator module 42 interacts with stores data pertaining to the processing requirements that are associated with various tasks. For example, the knowledge database 62 will specify for any particular task the type of processor required (i.e. a micro-controller, or a larger processor such as a desktop processor are two such examples).
  • the task generator module 44 first receives the specification of the tasks, and then develops a task list.
  • the development of the task list is based on interaction with the task generator knowledge database 62 as is described with reference to FIG. 7 . Based on the interaction with the task generator knowledge database 62 as is explained in further detail below, a task list is specified where the tasks that a robotic device 12 executes are specified. Also, the task generator module 42 develops based on interaction with the knowledge generator 62 various control system functionalities. Control system functionalities are used to refer to control of the robotic devices 12 lower level functionality which may relate to movement capabilities for example.
  • One example of determining the constraints associated with a robotic device 12 is to take the example of a robotic device that has to perform multiple similar movements on a repeated basis. Such a robot requires a high degree of stability and repeatability in its movements, as future movements may not be possible if prior movements fail.
  • the respective knowledge database and generator module determine the appropriate algorithms that are to be implemented by the robotic device 12 to implement the tasks from the tasks list. These algorithms may include, but are not limited to, localization algorithms, path planning algorithms, computer vision algorithms, pattern recognition algorithms, learning algorithms, behaviour patterns, heuristics, and case-based reasoning.
  • the various hardware/software/and firmware specifications are referred to as logic components.
  • the operation of the task specification module 42 is further described with respect to the operation of the task specification method 150 as shown in FIG. 6 and the interaction between the task generator module and the task generator knowledge database 62 is described in further detail with respect to FIG. 7 .
  • the interface generator module 46 generates an appropriate interface that allows a user to interact with a robotic device 12 . Specifically, the interface generator 46 receives information regarding the type of interface (i.e. wired or wireless) which determines characteristics of the interface (e.g., maximum bandwidth, reliability), and a description of the type of computing device that the robotic device 12 will communicate with (i.e. a PDA, desktop computer). The type of computer device determines the screen sizes, computing power, and storage characteristics of the interface. The method by which the interface generator module 46 determines the interface components is described in further detail with regards to the interaction between the appropriate knowledge base and the interface generator module 46 .
  • the type of interface i.e. wired or wireless
  • characteristics of the interface e.g., maximum bandwidth, reliability
  • a description of the type of computing device that the robotic device 12 will communicate with i.e. a PDA, desktop computer.
  • the type of computer device determines the screen sizes, computing power, and storage characteristics of the interface.
  • the communication generator module 48 is used to generate the various communication tools that are used by the robotic devices 12 .
  • the communication generator module 48 determines the method employed by various devices 12 to communicate. The determinations in an exemplary embodiment include the specification of which protocol is used to communicate, and this may include, protocols including, but not limited to, UPD, RF, Morse Code, vision-based communication, and natural language communication.
  • the communication generator module 48 interacts with the communication knowledge base 68 to determine the appropriate communication tools that are to be used by the robotic device 12 .
  • the knowledge base module 26 comprises a task specification knowledge database 60 , a task generator knowledge database 62 , a robot generator knowledge database 64 , an interface generator knowledge database 66 , and a communications generator knowledge database 68 .
  • the respective knowledge databases that are described in exemplary embodiments herein are described with respect to information that they contain therein that is used to generate one or more components of the robotic device 12 .
  • the task generator knowledge database 62 stores information related to task considerations, task specific information, domain information, and historical information.
  • the task generator knowledge database 62 stores the requirements associated with previous robotic systems. For example, for a robotic system 12 that was previously built, the database 62 will store information pertaining to the processing requirements for a particular task, the hardware and software requirements, and any other such information.
  • the robot generator knowledge database 64 stores information that allows the robot generator module 44 to perform its respective tasks.
  • the robot generator knowledge database 64 stores information pertaining to material considerations associated with the design of a robotic device 12 (i.e. weight and cost considerations), information regarding previous designs, environmental considerations (information pertaining to how specific robotic components may be affected by the environment, processing capabilities, and information regarding available components for any robotic devices, including the actuator and sensor components that may be used in any design.
  • the interface generator knowledge database 66 stores information pertaining to considerations that are associated with the choice of an interface, for example including information pertaining to interface designs for previously generated robotic devices 12 . Specifically, the interface generator knowledge database 66 stores information pertaining to system considerations related to the interface, including the platform on which it resides, the capabilities associated with the associated robotic device 12 and its components, physical connections between robotic device components and other information related to how the robotic device 12 may be affected by environmental conditions. With regards to system considerations, the interface generator knowledge database 66 stores information pertaining to the hardware and software associated with a particular interface. The hardware associated with a particular interface may include considerations related to the display type and information regarding components associated with the interface, including audio components, input components, and video components. With regards to software, the interface generator knowledge database 66 stores information pertaining to any GUI libraries and controls that were used in previous interface designs and that may be used in subsequent interface designs.
  • the communications generator knowledge database 68 stores information pertaining to how the robotic device 12 may communicate, and includes information pertaining to the communication devices and the associated specifications of previously designed robotic devices.
  • the communications generator knowledge database 68 in an exemplary embodiment, stores information relating to the requirements and capabilities for communication that may be associated with specific tasks, and information relating to hardware and software considerations for various communication devices 12 .
  • the knowledge database will generally store information pertaining to various components and design information that it may use to implementing a task that has been specified in the task specification.
  • Various information is stored in the knowledge databases specifically related to whether specific components are suitable for a specific task, including information pertaining to whether there are any specific requirements (including pre and post considerations that must be satisfied before use of any component).
  • Component information may include details regarding the general suitability of components for certain devices. For example, propellers are generally suited for water-based devices. Other information regarding the components may include specific suitability details (i.e.
  • sonar sensors with a 0.8 m range are suitable for robotic devices not larger than 1 m ⁇ 1 m ⁇ 1 m), past uses, or inferred future uses based on defined suitability and past usage.
  • General descriptions of particular devices also allow any suitable device to be selected by a user or the generator from a selection of devices. For example, a task specification may only indicate that a light-sensor device is required, allowing the generator or program to choose any device that is described as a light sensor in the system to implement this role (i.e. the cheapest light sensor available could be chosen).
  • the number of generators that are associated with the creation of a robotic device 12 may vary and the method by which the generators are accessed is described in further detail below.
  • Step 106 the various components, including the hardware and software components that are included in the robotic device are defined based on the task list that has been specified.
  • Method 100 then proceeds to step 108 , where a control interface is defined.
  • the control interface is defined by the interface generator module through engaging with the interface generator knowledge database 66 .
  • the control interface allows for the user to interact with the robotic device 12 by providing further instructions to the robotic device 12 , and by receiving updates as to the status of the robotic device 12 and its interactions within the specific environment.
  • Method 100 then proceeds to step 110 , where the communication components that allow for the robotic device 12 to communicate with other external devices or other robotic devices 12 or objects within an environment are defined.
  • Method 100 has been described with respect to each module of the generator application 22 being engaged to provide one or more components associated with the robotic device. It should be noted that the robot creation method 100 operates, in an exemplary embodiment, by first determining whether any specific components require generation and when certain components require generation, then the appropriate modules are engaged. Also, construction of a robotic device may comprise physical or virtual components, and a RSOC may exist in a simulated or virtual manner.
  • the task specification method 150 is used to generate a set of specifications for the tasks that a robotic device 12 is to perform.
  • Method 150 begins at step 152 , where the user provides descriptions of the environment within which the robotic device 12 will operate. The user may provide various details regarding the environment. For example, if the user has specified their backyard (for which co-ordinates have been provided or are available through the knowledge base), the user provides landmark or constraint information associated with the environment and for example may specify landmarks such as pools, and flower beds. Based on the landmarks that have been defined, constraints may be specified, which would include instructions to avoid.
  • the user may provide information regarding the navigational constraints that may be present in the particular environment, and a description of other objects that may be present in the environment (for example, landmarks). Further, the user may provide a map of the environment within which the robotic device will operate. A map may be provided through one or more various methods, including a sketching program, or drawing program or uploading a map that is recognizable to the system 10 . A further description of the provision of maps is provided with reference to FIG. 8 . Maps may also be created dynamically using sensor provided data or by accessing existing data systems containing pertinent information eg: Google Maps, etc.
  • step 154 If at step 154 , it is determined that the user has provided sufficient information regarding the environment, method 150 then proceeds to step 156 . Based on a task that has been specified by the user, more detailed specific information is requested from the user regarding the task. For example, where the user has specified that a robotic device is to cut grass, then further information regarding time, speed, date, length, obstacles, area and terrain must be provided. If certain information is not provided or is inconsistent then method 150 returns to step 152 , with specific questions or directions to the user for more or revised specifications.
  • the user provides information regarding the task description that the robotic device 12 is to carry out.
  • the user may provide information through more than one method, including through answers in response to multiple-choice questions, or selecting from one or more tasks already defined.
  • the input which the user provides in response to the prompts that are shown are used to form a general task specification of the various tasks/goals that the robotic device 12 is to carry out.
  • Method 150 then proceeds to step 158 .
  • a check is performed to determine whether the description that has been provided of the task is sufficient, which may include determination as to whether the appropriate location or duration information, where applicable, has been provided.
  • step 158 If it is determined at step 158 that a sufficient description has not been provided, method 150 returns to step 156 , and the user is queried for further information regarding the task description. If at step 158 it is determined that sufficient information has been provided regarding task descriptions, method 150 proceeds to step 160 .
  • the user is asked to provide information regarding the fabrication method of the robotic device 12 .
  • the user provides information pertaining to how the device 12 should be fabricated, where applicable. For example, the user may select between various options including, but not limited to, determining whether the fabrication should be done automatically by a CNC machine or should the specifications be provided to a manufacturer who will manufacture the appropriate components.
  • a check is performed to determine whether sufficient information has been provided. If it is determined at step 168 that sufficient information has not been provided, method 150 returns to step 166 , and the user is asked to provide further information regarding the fabrication of the robotic device 12 .
  • the system 10 has received from the user a general task description that can then be used to design and develop the robot as is described below. Upon receiving a task description, the system 10 then proceeds as described above to generate a task list that may be implemented by the robotic device 12 . As mentioned above, the various tasks specifications are determined based on interaction with the task specification knowledge database 60 , and a description is provided herein of the interaction with the respective knowledge databases.
  • the task generator module 40 in an exemplary embodiment defines the goals or tasks that are to be implemented based on the task specifications that have been provided, and may rank them based on their importance, defines minimum requirements associated with the robotic device and determines the processing requirements associated with the robotic device.
  • the task generator module 40 interacts with the task generator knowledge database 62 to define the task list.
  • the task generator knowledge database 62 stores information regarding task lists that have been used for other robotic devices, and hardware and software requirements associated with specific tasks.
  • the task generator module 40 based on the task specification provided, analyzes the task specification, and if this task is known to the knowledge database, the task generator will return the associated processing requirements that are associated with completion of this particular task.
  • a check is performed to determine whether the computational resources are exhausted.
  • the computational resources associated with method 200 have been exhausted this indicates that a task list for the provided for task specifications will never be generated.
  • Situations, where the computational resources have been exhausted include but are not limited to, where the associated task generator knowledge database 62 is empty, which would indicate that a task list cannot be generated, and also where there is a shortage of memory for storage.
  • method 200 proceeds to step 210 , where a failure message is generated indicating that a task list for the particular task specifications will not be generated. If at step 208 it is determined that the computational resources have not been exhausted, method 200 proceeds to step 212 .
  • FIG. 8 a flowchart illustrating the steps of an environment specification method 250 is shown in an exemplary embodiment.
  • the environment specification method 250 describes in further detail the steps that are undertaken in method 150 , primarily at steps 152 and 156 .
  • the environment specification method describes how the user may specify an environment and how the user may associate certain tasks with the areas or routes in the environment.
  • the environment specification method 250 described herein describes one example of how an environment may be specified.
  • the navigation method may be based on the provision of specific points within an environment, or through allowing the robotic device 12 to navigate the environment in a wander mode, where the device 12 explores the environment without following a predefined path.
  • Method 250 then proceeds to step 260 , where the user provides a map of the environment that the robotic device 12 is to operate in.
  • FIG. 9 where a sample map editor window 300 is shown in one embodiment.
  • the map editor window 300 as described herein allows the user to specify the layout of an environment, various points within the environment, and tasks that are to completed within the environment.
  • the map that is provided by the user at step 260 may be created by the user through providing the respective dimensions in a map specification window 302 .
  • the user may specify the dimensions of the environment in the map specification window through utilization of various tools that are provided in the tools menu 304 .
  • the user may also select the robotic device 12 for which the respective environment is being specified through use of the device selector window 306 .
  • the map editor window 300 of FIG. 9 is shown as one example of the method by which environments may be specified.
  • the user may upload maps that are either two or three-dimensional.
  • the geographic areas that are covered by these respective maps are not limited to room size dimensions, and may represent maps of any dimensions.
  • Method 250 now proceeds to step 262 , where the user may select an area upon the map to specify a particular waypoint or point of interest.
  • a point of interest may be a starting point, stopping point, or point of transition.
  • FIG. 10 where the map editor window 300 is shown where the user has selected a point of interest 310 .
  • method 250 proceeds to step 264 .
  • the user may specify a direction of travel or route that the robotic device 12 is to take from the point of interest 310 .
  • the route is specified in an exemplary embodiment, however, it should be noted that the robotic device 12 may also be required to determine a route between certain points that have been specified. Reference is now made to FIG.
  • FIG. 11 where the map editor window 300 is shown, where the user has specified a route 312 from the point of interest 310 that the robotic device 12 should follow.
  • FIG. 12 where a point of interest 310 is shown with two stopping points 314 .
  • the stopping points 314 are where a robotic device 12 may perform a task or may change their direction of travel.
  • Method 250 then proceeds to step 266 , where the user may specify tasks that are to be undertaken at the points of interest 310 , the stopping points 314 or along the route 312 .
  • FIG. 13 where the map editor window 300 is shown with a task selector window 316 .
  • the task selector window 316 is used to select one or more tasks that are to be completed by the robotic device 12 .
  • step 268 the respective maps and tasks are transmitted to the robotic device 12 .
  • the respective information in an exemplary embodiment is transmitted to the robotic device through UDP. Also, predefined tasks may also be transmitted to the robotic device where the device is to carry out certain tasks.
  • FIGS. 14A and 14B where a sample set of screen shots that are used when specifying a task are shown.
  • FIG. 14A and FIG. 14B where a task specification window 350 is shown in one embodiment.
  • the task specification window 350 will vary depending on the tasks that are being specified.
  • the task specification window 350 shown in FIGS. 14A and 14B is used to specify a task that is to be completed for a robotic device 12 , where the tasks are selected from predefined tasks that are associated with the system.
  • FIGS. 14A and 14B show that the user has chosen a task by first specifying a location 352 .
  • robotic devices 12 may also operate as part of a robotic team 14 .
  • a robotic team refers to two or more robotic devices that exist together in the same environment.
  • the team of robotic devices 14 may exist in a physical environment, virtual environment or in a mixed environment. Within the environment, one or more of the team members may be used to control the respective other devices 12 .
  • the robotic devices 12 that may control the other devices 12 are referred to as controlling robotic devices.
  • the controlling robotic devices may be a robotic server or have access to a robotic server.
  • the controlling device monitors the respective actions, and requirements within an environment and may communicate with the robotic server when new components are required for the other robotic devices that are being controlled.
  • new components may be required for this task completion, including updates as to the new task specification.
  • the controlling device may interact then with the robotic server and specify the new task information, and the robotic server would proceed to determine the various components and subsequent modifications that may be necessary for each of the respective controlled robotic devices.
  • the controlling robotic devices may also make requests to the robotic server for new components for itself as well.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
US12/499,411 2007-01-12 2009-07-08 Method and System for Robot Generation Abandoned US20090306823A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/499,411 US20090306823A1 (en) 2007-01-12 2009-07-08 Method and System for Robot Generation
US13/399,505 US9671786B2 (en) 2007-01-12 2012-02-17 Method and system for robot generation

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US88005907P 2007-01-12 2007-01-12
PCT/CA2008/000041 WO2008083489A1 (en) 2007-01-12 2008-01-11 Method and system for robot generation
US12/499,411 US20090306823A1 (en) 2007-01-12 2009-07-08 Method and System for Robot Generation

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2008/000041 Continuation WO2008083489A1 (en) 2007-01-12 2008-01-11 Method and system for robot generation

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/399,505 Continuation US9671786B2 (en) 2007-01-12 2012-02-17 Method and system for robot generation

Publications (1)

Publication Number Publication Date
US20090306823A1 true US20090306823A1 (en) 2009-12-10

Family

ID=39608294

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/499,411 Abandoned US20090306823A1 (en) 2007-01-12 2009-07-08 Method and System for Robot Generation
US13/399,505 Active 2028-11-15 US9671786B2 (en) 2007-01-12 2012-02-17 Method and system for robot generation

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/399,505 Active 2028-11-15 US9671786B2 (en) 2007-01-12 2012-02-17 Method and system for robot generation

Country Status (4)

Country Link
US (2) US20090306823A1 (de)
EP (1) EP2117782B1 (de)
CN (2) CN101631651B (de)
WO (1) WO2008083489A1 (de)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110184597A1 (en) * 2010-01-22 2011-07-28 Joy Mm Delaware, Inc. Device for reducing the likelihood of damage to a trailing cable
US20120078417A1 (en) * 2010-09-28 2012-03-29 International Business Machines Corporartion Detecting Energy and Environmental Leaks In Indoor Environments Using a Mobile Robot
US20120150345A1 (en) * 2007-01-12 2012-06-14 Hansjorg Baltes Method and system for robot generation
US20120324415A1 (en) * 2010-01-13 2012-12-20 Kuka Laboratories Gmbh System Comprising Development Environments And Machine Controls
US20130275091A1 (en) * 2010-07-22 2013-10-17 Cogmation Robotics Inc. Non-programmer method for creating simulation-enabled 3d robotic models for immediate robotic simulation, without programming intervention
US20150120043A1 (en) * 2013-10-30 2015-04-30 Georgia Tech Research Corporation Methods and systems for facilitating interactions between a robot and user
US9037282B2 (en) 2011-06-24 2015-05-19 The Boeing Company Manufacturing control system
US20150140893A1 (en) * 2007-07-19 2015-05-21 Hydrae Limited Interacting toys
EP2835231A4 (de) * 2012-04-02 2016-02-17 Yaskawa Denki Seisakusho Kk Robotersystem und robotersteuerungsvorrichtung
US20160217409A1 (en) * 2015-01-23 2016-07-28 Center for Independent Futures Goal management system and methods of operating the same
CN106154896A (zh) * 2015-04-08 2016-11-23 广明光电股份有限公司 机器人同步控制方法
US9661477B1 (en) * 2015-03-06 2017-05-23 AI Incorporated Collaborative robotic device work group
KR101773102B1 (ko) 2011-09-14 2017-08-31 한국전자통신연구원 컴포넌트 기반 로봇 응용 소프트웨어 개발에서의 가상 컴포넌트를 이용한 컴포넌트 조합 장치 및 방법과 이에 관련된 프로그램의 기록매체
US20180104816A1 (en) * 2016-10-19 2018-04-19 Fuji Xerox Co., Ltd. Robot device and non-transitory computer readable medium
US20180200884A1 (en) * 2017-01-16 2018-07-19 Ants Technology (Hk) Limited Robot apparatus, methods and computer products
US10035259B1 (en) * 2017-03-24 2018-07-31 International Business Machines Corporation Self-assembling robotics for disaster applications
US20180246500A1 (en) * 2015-02-25 2018-08-30 Siemens Aktiengesellschaft A method for manufacturing a product according to a production plan
US10168674B1 (en) * 2013-04-22 2019-01-01 National Technology & Engineering Solutions Of Sandia, Llc System and method for operator control of heterogeneous unmanned system teams
US20190205145A1 (en) * 2017-12-28 2019-07-04 UBTECH Robotics Corp. Robot task management method, robot using the same and computer readable storage medium
US11334069B1 (en) 2013-04-22 2022-05-17 National Technology & Engineering Solutions Of Sandia, Llc Systems, methods and computer program products for collaborative agent control
US11813750B2 (en) 2017-04-19 2023-11-14 Kabushiki Kaisha Yaskawa Denki Programming support apparatus, robot system, and programming support method

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2404193B1 (de) 2009-03-02 2017-05-03 Diversey, Inc. System und verfahren zur hygieneüberwachung und -verwaltung
US8918209B2 (en) 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US8918213B2 (en) 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US9014848B2 (en) * 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US8935005B2 (en) 2010-05-20 2015-01-13 Irobot Corporation Operating a mobile robot
TW201227190A (en) * 2010-12-28 2012-07-01 Hon Hai Prec Ind Co Ltd System and method for controlling robots via cloud computing
CN102571859A (zh) * 2010-12-29 2012-07-11 鸿富锦精密工业(深圳)有限公司 通过云计算控制机器人的***及方法
US8930019B2 (en) 2010-12-30 2015-01-06 Irobot Corporation Mobile human interface robot
US8818556B2 (en) * 2011-01-13 2014-08-26 Microsoft Corporation Multi-state model for robot and user interaction
CN104287676B (zh) * 2011-03-31 2017-02-15 科沃斯机器人股份有限公司 一种应用在多功能机器人中自动分离功能模块的控制装置
US10054933B2 (en) * 2012-03-27 2018-08-21 Sirqul, Inc. Controlling distributed device operations
US8868241B2 (en) * 2013-03-14 2014-10-21 GM Global Technology Operations LLC Robot task commander with extensible programming environment
KR101618585B1 (ko) * 2014-03-19 2016-05-09 (주)로보티즈 로봇 조립 장치
US9272418B1 (en) * 2014-09-02 2016-03-01 The Johns Hopkins University System and method for flexible human-machine collaboration
US9667613B1 (en) * 2014-12-10 2017-05-30 EMC IP Holding Company LLC Detecting mobile device emulation
US20160202670A1 (en) * 2015-01-08 2016-07-14 Northwestern University System and method for sequential action control for nonlinear systems
JP6844124B2 (ja) * 2016-06-14 2021-03-17 富士ゼロックス株式会社 ロボット制御システム
US10372127B2 (en) * 2016-07-18 2019-08-06 International Business Machines Corporation Drone and drone-based system and methods for helping users assemble an object
KR102634499B1 (ko) * 2017-01-27 2024-02-06 론자 리미티드 자동화된 시스템의 동적 제어
US10449671B2 (en) * 2017-04-04 2019-10-22 Toyota Research Institute, Inc. Methods and systems for providing robotic operation constraints for remote controllable robots
US11331803B2 (en) * 2017-04-17 2022-05-17 Siemens Aktiengesellschaft Mixed reality assisted spatial programming of robotic systems
CN107678842A (zh) * 2017-09-19 2018-02-09 上海思岚科技有限公司 一种用于移动机器人的定时任务的方法及***
EP3476545A1 (de) * 2017-10-27 2019-05-01 Creaholic SA Verfahren zum betrieb eines computergestützten inventars von hardwaremodulen eines robotersystems
WO2018172593A2 (es) 2018-05-25 2018-09-27 Erle Robotics, S.L Método para integrar nuevos módulos en robots modulares, y un componente de robot del mismo
US10953541B2 (en) * 2018-07-31 2021-03-23 At&T Intellectual Property I, L.P. Providing logistical support for robots
CN113168177B (zh) * 2018-10-29 2024-07-19 西门子股份公司 自主世界模型中的动态细化标记
CN109696915B (zh) * 2019-01-07 2022-02-08 上海托华机器人有限公司 一种测试方法和***
US11926048B2 (en) * 2021-05-26 2024-03-12 Amazon Technologies, Inc. Modular robotic linkages
DE102022111400A1 (de) 2022-05-06 2023-11-09 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren zum Vorbereiten und Ausführen von Aufgaben mithilfe eines Roboters, Roboter und Computerprogramm
CN115816441B (zh) * 2022-10-31 2023-08-08 实时侠智能控制技术有限公司 基于任务描述的机器人控制方法、装置及可读介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5084826A (en) * 1989-07-27 1992-01-28 Nachi-Fujikoshi Corp. Industrial robot system
US6266577B1 (en) * 1998-07-13 2001-07-24 Gte Internetworking Incorporated System for dynamically reconfigure wireless robot network
US20040024490A1 (en) * 2002-04-16 2004-02-05 Mclurkin James System amd methods for adaptive control of robotic devices
US20060079997A1 (en) * 2002-04-16 2006-04-13 Mclurkin James Systems and methods for dispersing and clustering a plurality of robotic devices
US20070031217A1 (en) * 2005-05-31 2007-02-08 Anil Sharma Track Spiders Robotic System
US20070061040A1 (en) * 2005-09-02 2007-03-15 Home Robots, Inc. Multi-function robotic device

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963712A (en) * 1996-07-08 1999-10-05 Sony Corporation Selectively configurable robot apparatus
JP3919040B2 (ja) * 1997-11-30 2007-05-23 ソニー株式会社 ロボツト装置
JPH11249725A (ja) * 1998-02-26 1999-09-17 Fanuc Ltd ロボット制御装置
US6347261B1 (en) * 1999-08-04 2002-02-12 Yamaha Hatsudoki Kabushiki Kaisha User-machine interface system for enhanced interaction
EP1195231A4 (de) * 2000-03-31 2006-01-25 Sony Corp Roboter, verfahren zur steuerung einer roboteraktion, verfahren und vorrichtung zur erfassung von externer kraft
US6904335B2 (en) * 2002-08-21 2005-06-07 Neal Solomon System, method and apparatus for organizing groups of self-configurable mobile robotic agents in a multi-robotic system
JP2005088179A (ja) * 2003-09-22 2005-04-07 Honda Motor Co Ltd 自律移動ロボットシステム
JP2005103722A (ja) * 2003-09-30 2005-04-21 Toshiba Corp 協調ロボット装置、システム、およびナビゲーションロボット装置
CN100556623C (zh) * 2004-10-19 2009-11-04 松下电器产业株式会社 自动机械装置
US8200700B2 (en) * 2005-02-01 2012-06-12 Newsilike Media Group, Inc Systems and methods for use of structured and unstructured distributed data
US7236861B2 (en) * 2005-02-16 2007-06-26 Lockheed Martin Corporation Mission planning system with asynchronous request capability
US7912633B1 (en) * 2005-12-01 2011-03-22 Adept Mobilerobots Llc Mobile autonomous updating of GIS maps
KR101099808B1 (ko) * 2005-12-02 2011-12-27 아이로보트 코퍼레이션 로봇 시스템
US9195233B2 (en) * 2006-02-27 2015-11-24 Perrone Robotics, Inc. General purpose robotics operating system
US20070271002A1 (en) * 2006-05-22 2007-11-22 Hoskinson Reed L Systems and methods for the autonomous control, automated guidance, and global coordination of moving process machinery
US8073564B2 (en) * 2006-07-05 2011-12-06 Battelle Energy Alliance, Llc Multi-robot control interface
US7211980B1 (en) * 2006-07-05 2007-05-01 Battelle Energy Alliance, Llc Robotic follow system and method
US7620477B2 (en) * 2006-07-05 2009-11-17 Battelle Energy Alliance, Llc Robotic intelligence kernel
US7668621B2 (en) * 2006-07-05 2010-02-23 The United States Of America As Represented By The United States Department Of Energy Robotic guarded motion system and method
US7801644B2 (en) * 2006-07-05 2010-09-21 Battelle Energy Alliance, Llc Generic robot architecture
CN101631651B (zh) * 2007-01-12 2013-07-24 汉斯乔格·巴尔特斯 用于产生机器人的方法和***
US20090234788A1 (en) * 2007-03-31 2009-09-17 Mitchell Kwok Practical Time Machine Using Dynamic Efficient Virtual And Real Robots
US20090248200A1 (en) * 2007-10-22 2009-10-01 North End Technologies Method & apparatus for remotely operating a robotic device linked to a communications network

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5084826A (en) * 1989-07-27 1992-01-28 Nachi-Fujikoshi Corp. Industrial robot system
US6266577B1 (en) * 1998-07-13 2001-07-24 Gte Internetworking Incorporated System for dynamically reconfigure wireless robot network
US20040024490A1 (en) * 2002-04-16 2004-02-05 Mclurkin James System amd methods for adaptive control of robotic devices
US20060079997A1 (en) * 2002-04-16 2006-04-13 Mclurkin James Systems and methods for dispersing and clustering a plurality of robotic devices
US20070179669A1 (en) * 2002-04-16 2007-08-02 Mclurkin James System and methods for adaptive control of robotic devices
US20070031217A1 (en) * 2005-05-31 2007-02-08 Anil Sharma Track Spiders Robotic System
US20070061040A1 (en) * 2005-09-02 2007-03-15 Home Robots, Inc. Multi-function robotic device
US20070061043A1 (en) * 2005-09-02 2007-03-15 Vladimir Ermakov Localization and mapping system and method for a robotic device

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120150345A1 (en) * 2007-01-12 2012-06-14 Hansjorg Baltes Method and system for robot generation
US9671786B2 (en) * 2007-01-12 2017-06-06 White Magic Robotics Inc. Method and system for robot generation
US20150140893A1 (en) * 2007-07-19 2015-05-21 Hydrae Limited Interacting toys
US10037019B2 (en) * 2010-01-13 2018-07-31 Kuka Deutschland Gmbh System comprising development environments and machine controls
US20120324415A1 (en) * 2010-01-13 2012-12-20 Kuka Laboratories Gmbh System Comprising Development Environments And Machine Controls
US8989929B2 (en) * 2010-01-22 2015-03-24 Joy Mm Delaware, Inc. Device for reducing the likelihood of damage to a trailing cable
US20110184597A1 (en) * 2010-01-22 2011-07-28 Joy Mm Delaware, Inc. Device for reducing the likelihood of damage to a trailing cable
US20130275091A1 (en) * 2010-07-22 2013-10-17 Cogmation Robotics Inc. Non-programmer method for creating simulation-enabled 3d robotic models for immediate robotic simulation, without programming intervention
US20120078417A1 (en) * 2010-09-28 2012-03-29 International Business Machines Corporartion Detecting Energy and Environmental Leaks In Indoor Environments Using a Mobile Robot
US9037282B2 (en) 2011-06-24 2015-05-19 The Boeing Company Manufacturing control system
KR101773102B1 (ko) 2011-09-14 2017-08-31 한국전자통신연구원 컴포넌트 기반 로봇 응용 소프트웨어 개발에서의 가상 컴포넌트를 이용한 컴포넌트 조합 장치 및 방법과 이에 관련된 프로그램의 기록매체
EP2835231A4 (de) * 2012-04-02 2016-02-17 Yaskawa Denki Seisakusho Kk Robotersystem und robotersteuerungsvorrichtung
US9662789B2 (en) 2012-04-02 2017-05-30 Kabushiki Kaisha Yaskawa Denki Robot system and robot controller
US10168674B1 (en) * 2013-04-22 2019-01-01 National Technology & Engineering Solutions Of Sandia, Llc System and method for operator control of heterogeneous unmanned system teams
US11334069B1 (en) 2013-04-22 2022-05-17 National Technology & Engineering Solutions Of Sandia, Llc Systems, methods and computer program products for collaborative agent control
US9846843B2 (en) * 2013-10-30 2017-12-19 Georgia Tech Research Corporation Methods and systems for facilitating interactions between a robot and user
US20150120043A1 (en) * 2013-10-30 2015-04-30 Georgia Tech Research Corporation Methods and systems for facilitating interactions between a robot and user
US20160217409A1 (en) * 2015-01-23 2016-07-28 Center for Independent Futures Goal management system and methods of operating the same
US10839333B2 (en) * 2015-01-23 2020-11-17 Center for Independent Futures Goal management system and methods of operating the same
US20180246500A1 (en) * 2015-02-25 2018-08-30 Siemens Aktiengesellschaft A method for manufacturing a product according to a production plan
US10816961B2 (en) * 2015-02-25 2020-10-27 Siemens Aktiengesellschaft Method for manufacturing a product according to a production plan
US9661477B1 (en) * 2015-03-06 2017-05-23 AI Incorporated Collaborative robotic device work group
CN106154896A (zh) * 2015-04-08 2016-11-23 广明光电股份有限公司 机器人同步控制方法
US20180104816A1 (en) * 2016-10-19 2018-04-19 Fuji Xerox Co., Ltd. Robot device and non-transitory computer readable medium
US10987804B2 (en) * 2016-10-19 2021-04-27 Fuji Xerox Co., Ltd. Robot device and non-transitory computer readable medium
US10661438B2 (en) * 2017-01-16 2020-05-26 Ants Technology (Hk) Limited Robot apparatus, methods and computer products
US20180200884A1 (en) * 2017-01-16 2018-07-19 Ants Technology (Hk) Limited Robot apparatus, methods and computer products
US10035259B1 (en) * 2017-03-24 2018-07-31 International Business Machines Corporation Self-assembling robotics for disaster applications
US10543595B2 (en) * 2017-03-24 2020-01-28 International Business Machines Corporation Creating assembly plans based on triggering events
US10532456B2 (en) * 2017-03-24 2020-01-14 International Business Machines Corporation Creating assembly plans based on triggering events
US10265844B2 (en) * 2017-03-24 2019-04-23 International Business Machines Corporation Creating assembly plans based on triggering events
US11813750B2 (en) 2017-04-19 2023-11-14 Kabushiki Kaisha Yaskawa Denki Programming support apparatus, robot system, and programming support method
US20190205145A1 (en) * 2017-12-28 2019-07-04 UBTECH Robotics Corp. Robot task management method, robot using the same and computer readable storage medium
US10725796B2 (en) * 2017-12-28 2020-07-28 Ubtech Robotics Corp Robot task management method, robot using the same and non-transitory computer readable storage medium

Also Published As

Publication number Publication date
US9671786B2 (en) 2017-06-06
EP2117782B1 (de) 2014-07-30
US20120150345A1 (en) 2012-06-14
CN101631651A (zh) 2010-01-20
EP2117782A1 (de) 2009-11-18
CN101631651B (zh) 2013-07-24
EP2117782A4 (de) 2013-07-24
CN103433923A (zh) 2013-12-11
WO2008083489A1 (en) 2008-07-17

Similar Documents

Publication Publication Date Title
US9671786B2 (en) Method and system for robot generation
US20170076194A1 (en) Apparatuses, methods and systems for defining hardware-agnostic brains for autonomous robots
Dodds et al. Components, curriculum, and community: Robots and robotics in undergraduate ai education
Roberts et al. Goal reasoning to coordinate robotic teams for disaster relief
U Lima et al. Multi-robot systems
Roldán-Álvarez et al. Unibotics: open ROS-based online framework for practical learning of robotics in higher education
Kästner et al. Arena-rosnav 2.0: A development and benchmarking platform for robot navigation in highly dynamic environments
Olvera et al. Mapping and navigation in an unknown environment using LiDAR for mobile service robots
Kästner et al. Demonstrating Arena-Web: A Web-based Development and Benchmarking Platform for Autonomous Navigation Approaches.
Panicker et al. Exposing students to a state-of-the-art problem through a capstone project
Galtarossa Obstacle avoidance algorithms for autonomous navigation system in unstructured indoor areas
Patnaik et al. Innovations in robot mobility and control
Adongo The Turtlebot Tour Guide (TTG)
Yusof et al. Development of an Educational Virtual Mobile Robot Simulation
Ben Roummane et al. Localization and navigation of ROS-based autonomous robot in hospital environment
Muchaxo Uav Navigation System for Prescribed Fires
De Martini et al. eduMorse: an open-source framework for mobile robotics education
Kästner et al. Arena-Web--A Web-based Development and Benchmarking Platform for Autonomous Navigation Approaches
Mooers et al. Human-robot teaming for a cooperative game in a shared partially observable space
Ghazal et al. Simulation of autonomous navigation of turtlebot robot system based on robot operating system
Liang Developing Robot Programming Lab Projects
Bendell et al. Human performance with autonomous robotic teammates: research methodologies and simulations
Smith User interface and function library for ground robot navigation
Kulich et al. USER’S ACCESS TO THE ROBOTIC E-LEARNING SYSTEM-SyRoTek
Sørensen et al. Hivemind

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION