EP4041503A2 - Procédé, produit-programme informatique et commande de robot pour configurer un environnement système robot-objet, et robot - Google Patents

Procédé, produit-programme informatique et commande de robot pour configurer un environnement système robot-objet, et robot

Info

Publication number
EP4041503A2
EP4041503A2 EP20828985.0A EP20828985A EP4041503A2 EP 4041503 A2 EP4041503 A2 EP 4041503A2 EP 20828985 A EP20828985 A EP 20828985A EP 4041503 A2 EP4041503 A2 EP 4041503A2
Authority
EP
European Patent Office
Prior art keywords
robot
rosu
drz
twin
digital
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20828985.0A
Other languages
German (de)
English (en)
Inventor
Vincent Dietrich
Florian Wirnshofer
Frederik Deroo
Robert Eidenberger
Daniel MEYER-DELIUS DI VASTO
Philipp Sebastian Schmitt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Publication of EP4041503A2 publication Critical patent/EP4041503A2/fr
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • B25J13/089Determining the position of the robot with reference to its environment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39032Touch probe senses constraint known plane, derive kinematic calibration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40323Modeling robot environment for sensor based robot system

Definitions

  • the invention relates to a method for configuring a robot object system environment according to the preamble of claim 1, a computer program product for configuring a robot object system environment according to the preamble of claim 12, a robot controller for configuring a robot Object system environment according to the preamble of claim 14 and a robot according to the preamble of claim 15.
  • a robot-object system environment within the meaning of the present invention is the environment of an automated system in which a robot as an automatically controlled, reprogrammable, multi-use handling device (e.g. in industry, in service, in the field or acting autonomously) with several degrees of freedom , whose movements are programmable and, if necessary, sensor-guided and which is used either stationary or movable to carry out handling and / or manufacturing tasks on one or more objects or items.
  • the automated system can, for example, be a production or robot cell in terms of shape and design.
  • the robot-object system environment or the cell must be configured in such a way that discrepancies between the reality of the robot-object system environment and its digital representation as a CAD model with regard to poses of the object and robot do not occur.
  • the pose results from the combination of position and orientation of a free rigid body (e.g. robot with its individual components and object) with 6 degrees of freedom in space. In this case one speaks of a 6D pose of the body. In common parlance, when the spatial The position of a body is to be determined and where, strictly speaking, the pose of the body would have to be determined, imprecisely spoken of the determination of a body position.
  • object position is also used in the context of the present application and in the course of this it is also spoken of object position data, although the more precise use of the terms object position and object position data would actually be popular despite their rare occurrence in common parlance.
  • FIGURE 1 uses a principle diagram to show how the conventional configuration of a robot object system environment ROSU ', which is designed, for example, as a robot cell, to avoid the discrepancies that occur between the reality of the robot object system environment ROSU' and its digital representation
  • the presentation is running or is being carried out by a DRZ 'digital robot twin, also known as a "digital twin" in technical jargon.
  • This configuration is divided into three phases, which are carried out independently of one another by manual measures by people.
  • the digital robot twin DRZ ' is generated, for example by an engineer.
  • the latter creates the draft (the design) of a control program STP 'with the help of a programming tool PGT, eg a "Totally Integrated Automation ⁇ TIA>” portal, and a CADP with the help of a "Computer Aided Design ⁇ CAD>” program Hardware design (a hardware design) of the robot object system environment ROSU '.
  • the control program STP ' which contains a control logic for a robot and movement commands of the robot, becomes a direct, integral part of the digital robot twin DRZ', while the hardware design, in which the geometric data of the robot object System environment ROSU 'are contained in a data memory DSP' of the digital robot twin DRZ '.
  • This is a current state AZ 'of the digital robot twin DRZ'.
  • the robot object system environment ROSU 'or the robot cell is built up, for example by a worker.
  • This structure includes the real local positioning of at least one object and the robot in the robot object system environment ROSU 'and the design of the robot object system environment ROSU' according to the geometric data stored in the data memory DSP.
  • the robot object system environment ROSU 'built in this way must now be synchronized with the digital robot twin DRZ' in the current state AZ '.
  • a manual synchronization of the digital robot twin SYDRZ 'in the current state AZ' is therefore carried out, e.g. by an operator of the robot object system environment ROSU '.
  • the result of this manually performed synchronization is an updated, synchronized state ASZ 'of the digital robot twin DRZ', but by manually bringing about the updated, synchronized state AZ 'of the digital robot twin DRZ' is not necessarily ensures that the identified discrepancies, which should be avoided, are also taken into account retrospectively in the DRZ 'digital robot twin.
  • the commissioning of the robot-object system environment is a crucial task in the automation and robotics environment and if the time for such commissioning can therefore be shortened and automated, this results in a competitive advantage.
  • the object on which the invention is based is to develop a method, computer program product and a robot control tion to configure a robot-object system environment and specify a robot in which or in which discrepancies between the reality of the robot-object system environment and its digital representation as a CAD model occur automatically in the course of the configuration of the robot-object system environment can be eliminated without manual on-site commissioning of the robot object system environment with an adaptation of the CAD model to reality.
  • the object is achieved based on the in the Oberbe handle of claim 14 robot controller by the specified in the characterizing part of claim 14 robot features.
  • the object is based on the robot defined in the preamble of claim 15 by the specified in the Kennzei chen of claim 15 robot features ge solves.
  • the teaching consists in configuring a robot-object system environment with at least one object and a robot for object manipulation and object detection, a digital robot twin that digitally represents the robot-object system environment and controls the robot for object manipulation on the basis of a control program for a targeted use of the robot in the robot object System environment for object manipulation needs-based - in the sense of a first-level accuracy requirement or a first-level accuracy requirement and a second-level accuracy requirement - and to synchronize in one or two stages in this regard.
  • each object in the robot-object system environment is optically detected in relation to an object position in the course of the control program sequence
  • each object in the robot-object system environment is recorded in relation to an object position in the course of the control program by determining an object pose distribution or by determining object pose distribution and robot contact
  • the particular advantages are that due to the need-based synchronization of the digital robot twin each takes place either in one or two stages as required. This reduces the runtime for the synchronization. Furthermore, the control program is simulated for both stages with a current digital robot twin, a first stage digital robot twin and a second stage digital robot twin. This makes it possible to automatically determine the respective accuracy requirement, the first-level accuracy requirement and the second-level accuracy requirement, depending on the current robot object system environment and current deviations.
  • the assessment of the current synchronization is ensured by a comparison step between simulated data and real data as well as a scanning step in which the possible distribution of object positions is estimated.
  • a meaningful assessment is made by simulating the actual robot control program for various samples, which provides a measure of the probability of failure of parts of the control program in relation to sub-process tasks.
  • the quality of the synchronization can be increased compared to the manual synchronization according to the prior art (see FIG. 1). This is shown by the fact that when the robot is operationally used in the robot-object system environment, errors and collisions during object manipulation are avoided.
  • the automatic synchronization of the digital robot twin ensures that the identified discrepancies between the reality of the robot object system environment and its digital representation, which must be avoided in the digital robot twin, also being taken into account by means of a corresponding automatic feedback.
  • FIG. 1 shows:
  • FIG. 2 shows an arrangement of a robot system and a robot-object system environment
  • FIG. 3 shows a flow chart for automatic digital robot twin synchronization
  • FIG. 4 shows an illustration diagram with regard to dedicated instruction steps and loop queries of the flow chart according to FIG. 3.
  • FIG. 2 shows an arrangement of a robot system RBSY and a robot-object system environment ROSU.
  • the robot system RBSY contains a robot RB, which is part of the robot object system environment ROSU and is used there for object manipulation and object detection, as well as a robot controller RBST to control the robot RB in the robot object system environment ROSU.
  • the robot controller RBST contains a digitally representing the robot object system environment ROSU, the robot RB for object manipulation controlling digital robot twin DRZ, a computer program product CPP and a configuration data memory KDSP, which are used in the manner shown Formation of a functional unit for the configuration of the robot object system environment ROSU, hereinafter also referred to as ROSU configuration, and in this case Context to control the robot RB functionally interact and / or are connected to one another.
  • the robot controller RBST can, for example, either - like the robot RB - be part of the robot system RBSY as an independently trained unit that is sold separately on the market, whereby the robot controller RBST and robot RB can be from the same manufacturer, but they do not have to, or the
  • the robot controller RBST forms a structural unit with the robot RB in such a way that it is distributed with the robot RB as a bundle package.
  • the digital robot twin DRZ is, for example, like the digital robot twin DRZ 'in FIG. 1, again generated by an engineer, with a control program STP with a control logic and movement commands for the robot RB, on the basis of which the robot RB is controlled in the robot object system environment ROSU during object manipulation, and a data memory DSP for storing geometric data of the robot object system environment ROSU is contained in the digital robot twin DRZ.
  • a process requirement can be stored in the digital robot twin DRZ, which defines the accuracy requirements for parts of the control program. For example, for a robot movement in which two objects are joined together, an accuracy requirement of 0.5 mm for the objects can be defined. This is expressed as a negative value and means that a maximum penetration depth of 0.5mm between the objects to be joined may be determined during the simulation. A positive value describes a minimum distance, so in this case no object contact is allowed.
  • the computer program product CPP as a further component of the robot controller RBST contains a non-volatile, readable memory SP, in which the processor-readable control program commands of a program module PGM performing the ROSU configuration are stored, and a processor connected to the memory SP PZ that executes the control program commands of the PGM program module for ROSU configuration.
  • the digital robot twin DRZ which represents the robot object system environment ROSU digitally and which controls the robot RB for object manipulation on the basis of the control program STP, is used for targeted use of the robot RB in the robot object System environment ROSU synchronized during object manipulation. How this synchronization takes place in detail is also explained in connection with the description of FIG.
  • the computer program product CPP or, more precisely, the program module PGM performing the ROSU configuration which in each case preferably as APP is designed, procured or trained, or is part of a programming environment or API for the robot controller, is also sold on the market independently and independently of the robot controller as a separate product from manufacturer A and then in any robot controller from manufacturer B can be used.
  • the robot-object system environment ROSU contains two objects, a first object OB and a second object OB ', which are to be manipulated and detected by the robot RB.
  • the robot RB has a gripper GF and a force sensor KS as an end effector and a camera KM and / or a 3D sensor system 3D-SS in the area of the end effector.
  • the force sensor does not necessarily have to be represented as an independent unit, but can also be emulated using other sensor systems available in the robot.
  • RGB-FS red-green-blue ⁇ RGB>" color sensor RGB-FS is attached in the area of the end effector for the ROSU configuration.
  • FIG. 3 shows a flow chart for performing a - in contrast to the manually performed digital robot twin synchronization SYDRZ 'according to FIG. 1 - automatic digital robot twin synchronization SYDRZ, in which the digital robot twin DRZ as required - in the sense of a first-level accuracy requirement or a first-level accuracy requirement and a second-level accuracy requirement - and is synchronized in one or two stages in this regard.
  • This flowchart is carried out in the computer program product by the processor PZ when executing the control program commands of the program module PGM for the ROSU configuration.
  • the starting point for performing the automatic digital robot twin synchronization SYDRZ is a current state AZ of the digital robot twin DRZ with the sequence of the control program STP contained in the digital robot twin DRZ and the sequence stored in the data memory DSP geometric data of the robot object system environment ROSU.
  • the automatic digital robot twin synchronization SYDRZ now begins with each object OB, OB 'in the robot object system environment ROSU in the course of the Sequence of the control program gramms STP is optically detected in relation to an object position until
  • each object OB, OB 'in the robot object system environment ROSU is processed in relation to an object position during the course of the control program STP Determination of an object pose distribution or by determining the object pose distribution and robot contact recorded up to
  • the first-level accuracy requirement ESGB is basically determined by the process and therefore results from simulation. For certain parts of the control program, the first-level accuracy requirement ESGB can still be overwritten by the process requirement.
  • the accuracy requirement results from a robot-object minimum distance, ie, if the robot RB is at least 10 cm away from an object, then it is sufficient to determine the object position, also known as the localization of the object, with an accuracy of up to 10 cm.
  • the first level accuracy requirement ESGB is therefore scalar, ie it is described by a single value, eg 10 cm.
  • the first stage robot-object minimum distance is the shortest simulated robot-object distance over the simulated control program over all time steps and requires the simulation of the control program with the current environmental estimate.
  • the shortest distance is determined with the help of the outer shell of the robot (including attachments, end effector and temporarily added objects) and of the target object in the form of a 3D surface representation.
  • the first level accuracy requirement ESGB can be expanded by a process requirement.
  • the process requirement for parts of the control program overwrites the existing value of the first-level accuracy requirement ESGB for the part of the control program under consideration.
  • the minimum distance value of the first-stage robot-object minimum distance is less than or equal to 0 cm
  • a recording, depositing and / or joining processing process usually takes place.
  • the requirement is overwritten by heuristic values stored in the digital robot twin and / or by user inputs for the process.
  • the joining process requires a positioning accuracy of 0.5mm. This is formulated as a negative value and means that a maximum penetration depth of 0.5mm between the objects to be joined may be determined during the simulation.
  • the two-level accuracy requirement ZSGB is comparable to the first-level accuracy requirement ESGB, but with the difference that a second-level robot-object minimum distance is used:
  • the estimation of the robot-object minimum distance in the second stage is largely determined in the same way as that of the first-stage robot-object minimum distance.
  • the difference is that the robot-object minimum distance is determined several times for different hypotheses. Sampled particles from probabilistic point distributions are typically referred to as hypotheses.
  • the simulation of the control program is carried out several times with highly weighted characteristic pose hypotheses, for example determined as the mean value of the components of a "Mixture of Gaussian Approximation" as an object pose in the digital twin.
  • highly weighted characteristic pose hypotheses for example determined as the mean value of the components of a "Mixture of Gaussian Approximation" as an object pose in the digital twin.
  • the distance between the robot and all hypotheses is calculated.
  • the distance estimate is still formulated as scalar, only the calculation differs.
  • the minimum distance is the shortest distance in the set of all determined distances.
  • Particles with a weighting below a heuristic threshold are discarded for the calculation of the second-stage robot-object minimum distance. This is an approximation that can be done because the commissioning engineer will still observe the first execution of the process. However, in comparison to the "state-of-the-art", ideally only controlled and not adapted.
  • a first query loop AFS1 is run through for each object OB, OB 'in the first level ES and a second query loop AFS2 in the second level ZS.
  • a first-level uncertainty ESU is estimated in a first instruction block AWB1 for each run, in that ambient measurement data UMD are optically recorded and compared with first simulation measurement data SMD1, which are generated with the help of a first-level digital robot twin DRZ- ES are generated.
  • the first-stage digital robot twin DRZ-ES is initialized with the data of the digital robot twin DRZ when it is first executed.
  • the determined ambient measurement data UMD and the first-stage digital robot twin DRZ-ES are - like the digital robot twin DRZ - stored in the configuration data memory KDSP and read out by the processor PZ.
  • the first-level uncertainty ESU describes the uncertainty estimate of a pose estimate in scalar form, e.g. 3 cm.
  • the value is determined by comparing the simulated data, the first simulation measurement data SMD1, with real measurement data of the object, the environmental measurement data UMD.
  • a distance measure for RGB-based detection methods heuristics or measurement models can be used to determine uncertainty.
  • the first interrogation loop AFS1 in a second instruction block AWB2, which is run through after the first instruction block AWB1, the first-level accuracy requirement ESGB is determined for each run.
  • loop pass conditions are checked in a first loop query SAF1. This check of the loop pass conditions is explained below.
  • a first loop condition test SAFl-a checks whether (i) the first-level uncertainty ESU meets the first-level accuracy requirement ESGB and (ii) the precondition for force-based synchronization is met, and that if (i) is not met and (ii) is fulfilled, the first-stage digital robot twin synchronization ES-SYDRZ is continued with a force-based synchronization in the second stage ZS.
  • the transition from the first-stage digital robot twin synchronization ES-SYDRZ to the second-stage digital robot twin synchronization ZS-SYDRZ takes place. Since the force-based synchronization cannot be carried out reliably and quickly in every case, the uncertainty in the object position must satisfy the precondition.
  • a second loop condition test SAFl-b is used to check whether (i) the first-stage uncertainty ESU meets the first-stage accuracy requirement ESGB and (ii) the precondition for force-based synchronization is met, and that if (i) and (ii) are not met, the first-stage uncertainty ESU is reduced with the aid of object pose estimation methods on the optical environmental measurement data UMD in a first instruction correction block AWKB1.
  • a third loop condition check SAFl-c checks whether the first-level uncertainty ESU meets the first-level accuracy requirement ESGB, and that, if this is met, the first-level digital robot twin synchronization ES-SYDRZ for the respective object OB OB 'can be successfully completed after running through the first stage ES, so that both the first stage digital robot twin synchronization ES-SYDRZ and the syn- The synchronization of the digital robot twin SYDZ has ended.
  • a fourth loop condition test SAFl-d is used to check whether (i) there is an improvement in the first-level uncertainty ESU in the first stage ES and (ii) the precondition for force-based synchronization is met in the second stage ZS, and that if (i) no longer takes place and (ii) is not fulfilled, the synchronization of the first-stage digital robot twin ES-SYDZ is aborted and the synchronization of the digital robot twin SYDZ is interrupted for user interaction.
  • a scene of the robot-object system environment ROSU is recorded under the assumption of deviations between reality and the first-stage digital robot twin DRZ-ES.
  • the robot-object system environment ROSU is recorded with the aid of a sensor system on the robot RB.
  • This sensor system can, for example, according to the representation in FIG. 2, the camera KM, the 3D 3D-SS sensors.
  • the RGB color sensor RGB-FS it is also possible for the RGB color sensor RGB-FS to be attached to the robot RB for the optical detection of the robot object system environment ROSU.
  • the ambient measurement data UMD are 3D image data 3D-BD generated by the 3D sensor system 3D-SS and / or sensor data SSD generated by the RGB color sensor RGB-FS.
  • the first-stage uncertainty ESU for each of the object OB, OB ' is estimated by comparing the environmental measurement data UMD determined during the optical ROSU detection with the first simulation measurement data SMD1, which is obtained with the aid of the first-stage digital robot twin DRZ-ES.
  • a fourth instruction step AWS4 the sequence of the control program STP for movements of the robot RB is simulated in accordance with the first-stage digital robot twin DRZ-ES.
  • a first minimum distance value MDW1 of a first robot object minimum distance that occurs in the course of the simulated control program sequence is determined for each object OB, OB '.
  • the first robot-object minimum distance is the shortest robot-object distance over all time steps of the simulated control program. This requires a simulation of the control program with the current first-stage digital robot twin DRZ-ES.
  • the shortest distance is determined with the help of the outer shell of the robot (including attachments, End effector and temporarily attached objects) and determined by the target object in the form of a 3D surface representation.
  • object position estimation methods are applied to the environmental measurement data UMD [cf. the European patent application (application no. 19178454.5) is cited as a reference for such object pose estimation methods].
  • the object position estimate is updated for each object OB, OB 'in the first-stage digital robot twin DRZ-ES.
  • FIG. 4 illustrates in the upper half of the figure the first-stage synchronization of the digital robot twin ES-SYDRZ for the first query loop AFS1 with the third instruction step AWS3 in the first instruction block AWB1, the fifth instruction step AWS5 in the second instruction block AWB2 and the first loop query SFA1.
  • the second uncertainty ZSU is estimated for each run by comparing the second simulation measurement data SMD2 with the environmental measurement data UMD.
  • the uncertainty is not viewed as a scalar, but as a distribution in the form of a list (or set) of individual weighted hypotheses, so-called particles.
  • This is a common representation for the robot RB, in particular in the case of mobile navigation, for example. This is determined by generating and evaluating new hypotheses, referred to as so-called "sampling". To evaluate the hypotheses, simulated depth values (depth images) of the object are compared with real (measured) depth values. The results of optical estimation methods can optionally be taken into account in the weighting.
  • an object pose distribution is determined and this is stored in the second-stage digital robot twin DRZ-ZS.
  • the second stage digital robot twin DRZ-ZS is initialized in the first version with the data of the first stage digital robot twin DRZ-ES and the second simulations are carried out with the aid of the second stage digital robot twin DRZ-ZS onsmessflower SMD2 generated.
  • This second stage digital robot twin DRZ-ZS is - like the first stage digital robot twin DRZ-ES, the digital robot twin DRZ and the environmental measurement data UMD - also stored in the configuration data memory KDSP and from the processor PZ read out.
  • loop cycle conditions are checked in a second loop query SAF2.
  • a first loop condition check SAF2-a checks whether the second-level uncertainty ZSU meets the second-level accuracy requirement ZSGB, and that, if this is met, the second-level digital robot twin synchronization ZS-SYDRZ ends and thus the Synchronization on of the digital robot twin SYDRZ for the viewed object have been successfully completed.
  • a second loop condition test SAF2-b checks whether the second-level uncertainty ZSU meets the second-level accuracy requirement ZSGB, and that, if this is not met, the second-level uncertainty ZSU by scanning the respective object OB, OB 'in the robot contact is reduced in a second instruction correction block AWKB2.
  • a third loop condition test SAF2-C checks whether (i) the second-level uncertainty ZSU meets the second-level accuracy requirement ZSGB and (ii) an improvement in the second-level uncertainty ZSU can still take place, and if (i) not yet is fulfilled and (ii) is answered in the negative, the synchronization of the second-stage digital robot twin ES-SYDZ is aborted and the synchronization of the digital robot twin SYDZ is interrupted for user interaction.
  • object poses hypotheses are generated (sampled), in particular taking into account physical constraints, and - as already mentioned - the ambient measurement data UMD determined during the ROSU acquisition are compared with the second simulation measurement data SMD2.
  • the second simulation measurement data SMD2 are generated by simulating several object pose hypotheses as part of the second-level uncertainty ZSU and as part of the second-level digital robot twin DRZ-ZS.
  • a possible object pose distribution is determined in a ninth instruction step AWS9 for each object OB, OB 'with the help of probable object pose hypotheses, with object poses with a lower deviation between the determined environmental measurement data UMD compared to the second simulation measurement data SMD2 probable are.
  • the sequence of the control program STP for movements of the robot RB is simulated with several probable object pose hypotheses as part of the second-stage uncertainty ZSU and as part of the second-stage digital robot twin DRZ-ZS.
  • a second minimum distance value MDW2 defining the second level accuracy requirement ZSGB of a second level robot object minimum distance is determined, which occurs in the course of the simulated control program flow.
  • the second-level robot-object minimum distance is largely determined in the same way as the first-level robot-object minimum distance. The difference is that the robot object minimum distance is multiple for different object portals. sen hypotheses is determined. The typically sampled particles from probabilistic pose distributions are referred to as object pose hypotheses.
  • object pose hypotheses The typically sampled particles from probabilistic pose distributions are referred to as object pose hypotheses.
  • the simulation of the control program is carried out several times with highly weighted characteristic pose hypotheses, which are determined, for example, as components of a "Mixture of Gaussian Approximation".
  • the distance between the robot and all hypotheses is calculated.
  • the distance estimation continues to be formulated as scalar, only the calculation differs.
  • the minimum distance denotes the shortest in the set of all determined distances. Hypotheses with a weighting below a threshold are discarded for the calculation of the second-stage robot-object minimum distance.
  • the second level accuracy requirement ZSGB is comparable to the first level accuracy requirement ESGB, with the difference that the second level robot-object minimum distance is used:
  • a further dedicated instruction step AWS is also carried out when the second instruction correction block AWKB2 is run through.
  • FIGURE 4 illustrates in the lower half of the figure the second stage synchronization of the digital robot twin ZS-SYDRZ for the second query loop AFS2 with the eighth instruction step AWS8 in the third instruction block AWB3, the eleventh instruction step AWS11 in the fourth instruction block AWB4 and the second Loop query SFA2.
  • the execution of the automatic digital robot twin synchronization SYDRZ ends with the fact that after the synchronization of the digital robot twin SYDRZ carried out - as explained above - in an updated, synchronized state ASZ of the digital robot twin DRZ the digital robot twin DRZ including the control program STP controlling the robot RB for object manipulation or the control program flow, the geometric data of the robot object system environment ROSU and / or uncertainty information for the objects OB, OB stored in the data memory DSP 'is updated in the robot object system environment ROSU.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)
EP20828985.0A 2019-12-11 2020-12-11 Procédé, produit-programme informatique et commande de robot pour configurer un environnement système robot-objet, et robot Pending EP4041503A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP19215346.8A EP3834998A1 (fr) 2019-12-11 2019-12-11 Procédé, produit programme informatique et commande de robot permettant de configurer un environnement système objet-robot ainsi que robot
PCT/EP2020/085852 WO2021116459A2 (fr) 2019-12-11 2020-12-11 Procédé, produit-programme informatique et commande de robot pour configurer un environnement système robot-objet, et robot

Publications (1)

Publication Number Publication Date
EP4041503A2 true EP4041503A2 (fr) 2022-08-17

Family

ID=68886905

Family Applications (2)

Application Number Title Priority Date Filing Date
EP19215346.8A Withdrawn EP3834998A1 (fr) 2019-12-11 2019-12-11 Procédé, produit programme informatique et commande de robot permettant de configurer un environnement système objet-robot ainsi que robot
EP20828985.0A Pending EP4041503A2 (fr) 2019-12-11 2020-12-11 Procédé, produit-programme informatique et commande de robot pour configurer un environnement système robot-objet, et robot

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP19215346.8A Withdrawn EP3834998A1 (fr) 2019-12-11 2019-12-11 Procédé, produit programme informatique et commande de robot permettant de configurer un environnement système objet-robot ainsi que robot

Country Status (4)

Country Link
US (1) US20220388167A1 (fr)
EP (2) EP3834998A1 (fr)
CN (1) CN115279557A (fr)
WO (1) WO2021116459A2 (fr)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120290130A1 (en) * 2011-05-10 2012-11-15 Agile Planet, Inc. Method to Model and Program a Robotic Workcell
US9815198B2 (en) * 2015-07-23 2017-11-14 X Development Llc System and method for determining a work offset
JP6450737B2 (ja) * 2016-12-08 2019-01-09 ファナック株式会社 ロボットシステム
TWI650626B (zh) * 2017-08-15 2019-02-11 由田新技股份有限公司 基於三維影像之機械手臂加工方法及系統
CN108724190A (zh) * 2018-06-27 2018-11-02 西安交通大学 一种工业机器人数字孪生***仿真方法及装置

Also Published As

Publication number Publication date
CN115279557A (zh) 2022-11-01
WO2021116459A2 (fr) 2021-06-17
US20220388167A1 (en) 2022-12-08
EP3834998A1 (fr) 2021-06-16
WO2021116459A9 (fr) 2021-09-16

Similar Documents

Publication Publication Date Title
DE102019001948B4 (de) Steuerung und maschinelle Lernvorrichtung
DE112019002310B4 (de) Ausführen einer "peg in hole"-aufgabe mit unbekannter neigung
DE102014108287B4 (de) Schnelles Erlernen durch Nachahmung von Kraftdrehmoment-Aufgaben durch Roboter
EP1602456B1 (fr) Procédé et dispositif de commande de manipulateurs
DE102020100316B4 (de) Bestimmungsgerät
EP3688537A1 (fr) Procédé, dispositif et programme informatique pour faire fonctionner un système de commande de robot
DE19930087B4 (de) Verfahren und Vorrichtung zur Regelung der Vorhalteposition eines Manipulators eines Handhabungsgeräts
DE102017003943A1 (de) Zellensteuervorrichtung zum Optimieren von Bewegungen eines Produktionssystems, das Industriemaschinen umfasst
DE102015000587B4 (de) Roboterprogrammiervorrichtung zum Erstellen eines Roboterprogramms zum Aufnehmen eines Bilds eines Werkstücks
DE102006055917B4 (de) Industrieroboter und Verfahren zum Erkennen eines ungenau parametrierten Robotermodells
DE112016002013T5 (de) Systeme und Verfahren zur Steuerung einer Robotermanipulation
EP2216144B1 (fr) Système et procédé pour vérifier des composants et/ou des unités fonctionnelles avec un dispositif de test
WO2019192905A1 (fr) Procédé pour l'étalonnage d'un capteur de position dans un véhicule, programme informatique, support d'enregistrement, dispositif de commande et parcours d'étalonnage
DE102012009010A1 (de) Verfahren zum Erzeugen einer Bewegung eines Roboters
EP3227061A1 (fr) Procédé de simulation de mouvement pour un manipulateur
DE102012024934B4 (de) Verfahren und Programmiersystem zur erstmaligen Erstellung eines auf einem Messroboter ausführbaren Messprogramms für die Messung eines neuen Messobjekts
EP3760390A1 (fr) Exécution d'une tâche prédéterminée à l'aide d'au moins un robot
WO2017063887A1 (fr) Synchronisation de plusieurs robots
WO2021018552A1 (fr) Procédé et système de manipulation pour permettre à un objet d'être manipulé par un robot
EP3328595A2 (fr) Procédé et système pour commander un robot
DE102017216093B4 (de) Verfahren zur Parametrierung eines robotischen Manipulators
DE102008018962A1 (de) Verfahren zur Steuerung eines Roboters
EP4041503A2 (fr) Procédé, produit-programme informatique et commande de robot pour configurer un environnement système robot-objet, et robot
DE102018216561A1 (de) Verfahren, Vorrichtung und Computerprogramm zum Ermitteln einer Strategie eines Agenten
DE102020006160A1 (de) Verfahren zur Lageerkennung eines Objekts mittels eines Lageerfassungssystems, Verfahren zum Bearbeiten eines Objekts sowie Lageerfassungssystem

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220513

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240502