CN107564065A - The measuring method of man-machine minimum range under a kind of Collaborative environment - Google Patents

The measuring method of man-machine minimum range under a kind of Collaborative environment Download PDF

Info

Publication number
CN107564065A
CN107564065A CN201710865268.5A CN201710865268A CN107564065A CN 107564065 A CN107564065 A CN 107564065A CN 201710865268 A CN201710865268 A CN 201710865268A CN 107564065 A CN107564065 A CN 107564065A
Authority
CN
China
Prior art keywords
robot
man
machine
node
minimum range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710865268.5A
Other languages
Chinese (zh)
Other versions
CN107564065B (en
Inventor
王政伟
甘亚辉
戴先中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201710865268.5A priority Critical patent/CN107564065B/en
Publication of CN107564065A publication Critical patent/CN107564065A/en
Application granted granted Critical
Publication of CN107564065B publication Critical patent/CN107564065B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a kind of measuring method of man-machine minimum range under Collaborative environment, according to the relative position relation between each part of robot and part rotation axis, the general motion model of robot is established, builds robot motion's node iterative data collection of man-machine minimum range.The view data in man-machine collaboration space is gathered by 3D vision sensors simultaneously to track and identify the human body information in Collaborative environment, skeleton node data is extracted, builds the skeleton node iterative data collection of man-machine minimum range.According to people and the minimum range and corresponding points locus coordinate in the machine human world in obtained iterative data collection iterative calculation Collaborative environment.The iterative calculation method of man-machine minimum range can in real time build accurate and be easily achieved man-machine apart from computation model under the Collaborative environment of the present invention, the man-machine minimum range and corresponding points in Collaborative environment are calculated in real time,, can be with the man-machine minimum range in the tracking environmental of real-time high-efficiency for improving the security of robot.

Description

The measuring method of man-machine minimum range under a kind of Collaborative environment
Technical field
The present invention relates to robotic technology field, the measuring method of man-machine minimum range under particularly a kind of Collaborative environment.
Background technology
In order to make up traditional robot flexibility and intelligent deficiency, robot is set to adapt to more extensive, more complicated appoint Business demand, cooperation robot technology are arisen at the historic moment.Man-machine collaboration and co-melting mechanism are by " computing capability, strength, the essence of robot Degree, repeatability " is combined with " flexibility, Heuristics, analysis, the decision-making capability " of people, realizes and has complementary advantages.
Compared with conventional industrial robot, the cooperation between people and robot is melted except isolating guarantor between man-machine together Protect, incorporate man-machine advantage, add flexibility and security, and then improve prouctiveness.And in order to make one and robot Cooperation is realized with co-melting, accurately and the relative distance between calculating people and robot then becomes urgently to be resolved hurrily ask in real time Topic.Structure is accurate in real time and is easily achieved man-machine apart from computation model, can not only improve the security of robot, can also Pass through the identification and feature extraction to human body, identification human body configuration anticipation human body behavior, for the intellectual reaction of design robot Strategy provides reference.The solution for having two kinds of main flows at present for this problem is:1) the man-machine distance based on figure calculates Method:Using the convex body model pre-established, the calculating of distance between convex body is carried out, but the method is based on bar known to model Under part.And in Collaborative environment, position and the posture of human body are handled in dynamic change, the convex body figure of corresponding composition human body It is also among change, if dynamically the convex body model amount of calculation of structure human body is larger in such a scenario, is unfavorable for real-time Realization;2) the man-machine distance calculating method based on plane picture:By the plane visual camera at the top of collaboration space, The scene image in collaboration space is obtained, obtains the minimum range of machine person to person with Morphological scale-space by image segmentation, but This method regards connected region independently as equivalent to people and robot, can only calculate between two entirety away from From, each overall inside posture is unknown, while when the image after morphological process is mapped to man-machine space length, essence Degree can be also greatly lowered.
The content of the invention
The technical problems to be solved by the invention are overcome the deficiencies in the prior art and provided man-machine under a kind of Collaborative environment The measuring method of minimum range, this method are simply easily achieved, and the man-machine minimum distance results obtained using this method measurement are accurate Exactness is higher, real-time is high.
The present invention uses following technical scheme to solve above-mentioned technical problem:
According to the measuring method of man-machine minimum range under a kind of Collaborative environment proposed by the present invention, comprise the following steps:
Step S1, the relative pose relation of robot and 3D vision sensors is determined using position measurement and calibration system, Obtain the transformational relation between robot basis coordinates system and 3D vision sensor coordinate systems;
S2, according to the relative position relation and part rotation relationship between each part of robot, the benchmark for establishing part is sat Mark system, the origin of the frame of reference is the critical movements node of robot, and between the critical movements node according to robot Relative pose relation, establish Robot Generalized coordinate model, the calculating surplus of each part of measuring and calculating robot, set robot Iteration order is vectorial, the joint angular data in real-time read machine people motion process, the joint angular data in motion process, The robot that man-machine minimum range is built using the iteration order vector of Robot Generalized coordinate model and the robot of setting is transported Dynamic node iterative data collection;
S3, tracked according to 3D vision sensors and identify the human body information in Collaborative environment, extract the people in human body information Body bone node data, the safe clearance of human body and the iteration order vector of skeleton are calculated, hereafter, according to skeleton section Point data and the iteration order of skeleton vector build the skeleton node iterative data collection of man-machine minimum range;
Robot motion's node iterative data collection, skeleton node iterative data collection, the robot that S4, basis obtain are each The calculating surplus of part and the safe clearance of human body, iterate to calculate in Collaborative environment the minimum range in people and the machine human world and with most The coordinate of the corresponding spatial point of small distance.
As the further prioritization scheme of measuring method of man-machine minimum range under a kind of Collaborative environment of the present invention, institute State in step S1, it is relative with the sensor coordinate system where 3D vision sensors to demarcate robot coordinate system by calibration system Position and posture, and determine transition matrix therebetween.
As the further prioritization scheme of measuring method of man-machine minimum range under a kind of Collaborative environment of the present invention, institute State in step S2,
Relative position relation between each part of robot refers to the priority connection time according to robot base to end flange Sequence, latter linked part is relative to the frame of reference of previous part part in Robot Generalized coordinate model along previous The relative translation relation in part frame of reference three axes direction, usei-1Ai=(x y z) expressions,i-1AiFor the i-th of robot Individual part is relative to the relative translation relation of the i-th -1 part, and x, y, z is respectively x, y, the translational movement in z-axis;
The rotation relationship of part is rotation axis of the finger in the frame of reference of part, uses Ri=(rx ry rz) Represent, RiRepresent the rotation axis of i-th of part, rx、ry、rzRepresent the direction vector of the rotation axis of part in part respectively The frame of reference in along three axes component;
The calculating surplus of each part of robot is determined by following criterion:With the adjacent portions in Robot Generalized coordinate model The origin of the frame of reference of part is end points, and the line of two-end-point is the central axis of cylinder, builds cylinder, then robot The calculating surplus of each part is can be by the cylinder radius of the complete envelope of robot components between bottom surface on two cylinders Infimum;
Joint angular data in read machine people's motion process in real time, the joint angular data in motion process, The robot that man-machine minimum range is built using the iteration order vector of Robot Generalized coordinate model and the robot of setting is transported Dynamic node iterative data collection;It is specific as follows:
Joint angular data in real-time read machine people's motion process, and when obtaining current by Robot Generalized coordinate model The position of the critical movements node of the robot at quarter, established according to the order of the mechanical connection from pedestal to end flange of robot The chain structure critical movements node data of robot, the iteration order vector of robot is established, constructs man-machine minimum range Robot motion's node iterative data collection L.
As the further prioritization scheme of measuring method of man-machine minimum range under a kind of Collaborative environment of the present invention, institute It is specific as follows to state step S3:
S3.1:The view data and depth information of the man-machine collaboration scene obtained according to 3D vision sensors, pass through image Analysis is handled to obtain the recognition result of human body and tracks the human body information in Collaborative environment with depth data, and by human body from environment In split, with reference to depth information calculate human skeleton node data, pass through image and the mapping relations in space, extract people The spatial positional information of the bone node of body;
S3.2:According to the human bioequivalence result that 3D visions generate and human skeleton node data, it is determined that the human body of identification Bone nodal information corresponds to the relevant position of human body in actual physics space, and calculates the safe clearance of human body accordingly;
S3.3:According to the skeleton node data extracted in S3.1, with the node corresponding to adjacent skeleton joint Index in bone node data marks for order, calculates the iteration order vector of skeleton;The iteration time of skeleton Sequence vector forms skeleton node iterative data collection H with human skeleton node data.
As the further prioritization scheme of measuring method of man-machine minimum range under a kind of Collaborative environment of the present invention, institute Stating iterative calculation in S4 includes following key element:
S4.1:Input robot motion's node iterative data collection L and skeleton node iterative data collection H, initialization sieve Select matrix E;Judge whether data set L and H are empty set, if empty set, then terminate following calculating process;Otherwise S4.2 is performed;
S4.2:According to the element in iteration order vector extraction machine people's movement node iterative data collection L of robot Lr', according to the element L in the iteration order vector extraction skeleton node iterative data collection H of skeletonh', then Lr' with Lh' line segment in space is corresponded to respectively, thus obtain two spatial line segments, Lr' corresponding spatial line segment LrRepresent, Lh′ Corresponding spatial line segment LhRepresent;
S4.3:Iteration constant initializes with direction vector;Use LhTwo endpoint initialization s0,s1, then s0,s1For with LhIt is right Two points in the space answered;Use LrTwo endpoint initialization t0,t1, then t0,t1For with LrTwo points in corresponding space; It is respectively L to make vectorial u and vhWith LrDirection vector, w0For the direction vector between two spatial line segment end points, then its expression formula Such as formula
(1) shown in (2) (3):
U=s1-s0 (1)
V=t1-t0 (2)
w0=s0-t0 (3)
S4.4:Iteration variable initializes:Make Stemp=sj=s0,Ttemp=tj=t0, wherein, Stemp,TtempFor iterative process In temporary variable, sj,tjFor variable obtained by the calculating in iterative process;
S4.5:Calculate vectorial coefficient ct
If ct∈ (- ∞, 0) then tj=t0;If ct∈ (1, ∞) then tj=t1;If ct∈ [0,1] then performs formula (5);
Calculate vectorial coefficient cs
If cs∈ (- ∞, 0) then sj=s0;If cs∈ (1, ∞) then sj=s1;If cs∈ [0,1] then performs formula (7);
sj=s0+csu (7)
S4.6:Calculating formula (8), formula (8) are iteration stopping condition;
(||Stemp-sj||≤δ)∩(||Ttemp-tj||≤δ) (8)
Wherein, δ is the iteration threshold of setting, if the result of calculation of formula (8) is true, L is calculated using formula (9)rWith LhBetween Minimum range d (Lr,Lh), and by d (L obtained by formula (9)r,Lh) and sj,tjColumn vector deposit screening matrix E is combined into, is sentenced afterwards Whether disconnected data set L and H are empty set, if not empty set then performs S4.2;If empty set, then it represents that all in this data set The equal iteration of element finishes, and performs S4.7;
d(Lr,Lh)=| | sj-tj|| (9)
If formula (8) result of calculation is false, shown in renewal iteration variable such as formula (10) (11):
Stemp=sj (10)
Ttemp=tj (11)
S4.5 is performed afterwards;
S4.7:Screening matrix E is entered and calculated shown in line (12), then man-machine minimum range dmin(L, H) is shown below:
dmin(L, H)=min d | d ∈ C, C=[1 0 0] E } (12)
Wherein, d is the element in Matrix C, the man-machine minimum range d obtained by formula (12)min(L, H) is looked into matrix E is screened Find dminIndexed corresponding to (L, H), and man-machine minimum range d is read by indexminTwo spaces point corresponding to (L, H).
As the further prioritization scheme of measuring method of man-machine minimum range under a kind of Collaborative environment of the present invention, change It is for stop condition:When in two spatial line segment iteration the front and rear iteration spatial point that is returned twice it is identical or difference be less than one During given infinitely small quantity, stop iterative process.
As the further prioritization scheme of measuring method of man-machine minimum range under a kind of Collaborative environment of the present invention, machine The critical movements node of device people is the movement node of the robot chosen in advance.
The present invention compared with prior art, has following technique effect using above technical scheme:
(1) the man-machine minimum range in the tracking environmental of real-time high-efficiency, is solved the problems, such as in actual man-machine collaboration:Dissect existing The man-machine distance present in actual man-machine collaboration and relative position relation problem, its essential characteristic is analyzed, propose corresponding change For computational methods, the complexity of traditional theory analysis and the exhaustive characteristic for case study of adjusting the distance are avoided, simplifies and calculated Journey, it is more beneficial for programming realization;
(2) the inventive method is simple and easy to do, has versatility:In terms of modeling to robot, compared to traditional machine People's modeling method, the process of modeling is greatly simplified, reduce the difficulty of modeling;In terms of modeling to human body, biography is avoided The characteristics of system method is computationally intensive;All analysis can be modeled for the structure mechanism similar with mechanical arm with such a method;
(3) the measured man-machine minimum distance results degree of accuracy is higher in this way, and amount of calculation is small, and real-time is high.Phase Than changing in traditional direct construction convex body model or the method that man-machine minimum range is directly calculated by image procossing, the method It has been apt to calculating performance, has substantially reduced amount of calculation, and the requirement of real-time in man-machine collaboration can be met;And due to the mesh of structure The inherent characteristic of target critical movements nodal analysis method and kinematic chain model, it can accurately analyze the mutual pass between each several part between target System.
Brief description of the drawings
Fig. 1 a are one of man-machine collaboration scene that the specific embodiment of the invention is directed to.
Fig. 1 b are that the man-machine collaboration scene characteristic that the specific embodiment of the invention is directed to extracts schematic diagram.
Fig. 2 is the overall structure key element schematic diagram of the specific embodiment of the invention.
Fig. 3 a~Fig. 3 f are robot components frame of reference method for building up schematic diagram in the specific embodiment of the invention.
Fig. 4 a are the robot schematic diagram of the specific embodiment of the invention.
Fig. 4 b are the robot critical movements node schematic diagram of the specific embodiment of the invention.
Fig. 5 is the human body critical movements node schematic diagram of the specific embodiment of the invention.
Fig. 6 is the man-machine minimum range iterative algorithm schematic diagram of the specific embodiment of the invention.
Fig. 7 is the man-machine minimum range iterative algorithm flow chart of the specific embodiment of the invention.
Fig. 8 a~Fig. 8 f are the iterative algorithm result schematic diagram based on man-machine distance model of the specific embodiment of the invention; Time corresponding to Fig. 8 a~Fig. 8 f is respectively:T=0ms, t=2344ms, t=4396ms, t=6072ms, t=8204ms, t= 10000ms。
Reference in figure is construed to:A~G is the critical movements node of robot in Fig. 4 b, and H is additional node.
Embodiment
Technical scheme is described in further detail below in conjunction with the accompanying drawings:
Present embodiment discloses a kind of iterative calculation method of man-machine minimum range under Collaborative environment, including following Step:
S1:The relative pose relation of robot and 3D vision sensors is determined using position measurement and calibration system, is obtained Transformational relation between robot basis coordinates system and 3D vision sensor coordinate systems;
S2:According to the relative position relation between each part of robot and part rotation relationship, the benchmark for establishing part is sat Mark system, the origin of the frame of reference is the critical movements node of robot, and between the critical movements node according to robot Relative pose relation, establish Robot Generalized coordinate model, the calculating surplus of each part of measuring and calculating robot, set robot Iteration order is vectorial, the joint angular data in real-time read machine people motion process, the joint angular data in motion process, The robot that man-machine minimum range is built using the iteration order vector of Robot Generalized coordinate model and the robot of setting is transported Dynamic node iterative data collection;
S3:Tracked according to 3D vision sensors and identify the human body information in Collaborative environment, extract the people in human body information Body bone node data, the safe clearance of human body and the iteration order vector of skeleton are calculated, hereafter, according to skeleton section Point data and the iteration order of skeleton vector build the skeleton node iterative data collection of man-machine minimum range;
S4:It is each according to obtained robot motion's node iterative data collection, skeleton node iterative data collection, robot The calculating surplus of part and the safe clearance of human body, iterate to calculate in Collaborative environment the minimum range in people and the machine human world and with most The coordinate of the corresponding spatial point of small distance.
Fig. 1 a are one of man-machine collaboration scene, and Fig. 1 b are the feature extraction schematic diagram of man-machine collaboration scene.For people and machine Device people realizes cooperation and co-melting operative scenario, and the quantity for the mechanical arm that the quantity or robot of robot are possessed does not limit System, but in order to realize the purpose of man-machine collaboration, builds that accurate in real time and what is be easily achieved man-machine has necessity apart from computation model Property, the security of robot can be not only improved, can also identify that human body configuration is pre- by the identification and feature extraction to human body Sentence human body behavior, reference is provided for the intellectual response strategy of design robot.
Fig. 2 is the overall structure key element schematic diagram of this method.Before collaborative task is carried out, first have to solve the problems, such as just The pose transformational relation being to determine between 3D sensors and robot body.Because the position of sensor installation has arbitrariness, So need to select a frame of reference, also referred to as world coordinate system in the environment, to describe the position of sensor, simultaneously Also with the information being no more than in description environment.Transformational relation between sensor coordinate system and world coordinate system can use homogeneous change Matrix is changed to represent.Homogeneous coordinates of any point P in world coordinate system and sensor coordinate system are respectively P in spacew= (xw yw zw 1)TWith P=(x y z 1)TThen, following relation be present between its coordinate:
Wherein R is 3 × 3 orthogonal matrices, and T is that D translation is vectorial, 0=(0 0 0)T, MsFor 4 × 4 matrixes.It is comprehensive Considerations above and the inner parameter for combining sensor, can finally determine transformation matrix therebetween.
, it is necessary to which each part of robot is established according to unified mode during the foundation of Robot Generalized coordinate model The frame of reference, according to from the pedestal of robot to the order of end flange, at the beginning of the robot such as state under, robot base Part frame of reference origin is located at the point of intersection of robot the first joint rotation axis and pedestal bottom surface, second, robot The frame of reference origin of part is located at robot base and the cooperation plane and the rotation axis intersection point in the first joint of second component Place, the third member frame of reference origin of robot be located at robot second joint rotation axis and third member it is symmetrical The point of intersection of plane, the frame of reference origin of the part of robot the 4th are located at the rotation axis and the 4th in the joint of robot the 3rd The point of intersection of the symmetrical plane of part, the frame of reference origin of the part of robot the 5th are located at the rotation in the joint of robot the 4th Axis and the 4th the 5th part coordinate the point of intersection of plane, and the origin of the frame of reference of the part of robot the 6th is located at robot 5th joint and the symmetrical plane point of intersection of the 5th part, the frame of reference origin of the part of robot the 7th are located at end flange The point of intersection in central axis and outward flange face, the X-directions of all part frames of reference all point to the front of robot to, All vertically upward, Y direction is determined Z-direction by right-handed Cartesian coordinate system, as shown in Fig. 3 a~Fig. 3 f, calculates each portion afterwards The spatial translation position relationship of part.
Determine that the space relative translation position relationship between part refers to latter component relative to previous part in flute card Translation relation in your space, usei-1Ai=(x y z) expressions,i-1AiFor robot i-th of part relative to the i-th -1 portion The relative translation relation of part, x, y, z are respectively x, y, the translational movement in z-axis.The then translation relation Transl (A between parti) fixed Justice is as shown in formula (2):
The rotation relationship of part is determined simultaneously, in particular to the rotation axis of part is determined in cartesian coordinate system, is used Ri=(rx ry rz) represent, RiRepresent the rotation axis of i-th of part, rx、ry、rzThe side of the rotation axis of part is represented respectively To vector in the frame of reference of part along the component of three axes, such as work as rxValue is -1, when other two elements are 0, X-axis negative direction rotation of the robot components along its cartesian coordinate system is represented, works as rxValue is 1, when other two elements are 0, table Show x-axis positive direction rotation of the robot components along its cartesian coordinate system.With RiExemplified by=(0 0 1) and the anglec of rotation are θ, its Shown in rotation relationship such as formula (3):
Because the frame of reference method for building up of each part of robot is identical, the posture of coordinate system is also identical, each portion Spatial translation relation is only existed between the frame of reference of part.So modeling process and model tormulation form can be simplified, then root According to shown in generalized coordinates model such as formula (4)~(11) defined above that robot is established with method.
TA=Transl (A1)*Trot(R10)) (4)
TB=TA*Transl(A2)*Trot(R21)) (5)
TC=TB*Transl(A3)*Trot(R32)) (6)
TD=TC*Transl(A4)*Trot(R43)) (7)
TE=TD*Transl(A5)*Trot(R54)) (8)
TF=TE*Transl(A6)*Trot(R65)) (9)
TG=TF*Transl(A7) (10)
TH=TD*Transl(SZ·A5) (11)
Wherein SZ=(0 0 1) are coordinate selection operator, as shown in formula (12):
SZ·A5=(0 0 1) (x5 y5 z5The z of)=(0 05) (12)
The mapping relations between the critical movements node of robot and the joint angle of robot can be then obtained, Fig. 4 a are this The robot schematic diagram of invention embodiment, as shown in Figure 4 b, A~G is machine to the critical movements node schematic diagram of robot The critical movements node of device people, H is additional node, for being fitted the component shape of robot.
The measuring and calculating that each part of robot calculates surplus is carried out by following criterion.With adjacent in Robot Generalized coordinate model Critical movements node is the center of circle, and line is normal, builds cylinder, then part calculates surplus as can be by the machine between two centers of circle The infimum of the cylinder radius of the complete envelope of device people's part.
For robot motion's node iterative data collection.The movable joint of real-time read machine people is wanted in Collaborative environment Angular data, and each critical movements node location value of robot at current time is obtained by generalized coordinates model, according to robot Mechanical connection order establishes the chain structure characteristic point data of robot, establishes iteration order vector, constructs man-machine minimum range Robot motion's node iterative data collection L.
Meanwhile according to the feedback data of 3D vision sensors, by analyzing color image recognition and tracking in Collaborative environment Human body information, and human body is split from environment, calculates its bone node data, pass through 3D point cloud data and cromogram The mapping relations of picture, extract the spatial positional information of skeleton node.Known according to the cloud data that 3D visions generate and human body Other result, with reference to the information of cloud data, it is determined that the skeleton information of identification corresponds to human body in actual physics space Relevant position, and the human-body safety surplus in setup algorithm accordingly.
The motion key node data of people and robot can be obtained by above procedure, but the storage of data is at random , it is necessary to data are carried out with processing and marks generation iteration index information, and be combined with initial data to build man-machine minimum The skeleton node iterative data collection of distance.
For skeleton node data, the bone node schematic diagram of extraction is as shown in figure 5, distribution according to skeleton Mode and syntople, using index of the node corresponding to adjacent skeleton joint in bone node data as order mark Note, in units of single bone in the skeleton feature of identification, skeleton data collection is constructed, and according to using head node, as root section The tree structure of point carries out rearrangement to all nodes, then the data of tree structure represent each corresponding skeleton character, According to the traversal order of tree structure, the father node bone node corresponding with child node for storing each branch indexes, and formation changes For ordering vector.Iteration order vector forms skeleton node iterative data collection H with skeleton node data.
After this, the iterative data collection obtained according to above procedure iterates to calculate people and the machine human world in Collaborative environment Minimum range and corresponding points locus coordinate, the schematic diagram that it is iterated to calculate are as shown in Figure 6.
By iterative data collection, the mathematical relationship of distance between iteration element is established.Make Lr∈ L are robot motion's node iteration Any iteration element in data set, Lh∈ H are that skeleton node iterative data concentrates arbitrary iteration element, and it is right respectively A line segment in space, t should be gone upj,sjRespectively LrWith LhUpper any point.The then minimum range definition between iteration element As shown in formula (13):
Further, spatial point is established to the distance between line relation.Make s0,s1,t0,t1The respectively end of two lines section Point, then there is following relation:
sj=s0+csu cs∈[0,1] (14)
tj=t0+ctv ct∈[0,1] (15)
U=s1-s0 (16)
V=t1-t0 (17)
w0=s0-t0 (18)
Wherein cs,ctFor vectorial coefficient, the direction vector of u and the respectively corresponding line segments of v, w0Between two line segment end points to Amount.Then to there is following relation in the point of a certain line segment in space between another line segment, with s0Exemplified by:
(s0-tj) v=0 (19)
In iterative calculation, if c in each iterative processt∈ (- ∞, 0) then in spatial point line computation, uses spatial line segment Left side end points renewal iteration spatial point, if ct∈ [0,1] then calculates the spatial point for renewal with the method for formula (21), if ct∈ (1, ∞) is then with the right side end points renewal iteration spatial point of spatial line segment.If in (20) formula | | v | | representation space line if=0 Section is punctured into space a bit, calculates the point of this contraction in turn to the distance of another line segment.
Minimum range between any two lines section in space is present and unique.By limited number of time iteration can in the hope of minimum away from From, and iterations N be present, as iterations n >=N, spatial point corresponding to two lines section minimum range no longer changes.Base Theoretical more than, iteration stopping condition is set as when the front and rear iteration spatial point phase returned twice in two spatial line segment iteration With or difference be less than a given infinitely small quantity when, stopping iterative process.
Then man-machine minimum range iterative calculation result is obtained by formula (22) under Collaborative environment, corresponding minimum range spatial point Coordinate is corresponding iteration variable value when iterative process stops.
Specific iterative calculation flow is as shown in Figure 7.Detailed iterative step is as follows:
Step 1:Input robot motion's node iterative data collection L and skeleton node iterative data collection H, initialization Screen matrix E;Judge whether data set L and H are empty set, if empty set, then terminate following calculating process;Otherwise Step is performed 2;
Step 2:According to the element in iteration order vector extraction machine people's movement node iterative data collection L of robot Lr', according to the element L in the iteration order vector extraction skeleton node iterative data collection H of skeletonh', then Lr' with Lh' line segment in space is corresponded to respectively, thus obtain two spatial line segments, Lr' corresponding spatial line segment LrRepresent, Lh′ Corresponding spatial line segment LhRepresent;
Step 3:Iteration constant initializes with direction vector;Use LhTwo endpoint initialization s0,s1, then s0,s1For with Lh Two points in corresponding space;Use LrTwo endpoint initialization t0,t1, then t0,t1For with LrTwo in corresponding space Point;It is respectively L to make vectorial u and vhWith LrDirection vector, w0For the direction vector between two spatial line segment end points, then its table Up to formula such as formula (23) (24) (25) Suo Shi:
U=s1-s0 (23)
V=t1-t0 (24)
w0=s0-t0 (25)
Step 4:Iteration variable initializes:Make Stemp=sj=s0,Ttemp=tj=t0, wherein, Stemp,TtempFor iteration mistake Temporary variable in journey, sj,tjFor variable obtained by the calculating in iterative process;
Step 5:Calculate vectorial coefficient ct
If ct∈ (- ∞, 0) then tj=t0;If ct∈ (1, ∞) then tj=t1;If ct∈ [0,1] then performs formula (27);
Calculate vectorial coefficient cs
If cs∈ (- ∞, 0) then sj=s0;If cs∈ (1, ∞) then sj=s1;If cs∈ [0,1] then performs formula (29);
sj=s0+csu (29)
Step 6:Calculating formula (30), formula (30) are iteration stopping condition;
(||Stemp-sj||≤δ)∩(||Ttemp-tj||≤δ) (30)
Wherein, δ is the iteration threshold of setting, if the result of calculation of formula (30) is true, L is calculated using formula (31)rWith LhIt Between minimum range d (Lr,Lh), and by d (L obtained by formula (31)r,Lh) and sj,tjColumn vector deposit screening matrix E is combined into, it Judge whether data set L and H are empty set afterwards, if not empty set then performs Step 2;If empty set, then it represents that in this data set The equal iteration of all elements finish, perform Step 7;
d(Lr,Lh)=| | sj-tj|| (31)
If formula (30) result of calculation is false, shown in renewal iteration variable such as formula (32) (33):
Stemp=sj (32)
Ttemp=tj (33)
Step 5 is performed afterwards;
Step 7:Screening matrix E is entered and calculated shown in line (34), then man-machine minimum range dmin(L, H) such as following formula institute Show:
dmin(L, H)=min d | d ∈ C, C=[1 0 0] E } (34)
Wherein, d is the element in Matrix C, the man-machine minimum range d obtained by formula (12)min(L, H) is looked into matrix E is screened Find dminIndexed corresponding to (L, H), and man-machine minimum range d is read by indexminTwo spaces point corresponding to (L, H).
The visible Fig. 8 a~Fig. 8 f of iterative algorithm result schematic diagram.Iterated to calculate more than can obtain eventually people and robot it Between minimum range and corresponding space point coordinates, and additional critical movements that can obtain machine person to person in an iterative process Detailed relative position relation between node and the kinematic chain being made up of movement node, is finally completed man-machine under Collaborative environment The real-time calculating of minimum range and corresponding points.
Above content is to combine specific preferred embodiment further description made for the present invention, it is impossible to is assert The specific implementation of the present invention is confined to these explanations.For general technical staff of the technical field of the invention, On the premise of not departing from present inventive concept, some simple deductions can also be made or substituted, should all be considered as belonging to the present invention's Protection domain.

Claims (7)

1. the measuring method of man-machine minimum range under a kind of Collaborative environment, it is characterised in that comprise the following steps:
Step S1, the relative pose relation of robot and 3D vision sensors is determined using position measurement and calibration system, is obtained Transformational relation between robot basis coordinates system and 3D vision sensor coordinate systems;
S2, according to the relative position relation and part rotation relationship between each part of robot, establish the frame of reference of part, The origin of the frame of reference is the critical movements node of robot, and according to relative between the critical movements node of robot Position orientation relation, Robot Generalized coordinate model is established, calculate the calculating surplus of each part of robot, set the iteration time of robot Sequence is vectorial, the joint angular data in real-time read machine people motion process, the joint angular data in motion process, utilizes machine The iteration order vector of device people's generalized coordinates model and the robot of setting builds robot motion's node of man-machine minimum range Iterative data collection;
S3, tracked according to 3D vision sensors and identify the human body information in Collaborative environment, extract the human body bone in human body information Bone node data, the safe clearance of human body and the iteration order vector of skeleton are calculated, hereafter, according to skeleton nodes The skeleton node iterative data collection of man-machine minimum range is built according to the iteration order vector with skeleton;
Robot motion's node iterative data collection, skeleton node iterative data collection, each part of robot that S4, basis obtain Calculating surplus and human body safe clearance, iterate to calculate in Collaborative environment the minimum range in people and the machine human world and with most narrow spacing From the coordinate of corresponding spatial point.
2. the measuring method of man-machine minimum range under a kind of Collaborative environment according to claim 1, it is characterised in that described In step S1, robot coordinate system and the relative position of the sensor coordinate system where 3D vision sensors are demarcated by calibration system Put and posture, and determine transition matrix therebetween.
3. the measuring method of man-machine minimum range under a kind of Collaborative environment according to claim 1, it is characterised in that described In step S2,
Relative position relation between each part of robot refers to the priority connection order according to robot base to end flange, after The part of connection is relative to the frame of reference of previous part part in Robot Generalized coordinate model along previous part The relative translation relation in frame of reference three axes direction, usei-1Ai=(x y z) expressions,i-1AiFor i-th of robot Part is relative to the relative translation relation of the i-th -1 part, and x, y, z is respectively x, y, the translational movement in z-axis;
The rotation relationship of part is rotation axis of the finger in the frame of reference of part, uses Ri=(rx ry rz) represent, RiRepresent the rotation axis of i-th of part, rx、ry、rzBenchmark of the direction vector in part of the rotation axis of part is represented respectively Along the component of three axes in coordinate system;
The calculating surplus of each part of robot is determined by following criterion:With the adjacent component in Robot Generalized coordinate model The origin of the frame of reference is end points, and the line of two-end-point is the central axis of cylinder, builds cylinder, then each portion of robot The calculating surplus of part is can be by under the cylinder radius of the complete envelope of robot components between bottom surface on two cylinders True boundary;
Joint angular data in read machine people's motion process in real time, the joint angular data in motion process, is utilized The robot motion that the iteration order vector of Robot Generalized coordinate model and the robot of setting builds man-machine minimum range saves Point iterative data collection;It is specific as follows:
Joint angular data in real-time read machine people's motion process, and current time is obtained by Robot Generalized coordinate model The position of the critical movements node of robot, machine is established according to the order of the mechanical connection from pedestal to end flange of robot The chain structure critical movements node data of people, the iteration order vector of robot is established, construct the machine of man-machine minimum range People's movement node iterative data collection L.
4. the measuring method of man-machine minimum range under a kind of Collaborative environment according to claim 1, it is characterised in that described Step S3 is specific as follows:
S3.1:The view data and depth information of the man-machine collaboration scene obtained according to 3D vision sensors, pass through graphical analysis Handle to obtain the recognition result of human body with depth data and track the human body information in Collaborative environment, and human body is divided from environment Cut out, calculate human skeleton node data with reference to depth information, by image and the mapping relations in space, extract human body The spatial positional information of bone node;
S3.2:According to the human bioequivalence result that 3D visions generate and human skeleton node data, it is determined that the skeleton of identification Nodal information corresponds to the relevant position of human body in actual physics space, and calculates the safe clearance of human body accordingly;
S3.3:According to the skeleton node data extracted in S3.1, with the node corresponding to adjacent skeleton joint in bone Index in bone node data marks for order, calculates the iteration order vector of skeleton;The iteration order of skeleton to Amount forms skeleton node iterative data collection H with human skeleton node data.
5. the measuring method of man-machine minimum range under a kind of Collaborative environment according to claim 1, it is characterised in that described Iterative calculation includes following key element in S4:
S4.1:Input robot motion's node iterative data collection L and skeleton node iterative data collection H, initialization screening square Battle array E;Judge whether data set L and H are empty set, if empty set, then terminate following calculating process;Otherwise S4.2 is performed;
S4.2:According to the element L ' in iteration order vector extraction machine people's movement node iterative data collection L of robotr, according to Element L ' in the iteration order vector extraction skeleton node iterative data collection H of skeletonh, then L 'rWith L 'hIt is right respectively A line segment in space is answered, thus obtains two spatial line segments, L 'rCorresponding spatial line segment LrRepresent, L 'hCorresponding sky Between line segment LhRepresent;
S4.3:Iteration constant initializes with direction vector;Use LhTwo endpoint initialization s0,s1, then s0,s1For with LhIt is corresponding Two points in space;Use LrTwo endpoint initialization t0,t1, then t0,t1For with LrTwo points in corresponding space;Make to It is respectively L to measure u and vhWith LrDirection vector, w0For the direction vector between two spatial line segment end points, then its expression formula such as formula (1) shown in (2) (3):
U=s1-s0 (1)
V=t1-t0 (2)
w0=s0-t0 (3)
S4.4:Iteration variable initializes:Make Stemp=sj=s0,Ttemp=tj=t0, wherein, Stemp,TtempFor in iterative process Temporary variable, sj,tjFor variable obtained by the calculating in iterative process;
S4.5:Calculate vectorial coefficient ct
<mrow> <msub> <mi>c</mi> <mi>t</mi> </msub> <mo>=</mo> <mfrac> <mrow> <msub> <mi>w</mi> <mn>0</mn> </msub> <mo>&amp;CenterDot;</mo> <mi>v</mi> </mrow> <mrow> <mo>|</mo> <mo>|</mo> <mi>v</mi> <mo>|</mo> <mo>|</mo> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
If ct∈ (- ∞, 0) then tj=t0;If ct∈ (1, ∞) then tj=t1;If ct∈ [0,1] then performs formula (5);
<mrow> <msub> <mi>t</mi> <mi>j</mi> </msub> <mo>=</mo> <msub> <mi>t</mi> <mn>0</mn> </msub> <mo>+</mo> <mfrac> <mrow> <msub> <mi>w</mi> <mn>0</mn> </msub> <mo>&amp;CenterDot;</mo> <mi>v</mi> </mrow> <mrow> <mo>|</mo> <mo>|</mo> <mi>v</mi> <mo>|</mo> <mo>|</mo> </mrow> </mfrac> <mi>v</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
Calculate vectorial coefficient cs
<mrow> <msub> <mi>c</mi> <mi>s</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>(</mo> <msub> <mi>t</mi> <mi>j</mi> </msub> <mo>-</mo> <msub> <mi>s</mi> <mn>0</mn> </msub> <mo>)</mo> <mo>&amp;CenterDot;</mo> <mi>u</mi> </mrow> <mrow> <mo>|</mo> <mo>|</mo> <mi>u</mi> <mo>|</mo> <mo>|</mo> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow>
If cs∈ (- ∞, 0) then sj=s0;If cs∈ (1, ∞) then sj=s1;If cs∈ [0,1] then performs formula (7);
sj=s0+csu (7)
S4.6:Calculating formula (8), formula (8) are iteration stopping condition;
(||Stemp-sj||≤δ)∩(||Ttemp-tj||≤δ) (8)
Wherein, δ is the iteration threshold of setting, if the result of calculation of formula (8) is true, L is calculated using formula (9)rWith LhBetween most Small distance d (Lr,Lh), and by d (L obtained by formula (9)r,Lh) and sj,tjColumn vector deposit screening matrix E is combined into, judges number afterwards According to whether L and H is integrated as empty set, if not empty set then performs S4.2;If empty set, then it represents that all elements in this data set Equal iteration finishes, and performs S4.7;
d(Lr,Lh)=| | sj-tj|| (9)
If formula (8) result of calculation is false, shown in renewal iteration variable such as formula (10) (11):
Stemp=sj (10)
Ttemp=tj (11)
S4.5 is performed afterwards;
S4.7:Screening matrix E is entered and calculated shown in line (12), then man-machine minimum range dmin(L, H) is shown below:
dmin(L, H)=min d | d ∈ C, C=[1 0 0] E } (12)
Wherein, d is the element in Matrix C, the man-machine minimum range d obtained by formula (12)min(L, H) is found in matrix E is screened dminIndexed corresponding to (L, H), and man-machine minimum range d is read by indexminTwo spaces point corresponding to (L, H).
6. the measuring method of man-machine minimum range under a kind of Collaborative environment according to claim 5, it is characterised in that iteration Stop condition is:When in two spatial line segment iteration the front and rear iteration spatial point that is returned twice it is identical or difference be less than one to When determining infinitely small quantity, stop iterative process.
7. the measuring method of man-machine minimum range under a kind of Collaborative environment according to claim 1, it is characterised in that machine The critical movements node of people is the movement node of the robot chosen in advance.
CN201710865268.5A 2017-09-22 2017-09-22 The measuring method of man-machine minimum range under a kind of Collaborative environment Active CN107564065B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710865268.5A CN107564065B (en) 2017-09-22 2017-09-22 The measuring method of man-machine minimum range under a kind of Collaborative environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710865268.5A CN107564065B (en) 2017-09-22 2017-09-22 The measuring method of man-machine minimum range under a kind of Collaborative environment

Publications (2)

Publication Number Publication Date
CN107564065A true CN107564065A (en) 2018-01-09
CN107564065B CN107564065B (en) 2019-10-22

Family

ID=60981574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710865268.5A Active CN107564065B (en) 2017-09-22 2017-09-22 The measuring method of man-machine minimum range under a kind of Collaborative environment

Country Status (1)

Country Link
CN (1) CN107564065B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108846891A (en) * 2018-05-30 2018-11-20 广东省智能制造研究所 A kind of man-machine safety collaboration method based on three-dimensional framework detection
CN110414339A (en) * 2019-06-21 2019-11-05 武汉倍特威视***有限公司 Hearing room personnel's close contact recognition methods based on video stream data
CN110561432A (en) * 2019-08-30 2019-12-13 广东省智能制造研究所 safety cooperation method and device based on man-machine co-fusion
CN112757274A (en) * 2020-12-30 2021-05-07 华中科技大学 Human-computer cooperative operation oriented dynamic fusion behavior safety algorithm and system
CN113485424A (en) * 2021-07-19 2021-10-08 武汉中测晟图遥感技术有限公司 Design method of pole tower inspection route
US20220001544A1 (en) * 2019-09-16 2022-01-06 Tencent Technology (Shenzhen) Company Limited Auxiliary photographing device for dyskinesia analysis, and control method and apparatus for auxiliary photographing device for dyskinesia analysis

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006231497A (en) * 2005-02-28 2006-09-07 Advanced Telecommunication Research Institute International Communication robot
CN102350700A (en) * 2011-09-19 2012-02-15 华南理工大学 Method for controlling robot based on visual sense
CN104057447A (en) * 2013-03-18 2014-09-24 株式会社安川电机 Robot Picking System And Method Of Manufacturing Workpiece
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN105137973A (en) * 2015-08-21 2015-12-09 华南理工大学 Method for robot to intelligently avoid human under man-machine cooperation scene
CN106166749A (en) * 2016-06-29 2016-11-30 北京控制工程研究所 The motion track planing method of multi-arm robot is moved in a kind of space
CN106338990A (en) * 2016-08-12 2017-01-18 杭州亿恒科技有限公司 Industrial robot DH parameter calibration and zero position calibration method based on laser tracker
CN106569613A (en) * 2016-11-14 2017-04-19 中国电子科技集团公司第二十八研究所 Multi-modal man-machine interaction system and control method thereof
CN106909216A (en) * 2017-01-05 2017-06-30 华南理工大学 A kind of Apery manipulator control method based on Kinect sensor

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006231497A (en) * 2005-02-28 2006-09-07 Advanced Telecommunication Research Institute International Communication robot
CN102350700A (en) * 2011-09-19 2012-02-15 华南理工大学 Method for controlling robot based on visual sense
CN104057447A (en) * 2013-03-18 2014-09-24 株式会社安川电机 Robot Picking System And Method Of Manufacturing Workpiece
CN104777775A (en) * 2015-03-25 2015-07-15 北京工业大学 Two-wheeled self-balancing robot control method based on Kinect device
CN105137973A (en) * 2015-08-21 2015-12-09 华南理工大学 Method for robot to intelligently avoid human under man-machine cooperation scene
CN106166749A (en) * 2016-06-29 2016-11-30 北京控制工程研究所 The motion track planing method of multi-arm robot is moved in a kind of space
CN106338990A (en) * 2016-08-12 2017-01-18 杭州亿恒科技有限公司 Industrial robot DH parameter calibration and zero position calibration method based on laser tracker
CN106569613A (en) * 2016-11-14 2017-04-19 中国电子科技集团公司第二十八研究所 Multi-modal man-machine interaction system and control method thereof
CN106909216A (en) * 2017-01-05 2017-06-30 华南理工大学 A kind of Apery manipulator control method based on Kinect sensor

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
STEFAN KUHN 等: "Fast Vision-Based Minimum Distance Determination Between Known and Unkown Objects", 《INTELLIGENT ROBOTS AND SYSTEMS》 *
吴海彬 等: "基于危险指数最小化的机器人安全运动规划", 《机械工程学报》 *
方建军: "移动式采摘机器人研究现状与进展", 《农业工程学报》 *
朱毅 等: "未知环境下势场法路径规划的局部极小问题研究", 《自动化学报》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108846891A (en) * 2018-05-30 2018-11-20 广东省智能制造研究所 A kind of man-machine safety collaboration method based on three-dimensional framework detection
CN108846891B (en) * 2018-05-30 2023-04-28 广东省智能制造研究所 Man-machine safety cooperation method based on three-dimensional skeleton detection
CN110414339A (en) * 2019-06-21 2019-11-05 武汉倍特威视***有限公司 Hearing room personnel's close contact recognition methods based on video stream data
CN110561432A (en) * 2019-08-30 2019-12-13 广东省智能制造研究所 safety cooperation method and device based on man-machine co-fusion
US20220001544A1 (en) * 2019-09-16 2022-01-06 Tencent Technology (Shenzhen) Company Limited Auxiliary photographing device for dyskinesia analysis, and control method and apparatus for auxiliary photographing device for dyskinesia analysis
US11945125B2 (en) * 2019-09-16 2024-04-02 Tencent Technology (Shenzhen) Company Limited Auxiliary photographing device for dyskinesia analysis, and control method and apparatus for auxiliary photographing device for dyskinesia analysis
CN112757274A (en) * 2020-12-30 2021-05-07 华中科技大学 Human-computer cooperative operation oriented dynamic fusion behavior safety algorithm and system
CN112757274B (en) * 2020-12-30 2022-02-18 华中科技大学 Human-computer cooperative operation oriented dynamic fusion behavior safety algorithm and system
CN113485424A (en) * 2021-07-19 2021-10-08 武汉中测晟图遥感技术有限公司 Design method of pole tower inspection route

Also Published As

Publication number Publication date
CN107564065B (en) 2019-10-22

Similar Documents

Publication Publication Date Title
CN107564065A (en) The measuring method of man-machine minimum range under a kind of Collaborative environment
CN109344882B (en) Convolutional neural network-based robot control target pose identification method
CN112859859B (en) Dynamic grid map updating method based on three-dimensional obstacle object pixel object mapping
CN106204718B (en) A kind of simple and efficient 3 D human body method for reconstructing based on single Kinect
CN106679648B (en) Visual inertia combination SLAM method based on genetic algorithm
CN105225269B (en) Object modelling system based on motion
Zou et al. Virtual manipulator-based binocular stereo vision positioning system and errors modelling
CN108898676B (en) Method and system for detecting collision and shielding between virtual and real objects
CN108416428B (en) Robot vision positioning method based on convolutional neural network
CN101794349B (en) Experimental system and method for augmented reality of teleoperation of robot
CN107253192A (en) It is a kind of based on Kinect without demarcation human-computer interactive control system and method
CN107564061A (en) A kind of binocular vision speedometer based on image gradient combined optimization calculates method
WO2021004416A1 (en) Method and apparatus for establishing beacon map on basis of visual beacons
CN107357427A (en) A kind of gesture identification control method for virtual reality device
CN106423656A (en) Automatic spraying system and automatic spraying method based on point cloud and image matching
CN106055091A (en) Hand posture estimation method based on depth information and calibration method
Krainin et al. Manipulator and object tracking for in hand model acquisition
CN107516326A (en) Merge monocular vision and the robot localization method and system of encoder information
CN107748569A (en) Motion control method, device and UAS for unmanned plane
CN106780631A (en) A kind of robot closed loop detection method based on deep learning
Sheng et al. Research on binocular visual system of robotic arm based on improved SURF algorithm
Chen et al. Representation of truss-style structures for autonomous climbing of biped pole-climbing robots
Ben et al. Research on visual orientation guidance of industrial robot based on cad model under binocular vision
CN103260008B (en) A kind of image position is to the projection conversion method of physical location
CN110405731A (en) A kind of quick double mechanical arms basis coordinates system scaling method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant