CN106651949A - Teleoperation method and system for grabbing objects using space mechanical arm based on simulation - Google Patents

Teleoperation method and system for grabbing objects using space mechanical arm based on simulation Download PDF

Info

Publication number
CN106651949A
CN106651949A CN201610903204.5A CN201610903204A CN106651949A CN 106651949 A CN106651949 A CN 106651949A CN 201610903204 A CN201610903204 A CN 201610903204A CN 106651949 A CN106651949 A CN 106651949A
Authority
CN
China
Prior art keywords
target
mechanical arm
unit
module
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610903204.5A
Other languages
Chinese (zh)
Other versions
CN106651949B (en
Inventor
刘传凯
王晓雪
王保丰
王镓
唐歌实
郭祥艳
卜彦龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PEOPLES LIBERATION ARMY TROOP 63920
Original Assignee
PEOPLES LIBERATION ARMY TROOP 63920
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PEOPLES LIBERATION ARMY TROOP 63920 filed Critical PEOPLES LIBERATION ARMY TROOP 63920
Priority to CN201610903204.5A priority Critical patent/CN106651949B/en
Publication of CN106651949A publication Critical patent/CN106651949A/en
Application granted granted Critical
Publication of CN106651949B publication Critical patent/CN106651949B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a teleoperation system for grabbing objects using space mechanical arm based on simulation, which comprises a data source acquisition module, a middle piece module, a target tracking and positioning module, an operation planning module, and a motion simulation module. Through an insertion piece type structuring strategy, it is possible to realize the real time acquisition of a plurality of different data sources by the system and to realize the inconsistency adjustment between data acquisition and data processing speed. Through the use of a rapid tracking algorithm, the photographing camera is ensured to track the target in real time; and through the pose measurement algorithm, the synchronization of the poses between a real target and a simulated target could be ensured. The invention provides a testing platform for the visual positioning algorithm in the dynamic target grabbing process by a space mechanism arm and can demonstrate the grabbing process through a three-dimensional visualized manner to assist the operation men on the ground to complete the control over the grabbing process.

Description

A kind of space manipulator capture target teleoperation method and system based on emulation
Technical field
The present invention relates to mechanical arm remote operating field, and in particular to a kind of space manipulator capture target based on emulation is distant Method of operating and system.
Background technology
Space manipulator is space station construction and safeguards requisite key equipment, in-orbit can complete the group of spacecraft It is various scientific exploration experiment etc. to be carried out in installation and dismantling point, maintenance and repair, fuel conveying, satellite release and recovery and space station Task, it is possible to reduce the extravehicular activity of spacefarer, it is to avoid life danger and save cabin expense.China will in 2022 or so, Construction completes manned space station engineering system, breaks through and grasp terrestrial space manned technology for a long time, carries out terrestrial space section Skill is tested, and strengthens comprehensive development and utilization space resources ability.With the development of aerospace industry, space manipulator will be played increasingly Important effect.
It is one of three large models of space manipulator manipulation that operation is shaken on ground, is that space manipulator performs various complex spaces The important control method that task has to rely on.When space manipulator performs the tasks such as aircraft is arrested, assembly module is changed, machine The vision system that tool arm is carried shoots the state of the object to be operated.Ground control personnel are according to lower blit as the behaviour in observation space Make process, yet with the transmission network bandwidth at space device and ground remote operating center is limited and transmitting procedure exist it is certain Time delay so that operating personnel cannot Real Time Observation to operating process, also cannot accurately predict mechanical arm with the object to be operated Relative position relation.When mechanical arm and the object to be operated are closer to the distance, then non-foreseeability collision is susceptible to, damages machinery Arm, aircraft and assembly module.
The content of the invention
The technical problem to be solved is to provide the image that the vision system of utilization space mechanical arm carrying shoots Measurement mechanical arm tail end predicts the movement locus of mechanical arm relative to by capture target pose, ancillary terrestrial remote operating people Member completes the emulation mode for manipulating decision-making, with solve remote operating personnel because cannot spatial predictor mechanical arm and target relative status And it is difficult to the problem of effective decision-making..
The technical scheme that the present invention solves above-mentioned technical problem is as follows:A kind of space manipulator capture target based on emulation Teleoperation method, it is characterised in that comprise the following steps:
Step 1, data source acquisition module gather the figure of target by polytype plug-in unit from the data source of corresponding types As data and unification be converted into being sent to middleware module after same format;
Step 2, middleware module receive and store the view data through changing and send to target tracking and positioning mould Block is processed, and keeps the speed and target tracking of data source acquisition module collection view data to process image with locating module Difference between the speed of data is maintained in preset range;
Step 3, target tracking and locating module are processed the view data through conversion, obtain the pose of target;
Step 4, operation planning module are according to the pose of target and the mechanical arm tail end pose of acquisition, planning space mechanical arm The motion in each joint;
Step 5, motion analog module set up space manipulator threedimensional model and target three-dimensional, and according to operation planning The motion in each joint of mechanical arm of module planning, the real-time pose for changing space manipulator threedimensional model and target three-dimensional Pose;
Step 6, repetition above step, until completing capture of the mechanical arm to target in motion analog module.
On the basis of above-mentioned technical proposal, the present invention can also do following improvement.
Further, the step 2 specifically includes following steps:
Step 2.1, plug-in management unit are managed and allotment to polytype plug-in unit, receive the image through conversion Data are simultaneously stored in automatic adaptation cushion area unit;
Step 2.2, automatic adaptation cushion area unit make the acquisition speed of the data source acquisition module and described Difference between target tracking and the processing speed of locating module is maintained in preset range;
Step 2.3, data output interface unit send the view data in the automatic adaptation cushion area to target following With locating module.
Further, the step 2.2 specifically includes following steps:
The largest frames difference that setting buffering area is allowed is FBMAX, the upper limit of the largest frames difference that buffering area is allowed is FMAX, abandon frame Reference numerals are FTf, the frame difference of IMAQ and image procossing is FP-C
Step 2.2.1, if FP-C> FBMAX, perform FTf=FTf+ 1, then jump to step 2.2.2;
If FP-C<FBMAX/ 2, then jump to step 2.2.3;
If FP-C≥FBMAX/ 2, and FP-C≤FBMAX, then step 2.2.4 is jumped to;
Step 2.2.2, if FTf≥FBMAX/ 2, perform FBMAX=2*FBMAX, jump to step 2.2.6;If FTf<FBMAX/ 2, jump directly to step 2.2.6;
Step 2.2.3, if FP-C≤ 1, then make FTf=0, otherwise perform FTf=(FTf+ 1)/2, until 2*FTf≤FP-C, hold Go and jump to step 2.2.5 after above-mentioned calculating;
Step 2.2.4, if FTf>0, then perform FTf=FTf- 1, until 2*FTf≤FP-C
Step 2.2.5, if FBMAX>FMAX, then F is madeBMAX=FBMAX/2;
Step 2.2.6, if FTf>0, abandon the 2*F stored in buffering areaTf- 1 frame image data, then by 2*FTfFrame View data is pushed to target tracking and locating module, i.e. FP-C=FP-C-2*FTf
Further, the step 3 specifically includes following steps:
Step 3.1, the view data received through conversion are simultaneously pre-processed, and obtain each of target in current picture frame Bar edge;
Step 3.2, the target area being calculated by edge housing in current image frame, wherein, the target area Geometric center for target position, the size of the target area for target size;
Step 3.3, the target area in the current image frame is processed by target tracking algorism after obtain down Target area in one picture frame;
Step 3.4, in the target area in the next image frame pixel application intensity-weighted geometric center sit Mark computational methods obtain target following position;
Step 3.5, calculate target projection position and by between the target projection position and the target following position The measurement model that minimizes of offset error obtain the pose of the target in every two field picture.
Further, the step 3.1 specifically includes following steps:
Step 3.1.1, gaussian filtering is carried out to image, select a certain threshold value T1, make gray scale in image be more than T1Pixel take It is worth for 255, gray scale is less than T1Pixel value be 0, binaryzation is carried out to the image after gaussian filtering, obtain binary image;
Step 3.1.2, to binary image using corrosion expansion algorithm process, eliminate binary image in it is isolated bright Point so that it is 255 to only have target area gray value in binary image, and other positions gray value is 0;
Step 3.1.3, to corrosion expansion after binary image carry out edge extracting.
Further, the target tracking algorism in the step 3.3 is the tracking that Camshift and Kalman filtering are blended Algorithm.
Further, the step 3.4 is specifically, set a certain threshold value T0, all gray values in target area are more than etc. In T0Pixel carry out intensity-weighted average calculating operation according to its position in the image after gaussian filtering process, try to achieve Intensity-weighted Geometric center coordinates, its computing formula is:
Wherein xcRepresent the x coordinate of intensity-weighted geometric center, ycThe y-coordinate of intensity-weighted geometric center is represented, x is represented The x coordinate of pixel in image, y represents the y-coordinate of pixel in image, the gray value at I (x, y) denotation coordination (x, y) place, R Represent target area.
Further, the step 3.5 specifically includes following steps:
The three-dimensional coordinate position of step 3.5.1, estimation target in world coordinate system;
Step 3.5.2, according to the three-dimensional coordinate position, calculate target projection in the picture using perspective projection model Coordinate, obtains target projection position;
Step 3.5.3, the target projection position and target following position are compared, calculate offset error;
Step 3.5.4, according to the offset error, using position of the least square refinement target in world coordinate system Put;
Step 3.5.5, it is repeated multiple times until convergence, i.e., offset error be less than previously given threshold value.
Further, the step 4 specifically includes following steps:
Step 4.1, mechanical arm tail end Motion trajectory unit are according to by the calculated mechanical arm end of mechanical arm configuration The pose of the target that the pose and target tracking at end is obtained with locating module, plans the mobile route of mechanical arm tail end;
The mobile road of the mechanical arm tail end that each joint cooperative motion planning unit of step 4.2, mechanical arm is obtained according to planning The discrete loci point in footpath, by Arithmetic of inverse kinematics the movement angle in each each joint of the corresponding mechanical arm of discrete loci point is solved And angular speed.
A kind of space manipulator capture target remote control system based on emulation, including:
Data source acquisition module, for gathering the image of target from the data source of corresponding types by polytype plug-in unit Data simultaneously are unified to be converted into being sent to middleware module after same format;
Middleware module, for receiving and store through conversion view data and send to target tracking and locating module Processed, and keep the speed and target tracking of data source acquisition module collection view data to process picture number with locating module According to speed between difference be maintained in preset range;
Target tracking and locating module, for processing the view data through conversion, obtain the pose of target;
Operation planning module, for according to the pose of target and the mechanical arm tail end pose for obtaining, planning space mechanical arm The motion in each joint;
Motion analog module, for setting up space manipulator threedimensional model and target three-dimensional, and according to operation planning The motion in each joint of mechanical arm of module planning, the real-time pose for changing space manipulator threedimensional model and target three-dimensional Pose.
Further, the middleware module includes that plug-in management unit, automatic adaptation cushion area unit and unitized data are defeated Outgoing interface unit;
The plug-in management unit, for polytype plug-in unit to be loaded, managed and allocated, receives through conversion View data and be stored in automatic adaptation cushion area unit;
Automatic adaptation cushion area unit, for making the acquisition speed and the target of the data source acquisition module Follow the trail of and the difference between the processing speed of locating module is maintained in preset range;
The unitized data output interface unit, for the view data in the automatic adaptation cushion area to be sent to mesh Mark tracking and locating module;
Further, the target tracking includes image pre-processing unit, object detection unit, target following with locating module Unit, position clearing unit and pose solving unit;
Described image pretreatment unit, for reception to be through the view data of conversion and is pre-processed, obtains current Each bar edge of target in picture frame;
The object detection unit, for the target area being calculated by edge housing in current image frame, wherein, The geometric center of the target area is the position of target, and the size of the target area is the size of target;
The target tracking unit, for being carried out by target tracking algorism to the target area in the current image frame The target area in next image frame is obtained after process;
The position solving unit, for adding to the pixel application gray scale in the target area in the next image frame Power Geometric center coordinates computational methods obtain target following position;
The pose solving unit, for by the offset error between target projection position and the target following position The measurement model of minimum obtains the pose of the target in every two field picture.
Further, the operation planning module includes mechanical arm tail end Motion trajectory unit and each joint association of mechanical arm With motion planning unit;
The mechanical arm tail end Motion trajectory unit, for according to the pose and target tracking of mechanical arm tail end with it is fixed The pose of the target that position module is obtained, plans the mobile route of mechanical arm tail end;
The each joint cooperative motion planning unit of the mechanical arm, the mobile road of the mechanical arm tail end for being obtained according to planning The discrete loci point in footpath, by Arithmetic of inverse kinematics the movement angle in each each joint of the corresponding mechanical arm of discrete loci point is solved And angular speed.
The invention has the beneficial effects as follows:Compared with existing remote operating analogue system, the remote operating emulation of present invention design System can obtain the relative change of image trace target and measure and estimate to be arrested mesh according to mechanical arm tail end vision system Target position and attitude, by analogue system visualization ground handling operator, ancillary terrestrial personnel's decision-making are showed.This with it is existing Judge mechanical arm and compared by the teleoperation method of capture target relative pose by operating personnel's image browsing, largely Alleviate the burden that operating personnel are analyzed mode of operation so that operating personnel can be placed on main attention to space manipulator Operation and control aspect, greatly improve the efficiency of the practice of remote operating..
Description of the drawings
Fig. 1 is that space manipulator capture target remote control system comprising modules and each module data flow to graph of a relation;
Fig. 2 is space manipulator capture target teleoperation method flow chart;
Fig. 3 is the algorithm flow chart that the method that Camshift and Kalman filtering are blended realizes spatial object tracking;
Fig. 4 is the spatial relationship and its motion control step of space manipulator threedimensional model and extraterrestrial target threedimensional model.
Specific embodiment
The principle and feature of the present invention are described below in conjunction with accompanying drawing, example is served only for explaining the present invention, and It is non-for limiting the scope of the present invention.
As shown in figure 1, the present invention provide it is a kind of based on emulation space manipulator capture target remote control system by Data source acquisition module, middleware module, target tracking and locating module, operation planning module and motion analog module composition, Its implementation is as shown in Fig. 2 comprise the following steps:
S1, data source acquisition module gather the picture number of target by polytype plug-in unit from the data source of corresponding types According to and unification be converted into being sent to middleware module after same format;
S2, middleware module receive and store the view data through changing and send to target tracking and locating module Row is processed, and keeps the speed and target tracking of data source acquisition module collection view data to process view data with locating module Speed between difference be maintained in preset range;
S3, target tracking and locating module are processed the view data through conversion, obtain the pose of target;
S4, operation planning module according to the pose and mechanical arm tail end pose of target, each joint of planning space mechanical arm Motion, carries out the capture of target;
S5, motion analog module set up the threedimensional model of space manipulator and target, and according to operation planning module planning The each joint of mechanical arm motion, the pose of space manipulator threedimensional model and the pose of target three-dimensional are changed in real time;
Repeat above step, until completing capture of the mechanical arm to target in motion analog module.
The embodiment of the modules of system is illustrated below:
1st, the enforcement explanation of data source acquisition module
As shown in figure 1, data source acquisition module, including the collection of camera review stream, the collection of disk video flowing and network number Plug-in unit is gathered according to polytype data sources such as stream collections, every class plug-in unit is explained further as follows.
Every class plug-in unit is mainly used in the reading of the data source for completing a certain type, and such as camera review stream collection plug-in unit can only The image stream data of acquisition camera equipment, disk video flowing collection plug-in unit can only gather the video file stored in disk (such as Avi format video files), network data flow collection plug-in unit can only read the image data stream that network interface is received.
It is per the function of class plug-in unit:According to the data that the data form made an appointment will be collected in a certain categorical data source The view data of single frames form is resolved to, is then repacked as consolidation form image data frame, output is stored in middleware mould In the automatic adaptation cushion area unit of block, for example consolidation form image data frame can select bitmap format.
Every class plug-in unit data unified to middleware module offer and information output form and functional interface so that all to insert Part can be managed collectively by middleware module.The functional interface of each plug-in unit mainly includes data source capability, data acquisition control, number According to the interface of the types such as output buffer setting.
2nd, the enforcement explanation of middleware module
Middleware module, is intermediary that data source acquisition module is interacted with target tracking with locating module, including inserter tube Reason unit, automatic adaptation cushion area unit and data output interface unit.Middleware module utilizes plug-in management unit, completes to not The management of same type plug-in unit and allotment.Data output interface unit provides unitized data and work(to target tracking and locating module Energy interface so that target tracking can correctly read the view data in different types of data source with locating module.Middleware module Automatic adaptation cushion area is provided with, the quantity for calculating storage image data in buffering area can be passed through, parts of images number is abandoned automatically According to frame so that data acquisition end and tracking are as synchronous as possible with the process end of positioning.Three units are specifically described as follows respectively.
Plug-in management unit can read data Source Description file, and the information loading in data Source Description file and Plug-in unit is called to realize the reading to Various types of data source.Data Source Description file essential record data source types, data source path, number According to information such as source output data forms.Wherein data source types load and call the type of plug-in unit, data for assisted Selection The information such as source path and data source output data form is used to send plug-in unit to, the correct data source that auxiliary plug-in completes read and Data form is parsed.
Automatic adaptation cushion area unit can be according to acquisition speed and target tracking and the processing speed of locating module Difference, carries out the discarding of data storage and partial data frame automatically so that target tracking and the currently processed image of locating module The difference of frame and the latest image frame of data source acquisition module input is in the range of several frames, it is ensured that data source collection terminal and follow the trail of and calmly Position processes the synchronism at end.It is the quantity by storage image data in calculating buffering area that it implements method, is automatically selected Image being read frame by frame or the pattern of image being read in frame-skipping, when image is read in frame-skipping, the picture frame to skipping is abandoned.Specifically Realize that process is as follows:
The largest frames difference that setting buffering area is allowed is FBMAX, the upper limit of the largest frames difference that buffering area is allowed is FMAX, abandon frame Reference numerals are FTf, the frame difference of IMAQ and image procossing is FP-C
Step 1, if FP-C> FBMAX, perform FTf=FTf+ 1, then jump to step 2.2.2;
If FP-C<FBMAX/ 2, then jump to step 2.2.3;
If FP-C≥FBMAX/ 2, and FP-C≤FBMAX, then step 2.2.4 is jumped to;
Step 2, if FTf≥FBMAX/ 2, perform FBMAX=2*FBMAX, jump to step 2.2.6;If FTf<FBMAX/ 2, directly Connect and jump to step 2.2.6;
Step 3, if FP-C≤ 1, then make FTf=0, otherwise perform FTf=(FTf+ 1)/2, until 2*FTf≤FP-C, performed Step 2.2.5 is jumped to after above-mentioned calculating;
Step 4, if FTf>0, then perform FTf=FTf- 1, until 2*FTf≤FP-C
Step 5, if FBMAX>FMAX, then F is madeBMAX=FBMAX/2;
Step 6, if FTf>0, abandon the 2*F stored in buffering areaTf- 1 frame image data, then by 2*FTfTwo field picture Data-pushing is to target tracking and locating module, i.e. FP-C=FP-C-2*FTf
Data output interface unit is the interface of middleware module and target tracking and locating module, is responsible for from self-adapting slowly Rush and a two field picture is read in area's unit, be sent to target tracking and locating module, use with locating module process for target tracking. Data output interface unit and target tracking refer mainly to reading and the transmission interface of image frame data, bag with the interface of locating module Include the reading and transmission of description and the image frame data of image frame format etc..
3rd, the enforcement explanation of target tracking and locating module
Target tracking and locating module, mainly by the process of sequence image that data source acquisition module is gathered and point Analysis, completes the real-time tracking and positioning of extraterrestrial target.Mainly include Image semantic classification, the detection of extraterrestrial target and tracking and space The functions such as the pose resolving of target.
Image semantic classification mainly includes the preprocessing functions such as gaussian filtering, binaryzation, corrosion expansion and edge extracting, in fact Existing step is to carry out gaussian filtering to image first, selectes a certain threshold value T1, makes pixel value of the gray scale more than T1 in image be 255, pixel value of the gray scale less than T1 is 0, and to the image after gaussian filtering binaryzation is carried out, and obtains binary image;To two Value image is processed using corrosion expansion algorithm, eliminates the isolated bright spot in binary image so that in binary image only Target area gray value is 255, and other positions gray value is 0;Edge extracting is carried out to the binary image after corrosion expansion, Each target location is obtained according to each bar edge calculations.
The detection of extraterrestrial target, feeling the pulse with the finger-tip is marked on the calculating of positions and dimensions in image.Obtain each according to image and in processing Bar edge, by edge housing target area is can be calculated, to the geometric center extremely target in region position in the picture Put, the size of target area is target size in the picture.
The tracking of extraterrestrial target, the method realization blended by Camshift and Kalman filtering, as shown in Fig. 2 its Realize that step is:
Step 1, Kalman Prediction:Predict extraterrestrial target position in the picture using Kalman's method, extraterrestrial target into The state transformation equation of picture is expressed asWherein XkFor the position of current frame image Spatial Object,Represent pre- The position of the next two field picture Spatial Object for measuring, F represents state-transition matrix.The covariance of predicated errorBy it Obtain in the covariance value at k moment, i.e.,Wherein QkWhat the uncertainty to be shifted due to state was produced The covariance of random error.By the position of extraterrestrial targetAs target's center, the size of target area in current frame image As target size, the target area S in next two field picture is calculatedk+1
Step 2, Camshift tracking:Mainly include 3 steps:1) the histogram calculation Kalman using To Template is pre- The back projection figure of the target area of survey;3) according to the back projection figure of target area, by first moment and zeroth order square computing meter Calculate the center of gravity (u of target area back projection figurex,uy), i.e.,
Wherein, Ω represents the coordinate set of target area, m10And m01The first moment of target area back projection figure is represented, m00Represent the zeroth order square of target area back projection figure;If 3) center of gravity (ux,uy) be less than relative to target area off-centring A certain threshold value, then terminate tracking, and target area R is updated otherwise centered on position of centre of gravityk+1, repeat 1) to 3) step.Final To accurate target areaWith target location Zk+1
Step 3, Kalman filtering updates:Target location is obtained using second step, by kalman filter method amendment mesh The covariance matrix of cursor position and target error.If turnover rate K of Kalman filterkRepresent given target new location information Relative to the amendment weight of known target positional information, HkRepresent the observing matrix of target, RkTo observe the association that random noise is produced Variance matrix, then can obtain:
So as to positional information X of target can be obtainedkWith covariance matrix PkPosterior estimate be:
Based on above-mentioned target tracking algorism, you can obtain target area, the central point for being typically based on this target area is obtained To target following position, but, when illumination flicker in the picture, the target following position for calculating is unstable, by mistake Difference is larger, therefore, here to the pixel application intensity-weighted Geometric center coordinates computational methods in target area, can be in the hope of More stably, accurate target following position, it is asked for mode and is described below:
Set a certain threshold value T0, T is more than or equal to all gray values in target area0Pixel according to its Gauss filter Position in ripple image carries out intensity-weighted average calculating operation, tries to achieve intensity-weighted center, and its computing formula is:
Wherein xcThe x coordinate of intensity-weighted geometric center, ycThe y-coordinate of intensity-weighted geometric center, x represents picture in image The x coordinate of vegetarian refreshments, y represents the y-coordinate of pixel in image, and the gray value at I (x, y) denotation coordination (x, y) place, R represents target Region.
The pose of extraterrestrial target is resolved, and is referred to by the offset error between target projection position and target following position most The measurement model of littleization solves the pose per the corresponding target of two field picture in space.Wherein target projection position refer to target and Under the conditions of coordinate is known in the alive boundary's coordinate system of camera photocentre, target electroluminescent lamp can be calculated by the method for perspective projection transformation Projection coordinate in camera image plane.The survey that offset error between target projection position and target following position is minimized Amount model refers to three-dimensional coordinate position of the rough estimate target in world coordinate system first, then according to the three-dimensional location coordinates Target projection coordinate in the picture is calculated using perspective projection model, target projection position is obtained, by the target projection position The target following position obtained with tracking is compared, and calculates offset error, is repaiied using least square method further according to offset error Position of the positive goal in world coordinate system;It is repeated multiple times to be less than previously given threshold value up to convergence, i.e. offset error.Formula Change process description as follows:
AssumeRepresent target position in three dimensions and attitude respectively with θ, N for target electroluminescent lamp number, Represent target projection position, uiTarget following position is represented, then the skew between target projection position and target following position is missed Difference minimum can be expressed as:
WhereinRepresent that target projection position is the function of target position in three dimensions and attitude, i.e., thoroughly Depending on projective transformation function, the function is nonlinear function, can be unfolded as follows by Taylor series linearisation:
WhereinRepresentJacobian matrix, i.e.,OrderThen can be solved according to linear least square
Pass throughUpdate with θ=θ+Δ θ iterationAnd θ, untilLess than specified threshold.
4th, the enforcement explanation of operation planning module
Operation planning module, including space manipulator end movement trajectory planning unit and each joint cooperative motion of mechanical arm Planning unit, can make according to the motion in the position of extraterrestrial target and each joint of requirement planning space mechanical arm of capture task Obtain the asymptotic expression capture that mechanical arm completes extraterrestrial target.
Space manipulator end movement trajectory planning unit, is responsible for the target position tried to achieve according to target tracking and locating module Appearance and the pose by the calculated mechanical arm tail end of mechanical arm configuration, plan mechanical arm tail end capture target mobile route and Track;The each joint cooperative motion planning unit of space manipulator, is responsible for the end mobile route discrete loci obtained according to planning Point, by Arithmetic of inverse kinematics the movement angle and angular speed in each each joint of the corresponding mechanical arm of discrete loci point are solved.
It is expressed as follows by the method for mechanical arm configuration calculating machine arm end pose:If the angle Θ of each joint angle of mechanical arm =[θ0θ1θ2…θn], in mechanical arm armed lever length di, axial line distance ai, joints axes angle αiIt is certain Deng structural parameters, then i-th Pose transformation relation between individual joint and i+1 joint is expressed as Mi=R (θi)T(ai,di)R(αi), then mechanical arm tail end Pose can be calculated as:
Wherein, R (θ12,…,θn)∈R3×3, T (θ1…θn1…αn,d1…dn,a1…an)∈R3×1, machine is represented respectively Tool arm end is to the rotation between pedestal barycenter and translation transformation relation.
Due to the movement locus of space manipulator end, not only have with the cooperative motion relation in each joint of space manipulator Close, it is also closely related with the attitude of space manipulator base carrier (space station).In actual applications generally by space manipulator End movement trajectory planning function is reduced to calculating machine arm end to the shortest path of target, i.e. straight line path;And to machine When each joint angle cooperative motion relation of tool arm is solved, further according to the minimum pact of base carrier (space station) attitude disturbance Beam, the end orbit of adjustment space mechanical arm.The process that each joint angle cooperative motion relation of space manipulator is planned, i.e., The movement angle and angular speed in each each joint of the corresponding space manipulator of discrete loci point are solved by Arithmetic of inverse kinematics Process, is described as follows:
Assume that Space Manipulator System has the n free degree, the differential kinematics equation in its each joint is:
Wherein, vee∈R3The respectively linear velocity and angular speed of space manipulator end, v00∈R3It is respectively empty Between stand pedestal barycenter at linear velocity and angular speed, Θ ∈ RnFor joint angle (n is mechanical arm number of degrees of freedom), JsIt is and space station The related Jacobian matrix of base motion, and JmIt is the Jacobian matrix related to manipulator motion.
Space manipulator is in free floating mode, and base position, attitude be uncontrolled, what space station and mechanical arm were constituted Total system meets linear momentum and the conservation of angular momentum, that is, meet:
Wherein, ωiFor BiAngular speed.Above formula abbreviation can be obtained for matrix form:
ω is solved using above formula0Can obtain:
By (4-4) bring into (4-1) be available from it is as follows by floating space Mechanical transmission test equation:
Wherein, JgFor the generalized Jacobian of space manipulator, it is space station pedestal attitude Ψ0, joint of mechanical arm angle Θ, each joint and space station quality miWith inertia IiFunction, JgvAnd JgvIt is respectively the generalized Jacobian square of speed and acceleration Battle array.
Spatial attitude angle is generally described using Eulerian angles E (α, beta, gamma), Descartes's attitude angular velocity and Euler angle rate it Between relation be:
If matrix JEFor nonsingular matrix, space station attitude angle change and machinery can be obtained according to formula (4-4) and (4-6) Relation between shoulder joint angle:
Minimum path planning is affected to pass through following optimization problem space station attitude:
Wherein, Ψl=[αlll]T, Ψu=[αuuu]T, Ψ=[α, beta, gamma]TRepresent mechanical arm from initial shape Attitude angle of the state to space station under any state of dbjective state, it is necessary to be limited in the range of the attitude of space station normal work. Formula (4-7) is carried out into discretization according to the method for equal interval sampling, then the optimization problem of discretization is solved, you can asked Obtain the movement angle Θ in each joint of each moment corresponding space manipulatorjAnd angular speed
5th, the enforcement explanation of motion analog module
Motion analog module, refers to that the virtual emulation based on space manipulator and extraterrestrial target digital three-dimemsional model is soft Part, including space manipulator threedimensional model and its operation behavior, the threedimensional model of extraterrestrial target and its relative to mechanical arm tail end Or the movement relation of vision camera.
The profile of space manipulator threedimensional model and extraterrestrial target threedimensional model in virtual emulation software, with space mechanism Arm entity is completely or approximately consistent with the profile of extraterrestrial target entity, and the size of two class threedimensional models is adopted and two class entities 1:1 Proportionate relationship setting.To space manipulator threedimensional model and the spatial relationship and its motion control step of extraterrestrial target threedimensional model It is rapid as shown in Figure 3.
The operation behavior of space manipulator threedimensional model refers mainly to each joint motions of space manipulator and base carrier is (empty Between stand) with the cooperative motion of space manipulator, its implementation includes two steps:
The first step, each joint motions of space manipulator, according to each pass that space manipulator operation planning module planning is obtained Section angular movement angle and angular speed sequence are set;
The cooperative motion of second step, base carrier (space station) and space manipulator, according to base carrier and space mechanism The immovable characteristic of arm assembly position of centre of gravity calculating the corresponding variation relation of base carrier position and attitude, according to the change Each step of change relation setting space manipulator moves corresponding base carrier pose.
Extraterrestrial target threedimensional model relative to the motion of mechanical arm tail end or vision camera, refer to extraterrestrial target relative to The change of the position orientation relation of mechanical arm tail end or vision camera, needs to consider extraterrestrial target relative to space station (base carrier) Motion and space manipulator itself motion.Space station (space station and space manipulator composite entity) is with extraterrestrial target in sky Between middle flight, space station is progressively near extraterrestrial target, it is assumed that the flying speed of space station be more than extraterrestrial target flying speed, exist Speed difference △ v so that extraterrestrial target occurs the change of △ vt relative to the position of space manipulator end and vision camera;Simultaneously Because each joint motions of space manipulator cause space station, space manipulator end and vision camera position and attitude to become Change, cause extraterrestrial target to change relative to the pose of space manipulator end and vision camera.To two kinds of positions and attitude Change be overlapped, be calculated extraterrestrial target relative to space manipulator end and vision camera absolute pose change close System, the change of the pose of extraterrestrial target threedimensional model is controlled according to absolute pose variation relation.
The foregoing is only presently preferred embodiments of the present invention, not to limit the present invention, all spirit in the present invention and Within principle, any modification, equivalent substitution and improvements made etc. should be included within the scope of the present invention.

Claims (13)

1. it is a kind of based on the space manipulator capture target teleoperation method for emulating, it is characterised in that to comprise the following steps:
Step 1, data source acquisition module gather the picture number of target by polytype plug-in unit from the data source of corresponding types According to and unification be converted into being sent to middleware module after same format;
Step 2, middleware module receive and store the view data through changing and send to target tracking and locating module Row is processed, and keeps the speed and target tracking of data source acquisition module collection view data to process view data with locating module Speed between difference be maintained in preset range;
Step 3, target tracking and locating module are processed the view data through conversion, obtain the pose of target;
Step 4, operation planning module are respectively closed according to the pose of target and the mechanical arm tail end pose of acquisition, planning space mechanical arm The motion of section;
Step 5, motion analog module set up space manipulator threedimensional model and target three-dimensional, and according to operation planning module The motion in each joint of mechanical arm of planning, changes in real time the pose of space manipulator threedimensional model and the position of target three-dimensional Appearance;
Step 6, repetition above step, until completing capture of the mechanical arm to target in motion analog module.
2. a kind of based on the space manipulator capture target teleoperation method for emulating according to claim 1, it is characterised in that The step 2 specifically includes following steps:
Step 2.1, plug-in management unit are managed and allotment to polytype plug-in unit, receive the view data through conversion And be stored in automatic adaptation cushion area unit;
Step 2.2, automatic adaptation cushion area unit make the acquisition speed and the target of the data source acquisition module Follow the trail of and the difference between the processing speed of locating module is maintained in preset range;
Step 2.3, data output interface unit by the view data in the automatic adaptation cushion area send to target following with it is fixed Position module.
3. a kind of based on the space manipulator capture target teleoperation method for emulating according to claim 2, it is characterised in that The step 2.2 specifically includes following steps:
The largest frames difference that setting buffering area is allowed is FBMAX, the upper limit of the largest frames difference that buffering area is allowed is FMAX, abandon frame flag Number is FTf, the frame difference of IMAQ and image procossing is FP-C
Step 2.2.1, if FP-C> FBMAX, perform FTf=FTf+ 1, then jump to step 2.2.2;
If FP-C<FBMAX/ 2, then jump to step 2.2.3;
If FP-C≥FBMAX/ 2, and FP-C≤FBMAX, then step 2.2.4 is jumped to;
Step 2.2.2, if FTf≥FBMAX/ 2, perform FBMAX=2*FBMAX, jump to step 2.2.6;If FTf<FBMAX/ 2, directly Connect and jump to step 2.2.6;
Step 2.2.3, if FP-C≤ 1, then make FTf=0, otherwise perform FTf=(FTf+ 1)/2, until 2*FTf≤FP-C, performed Step 2.2.5 is jumped to after above-mentioned calculating;
Step 2.2.4, if FTf>0, then perform FTf=FTf- 1, until 2*FTf≤FP-C
Step 2.2.5, if FBMAX>FMAX, then F is madeBMAX=FBMAX/2;
Step 2.2.6, if FTf>0, abandon the 2*F stored in buffering areaTf- 1 frame image data, then by 2*FTfTwo field picture Data-pushing is to target tracking and locating module, i.e. FP-C=FP-C-2*FTf
4. a kind of based on the space manipulator capture target teleoperation method for emulating according to claim 1, it is characterised in that The step 3 specifically includes following steps:
Step 3.1, reception are passed through the view data of conversion and are pre-processed, and obtain each bar side of target in current picture frame Edge;
Step 3.2, the target area being calculated by edge housing in current image frame, wherein, the target area it is several What center is the position of target, and the size of the target area is the size of target;
Step 3.3, the target area in the current image frame is processed by target tracking algorism after obtain next figure As the target area in frame;
Step 3.4, to the pixel application intensity-weighted Geometric center coordinates meter in the target area in the next image frame Calculation method obtains target following position;
Step 3.5, calculate target projection position and by inclined between the target projection position and the target following position The measurement model that shift error is minimized obtains the pose of the target in every two field picture.
5. a kind of based on the space manipulator capture target teleoperation method for emulating according to claim 4, it is characterised in that The step 3.1 specifically includes following steps:
Step 3.1.1, gaussian filtering is carried out to image, select a certain threshold value T1, make gray scale in image be more than T1Pixel value be 255, gray scale is less than T1Pixel value be 0, binaryzation is carried out to the image after gaussian filtering, obtain binary image;
Step 3.1.2, to binary image using corrosion expansion algorithm process, eliminate binary image in isolated bright spot, make It is 255 to obtain and only have in binary image target area gray value, and other positions gray value is 0;
Step 3.1.3, to corrosion expansion after binary image carry out edge extracting.
6. a kind of based on the space manipulator capture target teleoperation method for emulating according to claim 5, it is characterised in that Target tracking algorism in the step 3.3 is the track algorithm that Camshift and Kalman filtering are blended.
7. a kind of based on the space manipulator capture target teleoperation method for emulating according to claim 4, it is characterised in that The step 3.4 is specifically, set a certain threshold value T0, T is more than or equal to all gray values in target area0Pixel according to Its position in the image after gaussian filtering process carries out intensity-weighted average calculating operation, tries to achieve intensity-weighted geometric center Coordinate, its computing formula is:
X c = &Sigma; ( x , y ) &Element; R x I ( x , y ) / &Sigma; ( x , y ) &Element; R I ( x , y ) , y c = &Sigma; ( x , y ) &Element; R y I ( x , y ) / &Sigma; ( x , y ) &Element; R I ( x , y ) ,
Wherein xcRepresent the x coordinate of intensity-weighted geometric center, ycThe y-coordinate of intensity-weighted geometric center is represented, x represents image The x coordinate of middle pixel, y represents the y-coordinate of pixel in image, and the gray value at I (x, y) denotation coordination (x, y) place, R is represented Target area.
8. a kind of based on the space manipulator capture target teleoperation method for emulating according to claim 4, it is characterised in that The step 3.5 specifically includes following steps:
The three-dimensional coordinate position of step 3.5.1, estimation target in world coordinate system;
Step 3.5.2, according to the three-dimensional coordinate position, calculate target projection in the picture using perspective projection model and sit Mark, obtains target projection position;
Step 3.5.3, the target projection position and target following position are compared, calculate offset error;
Step 3.5.4, according to the offset error, using position of the least square refinement target in world coordinate system;
Step 3.5.5, it is repeated multiple times until convergence, i.e., offset error be less than previously given threshold value.
9. a kind of based on the space manipulator capture target teleoperation method for emulating according to claim 1, it is characterised in that The step 4 specifically includes following steps:
Step 4.1, mechanical arm tail end Motion trajectory unit are according to by the calculated mechanical arm tail end of mechanical arm configuration The pose of the target that pose and target tracking are obtained with locating module, plans the mobile route of mechanical arm tail end;
The mobile route of the mechanical arm tail end that each joint cooperative motion planning unit of step 4.2, mechanical arm is obtained according to planning Discrete loci point, by Arithmetic of inverse kinematics movement angle and the angle in each each joint of the corresponding mechanical arm of discrete loci point are solved Speed.
10. it is a kind of based on the space manipulator capture target remote control system for emulating, it is characterised in that to include:
Data source acquisition module, for gathering the view data of target from the data source of corresponding types by polytype plug-in unit And unification is converted into being sent to middleware module after same format;
Middleware module, for receiving and the view data and sending to target tracking that stores through changing is carried out with locating module Process, and keep the speed and target tracking of data source acquisition module collection view data to process view data with locating module Difference between speed is maintained in preset range;
Target tracking and locating module, for processing the view data through conversion, obtain the pose of target;
Operation planning module, for according to the pose of target and the mechanical arm tail end pose for obtaining, planning space mechanical arm respectively to be closed The motion of section;
Motion analog module, for setting up space manipulator threedimensional model and target three-dimensional, and according to operation planning module The motion in each joint of mechanical arm of planning, changes in real time the pose of space manipulator threedimensional model and the position of target three-dimensional Appearance.
11. is a kind of based on the space manipulator capture target remote control system for emulating according to claim 10, and its feature exists In the middleware module includes plug-in management unit, automatic adaptation cushion area unit and unitized data output interface unit;
The plug-in management unit, for polytype plug-in unit to be loaded, managed and allocated, receives the figure through conversion As data and it is stored in automatic adaptation cushion area unit;
Automatic adaptation cushion area unit, for making the acquisition speed and the target tracking of the data source acquisition module The difference and processing speed of locating module between is maintained in preset range;
The unitized data output interface unit, for by the view data in the automatic adaptation cushion area send to target with Track and locating module;
12. is a kind of based on the space manipulator capture target remote control system for emulating according to claim 10, and its feature exists In the target tracking includes image pre-processing unit, object detection unit, target tracking unit, position knot with locating module Calculate unit and pose solving unit;
Described image pretreatment unit, for reception to be through the view data of conversion and is pre-processed, obtains current image Each bar edge of target in frame;
The object detection unit, for the target area being calculated by edge housing in current image frame, wherein, it is described The geometric center of target area is the position of target, and the size of the target area is the size of target;
The target tracking unit, for being processed by target tracking algorism the target area in the current image frame The target area in next image frame is obtained afterwards;
The position solving unit, for several to the pixel application intensity-weighted in the target area in the next image frame What centre coordinate computational methods obtains target following position;
The pose solving unit, for minimum by the offset error between target projection position and the target following position The measurement model of change obtains the pose of the target in every two field picture.
13. is a kind of based on the space manipulator capture target remote control system for emulating according to claim 10, and its feature exists In the operation planning module includes that mechanical arm tail end Motion trajectory unit and each joint cooperative motion planning of mechanical arm are single Unit;
The mechanical arm tail end Motion trajectory unit, for according to the pose and target tracking of mechanical arm tail end and positioning mould The pose of the target that block is obtained, plans the mobile route of mechanical arm tail end;
The each joint cooperative motion planning unit of the mechanical arm, for the mobile route of mechanical arm tail end that obtained according to planning Discrete loci point, by Arithmetic of inverse kinematics movement angle and the angle in each each joint of the corresponding mechanical arm of discrete loci point are solved Speed.
CN201610903204.5A 2016-10-17 2016-10-17 Space manipulator target capturing teleoperation method and system based on simulation Active CN106651949B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610903204.5A CN106651949B (en) 2016-10-17 2016-10-17 Space manipulator target capturing teleoperation method and system based on simulation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610903204.5A CN106651949B (en) 2016-10-17 2016-10-17 Space manipulator target capturing teleoperation method and system based on simulation

Publications (2)

Publication Number Publication Date
CN106651949A true CN106651949A (en) 2017-05-10
CN106651949B CN106651949B (en) 2020-05-15

Family

ID=58855840

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610903204.5A Active CN106651949B (en) 2016-10-17 2016-10-17 Space manipulator target capturing teleoperation method and system based on simulation

Country Status (1)

Country Link
CN (1) CN106651949B (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107097231A (en) * 2017-07-06 2017-08-29 哈尔滨工业大学深圳研究生院 A kind of concentric tube robot precise motion control method of view-based access control model servo
CN107220601A (en) * 2017-05-18 2017-09-29 西北工业大学 A kind of target based on online Confidence arrests point prediction method
CN107696033A (en) * 2017-09-18 2018-02-16 北京控制工程研究所 A kind of space manipulator track Rolling Planning method of view-based access control model measurement
CN107932502A (en) * 2017-11-07 2018-04-20 陕西科技大学 A kind of SCARA method for planning track of robot based on binocular stereo vision
CN108656109A (en) * 2018-05-07 2018-10-16 中国科学院力学研究所 A kind of remote-operated training method and system
CN108748149A (en) * 2018-06-04 2018-11-06 上海理工大学 Based on deep learning without calibration mechanical arm grasping means under a kind of complex environment
CN108828935A (en) * 2018-05-07 2018-11-16 中国科学院力学研究所 A kind of intelligent auxiliary operation method and system of remote operation
CN109093376A (en) * 2018-08-17 2018-12-28 清华大学 A kind of multiaxis hole automation alignment methods based on laser tracker
CN109531566A (en) * 2018-11-16 2019-03-29 国网江苏省电力有限公司盐城供电分公司 A kind of robot livewire work control method based on virtual reality system
CN109583260A (en) * 2017-09-28 2019-04-05 北京猎户星空科技有限公司 A kind of collecting method, apparatus and system
WO2019080228A1 (en) * 2017-10-25 2019-05-02 南京阿凡达机器人科技有限公司 Robot object-grasping control method and apparatus
CN110216698A (en) * 2019-03-11 2019-09-10 浙江工业大学 A kind of mechanical arm remote control system based on ROS
WO2020010876A1 (en) * 2018-07-09 2020-01-16 五邑大学 Mechanical arm control method based on least squares method for use in robot experimental teaching
CN110806197A (en) * 2019-09-28 2020-02-18 上海翊视皓瞳信息科技有限公司 Gesture detecting system based on intelligent vision equipment
CN110815215A (en) * 2019-10-24 2020-02-21 上海航天控制技术研究所 Multi-mode fused rotating target approaching and stopping capture ground test system and method
CN111109417A (en) * 2019-12-23 2020-05-08 重庆大学 Route is from planning sugar-painter based on image information
CN111823225A (en) * 2020-06-04 2020-10-27 江汉大学 Visual servo three-dimensional simulation method and device
CN111890365A (en) * 2020-07-31 2020-11-06 平安科技(深圳)有限公司 Target tracking method and device, computer equipment and storage medium
CN112109075A (en) * 2019-06-20 2020-12-22 欧姆龙株式会社 Control system and control method
CN112148000A (en) * 2020-08-28 2020-12-29 上海宇航***工程研究所 In-cabin simulation platform for simulating operation scene of space maintenance robot and implementation method
CN112621789A (en) * 2020-12-08 2021-04-09 广东联航智能科技有限公司 Control system of robot for double-arm man-machine cooperative operation
CN112763253A (en) * 2020-12-28 2021-05-07 深圳市人工智能与机器人研究院 Sampling control method and device of mechanical arm and sampling system
CN112847334A (en) * 2020-12-16 2021-05-28 北京无线电测量研究所 Mechanical arm target tracking method based on visual servo
CN113103230A (en) * 2021-03-30 2021-07-13 山东大学 Human-computer interaction system and method based on remote operation of treatment robot
CN113352327A (en) * 2021-06-28 2021-09-07 深圳亿嘉和科技研发有限公司 Five-degree-of-freedom mechanical arm joint variable determination method
CN113479442A (en) * 2021-07-16 2021-10-08 上海交通大学烟台信息技术研究院 Device and method for realizing intelligent labeling of unstructured objects on production line
CN114078158A (en) * 2020-08-14 2022-02-22 边辕视觉科技(上海)有限公司 Method for automatically acquiring characteristic point parameters of target object
CN114102610A (en) * 2021-12-30 2022-03-01 浙江博采传媒有限公司 Mechanical arm simulation control method and device and storage medium
CN114770513A (en) * 2022-05-09 2022-07-22 重庆大学 Moving target tracking and grabbing method of industrial four-axis robot
CN116214549A (en) * 2023-01-19 2023-06-06 中国科学院微小卫星创新研究院 Teleoperation system and teleoperation method for space robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101726296A (en) * 2009-12-22 2010-06-09 哈尔滨工业大学 Vision measurement, path planning and GNC integrated simulation system for space robot
CN101733746A (en) * 2009-12-22 2010-06-16 哈尔滨工业大学 Autonomously identifying and capturing method of non-cooperative target of space robot
CN103926845A (en) * 2014-04-17 2014-07-16 哈尔滨工业大学 Ground-based simulation system for space robot visual servo to capture moving target and simulation method
CN105635648A (en) * 2014-10-28 2016-06-01 江苏绿扬电子仪器集团有限公司 Video real-time edge detection system
CN106003104A (en) * 2015-07-03 2016-10-12 中国运载火箭技术研究院 Mechanical arm planning method suitable for visual information guiding under multi-constrained condition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101726296A (en) * 2009-12-22 2010-06-09 哈尔滨工业大学 Vision measurement, path planning and GNC integrated simulation system for space robot
CN101733746A (en) * 2009-12-22 2010-06-16 哈尔滨工业大学 Autonomously identifying and capturing method of non-cooperative target of space robot
CN103926845A (en) * 2014-04-17 2014-07-16 哈尔滨工业大学 Ground-based simulation system for space robot visual servo to capture moving target and simulation method
CN105635648A (en) * 2014-10-28 2016-06-01 江苏绿扬电子仪器集团有限公司 Video real-time edge detection system
CN106003104A (en) * 2015-07-03 2016-10-12 中国运载火箭技术研究院 Mechanical arm planning method suitable for visual information guiding under multi-constrained condition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙凯等: "《多目标跟踪的改进Camshift/卡尔曼滤波组合算法》", 《信息与控制》 *

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107220601A (en) * 2017-05-18 2017-09-29 西北工业大学 A kind of target based on online Confidence arrests point prediction method
CN107097231A (en) * 2017-07-06 2017-08-29 哈尔滨工业大学深圳研究生院 A kind of concentric tube robot precise motion control method of view-based access control model servo
CN107696033A (en) * 2017-09-18 2018-02-16 北京控制工程研究所 A kind of space manipulator track Rolling Planning method of view-based access control model measurement
CN109583260A (en) * 2017-09-28 2019-04-05 北京猎户星空科技有限公司 A kind of collecting method, apparatus and system
WO2019080228A1 (en) * 2017-10-25 2019-05-02 南京阿凡达机器人科技有限公司 Robot object-grasping control method and apparatus
CN107932502A (en) * 2017-11-07 2018-04-20 陕西科技大学 A kind of SCARA method for planning track of robot based on binocular stereo vision
CN108656109A (en) * 2018-05-07 2018-10-16 中国科学院力学研究所 A kind of remote-operated training method and system
CN108828935B (en) * 2018-05-07 2022-05-13 中国科学院力学研究所 Intelligent auxiliary operation method and system for remote operation
CN108828935A (en) * 2018-05-07 2018-11-16 中国科学院力学研究所 A kind of intelligent auxiliary operation method and system of remote operation
CN108748149A (en) * 2018-06-04 2018-11-06 上海理工大学 Based on deep learning without calibration mechanical arm grasping means under a kind of complex environment
CN108748149B (en) * 2018-06-04 2021-05-28 上海理工大学 Non-calibration mechanical arm grabbing method based on deep learning in complex environment
WO2020010876A1 (en) * 2018-07-09 2020-01-16 五邑大学 Mechanical arm control method based on least squares method for use in robot experimental teaching
CN109093376B (en) * 2018-08-17 2020-04-03 清华大学 Multi-axis hole automatic alignment method based on laser tracker
CN109093376A (en) * 2018-08-17 2018-12-28 清华大学 A kind of multiaxis hole automation alignment methods based on laser tracker
CN109531566A (en) * 2018-11-16 2019-03-29 国网江苏省电力有限公司盐城供电分公司 A kind of robot livewire work control method based on virtual reality system
CN109531566B (en) * 2018-11-16 2022-08-19 国网江苏省电力有限公司盐城供电分公司 Robot live-line work control method based on virtual reality system
CN110216698A (en) * 2019-03-11 2019-09-10 浙江工业大学 A kind of mechanical arm remote control system based on ROS
CN112109075A (en) * 2019-06-20 2020-12-22 欧姆龙株式会社 Control system and control method
CN112109075B (en) * 2019-06-20 2024-05-17 欧姆龙株式会社 Control system and control method
CN110806197A (en) * 2019-09-28 2020-02-18 上海翊视皓瞳信息科技有限公司 Gesture detecting system based on intelligent vision equipment
CN110815215A (en) * 2019-10-24 2020-02-21 上海航天控制技术研究所 Multi-mode fused rotating target approaching and stopping capture ground test system and method
CN111109417A (en) * 2019-12-23 2020-05-08 重庆大学 Route is from planning sugar-painter based on image information
CN111823225A (en) * 2020-06-04 2020-10-27 江汉大学 Visual servo three-dimensional simulation method and device
CN111890365A (en) * 2020-07-31 2020-11-06 平安科技(深圳)有限公司 Target tracking method and device, computer equipment and storage medium
CN114078158A (en) * 2020-08-14 2022-02-22 边辕视觉科技(上海)有限公司 Method for automatically acquiring characteristic point parameters of target object
CN112148000A (en) * 2020-08-28 2020-12-29 上海宇航***工程研究所 In-cabin simulation platform for simulating operation scene of space maintenance robot and implementation method
CN112148000B (en) * 2020-08-28 2022-10-21 上海宇航***工程研究所 In-cabin simulation platform for simulating operation scene of space maintenance robot and implementation method
CN112621789A (en) * 2020-12-08 2021-04-09 广东联航智能科技有限公司 Control system of robot for double-arm man-machine cooperative operation
CN112847334A (en) * 2020-12-16 2021-05-28 北京无线电测量研究所 Mechanical arm target tracking method based on visual servo
CN112763253A (en) * 2020-12-28 2021-05-07 深圳市人工智能与机器人研究院 Sampling control method and device of mechanical arm and sampling system
CN112763253B (en) * 2020-12-28 2024-03-29 深圳市人工智能与机器人研究院 Sampling control method and device for mechanical arm and sampling system
CN113103230A (en) * 2021-03-30 2021-07-13 山东大学 Human-computer interaction system and method based on remote operation of treatment robot
CN113352327A (en) * 2021-06-28 2021-09-07 深圳亿嘉和科技研发有限公司 Five-degree-of-freedom mechanical arm joint variable determination method
CN113479442A (en) * 2021-07-16 2021-10-08 上海交通大学烟台信息技术研究院 Device and method for realizing intelligent labeling of unstructured objects on production line
CN114102610A (en) * 2021-12-30 2022-03-01 浙江博采传媒有限公司 Mechanical arm simulation control method and device and storage medium
CN114770513A (en) * 2022-05-09 2022-07-22 重庆大学 Moving target tracking and grabbing method of industrial four-axis robot
CN114770513B (en) * 2022-05-09 2024-07-12 重庆大学 Industrial four-axis robot moving target tracking and grabbing method
CN116214549A (en) * 2023-01-19 2023-06-06 中国科学院微小卫星创新研究院 Teleoperation system and teleoperation method for space robot
CN116214549B (en) * 2023-01-19 2024-03-01 中国科学院微小卫星创新研究院 Teleoperation system and teleoperation method for space robot

Also Published As

Publication number Publication date
CN106651949B (en) 2020-05-15

Similar Documents

Publication Publication Date Title
CN106651949A (en) Teleoperation method and system for grabbing objects using space mechanical arm based on simulation
CN109719730B (en) Digital twin robot for flexible assembly process of circuit breaker
CN109960880B (en) Industrial robot obstacle avoidance path planning method based on machine learning
Woods et al. Autonomous science for an ExoMars Rover–like mission
CN111325768B (en) Free floating target capture method based on 3D vision and simulation learning
CN106030430A (en) Multi-sensor fusion for robust autonomous filght in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (MAV)
CN112102403B (en) High-precision positioning method and system for autonomous inspection unmanned aerial vehicle in power transmission tower scene
CN109764876B (en) Multi-mode fusion positioning method of unmanned platform
CN111812978A (en) Cooperative SLAM method and system for multiple unmanned aerial vehicles
Mirolo et al. A solid modelling system for robot action planning
Rossmann erobotics: The symbiosis of advanced robotics and virtual reality technologies
CN113524173A (en) End-to-end intelligent capture method for extraterrestrial detection sample
CN112060088A (en) Non-cooperative target accurate capturing teleoperation method under variable time delay condition
Saunders et al. Parallel reinforcement learning simulation for visual quadrotor navigation
CN108151742B (en) Navigation control method and intelligent device for robot
CN108595771B (en) Spacecraft equipment field of view simulation analysis method
Castano et al. Opportunistic rover science: finding and reacting to rocks, clouds and dust devils
Sherwood et al. An integrated planning and scheduling prototype for automated Mars rover command generation
Estlin et al. Enabling autonomous rover science through dynamic planning and scheduling
Romero-Azpitarte et al. Enabling in-situ resources utilisation by leveraging collaborative robotics and astronaut-robot interaction
Zolghadr et al. Locating a two-wheeled robot using extended Kalman filter
Bahrpeyma et al. Application of Reinforcement Learning to UR10 Positioning for Prioritized Multi-Step Inspection in NVIDIA Omniverse
Borst et al. Telerobotic ground control of a space free-flyer
Covasan et al. Autonomy Challenges for the Next Generation of Mars Rovers
Liang Affecting Fundamental Transformation in Future Construction Work Through Replication of the Master-Apprentice Learning Model in Human-Robot Worker Teams

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant