CN105690371A - Space service robot-oriented hand-eye system - Google Patents

Space service robot-oriented hand-eye system Download PDF

Info

Publication number
CN105690371A
CN105690371A CN201410697687.9A CN201410697687A CN105690371A CN 105690371 A CN105690371 A CN 105690371A CN 201410697687 A CN201410697687 A CN 201410697687A CN 105690371 A CN105690371 A CN 105690371A
Authority
CN
China
Prior art keywords
robot
hand
eye system
mechanical arm
service robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410697687.9A
Other languages
Chinese (zh)
Inventor
梁帆
代凤飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University of Technology
Original Assignee
Tianjin University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University of Technology filed Critical Tianjin University of Technology
Priority to CN201410697687.9A priority Critical patent/CN105690371A/en
Publication of CN105690371A publication Critical patent/CN105690371A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a space service robot-oriented hand-eye system. The hand-eye system is provided with a master-control pioneer robot platform (1) and a robot self-positioning camera (2), and further comprises a mechanical arm (3), mechanical paws (4) and a micro camera (5) attached to the mechanical arm (3), a simulated space capsule operation panel (6), control buttons (7) on the operation panel (6), and a power supply module (8). The invention provides the hand-eye system, which can assist a spaceman to complete a task, of a mobile robot based on vision.

Description

A kind of hand-eye system towards space service robot
Technical field
The present invention is for meeting under simulation space weightlessness, robot assisted spaceman is utilized to complete the needs of certain task, solve service robot how autonomous classification target, independently position target, the problem of autonomous button click, on the platform of pioneer's mobile-robot system, it is proposed to and achieve the mobile robot visual feedback system of view-based access control model。On the basis based on image processing algorithms such as Hough transformation and Cam-shift track algorithms, extract target image characteristics, utilize the vision measurement model that pinhole imaging system principle is set up, draw the mapping relations of image and target, obtain impact point spatial information in real time, thus controlling mechanical hand to complete appointed task。
Background technology
Along with the development of science and technology, the mankind will more and more enter space。But when weightlessness of space, the work of spaceman is subjected to a lot of restriction。Then spaceman can send job task and action command by brain electricity, myoelectricity bio signal and voice signal, allows space service robot replace spaceman to go out cabin, and completes appointed task。In order to verify above-mentioned multi-mode Man Machine Interface technology, it is therefore necessary to exploitation has the service robot of certain intelligence。Simulation is under space environment, and Spacecraft Control panel is independently operated by service robot after receiving the instruction that spaceman sends。So how making service robot realize autonomous classification target, independently position target, the task of autonomous button click is the problem needing to solve。
In the robot serviced towards space, domestic and international patented invention almost without, courage Mars probes and Opportunity Rover Mars probes that on June 10th, 2003 launches from the air base, Canaveral Cape of the U.S. are a kind of space robotics, and on January 4th, 2004, the Gusev crater in the Mars Southern Hemisphere landed。" courage " numbers long 1.6 meters, wide 2.3 meters, high 1.5 meters, weighs 174 kilograms。Its " brain " is a computer that can perform about 20,000,000 instructions per second。Its main task is whether to there is water and life on detection Mars, and analyzes its Matter Composition, to infer that can Mars pass through to transform applicable life existence。Courage number and Opportunity Rover are all carry out investigations after landing and detect, and are not embodied as spacefarer's service in space capsule。So, our this service robot towards space is the necessity having very much development。。
Summary of the invention
It is an object of the invention to, utilize pioneer's mobile-robot system, devise the mobile apparatus people's hand-eye system based on monocular vision。By digital image processing methods such as Hough transformation and Cam-shift track algorithms, extract target image characteristics, in conjunction with priori, and utilize the mathematical model that pinhole imaging system principle sets up, draw the spatial information of target in real time, thus completing the identification of target and guiding the task of mechanical hand button click。Describe the technology contents of this invention in detail:
With reference to Fig. 1, robot, according to given Task Autonomous identification target, independently positions target, the task of autonomous button click。Fig. 1 is visual system position control theory diagram。This system passes through image acquisition and feature extraction, completes extraction feature point coordinate information in two-dimensional image plane from image。Then utilize the geometric model of target and photographic head, estimate the position relationship between target and photographic head。Again by the error propagation of estimated value and ideal position to robot controller, controller goes out the control signal of mechanical hand by this Error Calculation, thus controlling mechanical hand to complete task。
Utilizing when known target information the Target Photo that photographic head obtains to obtain on the basis of depth information herein, setting up monocular vision range finding model, and then extract physical aspect and the color characteristic of target。By the identification then passing through color characteristic identification and profile, improve the accuracy of identification。Before carrying out feature extraction, in conjunction with characteristic attribute, target image is smoothed, the Filtering Processing such as dilation erosion。Expansion is that image and core are carried out convolution, it is however generally that core is a little centre with the filled squares of reference point or disk。Corrosion is the inverse operations expanded, and this operation to calculate the minima of core area pixel。First with color of image space transforming function cvCvtColor, the RGB image collected is converted to gray level image。In order to eliminate the high-frequency noise interference to detection circle, re-use filter function cvSmooth and image is carried out core be sized to the Gaussian convolution of 9 × 9。Finally with hough-circle transform cvHoughCircles function, image is carried out loop truss。The accumulative resolution finding the circular arc center of circle is set to 6, and the minimum range between two different circles is set to 120, is set to 36 for the edge threshold values upper limit, and the threshold values of accumulator is set to 85, and smallest circle radius is set to 3, and greatest circle radius is set to 250。
The present invention compared with prior art have the advantage that and effect, it is therefore desirable to be analyzed structure, and there are suitable data。
Patent of the present invention technical problem to be solved is to provide a kind of can assist spaceman to complete certain task to need mobile apparatus people's hand-eye system of view-based access control model。
Native system has master control pioneer's robot platform, also includes mechanical arm simultaneously, depends on the minisize pick-up head on mechanical arm, and the control knob on simulation space cabin operation panel and operation panel, power module。Robot platform adopts P3-DX type pioneer robot, and internal operation windowXP system, using it as control core, peripheral configuration connects upper corresponding frame for movement and electrical structure, it is achieved overall hand-eye system。The seven of mechanical arm Robai company mechanical hand freely, utilizes the CytonC++API of its offer, thus erecting manipulator control platform。Minisize pick-up head adopts the diameter Canon high-definition camera less than 1CM, is connected with robot by USB port, it is achieved image acquisition and the transmission to guidance panel。Control knob on space capsule operation panel and plate, self-developing development and layout, 6 button respectively 6 kinds of colors, the identification for internal system controls, and whole operation panel is for the environment in simulation space cabin。
Accompanying drawing and brief description thereof:
Fig. 1 is the visual system position control theory diagram of native system;
Fig. 2 is the overall appearance figure of native system;
Fig. 3 is the space capsule operation panel of native system;
Embodiment:
The hand-eye system towards space service robot of the present invention, it is achieved robot, according to given Task Autonomous identification target, independently positions target, the task of autonomous button click。
With reference to Fig. 1, this system passes through image acquisition and feature extraction, completes extraction feature point coordinate information in two-dimensional image plane from image。Then utilize the geometric model of target and photographic head, estimate the position relationship between target and photographic head。Again by the error propagation of estimated value and ideal position to robot controller, controller goes out the control signal of mechanical hand by this Error Calculation, thus controlling mechanical hand to complete task。
With reference to robot as shown in Figure 2, for meeting task needs, in the main hardware of system, seven freedom mechanical hand is mounted on pioneer's mobile-robot system, by installing small-sized image pickup head to arm end, thus building hand-eye system。
With reference to as it is shown on figure 3, space capsule operation panel has 6 buttons and a knob, after assigning, to robot, the order clicking different colours button, by form and color double characteristic are extracted, identifying detection target, and calibrate recognition result。

Claims (3)

1. towards the hand-eye system of space service robot, there is master control pioneer's robot platform (1), robot self-localization photographic head (2), also include mechanical arm (3) simultaneously, depend on the gripper (4) on mechanical arm (3) and minisize pick-up head (5), and the control knob (7) on simulation space cabin operation panel (6) and operation panel (6), power module (8), the present invention provides a kind of and spaceman can be assisted to complete certain task need mobile apparatus people's hand-eye system of view-based access control model。
2. the hand-eye system towards space service robot according to claim 1, it is characterized in that: described mechanical arm (3) adopts 7 degree of freedom, it is loaded in the surface of host machine people's platform (1), it is possible to front face is to the front of gripper (4)。
3. the hand-eye system towards space service robot according to claim 1, it is characterized in that: described minisize pick-up head (3) is arranged on the surface of mechanical arm (2) end, it is the robot forward image that can photograph gripper (4), and identifies。
CN201410697687.9A 2014-11-28 2014-11-28 Space service robot-oriented hand-eye system Pending CN105690371A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410697687.9A CN105690371A (en) 2014-11-28 2014-11-28 Space service robot-oriented hand-eye system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410697687.9A CN105690371A (en) 2014-11-28 2014-11-28 Space service robot-oriented hand-eye system

Publications (1)

Publication Number Publication Date
CN105690371A true CN105690371A (en) 2016-06-22

Family

ID=56294141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410697687.9A Pending CN105690371A (en) 2014-11-28 2014-11-28 Space service robot-oriented hand-eye system

Country Status (1)

Country Link
CN (1) CN105690371A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107914272A (en) * 2017-11-20 2018-04-17 北京科技大学 A kind of method of seven freedom robot assemblies crawl target object

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107914272A (en) * 2017-11-20 2018-04-17 北京科技大学 A kind of method of seven freedom robot assemblies crawl target object
CN107914272B (en) * 2017-11-20 2020-06-05 北京科技大学 Method for grabbing target object by seven-degree-of-freedom mechanical arm assembly

Similar Documents

Publication Publication Date Title
US11691273B2 (en) Generating a model for an object encountered by a robot
US11034026B2 (en) Utilizing optical data to dynamically control operation of a snake-arm robot
JP6586532B2 (en) Deep machine learning method and apparatus for robot gripping
CN112634318B (en) Teleoperation system and method for underwater maintenance robot
WO2020243204A1 (en) Robotic control based on 3d bounding shape, for an object, generated using edge-depth values for the object
CN110246127A (en) Workpiece identification and localization method and system, sorting system based on depth camera
WO2022021156A1 (en) Method and apparatus for robot to grab three-dimensional object
US20180311818A1 (en) Automated personalized feedback for interactive learning applications
CN111462154A (en) Target positioning method and device based on depth vision sensor and automatic grabbing robot
CN105892633A (en) Gesture identification method and virtual reality display output device
Zhao et al. Image-based visual servoing using improved image moments in 6-DOF robot systems
EP3790710A1 (en) Robotic manipulation using domain-invariant 3d representations predicted from 2.5d vision data
CN110555404A (en) Flying wing unmanned aerial vehicle ground station interaction device and method based on human body posture recognition
CN113172659A (en) Flexible robot arm shape measuring method and system based on equivalent central point recognition
CN111624875A (en) Visual servo control method and device and unmanned equipment
CN117103277A (en) Mechanical arm sensing method based on multi-mode data fusion
CN113483664B (en) Screen plate automatic feeding system and method based on line structured light vision
Nigro et al. Assembly task execution using visual 3D surface reconstruction: An integrated approach to parts mating
CN105690371A (en) Space service robot-oriented hand-eye system
Schnaubelt et al. Autonomous assistance for versatile grasping with rescue robots
Kondaxakis et al. Real-time recognition of pointing gestures for robot to robot interaction
Sahu et al. Shape features for image-based servo-control using image moments
Peñalver et al. Multi-view underwater 3D reconstruction using a stripe laser light and an eye-in-hand camera
Nakhaeinia et al. Adaptive robotic contour following from low accuracy RGB-D surface profiling and visual servoing
CN111158489B (en) Gesture interaction method and gesture interaction system based on camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160622