CN108181897A - A kind of method of biped robot's automatic tracking - Google Patents

A kind of method of biped robot's automatic tracking Download PDF

Info

Publication number
CN108181897A
CN108181897A CN201711306559.7A CN201711306559A CN108181897A CN 108181897 A CN108181897 A CN 108181897A CN 201711306559 A CN201711306559 A CN 201711306559A CN 108181897 A CN108181897 A CN 108181897A
Authority
CN
China
Prior art keywords
path
image
robot
center line
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711306559.7A
Other languages
Chinese (zh)
Inventor
庄礼鸿
章建森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huaqiao University
Original Assignee
Huaqiao University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huaqiao University filed Critical Huaqiao University
Priority to CN201711306559.7A priority Critical patent/CN108181897A/en
Publication of CN108181897A publication Critical patent/CN108181897A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D57/00Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
    • B62D57/02Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
    • B62D57/032Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legs; with alternately or sequentially lifted feet or skid
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/68Analysis of geometric attributes of symmetry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30172Centreline of tubular or elongated structure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Electromagnetism (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Multimedia (AREA)
  • Chemical & Material Sciences (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of method of biped robot's automatic tracking, including:The guidance path image information of visual sensor distribution of machine people foot on the biped robot is to processor;Processor handles the guidance path image information received, obtains the relative position relation of robot current pose and guidance path, and is sent to controller;Relative position relation includes angular deviation and position deviation;Controller adjusts walking path to realize automatic tracking according to the angular deviation received and position deviation control robot.The present invention acquires guidance path image information by visual sensor distribution of machine;Image is ashed, mean filter, Canny edge detection algorithms and extraction path edge coordinate obtain route information, and improve routing information accuracy by section scanning;In Path Recognition, slope matched method is proposed for crossedpath to select best progress path;Therefore, there is preferable real-time and antijamming capability.

Description

A kind of method of biped robot's automatic tracking
Technical field
The present invention relates to robot visual guidance technical field, more particularly to a kind of side of biped robot's automatic tracking Method.
Background technology
The vision guided navigation of mobile robot is the important directions of current robot research field.The indoor navigation of view-based access control model It can be divided into three classes:Navigation based on map, the navigation based on map structuring and without digital map navigation.With service humanoid robot Family is promoted and enters into, just to can robot complete independently related service project proposes requirement, core in environment indoors The heart is exactly the indoor navigation technology of robot.Gartshore is proposed using the map structuring frame and feature locations for occupying grid The navigation algorithm of detection passes through single camera online processing RGB color image sequence.The algorithm is first in current image frame In by Harris edges and Corner Detection device come the contour edge of detection object, peak value can determine to edge feature scanning, Then, consider under arbitrary depth it is possible that position, by the Projection Character detected to the 2D planes of delineation, counts according to mileage According to the characteristics of image with extraction, system locating module can be with calculating robot position.Method based on map structuring needs to rely on The map of global context navigates the foundation of decision the most.This air navigation aid will appear problem when environment changes. Saitoh et al. proposes a kind of wheeled mobile robot corridor center line realized using single USB camera and laptop Tracking, this method detect the boundary in corridor and wall using Hough transformation, and robot will be moved along the center line in corridor. The above method can not meet robot and complete inter-related task under relative complex environment.Automatic guide vehicle navigation system uses A kind of method be namely based on the technology of guide line navigation, in practical applications, mobile robot is along pre-designed several What shape movement completes to search and rescue task.Many researchers have proposed to obtain using vision system on autonomous vehicle And it analyzes and is laid with the image of guide wire on the ground to overcome the limitation with other sensors.
The carrier of vision guided navigation is concentrated mainly on wheeled robot at present, and camera position is fixed, and biped robot is (such as NAO robots) it is the same with people, it is moved by the movement of biped, controls difficulty higher, control accuracy is difficult to reach wheeled machine The requirement of device people.But pursuit of the people for robot apery, following anthropomorphic robot will be transported by biped It is dynamic, therefore, also it is of great significance to the research of biped robot's vision guided navigation.
Invention content
The purpose of the present invention is to overcome the deficiency in the prior art, proposes a kind of method of biped robot's automatic tracking, leads to Cross visual sensor distribution of machine acquisition guidance path image information;Image is ashed, the inspection of mean filter, Canny edges Method of determining and calculating and extraction path edge coordinate obtain route information, and improve routing information accuracy by section scanning; In Path Recognition, slope matched method is proposed for crossedpath to select best progress path, have preferable real-time and Antijamming capability.
The technical solution adopted by the present invention to solve the technical problems is:
A kind of method of biped robot's automatic tracking, including:
The guidance path image information of visual sensor distribution of machine people foot on the biped robot arrives Processor;
The processor handles the guidance path image information received, obtain robot current pose with The relative position relation of guidance path, and it is sent to controller;The relative position relation includes angular deviation and position deviation;
The controller adjusts walking path to realize according to the angular deviation received and position deviation control robot Automatic tracking.
Preferably, the processor handles the guidance path image information received, including:
RGB color figure is converted into gray-scale map;
Image filtering is carried out to gray-scale map using mean filter method;
Edge detection is carried out to filtered image using Canny edge detection algorithms;
Extraction path edge coordinate obtains the center line in path;
The relative position relation of robot current pose and guidance path is obtained according to the position of the center line.
Preferably, the extraction path edge coordinate obtains the center line in path, including:
The edge coordinate in path in image is obtained by the method for progressive scan, and position of center line is worth in taking Column matrix idnex, it is as follows:
Idnex=(I1,I2,...,Ii)T
Wherein,F (i, j) represents f (i, j) is through Canny edge detection algorithms treated the corresponding two-dimensional array of two bit value images.
Preferably, if guidance path is straight line path or curved path, the center line in path is obtained according to following methods:
In image processing region, the first row all pixels are handled respectively, obtain left hand edge point position and and the right side Edge point position;
Since the second row, using the left hand edge point position and right hand edge point position of lastrow in path image at same frame come minute Left hand edge point position and the right hand edge point position range of adjacent next line are not limited to obtain respective left hand edge point position and the right side Edge point position;
The midpoint of left hand edge point position and right hand edge point position is taken to carry out line successively and obtains the center line in path.
Preferably, if guidance path is crossedpath, the center line in path is obtained according to following methods:
Image is split as bottom, middle part and three, top part;
A center row is taken to obtain the central point of its forward path between top and middle part, and line obtains path respectively Center line.
Preferably, if guidance path is straight line path, angular deviation and the position of straight line path are obtained according to following methods Put deviation:
The distance of the center of image base center line range image is obtained as position deviation, obtains center line and Y The angle of axis is as angular deviation.
Preferably, if guidance path is curved path, angular deviation and the position of curved path are obtained according to following methods Put deviation:
The distance of the center of image base center line range image is obtained as position deviation;
The arc length of path center line and the ratio of chord length are calculated as curvature, obtains curve front end tangent line and the folder of Y-axis Angle beta obtains the line at curve both ends midpoint and the angle α of Y-axis;Angular deviation β-α are compensated according to curvature.
Preferably, if guidance path is crossedpath, angular deviation and position deviation are obtained according to following methods:
Judge crossedpath direction of advance;
If direction of advance is straight line path, angular deviation and the position deviation of straight line path are obtained;
If direction of advance is curved path, angular deviation and the position deviation of curved path are obtained.
Preferably, the method for judging crossedpath direction of advance includes:
The intersection point that O is two crossedpath center lines is enabled, the center of square, four centers using the central point as region Line will intersect with the edge of square, calculate the slope of four center lines and point O, respectively k1、k2、k3And k4, define matching rate Kl,mFormula be:
Wherein, the integer of l ∈ [Isosorbide-5-Nitrae], the integer of m ∈ [Isosorbide-5-Nitrae], and l ≠ m;
Compare each matching rate Kl,mSize, K of the selective value closest to 1l,mCorresponding center line is as progress path.
Preferably, the visual sensor is camera.
Compared with prior art, the present invention has the advantages that:
(1) present invention acquires guidance path image information by visual sensor distribution of machine;Image is ashed, Value filtering, Canny edge detection algorithms and extraction path edge coordinate obtain route information, and are improved by section scanning Routing information accuracy;
(2) present invention extraction path edge coordinate and obtain path center line when, to straight line path, curved path and Processing is optimized in crossedpath differentiation, greatly reduces data processing amount, real-time is improved, while to a certain degree On improve antijamming capability;
(3) present invention proposes slope matched method, so as to enable biped robot in Path Recognition for crossedpath Select best progress path;
(4) present invention carries out vision guided navigation by indoor laying navigation routine, which is applied to biped machine People can simply and effectively solve the problems, such as that indoor environment is complicated, and family is entered into service robot in future with certain research Meaning.
Description of the drawings
Fig. 1 is the flow diagram figure of the present invention;
Fig. 2 is the image identification pretreatment process figure of the present invention;
Fig. 3 is three kinds of wave filters of the present invention and the lab diagram of three kinds of edge detection algorithms;
Fig. 4 is the digital picture schematic diagram of the present invention;
Fig. 5 is the real-time processing schematic diagram of the Path Recognition of the present invention;
Fig. 6 is the real-time processing schematic diagram of the crossedpath of the present invention;
Fig. 7 is the straight line model of the robot path navigation of the present invention;
Fig. 8 is the curve model of the robot path navigation of the present invention;
Fig. 9 is the intersection line model of the robot path navigation of the present invention;
Figure 10 is the initial attitude and its visual field that the NAO robots of the embodiment of the present invention stand;
Figure 11 is the NAO Robot straight ahead figures of the embodiment of the present invention;
Figure 12 is the NAO Robot circle advance figures of the embodiment of the present invention;
Figure 13 is the NAO Robot quadrangle shape advance figures of the embodiment of the present invention;
Figure 14 is the NAO Robot circle advance figures of the embodiment of the present invention.
Specific embodiment
It is shown in Figure 1, a kind of method of biped robot's automatic tracking, including:
Step 101, the guidance path figure of the visual sensor distribution of machine people foot on the biped robot As information to processor;
Step 102, the processor handles the guidance path image information received, obtains robot and works as The relative position relation of preceding posture and guidance path, and it is sent to controller;The relative position relation include angular deviation and Position deviation;
Step 103, the controller is according to the angular deviation received and position deviation control robot adjustment walking road Diameter is to realize automatic tracking.
It should be noted that the processor and controller in above-mentioned steps can be integrated on biped robot, it can also It is the processor and controller individually arranged, the present embodiment is not particularly limited.
The processor handles the guidance path image information received, including:RGB color figure is converted For gray-scale map;Image filtering is carried out to gray-scale map using mean filter method;Using Canny edge detection algorithms to filtered figure As carrying out edge detection;Extraction path edge coordinate obtains the center line in path;Machine is obtained according to the position of the center line The relative position relation of people's current pose and guidance path.
The principal visual sensor of the robot autonomous tracking of view-based access control model is exactly camera, and robot is obtained by camera Routing information is taken, the marginal information that path is found out by image knowledge method for distinguishing obtains navigational parameter by algorithm.Image identifies Preprocessing process flow it is as shown in Figure 2.Therefore, the speed of image procossing, antijamming capability and edge extracting to noise Accuracy be obtain navigational parameter precondition.
In image procossing, need in advance cromogram be converted into gray-scale map could carry out correlation computations, identification.RGB is color The conversion formula that chromatic graph is converted to gray-scale map is as follows:
The main purpose of Path Recognition is to detect the edge of guidance path.Common edge detection algorithm have Sobel operators, Canny operators and Laplace operator.Since edge detection algorithm is mainly based upon the first derivative of image intensity and second order is led Number, so the calculating of derivative is to noise-sensitive, it is therefore necessary to improve the performance of noise relevant edge detector using wave filter. Image filtering inhibits the noise of target image under conditions of image detail feature is retained as possible, is that image is located in advance Indispensable operation in reason, the quality for the treatment of effect will directly influence subsequent image processing and the validity analyzed and can By property.Common image filter has gaussian filtering, mean filter and medium filtering.
Mean filter method is, to pending current pixel, selects a template, several neighbouring for its of the template Pixel forms, the method that the value of original pixel is substituted with the mean value of template.
Wherein, (x, y) is pending current pixel point, and M represents total comprising the pixel including current pixel in the template Number.
Since extraction path is of less demanding to the detail section in image, made an uproar by mean filter with regard to that can effectively remove Acoustic jamming.
The substantially flow of Canny edge detection algorithms is as follows:
(1) image and Gaussian filter convolution are asked:
S [i, j]=G [i, j;σ]*I[i,j]
(2) it is divided to two arrays P and Q for calculating partial derivative using first difference:
P[i,j]≈(S[i,j+1]-S[i,j]
+S[i+1,j+1]-S[i+1,j])/2
Q[i,j]≈(S[i,j]-S[i+1,j]
+S[i,j+1]-S[i+1,j+1])/2
(3) amplitude and azimuth are calculated:
θ [i, j]=arctan (Q [i, j]/P [i, j]
(4) non-maxima suppression:The ridge band in magnitude image is refined, i.e., only retains the point of amplitude localized variation maximum. The variation range of gradient angle is reduced to one of four sectors, deflection and amplitude are respectively:
ξ [i, j]=Sector (θ [i, j])
N [i, j]=NMS (M [i, j], ξ [i, j]
(5) threshold value is taken, will be less than all taxes zero of threshold value, obtains the edge array of image.
Shown in Figure 3 is the lab diagram of three kinds of wave filters and three kinds of edge detection algorithms, passes through three kinds of wave filters and three The comparison of kind of edge detection effect, it can be seen that the combination of mean filter and Canny operators can obtain ideal effect, energy Interfere with edge caused by illumination in enough ceramic tile gaps that filters out well.Ideal continuous path marginal information is the essence of robot Really navigation is laid a good foundation.
It is included by the image of image preprocessing (gray scale, mean filter and the processing of Canny edge detection algorithms) for one The bianry image of routed edges information.The two-dimensional array of the two Dimension Numerical Value image actually gray value, the two-dimensional array Size be image resolution sizes.If the array is represented with F (i, j), shown in Figure 4, then coordinate (i, j) Gray value is f (i, j).Function f (i, j) is a mathematical model of digital picture, also sometimes referred to as image function.Pass through The method of progressive scan obtains the edge coordinate in path in image, and the column matrix of position of center line is worth in taking Idnex, the column matrix using as calculate navigational parameter main basis, it is as follows:
Idnex=(I1,I2,...,Ii)T
Wherein,F (i, j) represents f (i, j) is through Canny edge detection algorithms treated the corresponding two-dimensional array of two bit value images.
Robot is in real-time walking, it is necessary to assure a sampled images are completed before arriving in next image sampling period Processing, that is, ensure image procossing real-time.Therefore while guidance path is accurately identified, it is necessary to ensure path image The speed of identification.When identifying a frame path image, all steps shown in Fig. 2 are completed.This ensure that a certain extent The robustness of identification, but larger data processing amount is brought simultaneously, decline so as to cause real-time.In view of guidance path figure As being made of pixel line by line, the continuity in path causes the correspondence left and right edges point position phase between the adjacent rows of path It is poor little.For this purpose, in addition to upper and lower two image processing region is divided to reduce image real time transfer amount, path profile at same frame is also utilized The marginal position of lastrow limits the edge point range of adjacent next line as in, so as to reducing what is handled in each row Pixel number achievees the purpose that improve real-time.It is as follows:
(1) in image processing region, processing shown in Fig. 2 is carried out to the first row all pixels respectively, obtains a path left side Right hand edge point.If not detecting (the row corresponding region is without path), next line is continued to handle, until detecting a left side Right hand edge point position L1 and R1, as shown in Figure 5.
(2) after detecting the first row marginal point, when handling the second row with regard to no longer as step (1) to full line at Reason, and it is to determine the value that a width is f pixels.It is handled in [L1-f, R1+f] position range of the row, obtains the row Marginal point L2 and R2;The left and right edges point of the third line is found between [L2-f, R2+f] again, until obtaining navigating in processing region Each row left and right edges point in path.As long as f values are chosen suitable, it will be able to the left and right of the row is found on section [L1-f, R1+f] Marginal point.When f values are less than normal, can cause to can not find or find the marginal point of mistake;Though f values are excessive can be ensured to find, increase Operand is added, has been unfavorable for improving real-time.
Above method has the advantages that 2:1. greatly reduce needs be filtered, the pixel number of edge detection, because This data processing amount greatly reduces, and real-time is improved;2. when detecting next line marginal point, candidate marginal is limited to In smaller candidate section, antijamming capability is improved to a certain extent.
The above method can greatly reduce the time of image procossing in the case of no crossedpath, but for intersecting Crossing, this method meeting lost path information cause robot not advance correctly.Therefore it needs to draw during image procossing Enter path judgment mechanism, the Real-Time Performance is not used when encountering intersection to handle, it, will using three sections of processing of image Image is split as bottom, middle part, three, top part, as shown in Figure 6.The processing method can contract significantly on processing time Subtract, but to lose routing information premise the most, in order to accelerate image processing speed as best one can and obtain more accurately Routing information is using the method for taking intermediate value, as shown in fig. 6, top differs larger with the line at middle part midpoint with Actual path, because This takes a center row to obtain the central point of its forward path between top and middle part, and distinguishes line, by this method both Processing time can be reduced can cause path closer to Actual path again.
On the basis of guidance path is obtained, path trace model of the invention includes following three:
(1) straight line path trace model
Simple straight line path trace model is as shown in Figure 7.The navigation road that the model obtains biped robot's camera Diameter is considered as straight line, by the edge line of image recognition algorithm acquisition approach, and then obtains the center line in path.Image base The distance of the center of center line range image is position deviation d, and the angle of center line and Y-axis is angular deviation α.
(2) curved path trace model
It walks on curved path, best tracking mode is exactly that the direction of advance of robot is made to be protected always with curved path Hold tangent, so unlike straight line path trace model, the angular deviation α of curved path trace model can not directly pass through Marginal position is directly calculated.The model of curved path tracking is as shown in Figure 8.In the model, the company at curve both ends midpoint Line direction is V1, the direction of curve front end tangent line is V0If the method according to straight line model can cause the angle of navigational parameter to deposit In the deviation of β-α.In order to compensate for the angular deviation of β-α, the present invention is defined as follows the curvature in path:Path center line The ratio of arc length and chord length is the curvature in path, and the ratio is closer to 1, then closer to straight line.The introducing energy of curvature It is enough preferably to carry out path trace and with preferable robustness.
(3) trace model of intersecting routes
The situation of two paths intersection is often encountered in the autonomous tracking of robot, how robot judges and select The path of advance is problem to be solved.Dimension, this paper presents the routing algorithms based on slope matched, can solve The problem of crossedpath direction of advance selects.The model of cross spider path trace is as shown in Figure 9.In figure, O is two crossedpaths The intersection point of center line, the center of square using the central point as region, four center lines will intersect with the edge of square, and four Center line can be individually identified as 1,2,3 and 4, and the slope with point O is respectively k1、k2、k3And k4.Define matching rate Kl,mFormula For:
Wherein, the integer of l ∈ [Isosorbide-5-Nitrae], the integer of m ∈ [Isosorbide-5-Nitrae], and l ≠ m;
As matching rate Kl,mValue closer to 1, then two paths are maximum for the possibility of continuous path.By oblique The matched routing algorithm of rate can correctly select progress path, and be unlikely to not knowing institute when encountering intersection To.Assuming that the path currently walked is 1, K is calculated successively1,2、K1,3And K1,4, and compare K1,2、K1,3And K1,4Size, selection K1,2、K1,3And K1,4Center line corresponding to intermediate value closest to 1 is as progress path.
In the present embodiment, a kind of humanoid robot that autonomous tracking of biped robot is researched and developed with French Aldebaran companies NAO is developed for hardware platform.
NAO hardware is manufactured using recent design, be ensure that the fluency of NAO actions, is further provided with multiple sensors.This Outside, NAO can under the operating systems such as Linux, windows or Mac OS using multilingual such as C++, Python, Java etc. into Row programming, while patterned programming software Choregraphe is additionally provided, user be free to exercise the imagination as NAO Program is write, teaches its many action.
Current embodiment require that NAO robotic visions are developed.There are two up to every altogether on the head of NAO robots The camera of the 1280*960 resolution ratio of 30 frames of second.Routing information on ground is the primary information resource that NAO robots advance, Since image information remotely does not have larger utility value, the camera positioned at NAO robots bottom is selected, and The head of NAO robots is rotated down into fixed angle, can obtain underfooting routing information and can not by oneself Both feet cover the visual field, initialize midstance and camera view is as shown in Figure 10.
Aldebaran companies are a patterned programming software of NAO Robot Designs, and the software is for no software The programming fan for developing basis is very applicable.Meanwhile NAO robots additionally provide multilingual development environment, as C++, Python, Java etc..The present embodiment has used Python2.7+OpenCV2.4 to carry out visual pathway tracking to NAO robots Software development.Python is programming language all the fashion at present, has abundant and powerful library, can conveniently and efficiently write Program software with application value.OpenCV is a cross-platform computer vision library based on BSD licenses (increasing income) distribution, It may operate in Linux, Windows, Android and Mac OS operating systems.Its lightweight and efficiently --- by a system Row C function and a small amount of C++ class are formed, while provide the interface of the language such as Python, Ruby, MATLAB, are realized at image Many general-purpose algorithms in terms of reason and computer vision.
The present embodiment is programmed exploitation using Python, the code and pseudocode that will provide part as follows.
(1) NAO robots are initialized
NAO is maintained at fixed pose by the initialization procedure of NAO robots, selects bottom camera, and set camera shooting The color space of head and resolution ratio etc..
(2) acquisition and its pretreatment of image are carried out
The acquisition of image has used getImageRemote in ALVideoDeviceProxy agencies.The filtering of image and side Edge extraction has used the function in OpenCV libraries.
Image acquisition and pretreatment code
Path detection algorithm part pseudocode
Path detection pseudo-code of the algorithm
According to above-mentioned processing, the present embodiment experiment is laid with straight line, circle, quadrangle and cross spider verification on ground respectively Theoretical accuracy and stability.It is that the black belt of width about 15mm is pasted on light marble floorings used in the present embodiment It is tested.The line tracking experiment of NAO robots is as shown in figure 11, and NAO robots circle tracking test is as shown in figure 12, NAO Robot quadrangle route tracking tests figure as shown at 13, NAO Robot pahtfinder hard tracking test such as Figure 14 institutes Show.The experimental result data in various paths is as shown in table 1 below.
Table 1
As it can be seen from table 1 there are the tracking in relatively easy path in NAO robots good stability and accuracy, it is right In pahtfinder hard tracking also there are it is certain the defects of need to improve, when the reason of failure, mostlys come from NAO robots and walks about The shaking of camera and the interference of ambient intensity need to eliminate shake in camera and the anti-external interference of camera carry out It is further to improve.
The above is only present pre-ferred embodiments, is not intended to limit the scope of the present invention, therefore Any subtle modifications, equivalent variations and modifications that every technical spirit according to the present invention makees above example, still belong to In the range of technical solution of the present invention.

Claims (10)

  1. A kind of 1. method of biped robot's automatic tracking, which is characterized in that including:
    On the biped robot visual sensor distribution of machine people foot guidance path image information to handle Device;
    The processor handles the guidance path image information received, obtains robot current pose and navigation The relative position relation in path, and it is sent to controller;The relative position relation includes angular deviation and position deviation;
    The controller is automatic to realize according to the angular deviation received and position deviation control robot adjustment walking path Tracking.
  2. 2. the method for biped robot's automatic tracking according to claim 1, which is characterized in that the processor is to receiving To the guidance path image information handled, including:
    RGB color figure is converted into gray-scale map;
    Image filtering is carried out to gray-scale map using mean filter method;
    Edge detection is carried out to filtered image using Canny edge detection algorithms;
    Extraction path edge coordinate obtains the center line in path;
    The relative position relation of robot current pose and guidance path is obtained according to the position of the center line.
  3. 3. the method for biped robot's automatic tracking according to claim 2, which is characterized in that the extraction path edge Coordinate obtains the center line in path, including:
    The edge coordinate in path in image is obtained by the method for progressive scan, and the row of position of center line are worth in taking Matrix idnex, it is as follows:
    Idnex=(I1,I2,...,Ii)T
    Wherein,F (i, j) represents f (i, j) Through Canny edge detection algorithms treated the corresponding two-dimensional array of two bit value images.
  4. 4. the method for biped robot's automatic tracking according to claim 3, which is characterized in that if guidance path is straight Thread path or curved path obtain the center line in path according to following methods:
    In image processing region, the first row all pixels are handled respectively, obtain left hand edge point position and and right hand edge Point position;
    Since the second row, limited respectively using the left hand edge point position and right hand edge point position of lastrow in path image at same frame Left hand edge point position and the right hand edge point position range of adjacent next line are determined to obtain respective left hand edge point position and right hand edge Point position;
    The midpoint of left hand edge point position and right hand edge point position is taken to carry out line successively and obtains the center line in path.
  5. 5. the method for biped robot's automatic tracking according to claim 3, which is characterized in that if guidance path is hands over Cross road diameter obtains the center line in path according to following methods:
    Image is split as bottom, middle part and three, top part;
    A center row is taken to obtain the central point of its forward path between top and middle part, and line is obtained in path respectively Heart line.
  6. 6. the method for biped robot's automatic tracking according to claim 2, which is characterized in that if guidance path is straight Thread path obtains angular deviation and the position deviation of straight line path according to following methods:
    The distance of the center of image base center line range image is obtained as position deviation, acquisition center line and Y-axis Angle is as angular deviation.
  7. 7. the method for biped robot's automatic tracking according to claim 6, which is characterized in that if guidance path is song Thread path obtains angular deviation and the position deviation of curved path according to following methods:
    The distance of the center of image base center line range image is obtained as position deviation;
    The arc length of path center line and the ratio of chord length are calculated as curvature, the angle β of acquisition curve front end tangent line and Y-axis, Obtain the line at curve both ends midpoint and the angle α of Y-axis;Angular deviation β-α are compensated according to curvature.
  8. 8. the method for biped robot's automatic tracking according to claim 7, which is characterized in that if guidance path is hands over Cross road diameter obtains angular deviation and position deviation according to following methods:
    Judge crossedpath direction of advance;
    If direction of advance is straight line path, angular deviation and the position deviation of straight line path are obtained;
    If direction of advance is curved path, angular deviation and the position deviation of curved path are obtained.
  9. 9. the method for biped robot's automatic tracking according to claim 8, which is characterized in that the judgement crossedpath The method of direction of advance includes:
    The intersection point that O is two crossedpath center lines is enabled, the center of square using the central point as region, four center lines will Intersect with the edge of square, calculate the slope of four center lines and point O, respectively k1、k2、k3And k4, define matching rate Kl,m Formula be:
    Wherein, the integer of l ∈ [Isosorbide-5-Nitrae], the integer of m ∈ [Isosorbide-5-Nitrae], and l ≠ m;
    Compare each matching rate Kl,mSize, K of the selective value closest to 1l,mCorresponding center line is as progress path.
  10. 10. the method for biped robot's automatic tracking according to claim 1, which is characterized in that the visual sensor For camera.
CN201711306559.7A 2017-12-11 2017-12-11 A kind of method of biped robot's automatic tracking Pending CN108181897A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711306559.7A CN108181897A (en) 2017-12-11 2017-12-11 A kind of method of biped robot's automatic tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711306559.7A CN108181897A (en) 2017-12-11 2017-12-11 A kind of method of biped robot's automatic tracking

Publications (1)

Publication Number Publication Date
CN108181897A true CN108181897A (en) 2018-06-19

Family

ID=62545896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711306559.7A Pending CN108181897A (en) 2017-12-11 2017-12-11 A kind of method of biped robot's automatic tracking

Country Status (1)

Country Link
CN (1) CN108181897A (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109093625A (en) * 2018-09-11 2018-12-28 国网山东省电力公司莱芜供电公司 A kind of straight line path visual identity method for robot cruise
CN109269544A (en) * 2018-09-27 2019-01-25 中国人民解放军国防科技大学 Inspection system for suspension sensor of medium-low speed magnetic suspension vehicle
CN109324615A (en) * 2018-09-20 2019-02-12 深圳蓝胖子机器人有限公司 Office building delivery control method, device and computer readable storage medium
CN109828568A (en) * 2019-02-15 2019-05-31 武汉理工大学 Ball gait optimization method is sought to the NAO robot of RoboCup match
CN109900266A (en) * 2019-03-27 2019-06-18 小驴机器人(武汉)有限公司 A kind of quick identification positioning method and system based on RGB-D and inertial navigation
CN110032191A (en) * 2019-04-28 2019-07-19 中北大学 A kind of human emulated robot is quickly walked tracking avoidance implementation method
CN110084825A (en) * 2019-04-16 2019-08-02 上海岚豹智能科技有限公司 A kind of method and system based on image edge information navigation
CN110989581A (en) * 2019-11-26 2020-04-10 广东博智林机器人有限公司 Method and device for controlling conveyance system, computer device, and storage medium
CN111028275A (en) * 2019-12-03 2020-04-17 扬州后潮科技有限公司 Tracing robot PID method based on cross-correlation image positioning matching
CN111093007A (en) * 2018-10-23 2020-05-01 辽宁石油化工大学 Walking control method and device for biped robot, storage medium and terminal
CN111398984A (en) * 2020-03-22 2020-07-10 华南理工大学 Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot
CN112631312A (en) * 2021-03-08 2021-04-09 北京三快在线科技有限公司 Unmanned equipment control method and device, storage medium and electronic equipment
CN112720408A (en) * 2020-12-22 2021-04-30 江苏理工学院 Visual navigation control method for all-terrain robot
CN113139987A (en) * 2021-05-06 2021-07-20 太原科技大学 Visual tracking quadruped robot and tracking characteristic information extraction algorithm thereof
CN113381667A (en) * 2021-06-25 2021-09-10 哈尔滨工业大学 Seedling searching walking system and method based on ROS and image processing
CN113721625A (en) * 2021-08-31 2021-11-30 平安科技(深圳)有限公司 AGV control method, device, equipment and storage medium
CN114281085A (en) * 2021-12-29 2022-04-05 福建汉特云智能科技有限公司 Robot tracking method and storage medium
CN114639163A (en) * 2022-02-25 2022-06-17 纯米科技(上海)股份有限公司 Walking program scoring method, system, electronic device and storage medium
WO2022156755A1 (en) * 2021-01-21 2022-07-28 深圳市普渡科技有限公司 Indoor positioning method and apparatus, device, and computer-readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104007761A (en) * 2014-04-30 2014-08-27 宁波韦尔德斯凯勒智能科技有限公司 Visual servo robot tracking control method and device based on pose errors
KR20150144731A (en) * 2014-06-17 2015-12-28 주식회사 유진로봇 Apparatus for recognizing location mobile robot using edge based refinement and method thereof
CN105651286A (en) * 2016-02-26 2016-06-08 中国科学院宁波材料技术与工程研究所 Visual navigation method and system of mobile robot as well as warehouse system
CN106054886A (en) * 2016-06-27 2016-10-26 常熟理工学院 Automatic guiding transport vehicle route identification and control method based on visible light image
CN106990786A (en) * 2017-05-12 2017-07-28 中南大学 The tracking method of intelligent carriage
CN106985142A (en) * 2017-04-28 2017-07-28 东南大学 A kind of double vision for omni-directional mobile robots feels tracking device and method
CN108931240A (en) * 2018-03-06 2018-12-04 东南大学 A kind of path tracking sensor and tracking method based on electromagnetic induction
CN111062968A (en) * 2019-11-29 2020-04-24 河海大学 Robot path skeleton extraction method based on edge scanning and centerline extraction

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104007761A (en) * 2014-04-30 2014-08-27 宁波韦尔德斯凯勒智能科技有限公司 Visual servo robot tracking control method and device based on pose errors
KR20150144731A (en) * 2014-06-17 2015-12-28 주식회사 유진로봇 Apparatus for recognizing location mobile robot using edge based refinement and method thereof
CN105651286A (en) * 2016-02-26 2016-06-08 中国科学院宁波材料技术与工程研究所 Visual navigation method and system of mobile robot as well as warehouse system
CN106054886A (en) * 2016-06-27 2016-10-26 常熟理工学院 Automatic guiding transport vehicle route identification and control method based on visible light image
CN106985142A (en) * 2017-04-28 2017-07-28 东南大学 A kind of double vision for omni-directional mobile robots feels tracking device and method
CN106990786A (en) * 2017-05-12 2017-07-28 中南大学 The tracking method of intelligent carriage
CN108931240A (en) * 2018-03-06 2018-12-04 东南大学 A kind of path tracking sensor and tracking method based on electromagnetic induction
CN111062968A (en) * 2019-11-29 2020-04-24 河海大学 Robot path skeleton extraction method based on edge scanning and centerline extraction

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
HAMID REZA RIAHI BAKHTIARI, ABOLFAZL ABDOLLAHI, HANI REZAEIAN: "Semi automatic road extraction from digital images", 《THE EGYPTIAN JOURNAL OF REMOTE SENSING AND SPACE SCIENCE》 *
LI-HONG JUANG,JIAN-SEN ZHANG: "Robust visual line-following navigation system for humanoid robots", 《ARTIFICIAL INTELLIGENCE REVIEW》 *
LI-HONG JUANG,JIAN-SEN ZHANG: "Visual Tracking Control of Humanoid Robot", 《IEEE ACCESS》 *
孟武胜: "基于CMOS传感器的循迹智能车路径识别研究", 《机电一体化》 *
李灵芝: "基于图像处理的AGV视觉导航研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
胡长晖: "数字摄像头路径识别技术的研究与应用", 《可编程控制器与工厂自动化》 *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109093625A (en) * 2018-09-11 2018-12-28 国网山东省电力公司莱芜供电公司 A kind of straight line path visual identity method for robot cruise
CN109324615A (en) * 2018-09-20 2019-02-12 深圳蓝胖子机器人有限公司 Office building delivery control method, device and computer readable storage medium
CN109269544A (en) * 2018-09-27 2019-01-25 中国人民解放军国防科技大学 Inspection system for suspension sensor of medium-low speed magnetic suspension vehicle
CN109269544B (en) * 2018-09-27 2021-01-29 中国人民解放军国防科技大学 Inspection system for suspension sensor of medium-low speed magnetic suspension vehicle
CN111093007A (en) * 2018-10-23 2020-05-01 辽宁石油化工大学 Walking control method and device for biped robot, storage medium and terminal
CN111093007B (en) * 2018-10-23 2021-04-06 辽宁石油化工大学 Walking control method and device for biped robot, storage medium and terminal
CN109828568A (en) * 2019-02-15 2019-05-31 武汉理工大学 Ball gait optimization method is sought to the NAO robot of RoboCup match
CN109828568B (en) * 2019-02-15 2022-04-15 武汉理工大学 NAO robot ball-searching gait optimization method for RoboCup game
CN109900266A (en) * 2019-03-27 2019-06-18 小驴机器人(武汉)有限公司 A kind of quick identification positioning method and system based on RGB-D and inertial navigation
CN110084825A (en) * 2019-04-16 2019-08-02 上海岚豹智能科技有限公司 A kind of method and system based on image edge information navigation
CN110032191A (en) * 2019-04-28 2019-07-19 中北大学 A kind of human emulated robot is quickly walked tracking avoidance implementation method
CN110989581A (en) * 2019-11-26 2020-04-10 广东博智林机器人有限公司 Method and device for controlling conveyance system, computer device, and storage medium
CN110989581B (en) * 2019-11-26 2023-04-07 广东博智林机器人有限公司 Method and device for controlling conveyance system, computer device, and storage medium
CN111028275A (en) * 2019-12-03 2020-04-17 扬州后潮科技有限公司 Tracing robot PID method based on cross-correlation image positioning matching
CN111028275B (en) * 2019-12-03 2024-01-30 内蒙古汇栋科技有限公司 Image positioning matching tracking robot PID method based on cross correlation
CN111398984A (en) * 2020-03-22 2020-07-10 华南理工大学 Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot
CN111398984B (en) * 2020-03-22 2022-03-29 华南理工大学 Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot
CN112720408A (en) * 2020-12-22 2021-04-30 江苏理工学院 Visual navigation control method for all-terrain robot
CN112720408B (en) * 2020-12-22 2022-07-08 江苏理工学院 Visual navigation control method for all-terrain robot
WO2022156755A1 (en) * 2021-01-21 2022-07-28 深圳市普渡科技有限公司 Indoor positioning method and apparatus, device, and computer-readable storage medium
CN112631312A (en) * 2021-03-08 2021-04-09 北京三快在线科技有限公司 Unmanned equipment control method and device, storage medium and electronic equipment
CN113139987A (en) * 2021-05-06 2021-07-20 太原科技大学 Visual tracking quadruped robot and tracking characteristic information extraction algorithm thereof
CN113381667A (en) * 2021-06-25 2021-09-10 哈尔滨工业大学 Seedling searching walking system and method based on ROS and image processing
CN113381667B (en) * 2021-06-25 2022-10-04 哈尔滨工业大学 Seedling searching walking system and method based on ROS and image processing
CN113721625A (en) * 2021-08-31 2021-11-30 平安科技(深圳)有限公司 AGV control method, device, equipment and storage medium
CN113721625B (en) * 2021-08-31 2023-07-18 平安科技(深圳)有限公司 AGV control method, device, equipment and storage medium
CN114281085A (en) * 2021-12-29 2022-04-05 福建汉特云智能科技有限公司 Robot tracking method and storage medium
CN114281085B (en) * 2021-12-29 2023-06-06 福建汉特云智能科技有限公司 Robot tracking method and storage medium
CN114639163A (en) * 2022-02-25 2022-06-17 纯米科技(上海)股份有限公司 Walking program scoring method, system, electronic device and storage medium
CN114639163B (en) * 2022-02-25 2024-06-07 纯米科技(上海)股份有限公司 Scoring method and scoring system for walking program, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN108181897A (en) A kind of method of biped robot's automatic tracking
Romero-Ramirez et al. Speeded up detection of squared fiducial markers
US7407104B2 (en) Two-dimensional code detector and program thereof, and robot control information generator and robot
CN102982557B (en) Method for processing space hand signal gesture command based on depth camera
US20220319011A1 (en) Heterogeneous Image Registration Method and System
US20160012600A1 (en) Image processing method, image processing apparatus, program, storage medium, production apparatus, and method of producing assembly
CN110009680B (en) Monocular image position and posture measuring method based on circle feature and different-surface feature points
JP6973444B2 (en) Control system, information processing device and control method
CN112396656A (en) Outdoor mobile robot pose estimation method based on fusion of vision and laser radar
Ma et al. Crlf: Automatic calibration and refinement based on line feature for lidar and camera in road scenes
CN111967337A (en) Pipeline line change detection method based on deep learning and unmanned aerial vehicle images
CN106338277A (en) Baseline-based building change detection method
CN109164802A (en) A kind of robot maze traveling method, device and robot
CN112184765A (en) Autonomous tracking method of underwater vehicle based on vision
Sandy et al. Object-based visual-inertial tracking for additive fabrication
Armagan et al. Accurate Camera Registration in Urban Environments Using High-Level Feature Matching.
Hou et al. A novel mobile robot navigation method based on hand-drawn paths
Li et al. A mobile robotic arm grasping system with autonomous navigation and object detection
Juang et al. Visual tracking control of humanoid robot
Xu et al. Object detection on robot operation system
Stronger et al. Selective visual attention for object detection on a legged robot
CN114782639A (en) Rapid differential latent AGV dense three-dimensional reconstruction method based on multi-sensor fusion
Chen et al. A hierarchical visual model for robot automatic arc welding guidance
Shirai Robot vision
CN113160318B (en) Monocular camera-based air refueling taper sleeve pose measurement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180619

RJ01 Rejection of invention patent application after publication