CN105509748A - Navigation method and apparatus for robot - Google Patents

Navigation method and apparatus for robot Download PDF

Info

Publication number
CN105509748A
CN105509748A CN201511016707.2A CN201511016707A CN105509748A CN 105509748 A CN105509748 A CN 105509748A CN 201511016707 A CN201511016707 A CN 201511016707A CN 105509748 A CN105509748 A CN 105509748A
Authority
CN
China
Prior art keywords
cloud data
robot
data collection
point
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201511016707.2A
Other languages
Chinese (zh)
Other versions
CN105509748B (en
Inventor
欧勇盛
邢为之
江国来
张京林
吴新宇
冯伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201511016707.2A priority Critical patent/CN105509748B/en
Publication of CN105509748A publication Critical patent/CN105509748A/en
Application granted granted Critical
Publication of CN105509748B publication Critical patent/CN105509748B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention is applicable to the technical field of a robot, and provides a navigation method and apparatus for the robot. The navigation method comprises the following steps: inputting a point-cloud data set Pk and a cloud-point data set X of a previous frame acquired by a somatosensory sensor into a KD tree matcher, searching a corresponding data point which is the closest one to the Pk from X, and generating a point-to-set Yk=C(Pk, X), wherein the k represents iterations, an initialized k is equal to 0, and P0 presents a point-cloud data set of a current frame acquired by the somatosensory sensor; carrying out registration on the point-cloud data set Pk and the point-cloud data set X; before not reaching a preset end condition, enabling k to be equal to k+1, returning to execute an operation of inputting the point-cloud data set Pk and the point-cloud data set X of the previous frame acquired by the somatosensory sensor into the KD tree matcher; if reaching the preset end condition, correcting a state variable quantity of the robot according to the current point-cloud data set Pk. According to the navigation method and apparatus provided by the embodiment of the invention, manufacturing cost of the robot is greatly reduced.

Description

The air navigation aid of robot and device
Technical field
The invention belongs to robotics, particularly relate to air navigation aid and the device of robot.
Background technology
From the sixties in last century movable machine people Stanford University occur since, along with the development of science and technology, the application of movable machine people also plays more extensive, extends to the different field such as family, service, amusement, military affairs from industry.The operation of movable machine people is based on navigating, need the map of robot understand environment residing for it and locate the position of self, the robot of traditional indoor movable adopts laser equipment to navigate usually, although laser equipment navigation accuracy is high, its price up to ten thousand easily brings the too high manufacturing cost of mobile robot.
Summary of the invention
In view of this, embodiments provide air navigation aid and the device of robot, to solve the problem that existing robot navigation's technology can cause robot building high cost.
First aspect, provides a kind of air navigation aid of robot, built-in body propagated sensation sensor in described robot, and described method comprises:
By cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs KD and sets adaptation, finds from P in X knearest corresponding data point, generates point to set Y k=C (P k, X), wherein, described k is iterations, initialization k=0, P 0for the cloud data collection of the present frame that described body propagated sensation sensor collects;
By described cloud data collection P kregistration is carried out with described cloud data collection X;
Before not reaching default end condition, make k=k+1, return perform described by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs the operation that KD sets adaptation;
If reached described default end condition, according to current cloud data collection P kcorrect the amount of state variation of described robot.
Second aspect, provides a kind of guider of robot, built-in body propagated sensation sensor in described robot, and described device comprises:
Input block, for by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs KD and sets adaptation, finds from P in X knearest corresponding data point, generates point to set Y k=C (P k, X), wherein, described k is iterations, initialization k=0, P 0for the cloud data collection of the present frame that described body propagated sensation sensor collects;
Registration unit, for by described cloud data collection P kregistration is carried out with described cloud data collection X;
First returns unit, for before not reaching default end condition, makes k=k+1, returns the operation performing described input block;
First correcting unit, if for reaching described default end condition, according to current cloud data collection P kcorrect the amount of state variation of described robot.
In embodiments of the present invention, utilize body propagated sensation sensor to carry out alternative laser equipment, the pose realized in robot navigation's process is corrected, and greatly reduces the manufacturing cost of robot.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is the structured flowchart of robot hardware's system that the embodiment of the present invention provides;
Fig. 2 is the structured flowchart of robot software's system that the embodiment of the present invention provides;
Fig. 3 is the realization flow figure of the air navigation aid of the robot that the embodiment of the present invention provides;
Fig. 4 is the processing module schematic diagram of the air navigation aid of the robot that the embodiment of the present invention provides;
Fig. 5 is the realization flow figure of the air navigation aid of the robot that another embodiment of the present invention provides;
Fig. 6 is the point set alignment schematic diagram of the air navigation aid of the robot that the embodiment of the present invention provides;
Fig. 7 is the structured flowchart of the guider of the robot that the embodiment of the present invention provides.
Embodiment
In below describing, in order to illustrate instead of in order to limit, propose the detail of such as particular system structure, technology and so on, understand the embodiment of the present invention thoroughly to cut.But, it will be clear to one skilled in the art that and also can realize the present invention in other embodiment not having these details.In other situation, omit the detailed description to well-known system, device, circuit and method, in order to avoid unnecessary details hinders description of the invention.
Fig. 1 shows the structured flowchart of robot hardware's system that the embodiment of the present invention provides, and for convenience of explanation, illustrate only part related to the present embodiment.
As shown in Figure 1, the hardware system of robot forms primarily of following components:
1, industrial control computer: it has carried operating system, for processing data and storing, and provides service for other modules in hardware system;
2, body propagated sensation sensor: for color and the depth information of environment residing for collecting robot people;
3, motion sensor: for obtaining the movable information of robot, comprise the sensor such as code-disc, gyroscope;
4, driving control unit: by drive singal is inputted bottom controller, walks with drive machines people.
5, other modules: comprise display module, the communication interface between communication module and each module.
Fig. 2 shows the structured flowchart of robot software's system that the embodiment of the present invention provides, and for convenience of explanation, illustrate only part related to the present embodiment.
As shown in Figure 2, the software systems of robot form primarily of following components from the bottom to top:
1, operating system: it is mounted on industrial control computer;
2, robot driver;
3, robot operating system (ROS);
4, navigation feature bag: it is arranged among robot operating system.
In embodiments of the present invention, adopt industrial control computer as main calculation processing unit, the data of collection are sent to industrial control computer by body propagated sensation sensor, code-disc, build map and the location realizing mobile robot by calculating.In the process of navigation, mobile robot drives walking by the embedded system control of bottom.
Based on the software and hardware structure of the robot shown in Fig. 1 and Fig. 2, next the air navigation aid of the robot that the embodiment of the present invention provides is set forth:
Fig. 3 shows the realization flow of the air navigation aid of the robot that the embodiment of the present invention provides, and details are as follows:
In S301, by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs KD (K-dimensional, K tie up) and sets adaptation, finds from P in X knearest corresponding data point, generates point to set Y k=C (P k, X), wherein, described k is iterations, initialization k=0, P 0for the cloud data collection of the present frame that described body propagated sensation sensor collects.
The speed that body propagated sensation sensor gathers image is 30 frames/second, by iterative closest point approach, constantly the cloud data of present frame that body propagated sensation sensor collects is mated with the cloud data of previous frame, corrects to realize further pose.As shown in Figure 4, data point wave filter as input, and exports another cloud data using cloud data after treatment, and its processing mode comprises increases descriptor, reduce number of data points etc. by sampling, also can use multiple data point wave filter, user can select as required simultaneously.
Further, after S301, before S302, described method also comprises:
By putting described in outer filter detection set Y k, remove described point to set Y kin not meet the point of preset rules right.
Such as, by judging whether the distance between two points exceedes certain threshold value, if exceed, this two points can be removed.
In S302, by described cloud data collection P kregistration is carried out with described cloud data collection X.
Particularly, can first compute matrix Q (∑ py) proper vector q r=[q 0q 1q 2q 3], wherein, Q ( Σ p y ) = t r ( Σ p y ) Δ T Δ T Δ Σ p y + Σ p y T - t r ( Σ p y ) I 3 Δ Σ p y , Described ∑ pypoint set P kand Y kthe mutual variance matrix of association, I 33 rank unit matrixs, Δ=[A 23a 31a 12] t, A i j = ( Σ p y - Σ p y T ) i , j .
Secondly, according to described proper vector q rcalculate rotation matrix R and translation vector q t, obtain registration vector q=[q r| q t] t, to realize described cloud data collection P kregistration is carried out with described cloud data collection X.Wherein: R = q 0 2 + q 1 2 - q 2 2 - q 3 2 2 ( q 1 q 2 - q 0 q 3 ) 2 ( q 1 q 3 + q 0 q 2 ) 2 ( q 1 q 2 + q 0 q 3 ) q 0 2 + q 2 2 - q 1 2 - q 3 2 2 ( q 2 q 3 - q 0 q 1 ) 2 ( q 1 q 3 - q 0 q 2 ) 2 ( q 2 q 3 + q 0 q 1 ) q 0 2 + q 3 2 - q 1 2 - q 2 2 , Q ty-R (q r) μ p, described μ yand μ ppoint set Y respectively kand P kbarycenter.
In S303, before not reaching default end condition, make k=k+1, return perform described by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs the operation that KD sets adaptation.
In S304, if reached described default end condition, according to current cloud data collection P kcorrect the amount of state variation of described robot.
In embodiments of the present invention, described default end condition comprises:
The value of described k has exceeded default iterations; Or,
Described cloud data collection P kerror to described cloud data collection X is less than preset error value.
Further, robot is while motion, and its motion sensor can obtain the amount of state variation of robot within the unit interval.Typically, be subject to the impact of the factors such as the slip between the wheel of robot and smooth earth, skew, the confidence level of the amount of state variation that motion sensor obtains is not high, therefore, in embodiments of the present invention, method shown in Fig. 5 is adopted to increase the confidence level of the amount of state variation that motion sensor obtains:
In S501, obtain the rectification parameter of robot motion's sensor, described rectification parameter comprises linear rectification parameter and rotational correction parameter.
Such as, the rectification program using the gyroscope of robot built-in to carry, after repeatedly verifying, verifying, determines gyrostatic linear rectification parameter and rotational correction parameter.
In S502, according to the state that robot moves at every turn, calculate robot error increment direction possible in this motion process and numerical value, and merge with the amount of state variation that described motion sensor obtains, computing obtains the amount of state variation revised.
In S503, the amount of state variation of described rectification parameter and described correction is merged, estimates the amount of state variation obtaining robot.
The amount of state variation finally got is the higher value of a confidence level.
Further, suppose that robot is at initial pose P refbe single pass S ref, reach new pose P through once small motion afterwards new, and single pass S has been again on this pose new, usually, in above process, P refand P newcan obtain from motion sensor, but, in order to improve P further refand P newaccuracy, can by by twice sweep S refand S newin some alignment, obtain P refand P newbetween more accurate relation, the amount of state variation obtained with correction motion sensor, as follows:
Step 1: initialization predetermined threshold value e, and initiation parameter l=0, d l=d 0=0.
Exemplarily, predetermined threshold value e can be set to 0.0001.
Step 2: for the pose S that described robot is new newon each some P n, at described robot initial pose S refthe nearest with it and described distance of upper searching is less than the some P of described predetermined threshold value r, composition point is to set
Step 3: calculate described point to set rotation matrix R wwith translation matrix T, and calculate P n l + 1 = R w P n l + T .
Step 4: calculate wherein, described num is that described point is to set in pairing quantity.
Step 5: if | d l-d l+1| <e, then export described R wwith described T, and according to described R wthe amount of state variation of described robot is corrected with described T.
Step 6: if | d l-d l+1|>=e, makes k=l+1, returns and performs step 2.
By above algorithm, the transformation matrices that can obtain between adjacent two amount of state variation according to the scanning of adjacent two frames, to draw the amount of state variation of the robot of rectification.As shown in Figure 6, with S newon each point, at S refthe nearest point that upper searching corresponds, if the distance between them is less than the threshold value of setting, then just thinks that they are match points, calculates the rotation matrix R between match point afterwards wwith translation matrix T, iterative computation, make can better aliging of these match points, reduced the distance function between them.
In embodiments of the present invention, body propagated sensation sensor is utilized to carry out alternative laser equipment, the pose realized in robot navigation's process is corrected, greatly reduce the manufacturing cost of robot, simultaneously, the correction strategy of motion sensor measured value and the convergence strategy of multiple matching algorithm also contribute to more accurately drawing out environmental map, and determine the state of robot.
Should be understood that the size of the sequence number of each step in above-described embodiment and do not mean that the priority of execution sequence, the execution sequence of each process should be determined with its function and internal logic, and should not form any restriction to the implementation process of the embodiment of the present invention.
Corresponding to the air navigation aid of the robot described in foregoing embodiments, Fig. 7 shows the structured flowchart of the guider of the robot that the embodiment of the present invention provides.For convenience of explanation, illustrate only part related to the present embodiment.
With reference to Fig. 7, this device comprises:
Input block 71, by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs KD and sets adaptation, finds from P in X knearest corresponding data point, generates point to set Y k=C (P k, X), wherein, described k is iterations, initialization k=0, P 0for the cloud data collection of the present frame that described body propagated sensation sensor collects;
Registration unit 72, by described cloud data collection P kregistration is carried out with described cloud data collection X;
First returns unit 73, before not reaching default end condition, makes k=k+1, returns the operation performing described input block;
First correcting unit 74, if reached described default end condition, according to current cloud data collection P kcorrect the amount of state variation of described robot.
Alternatively, described device also comprises:
Acquiring unit, obtains the rectification parameter of the motion sensor of described robot, and described rectification parameter comprises linear rectification parameter and rotational correction parameter;
First computing unit, according to the state that described robot moves at every turn, calculate described robot error increment direction possible in this motion process and numerical value, and merge with the amount of state variation that described motion sensor obtains, computing obtains the amount of state variation revised;
Estimate unit, the amount of state variation of described rectification parameter and described correction is merged, estimates the amount of state variation obtaining described robot.
Alternatively, described device also comprises:
Initialization unit, initialization predetermined threshold value e, and initiation parameter l=0, d l=d 0=0;
Find unit, for the pose S that described robot is new newon each some P n, at described robot initial pose S refthe nearest with it and described distance of upper searching is less than the some P of described predetermined threshold value r, composition point is to set
Second computing unit, calculates described point to set rotation matrix R wwith translation matrix T, and calculate P n l + 1 = R w P n l + T ;
3rd computing unit, calculates wherein, described num is that described point is to set in pairing quantity;
Output unit, if | d l-d l+1| <e, then export described R wwith described T, and according to described R wthe amount of state variation of described robot is corrected with described T;
Second returns unit, if | d l-d l+1|>=e, makes l=l+1, returns the operation performing described searching unit.
Alternatively, described default end condition comprises:
The value of described k has exceeded default iterations; Or,
Described cloud data collection P kerror to described cloud data collection X is less than preset error value.
Alternatively, described device also comprises:
Removal unit, detects described point to set Y k, remove described point to set Y kin not meet the point of preset rules right.
Those skilled in the art can be well understood to, for convenience of description and succinctly, only be illustrated with the division of above-mentioned each functional unit, module, in practical application, can distribute as required and by above-mentioned functions and be completed by different functional units, module, inner structure by described device is divided into different functional units or module, to complete all or part of function described above.Each functional unit in embodiment, module can be integrated in a processing unit, also can be that the independent physics of unit exists, also can two or more unit in a unit integrated, above-mentioned integrated unit both can adopt the form of hardware to realize, and the form of SFU software functional unit also can be adopted to realize.In addition, the concrete title of each functional unit, module, also just for the ease of mutual differentiation, is not limited to the protection domain of the application.The specific works process of unit, module in said system, with reference to the corresponding process in preceding method embodiment, can not repeat them here.
Those of ordinary skill in the art can recognize, in conjunction with unit and the algorithm steps of each example of embodiment disclosed herein description, can realize with the combination of electronic hardware or computer software and electronic hardware.These functions perform with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.Professional and technical personnel can use distinct methods to realize described function to each specifically should being used for, but this realization should not thought and exceeds scope of the present invention.
In embodiment provided by the present invention, should be understood that disclosed apparatus and method can realize by another way.Such as, system embodiment described above is only schematic, such as, the division of described module or unit, be only a kind of logic function to divide, actual can have other dividing mode when realizing, such as multiple unit or assembly can in conjunction with or another system can be integrated into, or some features can be ignored, or do not perform.Another point, it can be by some interfaces that shown or discussed coupling each other or direct-coupling or communication connect, and the indirect coupling of device or unit or communication connect, and can be electrical, machinery or other form.
The described unit illustrated as separating component or can may not be and physically separates, and the parts as unit display can be or may not be physical location, namely can be positioned at a place, or also can be distributed in multiple network element.Some or all of unit wherein can be selected according to the actual needs to realize the object of the present embodiment scheme.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, also can be that the independent physics of unit exists, also can two or more unit in a unit integrated.Above-mentioned integrated unit both can adopt the form of hardware to realize, and the form of SFU software functional unit also can be adopted to realize.
If described integrated unit using the form of SFU software functional unit realize and as independently production marketing or use time, can be stored in a computer read/write memory medium.Based on such understanding, the part that the technical scheme of the embodiment of the present invention contributes to prior art in essence in other words or all or part of of this technical scheme can embody with the form of software product, this computer software product is stored in a storage medium, comprising some instructions in order to make a computer equipment (can be personal computer, server, or the network equipment etc.) or processor (processor) perform all or part of step of method described in each embodiment of the embodiment of the present invention.And aforesaid storage medium comprises: USB flash disk, portable hard drive, ROM (read-only memory) (ROM, Read-OnlyMemory), random access memory (RAM, RandomAccessMemory), magnetic disc or CD etc. various can be program code stored medium.
The above embodiment only in order to technical scheme of the present invention to be described, is not intended to limit; Although with reference to previous embodiment to invention has been detailed description, those of ordinary skill in the art is to be understood that: it still can be modified to the technical scheme described in foregoing embodiments, or carries out equivalent replacement to wherein portion of techniques feature; And these amendments or replacement, do not make the essence of appropriate technical solution depart from the spirit and scope of each embodiment technical scheme of the embodiment of the present invention.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, all any amendments done within the spirit and principles in the present invention, equivalent replacement and improvement etc., all should be included within protection scope of the present invention.

Claims (10)

1. an air navigation aid for robot, is characterized in that, built-in body propagated sensation sensor in described robot, and described method comprises:
By cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs KD and sets adaptation, finds from P in X knearest corresponding data point, generates point to set Y k=C (P k, X), wherein, described k is iterations, initialization k=0, P 0for the cloud data collection of the present frame that described body propagated sensation sensor collects;
By described cloud data collection P kregistration is carried out with described cloud data collection X;
Before not reaching default end condition, make k=k+1, return perform described by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs the operation that KD sets adaptation;
If reached described default end condition, according to current cloud data collection P kcorrect the amount of state variation of described robot.
2. the method for claim 1, is characterized in that, described by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs before KD sets adaptation, and described method also comprises:
Obtain the rectification parameter of the motion sensor of described robot, described rectification parameter comprises linear rectification parameter and rotational correction parameter;
According to the state that described robot moves at every turn, calculate described robot error increment direction possible in this motion process and numerical value, and merge with the amount of state variation that described motion sensor obtains, computing obtains the amount of state variation revised;
The amount of state variation of described rectification parameter and described correction is merged, estimates the amount of state variation obtaining described robot.
3. the method for claim 1, is characterized in that, described method also comprises:
Initialization predetermined threshold value e, and initiation parameter l=0, d l=d 0=0;
For the pose S that described robot is new newon each some P n, at described robot initial pose S refthe nearest with it and described distance of upper searching is less than the some P of described predetermined threshold value r, composition point is to set
Calculate described point to set rotation matrix R wwith translation matrix T, and calculate P n l + 1 = R w P n l + T ;
Calculate wherein, described num is that described point is to set in pairing quantity;
If | d l-d l+1| <e, then export described R wwith described T, and according to described R wthe amount of state variation of described robot is corrected with described T;
If | d l-d l+1|>=e, makes l=l+1, returns and performs the described pose S new for described robot newon each some P n, at described robot initial pose S refthe nearest with it and described distance of upper searching is less than the some P of described predetermined threshold value r, composition point is to set operation.
4. the method for claim 1, is characterized in that, described default end condition comprises:
The value of described k has exceeded default iterations; Or,
Described cloud data collection P kerror to described cloud data collection X is less than preset error value.
5. the method for claim 1, is characterized in that, described by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs KD and sets adaptation, finds from P in X knearest corresponding data point, generates point to set Y k=C (P k, X) after, described by described cloud data collection P kbefore carrying out registration with described cloud data collection X, described method also comprises:
Detect described point to set Y k, remove described point to set Y kin not meet the point of preset rules right.
6. a guider for robot, is characterized in that, built-in body propagated sensation sensor in described robot, and described device comprises:
Input block, for by cloud data collection P kthe cloud data collection X of the previous frame collected with described body propagated sensation sensor inputs KD and sets adaptation, finds from P in X knearest corresponding data point, generates point to set Y k=C (P k, X), wherein, described k is iterations, initialization k=0, P 0for the cloud data collection of the present frame that described body propagated sensation sensor collects;
Registration unit, for by described cloud data collection P kregistration is carried out with described cloud data collection X;
First returns unit, for before not reaching default end condition, makes k=k+1, returns the operation performing described input block;
First correcting unit, if for reaching described default end condition, according to current cloud data collection P kcorrect the amount of state variation of described robot.
7. device as claimed in claim 6, it is characterized in that, described device also comprises:
Acquiring unit, for obtaining the rectification parameter of the motion sensor of described robot, described rectification parameter comprises linear rectification parameter and rotational correction parameter;
First computing unit, for the state of moving according to described robot at every turn, calculate described robot error increment direction possible in this motion process and numerical value, and merge with the amount of state variation that described motion sensor obtains, computing obtains the amount of state variation revised;
Estimating unit, for the amount of state variation of described rectification parameter and described correction being merged, estimating the amount of state variation obtaining described robot.
8. device as claimed in claim 6, it is characterized in that, described device also comprises:
Initialization unit, for initialization predetermined threshold value e, and initiation parameter l=0, d l=d 0=0;
Find unit, for for the new pose S of described robot newon each some P n, at described robot initial pose S refthe nearest with it and described distance of upper searching is less than the some P of described predetermined threshold value r, composition point is to set
Second computing unit, for calculating described point to set rotation matrix R wwith translation matrix T, and calculate P n l + 1 = R w P n l + T ;
3rd computing unit, for calculating wherein, described num is that described point is to set in pairing quantity;
Output unit, if for | d l-d l+1| <e, then export described R wwith described T, and according to described R wthe amount of state variation of described robot is corrected with described T;
Second returns unit, if for | d l-d l+1|>=e, makes l=l+1, returns the operation performing described searching unit.
9. device as claimed in claim 6, it is characterized in that, described default end condition comprises:
The value of described k has exceeded default iterations; Or,
Described cloud data collection P kerror to described cloud data collection X is less than preset error value.
10. device as claimed in claim 6, it is characterized in that, described device also comprises:
Removal unit, for detecting described point to set Y k, remove described point to set Y kin not meet the point of preset rules right.
CN201511016707.2A 2015-12-29 2015-12-29 The air navigation aid and device of robot Active CN105509748B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201511016707.2A CN105509748B (en) 2015-12-29 2015-12-29 The air navigation aid and device of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201511016707.2A CN105509748B (en) 2015-12-29 2015-12-29 The air navigation aid and device of robot

Publications (2)

Publication Number Publication Date
CN105509748A true CN105509748A (en) 2016-04-20
CN105509748B CN105509748B (en) 2019-03-01

Family

ID=55717903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201511016707.2A Active CN105509748B (en) 2015-12-29 2015-12-29 The air navigation aid and device of robot

Country Status (1)

Country Link
CN (1) CN105509748B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955275A (en) * 2016-05-26 2016-09-21 华讯方舟科技有限公司 Robot path programming method and system
CN108053446A (en) * 2017-12-11 2018-05-18 北京奇虎科技有限公司 Localization method, device and electronic equipment based on cloud
CN108334080A (en) * 2018-01-18 2018-07-27 大连理工大学 A kind of virtual wall automatic generation method for robot navigation
CN111473785A (en) * 2020-06-28 2020-07-31 北京云迹科技有限公司 Method and device for adjusting relative pose of robot to map
CN112650250A (en) * 2020-12-23 2021-04-13 深圳市杉川机器人有限公司 Map construction method and robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103106688A (en) * 2013-02-20 2013-05-15 北京工业大学 Indoor three-dimensional scene rebuilding method based on double-layer rectification method
CN103247225A (en) * 2012-02-13 2013-08-14 联想(北京)有限公司 Instant positioning and map building method and equipment
CN103729882A (en) * 2013-12-30 2014-04-16 浙江大学 Point cloud relative pose estimation method based on three-dimensional curve matching
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN104463894A (en) * 2014-12-26 2015-03-25 山东理工大学 Overall registering method for global optimization of multi-view three-dimensional laser point clouds
AU2012314067B2 (en) * 2011-09-30 2015-05-21 Oxa Autonomy Ltd Localising transportable apparatus
CN104715469A (en) * 2013-12-13 2015-06-17 联想(北京)有限公司 Data processing method and electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2012314067B2 (en) * 2011-09-30 2015-05-21 Oxa Autonomy Ltd Localising transportable apparatus
CN103247225A (en) * 2012-02-13 2013-08-14 联想(北京)有限公司 Instant positioning and map building method and equipment
CN103106688A (en) * 2013-02-20 2013-05-15 北京工业大学 Indoor three-dimensional scene rebuilding method based on double-layer rectification method
CN104715469A (en) * 2013-12-13 2015-06-17 联想(北京)有限公司 Data processing method and electronic device
CN103729882A (en) * 2013-12-30 2014-04-16 浙江大学 Point cloud relative pose estimation method based on three-dimensional curve matching
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN104463894A (en) * 2014-12-26 2015-03-25 山东理工大学 Overall registering method for global optimization of multi-view three-dimensional laser point clouds

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955275A (en) * 2016-05-26 2016-09-21 华讯方舟科技有限公司 Robot path programming method and system
CN105955275B (en) * 2016-05-26 2021-07-13 华讯方舟科技有限公司 Robot path planning method and system
CN108053446A (en) * 2017-12-11 2018-05-18 北京奇虎科技有限公司 Localization method, device and electronic equipment based on cloud
CN108334080A (en) * 2018-01-18 2018-07-27 大连理工大学 A kind of virtual wall automatic generation method for robot navigation
CN111473785A (en) * 2020-06-28 2020-07-31 北京云迹科技有限公司 Method and device for adjusting relative pose of robot to map
CN111473785B (en) * 2020-06-28 2020-09-25 北京云迹科技有限公司 Method and device for adjusting relative pose of robot to map
CN112650250A (en) * 2020-12-23 2021-04-13 深圳市杉川机器人有限公司 Map construction method and robot

Also Published As

Publication number Publication date
CN105509748B (en) 2019-03-01

Similar Documents

Publication Publication Date Title
US10852139B2 (en) Positioning method, positioning device, and robot
CN105509748A (en) Navigation method and apparatus for robot
US9946264B2 (en) Autonomous navigation using visual odometry
KR102257610B1 (en) EXTRINSIC CALIBRATION METHOD OF PLURALITY OF 3D LiDAR SENSORS FOR AUTONOMOUS NAVIGATION SYSTEM
CN108844553B (en) Method and device for correcting mileage in robot moving process and robot
CN106104656B (en) Map information generating systems, method and program
JP7085296B2 (en) Robot repositioning method
Liu et al. Stereo visual-inertial odometry with multiple Kalman filters ensemble
Zhang et al. Visual-inertial odometry on chip: An algorithm-and-hardware co-design approach
EP3159123A1 (en) Device for controlling driving of mobile robot having wide-angle cameras mounted thereon, and method therefor
CN107677279A (en) It is a kind of to position the method and system for building figure
CN107741234A (en) The offline map structuring and localization method of a kind of view-based access control model
CN112634451A (en) Outdoor large-scene three-dimensional mapping method integrating multiple sensors
EP3974778A1 (en) Method and apparatus for updating working map of mobile robot, and storage medium
Siagian et al. Mobile robot navigation system in outdoor pedestrian environment using vision-based road recognition
US20230264765A1 (en) Method for estimating pose of humanoid robot, humanoid robot and computer-readable storage medium
CN105469405A (en) Visual ranging-based simultaneous localization and map construction method
CN105258702A (en) Global positioning method based on SLAM navigation mobile robot
CN112837352A (en) Image-based data processing method, device and equipment, automobile and storage medium
US20230415333A1 (en) Center of mass planning method for robot, robot and computer-readable storage medium
US11420694B2 (en) Robot gait planning method and robot with the same
US20230271656A1 (en) Robot state estimation method, computer-readable storage medium, and legged robot
CN110887493A (en) Trajectory estimation method, medium, terminal and device based on local map matching
WO2023024539A1 (en) Path navigation planning method and apparatus, storage medium, and electronic device
CN110634183A (en) Map construction method and device and unmanned equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant