CN108332752A - The method and device of robot indoor positioning - Google Patents

The method and device of robot indoor positioning Download PDF

Info

Publication number
CN108332752A
CN108332752A CN201810020362.5A CN201810020362A CN108332752A CN 108332752 A CN108332752 A CN 108332752A CN 201810020362 A CN201810020362 A CN 201810020362A CN 108332752 A CN108332752 A CN 108332752A
Authority
CN
China
Prior art keywords
robot
speckle
terrestrial reference
pose
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810020362.5A
Other languages
Chinese (zh)
Other versions
CN108332752B (en
Inventor
王声平
张立新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Infinite Power Development Co., Ltd.
Original Assignee
Shenzhen Water World Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Water World Co Ltd filed Critical Shenzhen Water World Co Ltd
Priority to CN201810020362.5A priority Critical patent/CN108332752B/en
Publication of CN108332752A publication Critical patent/CN108332752A/en
Application granted granted Critical
Publication of CN108332752B publication Critical patent/CN108332752B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Present invention is disclosed the method and devices of robot indoor positioning, wherein the method for robot indoor positioning provided by the invention, including:Identification defaults in the speckle terrestrial reference in indoor ceiling;The topological diagram of picture position is formed according to the change in location of the speckle terrestrial reference and robot;Image coordinate in the topological diagram is converted to the corresponding position of robot coordinate;According to the corresponding position positioning robot of the robot coordinate.The present invention defaults in the speckle terrestrial reference of ceiling as position identification marking, is convenient for image recognition, reduces the complexity and calculation amount of front-end image Processing Algorithm, improves the real-time of positioning system, and then improve the precision of robot indoor positioning.

Description

The method and device of robot indoor positioning
Technical field
The present invention relates to robot fields, especially relate to the method and device of robot indoor positioning.
Background technology
Synchronous superposition (Simultaneous Localization and Mapping, abbreviation SLAM) is The well-known problem of robot field.Mobile robot equipped with sensor, by from the measurement data of sensor, estimating The position of environmental map and robot.Sensor can be divided into robot external sensor and internal sensor, external sensor Amount of movement for robot measurement with respect to external environment, including video camera, laser scanner, accelerometer or GPS;And it is interior Portion sensor measurement machine people changes with respect to the state (position) of itself previous moment, including odometer and gyroscope.In tradition SLAM methods in, such as EKF-SLAM (Extended Kalman Filter-Simultaneous Localization And Mapping, the synchronous superposition based on Kalman filtering), internal sensor is used for the shape of robot measurement State changes, and external sensor corrects measurement error.Existing SLAM is using there are two types of typical scenarios:Tracking, usual robot Known to initial position;Global localization is usually only provided seldom or even is not provided special about robot initial position or environment The prior information of sign.Therefore, the key problem of SLAM is the Posterior estimator about robot motion path and local environment feature. Above-mentioned key problem is solved, model appropriate is just must be set up, to solve posterior probability.
Academia of robot has emerged the various solutions for SLAM key problems.Wherein, it is based on expanding The EKF-SLAM methods of exhibition Kalman filter, which obtain, extensive to be approved and commonly used.EKF-SLAM methods are estimated first Then the state at a certain moment in motion process is fed back in a manner of (Noise) measurand, finally according to feedback Correct estimated value.In this way, EKF-SLAM methods can be under the premise of without understanding robot detailed nature, efficiently to movement Past, current, even future state estimated.But there is also following disadvantages for the above method:Road sign is usually positioned over ground On face, it is easy to be interfered by other robots passing around;Although simple road sign image procossing is simple, does not have and entangle Mistake, although complicated road sign has partial data error correction, pattern is excessively complicated, and real-time is poor;And it is missed due to measuring The accumulation of difference and computation complexity increase with the quantity of node and rapidly increase, and calculation amount is excessive, it is not easy to expand to big ring Indoor positioning under border.
Therefore, the prior art could be improved.
Invention content
The main object of the present invention is to provide a kind of method and device of indoor positioning, it is intended to be solved in existing indoor positioning Road sign Data processing easily causes measurement error, the not accurate technical problem of positioning in SLAM methods.
The present invention provides robot indoor positioning method, including:
Identification defaults in the speckle terrestrial reference in indoor ceiling;
Change the topological diagram to form picture position according to the pose of the speckle terrestrial reference and robot;
Image coordinate in the topological diagram is converted to the corresponding position information of robot coordinate;
According to the corresponding position Information locating robot of the robot coordinate.
The present invention also provides a kind of devices of robot indoor positioning, including:
Identification module defaults in the speckle terrestrial reference in indoor ceiling for identification;
Module is formed, for changing the topology to form picture position according to the pose of the speckle terrestrial reference and robot Figure;
Conversion module, the corresponding position information for the image coordinate in the topological diagram to be converted to robot coordinate;
Locating module, for the corresponding position Information locating robot according to the robot coordinate.
Advantageous effects of the present invention:The present invention defaults in the speckle terrestrial reference of ceiling as position identification marking, is convenient for Image recognition reduces the complexity and calculation amount of front-end image Processing Algorithm, improves the real-time of positioning system, and then improve The precision of robot indoor positioning.And the present invention actively projects infrared dissipate using the infrared laser projector on the ceiling Spot is not influenced by ambient light photograph, is also suitable under dark surrounds, and positioning and the map structure of large scale indoor environment are suitable for It builds.
Description of the drawings
The method flow schematic diagram of the robot indoor positioning of Fig. 1 one embodiment of the invention;
The optimization method flow diagram of the robot indoor positioning of Fig. 2 one embodiment of the invention;
The flow diagram of the step S5 of Fig. 3 one embodiment of the invention;
The flow diagram of the step S6 of Fig. 4 one embodiment of the invention;
The flow diagram of the step S62 of Fig. 5 one embodiment of the invention;
The flow diagram of the step S1 of Fig. 6 one embodiment of the invention;
The method flow schematic diagram of the robot indoor positioning of Fig. 7 another embodiment of the present invention;
The topological diagram of Fig. 8 one embodiment of the invention is intended to;
Closed loop conditions system restriction relation schematic diagram in the topological diagram of Fig. 9 one embodiment of the invention;
The robot indoor positioning system schematic of Figure 10 one embodiment of the invention;
The structural schematic diagram of the robot chamber interior locating device of Figure 11 one embodiment of the invention;
The optimization structural schematic diagram of the robot chamber interior locating device of Figure 12 one embodiment of the invention;
The structural schematic diagram of the determining module of Figure 13 one embodiment of the invention;
The structural schematic diagram of the optimization module of Figure 14 one embodiment of the invention;
The structural schematic diagram of the solution unit of Figure 15 one embodiment of the invention;
The structural schematic diagram of the identification module of Figure 16 one embodiment of the invention;
The structural schematic diagram of the robot chamber interior locating device of Figure 17 another embodiment of the present invention.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific implementation mode
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
Referring to Fig.1, the method for the robot indoor positioning of one embodiment of the invention, including:
S1:Identification defaults in the speckle terrestrial reference in indoor ceiling.
The speckle terrestrial reference of this step includes the light projection speckle for being pasted on the picture speckle of ceiling in advance and projecting in advance. The present embodiment preferably by the infrared laser speckle projector project infrared speckle, naked eyes it is invisible and harmless, and not by The influence that ambient light is shone, is also suitable under dark surrounds, is suitable for positioning and the map structuring of large scale indoor environment.It is such as attached Shown in Figure 10, a kind of robot indoor positioning system schematic is present embodiments provided, which includes that infrared laser speckle is thrown Emitter 1, the wide-angle thermal camera 3 being fixed in robot and the robot chassis 2 equipped with odometer.The present embodiment is excellent It is selected in interior and uniformly places multiple infrared laser speckle projectors, speckle terrestrial reference is projected on ceiling, speckle terrestrial reference is random Distribution, each speckle terrestrial reference pattern is different, in order to identify.Infrared laser speckle is determined according to the projection scope of the speckle projector The quantity and location layout of the projector reduce the region overlapping of infrared speckle to the greatest extent, are identified with simplifying, be further simplified meter It calculates, considers in combination with the high integrity apart from ceiling of visual field size and thermal camera of corresponding thermal camera, So that thermal camera there is no observing subject to dead angle, i.e., video camera Anywhere can at least take one and dissipate in the scene Spot terrestrial reference.The present embodiment optimal wavelength is the 830nm infrared laser projectors, makes the positioning model of single infrared laser projector covering It encloses as wide as possible.The present embodiment is preferably perpendicular to the infrared wide-angle video camera that ceiling direction is set in robot and carries out speckle Identification, so that thermal camera has sufficiently large field range.The preferred focal length of the present embodiment is the wide-angle of 2.8mm or 2.5mm Camera lens, the narrow band filter slice that camera lens front installation center wavelength is 830nm, only carries out photosensitive imaging to the light wave of infrared speckle.
S2:Change the topological diagram to form picture position according to the pose of above-mentioned speckle terrestrial reference and robot.
The SLAM methods based on figure of the present embodiment convert abstract original sensor measurement data to structure simply Figure optimization problem.Original measurement data is replaced by the side in figure, and the side in topological diagram between two nodes passes through robot position The probability distribution of the relative position of appearance marks, and is constrained by sensor measurement data, as shown in Fig. 8, interior joint Indicate the pose of robot,Indicate the position of terrestrial reference,Indicate the restriction relation between terrestrial reference and robot,It indicates The restriction relation at t-1 moment and t moment robot pose, whereinIt can be measured by terrestrial reference detection sensor, andIt can To be obtained by odometer measurement.The present embodiment builds model by the restriction relation between terrestrial reference and robot pose, passes through Figure optimization solves the problems, such as robot localization and map structuring.
Enable x={ x1,...,xTIndicate that the pose of topological diagram interior joint is vectorial, wherein xiThe pose of i-th of node is represented, As shown in Fig. 9, zijIndicate the observational constraints between node i and node j, ΩijIndicate the letter of node i and node j observational constraints Matrix (error co-variance matrix inverse) is ceased,Indicate prediction:Pose transformational relation between node i and node j is Odometer measured value.
S3:Image coordinate in above topology figure is converted to the corresponding position information of robot coordinate.
The pose of robot contains coordinate (x, y) and its direction θ in the plane, belongs to SE (2) space, terrestrial reference can To pass through the coordinate representation of two-dimensional space;Robot locationLandmark locationsRobot is often moved a certain distance or is rotated by a certain angle, and can be measured by odometer available The pose at robot current time detects the road sign that current robot detects according to the image that video camera is shot.
The pose of robot is obtained by odometer measurement data.Pose of the robot in t moment If it is Δ d that odometer, which measures the move distance between t to t+1, the turning rate of robot in the horizontal plane is ω at this time, then The pose of t+1 moment robotsIt is calculated as according to dead reckoning formula:
Robot captured in real-time environment determines the position of speckle terrestrial reference in the picture by the identification of speckle terrestrial reference, according to Camera model is converted to camera coordinate system, and position of the speckle terrestrial reference relative to robot is obtained, using as observation Amount.In the embodiment of the present invention, robot is by identifying that infrared speckle realizes observation.
The image normalization that robot is taken by thermal camera;The infrared image that thermal camera takes is Image normalization can avoid the strong and weak influence to speckle terrestrial reference detection of speckle by gray image.
Binarization operation is carried out to the image after having normalized;The influence of the speckle projection intensity of illumination of consideration, the present invention Embodiment chooses adaptive threshold method and carries out binaryzation to gray level image, can be by the libraries OpenCV AdaptiveThreshold functions are realized.
Contour line is detected in the image of binaryzation, and finds out candidate speckle terrestrial reference object;In view of each in speckle Point is separation, and in order to extract the profile of speckle terrestrial reference pattern, the present invention carries out Gaussian Blur, expansion and corruption to image first The operations such as erosion, allow the spot in speckle all to connect.To make the speckle in image become the side of two kinds of colors of black and white composition Shape.Since the speckle terrestrial reference of the present embodiment is all rectangle, there is the profile being closed and specific size, therefore inspection can be passed through Contour line in altimetric image finds out target object.The embodiment of the present invention uses the FindContours functions in the libraries OpenCV The extraction for carrying out image outline indicates profile by storing the continuity point of the row composition profile of sequential recording one.Because can not possibly It is small profile, and passes through ContourArea functions and calculate contour area, rejects the too small or excessive non-speckle wheel of area It is wide.The speckle terrestrial reference of the present embodiment is rectangular, therefore profile should also be rectangular.And the wheel retained after rejecting is handled Often with the presence of not quadrangular configuration in exterior feature.Profile is further handled by polygonal approximation, records the top of approximate polygon Point.If approximate polygon vertex number is more than 4, this polygon is not just speckle terrestrial reference to be detected, it should be given up It abandons.Due to the use of be thermal camera, indoors when it is smaller by environmental disturbances, to further determine whether to be after processing Speckle terrestrial reference.Meanwhile when cameras view object, since distance and orientation are different, obtained image will produce corresponding deformation. Therefore image is mapped in the plane vertical with camera, can just obtains the true shape of objects in images in this way.It utilizes GetPerspectiveTransform and WarpPerspective functions in OpenCV function libraries convert obtained image Onto the plane vertical with camera.The quadrangle retained in image can be corresponding with the true shape in environment.Finally make The corresponding rectangle of speckle terrestrial reference is calculated with minAreaRect functions, by the center position (u, v) of rectangle as speckle road sign Position.
Position relationship between speckle terrestrial reference and robot in order to obtain, using the seat of speckle terrestrial reference on the image plane Mark is determined its three-dimensional space position under camera coordinate system, that is, is estimated from the pose of 2 d-to-3 d.This implementation Example projects model according to video camera:
The position of terrestrial reference is converted from image coordinate system to camera coordinate system namely respective coordinates system of robot.Wherein, M is the mapping matrix that video camera internal reference and distortion parameter form, and (u, v) is the point that image coordinate is fastened, and (x, y) sits for video camera The point fastened is marked, since the distance on video camera to ceiling is fixed, speckle image is obtained and is thrown relative to video camera The coordinate position being mapped on horizontal plane.
S4:According to the corresponding position Information locating robot of above-mentioned robot coordinate.
With reference to Fig. 2, further, the method for the robot indoor positioning of one embodiment of the invention, after step S3, packet It includes:
S5:Determine the closed loop conditions system restriction relation between the current pose of robot and the speckle terrestrial reference.
In the present embodiment, position orientation relation of the robot between t moment and t+1 moment is obtained by odometer measurement,It arrivesIndicate relative motion of the robot in Fixed Time Interval.It is influenced by noise since odometer is measured, this Kind measurement result usually and is really slightly to have a deviation.Odometer measurement is the Euler's transformation in SE (2) group, it is assumed that measurement is made an uproar Sound is white Gaussian noise, its measurement error, as information matrix are indicated by the symmetrical matrix of 3*3, which depends on machine The movement of device people.Citing ground, when robot movement is bigger, uncertainty is also bigger therewith.Topological diagram interior jointIt arrivesBetween side constrained by following two amount:Indicate the movement between node;It indicates to survey The covariance matrix of amount is symmetrical positive definite matrix.When robot is detected in positionWhen detect terrestrial referenceIt is in figure Corresponding side can be modeled by the position of the current pose of robot and terrestrial reference.The measurement of terrestrial reference has two dimension seat in plane (x, y) is marked to indicate.Assuming that under white Gaussian noise, noise can be modeled by the inverse of its covariance matrix.Robot pose Model to side between terrestrial reference is indicated by following parameter:Indicate that robot existsThe position of detected terrestrial reference;Indicate the covariance matrix that terrestrial reference measures.
S6:Optimize the robot coordinate according to closed loop conditions system restriction relation.
Observation zijLog-likelihood function lijIt is expressed as:
Define measurement error function e (xi,xj,zij) indicate expected estimated value,For with actual observation zijBetween Error, be expressed as:
So measurement error function expression is:
The optimization aim of the present embodiment figure is exactly to find so that the log-likelihood function F (x) of all measurement errors is minimum Node configures x*
Obtaining a preliminary pose of more accurate robotIn the case of, utilize newton-Gauss optimization numerical algorithm The solution for the formula (4) asked, in preliminary poseNearby using first order Taylor come approximate error function:
Above-mentioned JijIt is eij(x) Jacobian matrix, substituting into formula (3) can obtainApproximate representation:
By approaching for part, above-mentioned function is rewritten:
Wherein, c=∑s cij, b=∑s bijAnd H=∑s Hij
F (x) is minimized to solve the linear system:HΔx*=-b (8)
Above- mentioned information matrix H is to be mapped in measurement error to robot motion track by Jacobian matrix, so knot Structure is sparse, and nonzero term therein is the observation limitation between pose.
According to equation (8), the information matrix H and residual matrix b of system are a series of structures that matrixes and vector add up, Wherein, each matrix and vector correspond to a constraint respectively, each constraint can be to one addition Item C of systematic contributionsij, And the structure of addition Item depends on the Jacobian of system error function.Since the produce function of constraint depends only on two Value between a node, formula (6) Jacobian matrix have following form:
Wherein AijAnd BijIt is error function respectively respectively with node i and the relevant derivatives of node j, can be obtained from equation (7) To matrix-block HijWith coefficient vector bijStructure it is as follows:
For simplicity, zero is omitted in formula.
With reference to Fig. 3, further, the method for the robot indoor positioning of one embodiment of the invention, the step S5, packet It includes:
S50:Judge to observe under the specified pose of the speckle terrestrial reference observed under the current pose of robot and robot described scattered Whether spot terrestrial reference is identical.
In the present embodiment, when robot movement is more than 0.5m or rotation angle is more than 0.5When, just inserted into topological diagram Enter new node, with being roamed in robot indoors environment, entire topological diagram is just constructed.Between adjacent moment node Solid arrow represent the constraint of the odometer between two nodes, by speckle terrestrial reference matching between front and back two field pictures come in improving Journey meter measures estimated accuracy, and the constraint that camera observes data is represented without the dotted arrow between the node of adjacent moment.Machine Device people starts to explore in zone of ignorance, by cameras view, is matched with past observation.
S51:If identical, there are the restriction relations of closed loop conditions system for judgement.
If different location observes same speckle terrestrial reference, the restriction relation of closed loop conditions system is constituted.
With reference to Fig. 4, further, the method for the robot indoor positioning of one embodiment of the invention, the step S6, packet It includes:
S60:According to closed loop conditions system restriction relation, the observation parameter and biography under robot adjacent moment pose are defined Sensor surveys the first measurement error function between parameter.
Construction robot's poseTo terrestrial referenceBetween error function,Indicating virtual measurement, virtual measurement is Robot fromObserve terrestrial referencePosition, indicated with equation 1
Ground is marked under eulerian coordinate system, can directly count their difference to indicate error.The error function of terrestrial reference is:
Define the adjacent measurement pose of robotIt arrivesBetween measurement error function, odometer measure in SE (2) space It is interior, pass throughOperation can write out measurement functions and be:
Similarly, the first measurement error function is:
S61:Select the incrementation parameter of adjacent moment pose parameter in robot moving process.
The landmark locations of the present embodiment belong to the point in Euler space, and corresponding increment can directly be added to obtain.But The pose parameter of robot is not belonging to Euler space, and in SE (2) group, this space allows to include many parameters, such as including Spin matrix R (θ) and translation vector (x, y)TEither rotation angle θ and translation vector (x, y)T
Parameter increase is used as by the movement on two dimensional surface in robot can select rotation angle θ and translation vector (x,y)T, pass through definitionIt operates to indicate the correspondence increment relation of robot adjacent moment pose.Since angle is not belonging to Euler Space can not directly be added expression increment, must be from new normalization after adding up each time.In this patent, robot is provided PoseAnd incrementIt is defined as follows operation:
Corresponding inverse operation is as follows:
S62:The first measurement error function is solved by way of iteration according to the incrementation parameter, to be optimized The robot coordinate.
SLAM problems based on figure finally sum up in the point that the problem of solving linear equation (8).Pass through the line of iteration in the present invention Property mode solves nonlinear problem.Wherein formula (3) is the representation of the SLAM linearisations based on figure, and formula (4) is it Solution, and the incrementation parameter used updates stepping as it, in iterative process each time, the solution x in preceding an iteration*As under Initial pose during an iteration.Most structures are all sparse wherein in system, can be gone using sparse matrix Hessian matrix H is stored, and due to the symmetry of information matrix H, it is only necessary to calculate top half to improve computational efficiency.
First, robot pose and offset vector b and information matrix are initialized;
By all margin of error eijWith covariance matrix Ωij, its Jacobian matrix of calculator AijAnd Bij
The H-matrix nonzero block for calculating linear system, has according to equation (10):
Design factor vector:
Finally solution linear equation H Δs x*Optimization robot coordinate can be realized in=- b.
With reference to Fig. 5, further, the method for the robot indoor positioning of one embodiment of the invention, step S62, including:
S620:When robot updates pose, judge to iteratively solve described during the first measurement error function Whether functional value has convergence.
Whether the convergence of this step has the tendency that leveling off to a certain designated value, if so, then having convergence.
S621:If so, judging that corresponding functional value is the solution of the first measurement error function.
The designated value most approached in this step is the solution of the first measurement error function, that is, finds the value of optimization, exit Cycle calculates and is playing the Hessian matrix H in the space of points, at this time x*For the final result of SLAM systems.
Further, after step S620, including:
S622:If it is not, then defining the observation under robot adjacent moment pose according to closed loop conditions system restriction relation The second measurement error function between parameter and sensor actual measurement parameter.
The present embodiment judges that it is not restrained, then according to the closed loop when updating robot pose and landmark locations Condition system restriction relation, second defined between the observation parameter under robot adjacent moment pose and sensor actual measurement parameter are surveyed Error function is measured, judgement is continued cycling through.
With reference to Fig. 6, further, the method for the robot indoor positioning of one embodiment of the invention, step S1, including:
S10:Judge whether the speckle terrestrial reference taken is known speckle terrestrial reference by speckle information.
The speckle information of this step includes speckle position, speckle regions, speckle shape etc., by the speckle terrestrial reference newly shot with The speckle terrestrial reference shot before judges whether to be new terrestrial reference by template matches, if the speckle shot before indicate with Matched template, then be not new speckle terrestrial reference, otherwise be new speckle terrestrial reference.
S11:If so, transferring the id information of the speckle terrestrial reference.
All there is matching id information in known speckle terrestrial reference, different speckle terrestrial references is distinguished by identification id information.
S12:The speckle terrestrial reference is identified by the id information.
For each different speckle terrestrial reference in the present embodiment, a unique id information is distributed for it, facilitates identification Speckle terrestrial reference, to estimate position and pose of the marker relative to robot.
Further, after step S10, including:
S13:If it is not, being then that the speckle terrestrial reference matches corresponding id information.
If new speckle terrestrial reference, then new id information is distributed for it and is preserved to local, is used to observe.
With reference to Fig. 7, the method for the robot indoor positioning of another embodiment of the present invention, before step S1, including:
S101:The internal reference and distortion parameter of calibrating camera.
S102:Store the mapping matrix of the internal reference and distortion parameter composition.
The mapping matrix M of calibrating camera internal reference and distortion parameter composition, so as to by the position of terrestrial reference from image coordinate system It converts to camera coordinate system namely respective coordinates system of robot.
Referring to Fig.1 1, the device of the robot indoor positioning of one embodiment of the invention, including:
Identification module 1 defaults in the speckle terrestrial reference in indoor ceiling for identification;
Module 2 is formed, for changing to form picture position topological diagram according to the pose of the speckle terrestrial reference and robot;
Conversion module 3, the corresponding position information for the image coordinate in the topological diagram to be converted to robot coordinate;
Locating module 4, for the corresponding position Information locating robot according to the robot coordinate.
Referring to Fig.1 2, further, the device of the robot indoor positioning of one embodiment of the invention, including:
Determining module 5, for determining that the constraint of the closed loop conditions system between the current pose of robot and the speckle terrestrial reference is closed System;
Optimization module 6, for optimizing the robot coordinate according to closed loop conditions system restriction relation.
Referring to Fig.1 3, further, the determining module 5 of one embodiment of the invention, including:
First judging unit 50, the speckle terrestrial reference for judging to observe under the current pose of robot specify pose with robot Whether the speckle terrestrial reference of lower observation is identical;
Judging unit 51, if for identical, there are the restriction relations of closed loop conditions system for judgement.
Referring to Fig.1 4, further, the optimization module 6 of one embodiment of the invention, including:
Definition unit 60, for according to closed loop conditions system restriction relation, defining under robot adjacent moment pose Observe the first measurement error function between parameter and sensor actual measurement parameter;
Selecting unit 61, the incrementation parameter for selecting adjacent moment pose parameter in robot moving process;
Unit 62 is solved, for solving the first measurement error letter by way of iteration according to the incrementation parameter Number, with the robot coordinate optimized.
Referring to Fig.1 5, further, the solution unit 62 of one embodiment of the invention, including:
Judgment sub-unit 620, for when robot updates pose, judging to iteratively solve the first measurement error function During the functional value whether have convergence;
Subelement 621 is judged, for if so, judging that corresponding functional value is the solution of the first measurement error function.
Preferably, the solution unit 62, including:
Subelement 622 is redefined, is used to, if it is not, then according to closed loop conditions system restriction relation, define robot phase The second measurement error function between observation parameter under adjacent moment pose and sensor actual measurement parameter.
Referring to Fig.1 6, further, the identification module 1 of one embodiment of the invention, including:
Second judgment unit 10, for judging whether the speckle terrestrial reference taken is known speckle by speckle information Terrestrial reference;
Unit 11 is transferred, for if so, transferring the id information of the speckle terrestrial reference;
Unit 12 is identified, for identifying the speckle terrestrial reference by the id information.
Further, the identification module 1, including:
Matching unit 13 is used for if it is not, being then that the speckle terrestrial reference matches corresponding id information.
Referring to Fig.1 7, the device of the robot indoor positioning of another embodiment of the present invention, including:
Demarcating module 101 is used for the internal reference and distortion parameter of calibrating camera.
Memory module 102, the mapping matrix for storing the internal reference and distortion parameter composition.
The device of the robot indoor positioning provided in above-described embodiment and the method for robot indoor positioning are all based on Identical inventive concept.Therefore, in the device of robot indoor positioning function module/unit of each specific embodiment it is specific Function may refer to preceding method embodiment, details are not described herein.
The foregoing is merely the preferred embodiment of the present invention, are not intended to limit the scope of the invention, every utilization Equivalent structure or equivalent flow shift made by description of the invention and accompanying drawing content is applied directly or indirectly in other correlations Technical field, be included within the scope of the present invention.

Claims (10)

1. a kind of method of robot indoor positioning, which is characterized in that including:
Identification defaults in the speckle terrestrial reference in indoor ceiling;
Change the topological diagram to form picture position according to the pose of the speckle terrestrial reference and robot;
Image coordinate in the topological diagram is converted to the corresponding position information of robot coordinate;
Robot described in corresponding position Information locating according to the robot coordinate.
2. the method for robot indoor positioning according to claim 1, which is characterized in that it is described will be in the topological diagram After image coordinate is converted to the step of corresponding position information of robot coordinate, including:
Determine the closed loop conditions system restriction relation between the current pose of the robot and the speckle terrestrial reference;
Optimize the robot coordinate according to closed loop conditions system restriction relation.
3. the method for robot indoor positioning according to claim 2, which is characterized in that the determining robot present bit The step of closed loop conditions system restriction relation between appearance and the speckle terrestrial reference, including:
Judge to observe under the specified pose of the speckle terrestrial reference observed under the current pose of the robot and the robot described scattered Whether spot terrestrial reference is identical;
If identical, there are closed loop conditions system restriction relations for judgement.
4. the method for robot indoor positioning according to claim 2, which is characterized in that described according to the closed loop conditions It is the step of restriction relation optimizes the robot coordinate, including:
According to closed loop conditions system restriction relation, the observation parameter defined under robot adjacent moment pose is surveyed with sensor The first measurement error function between parameter;
Select the incrementation parameter of adjacent moment pose parameter in robot moving process;
The first measurement error function is solved by way of iteration according to the incrementation parameter, with the machine optimized Device people's coordinate.
5. the method for robot indoor positioning according to claim 4, which is characterized in that described according to the incrementation parameter The step of the first measurement error function is solved by way of iteration, including:
When robot updates pose, whether the functional value is judged to iteratively solve during the first measurement error function With convergence;
If so, judging that corresponding functional value is the solution of the first measurement error function;
If it is not, then according to closed loop conditions system restriction relation, the observation parameter and biography under robot adjacent moment pose are defined Sensor surveys the second measurement error function between parameter.
6. a kind of device of robot indoor positioning, which is characterized in that including:
Identification module defaults in the speckle terrestrial reference in indoor ceiling for identification;
Module is formed, for changing the topological diagram to form picture position according to the pose of the speckle terrestrial reference and robot;
Conversion module, the corresponding position information for the image coordinate in the topological diagram to be converted to robot coordinate;
Locating module, for the corresponding position Information locating robot according to the robot coordinate.
7. the device of robot indoor positioning according to claim 6, which is characterized in that including:
Determining module, for determining the closed loop conditions system restriction relation between the current pose of robot and the speckle terrestrial reference;
Optimization module, for optimizing the robot coordinate according to closed loop conditions system restriction relation.
8. the device of robot indoor positioning according to claim 7, which is characterized in that the determining module, including:
First judging unit is observed for judging that the speckle terrestrial reference observed under the current pose of robot is specified with robot under pose The speckle terrestrial reference it is whether identical;
Judging unit, if for identical, there are the restriction relations of closed loop conditions system for judgement.
9. the device of robot indoor positioning according to claim 7, which is characterized in that the optimization module, including:
Definition unit, for according to closed loop conditions system restriction relation, defining the observation ginseng under robot adjacent moment pose The first measurement error function between amount and sensor actual measurement parameter;
Selecting unit, the incrementation parameter for selecting adjacent moment pose parameter in robot moving process;
Unit is solved, for solving the first measurement error function by way of iteration according to the incrementation parameter, with To the robot coordinate of optimization.
10. the device of robot indoor positioning according to claim 9, which is characterized in that the solution unit, including:
Judgment sub-unit, for when robot updates pose, judging the process for iteratively solving the first measurement error function Described in functional value whether have convergence;
Subelement is judged, for if so, judging that corresponding functional value is the solution of the first measurement error function;
Subelement is redefined, is used to, if it is not, then according to closed loop conditions system restriction relation, define robot adjacent moment position The second measurement error function between observation parameter under appearance and sensor actual measurement parameter.
CN201810020362.5A 2018-01-09 2018-01-09 Indoor robot positioning method and device Active CN108332752B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810020362.5A CN108332752B (en) 2018-01-09 2018-01-09 Indoor robot positioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810020362.5A CN108332752B (en) 2018-01-09 2018-01-09 Indoor robot positioning method and device

Publications (2)

Publication Number Publication Date
CN108332752A true CN108332752A (en) 2018-07-27
CN108332752B CN108332752B (en) 2021-04-20

Family

ID=62924393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810020362.5A Active CN108332752B (en) 2018-01-09 2018-01-09 Indoor robot positioning method and device

Country Status (1)

Country Link
CN (1) CN108332752B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109523595A (en) * 2018-11-21 2019-03-26 南京链和科技有限公司 A kind of architectural engineering straight line corner angle spacing vision measuring method
CN109613472A (en) * 2018-12-26 2019-04-12 芜湖哈特机器人产业技术研究院有限公司 A kind of infrared top mark and its recognition methods for the navigation of indoor trackless
CN110874101A (en) * 2019-11-29 2020-03-10 哈工大机器人(合肥)国际创新研究院 Method and device for generating cleaning path of robot
CN111256689A (en) * 2020-01-15 2020-06-09 北京智华机器人科技有限公司 Robot positioning method, robot and storage medium
CN111274343A (en) * 2020-01-20 2020-06-12 北京百度网讯科技有限公司 Vehicle positioning method and device, electronic equipment and storage medium
CN111932675A (en) * 2020-10-16 2020-11-13 北京猎户星空科技有限公司 Map building method and device, self-moving equipment and storage medium
CN112099509A (en) * 2020-09-24 2020-12-18 杭州海康机器人技术有限公司 Map optimization method and device and robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102135429A (en) * 2010-12-29 2011-07-27 东南大学 Robot indoor positioning and navigating method based on vision
KR20130043305A (en) * 2011-10-20 2013-04-30 충북대학교 산학협력단 Position estimation apparatus of mobile robot using indoor ceiling image
WO2014128507A2 (en) * 2013-02-22 2014-08-28 Fox Murphy Limited A mobile indoor navigation system
CN104374395A (en) * 2014-03-31 2015-02-25 南京邮电大学 Graph-based vision SLAM (simultaneous localization and mapping) method
US20160154408A1 (en) * 2010-09-24 2016-06-02 Irobot Corporation Systems and methods for vslam optimization
CN106153048A (en) * 2016-08-11 2016-11-23 广东技术师范学院 A kind of robot chamber inner position based on multisensor and Mapping System
CN106802658A (en) * 2017-03-21 2017-06-06 厦门大学 Method for rapidly positioning in a kind of full-automatic high precision room

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160154408A1 (en) * 2010-09-24 2016-06-02 Irobot Corporation Systems and methods for vslam optimization
CN102135429A (en) * 2010-12-29 2011-07-27 东南大学 Robot indoor positioning and navigating method based on vision
KR20130043305A (en) * 2011-10-20 2013-04-30 충북대학교 산학협력단 Position estimation apparatus of mobile robot using indoor ceiling image
WO2014128507A2 (en) * 2013-02-22 2014-08-28 Fox Murphy Limited A mobile indoor navigation system
CN104374395A (en) * 2014-03-31 2015-02-25 南京邮电大学 Graph-based vision SLAM (simultaneous localization and mapping) method
CN106153048A (en) * 2016-08-11 2016-11-23 广东技术师范学院 A kind of robot chamber inner position based on multisensor and Mapping System
CN106802658A (en) * 2017-03-21 2017-06-06 厦门大学 Method for rapidly positioning in a kind of full-automatic high precision room

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘成瑞 等: ""基于相关系数AR模型的陀螺随机漂移分析方法"", 《空间控制技术与应用》 *
汪洋: ""扫地机器人定位算法设计与嵌入式***实现"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109523595A (en) * 2018-11-21 2019-03-26 南京链和科技有限公司 A kind of architectural engineering straight line corner angle spacing vision measuring method
CN109523595B (en) * 2018-11-21 2023-07-18 南京链和科技有限公司 Visual measurement method for linear angular spacing of building engineering
CN109613472B (en) * 2018-12-26 2023-04-28 芜湖哈特机器人产业技术研究院有限公司 Infrared top mark for indoor trackless navigation and identification method thereof
CN109613472A (en) * 2018-12-26 2019-04-12 芜湖哈特机器人产业技术研究院有限公司 A kind of infrared top mark and its recognition methods for the navigation of indoor trackless
CN110874101A (en) * 2019-11-29 2020-03-10 哈工大机器人(合肥)国际创新研究院 Method and device for generating cleaning path of robot
CN111256689A (en) * 2020-01-15 2020-06-09 北京智华机器人科技有限公司 Robot positioning method, robot and storage medium
CN111256689B (en) * 2020-01-15 2022-01-21 北京智华机器人科技有限公司 Robot positioning method, robot and storage medium
CN111274343A (en) * 2020-01-20 2020-06-12 北京百度网讯科技有限公司 Vehicle positioning method and device, electronic equipment and storage medium
CN111274343B (en) * 2020-01-20 2023-11-24 阿波罗智能技术(北京)有限公司 Vehicle positioning method and device, electronic equipment and storage medium
CN112099509A (en) * 2020-09-24 2020-12-18 杭州海康机器人技术有限公司 Map optimization method and device and robot
CN112099509B (en) * 2020-09-24 2024-05-28 杭州海康机器人股份有限公司 Map optimization method and device and robot
CN111932675B (en) * 2020-10-16 2020-12-29 北京猎户星空科技有限公司 Map building method and device, self-moving equipment and storage medium
CN111932675A (en) * 2020-10-16 2020-11-13 北京猎户星空科技有限公司 Map building method and device, self-moving equipment and storage medium

Also Published As

Publication number Publication date
CN108332752B (en) 2021-04-20

Similar Documents

Publication Publication Date Title
CN108332752A (en) The method and device of robot indoor positioning
CN110070615B (en) Multi-camera cooperation-based panoramic vision SLAM method
JP5832341B2 (en) Movie processing apparatus, movie processing method, and movie processing program
JP6760114B2 (en) Information processing equipment, data management equipment, data management systems, methods, and programs
US9208395B2 (en) Position and orientation measurement apparatus, position and orientation measurement method, and storage medium
Kümmerle et al. Large scale graph-based SLAM using aerial images as prior information
CN112785702A (en) SLAM method based on tight coupling of 2D laser radar and binocular camera
WO2019136613A1 (en) Indoor locating method and device for robot
CN113870343B (en) Relative pose calibration method, device, computer equipment and storage medium
US20170116783A1 (en) Navigation System Applying Augmented Reality
Acharya et al. BIM-Tracker: A model-based visual tracking approach for indoor localisation using a 3D building model
CN110458161B (en) Mobile robot doorplate positioning method combined with deep learning
CN112233177B (en) Unmanned aerial vehicle pose estimation method and system
CN104704384A (en) Image processing method, particularly used in a vision-based localization of a device
CN112396656B (en) Outdoor mobile robot pose estimation method based on fusion of vision and laser radar
CN115388902B (en) Indoor positioning method and system, AR indoor positioning navigation method and system
Yan et al. Joint camera intrinsic and lidar-camera extrinsic calibration
CN109596121A (en) A kind of motor-driven station Automatic Targets and space-location method
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN110120093A (en) Three-dimensional plotting method and system in a kind of room RGB-D of diverse characteristics hybrid optimization
KR20220025028A (en) Method and device for building beacon map based on visual beacon
Zhang LILO: A novel LiDAR–IMU SLAM system with loop optimization
CN114969221A (en) Method for updating map and related equipment
CN114140539A (en) Method and device for acquiring position of indoor object
CN112762929B (en) Intelligent navigation method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20190906

Address after: Room 402, 4th floor, Kanghe Sheng Building, New Energy Innovation Industrial Park, No. 1 Chuangsheng Road, Nanshan District, Shenzhen City, Guangdong Province, 518000

Applicant after: Shenzhen Infinite Power Development Co., Ltd.

Address before: 518000 Block 503,602, Garden City Digital Building B, 1079 Nanhai Avenue, Shekou, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN WOTE WODE CO., LTD.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant