CN105865438A - Autonomous precise positioning system based on machine vision for indoor mobile robots - Google Patents

Autonomous precise positioning system based on machine vision for indoor mobile robots Download PDF

Info

Publication number
CN105865438A
CN105865438A CN201510033871.8A CN201510033871A CN105865438A CN 105865438 A CN105865438 A CN 105865438A CN 201510033871 A CN201510033871 A CN 201510033871A CN 105865438 A CN105865438 A CN 105865438A
Authority
CN
China
Prior art keywords
image
ceiling
robot
mobile robot
light pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510033871.8A
Other languages
Chinese (zh)
Inventor
郭杰
郭小璇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
QINGDAO TONGCHAN SOFTWARE TECHNOLOGY Co Ltd
Original Assignee
QINGDAO TONGCHAN SOFTWARE TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by QINGDAO TONGCHAN SOFTWARE TECHNOLOGY Co Ltd filed Critical QINGDAO TONGCHAN SOFTWARE TECHNOLOGY Co Ltd
Priority to CN201510033871.8A priority Critical patent/CN105865438A/en
Publication of CN105865438A publication Critical patent/CN105865438A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides an autonomous precise positioning system based on machine vision for indoor mobile robots. The system includes a positioning controller in the mobile robot, an image acquisition camera with optical axis perpendicular to a ceiling plane and a laser generator. The laser generator is arranged below the ceiling, and used for projecting a light pattern with particular characteristics onto the ceiling corresponding to a preset movement area of the robot; and the image acquisition camera is connected to a microprocessor through a communication interface and used for acquiring the light pattern on the ceiling right above the location of the robot on a real-time basis and uploading the light pattern to the microprocessor. The system, without changing the original appearance of the ceiling, uses the laser generator to project the light pattern with particular characteristics onto the ceiling to the increase the identifiable characteristics of the ceiling, and achieves positioning by image processing and image matching. The system has wide range of application and high positioning accuracy.

Description

The autonomous Precise Position System of mobile robot based on machine vision
Technical field
The invention belongs to localization for Mobile Robot field, particularly relate to independently being accurately positioned of mobile robot based on machine vision System.
Background technology
The practicality of indoor positioning and necessity are at some specific occasions increasingly significant, and it has a extensive future, and has bigger Expand space, it has wide range of applications, under complex environment, such as library, gymnasium, underground garage, kinds of goods warehouse etc. The quick location to personnel and article, such as, the mobile robot of application in automated production and warehousing management can be realized, Indoor mobile robot accurately identifies current location, thus accurately completed knocked down products, transmit and carry product.At present, indoor shifting Mobile robot localization method typically has several: ultrasonic locating, infrared ray location, ultra broadband location, telemetry, Quick Response Code/ Bar code localization method.
Ultrasonic locating great majority at present use reflective telemetry.System is made up of a main range finder and several electronic tags, Main range finder can be positioned on mobile robot body, and each electronic tag is positioned over the fixed position of the interior space.Position fixing process As follows: first to be sent the signal of same frequency to each electronic tag by host computer, electronic tag receive after again bounce transmission to main survey Away from device, may thereby determine that each electronic tag, to the distance between main range finder, and obtains the elements of a fix.At present, stream is compared The technology based on sonication chamber inner position of row be on the mobile machine person 4 towards 4 ultrasonic sensors of installation, by undetermined Bit space subregion, is formed coordinate, general control data, strong interference immunity by ultrasonic sensor range finding, and precision is high, Er Qieke To solve robot Disorientation Problem.Positioning precision: ultrasonic locating precision is up to Centimeter Level, and precision comparison is high.Defect: ultrasonic Ripple is decayed in transmitting procedure substantially thus is affected its location effective range.
Infrared ray is the electromagnetic wave between a kind of wavelength between radio wave and visible light wave.Typical infrared ray indoor locating system Activebadges makes object under test enclose an electronic mark, and this mark is by infrared to indoor fixed placement of infrared transmitter The receiver cycle sends the unique ID of this determinand, and receiver transfers data to data base by cable network again.This positions skill Art power consumption can be intercepted by indoor wall or object more greatly and usually, and practicality is relatively low.If by infrared ray and ultrasonic technology Combine and also can conveniently realize positioning function.Triggering framing signal with infrared ray makes the ultrasonic transmitter of reference point to tested point Penetrate ultrasound wave, apply TOA rudimentary algorithm, by timer ranging localization.On the one hand reduce power consumption, on the other hand avoid super The defect that acoustic reflection formula location technology transmission range is short.Infrared technique is had complementary advantages with ultrasonic technology.Positioning precision: 5 To 10m.Defect: infrared ray is prone to be intercepted by object or body of wall in transmitting procedure and transmission range is shorter, alignment system complexity Higher, effectiveness and practicality relatively other technology still have gap.
Telemetry is also referred to as predication method, and slight distance measured by its encoder installed additional by two wheels, calculates mobile robot position Put the variable quantity with attitude, by cumulative, it is achieved mobile robot is automatically positioned.But, once wheel traveling process goes out Existing skid phenomenon, owing to the output of encoder can not revise this error, so, As time goes on, it will strengthen Deviation accumulation, causes the reduction of positioning precision, and therefore, telemetry is only adapted to one section of extremely short distance.
Quick Response Code/bar code method for positioning mobile robot is typically to print several two dimensions on each Quick Response Code/bar coded sticker Code/bar code, each bar code data is made up of two parts: one-level code and position offset code, and one-level code is used for positioning This label actual geographic position in indoor environment, position offset code is used for determining on this label between each Quick Response Code/bar code Side-play amount, on each label, the one-level code on each Quick Response Code/bar code is identical, represents whole label indoor Geographical position, the relative position between each Quick Response Code/bar code represents by respective side-play amount.Bar code scan rifle scans every time To a complete bar code, when clear area occurs in sweep limits, by the data message of the 1st left and right sides, clear area is carried out Splice and obtain a complete barcode data.When in order to legal variable mobile robot position, top, it is by the position of label (absolutely To position), the position (relative position) of bar code, bar code scan rifle position (relative position) these three data investigation, finally Obtain mobile robot in indoor actual geographic position.But, the method still has following deficiency: Quick Response Code/bar code recognition is fast Spend low, be not suitable for high efficiency mobile robot field;Quick Response Code/bar code layout structure is difficult to comprehensive recognition;Hold Wrong ability, cost height high to environmental requirement, it is unfavorable for promoting the use of.Owing to one-dimension code does not has error correction code word and is added on After code word data sequence so that symbol loses data when running into damage, when abrasion occurs after Quick Response Code/bar code life-time service, Or due to ground relief, during Quick Response Code/bar code bending, all can not normally distinguish data.
It is an object of the invention to overcome the deficiencies in the prior art, utilize laser generator to throw having the light pattern identifying feature It is mapped to ceiling, utilizes image recognition technology to position, it is provided that the reading of a kind of high speed, comprehensive recognition and registration Alignment system and method.
Summary of the invention
It is an object of the invention to overcome the deficiencies in the prior art, it is provided that the base that a kind of accurate positioning, stability are strong and convenient to carry out Autonomous Precise Position System in the mobile robot of machine vision.
The autonomous Precise Position System of mobile robot based on machine vision, controls including the location being arranged on mobile robot interior Device processed, optical axis are perpendicular to image acquisition photographic head and the laser generator of ceiling plane, wherein:
Described register control includes that microprocessor and communication interface, described microprocessor are arranged at numerical map module, image Reason module, images match module;
Communication interface is connected with image acquisition photographic head, and microprocessor controls image acquisition photographic head by communication interface and carries out figure As gathering, receive view data and realizing being accurately positioned of mobile robot by image processing module with images match module;
Described laser generator is arranged at below ceiling, for presetting projection tool on the ceiling that zone of action is corresponding in robot There is the light pattern of special characteristic;
Described image acquisition photographic head is connected with microprocessor, for Real-time Collection robot present position just by communication interface Square ceiling light pattern is also uploaded to microprocessor;
Also include that positioning display module, described positioning display module are connected with microprocessor, for showing in real time in numerical map Robot present position.
Further, described laser generator presets the light of projection on the ceiling that zone of action is corresponding in robot is visible ray.
Further, described laser generator presets the light of projection on the ceiling that zone of action is corresponding in robot is invisible Light.
Further, when the light of projection is black light, described image acquisition cam lens comprises optical filter, was used for Filter non-laser launches the light of wavelength.
Preferably, described communication interface is network interface, USB interface or 1394 interfaces.
The autonomous accurate positioning method of indoor mobile robot based on machine vision, comprises the following steps:
Step a: system initialization:
S1: laser generator is preset projection on the ceiling that zone of action is corresponding in robot and is had the light pattern of special characteristic;
S2: numerical map module sets up the numerical map containing physical coordinates information of above-mentioned light pattern;
Step b: location:
S3: the light pattern image of the ceiling that image acquisition photographic head Real-time Collection robot position is corresponding;
S4: image processing module carries out pretreatment to the image gathered;
S5: images match module extracts the feature of image, and search meets the physical coordinates of its characteristics of image in numerical map;
S6: image display shows robot present position in numerical map in real time.
Wherein, in step S2, carry out as follows:
A: image acquisition photographic head presets several light pattern images of zone of action by a graded harvester device people, and labelling its When gathering image, above-mentioned data are uploaded to numerical map module by communication interface by the physical coordinates on ground;
Several light pattern image mosaic are become complete image by B: numerical map module;
C: set zero, the physical coordinates on ground when labelling gathers image in the picture;
D: generate robot and preset the formatting data of the numerical map that physical coordinates information is contained in zone of action.
Advantages of the present invention and good effect be:
The first, utilizing laser generator to project ceiling by having the light pattern identifying feature, light pattern identification is more convenient Clear, by the image acquisition photographic head installed in mobile robot, gather mobile robot and preset the image of zone of action and build The vertical numerical map containing physical coordinates information, and realize location by image procossing and images match mode, it is applied widely, Positioning precision is high;
The second, this localization method does not relies on the auxiliary equipments such as Quick Response Code, bar code, electronic tag, relies on ceiling image Collection with identify i.e. complete location, enhance the popularization of mobile robot indoor positioning technologies, ease for use and using effect;
3rd, the present invention is reasonable in design, by with kinetic control system with the use of, make robot operate steadily, no longer occur Significantly jitter phenomenon in running in the past, traffic direction is clear and definite, achieves the prominent effect of localization for Mobile Robot and shows Write progressive.
Accompanying drawing explanation
Fig. 1 is the structural representation of the autonomous Precise Position System of the present invention mobile robot based on machine vision;
Fig. 2 is the structural representation of embodiment 1;
Fig. 3 is the image of the position A gathered in embodiment 1;
Fig. 4 is the image of the position B gathered in embodiment 1.
Wherein:
1: image acquisition photographic head;2: ceiling panoramic rays image;3: the image of the position A of collection;4: the position of collection Put the image of B;5: laser generator.
Detailed description of the invention
Below in conjunction with accompanying drawing, the present invention is further described:
As it is shown in figure 1, the autonomous Precise Position System of mobile robot based on machine vision, including being arranged on mobile robot The register control of inside, optical axis are perpendicular to image acquisition photographic head and the laser generator of ceiling plane, wherein: described Register control include microprocessor and communication interface, described microprocessor arrange numerical map module, image processing module, Images match module;Communication interface is network interface, USB interface or 1394 interfaces.Communication interface and image acquisition photographic head Being connected, microprocessor controls image acquisition photographic head by communication interface and carries out image acquisition, receives view data and by figure As processing module realizes being accurately positioned of mobile robot with images match module;Described laser generator is arranged at below ceiling, For presetting projection on the ceiling that zone of action is corresponding in robot, there is the light pattern of special characteristic;Described laser generator Presetting the light of projection on the ceiling that zone of action is corresponding in robot is visible ray or black light.As, launch infrared ray, Formed and there is the light pattern identifying feature.When the light of projection is black light, described image acquisition cam lens comprises Optical filter, launches the light of wavelength for filtering non-laser generator, and what image acquisition camera collection laser generator sent can not See light pattern.
Described image acquisition photographic head is connected with microprocessor, for Real-time Collection robot present position just by communication interface Square ceiling light pattern is also uploaded to microprocessor;Also include positioning display module, described positioning display module and micro-process Device connects, for showing robot present position in numerical map in real time.
Microprocessor is connected with motion-control module, and motion-control module controls, according to location information, direction and the speed that robot moves Degree, it is achieved navigation.
Embodiment 1:
Below this example demonstrates that the position fixing process of the autonomous Precise Position System of mobile robot based on machine vision, such as figure Shown in 2, carry out in the steps below:
S1: laser generator 5 presets projection visible light ray pattern 2 on the ceiling that zone of action is corresponding in robot;
S2: image acquisition photographic head 1 gathers robot and presets the light pattern image of zone of action and be uploaded to number by communication interface Word map module, numerical map module sets up the numerical map containing coordinate information;Including: A: image capture module gathers variola The complete image of plate glazed thread pattern;B: extract the characteristic point of image;C: set zero, and marker characteristic point is corresponding Indoor two-dimensional physical coordinate;D: generate the formatting data that robot presets the numerical map of zone of action.
Image acquisition photographic head, the light image that Real-time Collection robot position is corresponding, such as: image acquisition photographic head 1 exists Position A Real-time Collection light image 3, at position B real-time image acquisition 4, the most as shown in Figure 3, Figure 4.
B: image processing module carries out gray processing, removes make an uproar process, feature point extraction the image gathered, and its processing procedure is as follows,
1: gray processing
Image is carried out gray processing process, obtains gray level image, can subtract on the basis of retaining original image information to greatest extent The data volume of little image, improves image processing speed.Weighted mean method is used to carry out gray processing process
According to importance and other index, three components are weighted averagely with different weights.As the following formula to RGB three-component It is weighted average energy and obtains more rational gray level image.
F (i, j)=0.30R (i, j)+0.59G (i, j)+0.11B (i, j))
2: image denoising, texture strengthens
Ground texture is analyzed, uses Canny edge detection algorithm.Canny detection algorithm is based on Sobel operator, core The heart be use two different threshold values to determine which point belongs to profile: a low value and a high level.By good rim detection, The texture edge line on ground can be depicted the most clearly.
3, feature point extraction
Between images during matching characteristic, we use SURF (Speed Up Robust Features) to accelerate robust Feature.
The realization of SURF is as follows, and first to each pixel calculating Hessian matrix to obtain feature, this matrix measures a function Local curvature, be defined as follows:
H ( x , y ) = δ 2 I δx 2 δ 2 I δxδy δ 2 I δxδy δ 2 I δy 2
This determinant of a matrix provides the intensity of curvature, and definition angle point is that the picture point with higher local curvature is (i.e. multiple sides To having higher curvature).Owing to this matrix is second dervative composition, it can use the Laplacian of different σ yardstick Gaussian core calculates, and therefore Hessian becomes the function of three variablees: H (x, y, σ).When Hessian value is same Time when reaching local maximum in spatial domain and scale domain, be the scale invariant feature of this point.
S4: images match module extracts the feature of image, and search meets the physical coordinates of its characteristics of image in numerical map;First examine Survey and extract the characteristic point of image, then mate in numerical map.
The general considerations of the Point Pattern Matching in plane determines that under affine transformation, whether two point sets mate.Now set first point set For model points, one has n point, and second point set is figure image point, has m point.If second point set is first point set warp Crossing what certain affine transformation obtained, but due to the effect of noise, there is a small change the relative position of point, and first point set In partial dot (being referred to as lacking a little) may be had to concentrate at second point can not find corresponding point, second point is concentrated may be occurred at random Some new points (referred to as pseudo-).
Hausdorff distance is to describe the one of similarity degree between two groups of point sets to measure, and it is the one of the spacing of two point sets Plant form of Definition: assume there are two groups of set A={a1 ..., ap}, B={b1 ..., bq}, then between the set of the two point Hausdorff distance definition is H (A, B)=max (h (A, B), h (B, A)) (1)
Wherein,
H (A, B)=max (a ∈ A) min (b ∈ B) ‖ a-b ‖ (2)
H (B, A)=max (b ∈ B) min (a ∈ A) ‖ b-a ‖ (3)
‖ ‖ is the distance normal form between point set A and B point set
Here, formula (1) is referred to as two-way Hausdorff distance, is the most basic form of Hausdorff distance;H (A, B) in formula (2) Be called the unidirectional Hausdorff distance from set A to set B and from set B to set A with h (B, A). i.e. h (A, B) First to the distance ‖ ai-bj ‖ between each some ai in point set A to the set B midpoint bj nearest apart from this ai It is ranked up, then takes maximum in this distance and in like manner can obtain as value .h (B, A) of h (A, B).
Being known by formula (1), two-way Hausdorff distance H (A, B) is the greater in both one-way distance h (A, B) and h (B, A), it Measure the most very much not matching degree between two point sets.
Use SURF feature and describe son, it is possible to achieve Scale invariant mates.This algorithm position that has been each characterizing definition detected Putting and yardstick, scale-value can be used for being defined around the window size of characteristic point, though the yardstick of object, all by bag in window Containing identical visual information, these information for carrying out Feature Points Matching, will determine the image 3 of the position A of collection;And adopt The physical coordinates of the image 4 of the position B of collection.
S5: image display display position A, position B present position in numerical map, the processing procedure of other position with Said process is identical, and such image display can show mobile robot coordinate in numerical map and position in real time, for Kinetic control system provides orientation support to the mobile navigation of mobile robot.
Embodiment 2
The present embodiment and the difference implementing 1 are, the light of laser generator 5 projection is black light, and image acquisition is taken the photograph As head arranges optical filter, the light of other wavelength is filtered out, retain the black light that laser generator sends.
As described above, only presently preferred embodiments of the present invention, it is impossible to limit, with this, the scope that the present invention implements, the most in every case The simple equivalence change made according to scope of the present invention patent and invention description content and modification, the most still belong to patent of the present invention and contain In the range of lid.

Claims (7)

1. the autonomous Precise Position System of mobile robot based on machine vision, it is characterised in that: include being arranged on the register control of mobile robot interior, optical axis is perpendicular to image acquisition photographic head and the laser generator of ceiling plane, wherein:
Described register control includes that microprocessor and communication interface, described microprocessor arrange numerical map module, image processing module, images match module;
Communication interface is connected with image acquisition photographic head, and microprocessor controls image acquisition photographic head by communication interface and carries out image acquisition, receive view data and realize being accurately positioned of mobile robot by image processing module with images match module;
Described laser generator is arranged at below ceiling, has the light pattern of special characteristic for presetting projection on the ceiling that zone of action is corresponding in robot;
Described image acquisition photographic head is connected with microprocessor by communication interface, and ceiling light pattern directly over Real-time Collection robot present position is also uploaded to microprocessor;
Also include that positioning display module, described positioning display module are connected with microprocessor, for showing robot present position in numerical map in real time.
The autonomous Precise Position System of mobile robot based on machine vision the most according to claim 1, it is characterised in that: it is visible ray that described laser generator presets the light of projection on the ceiling that zone of action is corresponding in robot.
The autonomous Precise Position System of mobile robot based on machine vision the most according to claim 1, it is characterised in that: it is black light that described laser generator presets the light of projection on the ceiling that zone of action is corresponding in robot.
The autonomous Precise Position System of mobile robot based on machine vision the most according to claim 3, it is characterised in that: described image acquisition cam lens comprises optical filter, launches the light of wavelength for filtering non-laser.
The autonomous Precise Position System of indoor mobile robot based on terrain surface specifications the most according to claim 1, it is characterised in that: described communication interface is network interface, USB interface or 1394 interfaces.
6. the autonomous accurate positioning method of mobile robot based on machine vision, it is characterised in that: comprise the following steps:
Step a: system initialization:
S1: laser generator is preset projection on the ceiling that zone of action is corresponding in robot and is had the light pattern of special characteristic;
S2: numerical map module sets up the numerical map containing physical coordinates information of above-mentioned light pattern;
Step b: location:
S3: the light pattern image of the ceiling that image acquisition photographic head Real-time Collection robot position is corresponding;
S4: image processing module carries out pretreatment to the image gathered;
S5: images match module extracts the feature of image, and search meets the physical coordinates of its characteristics of image in numerical map;
S6: image display shows robot present position in numerical map in real time.
7. the autonomous accurate positioning method of indoor mobile robot based on machine vision, it is characterised in that: in step S2, carry out as follows:
A: image acquisition photographic head presets several light pattern images of zone of action, and the physical coordinates on ground during its collection image of labelling by a graded harvester device people, and by communication interface, above-mentioned data are uploaded to numerical map module;
Several light pattern image mosaic are become complete image by B: numerical map module;
C: set zero, the physical coordinates on ground when labelling gathers image in the picture;
D: generate robot and preset the formatting data of the numerical map that physical coordinates information is contained in zone of action.
CN201510033871.8A 2015-01-22 2015-01-22 Autonomous precise positioning system based on machine vision for indoor mobile robots Pending CN105865438A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510033871.8A CN105865438A (en) 2015-01-22 2015-01-22 Autonomous precise positioning system based on machine vision for indoor mobile robots

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510033871.8A CN105865438A (en) 2015-01-22 2015-01-22 Autonomous precise positioning system based on machine vision for indoor mobile robots

Publications (1)

Publication Number Publication Date
CN105865438A true CN105865438A (en) 2016-08-17

Family

ID=56623404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510033871.8A Pending CN105865438A (en) 2015-01-22 2015-01-22 Autonomous precise positioning system based on machine vision for indoor mobile robots

Country Status (1)

Country Link
CN (1) CN105865438A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106370188A (en) * 2016-09-21 2017-02-01 旗瀚科技有限公司 Robot indoor positioning and navigation method based on 3D camera
CN106444774A (en) * 2016-11-01 2017-02-22 西安理工大学 Indoor lamp based mobile robot visual navigation method
CN107843258A (en) * 2017-10-17 2018-03-27 深圳悉罗机器人有限公司 Indoor locating system and method
CN108115722A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 Mobile robot precision detection system and method
CN108181610A (en) * 2017-12-22 2018-06-19 鲁东大学 Position Method for Indoor Robot and system
CN108414980A (en) * 2018-02-12 2018-08-17 东南大学 A kind of indoor positioning device based on dotted infrared laser
CN109269477A (en) * 2018-10-08 2019-01-25 塔米智能科技(北京)有限公司 A kind of vision positioning method, device, equipment and storage medium
CN109827575A (en) * 2019-01-28 2019-05-31 深圳市普渡科技有限公司 Robot localization method based on positioning identifier
CN110450167A (en) * 2019-08-27 2019-11-15 南京涵曦月自动化科技有限公司 A kind of robot infrared laser positioning motion trail planning method
CN112468736A (en) * 2020-10-26 2021-03-09 珠海市一微半导体有限公司 Ceiling vision robot capable of intelligently supplementing light and control method thereof
CN113124850A (en) * 2019-12-30 2021-07-16 北京极智嘉科技股份有限公司 Robot, map generation method, electronic device, and storage medium
CN114947652A (en) * 2019-03-21 2022-08-30 深圳阿科伯特机器人有限公司 Navigation and cleaning area dividing method and system, and moving and cleaning robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101082912A (en) * 2006-06-01 2007-12-05 上海杰图软件技术有限公司 Method for annotating electronic map through photograph collection having position information
US20090323121A1 (en) * 2005-09-09 2009-12-31 Robert Jan Valkenburg A 3D Scene Scanner and a Position and Orientation System
CN101957194A (en) * 2009-07-16 2011-01-26 北京石油化工学院 Rapid visual orientation and remote monitoring system and method based on embedded mobile robot
CN102818568A (en) * 2012-08-24 2012-12-12 中国科学院深圳先进技术研究院 Positioning and navigation system and method of indoor robot
CN103197679A (en) * 2013-03-22 2013-07-10 长沙理工大学 Accurate positioning method for orbit type routing-inspection robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090323121A1 (en) * 2005-09-09 2009-12-31 Robert Jan Valkenburg A 3D Scene Scanner and a Position and Orientation System
CN101082912A (en) * 2006-06-01 2007-12-05 上海杰图软件技术有限公司 Method for annotating electronic map through photograph collection having position information
CN101957194A (en) * 2009-07-16 2011-01-26 北京石油化工学院 Rapid visual orientation and remote monitoring system and method based on embedded mobile robot
CN102818568A (en) * 2012-08-24 2012-12-12 中国科学院深圳先进技术研究院 Positioning and navigation system and method of indoor robot
CN103197679A (en) * 2013-03-22 2013-07-10 长沙理工大学 Accurate positioning method for orbit type routing-inspection robot

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106370188A (en) * 2016-09-21 2017-02-01 旗瀚科技有限公司 Robot indoor positioning and navigation method based on 3D camera
CN106444774B (en) * 2016-11-01 2019-06-18 西安理工大学 Vision navigation method of mobile robot based on indoor illumination
CN106444774A (en) * 2016-11-01 2017-02-22 西安理工大学 Indoor lamp based mobile robot visual navigation method
CN108115722A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 Mobile robot precision detection system and method
CN107843258A (en) * 2017-10-17 2018-03-27 深圳悉罗机器人有限公司 Indoor locating system and method
CN107843258B (en) * 2017-10-17 2020-01-17 深圳悉罗机器人有限公司 Indoor positioning system and method
CN108181610A (en) * 2017-12-22 2018-06-19 鲁东大学 Position Method for Indoor Robot and system
CN108414980A (en) * 2018-02-12 2018-08-17 东南大学 A kind of indoor positioning device based on dotted infrared laser
CN109269477A (en) * 2018-10-08 2019-01-25 塔米智能科技(北京)有限公司 A kind of vision positioning method, device, equipment and storage medium
CN109827575A (en) * 2019-01-28 2019-05-31 深圳市普渡科技有限公司 Robot localization method based on positioning identifier
CN114947652A (en) * 2019-03-21 2022-08-30 深圳阿科伯特机器人有限公司 Navigation and cleaning area dividing method and system, and moving and cleaning robot
CN110450167A (en) * 2019-08-27 2019-11-15 南京涵曦月自动化科技有限公司 A kind of robot infrared laser positioning motion trail planning method
CN113124850A (en) * 2019-12-30 2021-07-16 北京极智嘉科技股份有限公司 Robot, map generation method, electronic device, and storage medium
CN113124850B (en) * 2019-12-30 2023-07-28 北京极智嘉科技股份有限公司 Robot, map generation method, electronic device, and storage medium
CN112468736A (en) * 2020-10-26 2021-03-09 珠海市一微半导体有限公司 Ceiling vision robot capable of intelligently supplementing light and control method thereof

Similar Documents

Publication Publication Date Title
CN105865438A (en) Autonomous precise positioning system based on machine vision for indoor mobile robots
CN105865419A (en) Autonomous precise positioning system and method based on ground characteristic for mobile robot
Nissimov et al. Obstacle detection in a greenhouse environment using the Kinect sensor
CN102773862B (en) Quick and accurate locating system used for indoor mobile robot and working method thereof
CN102735235B (en) Indoor mobile robot positioning system based on two-dimensional code
Alismail et al. Automatic calibration of a range sensor and camera system
CN103411553B (en) The quick calibrating method of multi-linear structured light vision sensors
Yu et al. An autonomous restaurant service robot with high positioning accuracy
CN202702247U (en) Rapid and accurate positioning system used for indoor mobile robot
US11614743B2 (en) System and method for navigating a sensor-equipped mobile platform through an environment to a destination
CN103512579A (en) Map building method based on thermal infrared camera and laser range finder
CN101726741A (en) Apparatus and method for extracting feature information of object and apparatus and method for creating feature map
CN105865437A (en) Autonomous and accurate positioning system of mobile robot based on RFID(Radio Frequency Identification) and method thereof
CN113052903A (en) Vision and radar fusion positioning method for mobile robot
CN102788572A (en) Method, device and system for measuring attitude of engineering machinery lifting hook
CN108459597A (en) A kind of mobile electronic device and method for handling the task of mission area
CN109313822B (en) Virtual wall construction method and device based on machine vision, map construction method and movable electronic equipment
Yuan et al. An automated 3D scanning algorithm using depth cameras for door detection
CN103136525A (en) High-precision positioning method for special-shaped extended target by utilizing generalized Hough transformation
CN110827361A (en) Camera group calibration method and device based on global calibration frame
CN111964680A (en) Real-time positioning method of inspection robot
CN103679087B (en) Localization method based on radio-frequency technique
CN103196440B (en) M sequence discrete-type artificial signpost arrangement method and related mobile robot positioning method
Liu et al. Intensity image-based LiDAR fiducial marker system
Davies The dramatically changing face of computer vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160817