CN206832260U - A kind of Navigation System for Mobile Robot of view-based access control model - Google Patents

A kind of Navigation System for Mobile Robot of view-based access control model Download PDF

Info

Publication number
CN206832260U
CN206832260U CN201720492391.2U CN201720492391U CN206832260U CN 206832260 U CN206832260 U CN 206832260U CN 201720492391 U CN201720492391 U CN 201720492391U CN 206832260 U CN206832260 U CN 206832260U
Authority
CN
China
Prior art keywords
image
mobile robot
lane line
navigation system
based access
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201720492391.2U
Other languages
Chinese (zh)
Inventor
文生平
陈志鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201720492391.2U priority Critical patent/CN206832260U/en
Application granted granted Critical
Publication of CN206832260U publication Critical patent/CN206832260U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The utility model discloses a kind of Navigation System for Mobile Robot of view-based access control model;Including vision sensor, image processor, motion-control module and mobile robot car body.It is responsible for gathering scene image by vision sensor, image processor carries out image Treatment Analysis, then sends the operation of instruction control machine people by motion-control module.Lane line is preset in mobile work robot environment, and identifier is set in ad-hoc location;Mobile robot collection institute scene image at ambient, and the image with ground in orthographic projection is obtained by perspective transform;Gained image is melted into bianry image through gray processing and threshold value, then is split to obtain the image and the only image containing lane line only containing identifier;Approximate track trajectory is only being found out on the image containing lane line and is being tracked, then is identifying identifier, position and acceleration information is being obtained, realizes mobile robot precision navigation.

Description

A kind of Navigation System for Mobile Robot of view-based access control model
Technical field
It the utility model is related to robot navigation field, more particularly to a kind of Mobile Robotics Navigation system of view-based access control model System.
Background technology
Mobile robot can effectively improve logistics progress in workshop, reduce enterprise's labour cost and improves production effect Rate, had a wide range of applications demand in multiple industries such as automobile, food, printing, material transportation.
One key technology of mobile robot is airmanship, and Mobile Robotics Navigation has had various ways, such as used Property navigation, tape navigation, laser navigation and vision guided navigation etc..The mobile robot of different navigation mode has the characteristics of respective, And determine the flexible degree and cost of formed logistics system.Inertial navigation utilizes gyroscope and photoelectric encoder, its Influence easily is disturbed, control performance is relatively poor;Tape navigation need in advance laying tape, safeguard with transformation cost compared with It is high;Laser navigation needs extra installation reflector, and laser sensor involves great expense, and maintenance cost is higher.So vision guided navigation Mobile robot has higher practical value and wide application prospect.
Because the environment where mobile robot is usually relatively complex, and because the dynamic characteristic of its own is to need reality When image caused by motion process is handled, if to establish accurate environmental map, the data volume of processing is huge, Requirement to hardware is very high, and cost is high, implements difficult.The existing vision navigation method based on Quick Response Code and navigation band Although simple, quick, due in mobile robot practical application scene Quick Response Code easily by dust or spot influence so as to Recognition success rate is caused to decline, its validity has much room for improvement.
The content of the invention
The shortcomings that the purpose of this utility model is to overcome above-mentioned prior art and deficiency, there is provided a kind of simple and reliable, real The Navigation System for Mobile Robot for the view-based access control model that when property is good, controllability is good.
The utility model is achieved through the following technical solutions:
A kind of Navigation System for Mobile Robot of view-based access control model, including such as lower component:
Vision sensor;
Image processor;
Motion-control module;
Mobile robot car body;
The vision sensor, image processor and motion-control module carry by mobile robot car body;By vision Sensor is responsible for collection scene image in real time, and image processor carries out Treatment Analysis to the image collected, then by motion control Module sends the operation of instruction control machine people according to analysis result.
The vision sensor uses USB cameras, and is centrally mounted at the front end of mobile robot car body.
The USB cameras are additionally provided with a LED light source supporting with it.
Described image processor selects ARM Cortex-a9.
The air navigation aid of the Navigation System for Mobile Robot of view-based access control model, is comprised the steps of:
Step 1:The ambient image on vision sensor collection ground in front of the mobile robot car body in traveling;
The tape of dark color is preset with the scene of ambient image on light ground as lane line, in the pre-determined bit of tape Install and be equipped with multiple identifiers;There is a black (or dark) annulus outside identifier;Identifier includes mobile robot The positional information and velocity variations information of car body;
Step 2:The image obtained in step 1 is subjected to perspective transform by image processor, is changed into image and regards Feel image of the camera of sensor with ground in orthographic projection;
Step 3:Image obtained in step 2 is subjected to noise reduction process and is converted to gray level image;
Again image progress thresholding is handled to obtain bianry image and carry out image dividing processing, obtain one only comprising mark Know the ROI image and a ROI image for only including lane line of symbol.
The only ROI image containing lane line that will be obtained again, two discontinuous processing regions, traversal are specified on image The all pixels of pixel column are set, easily obtain the midpoint of lane line on intended pixel row in two processing regions, 2 points of connection obtains To with the approximate trajectory of lane line, and trajectory is tracked so as to realize navigation;
The only ROI image containing identifier obtained, by the predefined identifier identified in image obtain positional information with Acceleration change information.
The step 1 also comprises the following steps:
Step a:The camera putting position and posture of vision sensor are first determined, and camera is adjusted according to actual scene The brightness of supporting light source;
Step b:According to the light-source brightness currently adjusted, clearly road can be collected by adjusting the parameter of vision sensor Road image;
Step c:The continuous clearly ambient image, and the image collected is passed into image processor and carried out down of collection in real time The image procossing of one step.
The utility model is had the following advantages and effect relative to prior art:
The utility model is responsible for gathering scene image by vision sensor, and image processor carries out Treatment Analysis to image, The operation of instruction control machine people is sent by motion-control module again.Lane line is preset in mobile work robot environment, and In ad-hoc location, identifier is set;Mobile robot collection institute scene image at ambient, and obtained by perspective transform and Ground is in the image of orthographic projection;Gained image is melted into bianry image through gray processing and threshold value, then is split to obtain only containing mark The image of symbol and the only image containing lane line;Approximate track trajectory is only being found out on the image containing lane line and is being tracked, then Identifier is identified, position and acceleration information is obtained, realizes navigation feature.The utility model can be significantly simplified moving machine The control of device people and tracking process, realize the precision navigation of mobile robot.
Brief description of the drawings
Fig. 1 is the air navigation aid schematic flow sheet of the Navigation System for Mobile Robot of view-based access control model.
Fig. 2 is the demarcation schematic diagram of camera perspective transform.
Fig. 3 is lane line and identifier.
Fig. 4 is the bianry image after thresholding.
Fig. 5 is the lane line ROI image schematic diagram after segmentation.
Fig. 6 is the identifier ROI image schematic diagram after segmentation.
Fig. 7 is the lane line ROI image processing region schematic diagram after segmentation.
Fig. 8 is the track approximate trajectories line obtained after image procossing.
Fig. 9 is the Navigation System for Mobile Robot structural representation of view-based access control model.
Embodiment
The utility model is more specifically described in detail with reference to specific embodiment.
As shown in Figure 1.Step S100:Scene image under default environment where gathering the mobile robot, and will obtain The image obtained is by perspective transform, as shown in Fig. 2 making image be changed into image of the camera with ground in orthographic projection.It is described pre- If being covered with lane line in the road under environment in advance, multiple identifiers, the mark are provided with the ad-hoc location of the lane line Fu Jun has a circular profile higher with identifier background contrasts, as shown in Figure 3.
The track line width being layered in advance in the present embodiment on road is 3cm, and identifier circular profile exradius is 10cm, Internal diameter is 6cm.
Above-mentioned steps S100, in addition to following sub-step:
Step S101:Camera putting position is determined with posture and the bright of the supporting light source of camera is adjusted according to actual scene Degree.
Camera uses USB cameras in the present embodiment, and it is 30cm to put height as vertical range from the ground, and shooting angle is phase Machine axial line and level ground are in 45° angle.
Step S102:Light-source brightness adjustment camera parameter according to currently adjusting can collect clearly mileage chart Picture.
Step S103:The continuous clearly ambient image of collection in real time, and the image collected is passed into microprocessor and carried out The image procossing of next step.
Processor model is ARM Cortex-a9 in the present embodiment.
Step S200:Noise reduction process is carried out to the image obtained through step S100 and is converted into gray level image, then is passed through Thresholding carries out global binary conversion treatment, image is eventually converted into bianry image, as shown in Figure 4.
The obtained images of step S100 are smoothed using mean filter in the present embodiment, average core size is 3* 3。
Step S300:By through binary image segmentation obtained by step S200 obtain one only the ROI image comprising lane line and One ROI image for only including identifier, as shown in Figures 5 and 6.
On step S300, in addition to following sub-step:
Step S301, to bianry image obtained by step S200, pass through the lane line side in Hough transformation algorithm detection image Edge, then split to obtain the only ROI image containing lane line.
Step S302, to bianry image obtained by step S200, by the circle in hough-circle transform algorithm detection image, work as inspection Measure the ROI images for when circle contour is included in bianry image, just dividing the image into only containing identifier.
Straight-line detection is carried out with accumulated probability Hough transformation in the present embodiment, sets the progress size of its linear search Unit radius is 1, and most short straight line length is 3cm, to shorten detection time.
The a diameter of 9cm of smallest circle, the greatest circle a diameter of 11cm of hough-circle transform are set in the present embodiment, to shorten detection Time.
Step S400:The ROI image and the only ROI image containing identifier only containing lane line obtained by identification step S300, Navigation is tracked to lane line, and obtains the current positional information of mobile robot and acceleration information.
On step S400, also comprising following sub-step:
Step S401, to the only ROI image containing lane line obtained through step S301, specified two do not connect on image Continuous processing region, as shown in fig. 7, all pixels of traversal particular row.Due to being bianry image, the gray scale on lane line It is worth for 255, the gray value outside lane line is 0, easily obtains the midpoint on the lane line of particular row in two processing regions, Connect 2 points obtain with the approximate trajectory of lane line, as shown in figure 8, and trajectory is tracked so as to realize navigation.
Step S402, it is specific in image by identifying to the only ROI image containing identifier obtained through step 302 Identifier obtains positional information and acceleration change information.
Identifier discrimination in the present embodiment is high, and lane line tracking effect is good.
As described above, it can preferably realize the utility model.
Embodiment of the present utility model is simultaneously not restricted to the described embodiments, and other are any without departing from of the present utility model Spirit Essence with made under principle change, modification, replacement, combine, simplification, should be equivalent substitute mode, be included in Within the scope of protection of the utility model.

Claims (4)

1. a kind of Navigation System for Mobile Robot of view-based access control model, it is characterised in that including such as lower component:
Vision sensor;
Image processor;
Motion-control module;
Mobile robot car body;
The vision sensor, image processor and motion-control module carry by mobile robot car body;By visual sensing Device is responsible for collection scene image in real time, and image processor carries out Treatment Analysis to the image collected, then by motion-control module The operation of instruction control machine people is sent according to analysis result.
2. the Navigation System for Mobile Robot of view-based access control model according to claim 1, it is characterised in that:The vision sensor Using USB cameras, and it is centrally mounted at the front end of mobile robot car body.
3. the Navigation System for Mobile Robot of view-based access control model according to claim 2, it is characterised in that:The USB cameras are also Provided with a LED light source supporting with it.
4. the Navigation System for Mobile Robot of view-based access control model according to claim 3, it is characterised in that:Described image processor From ARM Cortex-a9.
CN201720492391.2U 2017-05-05 2017-05-05 A kind of Navigation System for Mobile Robot of view-based access control model Expired - Fee Related CN206832260U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201720492391.2U CN206832260U (en) 2017-05-05 2017-05-05 A kind of Navigation System for Mobile Robot of view-based access control model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201720492391.2U CN206832260U (en) 2017-05-05 2017-05-05 A kind of Navigation System for Mobile Robot of view-based access control model

Publications (1)

Publication Number Publication Date
CN206832260U true CN206832260U (en) 2018-01-02

Family

ID=60770386

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201720492391.2U Expired - Fee Related CN206832260U (en) 2017-05-05 2017-05-05 A kind of Navigation System for Mobile Robot of view-based access control model

Country Status (1)

Country Link
CN (1) CN206832260U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108898589A (en) * 2018-06-19 2018-11-27 南通大学 The quick-fried pearl intelligent detecting method of filter stick based on high speed machines vision
CN109032125A (en) * 2018-05-31 2018-12-18 上海工程技术大学 A kind of air navigation aid of vision AGV
CN110414511A (en) * 2019-07-30 2019-11-05 深圳市普渡科技有限公司 Cooperate sign and system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109032125A (en) * 2018-05-31 2018-12-18 上海工程技术大学 A kind of air navigation aid of vision AGV
CN108898589A (en) * 2018-06-19 2018-11-27 南通大学 The quick-fried pearl intelligent detecting method of filter stick based on high speed machines vision
CN108898589B (en) * 2018-06-19 2022-06-07 南通大学 Filter rod bead explosion intelligent detection method based on high-speed machine vision
CN110414511A (en) * 2019-07-30 2019-11-05 深圳市普渡科技有限公司 Cooperate sign and system
CN110414511B (en) * 2019-07-30 2022-05-03 深圳市普渡科技有限公司 Cooperative sign recognition method and system for robot

Similar Documents

Publication Publication Date Title
CN107421540A (en) A kind of Mobile Robotics Navigation method and system of view-based access control model
CN112184818B (en) Vision-based vehicle positioning method and parking lot management system applying same
CN102789234B (en) Robot navigation method and robot navigation system based on color coding identifiers
CN103064417B (en) A kind of Global localization based on many sensors guiding system and method
CN106774313B (en) A kind of outdoor automatic obstacle-avoiding AGV air navigation aid based on multisensor
CN105700532B (en) The Intelligent Mobile Robot navigator fix control method of view-based access control model
CN110379168B (en) Traffic vehicle information acquisition method based on Mask R-CNN
CN206832260U (en) A kind of Navigation System for Mobile Robot of view-based access control model
CN104023228A (en) Self-adaptive indoor vision positioning method based on global motion estimation
CN108364466A (en) A kind of statistical method of traffic flow based on unmanned plane traffic video
CN110108269A (en) AGV localization method based on Fusion
JP2007316685A (en) Traveling path boundary detection device and traveling path boundary detection method
JP4967758B2 (en) Object movement detection method and detection apparatus
CN106444774B (en) Vision navigation method of mobile robot based on indoor illumination
CN116901089A (en) Multi-angle vision distance robot control method and system
JP6916975B2 (en) Sign positioning system and program
Mutka et al. A low cost vision based localization system using fiducial markers
CN110727269A (en) Vehicle control method and related product
CN107436610A (en) A kind of vehicle and robot delivery navigation methods and systems of intelligent outdoor environment
CN114782639A (en) Rapid differential latent AGV dense three-dimensional reconstruction method based on multi-sensor fusion
CN112530270B (en) Mapping method and device based on region allocation
CN108665473B (en) Visual guidance and visual odometer multiplexing method
CN203077301U (en) Real-time detection device for positions and angles of wheel type motion robot
Aqel et al. Estimation of image scale variations in monocular visual odometry systems
JP2020076714A (en) Position attitude estimation device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180102

Termination date: 20210505