CN203053428U - Detection device of azimuth of optical imaging type wheeled mobile robot - Google Patents

Detection device of azimuth of optical imaging type wheeled mobile robot Download PDF

Info

Publication number
CN203053428U
CN203053428U CN 201220669496 CN201220669496U CN203053428U CN 203053428 U CN203053428 U CN 203053428U CN 201220669496 CN201220669496 CN 201220669496 CN 201220669496 U CN201220669496 U CN 201220669496U CN 203053428 U CN203053428 U CN 203053428U
Authority
CN
China
Prior art keywords
mobile robot
image processor
wheeled mobile
ground
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CN 201220669496
Other languages
Chinese (zh)
Inventor
高宏
王庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
UNIS CO Ltd
Original Assignee
UNIS CO Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by UNIS CO Ltd filed Critical UNIS CO Ltd
Priority to CN 201220669496 priority Critical patent/CN203053428U/en
Application granted granted Critical
Publication of CN203053428U publication Critical patent/CN203053428U/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The utility model relates to a detection device of the azimuth of an optical imaging type wheeled mobile robot, and belongs to the technical field of motional positioning. An optical imaging type wheeled mobile robot body is provided with a light source, a camera and an image processor. The camera shoots ground images and transmits the shot images to the image processor. The image processor receives the ground images output by the camera to measure the change value of the relative position and the position direction of adjacent two frames of images and convert the value to the change value of the position and the direction of the mobile robot relative to the ground. Finally, all the change values are respectively accumulated to obtain the position and the direction of the mobile robot relative to a starting point. The detection method and device provided by the utility model are simple in structure, convenient to use and low in cost, are not affected by time or mobile robot structural parameter estimation value in the principle, and have very high long-term precision and actual popularization and application values.

Description

The pick-up unit in a kind of optical imagery formula wheeled mobile robot orientation
Technical field
The utility model relates to the pick-up unit in a kind of optical imagery formula wheeled mobile robot orientation, belongs to the motion positions technical field.
Background technology
Along with social development and scientific-technical progress, robot has obtained application more and more widely in current productive life.Wheeled mobile robot is widely used in fields such as industry, agricultural, military affairs, hospital, family, space exploration owing to it has from heavy and light, carrying is big, mechanism is simple, drive with control relative convenient, advantages such as the speed of travel is fast, maneuverability, work efficiency height.
Wheeled mobile robot will be finished the task of appointment, at first will be in moving process current position and the orientation of real-time perception, and constantly with target location and aspect ratio, control travel mechanism and adjust accordingly, final arrival target location, thus effectively finish the work.The method that detects mobile robot position and orientation is divided into two classes: autonomous detection method and object of reference detection method.The object of reference detection method will rely on magnetic stripe, road sign, beacon outside references such as (as gps signals) or contrast signal and determine position and orientation, needs to set up and safeguard object of reference or contrast signal; Autonomous detection method does not need externally to set up object of reference or contrast signal, utilize the motion of the relative earth surface of robot to determine position and orientation, pick-up unit all is installed in robot interior, does not rely on external information during work, therefore the influence that is difficult for receiving external interference.
Autonomous detection method commonly used mainly contains two kinds:
(1) based on the measuring method of inertial sensor.According to newton's principle of inertia, utilization is installed in accelerometer and the gyro of mobile robot inside, linear acceleration and the angular acceleration of the relative earth surface of robot measurement itself carry out integration to measurement result respectively, thereby calculate distance and orientation variation that robot moves.Its shortcoming is: because the constant error of acceleration will cause and the time square measuring error that is directly proportional that therefore any little constant error all can in time and infinitely increase, long-term accuracy is very poor; Need the long initial alignment time before each the use; The complex structure of inertia measurement equipment, price are expensive.
(2) based on the measuring method of scrambler.The scrambler that utilization is installed in the wheeled mobile robot drive system is measured the rotational angle that left and right driving is taken turns respectively, extrapolate the distance that the left and right driving wheel is passed by according to the rotational angle of left and right driving wheel then, estimate position and the orientation of robot at last.Measuring method structure based on scrambler is simple relatively, cost is lower, but because the actual diameter of left and right driving wheel not exclusively equates, effectively wheelspan is also changing at any time, therefore constant error and the measuring error based on the measuring method of scrambler also can constantly accumulate in time, and long-term accuracy is also very poor.If the driving wheel skidding, based on the measuring method complete failure of scrambler.
The measurement result that the autonomous measuring method in above-mentioned two kinds of positions and orientation obtains is indirect measurement.Measurement result based on inertial sensor is to come out according to the direct measured value indirect calculation of the acceleration of inertial sensor, and measurement result is not only relevant with the inertial sensor acceleration, but also with square being directly proportional of time; Measurement result based on scrambler is come out according to the direct measured value indirect calculation of scrambler angular displacement, and measurement result is not only relevant with the scrambler angular displacement, but also with relating to parameters such as drive system ratio of gear, driving wheel diameter and wheel spacing.Therefore, time, drive system ratio of gear, driving wheel diameter and the isoparametric estimation error of wheel spacing all can be delivered in the measurement result in position and orientation, these estimation errors can constantly accumulate in time, cause the measuring result error of above-mentioned two kinds of measuring methods increasing, the long-term accuracy step-down.
Summary of the invention
The purpose of this utility model is the pick-up unit that proposes a kind of optical imagery formula wheeled mobile robot orientation, add up and the unlimited long-term problem of measuring back bearing accuracy step-down that causes that increases in time to overcome the location survey error that exists in the prior art, to improve the long-term accuracy of detection in wheeled mobile robot position and orientation.
The pick-up unit in the optical imagery formula wheeled mobile robot orientation that the utility model proposes comprises:
Light source is used for territory lighting, and described light source is fixed on the car body of wheeled mobile robot;
Video camera is used for taking ground image, and sends picture signal to image processor, and described video camera is fixed on the car body of wheeled mobile robot;
Image processor, be used for receiving the picture signal of video camera output, and carry out pre-service, measure the relative position of adjacent two two field pictures and the changing value of direction, be scaled the position on the relative ground of mobile robot and the changing value of direction then, added up respectively in all positions and direction changing value at last, obtain position and the direction of the relative starting point of mobile robot, described image processor is connected with video camera.
The pick-up unit in the optical imagery formula wheeled mobile robot orientation that the utility model proposes, its advantage is: the measurement result of utilizing the utility model method to obtain, only change relevant with the relative position on robot relative ground with direction, irrelevant with Measuring Time or mobile robot's structural parameters valuation, therefore eliminated the influence of numerous error components theoretically; The measurement result in mobile robot position and orientation is repeatedly adding up of direct measured value, because directly the measuring error of measured value is stochastic error, along with the increase of robot displacement, directly the quantity of measured value also constantly increases, and the accumulated value of measuring error will go to zero.Therefore the utility model measuring method has higher long-term measuring accuracy, has solved that existing detection technique measuring error adds up to increase in time and the problem of the long-term accuracy step-down that causes.And the pick-up unit of wheeled mobile robot of the present utility model position and direction, simple in structure, easy to use, cheap, on principle, be not subjected to the influence of time or robot architecture's parameter estimation, very high long-term accuracy is arranged, have practical extending application value.
Description of drawings
Fig. 1 is the structural principle synoptic diagram of the utility model pick-up unit.
Fig. 2 is that coordinate system and the earth axes of the utility model detection method concerns synoptic diagram.
Fig. 3 is the image of camera acquisition in the utility model measurement mechanism, and wherein (a) is first two field picture in the sampling interval, (b) is second two field picture of a sampling interval.
Among Fig. 1, the 1st, wheeled mobile robot car body, the 2nd, image processor, the 3rd, video camera, the 4th, light source.
Embodiment
The pick-up unit in the optical imagery formula wheeled mobile robot orientation that the utility model proposes, its structure comprises as shown in Figure 1:
Light source 4 is used for territory lighting, and described light source is fixed on the car body 1 of wheeled mobile robot;
Video camera 3 is used for taking ground image, and sends picture signal to image processor, and described video camera is fixed on the car body of wheeled mobile robot;
Image processor 2, be used for receiving the picture signal of video camera 3 outputs, and carry out pre-service, measure the relative position of adjacent two two field pictures and the changing value of direction, be scaled the position on the relative ground of mobile robot and the changing value of direction then, added up respectively in all positions and direction changing value at last, obtain position and the direction of the relative starting point of mobile robot, described image processor is connected with video camera.
In the detection method in the optical imagery formula wheeled mobile robot orientation that the utility model proposes, image processor receives the ground image of video camera output, in the first frame ground image, set two unique point M and N, and measure M and the coordinate figure (X of N in first two field picture M0, Y M0) and (X N0, Y N0), measure M and the N coordinate figure (X in second two field picture again M1, Y M1) and (X N1, Y N1), calculated line [(X M0, Y M0), (X N0, Y N0)] and straight line [(X M1, Y M1), (Xn1, Yn1)] the angle △ θ 1 between, and the relative ground M of car body displacement △ X2=Xm1-Xm0 and the △ Y2=Ym1-Ym0 of ordering, at last can be according to △ θ 1, △ X1 and △ Y1, calculate translation distance increment △ g1 and the △ h1 on mobile robot relative ground of car body when second two field picture, at last with mobile robot's translation distance increment △ gi and the △ hi in a continuous i sampling period, and the rotational angle on the relative ground of car body variation △ θ i adds up respectively, the position g=△ g1+ △ g2+ of the relative earth axes initial point of car body in the time of can obtaining i sampling instant ... + △ gi and h=△ h1+ △ h2+ ... + △ hi, and azimuth angle theta=△ θ 1+ △ θ 2+ ... + △ θ i.Position X, the Y of the increment △ g of car body translation distance g and h and △ h and ground M point previous moment in the XOY coordinate system and change in location △ X, △ Y, and the rotational angle θ of the relative ground of car body XOY coordinate system xoy coordinate system previous moment is relevant with rotational angle variation △ θ.
Among Fig. 3, (a) being first two field picture of a sampling interval, (b) is second two field picture of a sampling interval.M wherein, N are ground two unique points, and in the two frame of digital images of adjacent front and back, the coordinate figure that M, N are ordered is respectively from (X M0, Y M0) and (X N0, Y N0) be changed to (X M1, Y M1) and (X N1, Y N1).α is 2 inclination angles of line in first two field picture of MN, and β is 2 inclination angles of line in second two field picture of MN, the deflection angle △ θ of 2 lines of MN in car body XOY coordinate system 1The rotational angle that is exactly the relative ground of car body changes △ θ 1△ θ 1Can utilize two included angle of straight line computing formula to calculate:
△θ 1=arc?tan[(|tanα-tanβ|)/(1+tanα*tanβ)]
Tan α=(Y wherein N0-Y M0)/(X N0-X M0), tan β=(Y N1-Y M1)/(X N1-X M1).
In a sampling interval, the X-axis changes in coordinates that M is ordered is △ X 1=X M1-X M0, the Y-axis changes in coordinates is △ Y 1=Y M1-Y M0
If initial position wheeled mobile robot car body XOY coordinate system overlaps with ground xoy coordinate system, image processor is according to X M0, △ X 1, Y M0, △ Y 1With △ θ 1Calculate the increment △ g of car body translation distance 1With △ h 1Continuous coverage is also calculated △ g iWith △ h i(i=1,2,3 ...), then wheeled mobile robot car body position and the orientation during i sampling instant is:
g=△g 1+△g 2+…+△g i
h=△h 1+△h 2+…+△h i
θ=△θ 1+△θ 2+…+△θ i
In the pick-up unit of wheeled mobile robot of the present utility model position and direction, lighting source 4 adopts the TLHG520 infrared light-emitting diode of prestige generation semiconductor (Vishay Semiconductors) company, video camera 3 employing resolution are 2048 * 1536 CCD(Charge Coupled Device) digital camera, image processor 2 adopts the TMS320 digital signal processor of TI company, and image sampling speed was made as for 6 frame/seconds.
The principle of work of the utility model wheeled mobile robot position and direction detection device is:
4 pairs of territory lightings of lighting source, outstanding ground textural characteristics strengthens the ground details.
There is a proportionate relationship because the picture displacement of video camera 3 changes with true ground change in displacement, therefore at first video camera 3 demarcated.Identify the reference point of 5 known coordinate values on the ground, make 5 reference point be positioned at four angles and the central authorities of video camera 3 visual fields; Image processor 2 is handled by image and is extracted the coordinate figure of each reference point in video camera 3 coordinate systems, with the coordinate figure contrast of reference point in earth axes, obtains the proportionate relationship of video camera 3 coordinate systems and earth axes.
Video camera 3 is taken ground image with the speed of 6 frame/seconds, and sends data image signal to image processor 2.Image processor 2 at first carries out filtering to digital picture to be handled, and removes the noise in the image, and logarithm characters/numerals image strengthens processing, outstanding ground unique point; Set the terrain surface specifications point then, the change in location △ X of same characteristic features point in two two field pictures before and after measuring iWith △ Y i, and the azimuthal variation △ θ of the relative earth axes of car body i, calculate car body relative ground translation distance increment △ g iWith △ h iRespectively with △ g i, △ h iWith △ θ iAdd up, can obtain the relative earth axes initial point of current mobile robot the position (g, h) and orientation θ.

Claims (1)

1. the pick-up unit in an optical imagery formula wheeled mobile robot orientation is characterized in that this device comprises:
Light source is used for territory lighting, and described light source is fixed on the car body of wheeled mobile robot;
Video camera is used for taking ground image, and sends picture signal to image processor, and described video camera is fixed on the car body of wheeled mobile robot;
Image processor, be used for receiving the picture signal of video camera output, and carry out pre-service, measure the relative position of adjacent two two field pictures and the changing value of direction, be scaled the position on the relative ground of mobile robot and the changing value of direction then, added up respectively in all positions and direction changing value at last, obtain position and the direction of the relative starting point of mobile robot, described image processor is connected with video camera.
CN 201220669496 2012-12-06 2012-12-06 Detection device of azimuth of optical imaging type wheeled mobile robot Expired - Lifetime CN203053428U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201220669496 CN203053428U (en) 2012-12-06 2012-12-06 Detection device of azimuth of optical imaging type wheeled mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201220669496 CN203053428U (en) 2012-12-06 2012-12-06 Detection device of azimuth of optical imaging type wheeled mobile robot

Publications (1)

Publication Number Publication Date
CN203053428U true CN203053428U (en) 2013-07-10

Family

ID=48736372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201220669496 Expired - Lifetime CN203053428U (en) 2012-12-06 2012-12-06 Detection device of azimuth of optical imaging type wheeled mobile robot

Country Status (1)

Country Link
CN (1) CN203053428U (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980555A (en) * 2012-12-06 2013-03-20 紫光股份有限公司 Method and device for detecting direction of optical imaging type wheeled mobile robot
CN112783147A (en) * 2019-11-11 2021-05-11 科沃斯机器人股份有限公司 Trajectory planning method and device, robot and storage medium
US20210170934A1 (en) * 2019-01-21 2021-06-10 Kevin Arnold Morran Mobile Lighting System
CN113551651A (en) * 2016-01-05 2021-10-26 上海筑邦测控科技有限公司 Inclination angle sensor based on drop hammer position video identification technology

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980555A (en) * 2012-12-06 2013-03-20 紫光股份有限公司 Method and device for detecting direction of optical imaging type wheeled mobile robot
CN113551651A (en) * 2016-01-05 2021-10-26 上海筑邦测控科技有限公司 Inclination angle sensor based on drop hammer position video identification technology
US20210170934A1 (en) * 2019-01-21 2021-06-10 Kevin Arnold Morran Mobile Lighting System
CN112783147A (en) * 2019-11-11 2021-05-11 科沃斯机器人股份有限公司 Trajectory planning method and device, robot and storage medium

Similar Documents

Publication Publication Date Title
CN102980555B (en) Method and device for detecting direction of optical imaging type wheeled mobile robot
CN107463173B (en) Storage AGV navigation method and device, computer equipment and storage medium
CN110243358B (en) Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system
JP7073315B2 (en) Vehicles, vehicle positioning systems, and vehicle positioning methods
CN109911188B (en) Bridge detection unmanned aerial vehicle system in non-satellite navigation and positioning environment
CN110244772B (en) Navigation following system and navigation following control method of mobile robot
CN109648558B (en) Robot curved surface motion positioning method and motion positioning system thereof
Su et al. GR-LOAM: LiDAR-based sensor fusion SLAM for ground robots on complex terrain
CN103412565B (en) A kind of robot localization method with the quick estimated capacity of global position
CN102967305B (en) Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
CN107478214A (en) A kind of indoor orientation method and system based on Multi-sensor Fusion
CN110033480B (en) Aerial photography measurement-based airborne photoelectric system target motion vector estimation method
CN110986988B (en) Track calculation method, medium, terminal and device integrating multi-sensor data
CN107909614B (en) Positioning method of inspection robot in GPS failure environment
CN103064430B (en) Dynamo-electric pattern of fusion image stabilizing device
US8552362B2 (en) System and method for linear and angular measurements of a moving object
CN105547288A (en) Self-localization method and system for mobile device in underground coal mine
CN108733039A (en) The method and apparatus of navigator fix in a kind of robot chamber
CN203053428U (en) Detection device of azimuth of optical imaging type wheeled mobile robot
CN102721409B (en) Measuring method of three-dimensional movement track of moving vehicle based on vehicle body control point
CN105987697B (en) The wheeled AGV navigation locating method of Mecanum and system under a kind of quarter bend
CN105044754A (en) Mobile platform outdoor positioning method based on multi-sensor fusion
CN106489062B (en) System and method for measuring the displacement of mobile platform
JP2012003706A (en) Unmanned running vehicle guiding device and unmanned running vehicle guiding method
CN111426320A (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CX01 Expiry of patent term

Granted publication date: 20130710

CX01 Expiry of patent term