CN110400348B - Method for detecting and calibrating steering of vibrating wheel of double-cylinder unmanned equipment based on vision - Google Patents

Method for detecting and calibrating steering of vibrating wheel of double-cylinder unmanned equipment based on vision Download PDF

Info

Publication number
CN110400348B
CN110400348B CN201910554219.9A CN201910554219A CN110400348B CN 110400348 B CN110400348 B CN 110400348B CN 201910554219 A CN201910554219 A CN 201910554219A CN 110400348 B CN110400348 B CN 110400348B
Authority
CN
China
Prior art keywords
steering angle
image
double
steering
vibrating wheel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910554219.9A
Other languages
Chinese (zh)
Other versions
CN110400348A (en
Inventor
谢辉
孙一铭
周扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201910554219.9A priority Critical patent/CN110400348B/en
Publication of CN110400348A publication Critical patent/CN110400348A/en
Application granted granted Critical
Publication of CN110400348B publication Critical patent/CN110400348B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method for detecting and calibrating the steering of a vibrating wheel of double-cylinder unmanned equipment based on vision, which adopts a three-proofing camera special for engineering and is matched with color characteristics arranged at a cross beam at the front part of the vibrating wheel of the double-cylinder unmanned equipment, firstly utilizes the vision perception to realize the detection of the steering angle of the vibrating wheel of the double-cylinder unmanned equipment, and then carries out numerical filtering and calibration on the obtained angle, thereby improving the stability and the precision of the detection of the steering angle. The technology is innovated on the basis of the physical structure and method principle, and the problem that the steering angle detection of the double-cylinder unmanned equipment is inaccurate and unstable due to the problems of sensor installation, signal fluctuation, GPS signal drift and the like in the prior art is solved.

Description

Method for detecting and calibrating steering of vibrating wheel of double-cylinder unmanned equipment based on vision
Technical Field
The invention belongs to the field of unmanned engineering machinery, relates to an image processing technology, and particularly relates to a method for detecting and calibrating steering of a vibrating wheel of double-cylinder unmanned equipment based on vision.
Background
In recent years, with continuous progress of artificial intelligence and unmanned technology, modern engineering machinery is rapidly developing towards automation, informatization and intellectualization. The double-cylinder engineering operation machine plays an important role in construction such as road building and dam building, but the working environment of road building and dam building is very severe often, and field constructors need to carry out heavy and repeated high-strength double-cylinder unmanned equipment driving operation for a long time, on one hand, the working efficiency of the constructors is reduced due to continuous and tedious work, the rolling construction quality cannot be guaranteed, and on the other hand, in order to meet the compaction working requirement in the construction operation, the vibration excitation mechanism of the double-cylinder engineering operation machine generates severe vibration and can bring physical discomfort to the operators. In order to ensure the health of constructors, the operation cost of the double-cylinder unmanned equipment is reduced and the working efficiency of the double-cylinder unmanned equipment is improved, and the realization of the unmanned double-cylinder unmanned equipment has very important significance.
Take a road roller as an example. In the current research of the unmanned driving technology of the road roller, the accurate steering control of the vehicle is a very critical research work. The accurate steering control of the vehicle at the next moment is realized by detecting the steering angle of the vehicle at the current moment through a specific technology, and then calculating and correcting the steering angle by combining the preset expected track and information such as the heading, the position and the like of other sensors and giving the current track. The detection of the steering angle of the vehicle at the present moment is therefore of great importance.
In the case of a three-wheeled or four-wheeled vehicle, the steering angle of the front wheels having a steering function is a required steering angle, but in the case of an articulated steering structure such as a road roller, the front vibration and the rear body are connected by a center articulated cross. The steering angle of the vehicle detected here is therefore the steering angle of the front vibrating wheel of the road roller.
In the prior art, two methods are mainly used for the steering angle of the front vibrating wheel of the road roller.
The other is a mechanical angle measuring method, an angle sensor is used at the hinged position of the front vehicle body and the rear vehicle body of the road roller to be matched with a specific mounting bracket and a link mechanism, one end of the link mechanism is connected with the angle sensor, the other end of the link mechanism is fixedly connected with a beam around a vibrating wheel through the specific bracket, the angle sensor is mounted at the hinged position through the bracket, and the hinged position is rigidly connected with the rear vehicle body. In the steering process of the road roller, the steering angle of the vibrating wheel can be given through the angle sensor. However, the mechanical angle measuring method is easily interfered by problems of installation errors of the angle sensor, insecure installation and connection of a mechanical structure, a gap between a connecting rod mechanism and a front vehicle body, damage of precision components inside the sensor under long-term working conditions, signal fluctuation of the sensor and the like, so that the steering angle of the vibrating wheel is not accurately detected.
The other method is a GPS angle calculation method, and one group of double-antenna GPS can realize the orientation and the positioning of the vehicle, so that the rotation angle of the vibrating wheel can be obtained by calculating two groups of course angles by using two groups of double-antenna GPS. The specific implementation method comprises the steps that firstly, an antenna GPS is respectively arranged at the corresponding positions of the front vehicle body and the rear vehicle body of the road roller to form a group of double-antenna GPS, so that a course angle S1 can be obtained when the road roller is steered, and the direction of the course angle S1 is along the connecting line of the two GPS antennas of the front vehicle body and the rear vehicle body. And then, a group of double GPS antennas are longitudinally and collinearly arranged on the rear vehicle body of the road roller, so that a course angle S2 can be obtained, and the direction of the course angle S2 is longitudinally and collinearly with the rear vehicle body at any time, so that the included angle between the S1 direction and the S2 direction can be calculated, and the included angle is the steering angle of the vibrating wheel at the current moment. The method is high in cost, two groups of double-antenna GPS (global positioning system) antennas are required to be arranged, the influence of the installation positions of the GPS antennas is very easy to influence, and the problems of unstable satellite signal receiving, signal fluctuation, signal drifting and the like of the GPS easily occur, so that the steering angle detection of the vibrating wheel is not accurate.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a method for detecting and calibrating the steering of a vibrating wheel of double-cylinder unmanned equipment based on vision.
The technical scheme adopted by the invention for solving the technical problem is as follows:
a method for detecting the steering of a vibrating wheel of a double-cylinder unmanned aerial vehicle based on vision comprises the following steps:
(1) Collecting an original image;
(2) Extracting a region of interest (ROI);
(3) Carrying out distortion correction on the extracted ROI area;
(4) Segmenting the characteristic region according to a color threshold;
(5) Carrying out image morphological processing;
(6) Hough circle detection or feature contour detection;
(7) If the last step is Hough circle detection, calculating a steering angle of the vibrating wheel; if the last step is the feature contour detection, firstly calculating the centroid of the feature contour and then calculating the steering angle of the vibration wheel;
(8) Filtering the steering angle numerical value;
(9) The steering angle is stably output.
And calibrating the steering angle obtained in the step (9) to obtain a calibrated steering angle.
And in the calibration, a real steering angle is acquired according to a real double-cylinder unmanned equipment articulation angle sensor, a mapping relation between the real steering angle and the steering angle obtained in the step (9) is established to form a calibration table, a numerical value interval is searched for the calibration table, and a data interpolation method is adopted to realize geometric calibration.
In the interpolation method, a steering angle of a visually detected vibration wheel is defined as alpha, temp _ alpha is recorded as an intermediate quantity to be calibrated, temp _ alpha = alpha/5, then downward rounding operation is performed on the temp _ alpha to obtain a maximum integer which is less than or equal to the temp _ alpha and is recorded as insert _ a, insert _ B = insert _ a +1, at the time, [ insert _ a 5, insert _b5 ] is an interpolation interval where the steering angle alpha given by visual detection is located, corresponding f (insert _ a 5) is recorded as a steering angle value of real double-cylinder unmanned equipment corresponding to insert _ a 5, corresponding f (insert _ B5) is a steering angle value of real double-cylinder unmanned equipment corresponding to insert _ B5, and an interpolation operation formula is performed according to the interpolation interval and the corresponding values, wherein the interpolation formula is as follows:
Figure BDA0002106418890000031
and (2) the original image acquisition in the step (1) is to fix the industrial camera at the transverse middle position of the vehicle body of the double-cylinder unmanned equipment, and adopt a shockproof hard spring buffer or arrange a shockproof rubber gasket for shockproof treatment.
In addition, the distortion correction of the extracted ROI region in step (3) is to combine the internal parameters and the external parameters of the camera to realize the distortion correction of the camera, and obtain the real image img from the distorted image imgD by using the formula img (U, V) = imgD (Ud, vd).
And (5) the image morphology processing is to eliminate the undesired characteristic regions by corrosion processing and then to adopt the expansion processing to leave the desired characteristic regions.
Moreover, the method for calculating the steering angle of the vibrating wheel through Hough circle detection comprises the following steps: two circular markers with the length and width being consistent with the width of the cross beam are symmetrically arranged on the front cross beam of the vibrating wheel of the double-cylinder unmanned equipment along the longitudinal center line of the double-cylinder unmanned equipment, two symmetrical feature circles with the largest size are obtained through Hough circle detection, the slope of the connection line of the two circle center image coordinates is calculated according to the two circle center image coordinates and is marked as k1, the horizontal axis of the image in the x direction is taken as a reference line, the slope of the reference line is marked as k2, k2 is 0, and theta is the included angle of the two lines, and the calculation formula is as follows:
Figure BDA0002106418890000032
the steering angle of the vibration wheel of the double-cylinder unmanned equipment when the vibration wheel does not steer is set to be 90 degrees, and the steering angle alpha of the vibration wheel obtained through visual calculation is calculated through the following formula:
α=θ+90°。
the method for calculating the steering angle of the vibrating wheel through characteristic contour detection comprises the following steps: two markers with the length and width being consistent with the width of the beam are symmetrically arranged on a front beam of the vibrating wheel of the double-cylinder unmanned equipment along the longitudinal center line of the double-cylinder unmanned equipment, the outline information of the markers contained in an image is firstly found out, then all the outlines are traversed, the image moment of each outline is calculated, and the centroid position (Cx, cy) of the outline is obtained:
Figure BDA0002106418890000041
where Cx is the abscissa of the centroid of the profile on the image; cy is the ordinate of the outline centroid on the image; m00 is 0-order image moment and represents the area of a region surrounded by the outline; m10 represents the 1 st order moment of the image of a point on the contour in the x direction; m01 represents the 1 st order moment of the image of the point on the contour in the y direction;
calculating the slope of a connecting line of coordinates of the two centroids according to the two feature profile centroids which are stably output, and marking the slope as k1, wherein a horizontal axis in the x direction of the image is taken as a reference line, the slope of the reference line is marked as k2, k2 is 0, and theta is an included angle of the two lines, and the calculation formula is as follows:
Figure BDA0002106418890000042
the steering angle when the vibrating wheel of the double-cylinder unmanned equipment is not steered is set to be 90 degrees, and then the steering angle alpha of the vibrating wheel obtained through visual calculation can be calculated through the following formula:
α=θ+90°。
moreover, the numerical steering angle filtering adopts a recursive average filtering method.
The invention has the advantages and positive effects that:
the invention realizes the steering detection of the vibrating wheel of the double-cylinder unmanned equipment by visual perception. Compared with the prior art, the invention implements the steering angle detection from a brand-new angle. Specifically, the stable image acquisition can be ensured by adopting an industrial three-proofing (waterproof, shockproof and dustproof) vision camera, and the influence of equipment installation precision, satellite signal loss, signal fluctuation, signal drift and the like on the aspects of physical structure and signal receiving is avoided; in the aspect of algorithm processing, stable and reliable detection can be realized primarily by detecting features through a visual algorithm, and high-precision steering angle detection can be realized after filtering and calibration. In conclusion, the invention is innovated on the basis of the physical structure and method principle, effectively reduces the equipment cost of engineering research and development, avoids the problems of inaccuracy and instability of steering angle detection of the double-cylinder unmanned equipment caused by the problems of sensor installation, signal fluctuation, GPS signal drift and the like in the prior art, and brings great convenience for unmanned engineering development.
Drawings
FIG. 1 is a schematic flow chart of a detection method in example 1 of the present invention;
FIG. 2 is a schematic flow chart of a detection method in embodiment 2 of the present invention;
FIG. 3 is a schematic flowchart of a calibration method in embodiment 3 of the present invention;
fig. 4 is a functional module connection diagram of the present invention.
Detailed Description
For the purpose of promoting a better understanding of the objects, aspects and advantages of the present disclosure, reference is made to the following detailed description taken in conjunction with the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example 1
Fig. 1 is a schematic flow chart of a method for detecting steering of a dual-cylinder unmanned aerial vehicle vibratory wheel of an unmanned aerial vehicle based on vision according to embodiment 1 of the present invention, where the method can be executed by a device for detecting steering of a vibratory wheel of an unmanned aerial vehicle based on vision. The method specifically comprises the following steps:
1. and collecting an original image.
The image acquisition equipment is at least provided with one set of industrial cameras which are fixed in the transverse middle position of the road roller body. As the road roller can vibrate in the moving process, the shockproof treatment is carried out by adopting a shockproof hard spring buffer or arranging a shockproof rubber gasket. The acquisition angle of the image acquisition equipment is adjusted, and the vibration wheel and the front cross beam of the road roller can be still completely and clearly displayed in the imaging visual field under the condition of left-right rotation limit.
2. Region of interest (ROI) extraction.
There are generally two cases for the selection of the Region of Interest (ROI): a) Knowing the location of the ROI in the image; b) The position of the ROI in the image is unknown. For the case where the ROI extraction according to the present invention belongs to the position-known case, we extract the ROI from the original image acquired in the previous step. For an acquired image, an image pixel coordinate system (uOv coordinate system) is generally established by taking the upper left corner of the image as a coordinate origin, the horizontal direction of the image is recorded as the u direction, and the vertical direction of the image is recorded as the v direction. Setting an image size: w is the width of the image and h is the height of the image. Then, in the coordinate system, the four-corner coordinates of one image are (0, 0), (w, h), and (0, h).
The real structure of the road roller is that the periphery of the vibration wheel is surrounded by a circle of metal structure, and a beam is arranged right ahead in the circle of metal structure. And the steering of the beam is consistent with the steering of the vibrating wheel. The principle we extract the ROI is: and finding a region which ensures that the characteristics to be detected on the cross beam can be completely displayed under the condition that the front cross beam of the road roller vibrating wheel is in left and right steering limits as the ROI. Given this principle, the ROI area width is w ', the height is h', and the four-point coordinates are (0, 0), (w ',0,), (w', h '), (0, h'). The method for extracting the ROI is to directly cut the original image according to coordinates of four corners of the ROI so as to obtain a region of interest (ROI) required by people.
3. And carrying out distortion correction on the extracted ROI area.
The imaging process of an image acquisition device such as a camera is essentially a transformation of several coordinate systems. Firstly, converting points in the space from a world coordinate system to a camera coordinate system, then projecting the points to an imaging plane, namely an image physical coordinate system, and finally converting data on the imaging plane to an image plane, namely an image pixel coordinate system. But introduces distortion due to the accuracy of lens manufacture and variations in the assembly process, resulting in distortion of the original image.
The distortion of the lens is classified into radial distortion and tangential distortion. Radial distortion is distortion distributed along the radius of the lens, which is generated because rays are more curved away from the center of the lens than near the center, and is more pronounced in a typical inexpensive lens. The tangential distortion is generated because the lens itself is not parallel to the camera sensor plane (imaging plane) or the image plane, which is often caused by mounting deviations of the lens attached to the lens module.
Detailed description of the principles of aberration correction: undistorted coordinates (U, V) in the existing image pixel coordinate system (uOv coordinate system) fall on (Ud, vd) in the uOv coordinate system after radial distortion and tangential distortion. That is, the relationship between the real image img and the distorted image imgD is img (U, V) = imgD (Ud, vd).
In the last step, the ROI obtained by the user is a distorted image, and if an image without distortion is obtained, the mapping relation of the image is deduced through a distortion model. The relationship between the real image img and the distorted image imgD is img (U, V) = imgD (Ud, vd). All imgs (U, V) can be found by this relationship. In this process, the internal parameters of the camera need to be obtained by a camera calibration method, such as a zhangnyou calibration method, and the external parameters of the camera need to be obtained by combining the installation position of the camera. And then the distortion correction of the camera is realized by combining the internal parameters and the external parameters of the camera.
Since U and V are both integers for an image, since U and V represent the pixel coordinate locations that make up the image. In the process of correcting the distorted image imgD (Ud, vd) into the undistorted image img (U, V), the (Ud, vd) obtained through the calculation is not an integer, so that the solution needs to be performed by using an interpolation method, for example, the nearest neighbor interpolation and the bilinear interpolation can be realized.
4. The feature regions are segmented according to a color threshold.
In the invention, two red markers with the length and width consistent with the width of the cross beam are mainly selected and symmetrically arranged on the cross beam along the center line of the cross beam of the road roller. Since the roller itself is yellow, red is chosen as the characteristic color. Other representative colors besides red may be selected, such as white, blue, etc. The requirements for the characteristic colors here are: the color of the road roller and the color of the parts of the road roller are obviously distinguished; the color of the road surface environment around the road roller needs to be obviously distinguished. When the vision is detected, firstly, the color space of the image collected by the image collecting device is converted, and the RGB color space is converted into the HSV color space. And then, a red color threshold value under HSV is set, so that a red area is extracted, and the pixel value of a non-red area is set to be 0. This accomplishes the goal of segmenting the feature region according to the color threshold.
5. And (5) image morphology processing.
In the image processing process of the embodiment, a series of unexpected features can be extracted from the image obtained in the preceding step due to the influence of the surrounding working environment of the road roller, natural illumination and the like, so that the image morphology processing is required, the unexpected feature region is corroded and eliminated firstly, and then the expected feature region left is expanded, the image morphology opening operation is realized, and the effects of smoothing the contour of the feature region and eliminating the unexpected region such as burrs are achieved.
6. And (5) detecting Hough circle.
As described in the foregoing 4, two red markers having the same length and width as the width of the cross beam are symmetrically arranged on the front cross beam of the road roller vibrating wheel along the longitudinal centerline of the road roller, in this example, the hough circle detection scheme is used, so that the red markers are circular red markers, and the diameter of the circle is not greater than the width of the road roller cross beam. Through the steps, the image acquired by the image acquisition device can obtain a binary image with only circular features, hough circle detection is further performed, and two symmetrical feature circles with the largest size can be obtained after parameters are adjusted, namely, feature detection is realized. The process of adjusting the parameters is not described in detail in this example.
7. And calculating the steering angle of the vibrating wheel.
And according to the Hough circle detection in the last step, two characteristic circles which are stably output can be obtained. According to the output of the Hough circle detection, the image coordinates of the centers of the two feature circles can be directly obtained. According to the coordinates of the two circle center images, the slope of the connecting line of the two circle center image coordinates can be calculated and recorded as k1, the horizontal axis in the x direction of the image is taken as a reference line, the slope of the reference line is recorded as k2, and obviously k2 is 0. Let θ be the angle between the two lines, and the calculation formula is as follows:
Figure BDA0002106418890000071
and converting the arc value into an angle value after performing arc tangent calculation on tan theta to obtain a value of theta, setting the steering angle of the road roller vibration wheel when the steering is not performed to be 90 degrees, and setting the maximum steering range to be plus or minus 35 degrees, namely 55 to 125 degrees. Note that the left turns to "negative", and the right turns to "positive". The steering angle α of the vibrating wheel obtained by the visual calculation can be calculated by the following formula:
α=θ+90°
8. and (4) filtering the steering angle numerical value.
After the visual detection in the previous step, the steering angle of the vibration wheel can be detected, but the angle value of the visual feedback is calculated based on the included angle between the circle center connecting line provided by Hough circle detection and the X direction of the image. The hough circle detection has a series of unstable detections, which bring fluctuation of an output angle value, so that the stability of angle output can be ensured only by adopting a numerical filtering mode. In the embodiment, a recursive average filtering method (also called a sliding average filtering method) is adopted to realize stable output of the steering angle of the vibrating wheel.
The detailed description of the recursive average filtering method is as follows, the steering angle of the visual perception continuous output is regarded as a continuous sampling value, N sampling values obtained continuously are regarded as a queue, the length of the queue is fixed to be N, new data is sampled to be put into the tail of the queue each time, data at the head of the original queue is thrown away (first-in first-out principle), the N data in the queue are subjected to arithmetic average operation, and a new filtering result is obtained for use. The filtering method has the advantages of good suppression effect on periodic interference, high smoothness and suitability for a high-frequency oscillation system.
According to the implementation of the steps, the vibration wheel steering detection of the double-cylinder unmanned equipment based on vision can be realized.
Example 2
Fig. 2 is a schematic flow chart of a method for detecting steering of a vibrating wheel of a dual-cylinder unmanned aerial vehicle based on vision according to embodiment 2 of the present invention, and the technical solution of the present embodiment is based on the above embodiment, and further improves and optimizes the detection of the hough circle in step 6 and the calculation of the steering angle of the vibrating wheel in step 7. The method specifically comprises the following steps:
1. and collecting an original image.
2. Region of interest (ROI) extraction.
3. And carrying out distortion correction on the extracted ROI area.
4. The feature regions are segmented according to color thresholds.
5. And (5) image morphology processing.
6. And detecting the feature contour.
Contours are typical features of image objects and can be simply thought of as curves that connect consecutive points (along boundaries) together, with the same color or grayscale. Contours are useful in shape analysis and detection and identification of objects. In order to identify the contour more accurately in the process of feature contour detection, binarization processing must be performed on an original image, and a set formed by an area point set with high structural features is extracted by using the level difference of edge point connection, and the point set is likely to be the contour of an object. The output of feature profile detection is a list containing all detected profiles, each consisting of a series of points.
The characteristic contour detection has the advantage of more stable surrounding characteristic shapes compared with Hough circle detection, because the circular characteristic images arranged on the cross beam of the road roller are elliptical due to the installation shooting angle of the camera, the characteristic shapes are difficult to be stably surrounded when Hough circle detection is carried out, and the center of the Hough circle detection is shaken all the time to bring shaking of angle detection. By using the feature detection, the contour detection result can be stably output without being influenced by the shape of the feature. It is also meant that the feature shape need not be circular and can be provided in any shape. Meanwhile, some missed contours may appear in the feature detection process, and the missed contours can be filtered by setting the area of the closed contour region. In conclusion, stable contour detection can be achieved.
7. And calculating the centroid of the feature profile.
In the relevant fields of image processing, computer vision, and the like, an image moment is some particular weighted average of the intensities of the pixels of an image. The contour information contained in the image is found through the step 6, and each contour detected in the embodiment is a closed area. Then, traversing all the contours and calculating the image moment of each contour, the centroid position (Cx, cy) of the object can be obtained:
Figure BDA0002106418890000091
where Cx is the abscissa of the profile centroid on the image; cy is the ordinate of the outline centroid on the image; m00 is 0-order image moment and represents the area of a region surrounded by the outline; m10 represents the 1 st order moment of the image of a point on the contour in the x direction; m01 represents the 1 st order moment of the image in the y direction for a point on the contour. According to the method, the centroid of the extracted feature profile can be obtained and can be represented on the image for subsequent calculation of the centroid connecting line.
8. And calculating the steering angle of the vibration wheel.
According to the feature contour detection and the centroid calculation in the last step, two stably output feature contours and centroids of the corresponding contours can be obtained. The slope of the connecting line of the coordinates of the two centroids can be calculated according to the two feature profile centroids which are stably output and is marked as k1, the horizontal axis in the x direction of the image is taken as a reference line, the slope of the reference line is marked as k2, and obviously k2 is 0. Here, θ is an angle between two lines, and the calculation formula is as follows:
Figure BDA0002106418890000101
and converting the arc value into an angle value after performing arc tangent calculation on tan theta to obtain a value of theta, setting the steering angle of the road roller vibration wheel when the steering is not performed to be 90 degrees, and setting the maximum steering range to be plus or minus 35 degrees, namely 55 to 125 degrees. The left turn is recorded as "negative" and the right turn is recorded as "positive". The steering angle α of the vibrating wheel obtained by the visual calculation can be calculated by the following formula:
α=θ+90°
9. and (4) filtering the steering angle numerical value.
After the visual detection in the previous step, the steering angle of the vibration wheel can be detected, but the angle value of the visual feedback is calculated based on the included angle between the centroid connecting line of the detected and extracted characteristic profile and the X direction of the image. Although feature detection has enabled relatively stable numerical output, the visual output angle is still numerically filtered in view of the enhanced data reliability. The embodiment adopts a recursive average filtering method (also called a moving average filtering method) to realize stable output of the steering angle of the vibrating wheel.
According to the implementation of the steps, the embodiment can stably realize the steering detection of the vibration wheel of the double-cylinder unmanned equipment based on vision.
Example 3
Fig. 3 is a schematic flow chart of a method for calibrating steering of a vibrating wheel of a vision-based dual-cylinder unmanned aerial vehicle according to embodiment 3 of the present invention, where the method may be performed by a device for calibrating steering of a vibrating wheel of a vision-based dual-cylinder unmanned aerial vehicle, and the device may be implemented in hardware and/or software. The method specifically comprises the following steps:
1. and manufacturing a calibration table according to the set corner magnitude and the vision actual measurement corner.
The actual rotation angle range of the road roller vibrating wheel is physically limited, if the vibration wheel is recorded as a middle position corresponding to 90 degrees when not rotating, the limit values of left and right steering are 55 degrees and 125 degrees respectively, therefore, according to the rotation angle range, the road roller is manually operated to steer through visually detecting given angles when a calibration table is manufactured, and when the visually detected steering angle is stabilized at the vicinity of the angles of 55 degrees, 60 degrees, 65 degrees, 70 degrees, 75 degrees, 80 degrees, 85 degrees, 90 degrees, 95 degrees, 100 degrees, 105 degrees, 110 degrees, 115 degrees, 120 degrees and 125 degrees, the actual rotation angle value of the road roller vibrating wheel at the moment is measured by using an angle measuring tool and is respectively recorded. Forming a series of corresponding angles, and finishing the calibration table.
2. And inputting the steering angle of the vibration wheel of the visual detection road roller and reading a calibration table to store a calibration value.
The visually detected steering output value of the vibrating wheel can be obtained according to the embodiment 1 and the embodiment 2, and the angle value is used as a calibration input. The calibration table may be stored in a specific format, such as xml,. Json,. Yml, etc. And reading a calibration table in the calibration process.
3. And the interpolation method realizes calibration and outputs the calibrated rotation angle value.
And (4) combining the read calibration table to realize the calibration by the interpolation method according to the input angle value, wherein the calibration process by the interpolation method is as follows. Firstly, according to the setting of a calibration table, the interval between two calibration values is 5 °, so in the actual interpolation process, according to embodiments 1 and 2, we define the steering angle of the visually detected vibration wheel as α, note temp _ α as the intermediate quantity to be calibrated, where temp _ α = α/5, and then perform a downward rounding operation on temp _ α, so as to obtain a maximum integer less than or equal to temp _ α, which is noted as insert _ a, and note insert _ B = insert _ a +1, where [ insert _ a 5, insert_b 5] is the interpolation interval where the steering angle α given by visual detection is located. Noting that corresponding f (insert _ a 5) is the true road roller steering angle value corresponding to insert _ a 5, and corresponding f (insert _ B5) is the true road roller steering angle value corresponding to insert _ B5, and performing interpolation operation according to the interpolation interval and the corresponding true value, wherein the formula is as follows:
Figure BDA0002106418890000111
according to the method, the function of calibrating the steering angle of the visual detection road roller can be realized.
Example 4
The method can be realized by a plurality of functional modules which are connected in sequence, wherein the functional modules comprise an image acquisition module, an image distortion correction module, a detection processing module, a characteristic calculation module, a numerical filtering module and a data calibration module.
The image acquisition module is mainly used for providing a source of original data for visual inspection. By adjusting the acquisition angle of the image acquisition equipment, the vibration wheel and the front beam of the road roller can be ensured to be completely and clearly displayed in the imaging visual field under the condition of left-right rotation limit. And the function of sending the image to the subsequent module is realized.
The image distortion correction module is mainly used for correcting image distortion input by the acquisition device. The imaging process of an image acquisition device such as a camera is essentially a transformation of several coordinate systems. Firstly, converting points in the space from a world coordinate system to a camera coordinate system, then projecting the points to an imaging plane, namely an image physical coordinate system, and finally converting data on the imaging plane to an image plane, namely an image pixel coordinate system. But distortion is introduced due to lens manufacturing accuracy and variations in the assembly process, resulting in distortion of the original image. Therefore, the image distortion correction module realizes the distortion correction of the image pixels by combining the internal and external parameters of the image acquisition device with a coordinate system conversion formula.
The detection processing module is mainly used for processing the image after the distortion correction, and finally obtains stable detection of the expected contour features through combined use of image processing methods such as color feature threshold extraction, edge detection, morphological processing, contour extraction and the like, so as to provide original data for subsequent feature calculation.
The characteristic calculation module and the numerical filtering module are mainly used for removing irrelevant characteristics based on the outline area, calculating the slope of a connection line of characteristic points, calculating the steering angle and the like according to the original data output by the detection processing module, and using a filtering algorithm (such as recursive mean filtering, kalman filtering, variants thereof and the like) to carry out numerical filtering, so that numerical jump caused by false detection and jitter is eliminated, and stable detection and output of the steering angle are realized.
The data calibration module is mainly used for finding out the corresponding relation between the visual detection steering angle value and the real steering angle value of the road roller, determining the calibration relation by manufacturing a calibration table, and realizing the numerical calibration of the visual detection steering angle by combining an interpolation method, thereby realizing the stable and reliable real steering angle value of the road roller.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, it is possible to make various changes and modifications without departing from the concept of the present invention, and these are all within the scope of the present invention.

Claims (8)

1. A method for detecting steering of vibrating wheels of double-cylinder unmanned equipment based on vision is characterized by comprising the following steps: the method comprises the following steps:
(1) Collecting an original image;
(2) Extracting a region of interest (ROI);
(3) Carrying out distortion correction on the extracted ROI area;
(4) Segmenting the characteristic region according to the color threshold;
(5) Processing the image morphology;
(6) Hough circle detection or feature contour detection;
(7) If the last step is Hough circle detection, calculating a steering angle of the vibrating wheel; if the last step is the feature contour detection, firstly calculating the centroid of the feature contour and then calculating the steering angle of the vibration wheel;
(8) Filtering the steering angle numerical value;
(9) Stably outputting a steering angle;
the method for calculating the steering angle of the vibrating wheel through Hough circle detection comprises the following steps: two circular markers with the length and width being consistent with the width of the cross beam are symmetrically arranged on the front cross beam of the vibrating wheel of the double-cylinder unmanned equipment along the longitudinal center line of the double-cylinder unmanned equipment, two symmetrical feature circles with the largest size are obtained through Hough circle detection, the slope of the connection line of the two circle center image coordinates is calculated according to the two circle center image coordinates and is marked as k1, the horizontal axis of the image in the x direction is taken as a reference line, the slope of the reference line is marked as k2, k2 is 0, and theta is the included angle of the two lines, and the calculation formula is as follows:
Figure FDA0003904086740000011
the steering angle of the vibrating wheel of the double-cylinder unmanned equipment when the vibrating wheel does not steer is set to be 90 degrees, and the steering angle alpha of the vibrating wheel obtained through visual calculation is calculated through the following formula:
α=θ+90°;
the method for calculating the steering angle of the vibrating wheel through characteristic contour detection comprises the following steps: two markers with the length and width being consistent with the width of a cross beam are symmetrically arranged on a front cross beam of a vibrating wheel of the double-cylinder unmanned equipment along the longitudinal center line of the double-cylinder unmanned equipment, the information of the outline of the markers contained in an image is firstly found out, then all the outlines are traversed, the image moment of each outline is calculated, and the centroid position (Cx, cy) of the outline is obtained:
Figure FDA0003904086740000012
where Cx is the abscissa of the centroid of the profile on the image; cy is the ordinate of the outline centroid on the image; m00 is 0-order image moment and represents the area of a region surrounded by the outline; m10 represents the 1 st order moment of the image of a point on the contour in the x direction; m01 represents the 1 st order moment of the image of the point on the contour in the y direction;
calculating the slope of a connecting line of coordinates of the two centroids according to the two feature profile centroids which are stably output, and marking the slope as k1, wherein a horizontal axis in the x direction of the image is taken as a reference line, the slope of the reference line is marked as k2, k2 is 0, and theta is an included angle of the two lines, and the calculation formula is as follows:
Figure FDA0003904086740000021
the steering angle when the vibrating wheel of the double-cylinder unmanned equipment is not steered is set to be 90 degrees, and then the steering angle alpha of the vibrating wheel obtained through visual calculation can be calculated through the following formula:
α=θ+90°。
2. the vision-based method for detecting the steering of the vibrating wheel of the dual-cylinder unmanned aerial vehicle according to claim 1, wherein the method comprises the following steps: and (4) calibrating the steering angle obtained in the step (9) to obtain the calibrated steering angle.
3. The vision-based steering detection method for the vibrating wheel of the dual-cylinder unmanned aerial vehicle according to claim 2, characterized in that: and (4) calibrating, namely acquiring a real steering angle according to a real double-cylinder unmanned equipment articulation angle sensor, establishing a mapping relation between the real steering angle and the steering angle obtained in the step (9) to form a calibration table, searching a numerical value interval for the calibration table, and realizing geometric calibration by adopting a data interpolation method.
4. The vision-based method for detecting the steering of the vibrating wheel of the dual-cylinder unmanned aerial vehicle according to claim 3, wherein the method comprises the following steps: the interpolation method is characterized in that a steering angle of a visually detected vibration wheel is defined as alpha, temp _ alpha is recorded as an intermediate quantity to be calibrated, temp _ alpha = alpha/5, then downward rounding operation is carried out on the temp _ alpha to obtain a maximum integer which is less than or equal to the temp _ alpha and is recorded as insert _ A, insert _ B = insert _ A +1, at the moment, [ insert _ A5, insert _B5 ] is an interpolation interval where the steering angle alpha is provided by visual detection, corresponding f (insert _ A5) is recorded as a steering angle value of a real double-cylinder unmanned device corresponding to insert _ A5, corresponding f (insert _ B5) is a steering angle value of the real double-cylinder unmanned device corresponding to insert _ B5, and an interpolation operation formula is carried out according to the interpolation interval and the corresponding value, wherein the interpolation operation formula is as follows:
Figure FDA0003904086740000022
5. the vision-based steering detection method for the vibrating wheel of the dual-cylinder unmanned aerial vehicle according to claim 1, characterized in that: and (2) the original image acquisition in the step (1) is to fix an industrial camera at the transverse center of a vehicle body of the double-cylinder unmanned equipment, and adopt a shockproof hard spring buffer or arrange a shockproof rubber gasket for shockproof treatment.
6. The vision-based steering detection method for the vibrating wheel of the dual-cylinder unmanned aerial vehicle according to claim 1, characterized in that: and (4) the step (3) of carrying out distortion correction on the extracted ROI area is to combine internal parameters and external parameters of the camera to realize the distortion correction of the camera, and a real image img is obtained from the distorted image imgD by using a formula img (U, V) = imgD (Ud, vd).
7. The vision-based steering detection method for the vibrating wheel of the dual-cylinder unmanned aerial vehicle according to claim 1, characterized in that: and (5) the image morphology processing is to eliminate the undesired characteristic region by adopting corrosion processing, and then to adopt the desired characteristic region left by expansion processing.
8. The vision-based method for detecting the steering of the vibrating wheel of the dual-cylinder unmanned aerial vehicle according to claim 1, wherein the method comprises the following steps: the numerical filtering of the steering angle adopts a recursion average filtering method.
CN201910554219.9A 2019-06-25 2019-06-25 Method for detecting and calibrating steering of vibrating wheel of double-cylinder unmanned equipment based on vision Active CN110400348B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910554219.9A CN110400348B (en) 2019-06-25 2019-06-25 Method for detecting and calibrating steering of vibrating wheel of double-cylinder unmanned equipment based on vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910554219.9A CN110400348B (en) 2019-06-25 2019-06-25 Method for detecting and calibrating steering of vibrating wheel of double-cylinder unmanned equipment based on vision

Publications (2)

Publication Number Publication Date
CN110400348A CN110400348A (en) 2019-11-01
CN110400348B true CN110400348B (en) 2022-12-06

Family

ID=68323519

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910554219.9A Active CN110400348B (en) 2019-06-25 2019-06-25 Method for detecting and calibrating steering of vibrating wheel of double-cylinder unmanned equipment based on vision

Country Status (1)

Country Link
CN (1) CN110400348B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103255755A (en) * 2013-04-28 2013-08-21 河海大学 Lossless method for fast evaluating filling compaction quality of soil building stones in real time and evaluating device thereof
CN103321128A (en) * 2013-07-03 2013-09-25 中联重科股份有限公司 Method and system for preventing sinking of vibrating wheel of vibrating road roller and engineering machinery
WO2013151266A1 (en) * 2012-04-04 2013-10-10 Movon Corporation Method and system for lane departure warning based on image recognition
CN103850241A (en) * 2014-02-20 2014-06-11 天津大学 Earth and rockfill dam milling excitation frequency and excitation force real-time monitoring system and monitoring method
CN104503240A (en) * 2014-12-23 2015-04-08 福建船政交通职业学院 Ergonomic dynamic design method based on chaotic recognition
CN204530394U (en) * 2015-03-17 2015-08-05 广东华路交通科技有限公司 A kind of road roller data collecting system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013151266A1 (en) * 2012-04-04 2013-10-10 Movon Corporation Method and system for lane departure warning based on image recognition
CN103255755A (en) * 2013-04-28 2013-08-21 河海大学 Lossless method for fast evaluating filling compaction quality of soil building stones in real time and evaluating device thereof
CN103321128A (en) * 2013-07-03 2013-09-25 中联重科股份有限公司 Method and system for preventing sinking of vibrating wheel of vibrating road roller and engineering machinery
CN103850241A (en) * 2014-02-20 2014-06-11 天津大学 Earth and rockfill dam milling excitation frequency and excitation force real-time monitoring system and monitoring method
CN104503240A (en) * 2014-12-23 2015-04-08 福建船政交通职业学院 Ergonomic dynamic design method based on chaotic recognition
CN204530394U (en) * 2015-03-17 2015-08-05 广东华路交通科技有限公司 A kind of road roller data collecting system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
智能路面发展与展望;WANG Linbing等;《中国公路学报》;20190415(第04期);第1-4页 *
沥青路面施工质量自动化远程监控技术实践研究;喻东晓等;《公路交通科技(应用技术版)》;20150915(第09期);第126-128 *

Also Published As

Publication number Publication date
CN110400348A (en) 2019-11-01

Similar Documents

Publication Publication Date Title
CN109684921B (en) Road boundary detection and tracking method based on three-dimensional laser radar
CN108805934B (en) External parameter calibration method and device for vehicle-mounted camera
CN107203973B (en) Sub-pixel positioning method for center line laser of three-dimensional laser scanning system
WO2019105044A1 (en) Method and system for lens distortion correction and feature extraction
CN110398979B (en) Unmanned engineering operation equipment tracking method and device based on vision and attitude fusion
CN107895375B (en) Complex road route extraction method based on visual multi-features
CN109001757B (en) Parking space intelligent detection method based on 2D laser radar
CN110378957B (en) Torpedo tank car visual identification and positioning method and system for metallurgical operation
CN107392849B (en) Target identification and positioning method based on image subdivision
CN106599760B (en) Method for calculating running area of inspection robot of transformer substation
CN107358628B (en) Linear array image processing method based on target
CN112486207A (en) Unmanned aerial vehicle autonomous landing method based on visual identification
CN107066970B (en) Visual positioning method, device and system for AGV (automatic guided vehicle)
JP2010191661A (en) Traveling path recognition device, automobile, and traveling path recognition method
CN111415376A (en) Automobile glass sub-pixel contour extraction method and automobile glass detection method
CN105973265A (en) Mileage estimation method based on laser scanning sensor
CN102706291A (en) Method for automatically measuring road curvature radius
CN112017249A (en) Vehicle-mounted camera roll angle obtaining and mounting angle correcting method and device
CN109883396A (en) A kind of air level line width measurement method based on image processing techniques
CN112729318A (en) AGV fork truck is from moving SLAM navigation of fixed position
CN115861352A (en) Monocular vision, IMU and laser radar data fusion and edge extraction method
CN110400348B (en) Method for detecting and calibrating steering of vibrating wheel of double-cylinder unmanned equipment based on vision
CN117746343A (en) Personnel flow detection method and system based on contour map
CN112330667B (en) Morphology-based laser stripe center line extraction method
CN113763279A (en) Accurate correction processing method for image with rectangular frame

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant