CN115509122B - Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation - Google Patents
Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation Download PDFInfo
- Publication number
- CN115509122B CN115509122B CN202211451943.7A CN202211451943A CN115509122B CN 115509122 B CN115509122 B CN 115509122B CN 202211451943 A CN202211451943 A CN 202211451943A CN 115509122 B CN115509122 B CN 115509122B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- error
- waterline
- deviation
- change rate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 96
- 238000005457 optimization Methods 0.000 title claims abstract description 16
- 230000008859 change Effects 0.000 claims abstract description 116
- 238000012544 monitoring process Methods 0.000 claims abstract description 30
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims abstract description 28
- 238000001514 detection method Methods 0.000 claims abstract description 25
- 239000000126 substance Substances 0.000 claims description 48
- 230000008569 process Effects 0.000 claims description 20
- 230000006870 function Effects 0.000 claims description 15
- 238000006243 chemical reaction Methods 0.000 claims description 12
- 238000005070 sampling Methods 0.000 claims description 12
- 238000010606 normalization Methods 0.000 claims description 9
- 240000002627 Cordeauxia edulis Species 0.000 claims description 8
- 238000013507 mapping Methods 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000005260 corrosion Methods 0.000 claims description 6
- 230000007797 corrosion Effects 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 5
- 108010074864 Factor XI Proteins 0.000 claims 2
- 238000013461 design Methods 0.000 abstract description 6
- 238000012706 support-vector machine Methods 0.000 description 10
- 238000010276 construction Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 239000010426 asphalt Substances 0.000 description 4
- 238000005192 partition Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 239000000446 fuel Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B11/00—Automatic controllers
- G05B11/01—Automatic controllers electric
- G05B11/36—Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential
- G05B11/42—Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential for obtaining a characteristic which is both proportional and time-dependent, e.g. P. I., P. I. D.
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/34—Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Automation & Control Theory (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses an unmanned line marking vehicle online optimization control method and system based on machine vision navigation, belonging to the field of system control, and the unmanned line marking vehicle online optimization control method based on machine vision navigation comprises the following steps: collecting a water line image of a road surface; carrying out waterline detection to obtain a waterline; based on the detected waterline, selecting a monitoring point of the current position of the vehicle, calculating the running deviation and the running deviation change rate of the vehicle, selecting a monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation and the predicted deviation change rate of the vehicle; designing nonlinear incremental PID control according to the predicted running deviation and the predicted deviation change rate of the vehicle; learning parameters of nonlinear PID control increments in the region; the method and the system realize the machine vision real-time navigation of the line marking vehicle, and design the feedforward controller and the nonlinear increment PID controller according to the tracking error and the error change rate, so as to realize the line marking vehicle path tracking control based on the machine vision navigation.
Description
Technical Field
The invention belongs to the field of system control, and particularly relates to an unmanned line marking vehicle online optimization control method and system based on machine vision navigation.
Background
In the construction process of the highway marking, a waterline is generally manually drawn on the road surface, and the existing method is generally constructed along the waterline manually. The system adopts a machine vision technology, a computer automatically identifies a water line, and then controls marking equipment to carry out marking construction along the water line, namely navigation construction based on machine vision, and a construction schematic diagram is shown in figure 1.
The existing system has the following problems in marking vehicle control based on machine vision navigation:
the existing advanced image processing method is long in time consumption, real-time control of a marking vehicle is difficult to meet, only the gray value of each pixel is concerned, and the overall planning of foreground and background information is difficult;
the existing control method based on the model is not suitable because the structure of the line marking vehicle is too complex, an accurate motion model is difficult to obtain, and the control law of the line marking vehicle cannot be designed by adopting the method based on the model;
the vehicle navigation adopts an image navigation mode, only the degree of deviation of a construction vehicle from a planned route can be known, accurate vehicle global coordinates are difficult to provide, and the output generally given by vehicle modeling is the position information of the vehicle;
for such problems, model-free control methods such as PID control and fuzzy control are generally adopted, but such methods rely too much on manual experience and are difficult to optimize, and most optimization methods require accurate models of controlled objects.
Disclosure of Invention
In order to solve the problems in the prior art, the invention discloses an unmanned scribing vehicle on-line optimization control method based on machine vision navigation, which realizes the machine vision real-time navigation of a scribing vehicle, designs a feedforward controller and a nonlinear PID controller according to a tracking error and an error change rate, designs an on-line learning method of parameters of the nonlinear PID controller, and realizes the scribing vehicle path tracking control based on the machine vision navigation.
The invention adopts the scheme that:
the on-line optimizing control method of the unmanned line marking vehicle based on the machine vision navigation comprises the following steps:
s1, collecting a water line image of a road surface;
s2, carrying out waterline detection based on the waterline image to obtain a waterline;
s3, based on the detected waterline, selecting a monitoring point of the current position of the vehicle, and calculating the running deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation;
S4, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationObtaining a predicted feedforward control quantityBased on running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicle;
S5, controlling increment on the nonlinear PID in the regionParameter (2) of、、And performing online learning.
The step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein the content of the first and second substances,,,as a result of the stretching factor,,represents an imageGo to the firstColumns;is shown asGo to the firstGray values of the column pixel points; after stretching the gray scale of the water line image, selectingA personal HARR-like feature;
s202, aiming at the first stepCharacteristic of personal HARR,The calculation method is as formula (2):
wherein the content of the first and second substances,is as followsThe sum of pixels of the individual HARR-like feature waterline regions,is as followsThe sum of pixels of a road surface area with a HARR-like characteristic,、gray value based on stretched gray imageCalculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein the content of the first and second substances,,in order to normalize the HARR signature after the normalization,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
S203, aiming at each pixel pointFeature vector ofIdentifying and judging pixel pointsIf the characteristic vector accords with the linear characteristic, setting 1 to the pixel point, otherwise setting 0 to realize the binarization of the gray level image to obtain a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageObtaining an edge image E of the waterline by adopting a formula (6):
representing a new binary imageTo (1) aGo to the firstPixel values of the columns;represents the edge image EGo to the firstPixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesFinding the coordinates of the pixel with the pixel value of 1 asObtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
wherein the content of the first and second substances,andradius and angle detected by HOUGH transformation;
a group ofRepresenting a straight line, given a setIf, ifSatisfies the expression of the formula (7)On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1: selecting a plurality of straight lines with the longest length, wherein the length of the straight lines meets the length of a threshold value;
s205-2: two-value image of front and back framesDetected straight line parameterSatisfies the requirement of formula (8):
wherein,,Are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling moment of the image;
s205-3: and if more than one straight line still meets the requirement after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
The step S3 specifically includes the following steps: the height and width of the edge image E at which the waterline is located are H and;
selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (Y) running deviation of the vehicleRate of change of deviation from runningComprises the following steps:
selecting a height ofThe intersection point of the horizontal line and the waterline is used as a monitoring point of the predicted position of the vehicle (,) Obtaining a predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
Wherein, the first and the second end of the pipe are connected with each other,the distribution is the width of the edge image E.
S4 specifically comprises the following steps:
s401, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationCalculating a feedforward control amount of the vehicle:
In order to calculate the amount of feedforward control,andare two parameters of predictive control;
s402, in the area of the e-ec plane, based on the running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicle;
Step S402 specifically includes the following steps: constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; the error e and the error change rate ec refer to the running deviation of the vehicleRate of change of deviation from running(ii) a The non-linear division method adopted by the e-ec plane division is as follows:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;Andrespectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,for equal uniform division points of the error e or the error change rate ec,in order to non-uniformly partition the points after mapping,adjusting a factor for the degree of non-linearity;
In each divided regionInternally performing PID control, and non-linear PID control incrementComprises the following steps:
wherein the content of the first and second substances,to representRegion of timeNon-linear PID control increments of (1);
which is indicative of the time of the sampling,indicating areaThe internal proportionality coefficient of the air-fuel ratio,the scale is shown to be that of,the lines are represented as a result of,a presentation column;indicating areaInner integral the coefficients of which are such that,indicating areaThe inner differential coefficient;
based on all regionsNon-linear PID control increments ofAnd calculating the weighted average increment of the nonlinear PID control:
wherein, the first and the second end of the pipe are connected with each other,is a regionThe control law weight is incremented by one,is a regionThe error e and the radius of the error rate of change ec,is a regionThe center of (a);
wherein, the first and the second end of the pipe are connected with each other,for the maximum value of the incremental factor,is an offset;
for describing the degree to which the deviation and the rate of change of deviation deviate from the origin,is a scaling factor;
the method can make different errors e and error change rates ec have different increment factors.
Design ofThe method aims to provide controller increment factors under the conditions that the error e and the error change rate ec are different, and has the effects of improving the response speed of a system and reducing the complexity of an optimization process.
The step S5 specifically includes the following steps:
there are a number of pairsIn a divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2) of、、Performing online learning;
incremental nonlinear PID control using supervised Hebb learning ruleThe parameters of (2) are learned:
wherein the content of the first and second substances,is a regionInternal learning rate, learning rateBased on online adjustment rulesAnd (3) adjusting:
wherein the content of the first and second substances,is a regionThe matching coefficient in the range is used for adjusting the learning rate range,is a regionTwo weight coefficients within.
The unmanned line marking vehicle online optimization searching control system based on machine vision navigation comprises a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
a vision sensor collects a water line image of a road surface;
the waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting deviation changeRate of change。
The predictive controller operates the deviation according to the prediction of the vehicleAnd predicting the rate of change of deviationPredicting a predicted feedforward control amount of the vehicle;
The nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicleRate of change of deviation from operationObtaining an incremental nonlinear PID control law for a vehicle;
On-line learning rule unit for non-linear PID control increment in regionThe parameters of (2) are learned.
The working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein the content of the first and second substances,,,as a result of the stretching factor,represents an imageGo to the firstColumns;is shown asGo to the firstGray values of the column pixel points; after stretching the gray scale of the water line image, selectingThe individual HARR-like features express the linear features of each pixel point;
s202, aiming at the first stepCharacteristic of personal HARRThe calculation method is as shown in formula (2):
wherein the content of the first and second substances,for the pixel sum of the i-th HARR-like feature waterline region,for the sum of pixels of the ith HARR-like feature road surface area,、gray value based on stretched gray imageCalculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein the content of the first and second substances,is as followsThe number of normalized HARR features is then determined,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on the Chinese character ofNormalized HARR-like feature to construct pixel pointsFeature vector of:
S203, aiming at each pixel pointFeature vector ofIdentifying and judging pixel pointsIf the characteristic vector accords with the linear characteristic, setting 1 to the pixel point, otherwise setting 0 to realize binarization of the gray level image, and further obtaining a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageObtaining an edge image E of the waterline by adopting a formula (6),
representing a new binary imageTo (1)Go to the firstPixel values of the columns;represents the edge image EThe rows of the image data are, in turn,pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesWhere the coordinates of the pixel having the pixel value of 1 are found to beObtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
wherein the content of the first and second substances,radius and angle detected by HOUGH transformation;
a group ofRepresenting a straight line, given a setIf, ifSatisfies the formula (7) to showOn the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1, selecting a plurality of straight lines with the longest length and the length meeting the threshold length;
s205-2, two-value images of front and back framesDetected straight line parameterSatisfies formula (8):
wherein the content of the first and second substances,,are respectively asThe maximum allowable deviation angle and radius,for continuity permitThe threshold value of (2) is set,is the sampling moment of the image;
s205-3, if 2 or more than 2 straight lines still exist after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
The operation error prediction unit specifically comprises the following working processes: the height and width of the edge image E of the waterline are H and H respectively(ii) a Selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (Y) calculating a running deviation of the vehicleRate of change of deviation from runningComprises the following steps:
wherein the content of the first and second substances,the actual length represented by each pixel is represented by the camera calibration;
selecting a height ofThe intersection point of the horizontal line and the waterline is taken as a monitoring point of the predicted position of the vehicle: (,) Obtaining a predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
The predictive controller calculates a feedforward control amount of the vehicleComprises the following steps:
in order to predict the feedforward control quantity of the controller,andare two parameters of the predictive controller;
the nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicleRate of change of deviation from operationObtaining an incremental nonlinear PID control law for a vehicle;
Constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicleRate of change of deviation from running;
The e-ec plane nonlinear division method comprises the following steps:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;,Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,is a mistakeEqual division points of the difference e or the error change rate ec,in order to non-uniformly partition the points after mapping,the factor is adjusted for the degree of non-linearity.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the divided area set is marked as;
In each divided regionPID control is carried out by a nonlinear increment PID controller, and the increment is controlled by the nonlinear PIDComprises the following steps:
wherein the content of the first and second substances,to representRegion of timeNon-linear PID control increments of (a);
which is indicative of the time of the sampling,indicating areaThe ratio coefficient of the inner side of the outer ring,the scale is shown to be that of,the lines are represented as a result of,the columns are represented.The value of the integral coefficient is represented by,represents a differential coefficient;
based on all regionsNon-linear PID control increments ofCalculating a nonlinear PID control weighted average increment:
wherein the content of the first and second substances,is a regionThe control law weight is incremented by one, is a regionThe error e, the radius of the error change rate ec,is a regionThe center of (a);
based on incremental control law weightsCalculatingTime incremental nonlinear PID control lawComprises the following steps:
wherein the content of the first and second substances,an incremental factor, which is defined as follows:
wherein the content of the first and second substances,for the maximum value of the incremental factor,is an offset;
running deviation for vehicleRate of change of deviation from runningDegree of deviation from origin;
The method can lead the error change rate ec to have different increment factors according to different errors e.
Design ofIs a running deviationRate of change of deviation from runningThe controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
The working process of the online learning rule unit specifically comprises the following steps:
there are a number of pairsIn a divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2) of、、And performing online learning. Incremental nonlinear PID control using supervised Hebb learning ruleThe parameters of (2) are learned:
wherein the content of the first and second substances,is a regionThe inner learning rate. To improve learning efficiency, learning rateBased on online adjustment rulesThe adjustment is carried out, namely:
wherein the content of the first and second substances,is a regionThe matching coefficient in the range, the learning rate range is adjusted,is a regionTwo weight coefficients within.
Compared with the prior art, the invention has the beneficial effects that:
the application discloses an unmanned line marking vehicle on-line optimization control method based on machine vision navigation, which adopts a new HARR-like feature and an image processing method to realize the machine vision real-time navigation of a line marking vehicle, designs a feedforward controller and a nonlinear PID controller according to a tracking error and an error change rate, and discloses an on-line learning method of parameters of the nonlinear PID controller to realize the line marking vehicle path tracking control based on the machine vision navigation.
The application discloses a novel HARR-like feature which is used for extracting a straight line feature of a waterline, is convenient for later waterline detection, not only meets the real-time requirement of image detection, but also fully considers background information around the waterline, and improves the success rate of waterline detection;
the method is based on the detected waterline edge, the waterline in each frame of image is extracted by adopting a HOUGH conversion method, a new interference waterline filtering method is provided, and the robustness of waterline identification is realized; according to the respective conditions of the tracking error and the error change rate, the e-ec plane is divided into areas, an incremental PID controller is designed in each area, and finally a nonlinear incremental PID controller is constructed, so that the rapidity of the track tracking of the scribing car is realized, and the tracking precision of the scribing car is improved; the parameters of the nonlinear PID controller are adjusted by adopting a supervised Hebb learning rule, so that the online optimization of the nonlinear PID controller is realized; according to the prediction tracking deviation, a feedforward prediction controller is designed, the waterline adaptability of scribing vehicle control is improved, and the scribing vehicle control precision and the anti-interference capability are improved.
Drawings
FIG. 1 is a schematic view of a line marking vehicle construction;
FIG. 2 class HARR features;
FIG. 3 is a schematic of an operating deviation and predicted deviation calculation;
FIG. 4 is an online optimizing control system of an unmanned line marking vehicle based on machine vision navigation;
FIG. 5 e-ec plane non-linear division diagram;
FIG. 6 e-ec plane division area schematic diagram;
fig. 7 is a water line image schematic.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The on-line optimizing control method of the unmanned line marking vehicle based on the machine vision navigation comprises the following steps:
s1, collecting a water line image of a road surface to obtain a data sample, wherein the water line image is shown in FIG. 7:
the gray level image of the asphalt pavement is acquired through the vision sensor arranged on the line marking vehicle in the figure 1, a water line is drawn on the pavement, and water line information is arranged in the image. The vision sensor is arranged at the height of 35-45cm away from the asphalt pavement and ensures that the length of a view field in the advancing direction of the scribing vehicle is 30-40cm. In order to ensure that the visual sensor is resistant to the interference of external light, certain shading equipment is generally arranged outside the visual sensor.
S2, carrying out waterline detection based on the waterline image to obtain a waterline;
the step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein the content of the first and second substances,,,is a stretch factor; m, n represents the mth row and nth column of the image;is shown asGo to the firstAfter gray value of the row pixel points is subjected to gray value stretching of the gray value water line image, the linear characteristic of each pixel point is expressed based on the HARR-like characteristic, and the HARR-like characteristic is shown in figure 2: in the embodiment, 6 HARR-like features are selected according to engineering conditions, the number of the HARR-like features is large, and the HARR-like features are designed in the HARR-like features M1-M6 according to actual conditions to divide different regions. Assuming that the width of the water line region is 2w, where w is half the width of the water line, the width of the road surface region and the transition region in the HARR-like features M1-M3 is w. In the HARR-like features M4-M6, the width of each region is 2w, and the height of all HARR-like features is 4w-6w. For the rotation features M2, M3, M5, M6, the rotation angle is 15 degrees, which is determined by the field waterline characteristics.
Because the boundary of the waterline area is easy to be fuzzy, in the embodiment, the transition area is arranged between the road surface area and the waterline area, so that the fuzzy interference of the waterline boundary on the feature extraction is avoided, namely, when the feature is extracted, the transition area is not considered, and the transition area is ignored.
S202, the HARR-like features are used for highlighting the straight line features of the waterline on the road surface, and the waterline identification in the later period is facilitated. To the firstCharacteristic of personal HARR,The calculation method is as shown in formula (2):
in the present embodiment, Q =6,is as followsThe sum of pixels of the individual HARR-like feature waterline regions,is as followsThe sum of pixels of a road surface area with a HARR-like characteristic,、gray value based on stretched gray imageCalculating outObtaining;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein the content of the first and second substances,in order to normalize the HARR characteristics of the sample,,respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on the 6 normalized HARR-like features (see formula 3), a pixel point is constructedFeature vector of:
S203, in order to detect the waterline in the image, each pixel point is subjected toFeature vector ofIdentifying and judging pixel pointsWhether the feature vector of (1) is in line withAnd (4) line characteristics, if the line characteristics accord with the straight line characteristics, setting 1 to the pixel point, otherwise, setting 0 to realize binarization of the gray level image, and further obtaining a binary image B. The method is innovative in that the binarization problem of the image is changed into the pattern recognition problem of the features.
Feature vector pair by SVM methodIdentification is carried out, and the kernel function is the formula (5):
Is thatThe formula (5) is a kernel function in the SVM classification method, and is to use the feature vectorMapping to a higher dimension, judging whether the feature is a straight line feature through the SVM, and performing two-classification; constructing a feature vector for each pixel point according to the HARR-like features, and then judging whether the pixel point represented by each feature vector is on a straight line or not by using an SVM (support vector machine);
after obtaining the binary image B, carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageObtaining an edge image E of the waterline by adopting a formula (6),
representing a new binary imageTo (1) aGo to the firstPixel values of the columns;represents the edge image EThe rows of the image data are, in turn,pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesWhere the coordinates of the pixel having the pixel value of 1 are found to beObtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
wherein the content of the first and second substances,respectively, the radius and angle (variable representation of straight lines) detected by the HOUGH transform.
A group ofRepresenting a straight line, given a setIf at allSatisfies the formula (7)On the straight line represented by formula (7);
s205, due to the influence of noise in the image, a plurality of straight lines are detected, and the interference straight lines are removed to obtain real straight lines. The interference line removing method specifically comprises the following steps:
s205-1: selecting 3-5 straight lines with the longest length, wherein the length of the straight lines meets the threshold length (the length is greater than or equal to 1/4 of the image height, and the threshold length is 1/4 of the image height in the embodiment);
s205-2: because the waterline is continuous, the linear parameters detected by the two frames of images before and after the waterline are continuousThe requirement of formula (8) is satisfied, namely:
wherein, the first and the second end of the pipe are connected with each other,,are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling instant of the image.
S205-3: if a plurality of straight lines are still satisfied after passing through S205-1 and S205-2 (there are 2 or 2 straight lines in a line), the straight line with the highest pixel average value on the straight line is selected as the detected waterline, because the waterline is generally white and the brightness is highest in the image.
S3, based on the detected waterline, selecting a monitoring point of the current position of the vehicle, and calculating the running deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation;
The detected waterline is shown in FIG. 3, H,Are respectively an edgeThe height and width of the edge image E (the height and width of the edge image E and the horizontal line image H and W are the same). Taking the intersection point of the horizontal line at the position of 0.7H and the waterline as a monitoring point of the current position of the vehicle, and taking the 0.7H as y in the formula (7) to obtain the abscissa of the monitoring point of the current position. At this time, the running deviation of the vehicleRate of change of deviation from runningComprises the following steps:
0.7H is a value of this embodiment y, generally, the distance between two horizontal lines in fig. 3 should be at least the distance that the vehicle can travel in one control cycle, but the two horizontal lines cannot be too close to the upper and lower boundaries of the image, and according to the set vehicle speed and the visual field range of the vision sensor, the embodiment selects the intersection point of the horizontal line at 0.7H and the water line as the monitoring point of the current position of the vehicle;
wherein the content of the first and second substances,obtained by camera calibration, represents the actual length (the actual distance represented by a pixel unit) represented by each pixel. In the present embodiment, the first and second embodiments,taking the value of 0.3H, taking the intersection point of the horizontal line at the 0.3H position and the waterline as a monitoring point of the predicted position of the vehicle, and obtaining the predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
This division is reasonable because the vehicle speed is approximately 40 m/min and the effective field of view of the vision sensor is 20-30 cm.
As seen in fig. 3, the current position error is on the left, but to the right by the predicted position. Therefore, the error cannot be excessively adjusted at the current position, and the current adjustment can be finely adjusted by the prediction error.
S4, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationObtaining a predicted feedforward control quantityBased on running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicle;
Step S4 specifically includes the following steps:
s401, according to the predicted operation deviation of the vehicleAnd predicting the rate of change of deviationPredicting a feedforward control amount of the vehicle:
s402, in the area of the e-ec plane, based on the running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicleThe error e or the error change rate ec of the system conforms to Gaussian distribution, so that the non-uniform division method of the error and the error change rate is constructed based on a Gaussian function, and the e-ec plane takes the error e as the horizontal axis and the error change rate ec as the vertical axis; error or rate of change of error refers to deviation in the operation of the vehicleRate of change of deviation from running(ii) a The e-ec plane division adopts a nonlinear division method:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;,Respectively, the maximum of the absolute values of the error e and the error change rate ec, wherein,for equal uniform division points of the error e or the error change rate ec,in order to non-uniformly partition the points after mapping,adjusting a factor for the degree of non-linearity; the non-uniform partitioning diagram is shown in fig. 5.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the set of divided areas (each square in fig. 6 is an area) is marked asAs shown in fig. 6:
in each divided regionPID control is carried out by a nonlinear increment PID controller, and the increment is controlled by the nonlinear PIDComprises the following steps:
wherein the content of the first and second substances,representRegion of timeNon-linear PID control increments of (1);
which is indicative of the time of the sampling,indicating areaThe ratio coefficient of the inner side of the outer ring,the scale is shown to be that of,the lines are represented as a result of,the columns are represented.The value of the integral coefficient is represented by,represents a differential coefficient;
based on all regionsNon-linear PID control increments ofAnd calculating the weighted average increment of the nonlinear PID control:
wherein, the first and the second end of the pipe are connected with each other,is a regionThe control law weight is incremented by one,is a regionError e (square in figure 6), radius of error rate of change ec,is a regionOf the center of (c).
Synthesizing the above incremental control law weightsCalculatingTime incremental nonlinear PID control lawComprises the following steps:
wherein the content of the first and second substances,for the maximum value of the incremental factor,is an offset;
for describing running deviationsRate of change of deviation from runningThe degree of deviation from the origin is,to be flexibleA factor.
The method can enable different e and ec to have different increment factors.
Is a running deviationRate of change of deviation from runningThe controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
S5, controlling increment on the nonlinear PID in the regionParameter (2) of、、Performing online learning;
s5 specifically comprises the following steps: there are a number of pairsIn a divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2) of、、And performing online learning.
Incremental nonlinear PID control using supervised Hebb learning ruleThe parameters of (2) are learned:
wherein the content of the first and second substances,is a regionThe inner learning rate. To improve learning efficiency, learning rateBased on online adjustment rulesThe adjustment is carried out, namely:
wherein the content of the first and second substances,is a regionInside ofMatching coefficient, adjusting the range of learning rate,is a regionTwo weight coefficients within. The two weighting coefficients may be artificially given based on historical data.
As shown in fig. 4, the online optimizing control system of the unmanned line marking vehicle based on machine vision navigation comprises a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
the vision sensor collects a water line image of the road surface (fig. 7), and acquires a data sample:
the gray level image of the asphalt pavement is acquired through the vision sensor arranged on the line marking vehicle in the figure 1, a waterline drawn in advance is arranged on the pavement, and waterline information is required in the image. The vision sensor is arranged at the height of 35-45cm away from the asphalt pavement and ensures that the length of a view field in the advancing direction of the scribing vehicle is 30-40cm. In order to ensure that the visual sensor is resistant to the interference of external light, a certain shading device is generally arranged outside the visual sensor.
The waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line imagePerforming gray scale stretching, wherein the stretching formula is shown as formula (1):
wherein the content of the first and second substances,,,as a result of the stretching factor,represents an imageLine, firstColumns;is shown asGo to the firstAfter gray value of the row pixel points is subjected to gray value stretching of the gray value water line image, the linear characteristic of each pixel point is expressed based on the HARR-like characteristic, and the HARR-like characteristic is shown in figure 2:
in the embodiment, 6 HARR-like features are selected according to engineering conditions, the number of the HARR-like features is large, and the HARR-like features are designed in the HARR-like features M1-M6 according to actual conditions to divide different regions. Assuming that the width of the waterline region is 2w, where w is half the width of the waterline, the width of the road surface region and the transition region in the HARR-like features M1-M3 is w. In the HARR-like features M4-M6, the width of each region is 2w, and the height of all HARR-like features is 4w-6w. For the rotation features M2, M3, M5, M6, the rotation angle is 15 degrees, which is determined by the site waterline characteristics.
S202, the HARR-like features are used for highlighting the straight line features of the waterline on the road surface, and the waterline identification in the later period is facilitated. For the ith HARR-like featureThe calculation method is as shown in formula (2):
in the present embodiment, Q =6,for the pixel sum of the i-th HARR-like feature waterline region,the sum of the pixels of the ith HARR-like characteristic road surface area,、gray value based on stretched gray imageCalculating to obtain;
after the description of the HARR-like features (straight line features) of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
wherein the content of the first and second substances,in order to normalize the HARR signature after the normalization,,respectively, mean value of gray level and average value of gray level in detection windowThe mean of the squares;
based on the 6 normalized HARR-like features (see formula 3), a pixel point is constructedFeature vector of:
S203, in order to detect the waterline in the image, each pixel point is subjected toFeature vector ofPerforming identification to determine pixel pointsAnd (3) judging whether the characteristic vector accords with the linear characteristic, if so, setting 1 to the pixel point, otherwise, setting 0 to realize binarization of the gray level image, and further obtaining a binary image B. The method is innovative in that the binarization problem of the image is changed into the pattern recognition problem of the features.
The feature vector constructed by the SVM method is adopted for identification, and the kernel function is as follows:
Is thatThe formula (5) is a kernel function to be designed in the SVM classification method, and is to use the feature vectorMapping to a higher dimension, judging whether the feature is a straight line feature through the SVM, and performing two-classification; constructing a linear feature vector for each pixel point according to the HARR-like features, and then judging whether the pixel point represented by each linear feature vector is on a straight line or not by using an SVM (support vector machine);
after obtaining the binary image, carrying out expansion corrosion treatment on the binary image B to obtain a new binary imageObtaining an edge image E of the waterline by adopting a formula (6);
representing a new binary imageTo (1) aGo to the firstPixel values of the columns;represents the edge image EThe rows of the image data are, in turn,pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary imagesWhere the coordinates of the pixel having the pixel value of 1 are found to beObtained by HOUGH conversionObtained by the formula (7)A corresponding linear expression;
wherein the content of the first and second substances,respectively, the radius and angle (variable representation of straight lines) detected by the HOUGH transform.
A group ofRepresenting a straight line, given a setIf, ifSatisfies the formula (7)On the straight line represented by formula (7);
s205, due to the influence of noise in the image, a plurality of straight lines are detected, and the interference straight lines are removed to obtain real straight lines. The interference line removing method specifically comprises the following steps:
s205-1, selecting 3-5 straight lines with the longest length, wherein the length of the straight lines meets the threshold length (the length is greater than or equal to 1/4 of the image height, and the threshold length is 1/4 of the image height in the embodiment);
s205-2, because the waterline is continuous, the straight line parameters detected by the front frame image and the rear frame imageThe requirement of formula (8) is satisfied, namely:
wherein, the first and the second end of the pipe are connected with each other,,are respectively asThe maximum allowable deviation angle and radius,is a threshold value that is allowed for continuity,is the sampling instant of the image.
S205-3, if a plurality of straight lines meet the requirements after the S205-1 and the S205-2, the straight line with the highest pixel average value on the straight line is selected as the detected waterline, because the waterline is generally white and the brightness is highest in the image.
The operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicleRate of change of deviation from runningSelecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicleAnd predicting the rate of change of deviation;
The detected waterline is shown in FIG. 3:
as shown in FIG. 3, H,The distribution is the height and width of the edge image E (the height and width of the edge image E and the height and width of the horizontal line image H and W are the same). Taking the intersection point of the horizontal line at the position of 0.7H and the waterline as a monitoring point of the current position of the vehicle, and taking the 0.7H as y in the formula (7) to obtain the abscissa of the monitoring point of the current position. At this time, the running deviation of the vehicleRate of change of deviation from runningComprises the following steps:
0.7H is a value of this embodiment, the distance between the two horizontal lines in fig. 3 should be at least the distance that the vehicle can travel in one control cycle, but the two horizontal lines cannot be too close to the upper and lower boundaries of the image, and according to the set vehicle speed and the visual field range of the vision sensor, the present embodiment takes the intersection point of the horizontal line at 0.7H and the waterline as a monitoring point of the current position of the vehicle;
wherein the content of the first and second substances,obtained by camera calibration, represents the actual length (the actual distance represented by a pixel unit) represented by each pixel. The intersection point of the horizontal line at 0.3H and the waterline is used as a monitoring point of the predicted position of the vehicle to obtain the predicted running deviation of the vehicleAnd predicting the rate of change of deviation:
This division is reasonable because the vehicle speed is approximately 40 m/min and the effective field of view of the vision sensor is 20-30 cm.
As seen in fig. 3, the current position error is on the left, but to the right by the predicted position. The error cannot be adjusted excessively at the current position and the current adjustment can be fine-tuned by the prediction error.
The predictive controller operates the deviation according to the prediction of the vehicleAnd predicting the rate of change of deviationPredicting a feedforward control amount of the vehicle;
In order to predict the feedforward control quantity of the controller,andare two parameters of the predictive controller;
the nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicleRate of change of deviation from runningObtaining an incremental nonlinear PID control law for a vehicle;
The error e or the error change rate ec of the system conforms to Gaussian distribution, so that a non-uniform division method of the error e and the error change rate ec is constructed based on a Gaussian function, and an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicleRate of change of deviation from running;
The e-ec plane division adopts a nonlinear division method, and the nonlinear division method comprises the following steps:
when the division of the error change rate ec is calculated,is thatWhen the division of the error e is calculated,is that;,Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,for equal uniform division points of the error e or the error change rate ec,in order to non-uniformly partition the points after mapping,adjusting a factor for the degree of non-linearity; the non-uniform partitioning diagram is shown in fig. 5.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the set of divided areas (each square in fig. 6 is an area) is marked asAs shown in fig. 6:
in each divided regionPID control is carried out by a nonlinear increment PID controller, and the increment is controlled by the nonlinear PIDComprises the following steps:
wherein the content of the first and second substances,to representRegion of time of dayNon-linear PID control increments of (1);
which is indicative of the time of the sampling,indicating areaThe internal proportionality coefficient of the air-fuel ratio,the scale is shown to be that of,the lines are represented as a result of,a presentation column;the value of the integral coefficient is represented by,representing the differential coefficient.
Based on all regionsNon-linear PID control increments ofCalculating a nonlinear PID control weighted average increment:
wherein the content of the first and second substances,is a regionThe control law weight is incremented by one,is a regionError e (square in figure 6), radius of error rate of change ec,is a regionOf the center of (c).
Synthesizing the above incremental control law weightsCalculatingTime incremental nonlinear PID control lawComprises the following steps:
wherein the content of the first and second substances,an incremental factor, which is defined as follows:
wherein the content of the first and second substances,for the maximum value of the incremental factor,is an offset;
for describing the running deviation of the vehicleRate of change of deviation from runningThe degree of deviation from the origin;
The method can lead the error change rate ec to have different increment factors according to different errors e.
Gives out the running deviationRate of change of deviation from runningThe controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
On-line learning rule unit for non-linear PID control increment in regionParameter (2) of、、Performing online learning, and adopting supervised Hebb learning rule to control increment of nonlinear PIDLearning the parameters;
the process of the online learning rule unit specifically comprises the following steps: there are a number of pairsIn a divided regionInner, then to the areaInternal non-linear PID control incrementsParameter (2) of、、And performing online learning. Incremental nonlinear PID control using supervised Hebb learning ruleThe parameters of (2) are learned:
wherein the content of the first and second substances,is a regionThe inner learning rate. To improve learning efficiency, learning rateBased on online adjustment rulesThe adjustment is carried out, namely:
wherein the content of the first and second substances,is a regionThe matching coefficient in the range, the learning rate range is adjusted,is a regionTwo weight coefficients within. The two weighting coefficients may be artificially given based on historical data.
The part adjusts the parameters of the controllers of each area on line, and realizes the on-line adjustment of the learning rate of each area.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or groups of devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. Modules or units or groups in embodiments may be combined into one module or unit or group and may furthermore be divided into sub-modules or sub-units or sub-groups. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the method of the invention according to instructions in said program code stored in the memory.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer-readable media includes both computer storage media and communication media. Computer storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.
Claims (6)
1. The on-line optimization control method of the unmanned line marking vehicle based on machine vision navigation is characterized by comprising the following steps:
s1, collecting a water line image of a road surface;
s2, carrying out waterline detection based on the waterline image to obtain a waterline;
s3, based on the detected waterline, selecting a monitoring point of the current position of the vehicle, calculating the running deviation e (k) and the running deviation change rate ec (k) of the vehicle, selecting a monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation e of the vehicle p (k) And predicted deviation change rate ec p (k);
s4, based on the predicted running deviation e of the vehicle p (k) And predicted deviation change rate ec p (k) Obtaining a predicted feedforward control quantity u p (k) Obtaining an incremental nonlinear PID control law u (k) of the vehicle based on the operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle;
s5, online learning parameters in the incremental nonlinear PID control law u (k) in the region;
the step S3 specifically includes the following steps:
the height and the width of the edge image E where the waterline is located are H and W respectively;
the intersection point of the horizontal line with the height of y and the waterline is selected as the monitoring point (X) of the current position of the vehicle c,y ) The operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle are as follows:
wherein the content of the first and second substances,the actual length represented by each pixel is represented by the camera calibration;
the intersection of the horizontal line at height y' and the waterline is selected as the monitoring point (X) of the predicted position of the vehicle p Y') to obtain a predicted running deviation e of the vehicle p (k) And predicted deviation change rate ec p (k):
Wherein, the W distribution is the width of the edge image E;
s4 specifically comprises the following steps:
s401, according to the predicted operation deviation e of the vehicle p (k) And predicted deviation change rate ec p (k) Calculating a feedforward control quantity u of the vehicle p (k):
u p (k) In order to calculate the feedforward control amount,andare two parameters of predictive control;
s402, in the area of the e-ec plane non-linear division, based on the operation deviation e (k) and the operation deviation change rate of the vehicle
ec (k) obtaining an incremental nonlinear PID control law u (k) of the vehicle;
s402 specifically includes the following steps: constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; the error e and the error change rate ec refer to the running deviation e (k) and the running deviation change rate ec (k) of the vehicle; the non-linear division method adopted by the e-ec plane division is as follows:
when calculating the division of the error change rate ec, X max Is EC max When calculating the division of the error e, X max Is E max ;E max And EC max Respectively the maximum of the absolute values of the error e and the error rate of change ec,
x i ∈{x 1 ,x 2 ,…,x n is the equal division point of the error e or the error change rate ec,non-uniform dividing points after mapping; tau is a nonlinear degree adjusting factor;
after the e-ec plane is subjected to non-uniform division, a division area set is marked as { R ij };
In each divided region R ij Internally performing PID control, and controlling increment delta u by nonlinear PID ij (k) Comprises the following steps:
wherein, Δ u ij (k) Region R representing time k ij Non-linear PID control increments of (1);
k denotes the sampling instant of time,represents a region R ij Inner scale factor, p represents scale, i represents row, j represents column;represents a region R ij Inner integral the coefficients of which are such that,represents a region R ij The inner differential coefficient;
based on all regions R ij Nonlinear PID control increment of Delaut ij (k) And calculating the weighted average increment of the nonlinear PID control:
wherein the content of the first and second substances,is a region R ij Incremental control law weight, r ej And r eci Is a region R ij Error e and radius of error change rate ec, e j ,ec i Is a region R ij The center of (a);
and calculating an incremental nonlinear PID control law u (k) at the moment k as follows:
u(k)=u(k-1)+ξ(e(k),ec(k))·Δu(k) (15)
where ξ (e (k), ec (k)) > 0 is the incremental factor defined as:
wherein xi max 0 is the maximum increment factor xi 0 Offset is more than 0;
the method is used for describing the degree of deviation of an error e and an error change rate ec from an original point, s is more than 0, and S is a scaling factor;
the method can enable different e and ec to have different increment factors.
2. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S2 specifically includes the following steps:
s201, carrying out gray stretching on pixel points (m', n) on the water line image, wherein the stretching formula is as the following formula (1):
wherein, the lambda is more than 1,λ is the stretch factor; m, n represents the mth row and nth column of the image; i (m, n) represents the gray value of the pixel point at the nth row and the nth column; after the gray level of the water line image is stretched, Q HARR-like features are selected;
s202, aiming at the ith HARR-like feature h i ,h i The calculation method is as shown in formula (2):
wherein the content of the first and second substances,for the pixel sum of the i-th HARR-like feature waterline region,the sum of the pixels of the ith HARR-like characteristic road surface area,calculating and obtaining the gray value I (m, n) based on the stretched gray image; obtaining class H of each pixel pointAfter the ARR characteristics are described, each HARR-like characteristic is normalized, and the normalization formula is shown as formula (3):
wherein i =1,2, \8230;, Q,in order to normalize the HARR signature after the normalization,mn and sqmn are mean values of the gray level mean value and the gray level square value in the detection window respectively;
constructing a feature vector F (m, n, wherein: of a pixel point (m, n) based on the normalized HARR-like features:
s203, identifying the feature vector F (m, n) of each pixel point (m, n), judging whether the feature vector of the pixel point (m, n) accords with the linear feature, if so, setting 1 to the pixel point, otherwise, setting 0 to realize the binarization of the gray level image, and obtaining a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary image Bn, and obtaining an edge image E of a waterline by adopting a formula (6):
E(m,n)=Bn(m,n)-Bn(m,n-1) (6)
bn (m, n) represents the pixel value of the mth row and nth column of the new binary image Bn; e (m, n) represents a pixel value of an mth row and an nth column of the edge image E;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
finding out coordinates (x, y) of a pixel with a pixel value of 1 from the new binary image Bn, obtaining theta and rho of each straight line through HOUGH transformation, and obtaining a straight line expression corresponding to the theta and the rho through a formula (7);
x=ρ/cosθ-y tanθ (7)
wherein theta and rho are respectively the radius and the angle detected by HOUGH transformation;
a set of theta, rho represents a straight line, given a set of theta, rho, if (x, y) satisfies equation (7) the (x, y) is on the straight line represented by equation (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1: selecting a plurality of straight lines with the longest length, wherein the length of the straight lines meets the length of a threshold value;
s205-2: the straight line parameters theta and rho detected by the two frames of binary images Bn meet the requirement of the formula (8):
wherein, delta = [ theta ] k -θ k-1 ρ k -ρ k-1 ] T Alpha and beta are respectively theta, the maximum deviation angle and radius allowed by rho, delta is a threshold allowed by continuity, and k is the sampling time of the image;
s205-3: and if more than one straight line still meets the requirement after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
3. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S5 specifically includes the following steps:
there are pairs (e (k), ec (k)) of numbers in the dividing region R ij Inner, then to the region R ij Internal non-linear PID control delta u ij (k) Parameter (2) ofPerforming online learning;
by usingSupervised Hebb learning rule versus nonlinear PID control increments Δ u ij (k) The parameters of (2) are learned:
wherein eta ij Is a region R ij Inner learning rate, learning rate eta ij Based on-line regulation rule eta ij (k) Adjusting:
η ij (k)=ψ ij ·(1-exp(-υ ij ·ec 2 (k)-v ij ·e 2 (k))) (17)
wherein psi ij Is a region R ij Inner matching coefficient for adjusting the learning rate range upsilon ij ,v ij Is a region R ij Two weight coefficients within.
4. The unmanned line marking vehicle online optimization control system based on machine vision navigation is characterized by comprising a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
a vision sensor collects a water line image of a road surface;
the waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the operation error prediction means selects a monitoring point of the current position of the vehicle based on the detected waterline, calculates the operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle, selects a monitoring point of the predicted position of the vehicle, and calculates the predicted operation deviation e of the vehicle p (k) And predicted deviation change rate ec p (k);
The predictive controller operates the deviation e according to the prediction of the vehicle p (k) And predicted deviation change rate ec p (k) Calculating a feedforward control quantity u p (k);
The method comprises the steps that a non-linear increment PID controller obtains an incremental non-linear PID control law u (k) of a vehicle in a non-linear division area of an e-ec plane based on the operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle;
on-line learning rule unit for nonlinear PID control increment delta u in region ij (k) Learning the parameters;
the specific working process of the operation error prediction unit comprises the following steps: the height and the width of the edge image E where the waterline is located are H and W respectively; the intersection point of the horizontal line with the height of y and the waterline is selected as the monitoring point (X) of the current position of the vehicle c Y), calculating the operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle as:
wherein the content of the first and second substances,the actual length represented by each pixel is represented by the camera calibration;
the intersection point of the horizontal line at the height of y' and the waterline is selected as the monitoring point (X) of the predicted position of the vehicle p Y') to obtain a predicted running deviation e of the vehicle p (k) And predicted deviation change rate ec p (k):
Feedforward control amount u of vehicle predicted by prediction controller p (k) Comprises the following steps:
u p (k) In order to predict the feedforward control quantity of the controller,andare two parameters of the predictive controller;
the working process of the nonlinear incremental PID controller specifically comprises the following steps:
constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; the error or the error change rate refers to a running deviation e (k) of the vehicle and a running deviation change rate ec (k);
the e-ec plane nonlinear division method comprises the following steps:
when calculating the division of the error change rate ec, X max Is EC max When calculating the division of the error e, X max Is E max ;E max ,EC max Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,
x i ∈{x 1 ,x 2 ,…,x n the equal uniform division points of the error e or the error change rate ec are used as the dividing points,non-uniform dividing points after mapping; tau is a non-linearity degree adjusting factor;
according to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the divided region set is marked as { R } ij };
In each divided region R ij Internal, non-linear PID control increments Δ u ij (k) Comprises the following steps:
wherein, Δ u ij (k) Region R representing time k ij Non-linear PID control increments of (1);
k representsAt the moment of sampling the time of the sample,represents a region R ij Inner scale factor, p represents scale, i represents row, j represents column;represents a region R ij Inner integral the coefficients of which are such that,represents a region R ij The inner differential coefficient;
based on all regions R ij Nonlinear PID control increment of Delaut ij (k) And calculating the weighted average increment of the nonlinear PID control:
wherein the content of the first and second substances,is a region R ij Incremental control law weight, r ej ,r eci Is a region R ij Error e, radius of error change rate ec, e j ,ec i Is a region R ij The center of (a);
based on incremental control law weight w ij And calculating the time incremental nonlinear PID control law u (k) as follows:
u(k)=u(k-1)+ξ(e(k),ec(k))·Δu(k) (15)
where ξ (e (k), ec (k)) > 0 is an incremental factor defined as follows:
therein, xi max 0 is the maximum increment factor xi 0 Offset is more than 0;
s > 0, S is the scaling factor.
5. The on-line optimizing control system of the unmanned line marking vehicle based on machine vision navigation as claimed in claim 4,
the working process of the waterline detection unit specifically comprises the following steps:
s201, carrying out gray stretching on pixel points I (m, n) on the water line image, wherein the stretching formula is as shown in formula (1):
wherein, the lambda is more than 1,λ is a stretching factor, m, n represents the mth row and nth column of the image; i (m, n) represents the gray value of the nth pixel point of the mth row; after the gray level of the water line image is stretched, Q HARR-like features are selected to express the straight line feature of each pixel point;
s202, aiming at the ith HARR-like feature h i The calculation method is as shown in formula (2):
wherein the content of the first and second substances,for the pixel sum of the i-th HARR-like feature waterline region,the sum of the pixels of the ith HARR-like characteristic road surface area,calculating the gray value I (m, n) of the stretched gray image;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
mn and sqmn are mean values of the gray level mean value and the gray level square value in the detection window respectively;
constructing a feature vector F (m, n, wherein: of a pixel point (m, n) based on the normalized HARR-like features:
s203, identifying the feature vector F (m, n) of each pixel point (m, n), judging whether the feature vector of the pixel point (m, n) accords with the linear feature, if so, setting 1 to the pixel point, otherwise, setting 0 to realize the binarization of the gray level image, and further obtaining a binary image B;
performing expansion corrosion treatment on the binary image B to obtain a new binary image Bn, obtaining an edge image E of a waterline by adopting a formula (6),
E(m,n)=Bn(m,n)-Bn(m,n-1) (6)
bn (m, n) represents the pixel value of the mth row and nth column of the new binary image Bn; e (m, n) represents the pixel value of the m-th row, n-th column of the edge image E;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
finding out the coordinates (x, y) of the pixel with the pixel value of 1 from the new binary image Bn, obtaining theta and rho through HOUGH transformation, and obtaining a linear expression corresponding to the theta and the rho through a formula (7);
x=ρ/cosθ-ytanθ (7)
wherein, theta and rho are respectively the radius and the angle detected by HOUGH conversion;
a set of θ, ρ represents a straight line, and given a set of θ, ρ, if (x, y) satisfies formula (7), it means that (x, y) is on the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1, selecting a plurality of straight lines with the longest length and the length meeting the threshold length;
s205-2, the straight line parameters theta, rho detected by the two frames of binary images Bn satisfy the formula (8):
wherein, delta = [ theta ] k -θ k-1 ρ k -ρ k-1 ] T Alpha and beta are respectively theta, the maximum deviation angle and radius allowed by rho, delta is a threshold allowed by continuity, and k is the sampling time of the image;
s205-3, if 2 or more than 2 straight lines still exist after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
6. The on-line optimizing control system of the unmanned line marking vehicle based on machine vision navigation as claimed in claim 4,
the working process of the online learning rule unit specifically comprises the following steps:
there are pairs (e (k), ec (k)) of numbers in the dividing region R ij Inner, then to the areaR ij Internal non-linear PID control delta u ij (k) Parameter (2) ofPerforming online learning;
control of delta u for non-linear PID using supervised Hebb learning rule ij (k) The parameters of (2) are learned:
wherein eta ij Is a region R ij Inner learning rate, learning rate eta ij Based on-line regulation rule eta ij (k) And (3) adjusting:
η ij (k)=ψ ij ·(1-exp(-υ ij ·ec 2 (k)-v ij ·e 2 (k))) (17)
wherein psi ij Is a region R ij Inner matching coefficient, adjusting learning rate range upsilon ij ,v ij Is a region R ij Two weight coefficients within.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211451943.7A CN115509122B (en) | 2022-11-21 | 2022-11-21 | Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211451943.7A CN115509122B (en) | 2022-11-21 | 2022-11-21 | Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115509122A CN115509122A (en) | 2022-12-23 |
CN115509122B true CN115509122B (en) | 2023-03-21 |
Family
ID=84513924
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211451943.7A Active CN115509122B (en) | 2022-11-21 | 2022-11-21 | Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115509122B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115760850B (en) * | 2023-01-05 | 2023-05-26 | 长江勘测规划设计研究有限责任公司 | Method for recognizing water level without scale by utilizing machine vision |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113960921A (en) * | 2021-10-19 | 2022-01-21 | 华南农业大学 | Visual navigation control method and system for orchard tracked vehicle |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011032208A1 (en) * | 2009-09-15 | 2011-03-24 | The University Of Sydney | A system and method for autonomous navigation of a tracked or skid-steer vehicle |
AU2013332262B2 (en) * | 2012-10-17 | 2017-08-03 | WATSON, Diane Lee | Vehicle for line marking |
CN106527119B (en) * | 2016-11-03 | 2019-07-23 | 东华大学 | Derivative-precedence PID system based on fuzzy control |
CN109176519A (en) * | 2018-09-14 | 2019-01-11 | 北京遥感设备研究所 | A method of improving the Robot Visual Servoing control response time |
CN110398979B (en) * | 2019-06-25 | 2022-03-04 | 天津大学 | Unmanned engineering operation equipment tracking method and device based on vision and attitude fusion |
AU2020104234A4 (en) * | 2020-12-22 | 2021-03-11 | Qingdao Agriculture University | An Estimation Method and Estimator for Sideslip Angle of Straight-line Navigation of Agricultural Machinery |
CN112706835B (en) * | 2021-01-07 | 2022-04-19 | 济南北方交通工程咨询监理有限公司 | Expressway unmanned marking method based on image navigation |
CN113296518A (en) * | 2021-05-25 | 2021-08-24 | 山东交通学院 | Unmanned driving system and method for formation of in-place heat regeneration unit |
CN114942641A (en) * | 2022-06-06 | 2022-08-26 | 仲恺农业工程学院 | Road bridge autonomous walking marking system controlled by multiple sensor data fusion stereoscopic vision |
CN115082701B (en) * | 2022-08-16 | 2022-11-08 | 山东高速集团有限公司创新研究院 | Multi-water-line cross identification positioning method based on double cameras |
-
2022
- 2022-11-21 CN CN202211451943.7A patent/CN115509122B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113960921A (en) * | 2021-10-19 | 2022-01-21 | 华南农业大学 | Visual navigation control method and system for orchard tracked vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN115509122A (en) | 2022-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113221905B (en) | Semantic segmentation unsupervised domain adaptation method, device and system based on uniform clustering and storage medium | |
CN109375235B (en) | Inland ship freeboard detection method based on deep reinforcement neural network | |
CN115509122B (en) | Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation | |
Li et al. | Model-based online learning with kernels | |
CN103886325B (en) | Cyclic matrix video tracking method with partition | |
CN112348849A (en) | Twin network video target tracking method and device | |
CN108614994A (en) | A kind of Human Head Region Image Segment extracting method and device based on deep learning | |
CN111860439A (en) | Unmanned aerial vehicle inspection image defect detection method, system and equipment | |
CN112818873B (en) | Lane line detection method and system and electronic equipment | |
CN110889332A (en) | Lie detection method based on micro expression in interview | |
CN112298194B (en) | Lane changing control method and device for vehicle | |
CN111325711A (en) | Chromosome split-phase image quality evaluation method based on deep learning | |
CN112016463A (en) | Deep learning-based lane line detection method | |
CN112184655A (en) | Wide and thick plate contour detection method based on convolutional neural network | |
CN113516853B (en) | Multi-lane traffic flow detection method for complex monitoring scene | |
Jiang et al. | Dfnet: Semantic segmentation on panoramic images with dynamic loss weights and residual fusion block | |
CN113989613A (en) | Light-weight high-precision ship target detection method coping with complex environment | |
CN111259827A (en) | Automatic detection method and device for water surface floating objects for urban river supervision | |
CN107766798A (en) | A kind of Remote Sensing Target detection method based on cloud computing storage and deep learning | |
CN112802005A (en) | Automobile surface scratch detection method based on improved Mask RCNN | |
CN114581486A (en) | Template updating target tracking algorithm based on full convolution twin network multilayer characteristics | |
CN116630748A (en) | Rare earth electrolytic tank state multi-parameter monitoring method based on fused salt image characteristics | |
CN116476863A (en) | Automatic driving transverse and longitudinal integrated decision-making method based on deep reinforcement learning | |
CN116977902B (en) | Target tracking method and system for on-board photoelectric stabilized platform of coastal defense | |
CN102592125A (en) | Moving object detection method based on standard deviation characteristic |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |