CN115509122B - Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation - Google Patents

Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation Download PDF

Info

Publication number
CN115509122B
CN115509122B CN202211451943.7A CN202211451943A CN115509122B CN 115509122 B CN115509122 B CN 115509122B CN 202211451943 A CN202211451943 A CN 202211451943A CN 115509122 B CN115509122 B CN 115509122B
Authority
CN
China
Prior art keywords
vehicle
error
waterline
deviation
change rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211451943.7A
Other languages
Chinese (zh)
Other versions
CN115509122A (en
Inventor
辛公锋
石磊
龙关旭
王福海
潘为刚
王目树
李一鸣
秦石铭
张文亮
靳华磊
张泽军
康超
李帆
胡朋
潘立平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innovation Research Institute Of Shandong Expressway Group Co ltd
Original Assignee
Innovation Research Institute Of Shandong Expressway Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innovation Research Institute Of Shandong Expressway Group Co ltd filed Critical Innovation Research Institute Of Shandong Expressway Group Co ltd
Priority to CN202211451943.7A priority Critical patent/CN115509122B/en
Publication of CN115509122A publication Critical patent/CN115509122A/en
Application granted granted Critical
Publication of CN115509122B publication Critical patent/CN115509122B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B11/00Automatic controllers
    • G05B11/01Automatic controllers electric
    • G05B11/36Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential
    • G05B11/42Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential for obtaining a characteristic which is both proportional and time-dependent, e.g. P. I., P. I. D.
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Automation & Control Theory (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an unmanned line marking vehicle online optimization control method and system based on machine vision navigation, belonging to the field of system control, and the unmanned line marking vehicle online optimization control method based on machine vision navigation comprises the following steps: collecting a water line image of a road surface; carrying out waterline detection to obtain a waterline; based on the detected waterline, selecting a monitoring point of the current position of the vehicle, calculating the running deviation and the running deviation change rate of the vehicle, selecting a monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation and the predicted deviation change rate of the vehicle; designing nonlinear incremental PID control according to the predicted running deviation and the predicted deviation change rate of the vehicle; learning parameters of nonlinear PID control increments in the region; the method and the system realize the machine vision real-time navigation of the line marking vehicle, and design the feedforward controller and the nonlinear increment PID controller according to the tracking error and the error change rate, so as to realize the line marking vehicle path tracking control based on the machine vision navigation.

Description

Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation
Technical Field
The invention belongs to the field of system control, and particularly relates to an unmanned line marking vehicle online optimization control method and system based on machine vision navigation.
Background
In the construction process of the highway marking, a waterline is generally manually drawn on the road surface, and the existing method is generally constructed along the waterline manually. The system adopts a machine vision technology, a computer automatically identifies a water line, and then controls marking equipment to carry out marking construction along the water line, namely navigation construction based on machine vision, and a construction schematic diagram is shown in figure 1.
The existing system has the following problems in marking vehicle control based on machine vision navigation:
the existing advanced image processing method is long in time consumption, real-time control of a marking vehicle is difficult to meet, only the gray value of each pixel is concerned, and the overall planning of foreground and background information is difficult;
the existing control method based on the model is not suitable because the structure of the line marking vehicle is too complex, an accurate motion model is difficult to obtain, and the control law of the line marking vehicle cannot be designed by adopting the method based on the model;
the vehicle navigation adopts an image navigation mode, only the degree of deviation of a construction vehicle from a planned route can be known, accurate vehicle global coordinates are difficult to provide, and the output generally given by vehicle modeling is the position information of the vehicle;
for such problems, model-free control methods such as PID control and fuzzy control are generally adopted, but such methods rely too much on manual experience and are difficult to optimize, and most optimization methods require accurate models of controlled objects.
Disclosure of Invention
In order to solve the problems in the prior art, the invention discloses an unmanned scribing vehicle on-line optimization control method based on machine vision navigation, which realizes the machine vision real-time navigation of a scribing vehicle, designs a feedforward controller and a nonlinear PID controller according to a tracking error and an error change rate, designs an on-line learning method of parameters of the nonlinear PID controller, and realizes the scribing vehicle path tracking control based on the machine vision navigation.
The invention adopts the scheme that:
the on-line optimizing control method of the unmanned line marking vehicle based on the machine vision navigation comprises the following steps:
s1, collecting a water line image of a road surface;
s2, carrying out waterline detection based on the waterline image to obtain a waterline;
s3, based on the detected waterline, selecting a monitoring point of the current position of the vehicle, and calculating the running deviation of the vehicle
Figure 976467DEST_PATH_IMAGE001
Rate of change of deviation from running
Figure 300132DEST_PATH_IMAGE002
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 17553DEST_PATH_IMAGE003
And predicting the rate of change of deviation
Figure 819287DEST_PATH_IMAGE004
S4, according to the predicted operation deviation of the vehicle
Figure 305763DEST_PATH_IMAGE005
And predicting the rate of change of deviation
Figure 800329DEST_PATH_IMAGE006
Obtaining a predicted feedforward control quantity
Figure 5045DEST_PATH_IMAGE007
Based on running deviation of the vehicle
Figure 610470DEST_PATH_IMAGE008
Rate of change of deviation from running
Figure 685874DEST_PATH_IMAGE009
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 85762DEST_PATH_IMAGE010
S5, controlling increment on the nonlinear PID in the region
Figure 512195DEST_PATH_IMAGE011
Parameter (2) of
Figure 384293DEST_PATH_IMAGE012
Figure 579782DEST_PATH_IMAGE013
Figure 681731DEST_PATH_IMAGE014
And performing online learning.
The step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line image
Figure 595460DEST_PATH_IMAGE015
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 542687DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure 858262DEST_PATH_IMAGE017
Figure 865533DEST_PATH_IMAGE018
Figure 266558DEST_PATH_IMAGE019
as a result of the stretching factor,
Figure 17476DEST_PATH_IMAGE020
Figure 921978DEST_PATH_IMAGE021
represents an image
Figure 834571DEST_PATH_IMAGE022
Go to the first
Figure 191734DEST_PATH_IMAGE021
Columns;
Figure 474905DEST_PATH_IMAGE023
is shown as
Figure 499492DEST_PATH_IMAGE024
Go to the first
Figure 848565DEST_PATH_IMAGE025
Gray values of the column pixel points; after stretching the gray scale of the water line image, selecting
Figure 240495DEST_PATH_IMAGE026
A personal HARR-like feature;
s202, aiming at the first step
Figure 333215DEST_PATH_IMAGE027
Characteristic of personal HARR
Figure 477889DEST_PATH_IMAGE028
Figure 263442DEST_PATH_IMAGE029
The calculation method is as formula (2):
Figure 595198DEST_PATH_IMAGE030
(2)
wherein the content of the first and second substances,
Figure 491610DEST_PATH_IMAGE031
is as follows
Figure 219351DEST_PATH_IMAGE032
The sum of pixels of the individual HARR-like feature waterline regions,
Figure 910226DEST_PATH_IMAGE033
is as follows
Figure 729278DEST_PATH_IMAGE034
The sum of pixels of a road surface area with a HARR-like characteristic,
Figure 163801DEST_PATH_IMAGE035
Figure 17488DEST_PATH_IMAGE036
gray value based on stretched gray image
Figure 879265DEST_PATH_IMAGE037
Calculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 716771DEST_PATH_IMAGE038
wherein the content of the first and second substances,
Figure 689406DEST_PATH_IMAGE039
Figure 663178DEST_PATH_IMAGE040
in order to normalize the HARR signature after the normalization,
Figure 695856DEST_PATH_IMAGE041
Figure 755079DEST_PATH_IMAGE042
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
constructing pixel points based on normalized HARR-like characteristics
Figure 62564DEST_PATH_IMAGE043
Feature vector of
Figure 642841DEST_PATH_IMAGE044
Figure 112000DEST_PATH_IMAGE045
S203, aiming at each pixel point
Figure 392940DEST_PATH_IMAGE046
Feature vector of
Figure 504115DEST_PATH_IMAGE047
Identifying and judging pixel points
Figure 186900DEST_PATH_IMAGE048
If the characteristic vector accords with the linear characteristic, setting 1 to the pixel point, otherwise setting 0 to realize the binarization of the gray level image to obtain a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 561381DEST_PATH_IMAGE049
Obtaining an edge image E of the waterline by adopting a formula (6):
Figure 595196DEST_PATH_IMAGE050
Figure 713325DEST_PATH_IMAGE051
representing a new binary image
Figure 516196DEST_PATH_IMAGE052
To (1) a
Figure 61578DEST_PATH_IMAGE053
Go to the first
Figure 582689DEST_PATH_IMAGE054
Pixel values of the columns;
Figure 238929DEST_PATH_IMAGE055
represents the edge image E
Figure 896307DEST_PATH_IMAGE056
Go to the first
Figure 606731DEST_PATH_IMAGE057
Pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 83979DEST_PATH_IMAGE058
Finding the coordinates of the pixel with the pixel value of 1 as
Figure 809490DEST_PATH_IMAGE059
Obtained by HOUGH conversion
Figure 321374DEST_PATH_IMAGE060
Obtained by the formula (7)
Figure 474138DEST_PATH_IMAGE061
A corresponding linear expression;
Figure 438683DEST_PATH_IMAGE062
wherein the content of the first and second substances,
Figure 967884DEST_PATH_IMAGE063
and
Figure 599854DEST_PATH_IMAGE064
radius and angle detected by HOUGH transformation;
a group of
Figure 923519DEST_PATH_IMAGE065
Representing a straight line, given a set
Figure 640939DEST_PATH_IMAGE066
If, if
Figure 708252DEST_PATH_IMAGE067
Satisfies the expression of the formula (7)
Figure 929149DEST_PATH_IMAGE068
On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1: selecting a plurality of straight lines with the longest length, wherein the length of the straight lines meets the length of a threshold value;
s205-2: two-value image of front and back frames
Figure 689295DEST_PATH_IMAGE069
Detected straight line parameter
Figure 622572DEST_PATH_IMAGE070
Satisfies the requirement of formula (8):
Figure 493577DEST_PATH_IMAGE071
wherein,
Figure 834559DEST_PATH_IMAGE072
Figure 765606DEST_PATH_IMAGE073
Are respectively as
Figure 192039DEST_PATH_IMAGE074
The maximum allowable deviation angle and radius,
Figure 601155DEST_PATH_IMAGE075
is a threshold value that is allowed for continuity,
Figure 531065DEST_PATH_IMAGE076
is the sampling moment of the image;
s205-3: and if more than one straight line still meets the requirement after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
The step S3 specifically includes the following steps: the height and width of the edge image E at which the waterline is located are H and
Figure 633013DEST_PATH_IMAGE077
selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (
Figure 281163DEST_PATH_IMAGE078
Y) running deviation of the vehicle
Figure 697232DEST_PATH_IMAGE079
Rate of change of deviation from running
Figure 747228DEST_PATH_IMAGE080
Comprises the following steps:
Figure 488919DEST_PATH_IMAGE081
wherein,
Figure 884085DEST_PATH_IMAGE082
The actual length represented by each pixel is represented by the camera calibration;
selecting a height of
Figure 635003DEST_PATH_IMAGE083
The intersection point of the horizontal line and the waterline is used as a monitoring point of the predicted position of the vehicle (
Figure 805085DEST_PATH_IMAGE084
Figure 717677DEST_PATH_IMAGE085
) Obtaining a predicted running deviation of the vehicle
Figure 74840DEST_PATH_IMAGE086
And predicting the rate of change of deviation
Figure 629449DEST_PATH_IMAGE087
Figure 654037DEST_PATH_IMAGE088
Wherein, the first and the second end of the pipe are connected with each other,
Figure 3110DEST_PATH_IMAGE089
the distribution is the width of the edge image E.
S4 specifically comprises the following steps:
s401, according to the predicted operation deviation of the vehicle
Figure 113149DEST_PATH_IMAGE090
And predicting the rate of change of deviation
Figure 471449DEST_PATH_IMAGE091
Calculating a feedforward control amount of the vehicle
Figure 616122DEST_PATH_IMAGE092
Figure 136096DEST_PATH_IMAGE093
Figure 751009DEST_PATH_IMAGE094
In order to calculate the amount of feedforward control,
Figure 647421DEST_PATH_IMAGE095
and
Figure 912180DEST_PATH_IMAGE096
are two parameters of predictive control;
s402, in the area of the e-ec plane, based on the running deviation of the vehicle
Figure 868635DEST_PATH_IMAGE097
Rate of change of deviation from running
Figure 953265DEST_PATH_IMAGE098
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 653368DEST_PATH_IMAGE099
Step S402 specifically includes the following steps: constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; the error e and the error change rate ec refer to the running deviation of the vehicle
Figure 507055DEST_PATH_IMAGE100
Rate of change of deviation from running
Figure 634411DEST_PATH_IMAGE101
(ii) a The non-linear division method adopted by the e-ec plane division is as follows:
Figure 206338DEST_PATH_IMAGE102
when the division of the error change rate ec is calculated,
Figure 444552DEST_PATH_IMAGE103
is that
Figure 152745DEST_PATH_IMAGE104
When the division of the error e is calculated,
Figure 185423DEST_PATH_IMAGE103
is that
Figure 979067DEST_PATH_IMAGE105
Figure 749533DEST_PATH_IMAGE106
And
Figure 781074DEST_PATH_IMAGE107
respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,
Figure 250233DEST_PATH_IMAGE108
for equal uniform division points of the error e or the error change rate ec,
Figure 531173DEST_PATH_IMAGE109
in order to non-uniformly partition the points after mapping,
Figure 376769DEST_PATH_IMAGE110
adjusting a factor for the degree of non-linearity;
after the e-ec plane is divided non-uniformly, the divided area set is marked as
Figure 325134DEST_PATH_IMAGE111
In each divided region
Figure 965193DEST_PATH_IMAGE112
Internally performing PID control, and non-linear PID control increment
Figure 733429DEST_PATH_IMAGE113
Comprises the following steps:
Figure 851558DEST_PATH_IMAGE114
wherein the content of the first and second substances,
Figure 654429DEST_PATH_IMAGE115
to represent
Figure DEST_PATH_IMAGE116
Region of time
Figure 403073DEST_PATH_IMAGE117
Non-linear PID control increments of (1);
Figure 652746DEST_PATH_IMAGE118
which is indicative of the time of the sampling,
Figure 574565DEST_PATH_IMAGE119
indicating area
Figure 231943DEST_PATH_IMAGE120
The internal proportionality coefficient of the air-fuel ratio,
Figure 213805DEST_PATH_IMAGE121
the scale is shown to be that of,
Figure 956633DEST_PATH_IMAGE122
the lines are represented as a result of,
Figure 682144DEST_PATH_IMAGE123
a presentation column;
Figure 194028DEST_PATH_IMAGE124
indicating area
Figure 346792DEST_PATH_IMAGE125
Inner integral the coefficients of which are such that,
Figure 576916DEST_PATH_IMAGE126
indicating area
Figure 106117DEST_PATH_IMAGE127
The inner differential coefficient;
based on all regions
Figure 472508DEST_PATH_IMAGE128
Non-linear PID control increments of
Figure 796173DEST_PATH_IMAGE129
And calculating the weighted average increment of the nonlinear PID control:
Figure 716855DEST_PATH_IMAGE130
wherein, the first and the second end of the pipe are connected with each other,
Figure 247151DEST_PATH_IMAGE131
is a region
Figure 468048DEST_PATH_IMAGE132
The control law weight is incremented by one,
Figure 962614DEST_PATH_IMAGE133
is a region
Figure 901751DEST_PATH_IMAGE134
The error e and the radius of the error rate of change ec,
Figure 507176DEST_PATH_IMAGE135
is a region
Figure 582579DEST_PATH_IMAGE136
The center of (a);
Figure 248047DEST_PATH_IMAGE137
time incremental nonlinear PID control law
Figure 408901DEST_PATH_IMAGE138
Comprises the following steps:
Figure 818017DEST_PATH_IMAGE139
wherein the content of the first and second substances,
Figure 13506DEST_PATH_IMAGE140
is an incremental factor, defined as:
Figure 584296DEST_PATH_IMAGE141
wherein, the first and the second end of the pipe are connected with each other,
Figure 515603DEST_PATH_IMAGE142
for the maximum value of the incremental factor,
Figure 993989DEST_PATH_IMAGE143
is an offset;
Figure 778405DEST_PATH_IMAGE144
for describing the degree to which the deviation and the rate of change of deviation deviate from the origin,
Figure 51255DEST_PATH_IMAGE145
is a scaling factor;
the method can make different errors e and error change rates ec have different increment factors.
Design of
Figure 921122DEST_PATH_IMAGE146
The method aims to provide controller increment factors under the conditions that the error e and the error change rate ec are different, and has the effects of improving the response speed of a system and reducing the complexity of an optimization process.
The step S5 specifically includes the following steps:
there are a number of pairs
Figure 937619DEST_PATH_IMAGE147
In a divided region
Figure 107701DEST_PATH_IMAGE148
Inner, then to the area
Figure 285872DEST_PATH_IMAGE149
Internal non-linear PID control increments
Figure 908615DEST_PATH_IMAGE150
Parameter (2) of
Figure 463224DEST_PATH_IMAGE151
Figure 222233DEST_PATH_IMAGE152
Figure 571305DEST_PATH_IMAGE153
Performing online learning;
incremental nonlinear PID control using supervised Hebb learning rule
Figure 144326DEST_PATH_IMAGE154
The parameters of (2) are learned:
Figure 971468DEST_PATH_IMAGE155
(16)
wherein the content of the first and second substances,
Figure 319404DEST_PATH_IMAGE156
is a region
Figure 839378DEST_PATH_IMAGE157
Internal learning rate, learning rate
Figure 436712DEST_PATH_IMAGE158
Based on online adjustment rules
Figure 333124DEST_PATH_IMAGE159
And (3) adjusting:
Figure 332304DEST_PATH_IMAGE160
(17)
wherein the content of the first and second substances,
Figure 288759DEST_PATH_IMAGE161
is a region
Figure 842231DEST_PATH_IMAGE162
The matching coefficient in the range is used for adjusting the learning rate range,
Figure 276755DEST_PATH_IMAGE163
is a region
Figure 130441DEST_PATH_IMAGE164
Two weight coefficients within.
The unmanned line marking vehicle online optimization searching control system based on machine vision navigation comprises a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
a vision sensor collects a water line image of a road surface;
the waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicle
Figure 251938DEST_PATH_IMAGE165
Rate of change of deviation from running
Figure 823865DEST_PATH_IMAGE166
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 530921DEST_PATH_IMAGE167
And predicting deviation changeRate of change
Figure 973534DEST_PATH_IMAGE168
The predictive controller operates the deviation according to the prediction of the vehicle
Figure 271792DEST_PATH_IMAGE169
And predicting the rate of change of deviation
Figure 65435DEST_PATH_IMAGE170
Predicting a predicted feedforward control amount of the vehicle
Figure 841761DEST_PATH_IMAGE171
The nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicle
Figure 670040DEST_PATH_IMAGE172
Rate of change of deviation from operation
Figure 139199DEST_PATH_IMAGE173
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 154559DEST_PATH_IMAGE174
On-line learning rule unit for non-linear PID control increment in region
Figure 203418DEST_PATH_IMAGE175
The parameters of (2) are learned.
The working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line image
Figure 614764DEST_PATH_IMAGE176
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 989245DEST_PATH_IMAGE177
wherein the content of the first and second substances,
Figure 491902DEST_PATH_IMAGE178
Figure 610030DEST_PATH_IMAGE179
Figure 412901DEST_PATH_IMAGE180
as a result of the stretching factor,
Figure 223863DEST_PATH_IMAGE181
represents an image
Figure 948236DEST_PATH_IMAGE182
Go to the first
Figure 135635DEST_PATH_IMAGE183
Columns;
Figure 527433DEST_PATH_IMAGE184
is shown as
Figure 509296DEST_PATH_IMAGE185
Go to the first
Figure 720965DEST_PATH_IMAGE183
Gray values of the column pixel points; after stretching the gray scale of the water line image, selecting
Figure 446476DEST_PATH_IMAGE186
The individual HARR-like features express the linear features of each pixel point;
s202, aiming at the first step
Figure 179200DEST_PATH_IMAGE187
Characteristic of personal HARR
Figure 800805DEST_PATH_IMAGE188
The calculation method is as shown in formula (2):
Figure 30930DEST_PATH_IMAGE030
(2)
wherein the content of the first and second substances,
Figure 294552DEST_PATH_IMAGE189
for the pixel sum of the i-th HARR-like feature waterline region,
Figure 926521DEST_PATH_IMAGE190
for the sum of pixels of the ith HARR-like feature road surface area,
Figure 250187DEST_PATH_IMAGE191
Figure 702028DEST_PATH_IMAGE192
gray value based on stretched gray image
Figure 34920DEST_PATH_IMAGE193
Calculating to obtain;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 521396DEST_PATH_IMAGE194
wherein the content of the first and second substances,
Figure 15962DEST_PATH_IMAGE195
is as follows
Figure 955100DEST_PATH_IMAGE196
The number of normalized HARR features is then determined,
Figure 826104DEST_PATH_IMAGE197
Figure 895648DEST_PATH_IMAGE198
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on the Chinese character ofNormalized HARR-like feature to construct pixel points
Figure 295536DEST_PATH_IMAGE199
Feature vector of
Figure 659653DEST_PATH_IMAGE200
Figure 334347DEST_PATH_IMAGE201
S203, aiming at each pixel point
Figure 529837DEST_PATH_IMAGE202
Feature vector of
Figure 366206DEST_PATH_IMAGE203
Identifying and judging pixel points
Figure 279935DEST_PATH_IMAGE204
If the characteristic vector accords with the linear characteristic, setting 1 to the pixel point, otherwise setting 0 to realize binarization of the gray level image, and further obtaining a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 470571DEST_PATH_IMAGE205
Obtaining an edge image E of the waterline by adopting a formula (6),
Figure 723828DEST_PATH_IMAGE206
Figure 668782DEST_PATH_IMAGE207
representing a new binary image
Figure 69807DEST_PATH_IMAGE208
To (1)
Figure 820726DEST_PATH_IMAGE209
Go to the first
Figure 990807DEST_PATH_IMAGE210
Pixel values of the columns;
Figure 452136DEST_PATH_IMAGE211
represents the edge image E
Figure 74878DEST_PATH_IMAGE212
The rows of the image data are, in turn,
Figure 895067DEST_PATH_IMAGE213
pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 654075DEST_PATH_IMAGE214
Where the coordinates of the pixel having the pixel value of 1 are found to be
Figure 268727DEST_PATH_IMAGE215
Obtained by HOUGH conversion
Figure 847607DEST_PATH_IMAGE216
Obtained by the formula (7)
Figure 205908DEST_PATH_IMAGE217
A corresponding linear expression;
Figure 553843DEST_PATH_IMAGE218
wherein the content of the first and second substances,
Figure 73818DEST_PATH_IMAGE219
radius and angle detected by HOUGH transformation;
a group of
Figure 671152DEST_PATH_IMAGE220
Representing a straight line, given a set
Figure 567564DEST_PATH_IMAGE220
If, if
Figure 566744DEST_PATH_IMAGE221
Satisfies the formula (7) to show
Figure 517339DEST_PATH_IMAGE222
On the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1, selecting a plurality of straight lines with the longest length and the length meeting the threshold length;
s205-2, two-value images of front and back frames
Figure 70812DEST_PATH_IMAGE223
Detected straight line parameter
Figure 505335DEST_PATH_IMAGE224
Satisfies formula (8):
Figure 296705DEST_PATH_IMAGE225
wherein the content of the first and second substances,
Figure 689640DEST_PATH_IMAGE226
Figure 995987DEST_PATH_IMAGE227
are respectively as
Figure 234202DEST_PATH_IMAGE228
The maximum allowable deviation angle and radius,
Figure 473553DEST_PATH_IMAGE229
for continuity permitThe threshold value of (2) is set,
Figure 771811DEST_PATH_IMAGE230
is the sampling moment of the image;
s205-3, if 2 or more than 2 straight lines still exist after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
The operation error prediction unit specifically comprises the following working processes: the height and width of the edge image E of the waterline are H and H respectively
Figure 299875DEST_PATH_IMAGE231
(ii) a Selecting the intersection point of the horizontal line with the height of y and the waterline as the monitoring point of the current position of the vehicle: (
Figure 341780DEST_PATH_IMAGE232
Y) calculating a running deviation of the vehicle
Figure 170059DEST_PATH_IMAGE233
Rate of change of deviation from running
Figure 633358DEST_PATH_IMAGE234
Comprises the following steps:
Figure 914298DEST_PATH_IMAGE235
wherein the content of the first and second substances,
Figure 494315DEST_PATH_IMAGE236
the actual length represented by each pixel is represented by the camera calibration;
selecting a height of
Figure 442680DEST_PATH_IMAGE237
The intersection point of the horizontal line and the waterline is taken as a monitoring point of the predicted position of the vehicle: (
Figure 817160DEST_PATH_IMAGE238
Figure 585396DEST_PATH_IMAGE239
) Obtaining a predicted running deviation of the vehicle
Figure 969104DEST_PATH_IMAGE240
And predicting the rate of change of deviation
Figure 771975DEST_PATH_IMAGE241
Figure 582936DEST_PATH_IMAGE242
(10)。
The predictive controller calculates a feedforward control amount of the vehicle
Figure 838468DEST_PATH_IMAGE243
Comprises the following steps:
Figure 25867DEST_PATH_IMAGE244
Figure 417665DEST_PATH_IMAGE245
in order to predict the feedforward control quantity of the controller,
Figure 862510DEST_PATH_IMAGE246
and
Figure 339759DEST_PATH_IMAGE247
are two parameters of the predictive controller;
the nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicle
Figure 534111DEST_PATH_IMAGE248
Rate of change of deviation from operation
Figure 45995DEST_PATH_IMAGE249
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 198758DEST_PATH_IMAGE250
Constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicle
Figure 428883DEST_PATH_IMAGE251
Rate of change of deviation from running
Figure 692505DEST_PATH_IMAGE252
The e-ec plane nonlinear division method comprises the following steps:
Figure 58895DEST_PATH_IMAGE253
when the division of the error change rate ec is calculated,
Figure 913719DEST_PATH_IMAGE254
is that
Figure 365560DEST_PATH_IMAGE255
When the division of the error e is calculated,
Figure 432873DEST_PATH_IMAGE256
is that
Figure 919349DEST_PATH_IMAGE257
Figure 413916DEST_PATH_IMAGE258
Figure 370631DEST_PATH_IMAGE259
Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,
Figure 241635DEST_PATH_IMAGE260
is a mistakeEqual division points of the difference e or the error change rate ec,
Figure 582617DEST_PATH_IMAGE261
in order to non-uniformly partition the points after mapping,
Figure 248085DEST_PATH_IMAGE262
the factor is adjusted for the degree of non-linearity.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the divided area set is marked as
Figure 940098DEST_PATH_IMAGE263
In each divided region
Figure 349213DEST_PATH_IMAGE264
PID control is carried out by a nonlinear increment PID controller, and the increment is controlled by the nonlinear PID
Figure 544702DEST_PATH_IMAGE265
Comprises the following steps:
Figure 646651DEST_PATH_IMAGE266
wherein the content of the first and second substances,
Figure 560380DEST_PATH_IMAGE267
to represent
Figure 976449DEST_PATH_IMAGE268
Region of time
Figure 292024DEST_PATH_IMAGE269
Non-linear PID control increments of (a);
Figure 33715DEST_PATH_IMAGE270
which is indicative of the time of the sampling,
Figure 163302DEST_PATH_IMAGE271
indicating area
Figure 648641DEST_PATH_IMAGE272
The ratio coefficient of the inner side of the outer ring,
Figure 553143DEST_PATH_IMAGE273
the scale is shown to be that of,
Figure 668998DEST_PATH_IMAGE274
the lines are represented as a result of,
Figure 26161DEST_PATH_IMAGE275
the columns are represented.
Figure 315191DEST_PATH_IMAGE276
The value of the integral coefficient is represented by,
Figure 339779DEST_PATH_IMAGE277
represents a differential coefficient;
based on all regions
Figure 954431DEST_PATH_IMAGE278
Non-linear PID control increments of
Figure 64469DEST_PATH_IMAGE279
Calculating a nonlinear PID control weighted average increment:
Figure 157190DEST_PATH_IMAGE280
wherein the content of the first and second substances,
Figure 301864DEST_PATH_IMAGE281
is a region
Figure 81558DEST_PATH_IMAGE282
The control law weight is incremented by one,
Figure 678892DEST_PATH_IMAGE283
Figure 575304DEST_PATH_IMAGE284
is a region
Figure 308905DEST_PATH_IMAGE285
The error e, the radius of the error change rate ec,
Figure 999780DEST_PATH_IMAGE286
is a region
Figure 84411DEST_PATH_IMAGE287
The center of (a);
based on incremental control law weights
Figure 784514DEST_PATH_IMAGE288
Calculating
Figure 638200DEST_PATH_IMAGE289
Time incremental nonlinear PID control law
Figure 499977DEST_PATH_IMAGE290
Comprises the following steps:
Figure 71904DEST_PATH_IMAGE291
wherein the content of the first and second substances,
Figure 310118DEST_PATH_IMAGE292
an incremental factor, which is defined as follows:
Figure 283890DEST_PATH_IMAGE293
wherein the content of the first and second substances,
Figure 316568DEST_PATH_IMAGE294
for the maximum value of the incremental factor,
Figure 635511DEST_PATH_IMAGE295
is an offset;
Figure 677417DEST_PATH_IMAGE296
running deviation for vehicle
Figure 708958DEST_PATH_IMAGE297
Rate of change of deviation from running
Figure 912537DEST_PATH_IMAGE298
Degree of deviation from origin;
Figure 193477DEST_PATH_IMAGE299
Figure 39073DEST_PATH_IMAGE300
is a scaling factor.
The method can lead the error change rate ec to have different increment factors according to different errors e.
Design of
Figure 987437DEST_PATH_IMAGE301
Is a running deviation
Figure 361918DEST_PATH_IMAGE302
Rate of change of deviation from running
Figure 943203DEST_PATH_IMAGE303
The controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
The working process of the online learning rule unit specifically comprises the following steps:
there are a number of pairs
Figure 326911DEST_PATH_IMAGE304
In a divided region
Figure 881781DEST_PATH_IMAGE305
Inner, then to the area
Figure 692742DEST_PATH_IMAGE306
Internal non-linear PID control increments
Figure 948274DEST_PATH_IMAGE307
Parameter (2) of
Figure 135673DEST_PATH_IMAGE308
Figure 793050DEST_PATH_IMAGE309
Figure 774913DEST_PATH_IMAGE310
And performing online learning. Incremental nonlinear PID control using supervised Hebb learning rule
Figure 517741DEST_PATH_IMAGE311
The parameters of (2) are learned:
Figure 243252DEST_PATH_IMAGE155
(16)
wherein the content of the first and second substances,
Figure 755135DEST_PATH_IMAGE312
is a region
Figure 907899DEST_PATH_IMAGE313
The inner learning rate. To improve learning efficiency, learning rate
Figure 872444DEST_PATH_IMAGE314
Based on online adjustment rules
Figure 667225DEST_PATH_IMAGE315
The adjustment is carried out, namely:
Figure 33615DEST_PATH_IMAGE316
(17)
wherein the content of the first and second substances,
Figure 820263DEST_PATH_IMAGE317
is a region
Figure 272104DEST_PATH_IMAGE318
The matching coefficient in the range, the learning rate range is adjusted,
Figure 73838DEST_PATH_IMAGE319
is a region
Figure 966838DEST_PATH_IMAGE320
Two weight coefficients within.
Compared with the prior art, the invention has the beneficial effects that:
the application discloses an unmanned line marking vehicle on-line optimization control method based on machine vision navigation, which adopts a new HARR-like feature and an image processing method to realize the machine vision real-time navigation of a line marking vehicle, designs a feedforward controller and a nonlinear PID controller according to a tracking error and an error change rate, and discloses an on-line learning method of parameters of the nonlinear PID controller to realize the line marking vehicle path tracking control based on the machine vision navigation.
The application discloses a novel HARR-like feature which is used for extracting a straight line feature of a waterline, is convenient for later waterline detection, not only meets the real-time requirement of image detection, but also fully considers background information around the waterline, and improves the success rate of waterline detection;
the method is based on the detected waterline edge, the waterline in each frame of image is extracted by adopting a HOUGH conversion method, a new interference waterline filtering method is provided, and the robustness of waterline identification is realized; according to the respective conditions of the tracking error and the error change rate, the e-ec plane is divided into areas, an incremental PID controller is designed in each area, and finally a nonlinear incremental PID controller is constructed, so that the rapidity of the track tracking of the scribing car is realized, and the tracking precision of the scribing car is improved; the parameters of the nonlinear PID controller are adjusted by adopting a supervised Hebb learning rule, so that the online optimization of the nonlinear PID controller is realized; according to the prediction tracking deviation, a feedforward prediction controller is designed, the waterline adaptability of scribing vehicle control is improved, and the scribing vehicle control precision and the anti-interference capability are improved.
Drawings
FIG. 1 is a schematic view of a line marking vehicle construction;
FIG. 2 class HARR features;
FIG. 3 is a schematic of an operating deviation and predicted deviation calculation;
FIG. 4 is an online optimizing control system of an unmanned line marking vehicle based on machine vision navigation;
FIG. 5 e-ec plane non-linear division diagram;
FIG. 6 e-ec plane division area schematic diagram;
fig. 7 is a water line image schematic.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The on-line optimizing control method of the unmanned line marking vehicle based on the machine vision navigation comprises the following steps:
s1, collecting a water line image of a road surface to obtain a data sample, wherein the water line image is shown in FIG. 7:
the gray level image of the asphalt pavement is acquired through the vision sensor arranged on the line marking vehicle in the figure 1, a water line is drawn on the pavement, and water line information is arranged in the image. The vision sensor is arranged at the height of 35-45cm away from the asphalt pavement and ensures that the length of a view field in the advancing direction of the scribing vehicle is 30-40cm. In order to ensure that the visual sensor is resistant to the interference of external light, certain shading equipment is generally arranged outside the visual sensor.
S2, carrying out waterline detection based on the waterline image to obtain a waterline;
the step S2 specifically includes the following steps:
s201, aiming at pixel points on the water line image
Figure 726984DEST_PATH_IMAGE321
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 666121DEST_PATH_IMAGE322
wherein the content of the first and second substances,
Figure 271546DEST_PATH_IMAGE323
Figure 346949DEST_PATH_IMAGE324
Figure 746838DEST_PATH_IMAGE325
is a stretch factor; m, n represents the mth row and nth column of the image;
Figure 173271DEST_PATH_IMAGE326
is shown as
Figure 310948DEST_PATH_IMAGE327
Go to the first
Figure 506437DEST_PATH_IMAGE328
After gray value of the row pixel points is subjected to gray value stretching of the gray value water line image, the linear characteristic of each pixel point is expressed based on the HARR-like characteristic, and the HARR-like characteristic is shown in figure 2: in the embodiment, 6 HARR-like features are selected according to engineering conditions, the number of the HARR-like features is large, and the HARR-like features are designed in the HARR-like features M1-M6 according to actual conditions to divide different regions. Assuming that the width of the water line region is 2w, where w is half the width of the water line, the width of the road surface region and the transition region in the HARR-like features M1-M3 is w. In the HARR-like features M4-M6, the width of each region is 2w, and the height of all HARR-like features is 4w-6w. For the rotation features M2, M3, M5, M6, the rotation angle is 15 degrees, which is determined by the field waterline characteristics.
Because the boundary of the waterline area is easy to be fuzzy, in the embodiment, the transition area is arranged between the road surface area and the waterline area, so that the fuzzy interference of the waterline boundary on the feature extraction is avoided, namely, when the feature is extracted, the transition area is not considered, and the transition area is ignored.
S202, the HARR-like features are used for highlighting the straight line features of the waterline on the road surface, and the waterline identification in the later period is facilitated. To the first
Figure 608385DEST_PATH_IMAGE329
Characteristic of personal HARR
Figure 522115DEST_PATH_IMAGE330
Figure 469342DEST_PATH_IMAGE331
The calculation method is as shown in formula (2):
Figure 784917DEST_PATH_IMAGE030
(2)
in the present embodiment, Q =6,
Figure 57767DEST_PATH_IMAGE332
is as follows
Figure 458792DEST_PATH_IMAGE333
The sum of pixels of the individual HARR-like feature waterline regions,
Figure 209710DEST_PATH_IMAGE334
is as follows
Figure 379792DEST_PATH_IMAGE335
The sum of pixels of a road surface area with a HARR-like characteristic,
Figure 557963DEST_PATH_IMAGE336
Figure 446285DEST_PATH_IMAGE337
gray value based on stretched gray image
Figure 266473DEST_PATH_IMAGE338
Calculating outObtaining;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 285202DEST_PATH_IMAGE339
(3)
wherein the content of the first and second substances,
Figure 368695DEST_PATH_IMAGE340
in order to normalize the HARR characteristics of the sample,
Figure 213155DEST_PATH_IMAGE341
Figure 305876DEST_PATH_IMAGE342
respectively the average value of the gray levels and the average value of the square of the gray levels in the detection window;
based on the 6 normalized HARR-like features (see formula 3), a pixel point is constructed
Figure 450549DEST_PATH_IMAGE343
Feature vector of
Figure 236102DEST_PATH_IMAGE344
Figure 833437DEST_PATH_IMAGE345
S203, in order to detect the waterline in the image, each pixel point is subjected to
Figure 729849DEST_PATH_IMAGE346
Feature vector of
Figure 994608DEST_PATH_IMAGE347
Identifying and judging pixel points
Figure 685484DEST_PATH_IMAGE346
Whether the feature vector of (1) is in line withAnd (4) line characteristics, if the line characteristics accord with the straight line characteristics, setting 1 to the pixel point, otherwise, setting 0 to realize binarization of the gray level image, and further obtaining a binary image B. The method is innovative in that the binarization problem of the image is changed into the pattern recognition problem of the features.
Feature vector pair by SVM method
Figure 770114DEST_PATH_IMAGE348
Identification is carried out, and the kernel function is the formula (5):
Figure 204638DEST_PATH_IMAGE349
wherein the content of the first and second substances,
Figure 58324DEST_PATH_IMAGE350
is a covariance matrix of the feature set.
Figure 15758DEST_PATH_IMAGE351
Is that
Figure 587685DEST_PATH_IMAGE352
The formula (5) is a kernel function in the SVM classification method, and is to use the feature vector
Figure 825900DEST_PATH_IMAGE353
Mapping to a higher dimension, judging whether the feature is a straight line feature through the SVM, and performing two-classification; constructing a feature vector for each pixel point according to the HARR-like features, and then judging whether the pixel point represented by each feature vector is on a straight line or not by using an SVM (support vector machine);
after obtaining the binary image B, carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 534093DEST_PATH_IMAGE354
Obtaining an edge image E of the waterline by adopting a formula (6),
Figure 566771DEST_PATH_IMAGE355
Figure 625994DEST_PATH_IMAGE356
representing a new binary image
Figure 136740DEST_PATH_IMAGE357
To (1) a
Figure 965019DEST_PATH_IMAGE358
Go to the first
Figure 434178DEST_PATH_IMAGE359
Pixel values of the columns;
Figure 715118DEST_PATH_IMAGE360
represents the edge image E
Figure 295135DEST_PATH_IMAGE361
The rows of the image data are, in turn,
Figure 977920DEST_PATH_IMAGE362
pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 612120DEST_PATH_IMAGE363
Where the coordinates of the pixel having the pixel value of 1 are found to be
Figure 380356DEST_PATH_IMAGE364
Obtained by HOUGH conversion
Figure 498485DEST_PATH_IMAGE365
Obtained by the formula (7)
Figure 504618DEST_PATH_IMAGE366
A corresponding linear expression;
Figure 863049DEST_PATH_IMAGE367
wherein the content of the first and second substances,
Figure 321844DEST_PATH_IMAGE368
respectively, the radius and angle (variable representation of straight lines) detected by the HOUGH transform.
A group of
Figure 243663DEST_PATH_IMAGE368
Representing a straight line, given a set
Figure 635462DEST_PATH_IMAGE369
If at all
Figure 617324DEST_PATH_IMAGE370
Satisfies the formula (7)
Figure 88713DEST_PATH_IMAGE371
On the straight line represented by formula (7);
s205, due to the influence of noise in the image, a plurality of straight lines are detected, and the interference straight lines are removed to obtain real straight lines. The interference line removing method specifically comprises the following steps:
s205-1: selecting 3-5 straight lines with the longest length, wherein the length of the straight lines meets the threshold length (the length is greater than or equal to 1/4 of the image height, and the threshold length is 1/4 of the image height in the embodiment);
s205-2: because the waterline is continuous, the linear parameters detected by the two frames of images before and after the waterline are continuous
Figure 548645DEST_PATH_IMAGE372
The requirement of formula (8) is satisfied, namely:
Figure 794949DEST_PATH_IMAGE373
wherein, the first and the second end of the pipe are connected with each other,
Figure 682134DEST_PATH_IMAGE374
Figure 115521DEST_PATH_IMAGE375
are respectively as
Figure 379143DEST_PATH_IMAGE376
The maximum allowable deviation angle and radius,
Figure 479954DEST_PATH_IMAGE377
is a threshold value that is allowed for continuity,
Figure 272461DEST_PATH_IMAGE378
is the sampling instant of the image.
S205-3: if a plurality of straight lines are still satisfied after passing through S205-1 and S205-2 (there are 2 or 2 straight lines in a line), the straight line with the highest pixel average value on the straight line is selected as the detected waterline, because the waterline is generally white and the brightness is highest in the image.
S3, based on the detected waterline, selecting a monitoring point of the current position of the vehicle, and calculating the running deviation of the vehicle
Figure 989881DEST_PATH_IMAGE379
Rate of change of deviation from running
Figure 791615DEST_PATH_IMAGE380
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 6652DEST_PATH_IMAGE381
And predicting the rate of change of deviation
Figure 501219DEST_PATH_IMAGE382
The detected waterline is shown in FIG. 3, H,
Figure 440356DEST_PATH_IMAGE383
Are respectively an edgeThe height and width of the edge image E (the height and width of the edge image E and the horizontal line image H and W are the same). Taking the intersection point of the horizontal line at the position of 0.7H and the waterline as a monitoring point of the current position of the vehicle, and taking the 0.7H as y in the formula (7) to obtain the abscissa of the monitoring point of the current position
Figure 249043DEST_PATH_IMAGE384
. At this time, the running deviation of the vehicle
Figure 58867DEST_PATH_IMAGE385
Rate of change of deviation from running
Figure 989914DEST_PATH_IMAGE386
Comprises the following steps:
0.7H is a value of this embodiment y, generally, the distance between two horizontal lines in fig. 3 should be at least the distance that the vehicle can travel in one control cycle, but the two horizontal lines cannot be too close to the upper and lower boundaries of the image, and according to the set vehicle speed and the visual field range of the vision sensor, the embodiment selects the intersection point of the horizontal line at 0.7H and the water line as the monitoring point of the current position of the vehicle;
Figure 88451DEST_PATH_IMAGE387
wherein the content of the first and second substances,
Figure 966408DEST_PATH_IMAGE388
obtained by camera calibration, represents the actual length (the actual distance represented by a pixel unit) represented by each pixel. In the present embodiment, the first and second embodiments,
Figure 896318DEST_PATH_IMAGE389
taking the value of 0.3H, taking the intersection point of the horizontal line at the 0.3H position and the waterline as a monitoring point of the predicted position of the vehicle, and obtaining the predicted running deviation of the vehicle
Figure 935950DEST_PATH_IMAGE390
And predicting the rate of change of deviation
Figure 336099DEST_PATH_IMAGE391
Figure 17747DEST_PATH_IMAGE392
(10)
This division is reasonable because the vehicle speed is approximately 40 m/min and the effective field of view of the vision sensor is 20-30 cm.
As seen in fig. 3, the current position error is on the left, but to the right by the predicted position. Therefore, the error cannot be excessively adjusted at the current position, and the current adjustment can be finely adjusted by the prediction error.
S4, according to the predicted operation deviation of the vehicle
Figure 536584DEST_PATH_IMAGE393
And predicting the rate of change of deviation
Figure 543854DEST_PATH_IMAGE394
Obtaining a predicted feedforward control quantity
Figure 148142DEST_PATH_IMAGE395
Based on running deviation of the vehicle
Figure 899060DEST_PATH_IMAGE396
Rate of change of deviation from running
Figure 803563DEST_PATH_IMAGE397
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 981734DEST_PATH_IMAGE398
Step S4 specifically includes the following steps:
s401, according to the predicted operation deviation of the vehicle
Figure 870056DEST_PATH_IMAGE399
And predicting the rate of change of deviation
Figure 424665DEST_PATH_IMAGE400
Predicting a feedforward control amount of the vehicle
Figure 443393DEST_PATH_IMAGE401
Figure 792466DEST_PATH_IMAGE402
Figure 902505DEST_PATH_IMAGE403
In order to predict the amount of feedforward control,
Figure 260805DEST_PATH_IMAGE404
and
Figure 874320DEST_PATH_IMAGE405
are two parameters of predictive control;
s402, in the area of the e-ec plane, based on the running deviation of the vehicle
Figure 394294DEST_PATH_IMAGE406
Rate of change of deviation from running
Figure 929312DEST_PATH_IMAGE407
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 825724DEST_PATH_IMAGE408
The error e or the error change rate ec of the system conforms to Gaussian distribution, so that the non-uniform division method of the error and the error change rate is constructed based on a Gaussian function, and the e-ec plane takes the error e as the horizontal axis and the error change rate ec as the vertical axis; error or rate of change of error refers to deviation in the operation of the vehicle
Figure 824904DEST_PATH_IMAGE409
Rate of change of deviation from running
Figure 984621DEST_PATH_IMAGE410
(ii) a The e-ec plane division adopts a nonlinear division method:
Figure 538093DEST_PATH_IMAGE411
when the division of the error change rate ec is calculated,
Figure 701178DEST_PATH_IMAGE412
is that
Figure 554864DEST_PATH_IMAGE413
When the division of the error e is calculated,
Figure 682220DEST_PATH_IMAGE414
is that
Figure 519726DEST_PATH_IMAGE415
Figure 492362DEST_PATH_IMAGE416
Figure 466134DEST_PATH_IMAGE417
Respectively, the maximum of the absolute values of the error e and the error change rate ec, wherein,
Figure 498812DEST_PATH_IMAGE418
for equal uniform division points of the error e or the error change rate ec,
Figure 292455DEST_PATH_IMAGE419
in order to non-uniformly partition the points after mapping,
Figure 334361DEST_PATH_IMAGE420
adjusting a factor for the degree of non-linearity; the non-uniform partitioning diagram is shown in fig. 5.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the set of divided areas (each square in fig. 6 is an area) is marked as
Figure 162640DEST_PATH_IMAGE421
As shown in fig. 6:
in each divided region
Figure 631798DEST_PATH_IMAGE422
PID control is carried out by a nonlinear increment PID controller, and the increment is controlled by the nonlinear PID
Figure 647159DEST_PATH_IMAGE423
Comprises the following steps:
Figure 221316DEST_PATH_IMAGE424
wherein the content of the first and second substances,
Figure 904102DEST_PATH_IMAGE425
represent
Figure 481845DEST_PATH_IMAGE426
Region of time
Figure 718922DEST_PATH_IMAGE427
Non-linear PID control increments of (1);
Figure 837051DEST_PATH_IMAGE428
which is indicative of the time of the sampling,
Figure 639922DEST_PATH_IMAGE429
indicating area
Figure 450883DEST_PATH_IMAGE430
The ratio coefficient of the inner side of the outer ring,
Figure 440836DEST_PATH_IMAGE431
the scale is shown to be that of,
Figure 628234DEST_PATH_IMAGE432
the lines are represented as a result of,
Figure 551191DEST_PATH_IMAGE433
the columns are represented.
Figure 267474DEST_PATH_IMAGE434
The value of the integral coefficient is represented by,
Figure 10302DEST_PATH_IMAGE435
represents a differential coefficient;
based on all regions
Figure 753391DEST_PATH_IMAGE436
Non-linear PID control increments of
Figure 265275DEST_PATH_IMAGE437
And calculating the weighted average increment of the nonlinear PID control:
Figure 418039DEST_PATH_IMAGE438
wherein, the first and the second end of the pipe are connected with each other,
Figure 648163DEST_PATH_IMAGE439
is a region
Figure 911785DEST_PATH_IMAGE440
The control law weight is incremented by one,
Figure 543755DEST_PATH_IMAGE441
is a region
Figure 867420DEST_PATH_IMAGE442
Error e (square in figure 6), radius of error rate of change ec,
Figure 319261DEST_PATH_IMAGE443
is a region
Figure 386574DEST_PATH_IMAGE444
Of the center of (c).
Synthesizing the above incremental control law weights
Figure 873050DEST_PATH_IMAGE445
Calculating
Figure 367617DEST_PATH_IMAGE446
Time incremental nonlinear PID control law
Figure 306754DEST_PATH_IMAGE447
Comprises the following steps:
Figure 177758DEST_PATH_IMAGE448
wherein the content of the first and second substances,
Figure 512881DEST_PATH_IMAGE449
is an incremental factor, defined as:
Figure 178349DEST_PATH_IMAGE450
wherein the content of the first and second substances,
Figure 604782DEST_PATH_IMAGE451
for the maximum value of the incremental factor,
Figure 545056DEST_PATH_IMAGE452
is an offset;
Figure 943808DEST_PATH_IMAGE453
for describing running deviations
Figure 780177DEST_PATH_IMAGE454
Rate of change of deviation from running
Figure 631589DEST_PATH_IMAGE455
The degree of deviation from the origin is,
Figure 844396DEST_PATH_IMAGE456
to be flexibleA factor.
The method can enable different e and ec to have different increment factors.
Figure 628812DEST_PATH_IMAGE457
Is a running deviation
Figure 901662DEST_PATH_IMAGE458
Rate of change of deviation from running
Figure 302687DEST_PATH_IMAGE459
The controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
S5, controlling increment on the nonlinear PID in the region
Figure 47746DEST_PATH_IMAGE460
Parameter (2) of
Figure 952248DEST_PATH_IMAGE461
Figure 130420DEST_PATH_IMAGE462
Figure 753162DEST_PATH_IMAGE463
Performing online learning;
s5 specifically comprises the following steps: there are a number of pairs
Figure 42192DEST_PATH_IMAGE464
In a divided region
Figure 801201DEST_PATH_IMAGE465
Inner, then to the area
Figure 681432DEST_PATH_IMAGE466
Internal non-linear PID control increments
Figure 791470DEST_PATH_IMAGE467
Parameter (2) of
Figure 884191DEST_PATH_IMAGE468
Figure 763286DEST_PATH_IMAGE469
Figure 548839DEST_PATH_IMAGE470
And performing online learning.
Incremental nonlinear PID control using supervised Hebb learning rule
Figure 411753DEST_PATH_IMAGE471
The parameters of (2) are learned:
Figure 302305DEST_PATH_IMAGE155
(16)
wherein the content of the first and second substances,
Figure 301485DEST_PATH_IMAGE472
is a region
Figure 992361DEST_PATH_IMAGE473
The inner learning rate. To improve learning efficiency, learning rate
Figure 76991DEST_PATH_IMAGE474
Based on online adjustment rules
Figure 777094DEST_PATH_IMAGE475
The adjustment is carried out, namely:
Figure 834043DEST_PATH_IMAGE476
(17)
wherein the content of the first and second substances,
Figure 695820DEST_PATH_IMAGE477
is a region
Figure 674271DEST_PATH_IMAGE478
Inside ofMatching coefficient, adjusting the range of learning rate,
Figure 912486DEST_PATH_IMAGE479
is a region
Figure 620679DEST_PATH_IMAGE480
Two weight coefficients within. The two weighting coefficients may be artificially given based on historical data.
As shown in fig. 4, the online optimizing control system of the unmanned line marking vehicle based on machine vision navigation comprises a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
the vision sensor collects a water line image of the road surface (fig. 7), and acquires a data sample:
the gray level image of the asphalt pavement is acquired through the vision sensor arranged on the line marking vehicle in the figure 1, a waterline drawn in advance is arranged on the pavement, and waterline information is required in the image. The vision sensor is arranged at the height of 35-45cm away from the asphalt pavement and ensures that the length of a view field in the advancing direction of the scribing vehicle is 30-40cm. In order to ensure that the visual sensor is resistant to the interference of external light, a certain shading device is generally arranged outside the visual sensor.
The waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the working process of the waterline detection unit specifically comprises the following steps:
s201, aiming at pixel points on the water line image
Figure 653357DEST_PATH_IMAGE481
Performing gray scale stretching, wherein the stretching formula is shown as formula (1):
Figure 198999DEST_PATH_IMAGE482
(1)
wherein the content of the first and second substances,
Figure 506484DEST_PATH_IMAGE483
Figure 334763DEST_PATH_IMAGE484
Figure 803921DEST_PATH_IMAGE485
as a result of the stretching factor,
Figure 84861DEST_PATH_IMAGE486
represents an image
Figure 664878DEST_PATH_IMAGE487
Line, first
Figure 347663DEST_PATH_IMAGE488
Columns;
Figure 987723DEST_PATH_IMAGE489
is shown as
Figure 755959DEST_PATH_IMAGE490
Go to the first
Figure 139667DEST_PATH_IMAGE491
After gray value of the row pixel points is subjected to gray value stretching of the gray value water line image, the linear characteristic of each pixel point is expressed based on the HARR-like characteristic, and the HARR-like characteristic is shown in figure 2:
in the embodiment, 6 HARR-like features are selected according to engineering conditions, the number of the HARR-like features is large, and the HARR-like features are designed in the HARR-like features M1-M6 according to actual conditions to divide different regions. Assuming that the width of the waterline region is 2w, where w is half the width of the waterline, the width of the road surface region and the transition region in the HARR-like features M1-M3 is w. In the HARR-like features M4-M6, the width of each region is 2w, and the height of all HARR-like features is 4w-6w. For the rotation features M2, M3, M5, M6, the rotation angle is 15 degrees, which is determined by the site waterline characteristics.
S202, the HARR-like features are used for highlighting the straight line features of the waterline on the road surface, and the waterline identification in the later period is facilitated. For the ith HARR-like feature
Figure 676959DEST_PATH_IMAGE492
The calculation method is as shown in formula (2):
Figure 487920DEST_PATH_IMAGE030
(2)
in the present embodiment, Q =6,
Figure 743452DEST_PATH_IMAGE493
for the pixel sum of the i-th HARR-like feature waterline region,
Figure 924991DEST_PATH_IMAGE494
the sum of the pixels of the ith HARR-like characteristic road surface area,
Figure 316789DEST_PATH_IMAGE495
Figure 298652DEST_PATH_IMAGE496
gray value based on stretched gray image
Figure 307059DEST_PATH_IMAGE497
Calculating to obtain;
after the description of the HARR-like features (straight line features) of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure 32570DEST_PATH_IMAGE339
(3)
wherein the content of the first and second substances,
Figure 13295DEST_PATH_IMAGE498
in order to normalize the HARR signature after the normalization,
Figure 166059DEST_PATH_IMAGE499
Figure 68287DEST_PATH_IMAGE500
respectively, mean value of gray level and average value of gray level in detection windowThe mean of the squares;
based on the 6 normalized HARR-like features (see formula 3), a pixel point is constructed
Figure 331909DEST_PATH_IMAGE501
Feature vector of
Figure 698300DEST_PATH_IMAGE502
Figure 21965DEST_PATH_IMAGE503
S203, in order to detect the waterline in the image, each pixel point is subjected to
Figure 733526DEST_PATH_IMAGE504
Feature vector of
Figure 800839DEST_PATH_IMAGE505
Performing identification to determine pixel points
Figure 287315DEST_PATH_IMAGE506
And (3) judging whether the characteristic vector accords with the linear characteristic, if so, setting 1 to the pixel point, otherwise, setting 0 to realize binarization of the gray level image, and further obtaining a binary image B. The method is innovative in that the binarization problem of the image is changed into the pattern recognition problem of the features.
The feature vector constructed by the SVM method is adopted for identification, and the kernel function is as follows:
Figure 781881DEST_PATH_IMAGE507
(5)
wherein the content of the first and second substances,
Figure 721018DEST_PATH_IMAGE508
is a covariance matrix of the feature set.
Figure 592022DEST_PATH_IMAGE509
Is that
Figure 401847DEST_PATH_IMAGE510
The formula (5) is a kernel function to be designed in the SVM classification method, and is to use the feature vector
Figure 801735DEST_PATH_IMAGE511
Mapping to a higher dimension, judging whether the feature is a straight line feature through the SVM, and performing two-classification; constructing a linear feature vector for each pixel point according to the HARR-like features, and then judging whether the pixel point represented by each linear feature vector is on a straight line or not by using an SVM (support vector machine);
after obtaining the binary image, carrying out expansion corrosion treatment on the binary image B to obtain a new binary image
Figure 228168DEST_PATH_IMAGE512
Obtaining an edge image E of the waterline by adopting a formula (6);
Figure 637284DEST_PATH_IMAGE513
(6)
Figure 832773DEST_PATH_IMAGE514
representing a new binary image
Figure 934721DEST_PATH_IMAGE515
To (1) a
Figure 848451DEST_PATH_IMAGE516
Go to the first
Figure 789819DEST_PATH_IMAGE517
Pixel values of the columns;
Figure 105394DEST_PATH_IMAGE518
represents the edge image E
Figure 112664DEST_PATH_IMAGE519
The rows of the image data are, in turn,
Figure 716952DEST_PATH_IMAGE520
pixel values of the columns;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
from new binary images
Figure 202291DEST_PATH_IMAGE521
Where the coordinates of the pixel having the pixel value of 1 are found to be
Figure 106793DEST_PATH_IMAGE522
Obtained by HOUGH conversion
Figure 550544DEST_PATH_IMAGE523
Obtained by the formula (7)
Figure 173286DEST_PATH_IMAGE524
A corresponding linear expression;
Figure 727895DEST_PATH_IMAGE525
(7)
wherein the content of the first and second substances,
Figure 752483DEST_PATH_IMAGE526
respectively, the radius and angle (variable representation of straight lines) detected by the HOUGH transform.
A group of
Figure 632714DEST_PATH_IMAGE527
Representing a straight line, given a set
Figure 742753DEST_PATH_IMAGE528
If, if
Figure 587473DEST_PATH_IMAGE529
Satisfies the formula (7)
Figure 732146DEST_PATH_IMAGE530
On the straight line represented by formula (7);
s205, due to the influence of noise in the image, a plurality of straight lines are detected, and the interference straight lines are removed to obtain real straight lines. The interference line removing method specifically comprises the following steps:
s205-1, selecting 3-5 straight lines with the longest length, wherein the length of the straight lines meets the threshold length (the length is greater than or equal to 1/4 of the image height, and the threshold length is 1/4 of the image height in the embodiment);
s205-2, because the waterline is continuous, the straight line parameters detected by the front frame image and the rear frame image
Figure 252120DEST_PATH_IMAGE531
The requirement of formula (8) is satisfied, namely:
Figure 849455DEST_PATH_IMAGE532
(8)
wherein, the first and the second end of the pipe are connected with each other,
Figure 480288DEST_PATH_IMAGE533
Figure 213888DEST_PATH_IMAGE534
are respectively as
Figure 170343DEST_PATH_IMAGE535
The maximum allowable deviation angle and radius,
Figure 254974DEST_PATH_IMAGE536
is a threshold value that is allowed for continuity,
Figure 689497DEST_PATH_IMAGE537
is the sampling instant of the image.
S205-3, if a plurality of straight lines meet the requirements after the S205-1 and the S205-2, the straight line with the highest pixel average value on the straight line is selected as the detected waterline, because the waterline is generally white and the brightness is highest in the image.
The operation error prediction unit selects a monitoring point of the current position of the vehicle based on the detected waterline, and calculates the operation deviation of the vehicle
Figure 808763DEST_PATH_IMAGE538
Rate of change of deviation from running
Figure 404961DEST_PATH_IMAGE539
Selecting the monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation of the vehicle
Figure 242467DEST_PATH_IMAGE540
And predicting the rate of change of deviation
Figure 474822DEST_PATH_IMAGE541
The detected waterline is shown in FIG. 3:
as shown in FIG. 3, H,
Figure 448594DEST_PATH_IMAGE542
The distribution is the height and width of the edge image E (the height and width of the edge image E and the height and width of the horizontal line image H and W are the same). Taking the intersection point of the horizontal line at the position of 0.7H and the waterline as a monitoring point of the current position of the vehicle, and taking the 0.7H as y in the formula (7) to obtain the abscissa of the monitoring point of the current position
Figure 481272DEST_PATH_IMAGE543
. At this time, the running deviation of the vehicle
Figure 540495DEST_PATH_IMAGE544
Rate of change of deviation from running
Figure 316821DEST_PATH_IMAGE545
Comprises the following steps:
0.7H is a value of this embodiment, the distance between the two horizontal lines in fig. 3 should be at least the distance that the vehicle can travel in one control cycle, but the two horizontal lines cannot be too close to the upper and lower boundaries of the image, and according to the set vehicle speed and the visual field range of the vision sensor, the present embodiment takes the intersection point of the horizontal line at 0.7H and the waterline as a monitoring point of the current position of the vehicle;
Figure 145100DEST_PATH_IMAGE546
(9)
wherein the content of the first and second substances,
Figure 83100DEST_PATH_IMAGE547
obtained by camera calibration, represents the actual length (the actual distance represented by a pixel unit) represented by each pixel. The intersection point of the horizontal line at 0.3H and the waterline is used as a monitoring point of the predicted position of the vehicle to obtain the predicted running deviation of the vehicle
Figure 364039DEST_PATH_IMAGE548
And predicting the rate of change of deviation
Figure 147319DEST_PATH_IMAGE549
Figure 830104DEST_PATH_IMAGE550
(10)
This division is reasonable because the vehicle speed is approximately 40 m/min and the effective field of view of the vision sensor is 20-30 cm.
As seen in fig. 3, the current position error is on the left, but to the right by the predicted position. The error cannot be adjusted excessively at the current position and the current adjustment can be fine-tuned by the prediction error.
The predictive controller operates the deviation according to the prediction of the vehicle
Figure 470164DEST_PATH_IMAGE551
And predicting the rate of change of deviation
Figure 503979DEST_PATH_IMAGE552
Predicting a feedforward control amount of the vehicle
Figure 616248DEST_PATH_IMAGE553
Figure 419119DEST_PATH_IMAGE554
(11)
Figure 230080DEST_PATH_IMAGE555
In order to predict the feedforward control quantity of the controller,
Figure 485612DEST_PATH_IMAGE556
and
Figure 141853DEST_PATH_IMAGE557
are two parameters of the predictive controller;
the nonlinear incremental PID controller is in a nonlinear division area of an e-ec plane and is based on the running deviation of the vehicle
Figure 64810DEST_PATH_IMAGE558
Rate of change of deviation from running
Figure 46672DEST_PATH_IMAGE559
Obtaining an incremental nonlinear PID control law for a vehicle
Figure 789500DEST_PATH_IMAGE560
The error e or the error change rate ec of the system conforms to Gaussian distribution, so that a non-uniform division method of the error e and the error change rate ec is constructed based on a Gaussian function, and an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; error or rate of change of error refers to deviation in the operation of the vehicle
Figure 249431DEST_PATH_IMAGE561
Rate of change of deviation from running
Figure 230157DEST_PATH_IMAGE562
The e-ec plane division adopts a nonlinear division method, and the nonlinear division method comprises the following steps:
Figure 382921DEST_PATH_IMAGE563
(12)
when the division of the error change rate ec is calculated,
Figure 347466DEST_PATH_IMAGE564
is that
Figure 605228DEST_PATH_IMAGE565
When the division of the error e is calculated,
Figure 971619DEST_PATH_IMAGE566
is that
Figure 295284DEST_PATH_IMAGE567
Figure 12704DEST_PATH_IMAGE568
Figure 80017DEST_PATH_IMAGE569
Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,
Figure 35335DEST_PATH_IMAGE570
for equal uniform division points of the error e or the error change rate ec,
Figure 529901DEST_PATH_IMAGE571
in order to non-uniformly partition the points after mapping,
Figure 813246DEST_PATH_IMAGE572
adjusting a factor for the degree of non-linearity; the non-uniform partitioning diagram is shown in fig. 5.
According to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the set of divided areas (each square in fig. 6 is an area) is marked as
Figure 418671DEST_PATH_IMAGE573
As shown in fig. 6:
in each divided region
Figure 759654DEST_PATH_IMAGE574
PID control is carried out by a nonlinear increment PID controller, and the increment is controlled by the nonlinear PID
Figure 708279DEST_PATH_IMAGE575
Comprises the following steps:
Figure 134712DEST_PATH_IMAGE576
(13)
wherein the content of the first and second substances,
Figure 809407DEST_PATH_IMAGE577
to represent
Figure 4896DEST_PATH_IMAGE578
Region of time of day
Figure 841265DEST_PATH_IMAGE579
Non-linear PID control increments of (1);
Figure 489415DEST_PATH_IMAGE580
which is indicative of the time of the sampling,
Figure 436643DEST_PATH_IMAGE581
indicating area
Figure 752217DEST_PATH_IMAGE582
The internal proportionality coefficient of the air-fuel ratio,
Figure 493909DEST_PATH_IMAGE583
the scale is shown to be that of,
Figure 629355DEST_PATH_IMAGE584
the lines are represented as a result of,
Figure 380273DEST_PATH_IMAGE585
a presentation column;
Figure 284775DEST_PATH_IMAGE586
the value of the integral coefficient is represented by,
Figure 457087DEST_PATH_IMAGE587
representing the differential coefficient.
Based on all regions
Figure 345409DEST_PATH_IMAGE588
Non-linear PID control increments of
Figure 368860DEST_PATH_IMAGE589
Calculating a nonlinear PID control weighted average increment:
Figure 659027DEST_PATH_IMAGE590
(14)
wherein the content of the first and second substances,
Figure 273679DEST_PATH_IMAGE591
is a region
Figure 383717DEST_PATH_IMAGE592
The control law weight is incremented by one,
Figure 945280DEST_PATH_IMAGE593
is a region
Figure 824374DEST_PATH_IMAGE594
Error e (square in figure 6), radius of error rate of change ec,
Figure 16452DEST_PATH_IMAGE595
is a region
Figure 82628DEST_PATH_IMAGE596
Of the center of (c).
Synthesizing the above incremental control law weights
Figure 979040DEST_PATH_IMAGE597
Calculating
Figure 441202DEST_PATH_IMAGE598
Time incremental nonlinear PID control law
Figure 397657DEST_PATH_IMAGE599
Comprises the following steps:
Figure 216708DEST_PATH_IMAGE600
(15)
wherein the content of the first and second substances,
Figure 651232DEST_PATH_IMAGE601
an incremental factor, which is defined as follows:
Figure 239339DEST_PATH_IMAGE602
wherein the content of the first and second substances,
Figure 366695DEST_PATH_IMAGE603
for the maximum value of the incremental factor,
Figure 938622DEST_PATH_IMAGE604
is an offset;
Figure 176836DEST_PATH_IMAGE605
for describing the running deviation of the vehicle
Figure 619450DEST_PATH_IMAGE606
Rate of change of deviation from running
Figure 917708DEST_PATH_IMAGE607
The degree of deviation from the origin;
Figure 711351DEST_PATH_IMAGE608
Figure 481818DEST_PATH_IMAGE609
is a scaling factor.
The method can lead the error change rate ec to have different increment factors according to different errors e.
Figure 778938DEST_PATH_IMAGE610
Gives out the running deviation
Figure 982518DEST_PATH_IMAGE611
Rate of change of deviation from running
Figure 997878DEST_PATH_IMAGE612
The controller increment factors under different conditions have the functions of improving the response speed of the system and reducing the complexity of the optimization process.
On-line learning rule unit for non-linear PID control increment in region
Figure 312316DEST_PATH_IMAGE613
Parameter (2) of
Figure 729522DEST_PATH_IMAGE614
Figure 838423DEST_PATH_IMAGE615
Figure 341080DEST_PATH_IMAGE616
Performing online learning, and adopting supervised Hebb learning rule to control increment of nonlinear PID
Figure 459209DEST_PATH_IMAGE617
Learning the parameters;
the process of the online learning rule unit specifically comprises the following steps: there are a number of pairs
Figure 262080DEST_PATH_IMAGE618
In a divided region
Figure 73041DEST_PATH_IMAGE619
Inner, then to the area
Figure 346151DEST_PATH_IMAGE620
Internal non-linear PID control increments
Figure 2391DEST_PATH_IMAGE621
Parameter (2) of
Figure 659769DEST_PATH_IMAGE622
Figure 641631DEST_PATH_IMAGE623
Figure 118880DEST_PATH_IMAGE624
And performing online learning. Incremental nonlinear PID control using supervised Hebb learning rule
Figure 109970DEST_PATH_IMAGE625
The parameters of (2) are learned:
Figure 621854DEST_PATH_IMAGE155
(16)
wherein the content of the first and second substances,
Figure 774618DEST_PATH_IMAGE626
is a region
Figure 739163DEST_PATH_IMAGE627
The inner learning rate. To improve learning efficiency, learning rate
Figure 268364DEST_PATH_IMAGE628
Based on online adjustment rules
Figure 634754DEST_PATH_IMAGE629
The adjustment is carried out, namely:
Figure 958419DEST_PATH_IMAGE630
(17)
wherein the content of the first and second substances,
Figure 675840DEST_PATH_IMAGE631
is a region
Figure 737293DEST_PATH_IMAGE632
The matching coefficient in the range, the learning rate range is adjusted,
Figure 958190DEST_PATH_IMAGE633
is a region
Figure 452757DEST_PATH_IMAGE634
Two weight coefficients within. The two weighting coefficients may be artificially given based on historical data.
The part adjusts the parameters of the controllers of each area on line, and realizes the on-line adjustment of the learning rate of each area.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or groups of devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. Modules or units or groups in embodiments may be combined into one module or unit or group and may furthermore be divided into sub-modules or sub-units or sub-groups. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the method of the invention according to instructions in said program code stored in the memory.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer-readable media includes both computer storage media and communication media. Computer storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (6)

1. The on-line optimization control method of the unmanned line marking vehicle based on machine vision navigation is characterized by comprising the following steps:
s1, collecting a water line image of a road surface;
s2, carrying out waterline detection based on the waterline image to obtain a waterline;
s3, based on the detected waterline, selecting a monitoring point of the current position of the vehicle, calculating the running deviation e (k) and the running deviation change rate ec (k) of the vehicle, selecting a monitoring point of the predicted position of the vehicle, and calculating the predicted running deviation e of the vehicle p (k) And predicted deviation change rate ec p (k);
s4, based on the predicted running deviation e of the vehicle p (k) And predicted deviation change rate ec p (k) Obtaining a predicted feedforward control quantity u p (k) Obtaining an incremental nonlinear PID control law u (k) of the vehicle based on the operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle;
s5, online learning parameters in the incremental nonlinear PID control law u (k) in the region;
the step S3 specifically includes the following steps:
the height and the width of the edge image E where the waterline is located are H and W respectively;
the intersection point of the horizontal line with the height of y and the waterline is selected as the monitoring point (X) of the current position of the vehicle c,y ) The operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle are as follows:
Figure FDA0004057067220000011
wherein the content of the first and second substances,
Figure FDA0004057067220000012
the actual length represented by each pixel is represented by the camera calibration;
the intersection of the horizontal line at height y' and the waterline is selected as the monitoring point (X) of the predicted position of the vehicle p Y') to obtain a predicted running deviation e of the vehicle p (k) And predicted deviation change rate ec p (k):
Figure FDA0004057067220000013
Wherein, the W distribution is the width of the edge image E;
s4 specifically comprises the following steps:
s401, according to the predicted operation deviation e of the vehicle p (k) And predicted deviation change rate ec p (k) Calculating a feedforward control quantity u of the vehicle p (k):
Figure FDA0004057067220000021
u p (k) In order to calculate the feedforward control amount,
Figure FDA0004057067220000022
and
Figure FDA0004057067220000023
are two parameters of predictive control;
s402, in the area of the e-ec plane non-linear division, based on the operation deviation e (k) and the operation deviation change rate of the vehicle
ec (k) obtaining an incremental nonlinear PID control law u (k) of the vehicle;
s402 specifically includes the following steps: constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; the error e and the error change rate ec refer to the running deviation e (k) and the running deviation change rate ec (k) of the vehicle; the non-linear division method adopted by the e-ec plane division is as follows:
Figure FDA0004057067220000024
when calculating the division of the error change rate ec, X max Is EC max When calculating the division of the error e, X max Is E max ;E max And EC max Respectively the maximum of the absolute values of the error e and the error rate of change ec,
x i ∈{x 1 ,x 2 ,…,x n is the equal division point of the error e or the error change rate ec,
Figure FDA0004057067220000025
non-uniform dividing points after mapping; tau is a nonlinear degree adjusting factor;
after the e-ec plane is subjected to non-uniform division, a division area set is marked as { R ij };
In each divided region R ij Internally performing PID control, and controlling increment delta u by nonlinear PID ij (k) Comprises the following steps:
Figure FDA0004057067220000031
wherein, Δ u ij (k) Region R representing time k ij Non-linear PID control increments of (1);
k denotes the sampling instant of time,
Figure FDA0004057067220000032
represents a region R ij Inner scale factor, p represents scale, i represents row, j represents column;
Figure FDA0004057067220000033
represents a region R ij Inner integral the coefficients of which are such that,
Figure FDA0004057067220000034
represents a region R ij The inner differential coefficient;
based on all regions R ij Nonlinear PID control increment of Delaut ij (k) And calculating the weighted average increment of the nonlinear PID control:
Figure FDA0004057067220000035
wherein the content of the first and second substances,
Figure FDA0004057067220000036
is a region R ij Incremental control law weight, r ej And r eci Is a region R ij Error e and radius of error change rate ec, e j ,ec i Is a region R ij The center of (a);
and calculating an incremental nonlinear PID control law u (k) at the moment k as follows:
u(k)=u(k-1)+ξ(e(k),ec(k))·Δu(k) (15)
where ξ (e (k), ec (k)) > 0 is the incremental factor defined as:
Figure FDA0004057067220000041
wherein xi max 0 is the maximum increment factor xi 0 Offset is more than 0;
Figure FDA0004057067220000042
the method is used for describing the degree of deviation of an error e and an error change rate ec from an original point, s is more than 0, and S is a scaling factor;
the method can enable different e and ec to have different increment factors.
2. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S2 specifically includes the following steps:
s201, carrying out gray stretching on pixel points (m', n) on the water line image, wherein the stretching formula is as the following formula (1):
Figure FDA0004057067220000043
wherein, the lambda is more than 1,
Figure FDA0004057067220000044
λ is the stretch factor; m, n represents the mth row and nth column of the image; i (m, n) represents the gray value of the pixel point at the nth row and the nth column; after the gray level of the water line image is stretched, Q HARR-like features are selected;
s202, aiming at the ith HARR-like feature h i ,h i The calculation method is as shown in formula (2):
Figure FDA0004057067220000045
wherein the content of the first and second substances,
Figure FDA0004057067220000046
for the pixel sum of the i-th HARR-like feature waterline region,
Figure FDA0004057067220000047
the sum of the pixels of the ith HARR-like characteristic road surface area,
Figure FDA0004057067220000048
calculating and obtaining the gray value I (m, n) based on the stretched gray image; obtaining class H of each pixel pointAfter the ARR characteristics are described, each HARR-like characteristic is normalized, and the normalization formula is shown as formula (3):
Figure FDA0004057067220000051
wherein i =1,2, \8230;, Q,
Figure FDA0004057067220000052
in order to normalize the HARR signature after the normalization,
Figure FDA0004057067220000053
mn and sqmn are mean values of the gray level mean value and the gray level square value in the detection window respectively;
constructing a feature vector F (m, n, wherein: of a pixel point (m, n) based on the normalized HARR-like features:
Figure FDA0004057067220000054
s203, identifying the feature vector F (m, n) of each pixel point (m, n), judging whether the feature vector of the pixel point (m, n) accords with the linear feature, if so, setting 1 to the pixel point, otherwise, setting 0 to realize the binarization of the gray level image, and obtaining a binary image B;
carrying out expansion corrosion treatment on the binary image B to obtain a new binary image Bn, and obtaining an edge image E of a waterline by adopting a formula (6):
E(m,n)=Bn(m,n)-Bn(m,n-1) (6)
bn (m, n) represents the pixel value of the mth row and nth column of the new binary image Bn; e (m, n) represents a pixel value of an mth row and an nth column of the edge image E;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
finding out coordinates (x, y) of a pixel with a pixel value of 1 from the new binary image Bn, obtaining theta and rho of each straight line through HOUGH transformation, and obtaining a straight line expression corresponding to the theta and the rho through a formula (7);
x=ρ/cosθ-y tanθ (7)
wherein theta and rho are respectively the radius and the angle detected by HOUGH transformation;
a set of theta, rho represents a straight line, given a set of theta, rho, if (x, y) satisfies equation (7) the (x, y) is on the straight line represented by equation (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1: selecting a plurality of straight lines with the longest length, wherein the length of the straight lines meets the length of a threshold value;
s205-2: the straight line parameters theta and rho detected by the two frames of binary images Bn meet the requirement of the formula (8):
Figure FDA0004057067220000061
wherein, delta = [ theta ] kk-1 ρ kk-1 ] T Alpha and beta are respectively theta, the maximum deviation angle and radius allowed by rho, delta is a threshold allowed by continuity, and k is the sampling time of the image;
s205-3: and if more than one straight line still meets the requirement after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
3. The on-line optimizing control method for the unmanned line marking vehicle based on machine vision navigation as claimed in claim 1,
the step S5 specifically includes the following steps:
there are pairs (e (k), ec (k)) of numbers in the dividing region R ij Inner, then to the region R ij Internal non-linear PID control delta u ij (k) Parameter (2) of
Figure FDA0004057067220000062
Performing online learning;
by usingSupervised Hebb learning rule versus nonlinear PID control increments Δ u ij (k) The parameters of (2) are learned:
Figure FDA0004057067220000063
wherein eta ij Is a region R ij Inner learning rate, learning rate eta ij Based on-line regulation rule eta ij (k) Adjusting:
η ij (k)=ψ ij ·(1-exp(-υ ij ·ec 2 (k)-v ij ·e 2 (k))) (17)
wherein psi ij Is a region R ij Inner matching coefficient for adjusting the learning rate range upsilon ij ,v ij Is a region R ij Two weight coefficients within.
4. The unmanned line marking vehicle online optimization control system based on machine vision navigation is characterized by comprising a vision sensor, a waterline detection unit, an operation error prediction unit, a prediction controller, a nonlinear increment PID controller and an online learning rule unit;
a vision sensor collects a water line image of a road surface;
the waterline detection unit performs waterline detection based on the waterline image to obtain a waterline;
the operation error prediction means selects a monitoring point of the current position of the vehicle based on the detected waterline, calculates the operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle, selects a monitoring point of the predicted position of the vehicle, and calculates the predicted operation deviation e of the vehicle p (k) And predicted deviation change rate ec p (k);
The predictive controller operates the deviation e according to the prediction of the vehicle p (k) And predicted deviation change rate ec p (k) Calculating a feedforward control quantity u p (k);
The method comprises the steps that a non-linear increment PID controller obtains an incremental non-linear PID control law u (k) of a vehicle in a non-linear division area of an e-ec plane based on the operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle;
on-line learning rule unit for nonlinear PID control increment delta u in region ij (k) Learning the parameters;
the specific working process of the operation error prediction unit comprises the following steps: the height and the width of the edge image E where the waterline is located are H and W respectively; the intersection point of the horizontal line with the height of y and the waterline is selected as the monitoring point (X) of the current position of the vehicle c Y), calculating the operation deviation e (k) and the operation deviation change rate ec (k) of the vehicle as:
Figure FDA0004057067220000071
wherein the content of the first and second substances,
Figure FDA0004057067220000072
the actual length represented by each pixel is represented by the camera calibration;
the intersection point of the horizontal line at the height of y' and the waterline is selected as the monitoring point (X) of the predicted position of the vehicle p Y') to obtain a predicted running deviation e of the vehicle p (k) And predicted deviation change rate ec p (k):
Figure FDA0004057067220000081
Feedforward control amount u of vehicle predicted by prediction controller p (k) Comprises the following steps:
Figure FDA0004057067220000082
u p (k) In order to predict the feedforward control quantity of the controller,
Figure FDA0004057067220000083
and
Figure FDA0004057067220000084
are two parameters of the predictive controller;
the working process of the nonlinear incremental PID controller specifically comprises the following steps:
constructing a non-uniform division method of an error e and an error change rate ec based on a Gaussian function, wherein an e-ec plane takes the error e as a horizontal axis and the error change rate ec as a vertical axis; the error or the error change rate refers to a running deviation e (k) of the vehicle and a running deviation change rate ec (k);
the e-ec plane nonlinear division method comprises the following steps:
Figure FDA0004057067220000085
when calculating the division of the error change rate ec, X max Is EC max When calculating the division of the error e, X max Is E max ;E max ,EC max Respectively, the maximum of the absolute values of the error e and the error rate of change ec, wherein,
x i ∈{x 1 ,x 2 ,…,x n the equal uniform division points of the error e or the error change rate ec are used as the dividing points,
Figure FDA0004057067220000086
non-uniform dividing points after mapping; tau is a non-linearity degree adjusting factor;
according to the range of the error e and the error change rate ec, the e-ec plane is divided non-uniformly, and the divided region set is marked as { R } ij };
In each divided region R ij Internal, non-linear PID control increments Δ u ij (k) Comprises the following steps:
Figure FDA0004057067220000091
wherein, Δ u ij (k) Region R representing time k ij Non-linear PID control increments of (1);
k representsAt the moment of sampling the time of the sample,
Figure FDA0004057067220000092
represents a region R ij Inner scale factor, p represents scale, i represents row, j represents column;
Figure FDA0004057067220000093
represents a region R ij Inner integral the coefficients of which are such that,
Figure FDA0004057067220000094
represents a region R ij The inner differential coefficient;
based on all regions R ij Nonlinear PID control increment of Delaut ij (k) And calculating the weighted average increment of the nonlinear PID control:
Figure FDA0004057067220000095
wherein the content of the first and second substances,
Figure FDA0004057067220000096
is a region R ij Incremental control law weight, r ej ,r eci Is a region R ij Error e, radius of error change rate ec, e j ,ec i Is a region R ij The center of (a);
based on incremental control law weight w ij And calculating the time incremental nonlinear PID control law u (k) as follows:
u(k)=u(k-1)+ξ(e(k),ec(k))·Δu(k) (15)
where ξ (e (k), ec (k)) > 0 is an incremental factor defined as follows:
Figure FDA0004057067220000097
therein, xi max 0 is the maximum increment factor xi 0 Offset is more than 0;
Figure FDA0004057067220000101
s > 0, S is the scaling factor.
5. The on-line optimizing control system of the unmanned line marking vehicle based on machine vision navigation as claimed in claim 4,
the working process of the waterline detection unit specifically comprises the following steps:
s201, carrying out gray stretching on pixel points I (m, n) on the water line image, wherein the stretching formula is as shown in formula (1):
Figure FDA0004057067220000102
wherein, the lambda is more than 1,
Figure FDA0004057067220000103
λ is a stretching factor, m, n represents the mth row and nth column of the image; i (m, n) represents the gray value of the nth pixel point of the mth row; after the gray level of the water line image is stretched, Q HARR-like features are selected to express the straight line feature of each pixel point;
s202, aiming at the ith HARR-like feature h i The calculation method is as shown in formula (2):
Figure FDA0004057067220000104
wherein the content of the first and second substances,
Figure FDA0004057067220000105
for the pixel sum of the i-th HARR-like feature waterline region,
Figure FDA0004057067220000106
the sum of the pixels of the ith HARR-like characteristic road surface area,
Figure FDA0004057067220000107
calculating the gray value I (m, n) of the stretched gray image;
after the HARR-like feature description of each pixel point is obtained, normalizing each HARR-like feature, wherein the normalization formula is as shown in formula (3):
Figure FDA0004057067220000108
wherein the content of the first and second substances,
Figure FDA0004057067220000109
for the ith normalized HARR feature,
Figure FDA00040570672200001010
mn and sqmn are mean values of the gray level mean value and the gray level square value in the detection window respectively;
constructing a feature vector F (m, n, wherein: of a pixel point (m, n) based on the normalized HARR-like features:
Figure FDA0004057067220000111
s203, identifying the feature vector F (m, n) of each pixel point (m, n), judging whether the feature vector of the pixel point (m, n) accords with the linear feature, if so, setting 1 to the pixel point, otherwise, setting 0 to realize the binarization of the gray level image, and further obtaining a binary image B;
performing expansion corrosion treatment on the binary image B to obtain a new binary image Bn, obtaining an edge image E of a waterline by adopting a formula (6),
E(m,n)=Bn(m,n)-Bn(m,n-1) (6)
bn (m, n) represents the pixel value of the mth row and nth column of the new binary image Bn; e (m, n) represents the pixel value of the m-th row, n-th column of the edge image E;
s204, identifying a straight line in the edge image E by adopting a HOUGH conversion method:
finding out the coordinates (x, y) of the pixel with the pixel value of 1 from the new binary image Bn, obtaining theta and rho through HOUGH transformation, and obtaining a linear expression corresponding to the theta and the rho through a formula (7);
x=ρ/cosθ-ytanθ (7)
wherein, theta and rho are respectively the radius and the angle detected by HOUGH conversion;
a set of θ, ρ represents a straight line, and given a set of θ, ρ, if (x, y) satisfies formula (7), it means that (x, y) is on the straight line represented by formula (7);
s205, eliminating interference straight lines;
the interference line removing method specifically comprises the following steps:
s205-1, selecting a plurality of straight lines with the longest length and the length meeting the threshold length;
s205-2, the straight line parameters theta, rho detected by the two frames of binary images Bn satisfy the formula (8):
Figure FDA0004057067220000112
wherein, delta = [ theta ] kk-1 ρ kk-1 ] T Alpha and beta are respectively theta, the maximum deviation angle and radius allowed by rho, delta is a threshold allowed by continuity, and k is the sampling time of the image;
s205-3, if 2 or more than 2 straight lines still exist after the process of S205-1 and S205-2, selecting the straight line with the highest pixel mean value on the straight line as the detected waterline.
6. The on-line optimizing control system of the unmanned line marking vehicle based on machine vision navigation as claimed in claim 4,
the working process of the online learning rule unit specifically comprises the following steps:
there are pairs (e (k), ec (k)) of numbers in the dividing region R ij Inner, then to the areaR ij Internal non-linear PID control delta u ij (k) Parameter (2) of
Figure FDA0004057067220000121
Performing online learning;
control of delta u for non-linear PID using supervised Hebb learning rule ij (k) The parameters of (2) are learned:
Figure FDA0004057067220000122
wherein eta ij Is a region R ij Inner learning rate, learning rate eta ij Based on-line regulation rule eta ij (k) And (3) adjusting:
η ij (k)=ψ ij ·(1-exp(-υ ij ·ec 2 (k)-v ij ·e 2 (k))) (17)
wherein psi ij Is a region R ij Inner matching coefficient, adjusting learning rate range upsilon ij ,v ij Is a region R ij Two weight coefficients within.
CN202211451943.7A 2022-11-21 2022-11-21 Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation Active CN115509122B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211451943.7A CN115509122B (en) 2022-11-21 2022-11-21 Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211451943.7A CN115509122B (en) 2022-11-21 2022-11-21 Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation

Publications (2)

Publication Number Publication Date
CN115509122A CN115509122A (en) 2022-12-23
CN115509122B true CN115509122B (en) 2023-03-21

Family

ID=84513924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211451943.7A Active CN115509122B (en) 2022-11-21 2022-11-21 Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation

Country Status (1)

Country Link
CN (1) CN115509122B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760850B (en) * 2023-01-05 2023-05-26 长江勘测规划设计研究有限责任公司 Method for recognizing water level without scale by utilizing machine vision

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113960921A (en) * 2021-10-19 2022-01-21 华南农业大学 Visual navigation control method and system for orchard tracked vehicle

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011032208A1 (en) * 2009-09-15 2011-03-24 The University Of Sydney A system and method for autonomous navigation of a tracked or skid-steer vehicle
AU2013332262B2 (en) * 2012-10-17 2017-08-03 WATSON, Diane Lee Vehicle for line marking
CN106527119B (en) * 2016-11-03 2019-07-23 东华大学 Derivative-precedence PID system based on fuzzy control
CN109176519A (en) * 2018-09-14 2019-01-11 北京遥感设备研究所 A method of improving the Robot Visual Servoing control response time
CN110398979B (en) * 2019-06-25 2022-03-04 天津大学 Unmanned engineering operation equipment tracking method and device based on vision and attitude fusion
AU2020104234A4 (en) * 2020-12-22 2021-03-11 Qingdao Agriculture University An Estimation Method and Estimator for Sideslip Angle of Straight-line Navigation of Agricultural Machinery
CN112706835B (en) * 2021-01-07 2022-04-19 济南北方交通工程咨询监理有限公司 Expressway unmanned marking method based on image navigation
CN113296518A (en) * 2021-05-25 2021-08-24 山东交通学院 Unmanned driving system and method for formation of in-place heat regeneration unit
CN114942641A (en) * 2022-06-06 2022-08-26 仲恺农业工程学院 Road bridge autonomous walking marking system controlled by multiple sensor data fusion stereoscopic vision
CN115082701B (en) * 2022-08-16 2022-11-08 山东高速集团有限公司创新研究院 Multi-water-line cross identification positioning method based on double cameras

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113960921A (en) * 2021-10-19 2022-01-21 华南农业大学 Visual navigation control method and system for orchard tracked vehicle

Also Published As

Publication number Publication date
CN115509122A (en) 2022-12-23

Similar Documents

Publication Publication Date Title
CN113221905B (en) Semantic segmentation unsupervised domain adaptation method, device and system based on uniform clustering and storage medium
CN109375235B (en) Inland ship freeboard detection method based on deep reinforcement neural network
CN115509122B (en) Online optimization control method and system for unmanned line marking vehicle based on machine vision navigation
Li et al. Model-based online learning with kernels
CN103886325B (en) Cyclic matrix video tracking method with partition
CN112348849A (en) Twin network video target tracking method and device
CN108614994A (en) A kind of Human Head Region Image Segment extracting method and device based on deep learning
CN111860439A (en) Unmanned aerial vehicle inspection image defect detection method, system and equipment
CN112818873B (en) Lane line detection method and system and electronic equipment
CN110889332A (en) Lie detection method based on micro expression in interview
CN112298194B (en) Lane changing control method and device for vehicle
CN111325711A (en) Chromosome split-phase image quality evaluation method based on deep learning
CN112016463A (en) Deep learning-based lane line detection method
CN112184655A (en) Wide and thick plate contour detection method based on convolutional neural network
CN113516853B (en) Multi-lane traffic flow detection method for complex monitoring scene
Jiang et al. Dfnet: Semantic segmentation on panoramic images with dynamic loss weights and residual fusion block
CN113989613A (en) Light-weight high-precision ship target detection method coping with complex environment
CN111259827A (en) Automatic detection method and device for water surface floating objects for urban river supervision
CN107766798A (en) A kind of Remote Sensing Target detection method based on cloud computing storage and deep learning
CN112802005A (en) Automobile surface scratch detection method based on improved Mask RCNN
CN114581486A (en) Template updating target tracking algorithm based on full convolution twin network multilayer characteristics
CN116630748A (en) Rare earth electrolytic tank state multi-parameter monitoring method based on fused salt image characteristics
CN116476863A (en) Automatic driving transverse and longitudinal integrated decision-making method based on deep reinforcement learning
CN116977902B (en) Target tracking method and system for on-board photoelectric stabilized platform of coastal defense
CN102592125A (en) Moving object detection method based on standard deviation characteristic

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant