CN108133471B - Robot navigation path extraction method and device based on artificial bee colony algorithm - Google Patents

Robot navigation path extraction method and device based on artificial bee colony algorithm Download PDF

Info

Publication number
CN108133471B
CN108133471B CN201611076483.9A CN201611076483A CN108133471B CN 108133471 B CN108133471 B CN 108133471B CN 201611076483 A CN201611076483 A CN 201611076483A CN 108133471 B CN108133471 B CN 108133471B
Authority
CN
China
Prior art keywords
crop row
crop
image
center line
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611076483.9A
Other languages
Chinese (zh)
Other versions
CN108133471A (en
Inventor
孟庆宽
孙文彬
王继广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Vocational And Technical Normal University
Original Assignee
Tianjin Vocational And Technical Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Vocational And Technical Normal University filed Critical Tianjin Vocational And Technical Normal University
Priority to CN201611076483.9A priority Critical patent/CN108133471B/en
Publication of CN108133471A publication Critical patent/CN108133471A/en
Application granted granted Critical
Publication of CN108133471B publication Critical patent/CN108133471B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an agricultural mobile robot navigation path extraction method and device based on an artificial bee colony algorithm under natural illumination conditions, wherein the method comprises the following steps: collecting images of farmland crops; carrying out gray processing on the crop image; dividing the processed gray level image into a binary image; respectively carrying out gray level vertical projection on the top and the bottom of the binary image to obtain a crop row region range; detecting the characteristic points of the crop rows in the range of the crop row area by adopting a vertical projection method based on a moving window; establishing a crop row center line solving model according to the characteristics of crop rows in the image, and performing optimization search on crop row feature points in a crop row area range through an artificial bee colony algorithm to determine the crop row center line; determining a navigation path between two crop rows according to the center lines of two adjacent crop rows closest to the center line of the image; the method improves the detection precision and speed of the navigation path and the adaptability to natural illumination, and solves the problems of low accuracy, poor real-time performance, sensitivity to natural illumination change and the like of the traditional agricultural mobile robot navigation path extraction method.

Description

Robot navigation path extraction method and device based on artificial bee colony algorithm
Technical Field
The invention relates to the technical field of navigation of agricultural mobile robots, in particular to a method and a device for extracting a navigation path of an agricultural mobile robot based on an artificial bee colony algorithm under a natural illumination condition.
Background
The agricultural mobile robot is used for completing field automatic navigation operation, so that the farmland operation efficiency can be effectively improved, the production cost is reduced, and workers are prevented from being exposed to severe environments such as high temperature, high humidity, harm and the like. At present, the research on the navigation of agricultural mobile robots mainly focuses on two modes of machine vision and satellite navigation (GNSS) technologies, and the mobile robot based on machine vision has the advantages of high working efficiency, good real-time performance, low system cost and the like, and has become a hot spot in the field of domestic and foreign fine agriculture research. The navigation path identification is the premise of autonomous navigation operation of the agricultural mobile robot, and the crop row center line and the navigation path can be quickly and accurately extracted, so that the working efficiency and the operation precision of the mobile robot can be effectively improved. For early crops, due to the fact that crop rows are approximately parallel in mechanized sowing and crop row growth is continuous, the crop rows present an approximate straight line on the whole trend, and a crop row center line can be extracted by processing farmland crop images to obtain a navigation path;
the farmland environment is complicated changeable, has unstructured and open characteristics, and the main factors that influence the discernment of agricultural mobile robot vision navigation route include: natural lighting variations and the performance of the path recognition algorithm itself. For the problem of natural illumination, some students propose a navigation information acquisition method based on illumination stability and an illumination-independent graph so as to reduce the influence of illumination change on navigation path extraction. Some scholars select 2G-R-B color factors, H components and Cr components to perform graying processing on an image in RGB, HIS and YCrCb color spaces respectively so as to improve the adaptability of image segmentation to illumination change. However, R, G, B three components are coupled to each other and change with the change of illumination intensity, so the 2G-R-B color factor has low adaptability to the illumination change; the H component and the RGB color space are converted into nonlinearity, so that image distortion is easily caused, and an error is generated on an image processing result; the YCrCb color space lacks a description of the green component and is not suitable for processing farm green crop images. In the aspect of crop row center line (straight line) fitting method selection, some scholars adopt Hough transformation to identify crop rows, the Hough transformation has strong robustness, but the calculation is complex, and the detection precision is limited. Some scholars extract the crop row straight line by using a least square method, and the method has high detection speed, but the least square method is sensitive to strong noise points and poor in robustness. At present, scholars propose a crop row straight line extraction algorithm based on a random method, and the algorithm has the characteristics of low calculation complexity, good real-time performance and the like, but the straight line detection precision depends on the selection of random points to a great extent. In addition, the learners adopt the maximum square to extract the crop skeleton, the crop row straight line is obtained through straight line fitting, and the algorithm needs to search the maximum square for each target pixel point, so that the calculation complexity is increased, the consumed time is more, and the requirement of a high-speed navigation operation system on the real-time performance is difficult to meet.
Disclosure of Invention
The invention is realized by normalization
Figure 803402DEST_PATH_IMAGE001
The factor carries out gray processing on the crop image so as to reduce the influence of illumination change on navigation path identification; by establishing a crop row central line solving model, the artificial bee colony algorithm is utilized to extract crop row straight lines and navigation lines, so that the accuracy and speed of navigation path recognition are improved.
Technical problem to be solved
The invention aims to solve the technical problem of how to extract the navigation path under the natural illumination condition, and improve the accuracy, real-time property and adaptability of the navigation path identification of the agricultural mobile robot.
(II) technical scheme
In order to solve the technical problem, in a first aspect, the invention provides an agricultural mobile robot navigation path extraction method based on an artificial bee colony algorithm under natural illumination conditions, the method comprising the following steps:
s01, collecting crop images; the camera forms an included angle of 60-70 degrees with the horizontal direction, and the vertical height from the ground is about 1.2-1.4 m;
s02, use of
Figure 153831DEST_PATH_IMAGE001
Carrying out gray processing on the collected crop image by the factor, and converting the color crop image into a gray image;
s03, performing image segmentation by adopting a maximum inter-class variance method, and converting the gray level image into a binary image;
s04, obtaining edge position information of the crop row at the top edge and the bottom edge of the image by adopting a vertical projection method, and forming a crop row area range by connecting edge points through straight lines;
s05, extracting the characteristic points of the crop rows in the crop row area range by using a vertical projection method based on a moving window;
s06, establishing a crop row center line solving model according to the characteristics of crop rows in the image, and extracting crop row center lines in the crop row area range through an artificial bee colony algorithm;
and S07, determining a navigation path between the two crop row central lines according to the two crop row central lines closest to the image central line.
Preferably, the step S02 specifically includes the following steps:
(1) converting crop images from RGB color space to
Figure 933568DEST_PATH_IMAGE002
Color space:
Figure 266461DEST_PATH_IMAGE003
(1)
(2) in that
Figure 80833DEST_PATH_IMAGE002
Constructing a Cg component irrelevant to illumination on the basis of a color space, wherein the Cg component corresponds to the difference between a green signal and brightness:
Figure 840979DEST_PATH_IMAGE004
(2)
(3) processing Cg according to ITU-R BT.601-6 to obtain formula (3):
Figure 45695DEST_PATH_IMAGE005
(3)
(4) respectively carrying out normalization processing on the Cr, Cb and Cg components to obtain
Figure 244595DEST_PATH_IMAGE006
Component (b):
Figure 349692DEST_PATH_IMAGE007
(4)
Figure 280739DEST_PATH_IMAGE008
(5)
Figure 35068DEST_PATH_IMAGE009
(6)
(5) by using
Figure 709763DEST_PATH_IMAGE001
The factor grays the color image.
Preferably, the step S06 specifically includes the following steps:
(1) a crop row central line solving model is established according to the characteristics of crop rows in the image, the crop row central line solving model is that the farmland crop rows are expressed as approximate straight lines in shape, and an equation can be determined according to the characteristic points of the two crop rows in the image. Let V denote the crop row feature point data space obtained according to said step S05,
Figure 233149DEST_PATH_IMAGE010
and
Figure 600676DEST_PATH_IMAGE011
for two points in V, then the crop row centerline equation can be expressed as:
Figure 842301DEST_PATH_IMAGE012
(7)
counting the number of characteristic points within the range of d from the straight line, using the number as a standard for evaluating the quality of the straight line, and adjusting
Figure 556573DEST_PATH_IMAGE013
And
Figure 137727DEST_PATH_IMAGE014
selecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
(2) dividing the crop row feature points in the crop row area range obtained in the step S05 into an upper part and a lower part 2 by taking the height of the image 1/2 as a boundary, taking the crop row feature points of the upper half part as candidate starting points of a crop row center line and the crop row feature points of the lower half part as candidate end points of the crop row center line, and establishing arrays to respectively store the candidate starting points and the candidate end points of the crop row center line;
(3) and solving the model according to the crop row center line, and randomly selecting 1 candidate starting point and 1 candidate ending point to form a honey source of the artificial bee colony algorithm to represent a candidate crop row center line. Initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of feature points within a certain range from the candidate straight lines, taking the feature points as fitness functions for evaluating the goodness and badness of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as a crop row central line through multiple searches of an artificial bee colony algorithm.
The invention also provides an agricultural mobile robot navigation path extraction device based on the artificial bee colony algorithm under the natural illumination condition, which comprises the following steps:
(1) image graying processing module using
Figure 472893DEST_PATH_IMAGE001
The factor converts the color crop image into a gray image;
(2) the image segmentation module is used for converting the gray level image into a binary image by a maximum inter-class variance method;
(3) the crop row region range determining module is used for determining the crop row region range through a vertical projection method according to the binary image;
(4) the crop row characteristic point detection module is used for extracting crop row characteristic points in the crop row region range through a vertical projection method based on a moving window according to the crop row region range;
(5) the crop row center line extraction module is used for establishing a crop row center line solving model according to the characteristics of crop rows in the image and extracting the crop row center lines in the range of the crop row area through an artificial bee colony algorithm;
(6) and the navigation path determining module is used for determining a navigation path between the two crop row center lines according to the two crop row center lines closest to the image center line.
Preferably, the image graying processing module includes:
(1) converting crop images from RGB color space to
Figure 139498DEST_PATH_IMAGE002
Color space:
Figure 218313DEST_PATH_IMAGE015
(8)
(2) in that
Figure 653973DEST_PATH_IMAGE002
Constructing a Cg component irrelevant to illumination on the basis of a color space, wherein the Cg component corresponds to the difference between a green signal and brightness:
Figure 425620DEST_PATH_IMAGE004
(9)
(3) cg is processed according to ITU-R BT.601-6 to yield (10):
Figure 812477DEST_PATH_IMAGE016
(10)
(4) respectively carrying out normalization processing on Cg, Cr and Cb components to obtain
Figure 694982DEST_PATH_IMAGE006
Component (b):
Figure 985149DEST_PATH_IMAGE017
(11)
Figure 927697DEST_PATH_IMAGE018
(12)
Figure 303315DEST_PATH_IMAGE019
(13)
(5) by using
Figure 661615DEST_PATH_IMAGE001
The factor performs graying processing on the color crop image.
Preferably, the crop row centerline extraction module comprises:
(1) a crop row central line solving model is established according to the characteristics of crop rows in the image, the crop row central line solving model is that the farmland crop rows are expressed as approximate straight lines in shape, and an equation can be determined according to the characteristic points of the two crop rows in the image. Setting V to represent a crop row characteristic point data space obtained by the crop row characteristic point detection module,
Figure 816741DEST_PATH_IMAGE010
and
Figure 743240DEST_PATH_IMAGE020
for two points in V, then the crop row centerline equation can be expressed as:
Figure 668470DEST_PATH_IMAGE021
(14)
counting the number of characteristic points within the range of d from the straight line, using the number as a standard for evaluating the quality of the straight line, and adjusting
Figure 269609DEST_PATH_IMAGE010
And
Figure 331106DEST_PATH_IMAGE020
selecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
(2) dividing the crop row characteristic points in the crop row area range obtained by the crop row characteristic point detection module into an upper part and a lower part by taking the height of the image 1/2 as a boundary, taking the crop row characteristic points of the upper half part as candidate starting points of a crop row central line and the crop row characteristic points of the lower half part as candidate terminal points of the crop row central line, and establishing arrays for respectively storing the candidate starting points and the candidate terminal points of the crop row central line;
(3) and solving the model according to the crop row center line, and randomly selecting 1 candidate starting point and 1 candidate ending point to form a honey source of the artificial bee colony algorithm to represent a candidate crop row center line. Initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of feature points within a certain range from the candidate straight lines, taking the feature points as fitness functions for evaluating the goodness and badness of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as a crop row central line through multiple searches of an artificial bee colony algorithm.
(III) advantageous effects
The invention provides an agricultural mobile robot navigation path extraction method and device based on an artificial bee colony algorithm under a natural illumination condition, aiming at the defects of the existing agricultural mobile robot navigation path identification technology. The method comprises the steps of carrying out gray processing on collected farmland crop images, and segmenting the processed gray images to convert the gray images into binary images; acquiring edge position information of the crop row on the top edge and the bottom edge of the image by adopting a vertical projection method, and connecting edge points through straight lines to form a crop row area range; extracting crop row characteristic points in the crop row region range by using a vertical projection method based on a moving window, solving a model according to a crop row center line, extracting the crop row center line in the crop row region range by using an artificial bee colony algorithm, and further acquiring a navigation path; the method reduces the influence of illumination change on navigation path identification, improves the reliability of navigation path extraction, and solves the problems of poor real-time performance, low accuracy, sensitivity to illumination change and the like of navigation path extraction of the agricultural mobile robot.
Drawings
Fig. 1 is a schematic flow chart of an agricultural mobile robot navigation path extraction method based on an artificial bee colony algorithm under a natural light condition according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a crop row image according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an image after crop row image gray processing according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a binary image obtained by segmenting a grayscale image according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of a crop row gray scale vertical projection according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of detecting positions of crop rows nearest to the left and right sides of the center line of the image at the top edge and the bottom edge of the image according to an embodiment of the present invention.
Fig. 7 is a schematic diagram of crop row area range detection according to an embodiment of the present invention.
Fig. 8 is a schematic diagram illustrating detection of characteristic points of crop rows within a crop row region according to an embodiment of the present invention.
Fig. 9 is a schematic diagram of a crop row centerline solution model feature according to an embodiment of the present invention.
Fig. 10 is a schematic diagram of crop row center line and navigation path extraction according to an embodiment of the present invention.
Fig. 11 is a schematic structural diagram of an agricultural mobile robot navigation path extraction device based on an artificial bee colony algorithm under a natural light condition according to an embodiment of the present invention.
Detailed Description
The following embodiments are merely used to more clearly illustrate the technical solution of the present invention, but the scope of the present invention is not limited thereby, and the following embodiments select the corn image under the natural illumination during cultivation as an example for detailed description.
Fig. 1 shows a schematic flow chart of an agricultural mobile robot navigation path extraction method based on an artificial bee colony algorithm under natural light conditions according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following specific steps.
And S01, collecting crop images. The height and the angle of a camera arranged on the agricultural mobile robot are adjusted to form an included angle of 60-70 degrees between the camera and the horizontal direction, the vertical height from the ground is 1.2-1.4 m, and the obtained image is shown in figure 2.
And S02, carrying out crop image graying processing. The invention is in
Figure 553140DEST_PATH_IMAGE002
Constructing a Cg component irrelevant to illumination on the basis of a color model, and respectively carrying out normalization processing on the Cg component, the Cr component and the Cb component to obtain
Figure 903350DEST_PATH_IMAGE006
Component by normalization
Figure 931349DEST_PATH_IMAGE001
Performing graying processing on the color image by the factor as shown in FIG. 3;
the color image collected by the camera is in RGB format, and the crop image is converted from RGB color space
Figure 549150DEST_PATH_IMAGE002
The formula of the color space is shown in (1):
Figure 676506DEST_PATH_IMAGE022
(1)
the Cg component corresponds to the difference between the green signal and the luminance:
Figure 451695DEST_PATH_IMAGE023
(2)
cg is processed according to ITU-R BT.601-6 standard to obtain formula (3)
Figure 17806DEST_PATH_IMAGE024
(3)
Respectively carrying out normalization processing on Cg, Cr and Cb components to obtain
Figure 776200DEST_PATH_IMAGE006
Component (b):
Figure 74457DEST_PATH_IMAGE025
(4)
Figure 461576DEST_PATH_IMAGE008
(5)
Figure 769061DEST_PATH_IMAGE009
(6)
(5) by using
Figure 33558DEST_PATH_IMAGE001
The factor grays the color image.
S03, performing image segmentation by using the maximum inter-class variance method, and converting the grayscale image into a binary image, as shown in fig. 4.
And S04, acquiring edge position information of the crop row at the top edge and the bottom edge of the image by adopting a vertical projection method, and connecting edge points through straight lines to form a crop row area range. Usually, a farmland image acquired by a vision sensor comprises a plurality of rows of crops, and the embodiment adopts a vertical projection method based on a binary image to detect two crop row region ranges nearest to the central line of the image;
the detection steps of the crop row region range are as follows:
(1) the binary image only comprises 2 kinds of white pixels and 2 kinds of black pixels, the gray value of the white pixels is 255 to represent crop information, and the gray value of the black pixels is 0 to represent soil background information. In the image coordinate system, the upper left corner of the image is defined as the origin of coordinates, the left side is defined as the positive direction of the horizontal axis, and the down side is defined as the positive direction of the vertical axis. According to the projection area set in the binary image, the sum of the gray values of each row of pixels in the projection area is calculated, and the average gray value of all pixels in the projection area is determined according to the gray value of each pixel in the projection area, as shown in fig. 5. Wherein the projection regions are an upper half 1/3 region and a lower half 1/3 region in the binary image;
(2) in the projected area of the binary image upper half 1/3, the projection direction is vertically upward, projecting the sum of the column pixel gray levels to the top edge of the image. Sequentially adding the gray levels of the pixels in the rows from left to right
Figure 830613DEST_PATH_IMAGE026
Comparing with the average gray value t if
Figure 377132DEST_PATH_IMAGE027
Setting the sum of the pixel grays of the columns as an average gray value t, otherwise setting the sum of the pixel grays of the columns as 0, wherein j is an integer greater than or equal to 1 and represents the columns of the image. Comparing the sum of the column pixels of the jth column with the sum of the column pixels of the (j + 1) th column, judging the left and right edge points of the crop row at the top edge of the image according to the size of the sum of the column pixels of the jth column and the sum of the column pixels of the (j + 1) th column, and if the sum of the column pixels of the jth column is smaller than the sum of the column pixels of the (j + 1) th column, j is the left edge point of the crop row; if the sum of the pixels of the jth column is larger than the sum of the pixels of the jth +1 th column, j is a right edge point of the crop row; if the sum of the pixels of the jth column is equal to the sum of the pixels of the (j + 1) th column, no processing is carried out;
(3) in the lower half 1/3 projection area of the image, the projection direction is vertically downward, projecting the sum of the column pixel grayscales to the bottom edge of the image. Calculating the difference between the image height and the sum of the gray levels of the pixels in the row from left to right
Figure 488307DEST_PATH_IMAGE028
And calculating the image height value and said averageThe difference T of gray values will be sequentially from left to right
Figure 498988DEST_PATH_IMAGE028
Comparing with T if
Figure 404627DEST_PATH_IMAGE028
And if the sum of the gray levels of the column pixels is larger than T, setting the sum of the gray levels of the column pixels as an image height value, otherwise, setting the sum of the column pixels as T, wherein j is an integer larger than or equal to 1 and represents the column of the image. Comparing the sum of the column pixels of the jth column with the sum of the column pixels of the (j + 1) th column, judging the left and right edge points of the crop row at the bottom edge of the image according to the size of the sum of the column pixels of the jth column and the sum of the column pixels of the (j + 1) th column, and if the sum of the pixel of the jth column is greater than the sum of the pixel of the (j + 1) th column, j is the left edge point of the crop row; if the sum of the pixels of the jth column is less than the sum of the pixels of the jth +1 th column, j is a right edge point of the crop row; if the sum of the pixels of the jth column is equal to the sum of the pixels of the (j + 1) th column, no processing is carried out;
(4) calculating the difference value between the left edge and the right edge of the crop row, comparing the difference value with a preset crop row width threshold value, if the difference value is greater than the crop row width threshold value, retaining the information of the left edge and the right edge of the crop row, and otherwise, rejecting the information of the left edge and the right edge of the crop row;
(5) according to the information of the left side edge and the right side edge of the crop row, the position areas of the crop row at the top edge and the bottom edge of the image can be determined, and 2 crop row position areas which are located at the left side and the right side of the image center line and are closest to the image center line are found at the top edge and the bottom edge of the image respectively by taking the image center line as a boundary, as shown in fig. 6;
(6) according to the positions of the closest crop rows on the left side and the right side of the central line of the image, connecting the left edge point and the right edge point of the crop row on the same side at the bottom edge of the image with the left edge point and the right edge point of the crop row on the top edge of the image respectively by using straight lines to form a crop row area range, as shown in fig. 7;
in particular, for a frame
Figure 939907DEST_PATH_IMAGE029
The image of the pixel or pixels of the image,
Figure 651511DEST_PATH_IMAGE030
which represents the width of the image,
Figure 454382DEST_PATH_IMAGE031
representing the height of the image
Figure 530923DEST_PATH_IMAGE032
In the representation image
Figure 114351DEST_PATH_IMAGE033
The gray scale of the pixel point at the position,
Figure 301750DEST_PATH_IMAGE034
is the sum of the gray levels of the j-th row of pixels, T is the average value of the gray levels of the pixels in the projection area, the width threshold value of the row of the crop to be detected is R,
Figure 287023DEST_PATH_IMAGE035
and T is expressed as:
Figure 33000DEST_PATH_IMAGE036
(j=1,2,3,4‥,M) (7)
Figure 103724DEST_PATH_IMAGE037
(j=1,2,3,4‥,M) (8)
wherein,
Figure 94814DEST_PATH_IMAGE038
which represents the width of the image,
Figure 809960DEST_PATH_IMAGE039
representing the height of a projection area, namely vertically projecting a row of pixel numbers;
the position area detection algorithm of the crop row at the bottom edge of the image is as follows:
1. the projection direction is vertically downward, and the sum of the pixels in the column in the projection area (the projection area is the lower half 1/3 area of the image in the embodiment) is calculated
Figure 723909DEST_PATH_IMAGE040
(j =1,2, … M) and the average gray value T, as shown in fig. 5, the gray curve is a projection curve, and the gray horizontal straight line is the average gray value; establishing and initializing a two-dimensional array A for storing crop row number and position information, e.g. using
Figure 16350DEST_PATH_IMAGE041
(
Figure 873447DEST_PATH_IMAGE042
) Storing the number and position information of the crop rows, wherein
Figure 505417DEST_PATH_IMAGE043
Store the first
Figure 156978DEST_PATH_IMAGE044
The information of the left edge of each crop row,
Figure 139978DEST_PATH_IMAGE045
store the first
Figure 909088DEST_PATH_IMAGE044
Right edge information of each crop row; initializing temporary variables m =1, j = 1;
2. will go from left to right in the projection area
Figure 457881DEST_PATH_IMAGE046
And
Figure 483606DEST_PATH_IMAGE047
the values are compared if
Figure 750639DEST_PATH_IMAGE048
Then
Figure 887223DEST_PATH_IMAGE040
=
Figure 290522DEST_PATH_IMAGE049
Otherwise
Figure 487148DEST_PATH_IMAGE050
=
Figure 241478DEST_PATH_IMAGE051
3. If it is not
Figure 417638DEST_PATH_IMAGE040
>
Figure 878706DEST_PATH_IMAGE052
Then A [ m ]][0]= j represents crop row left edge; if it is not
Figure 308550DEST_PATH_IMAGE040
<
Figure 222280DEST_PATH_IMAGE052
Then A [ m ]][1]= j denotes the crop row right edge; if it is not
Figure 700665DEST_PATH_IMAGE053
=
Figure 780355DEST_PATH_IMAGE052
If yes, indicating the non-crop row edge and not processing;
4. calculating the difference value of the left edge and the right edge of the crop row, namely Ds = A [ m ] [1] -A [ m ] [0], and if Ds > = R indicates that the position information of the crop row is effective, reserving; if Ds < R indicates that the crop row position information is invalid, deleting;
5. m = m +1, j = j +1, repeating the processes of the steps (2) - (5),
Figure 53204DEST_PATH_IMAGE054
stopping searching and ending the program;
Figure 782126DEST_PATH_IMAGE055
the image is shown to finish the projection of all the column pixels from left to right, and the projection is used as the detection ending condition of the position area of the crop row at the bottom edge of the image;
6. selecting the crop row position closest to the left and right sides of the image central line at the bottom edge of the image in A [ m ] [ n ], as shown in FIG. 6;
the detection algorithm of the position area of the crop row on the top edge of the image is as follows:
1. the projection direction is vertically upward, and the sum of the pixels in the column of the projection area (the projection area is the area 1/3 on the upper half part of the image in the embodiment) is calculated
Figure 798623DEST_PATH_IMAGE053
(j =1,2, 3, … M) and an average gray value T, as shown in fig. 5, a gray curve is a projection curve, and a gray horizontal straight line is an average gray value; establishing and initializing a two-dimensional array A for storing crop row number and position information, e.g. using
Figure 296601DEST_PATH_IMAGE041
(
Figure 740352DEST_PATH_IMAGE056
) Storing the number and position information of the crop rows, wherein
Figure 389858DEST_PATH_IMAGE057
Store the first
Figure 272363DEST_PATH_IMAGE044
The information of the left edge of each crop row,
Figure 562530DEST_PATH_IMAGE058
store the first
Figure 442762DEST_PATH_IMAGE044
Right edge information of each crop row; initializing temporary variables m =1, j = 1;
2. will go from left to right in the projection area
Figure 880696DEST_PATH_IMAGE040
And
Figure 238996DEST_PATH_IMAGE059
the values are compared if
Figure 147784DEST_PATH_IMAGE060
Then
Figure 995654DEST_PATH_IMAGE040
=
Figure 858568DEST_PATH_IMAGE061
Otherwise
Figure 20559DEST_PATH_IMAGE040
=0;
3. If it is not
Figure 347635DEST_PATH_IMAGE040
<
Figure 569669DEST_PATH_IMAGE052
Then A [ m ]][0]= j represents crop row left edge; if it is not
Figure 982196DEST_PATH_IMAGE040
>
Figure 121447DEST_PATH_IMAGE052
Then A [ m ]][1]= j denotes the crop row right edge; if it is not
Figure 240713DEST_PATH_IMAGE053
=
Figure 695965DEST_PATH_IMAGE052
If yes, indicating the non-crop row edge and not processing;
4. calculating the difference value of the left edge and the right edge of the crop row, namely Ds = A [ m ] [1] -A [ m ] [0], and if Ds > = R indicates that the position information of the crop row is effective, reserving; if Ds < R indicates that the crop row position information is invalid, deleting;
5. m = m +1, j = j +1, repeating the processes of the steps (2) - (5),
Figure 533471DEST_PATH_IMAGE062
stopping searching and ending the program;
Figure 99581DEST_PATH_IMAGE055
the image is shown to finish the projection of all the column pixels from left to right, and the projection is used as the detection ending condition of the position area of the crop row at the bottom edge of the image;
6. selecting the crop row positions closest to the left and right sides of the image center line on the top edge of the image in A [ m ] [ n ], as shown in FIG. 6;
according to the positions of the closest crop rows on the left side and the right side of the central line of the image, the left edge point and the right edge point of the crop row on the same side on the bottom side of the image are respectively connected with the left edge point and the right edge point of the crop on the top side of the image by straight lines to form a crop row area range, as shown in fig. 7.
S05, extracting the characteristic points of the crop rows in the crop row area range by using a vertical projection method based on a moving window, as shown in FIG. 8;
the detection steps of the crop row characteristic points in the crop row area range are as follows:
(1) dividing a horizontal strip at the top end of the image according to the crop row area range, wherein the horizontal strip and the left and right side edges of the crop row area range form a window in a straight line manner, and the horizontal strip has a certain height;
(2) the projection direction is vertically downward, the window is scanned from left to right column by column, and the sum of the gray levels of the pixels in each column and the average gray level of all the pixels in the window are calculated;
(3) comparing the sum of the gray levels of the column pixels with the average gray level value from left to right in the window in sequence, if the sum of the gray levels of the column pixels is larger than or equal to the average gray level value, setting the sum of the gray levels of the column pixels as the average gray level value, otherwise, setting the sum of the gray levels of the column pixels as 0;
(4) comparing the sum of the column pixels of the jth column with the sum of the column pixels of the (j + 1) th column, and judging the left edge point and the right edge point of the crop row according to the magnitude of the sum of the column pixels of the jth column and the sum of the column pixels of the (j + 1) th column; if the sum of the pixels of the j-th column is less than the sum of the pixels of the j + 1-th column, j is the left edge point of the crop row; if the sum of the pixels of the jth column is larger than the sum of the pixels of the jth +1 th column, taking j as a right edge point of the crop row; if the sum of the pixels of the jth column is equal to the sum of the pixels of the (j + 1) th column, no processing is carried out; wherein j is a positive integer greater than or equal to 1 and represents a column of image pixels;
(5) comparing and judging according to the difference value of the left edge point and the right edge point of the crop row and a width threshold value of a preset crop row, if the difference value is larger than the preset crop row width threshold value, the left edge point and the right edge point of the crop row are crop row boundary feature points, and if not, the left edge point and the right edge point are false feature points, and removing;
(6) calculating the middle point of the left edge point and the right edge point of the crop row, and taking the middle point as the characteristic point of the crop row;
(7) the horizontal stripe moves downwards by one pixel, the process is repeated, and the characteristic point of the crop row in the window is calculated until the horizontal stripe reaches the bottommost end of the crop row area;
in particular, for a frame
Figure 571889DEST_PATH_IMAGE063
The image of the pixel or pixels of the image,
Figure 135725DEST_PATH_IMAGE030
which represents the width of the image,
Figure 522844DEST_PATH_IMAGE064
representing the image height; is provided with
Figure 564749DEST_PATH_IMAGE065
In the representation image
Figure 720924DEST_PATH_IMAGE066
The gray scale of the pixel point at the position,
Figure 455662DEST_PATH_IMAGE067
the sum of the gray levels of the j-th row of pixels, and T is the average value of the gray levels of the pixels in the window; the width threshold of the crop row is set as R,
Figure 497787DEST_PATH_IMAGE068
and T is expressed as:
Figure 671279DEST_PATH_IMAGE069
Figure 619643DEST_PATH_IMAGE070
(9)
Figure 525283DEST_PATH_IMAGE071
Figure 621414DEST_PATH_IMAGE072
(10)
in the above formula, the first and second carbon atoms are,
Figure 5122DEST_PATH_IMAGE073
represents a straight-line abscissa point of the left edge of the window,
Figure 135889DEST_PATH_IMAGE074
represents a straight-line abscissa point of the right edge of the window,
Figure 710965DEST_PATH_IMAGE075
representing the width of the window area, expressed as
Figure 232076DEST_PATH_IMAGE076
Figure 481792DEST_PATH_IMAGE077
Representing the height of a window for vertical projection, namely the number of pixels of a column for vertical projection; wherein the window is formed by horizontal strips and the range of the crop row areaThe right side edge is composed of straight lines, and for simplifying calculation
Figure 404749DEST_PATH_IMAGE077
Under the condition of smaller size, the window is approximate to a rectangle;
the detection algorithm of the position area of the crop row on the top edge of the image is as follows:
1. dividing a horizontal strip with the height h at the top end of the binary image, wherein the horizontal strip and the left and right side edges of the crop row region form a window;
2. calculating the sum of pixels in each column of the window
Figure 714507DEST_PATH_IMAGE078
And the average gray value T, will be from left to right in the window
Figure 722915DEST_PATH_IMAGE079
And
Figure 215469DEST_PATH_IMAGE080
the values are compared if
Figure 55249DEST_PATH_IMAGE081
Then
Figure 473592DEST_PATH_IMAGE082
Otherwise
Figure 766033DEST_PATH_IMAGE083
3. If it is not
Figure 560814DEST_PATH_IMAGE079
<
Figure 192784DEST_PATH_IMAGE084
Then j column indicates the candidate edge point on the left side of the crop row, if
Figure 844345DEST_PATH_IMAGE079
>
Figure 325879DEST_PATH_IMAGE085
The j column represents the candidate edge point on the right side of the crop row, if the distance between the candidate edge point on the right side and the candidate edge point on the left side is greater than a set threshold value R, the edge point pair (the candidate edge point on the right side and the candidate edge point on the left side) is considered as an effective edge point of the crop row, and if not, the edge point pair is a false edge point and is removed;
4. calculating the middle points of the left edge point and the right edge point of the crop row, and taking the points as the characteristic points of the crop row;
5. the horizontal strip is moved one pixel down and the process is repeated until the horizontal strip reaches the bottom most crop row area.
S06, establishing a crop row center line solving model according to the characteristics of crop rows in the image, and extracting crop row center lines in the crop row region range through an artificial bee colony algorithm, as shown in FIG. 10, wherein a gray solid line represents the crop row center lines, and a gray dotted line is a navigation path;
the method specifically comprises the following steps:
(1) a crop row center line solving model is established according to the characteristics of crop rows in the image, the crop row center line solving model is that the farmland crop rows are represented as approximate straight lines in shape, and an equation can be determined according to the characteristic points of the two crop rows in the image, as shown in figure 9. Let V denote the crop row feature point data space obtained according to said step S05,
Figure 721089DEST_PATH_IMAGE010
and
Figure 207565DEST_PATH_IMAGE086
for two points in V, then the crop row centerline equation can be expressed as:
Figure 233290DEST_PATH_IMAGE087
(11)
counting the number of characteristic points within the range of d from the straight line, using the number as a standard for evaluating the quality of the straight line, and adjusting
Figure 500323DEST_PATH_IMAGE010
And
Figure 371327DEST_PATH_IMAGE086
selecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
(2) dividing the crop row feature points in the crop row area range obtained in the step S05 into an upper part and a lower part 2 by taking the height of the image 1/2 as a boundary, taking the crop row feature points of the upper half part as candidate starting points of a crop row center line and the crop row feature points of the lower half part as candidate end points of the crop row center line, and establishing arrays to respectively store the candidate starting points and the candidate end points of the crop row center line;
(3) and solving the model according to the crop row center line, and randomly selecting 1 candidate starting point and 1 candidate ending point to form a honey source of the artificial bee colony algorithm to represent a candidate crop row center line. Initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of feature points within a certain range from the candidate straight lines, taking the feature points as fitness functions for evaluating the goodness and badness of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as a crop row central line through multiple searches of an artificial bee colony algorithm;
the minimum search model of the artificial bee colony algorithm comprises four constituent elements of honey sources, leading bees, following bees and scout bees and 2 behaviors of recruiting the bees and abandoning the honey sources, wherein the number of the leading bees and the following bees in the algorithm is equal to the number of the honey sources, and the basic principle is as follows:
let D be the dimension of the problem to be solved, the position of the honey source represents the potential solution of the problem, and the position of the honey source i is expressed as
Figure 40206DEST_PATH_IMAGE088
Then, the mathematical model of the artificial bee colony algorithm is as follows:
Figure 755875DEST_PATH_IMAGE089
(12)
Figure 244625DEST_PATH_IMAGE090
(13)
Figure 184899DEST_PATH_IMAGE091
(14)
in the above formula, the first and second carbon atoms are,
Figure 708284DEST_PATH_IMAGE092
and
Figure 810233DEST_PATH_IMAGE093
respectively representing the upper and lower limits of the search space,
Figure 989541DEST_PATH_IMAGE094
one dimension of the solution;
Figure 530244DEST_PATH_IMAGE095
representing a new honey source generated by the search phase in the vicinity of honey source i,
Figure 344354DEST_PATH_IMAGE096
is [ -1, 1 [ ]]A random number in between, and a random number,
Figure 945099DEST_PATH_IMAGE097
,
Figure 346125DEST_PATH_IMAGE098
Figure 628202DEST_PATH_IMAGE099
representing the probability of the follower bee selecting the ith honey source,
Figure 126179DEST_PATH_IMAGE100
the ith honey source fitness is defined, and NP is the number of solutions; if the quality of the honey source is not improved after the limited number of cycles, the honey source is abandoned, the corresponding leading bee is converted into a detection bee, and the detection beeA new source of honey is generated according to equation (12). The specific algorithm design for detecting the crop row center line is as follows:
1. using a vertical projection method to obtain the number N of crop rows and the area range if
Figure 569930DEST_PATH_IMAGE101
(the image at least comprises 1 crop row), detecting the characteristic points of the crops in the strip area, otherwise, ending the program;
2. dividing the characteristic points of the crop rows into an upper part and a lower part by taking the height of the image 1/2 as a boundary, and establishing an array
Figure 786148DEST_PATH_IMAGE102
Figure 107801DEST_PATH_IMAGE103
Respectively for storing
Figure 397968DEST_PATH_IMAGE104
Characteristic points of the upper half part and the lower half part of the row of the strip crops;
3. initializing crop row count variables
Figure 74937DEST_PATH_IMAGE105
Distance threshold
Figure 450555DEST_PATH_IMAGE106
Number of honey sources
Figure 136751DEST_PATH_IMAGE107
(the number of leading bees and following bees is the same as the number of honey sources), local search threshold limit and maximum iteration number
Figure 281424DEST_PATH_IMAGE108
(ii) a Establishing a fitness function
Figure 394874DEST_PATH_IMAGE109
Figure 756323DEST_PATH_IMAGE110
Representing a straight line of distance
Figure 980631DEST_PATH_IMAGE106
The number of characteristic points in the range;
4. are respectively provided with
Figure 245390DEST_PATH_IMAGE111
In the random selection of 2 points
Figure 467424DEST_PATH_IMAGE112
Form a honey source
Figure 923026DEST_PATH_IMAGE113
The initialization of the honey source using equation (12) represents a potential solution
Figure 888708DEST_PATH_IMAGE107
Calculating the fitness of all the potential solutions;
5. leading bees to perform neighborhood search according to the formula (13) to generate a new honey source
Figure 7974DEST_PATH_IMAGE114
Calculating the fitness of the new honey source if
Figure 463226DEST_PATH_IMAGE114
Adaptability is higher than
Figure 799267DEST_PATH_IMAGE115
Then
Figure 365378DEST_PATH_IMAGE116
Otherwise
Figure 339150DEST_PATH_IMAGE115
Keeping the same;
6. calculation according to equation (14)
Figure 965304DEST_PATH_IMAGE117
Is related to probability
Figure 24526DEST_PATH_IMAGE118
According to the bee
Figure 332011DEST_PATH_IMAGE118
Selecting a honey source; following bees perform neighborhood search using equation (13) to generate a new solution
Figure 488186DEST_PATH_IMAGE119
Calculating its fitness if
Figure 724388DEST_PATH_IMAGE119
Adaptability is higher than
Figure 333224DEST_PATH_IMAGE120
Fitness is then
Figure 178821DEST_PATH_IMAGE121
Otherwise
Figure 189502DEST_PATH_IMAGE122
Keeping the same;
7. after limit cycles, if
Figure 95141DEST_PATH_IMAGE123
If the fitness is not changed, the solution is abandoned, the corresponding leading bee is converted into a detection bee, and a new solution is generated according to the formula (12) to replace the current solution
Figure 191273DEST_PATH_IMAGE115
8. Saving the current optimal solution, and judging whether the maximum iteration number is reached
Figure 276778DEST_PATH_IMAGE124
If so, outputting the optimal result
Figure 407546DEST_PATH_IMAGE125
The linear equation of the crop row can be calculated by using two points
Figure 484086DEST_PATH_IMAGE126
Wherein
Figure 739618DEST_PATH_IMAGE127
,
Figure 254913DEST_PATH_IMAGE128
(ii) a Otherwise, returning to the step (5);
9. if num +1> N, the coverage of all crop row areas is traversed, the program is ended, otherwise num = num +1, and the step (4) is returned;
in the algorithm, the number of honey sources, the number of leading bees and the number of following bees are all m =30, the local search threshold limit =10, the maximum iteration number C =50, and the linear distance threshold d = 2.
And S07, extracting the navigation path. The navigation path equation is calculated from the two crop row lines closest to the image centerline, as shown in fig. 10, where the gray dashed line represents the navigation path. Set the image size of the farm crop as
Figure 177869DEST_PATH_IMAGE129
Figure 920917DEST_PATH_IMAGE130
Which represents the width of the image,
Figure 726062DEST_PATH_IMAGE131
representing the height of the image, the ordinate of the top edge point of the image is 0, and the ordinate of the bottom edge point of the image is
Figure 717151DEST_PATH_IMAGE132
And 2 points for determining the crop line straight line can be calculated by an artificial bee colony algorithm, and the navigation path equation is calculated by the following specific steps:
(1) set the left side of the image center line as the straight line of the crop row (A), (B), (C), (D), (E), (D), (E) and (E), (E) and (E), (E) and (E) a)
Figure 556931DEST_PATH_IMAGE133
Figure 975274DEST_PATH_IMAGE134
) And (a)
Figure 533295DEST_PATH_IMAGE135
Figure 62496DEST_PATH_IMAGE136
) Two points are determined, the intersection of the line with the top and bottom edges of the image being: (
Figure 22362DEST_PATH_IMAGE137
0) and (
Figure 110141DEST_PATH_IMAGE138
,
Figure 889879DEST_PATH_IMAGE139
) Wherein
Figure 222771DEST_PATH_IMAGE140
Figure 37143DEST_PATH_IMAGE141
(2) setting the right side of the central line of the image as the straight line of the crop row to pass (
Figure 797289DEST_PATH_IMAGE142
Figure 64322DEST_PATH_IMAGE143
)、(
Figure 200905DEST_PATH_IMAGE144
Figure 869784DEST_PATH_IMAGE145
) Two points, the intersection of the line with the top and bottom edges of the image: (
Figure 302296DEST_PATH_IMAGE146
0) and (
Figure 994308DEST_PATH_IMAGE147
Figure 996899DEST_PATH_IMAGE148
) Wherein
Figure 457968DEST_PATH_IMAGE149
Figure 622233DEST_PATH_IMAGE150
(3) calculating the midpoint between two crop row lines (
Figure 801541DEST_PATH_IMAGE151
,0),(
Figure 778462DEST_PATH_IMAGE152
Figure 156354DEST_PATH_IMAGE153
) Solving the navigation path equation by using 2 points
Figure 694783DEST_PATH_IMAGE154
Wherein
Figure 158125DEST_PATH_IMAGE155
Figure 440202DEST_PATH_IMAGE156
fig. 11 shows an agricultural mobile robot navigation path extraction device based on an artificial bee colony algorithm under natural lighting conditions, which is provided by the embodiment of the invention and comprises:
an image graying processing module M01 for passing
Figure 610283DEST_PATH_IMAGE001
The factor converts the color crop image into a gray image;
the image segmentation module M02 is used for converting the gray-scale image into a binary image by a maximum inter-class variance method;
the crop row region range determining module M03 is used for determining the crop row region range through a vertical projection method according to the binary image;
the crop row characteristic point detection module M04 is used for extracting crop row characteristic points in the crop row region range by a vertical projection method based on a moving window according to the crop row region range;
the crop row center line extraction module M05 is used for establishing a crop row center line solving model according to the characteristics of crop rows in the image, and extracting the crop row center lines in the crop row area range through an artificial bee colony algorithm;
the navigation path determining module M06 is used for determining a navigation path between the two crop row center lines according to the two crop row center lines closest to the image center line;
the above-mentioned apparatuses and the above-mentioned methods are in a one-to-one correspondence, and the details of the implementation of the above-mentioned apparatuses will not be described in detail in this embodiment.
Compared with the prior art, the navigation path identification method is rapid and accurate and has strong adaptability to natural illumination. By normalization
Figure 381930DEST_PATH_IMAGE001
The factor carries out gray processing on the crop image so as to reduce the influence of illumination change on navigation path identification; by establishing a crop row central line solving model, the artificial bee colony algorithm is utilized to extract crop row straight lines and navigation lines, so that the accuracy and speed of navigation path recognition are improved.
Although the present invention has been described in detail with reference to examples, it should be understood by those skilled in the art that various combinations, modifications and equivalents may be made without departing from the spirit and scope of the invention as defined in the claims.

Claims (4)

1. An agricultural mobile robot navigation path extraction method based on an artificial bee colony algorithm under natural illumination conditions is characterized by comprising the following steps:
s01, collecting crop images; the camera forms an included angle of 60-70 degrees with the horizontal direction, and the vertical height from the ground is about 1.2-1.4 m;
s02, utilizing 2CgNor-CrNor-CbNorCarrying out gray processing on the collected crop image by the factor, and converting the color crop image into a gray image;
s03, performing image segmentation by adopting a maximum inter-class variance method, and converting the gray level image into a binary image;
s04, obtaining edge position information of the crop row at the top edge and the bottom edge of the image by adopting a vertical projection method, and forming a crop row area range by connecting edge points through straight lines;
s05, extracting the characteristic points of the crop rows in the crop row area range by using a vertical projection method based on a moving window;
s06, establishing a crop row center line solving model according to the characteristics of crop rows in the image, and extracting crop row center lines in the crop row area range through an artificial bee colony algorithm; the method specifically comprises the following steps:
(1) establishing a crop row center line solving model according to the characteristics of crop rows in the image, wherein the crop row center line solving model is that the farmland crop rows are represented as approximate straight lines in shape, an equation of the crop row center line solving model can be determined according to two crop row feature points in the image, and V is set to represent a crop row feature point data space obtained according to the step S05, (x)i,yi) And (x)j,yj) For two points in V, then the crop row centerline equation can be expressed as:
Figure FDA0003175814520000011
counting the number of the characteristic points within the range of d from the straight lineWhich is used as a standard for evaluating the quality of a straight line by adjusting (x)i,yi) And (x)j,yj) Selecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
(2) dividing the crop row feature points in the crop row area range obtained in the step S05 into an upper part and a lower part 2 by taking the height of the image 1/2 as a boundary, taking the crop row feature points of the upper half part as candidate starting points of a crop row center line and the crop row feature points of the lower half part as candidate end points of the crop row center line, and establishing arrays to respectively store the candidate starting points and the candidate end points of the crop row center line;
(3) solving a model according to a crop row center line, randomly selecting 1 candidate starting point and 1 candidate end point to form a honey source of an artificial bee colony algorithm, representing a candidate crop row center line, initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of feature points within a certain range from the candidate straight lines, taking the feature points as fitness functions for evaluating the goodness and badness of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as the crop row center line through multiple searches of the artificial bee colony algorithm;
and S07, determining a navigation path between the two crop row central lines according to the two crop row central lines closest to the image central line.
2. The method according to claim 1, wherein the step S02 specifically comprises the steps of:
(1) converting the crop image from RGB color space to YCrCb color space:
Figure FDA0003175814520000021
(2) constructing a Cg component irrelevant to illumination on the basis of a YCrCb color space, wherein the Cg component corresponds to the difference between a green signal and a brightness signal:
Figure FDA0003175814520000022
(3) processing Cg according to ITU-R BT.601-6 to obtain formula (4):
Figure FDA0003175814520000023
(4) respectively carrying out normalization processing on the Cr, Cb and Cg components to obtain CgNor、CrNor、CbNorComponent (b):
Figure FDA0003175814520000031
Figure FDA0003175814520000032
Figure FDA0003175814520000033
(5) using 2CgNor-CrNor-CbNorThe factor performs graying processing on the color crop image.
3. The utility model provides an agricultural mobile robot navigation route extraction element based on artifical bee colony algorithm under natural lighting condition which characterized in that includes:
(1) image graying processing module using 2CgNor-CrNor-CbNorThe factor converts the color crop image into a gray image;
(2) the image segmentation module is used for converting the gray level image into a binary image by a maximum inter-class variance method;
(3) the crop row region range determining module is used for determining the crop row region range through a vertical projection method according to the binary image;
(4) the crop row characteristic point detection module is used for extracting crop row characteristic points in the crop row region range through a vertical projection method based on a moving window according to the crop row region range;
(5) a crop row centerline extraction module to:
establishing a crop row center line solving model according to the characteristics of crop rows in an image, wherein the crop row center line solving model is that farmland crop rows are expressed as approximate straight lines in form, an equation of the crop row center line solving model can be determined according to two crop row feature points in the image, and V is set to represent a crop row feature point data space obtained according to a crop row feature point detection module, (x)i,yi) And (x)j,yj) For two points in V, then the crop row centerline equation can be expressed as:
Figure FDA0003175814520000034
counting the number of characteristic points within the range of d from the straight line, using the number as a standard for evaluating the quality of the straight line, and adjusting (x)i,yi) And (x)j,yj) Selecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
dividing the crop row characteristic points in the crop row area range obtained by the crop row characteristic point detection module into an upper part and a lower part by taking the height of the image 1/2 as a boundary, taking the crop row characteristic points of the upper half part as candidate starting points of a crop row center line and the crop row characteristic points of the lower half part as candidate terminal points of the crop row center line, and establishing arrays to respectively store the candidate starting points and the candidate terminal points of the crop row center line;
solving the model according to the crop row center line, randomly selecting 1 candidate starting point and 1 candidate end point to form a honey source of the artificial bee colony algorithm, representing a candidate crop row center line, initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of characteristic points within a certain range from the candidate straight lines, taking the characteristic points as a fitness function for evaluating the quality of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as the crop row center line through multiple searches of the artificial bee colony algorithm;
(6) and the navigation path determining module is used for determining a navigation path between the two crop row center lines according to the two crop row center lines closest to the image center line.
4. The apparatus according to claim 3, wherein the image graying processing module specifically includes:
(1) converting the crop image from RGB color space to YCrCb color space:
Figure FDA0003175814520000041
(2) constructing a Cg component irrelevant to illumination on the basis of a YCrCb color space, wherein the Cg component corresponds to the difference between a green signal and brightness:
Figure FDA0003175814520000042
(3) cg is processed according to ITU-R BT.601-6 to yield (11):
Figure FDA0003175814520000051
(4) respectively carrying out normalization processing on the Cr, Cb and Cg components to obtain CgNor、CrNor、CbNorComponent (b):
Figure FDA0003175814520000052
Figure FDA0003175814520000053
Figure FDA0003175814520000054
(5) using 2CgNor-CrNor-CbNorThe factor performs graying processing on the color crop image.
CN201611076483.9A 2016-11-30 2016-11-30 Robot navigation path extraction method and device based on artificial bee colony algorithm Active CN108133471B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611076483.9A CN108133471B (en) 2016-11-30 2016-11-30 Robot navigation path extraction method and device based on artificial bee colony algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611076483.9A CN108133471B (en) 2016-11-30 2016-11-30 Robot navigation path extraction method and device based on artificial bee colony algorithm

Publications (2)

Publication Number Publication Date
CN108133471A CN108133471A (en) 2018-06-08
CN108133471B true CN108133471B (en) 2021-09-17

Family

ID=62387328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611076483.9A Active CN108133471B (en) 2016-11-30 2016-11-30 Robot navigation path extraction method and device based on artificial bee colony algorithm

Country Status (1)

Country Link
CN (1) CN108133471B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107067430B (en) * 2017-04-13 2020-04-21 河南理工大学 Wheat field crop row detection method based on feature point clustering
CN108901540A (en) * 2018-06-28 2018-11-30 重庆邮电大学 Fruit tree light filling and fruit thinning method based on artificial bee colony fuzzy clustering algorithm
CN111931789B (en) * 2020-07-28 2024-05-14 江苏大学 Linear crop row extraction method suitable for different illumination, crop density and growth backgrounds
CN112395984B (en) * 2020-11-18 2022-09-16 河南科技大学 Method for detecting seedling guide line of unmanned agricultural machine
CN112712534B (en) * 2021-01-15 2023-05-26 山东理工大学 Corn rhizome navigation datum line extraction method based on navigation trend line
CN113111892B (en) * 2021-05-12 2021-10-22 中国科学院地理科学与资源研究所 Crop planting row extraction method based on unmanned aerial vehicle image

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2131184A1 (en) * 2008-06-02 2009-12-09 CNH Belgium N.V. Crop particle discrimination methods and apparatus
CN101604166A (en) * 2009-07-10 2009-12-16 杭州电子科技大学 A kind of method for planning path for mobile robot based on particle swarm optimization algorithm
CN101807252A (en) * 2010-03-24 2010-08-18 中国农业大学 Crop row center line extraction method and system
CN102999757A (en) * 2012-11-12 2013-03-27 中国农业大学 Leading line extracting method
CN104866820A (en) * 2015-04-29 2015-08-26 中国农业大学 Farm machine navigation line extraction method based on genetic algorithm and device thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2131184A1 (en) * 2008-06-02 2009-12-09 CNH Belgium N.V. Crop particle discrimination methods and apparatus
CN101604166A (en) * 2009-07-10 2009-12-16 杭州电子科技大学 A kind of method for planning path for mobile robot based on particle swarm optimization algorithm
CN101807252A (en) * 2010-03-24 2010-08-18 中国农业大学 Crop row center line extraction method and system
CN102999757A (en) * 2012-11-12 2013-03-27 中国农业大学 Leading line extracting method
CN104866820A (en) * 2015-04-29 2015-08-26 中国农业大学 Farm machine navigation line extraction method based on genetic algorithm and device thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Mobile robot path planning using artificial bee colony and evolutionary programming;Contreras-Cruz M A 等;《Applied Soft Computing Journal》;20150210;第319-328页 *
人工蜂群算法在移动机器人路径规划中的应用;黎竹娟;《计算机仿真》;20121231;第29卷(第12期);第247-250页 *

Also Published As

Publication number Publication date
CN108133471A (en) 2018-06-08

Similar Documents

Publication Publication Date Title
CN108133471B (en) Robot navigation path extraction method and device based on artificial bee colony algorithm
CN111640157B (en) Checkerboard corner detection method based on neural network and application thereof
Wang et al. Image segmentation of overlapping leaves based on Chan–Vese model and Sobel operator
CN107516077B (en) Traffic sign information extraction method based on fusion of laser point cloud and image data
CN109903331B (en) Convolutional neural network target detection method based on RGB-D camera
CN107492094A (en) A kind of unmanned plane visible detection method of high voltage line insulator
Jin et al. Corn plant sensing using real‐time stereo vision
CN113160192A (en) Visual sense-based snow pressing vehicle appearance defect detection method and device under complex background
CN111915704A (en) Apple hierarchical identification method based on deep learning
CN111753577A (en) Apple identification and positioning method in automatic picking robot
CN108509928A (en) For Cold region apple jujube garden field pipe operation vision guided navigation path extraction method
CN112784869B (en) Fine-grained image identification method based on attention perception and counterstudy
CN112766155A (en) Deep learning-based mariculture area extraction method
CN111724354B (en) Image processing-based method for measuring wheat ear length and wheat ear number of multiple wheat plants
CN113450402B (en) Navigation center line extraction method for vegetable greenhouse inspection robot
CN109190452B (en) Crop row identification method and device
CN116740758A (en) Bird image recognition method and system for preventing misjudgment
Tu et al. An efficient crop row detection method for agriculture robots
CN116977960A (en) Rice seedling row detection method based on example segmentation
Xiang et al. PhenoStereo: a high-throughput stereo vision system for field-based plant phenotyping-with an application in sorghum stem diameter estimation
CN115049689A (en) Table tennis identification method based on contour detection technology
CN117079125A (en) Kiwi fruit pollination flower identification method based on improved YOLOv5
CN113421301B (en) Method and system for positioning central area of field crop
CN115995017A (en) Fruit identification and positioning method, device and medium
CN115451965A (en) Binocular vision-based relative heading information detection method for transplanting system of rice transplanter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant