CN108133471B - Robot navigation path extraction method and device based on artificial bee colony algorithm - Google Patents
Robot navigation path extraction method and device based on artificial bee colony algorithm Download PDFInfo
- Publication number
- CN108133471B CN108133471B CN201611076483.9A CN201611076483A CN108133471B CN 108133471 B CN108133471 B CN 108133471B CN 201611076483 A CN201611076483 A CN 201611076483A CN 108133471 B CN108133471 B CN 108133471B
- Authority
- CN
- China
- Prior art keywords
- crop row
- crop
- image
- center line
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000605 extraction Methods 0.000 title claims abstract description 22
- 238000000034 method Methods 0.000 claims abstract description 45
- 238000012545 processing Methods 0.000 claims abstract description 34
- 238000005286 illumination Methods 0.000 claims abstract description 27
- 238000001514 detection method Methods 0.000 claims abstract description 23
- 235000012907 honey Nutrition 0.000 claims description 30
- 238000010606 normalization Methods 0.000 claims description 9
- 238000003709 image segmentation Methods 0.000 claims description 7
- 238000003491 array Methods 0.000 claims description 5
- 230000008859 change Effects 0.000 abstract description 10
- 230000035945 sensitivity Effects 0.000 abstract description 2
- 238000005457 optimization Methods 0.000 abstract 1
- 241000257303 Hymenoptera Species 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 238000004364 calculation method Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 125000004432 carbon atom Chemical group C* 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 241000287196 Asthenes Species 0.000 description 1
- 240000008042 Zea mays Species 0.000 description 1
- 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 description 1
- 235000002017 Zea mays subsp mays Nutrition 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 235000005822 corn Nutrition 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 238000009331 sowing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Landscapes
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an agricultural mobile robot navigation path extraction method and device based on an artificial bee colony algorithm under natural illumination conditions, wherein the method comprises the following steps: collecting images of farmland crops; carrying out gray processing on the crop image; dividing the processed gray level image into a binary image; respectively carrying out gray level vertical projection on the top and the bottom of the binary image to obtain a crop row region range; detecting the characteristic points of the crop rows in the range of the crop row area by adopting a vertical projection method based on a moving window; establishing a crop row center line solving model according to the characteristics of crop rows in the image, and performing optimization search on crop row feature points in a crop row area range through an artificial bee colony algorithm to determine the crop row center line; determining a navigation path between two crop rows according to the center lines of two adjacent crop rows closest to the center line of the image; the method improves the detection precision and speed of the navigation path and the adaptability to natural illumination, and solves the problems of low accuracy, poor real-time performance, sensitivity to natural illumination change and the like of the traditional agricultural mobile robot navigation path extraction method.
Description
Technical Field
The invention relates to the technical field of navigation of agricultural mobile robots, in particular to a method and a device for extracting a navigation path of an agricultural mobile robot based on an artificial bee colony algorithm under a natural illumination condition.
Background
The agricultural mobile robot is used for completing field automatic navigation operation, so that the farmland operation efficiency can be effectively improved, the production cost is reduced, and workers are prevented from being exposed to severe environments such as high temperature, high humidity, harm and the like. At present, the research on the navigation of agricultural mobile robots mainly focuses on two modes of machine vision and satellite navigation (GNSS) technologies, and the mobile robot based on machine vision has the advantages of high working efficiency, good real-time performance, low system cost and the like, and has become a hot spot in the field of domestic and foreign fine agriculture research. The navigation path identification is the premise of autonomous navigation operation of the agricultural mobile robot, and the crop row center line and the navigation path can be quickly and accurately extracted, so that the working efficiency and the operation precision of the mobile robot can be effectively improved. For early crops, due to the fact that crop rows are approximately parallel in mechanized sowing and crop row growth is continuous, the crop rows present an approximate straight line on the whole trend, and a crop row center line can be extracted by processing farmland crop images to obtain a navigation path;
the farmland environment is complicated changeable, has unstructured and open characteristics, and the main factors that influence the discernment of agricultural mobile robot vision navigation route include: natural lighting variations and the performance of the path recognition algorithm itself. For the problem of natural illumination, some students propose a navigation information acquisition method based on illumination stability and an illumination-independent graph so as to reduce the influence of illumination change on navigation path extraction. Some scholars select 2G-R-B color factors, H components and Cr components to perform graying processing on an image in RGB, HIS and YCrCb color spaces respectively so as to improve the adaptability of image segmentation to illumination change. However, R, G, B three components are coupled to each other and change with the change of illumination intensity, so the 2G-R-B color factor has low adaptability to the illumination change; the H component and the RGB color space are converted into nonlinearity, so that image distortion is easily caused, and an error is generated on an image processing result; the YCrCb color space lacks a description of the green component and is not suitable for processing farm green crop images. In the aspect of crop row center line (straight line) fitting method selection, some scholars adopt Hough transformation to identify crop rows, the Hough transformation has strong robustness, but the calculation is complex, and the detection precision is limited. Some scholars extract the crop row straight line by using a least square method, and the method has high detection speed, but the least square method is sensitive to strong noise points and poor in robustness. At present, scholars propose a crop row straight line extraction algorithm based on a random method, and the algorithm has the characteristics of low calculation complexity, good real-time performance and the like, but the straight line detection precision depends on the selection of random points to a great extent. In addition, the learners adopt the maximum square to extract the crop skeleton, the crop row straight line is obtained through straight line fitting, and the algorithm needs to search the maximum square for each target pixel point, so that the calculation complexity is increased, the consumed time is more, and the requirement of a high-speed navigation operation system on the real-time performance is difficult to meet.
Disclosure of Invention
The invention is realized by normalizationThe factor carries out gray processing on the crop image so as to reduce the influence of illumination change on navigation path identification; by establishing a crop row central line solving model, the artificial bee colony algorithm is utilized to extract crop row straight lines and navigation lines, so that the accuracy and speed of navigation path recognition are improved.
Technical problem to be solved
The invention aims to solve the technical problem of how to extract the navigation path under the natural illumination condition, and improve the accuracy, real-time property and adaptability of the navigation path identification of the agricultural mobile robot.
(II) technical scheme
In order to solve the technical problem, in a first aspect, the invention provides an agricultural mobile robot navigation path extraction method based on an artificial bee colony algorithm under natural illumination conditions, the method comprising the following steps:
s01, collecting crop images; the camera forms an included angle of 60-70 degrees with the horizontal direction, and the vertical height from the ground is about 1.2-1.4 m;
s02, use ofCarrying out gray processing on the collected crop image by the factor, and converting the color crop image into a gray image;
s03, performing image segmentation by adopting a maximum inter-class variance method, and converting the gray level image into a binary image;
s04, obtaining edge position information of the crop row at the top edge and the bottom edge of the image by adopting a vertical projection method, and forming a crop row area range by connecting edge points through straight lines;
s05, extracting the characteristic points of the crop rows in the crop row area range by using a vertical projection method based on a moving window;
s06, establishing a crop row center line solving model according to the characteristics of crop rows in the image, and extracting crop row center lines in the crop row area range through an artificial bee colony algorithm;
and S07, determining a navigation path between the two crop row central lines according to the two crop row central lines closest to the image central line.
Preferably, the step S02 specifically includes the following steps:
(2) in thatConstructing a Cg component irrelevant to illumination on the basis of a color space, wherein the Cg component corresponds to the difference between a green signal and brightness:
(3) processing Cg according to ITU-R BT.601-6 to obtain formula (3):
(4) respectively carrying out normalization processing on the Cr, Cb and Cg components to obtainComponent (b):
Preferably, the step S06 specifically includes the following steps:
(1) a crop row central line solving model is established according to the characteristics of crop rows in the image, the crop row central line solving model is that the farmland crop rows are expressed as approximate straight lines in shape, and an equation can be determined according to the characteristic points of the two crop rows in the image. Let V denote the crop row feature point data space obtained according to said step S05,andfor two points in V, then the crop row centerline equation can be expressed as:
counting the number of characteristic points within the range of d from the straight line, using the number as a standard for evaluating the quality of the straight line, and adjustingAndselecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
(2) dividing the crop row feature points in the crop row area range obtained in the step S05 into an upper part and a lower part 2 by taking the height of the image 1/2 as a boundary, taking the crop row feature points of the upper half part as candidate starting points of a crop row center line and the crop row feature points of the lower half part as candidate end points of the crop row center line, and establishing arrays to respectively store the candidate starting points and the candidate end points of the crop row center line;
(3) and solving the model according to the crop row center line, and randomly selecting 1 candidate starting point and 1 candidate ending point to form a honey source of the artificial bee colony algorithm to represent a candidate crop row center line. Initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of feature points within a certain range from the candidate straight lines, taking the feature points as fitness functions for evaluating the goodness and badness of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as a crop row central line through multiple searches of an artificial bee colony algorithm.
The invention also provides an agricultural mobile robot navigation path extraction device based on the artificial bee colony algorithm under the natural illumination condition, which comprises the following steps:
(1) image graying processing module usingThe factor converts the color crop image into a gray image;
(2) the image segmentation module is used for converting the gray level image into a binary image by a maximum inter-class variance method;
(3) the crop row region range determining module is used for determining the crop row region range through a vertical projection method according to the binary image;
(4) the crop row characteristic point detection module is used for extracting crop row characteristic points in the crop row region range through a vertical projection method based on a moving window according to the crop row region range;
(5) the crop row center line extraction module is used for establishing a crop row center line solving model according to the characteristics of crop rows in the image and extracting the crop row center lines in the range of the crop row area through an artificial bee colony algorithm;
(6) and the navigation path determining module is used for determining a navigation path between the two crop row center lines according to the two crop row center lines closest to the image center line.
Preferably, the image graying processing module includes:
(2) in thatConstructing a Cg component irrelevant to illumination on the basis of a color space, wherein the Cg component corresponds to the difference between a green signal and brightness:
(3) cg is processed according to ITU-R BT.601-6 to yield (10):
(4) respectively carrying out normalization processing on Cg, Cr and Cb components to obtainComponent (b):
Preferably, the crop row centerline extraction module comprises:
(1) a crop row central line solving model is established according to the characteristics of crop rows in the image, the crop row central line solving model is that the farmland crop rows are expressed as approximate straight lines in shape, and an equation can be determined according to the characteristic points of the two crop rows in the image. Setting V to represent a crop row characteristic point data space obtained by the crop row characteristic point detection module,andfor two points in V, then the crop row centerline equation can be expressed as:
counting the number of characteristic points within the range of d from the straight line, using the number as a standard for evaluating the quality of the straight line, and adjustingAndselecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
(2) dividing the crop row characteristic points in the crop row area range obtained by the crop row characteristic point detection module into an upper part and a lower part by taking the height of the image 1/2 as a boundary, taking the crop row characteristic points of the upper half part as candidate starting points of a crop row central line and the crop row characteristic points of the lower half part as candidate terminal points of the crop row central line, and establishing arrays for respectively storing the candidate starting points and the candidate terminal points of the crop row central line;
(3) and solving the model according to the crop row center line, and randomly selecting 1 candidate starting point and 1 candidate ending point to form a honey source of the artificial bee colony algorithm to represent a candidate crop row center line. Initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of feature points within a certain range from the candidate straight lines, taking the feature points as fitness functions for evaluating the goodness and badness of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as a crop row central line through multiple searches of an artificial bee colony algorithm.
(III) advantageous effects
The invention provides an agricultural mobile robot navigation path extraction method and device based on an artificial bee colony algorithm under a natural illumination condition, aiming at the defects of the existing agricultural mobile robot navigation path identification technology. The method comprises the steps of carrying out gray processing on collected farmland crop images, and segmenting the processed gray images to convert the gray images into binary images; acquiring edge position information of the crop row on the top edge and the bottom edge of the image by adopting a vertical projection method, and connecting edge points through straight lines to form a crop row area range; extracting crop row characteristic points in the crop row region range by using a vertical projection method based on a moving window, solving a model according to a crop row center line, extracting the crop row center line in the crop row region range by using an artificial bee colony algorithm, and further acquiring a navigation path; the method reduces the influence of illumination change on navigation path identification, improves the reliability of navigation path extraction, and solves the problems of poor real-time performance, low accuracy, sensitivity to illumination change and the like of navigation path extraction of the agricultural mobile robot.
Drawings
Fig. 1 is a schematic flow chart of an agricultural mobile robot navigation path extraction method based on an artificial bee colony algorithm under a natural light condition according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a crop row image according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an image after crop row image gray processing according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of a binary image obtained by segmenting a grayscale image according to an embodiment of the present invention.
Fig. 5 is a schematic diagram of a crop row gray scale vertical projection according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of detecting positions of crop rows nearest to the left and right sides of the center line of the image at the top edge and the bottom edge of the image according to an embodiment of the present invention.
Fig. 7 is a schematic diagram of crop row area range detection according to an embodiment of the present invention.
Fig. 8 is a schematic diagram illustrating detection of characteristic points of crop rows within a crop row region according to an embodiment of the present invention.
Fig. 9 is a schematic diagram of a crop row centerline solution model feature according to an embodiment of the present invention.
Fig. 10 is a schematic diagram of crop row center line and navigation path extraction according to an embodiment of the present invention.
Fig. 11 is a schematic structural diagram of an agricultural mobile robot navigation path extraction device based on an artificial bee colony algorithm under a natural light condition according to an embodiment of the present invention.
Detailed Description
The following embodiments are merely used to more clearly illustrate the technical solution of the present invention, but the scope of the present invention is not limited thereby, and the following embodiments select the corn image under the natural illumination during cultivation as an example for detailed description.
Fig. 1 shows a schematic flow chart of an agricultural mobile robot navigation path extraction method based on an artificial bee colony algorithm under natural light conditions according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following specific steps.
And S01, collecting crop images. The height and the angle of a camera arranged on the agricultural mobile robot are adjusted to form an included angle of 60-70 degrees between the camera and the horizontal direction, the vertical height from the ground is 1.2-1.4 m, and the obtained image is shown in figure 2.
And S02, carrying out crop image graying processing. The invention is inConstructing a Cg component irrelevant to illumination on the basis of a color model, and respectively carrying out normalization processing on the Cg component, the Cr component and the Cb component to obtainComponent by normalizationPerforming graying processing on the color image by the factor as shown in FIG. 3;
the color image collected by the camera is in RGB format, and the crop image is converted from RGB color spaceThe formula of the color space is shown in (1):
the Cg component corresponds to the difference between the green signal and the luminance:
cg is processed according to ITU-R BT.601-6 standard to obtain formula (3)
Respectively carrying out normalization processing on Cg, Cr and Cb components to obtainComponent (b):
S03, performing image segmentation by using the maximum inter-class variance method, and converting the grayscale image into a binary image, as shown in fig. 4.
And S04, acquiring edge position information of the crop row at the top edge and the bottom edge of the image by adopting a vertical projection method, and connecting edge points through straight lines to form a crop row area range. Usually, a farmland image acquired by a vision sensor comprises a plurality of rows of crops, and the embodiment adopts a vertical projection method based on a binary image to detect two crop row region ranges nearest to the central line of the image;
the detection steps of the crop row region range are as follows:
(1) the binary image only comprises 2 kinds of white pixels and 2 kinds of black pixels, the gray value of the white pixels is 255 to represent crop information, and the gray value of the black pixels is 0 to represent soil background information. In the image coordinate system, the upper left corner of the image is defined as the origin of coordinates, the left side is defined as the positive direction of the horizontal axis, and the down side is defined as the positive direction of the vertical axis. According to the projection area set in the binary image, the sum of the gray values of each row of pixels in the projection area is calculated, and the average gray value of all pixels in the projection area is determined according to the gray value of each pixel in the projection area, as shown in fig. 5. Wherein the projection regions are an upper half 1/3 region and a lower half 1/3 region in the binary image;
(2) in the projected area of the binary image upper half 1/3, the projection direction is vertically upward, projecting the sum of the column pixel gray levels to the top edge of the image. Sequentially adding the gray levels of the pixels in the rows from left to rightComparing with the average gray value t ifSetting the sum of the pixel grays of the columns as an average gray value t, otherwise setting the sum of the pixel grays of the columns as 0, wherein j is an integer greater than or equal to 1 and represents the columns of the image. Comparing the sum of the column pixels of the jth column with the sum of the column pixels of the (j + 1) th column, judging the left and right edge points of the crop row at the top edge of the image according to the size of the sum of the column pixels of the jth column and the sum of the column pixels of the (j + 1) th column, and if the sum of the column pixels of the jth column is smaller than the sum of the column pixels of the (j + 1) th column, j is the left edge point of the crop row; if the sum of the pixels of the jth column is larger than the sum of the pixels of the jth +1 th column, j is a right edge point of the crop row; if the sum of the pixels of the jth column is equal to the sum of the pixels of the (j + 1) th column, no processing is carried out;
(3) in the lower half 1/3 projection area of the image, the projection direction is vertically downward, projecting the sum of the column pixel grayscales to the bottom edge of the image. Calculating the difference between the image height and the sum of the gray levels of the pixels in the row from left to rightAnd calculating the image height value and said averageThe difference T of gray values will be sequentially from left to rightComparing with T ifAnd if the sum of the gray levels of the column pixels is larger than T, setting the sum of the gray levels of the column pixels as an image height value, otherwise, setting the sum of the column pixels as T, wherein j is an integer larger than or equal to 1 and represents the column of the image. Comparing the sum of the column pixels of the jth column with the sum of the column pixels of the (j + 1) th column, judging the left and right edge points of the crop row at the bottom edge of the image according to the size of the sum of the column pixels of the jth column and the sum of the column pixels of the (j + 1) th column, and if the sum of the pixel of the jth column is greater than the sum of the pixel of the (j + 1) th column, j is the left edge point of the crop row; if the sum of the pixels of the jth column is less than the sum of the pixels of the jth +1 th column, j is a right edge point of the crop row; if the sum of the pixels of the jth column is equal to the sum of the pixels of the (j + 1) th column, no processing is carried out;
(4) calculating the difference value between the left edge and the right edge of the crop row, comparing the difference value with a preset crop row width threshold value, if the difference value is greater than the crop row width threshold value, retaining the information of the left edge and the right edge of the crop row, and otherwise, rejecting the information of the left edge and the right edge of the crop row;
(5) according to the information of the left side edge and the right side edge of the crop row, the position areas of the crop row at the top edge and the bottom edge of the image can be determined, and 2 crop row position areas which are located at the left side and the right side of the image center line and are closest to the image center line are found at the top edge and the bottom edge of the image respectively by taking the image center line as a boundary, as shown in fig. 6;
(6) according to the positions of the closest crop rows on the left side and the right side of the central line of the image, connecting the left edge point and the right edge point of the crop row on the same side at the bottom edge of the image with the left edge point and the right edge point of the crop row on the top edge of the image respectively by using straight lines to form a crop row area range, as shown in fig. 7;
in particular, for a frameThe image of the pixel or pixels of the image,which represents the width of the image,representing the height of the imageIn the representation imageThe gray scale of the pixel point at the position,is the sum of the gray levels of the j-th row of pixels, T is the average value of the gray levels of the pixels in the projection area, the width threshold value of the row of the crop to be detected is R,and T is expressed as:
wherein,which represents the width of the image,representing the height of a projection area, namely vertically projecting a row of pixel numbers;
the position area detection algorithm of the crop row at the bottom edge of the image is as follows:
1. the projection direction is vertically downward, and the sum of the pixels in the column in the projection area (the projection area is the lower half 1/3 area of the image in the embodiment) is calculated(j =1,2, … M) and the average gray value T, as shown in fig. 5, the gray curve is a projection curve, and the gray horizontal straight line is the average gray value; establishing and initializing a two-dimensional array A for storing crop row number and position information, e.g. using () Storing the number and position information of the crop rows, whereinStore the firstThe information of the left edge of each crop row,store the firstRight edge information of each crop row; initializing temporary variables m =1, j = 1;
3. If it is not>Then A [ m ]][0]= j represents crop row left edge; if it is not<Then A [ m ]][1]= j denotes the crop row right edge; if it is not=If yes, indicating the non-crop row edge and not processing;
4. calculating the difference value of the left edge and the right edge of the crop row, namely Ds = A [ m ] [1] -A [ m ] [0], and if Ds > = R indicates that the position information of the crop row is effective, reserving; if Ds < R indicates that the crop row position information is invalid, deleting;
5. m = m +1, j = j +1, repeating the processes of the steps (2) - (5),stopping searching and ending the program;the image is shown to finish the projection of all the column pixels from left to right, and the projection is used as the detection ending condition of the position area of the crop row at the bottom edge of the image;
6. selecting the crop row position closest to the left and right sides of the image central line at the bottom edge of the image in A [ m ] [ n ], as shown in FIG. 6;
the detection algorithm of the position area of the crop row on the top edge of the image is as follows:
1. the projection direction is vertically upward, and the sum of the pixels in the column of the projection area (the projection area is the area 1/3 on the upper half part of the image in the embodiment) is calculated(j =1,2, 3, … M) and an average gray value T, as shown in fig. 5, a gray curve is a projection curve, and a gray horizontal straight line is an average gray value; establishing and initializing a two-dimensional array A for storing crop row number and position information, e.g. using () Storing the number and position information of the crop rows, whereinStore the firstThe information of the left edge of each crop row,store the firstRight edge information of each crop row; initializing temporary variables m =1, j = 1;
3. If it is not<Then A [ m ]][0]= j represents crop row left edge; if it is not>Then A [ m ]][1]= j denotes the crop row right edge; if it is not=If yes, indicating the non-crop row edge and not processing;
4. calculating the difference value of the left edge and the right edge of the crop row, namely Ds = A [ m ] [1] -A [ m ] [0], and if Ds > = R indicates that the position information of the crop row is effective, reserving; if Ds < R indicates that the crop row position information is invalid, deleting;
5. m = m +1, j = j +1, repeating the processes of the steps (2) - (5),stopping searching and ending the program;the image is shown to finish the projection of all the column pixels from left to right, and the projection is used as the detection ending condition of the position area of the crop row at the bottom edge of the image;
6. selecting the crop row positions closest to the left and right sides of the image center line on the top edge of the image in A [ m ] [ n ], as shown in FIG. 6;
according to the positions of the closest crop rows on the left side and the right side of the central line of the image, the left edge point and the right edge point of the crop row on the same side on the bottom side of the image are respectively connected with the left edge point and the right edge point of the crop on the top side of the image by straight lines to form a crop row area range, as shown in fig. 7.
S05, extracting the characteristic points of the crop rows in the crop row area range by using a vertical projection method based on a moving window, as shown in FIG. 8;
the detection steps of the crop row characteristic points in the crop row area range are as follows:
(1) dividing a horizontal strip at the top end of the image according to the crop row area range, wherein the horizontal strip and the left and right side edges of the crop row area range form a window in a straight line manner, and the horizontal strip has a certain height;
(2) the projection direction is vertically downward, the window is scanned from left to right column by column, and the sum of the gray levels of the pixels in each column and the average gray level of all the pixels in the window are calculated;
(3) comparing the sum of the gray levels of the column pixels with the average gray level value from left to right in the window in sequence, if the sum of the gray levels of the column pixels is larger than or equal to the average gray level value, setting the sum of the gray levels of the column pixels as the average gray level value, otherwise, setting the sum of the gray levels of the column pixels as 0;
(4) comparing the sum of the column pixels of the jth column with the sum of the column pixels of the (j + 1) th column, and judging the left edge point and the right edge point of the crop row according to the magnitude of the sum of the column pixels of the jth column and the sum of the column pixels of the (j + 1) th column; if the sum of the pixels of the j-th column is less than the sum of the pixels of the j + 1-th column, j is the left edge point of the crop row; if the sum of the pixels of the jth column is larger than the sum of the pixels of the jth +1 th column, taking j as a right edge point of the crop row; if the sum of the pixels of the jth column is equal to the sum of the pixels of the (j + 1) th column, no processing is carried out; wherein j is a positive integer greater than or equal to 1 and represents a column of image pixels;
(5) comparing and judging according to the difference value of the left edge point and the right edge point of the crop row and a width threshold value of a preset crop row, if the difference value is larger than the preset crop row width threshold value, the left edge point and the right edge point of the crop row are crop row boundary feature points, and if not, the left edge point and the right edge point are false feature points, and removing;
(6) calculating the middle point of the left edge point and the right edge point of the crop row, and taking the middle point as the characteristic point of the crop row;
(7) the horizontal stripe moves downwards by one pixel, the process is repeated, and the characteristic point of the crop row in the window is calculated until the horizontal stripe reaches the bottommost end of the crop row area;
in particular, for a frameThe image of the pixel or pixels of the image,which represents the width of the image,representing the image height; is provided withIn the representation imageThe gray scale of the pixel point at the position,the sum of the gray levels of the j-th row of pixels, and T is the average value of the gray levels of the pixels in the window; the width threshold of the crop row is set as R,and T is expressed as:
in the above formula, the first and second carbon atoms are,represents a straight-line abscissa point of the left edge of the window,represents a straight-line abscissa point of the right edge of the window,representing the width of the window area, expressed as,Representing the height of a window for vertical projection, namely the number of pixels of a column for vertical projection; wherein the window is formed by horizontal strips and the range of the crop row areaThe right side edge is composed of straight lines, and for simplifying calculationUnder the condition of smaller size, the window is approximate to a rectangle;
the detection algorithm of the position area of the crop row on the top edge of the image is as follows:
1. dividing a horizontal strip with the height h at the top end of the binary image, wherein the horizontal strip and the left and right side edges of the crop row region form a window;
2. calculating the sum of pixels in each column of the windowAnd the average gray value T, will be from left to right in the windowAndthe values are compared ifThenOtherwise;
3. If it is not<Then j column indicates the candidate edge point on the left side of the crop row, if>The j column represents the candidate edge point on the right side of the crop row, if the distance between the candidate edge point on the right side and the candidate edge point on the left side is greater than a set threshold value R, the edge point pair (the candidate edge point on the right side and the candidate edge point on the left side) is considered as an effective edge point of the crop row, and if not, the edge point pair is a false edge point and is removed;
4. calculating the middle points of the left edge point and the right edge point of the crop row, and taking the points as the characteristic points of the crop row;
5. the horizontal strip is moved one pixel down and the process is repeated until the horizontal strip reaches the bottom most crop row area.
S06, establishing a crop row center line solving model according to the characteristics of crop rows in the image, and extracting crop row center lines in the crop row region range through an artificial bee colony algorithm, as shown in FIG. 10, wherein a gray solid line represents the crop row center lines, and a gray dotted line is a navigation path;
the method specifically comprises the following steps:
(1) a crop row center line solving model is established according to the characteristics of crop rows in the image, the crop row center line solving model is that the farmland crop rows are represented as approximate straight lines in shape, and an equation can be determined according to the characteristic points of the two crop rows in the image, as shown in figure 9. Let V denote the crop row feature point data space obtained according to said step S05,andfor two points in V, then the crop row centerline equation can be expressed as:
counting the number of characteristic points within the range of d from the straight line, using the number as a standard for evaluating the quality of the straight line, and adjustingAndselecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
(2) dividing the crop row feature points in the crop row area range obtained in the step S05 into an upper part and a lower part 2 by taking the height of the image 1/2 as a boundary, taking the crop row feature points of the upper half part as candidate starting points of a crop row center line and the crop row feature points of the lower half part as candidate end points of the crop row center line, and establishing arrays to respectively store the candidate starting points and the candidate end points of the crop row center line;
(3) and solving the model according to the crop row center line, and randomly selecting 1 candidate starting point and 1 candidate ending point to form a honey source of the artificial bee colony algorithm to represent a candidate crop row center line. Initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of feature points within a certain range from the candidate straight lines, taking the feature points as fitness functions for evaluating the goodness and badness of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as a crop row central line through multiple searches of an artificial bee colony algorithm;
the minimum search model of the artificial bee colony algorithm comprises four constituent elements of honey sources, leading bees, following bees and scout bees and 2 behaviors of recruiting the bees and abandoning the honey sources, wherein the number of the leading bees and the following bees in the algorithm is equal to the number of the honey sources, and the basic principle is as follows:
let D be the dimension of the problem to be solved, the position of the honey source represents the potential solution of the problem, and the position of the honey source i is expressed asThen, the mathematical model of the artificial bee colony algorithm is as follows:
in the above formula, the first and second carbon atoms are,andrespectively representing the upper and lower limits of the search space,one dimension of the solution;representing a new honey source generated by the search phase in the vicinity of honey source i,is [ -1, 1 [ ]]A random number in between, and a random number,,;representing the probability of the follower bee selecting the ith honey source,the ith honey source fitness is defined, and NP is the number of solutions; if the quality of the honey source is not improved after the limited number of cycles, the honey source is abandoned, the corresponding leading bee is converted into a detection bee, and the detection beeA new source of honey is generated according to equation (12). The specific algorithm design for detecting the crop row center line is as follows:
1. using a vertical projection method to obtain the number N of crop rows and the area range if(the image at least comprises 1 crop row), detecting the characteristic points of the crops in the strip area, otherwise, ending the program;
2. dividing the characteristic points of the crop rows into an upper part and a lower part by taking the height of the image 1/2 as a boundary, and establishing an array,Respectively for storingCharacteristic points of the upper half part and the lower half part of the row of the strip crops;
3. initializing crop row count variablesDistance thresholdNumber of honey sources(the number of leading bees and following bees is the same as the number of honey sources), local search threshold limit and maximum iteration number(ii) a Establishing a fitness function,Representing a straight line of distanceThe number of characteristic points in the range;
4. are respectively provided withIn the random selection of 2 pointsForm a honey sourceThe initialization of the honey source using equation (12) represents a potential solutionCalculating the fitness of all the potential solutions;
5. leading bees to perform neighborhood search according to the formula (13) to generate a new honey sourceCalculating the fitness of the new honey source ifAdaptability is higher thanThenOtherwiseKeeping the same;
6. calculation according to equation (14)Is related to probabilityAccording to the beeSelecting a honey source; following bees perform neighborhood search using equation (13) to generate a new solutionCalculating its fitness ifAdaptability is higher thanFitness is thenOtherwiseKeeping the same;
7. after limit cycles, ifIf the fitness is not changed, the solution is abandoned, the corresponding leading bee is converted into a detection bee, and a new solution is generated according to the formula (12) to replace the current solution;
8. Saving the current optimal solution, and judging whether the maximum iteration number is reachedIf so, outputting the optimal resultThe linear equation of the crop row can be calculated by using two pointsWherein,
9. if num +1> N, the coverage of all crop row areas is traversed, the program is ended, otherwise num = num +1, and the step (4) is returned;
in the algorithm, the number of honey sources, the number of leading bees and the number of following bees are all m =30, the local search threshold limit =10, the maximum iteration number C =50, and the linear distance threshold d = 2.
And S07, extracting the navigation path. The navigation path equation is calculated from the two crop row lines closest to the image centerline, as shown in fig. 10, where the gray dashed line represents the navigation path. Set the image size of the farm crop as,Which represents the width of the image,representing the height of the image, the ordinate of the top edge point of the image is 0, and the ordinate of the bottom edge point of the image isAnd 2 points for determining the crop line straight line can be calculated by an artificial bee colony algorithm, and the navigation path equation is calculated by the following specific steps:
(1) set the left side of the image center line as the straight line of the crop row (A), (B), (C), (D), (E), (D), (E) and (E), (E) and (E), (E) and (E) a),) And (a),) Two points are determined, the intersection of the line with the top and bottom edges of the image being: (0) and (,) Wherein
(2) setting the right side of the central line of the image as the straight line of the crop row to pass (,)、(,) Two points, the intersection of the line with the top and bottom edges of the image: (0) and (,) Wherein,;
(3) calculating the midpoint between two crop row lines (,0),(,) Solving the navigation path equation by using 2 pointsWherein,
fig. 11 shows an agricultural mobile robot navigation path extraction device based on an artificial bee colony algorithm under natural lighting conditions, which is provided by the embodiment of the invention and comprises:
an image graying processing module M01 for passingThe factor converts the color crop image into a gray image;
the image segmentation module M02 is used for converting the gray-scale image into a binary image by a maximum inter-class variance method;
the crop row region range determining module M03 is used for determining the crop row region range through a vertical projection method according to the binary image;
the crop row characteristic point detection module M04 is used for extracting crop row characteristic points in the crop row region range by a vertical projection method based on a moving window according to the crop row region range;
the crop row center line extraction module M05 is used for establishing a crop row center line solving model according to the characteristics of crop rows in the image, and extracting the crop row center lines in the crop row area range through an artificial bee colony algorithm;
the navigation path determining module M06 is used for determining a navigation path between the two crop row center lines according to the two crop row center lines closest to the image center line;
the above-mentioned apparatuses and the above-mentioned methods are in a one-to-one correspondence, and the details of the implementation of the above-mentioned apparatuses will not be described in detail in this embodiment.
Compared with the prior art, the navigation path identification method is rapid and accurate and has strong adaptability to natural illumination. By normalizationThe factor carries out gray processing on the crop image so as to reduce the influence of illumination change on navigation path identification; by establishing a crop row central line solving model, the artificial bee colony algorithm is utilized to extract crop row straight lines and navigation lines, so that the accuracy and speed of navigation path recognition are improved.
Although the present invention has been described in detail with reference to examples, it should be understood by those skilled in the art that various combinations, modifications and equivalents may be made without departing from the spirit and scope of the invention as defined in the claims.
Claims (4)
1. An agricultural mobile robot navigation path extraction method based on an artificial bee colony algorithm under natural illumination conditions is characterized by comprising the following steps:
s01, collecting crop images; the camera forms an included angle of 60-70 degrees with the horizontal direction, and the vertical height from the ground is about 1.2-1.4 m;
s02, utilizing 2CgNor-CrNor-CbNorCarrying out gray processing on the collected crop image by the factor, and converting the color crop image into a gray image;
s03, performing image segmentation by adopting a maximum inter-class variance method, and converting the gray level image into a binary image;
s04, obtaining edge position information of the crop row at the top edge and the bottom edge of the image by adopting a vertical projection method, and forming a crop row area range by connecting edge points through straight lines;
s05, extracting the characteristic points of the crop rows in the crop row area range by using a vertical projection method based on a moving window;
s06, establishing a crop row center line solving model according to the characteristics of crop rows in the image, and extracting crop row center lines in the crop row area range through an artificial bee colony algorithm; the method specifically comprises the following steps:
(1) establishing a crop row center line solving model according to the characteristics of crop rows in the image, wherein the crop row center line solving model is that the farmland crop rows are represented as approximate straight lines in shape, an equation of the crop row center line solving model can be determined according to two crop row feature points in the image, and V is set to represent a crop row feature point data space obtained according to the step S05, (x)i,yi) And (x)j,yj) For two points in V, then the crop row centerline equation can be expressed as:
counting the number of the characteristic points within the range of d from the straight lineWhich is used as a standard for evaluating the quality of a straight line by adjusting (x)i,yi) And (x)j,yj) Selecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
(2) dividing the crop row feature points in the crop row area range obtained in the step S05 into an upper part and a lower part 2 by taking the height of the image 1/2 as a boundary, taking the crop row feature points of the upper half part as candidate starting points of a crop row center line and the crop row feature points of the lower half part as candidate end points of the crop row center line, and establishing arrays to respectively store the candidate starting points and the candidate end points of the crop row center line;
(3) solving a model according to a crop row center line, randomly selecting 1 candidate starting point and 1 candidate end point to form a honey source of an artificial bee colony algorithm, representing a candidate crop row center line, initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of feature points within a certain range from the candidate straight lines, taking the feature points as fitness functions for evaluating the goodness and badness of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as the crop row center line through multiple searches of the artificial bee colony algorithm;
and S07, determining a navigation path between the two crop row central lines according to the two crop row central lines closest to the image central line.
2. The method according to claim 1, wherein the step S02 specifically comprises the steps of:
(1) converting the crop image from RGB color space to YCrCb color space:
(2) constructing a Cg component irrelevant to illumination on the basis of a YCrCb color space, wherein the Cg component corresponds to the difference between a green signal and a brightness signal:
(3) processing Cg according to ITU-R BT.601-6 to obtain formula (4):
(4) respectively carrying out normalization processing on the Cr, Cb and Cg components to obtain CgNor、CrNor、CbNorComponent (b):
(5) using 2CgNor-CrNor-CbNorThe factor performs graying processing on the color crop image.
3. The utility model provides an agricultural mobile robot navigation route extraction element based on artifical bee colony algorithm under natural lighting condition which characterized in that includes:
(1) image graying processing module using 2CgNor-CrNor-CbNorThe factor converts the color crop image into a gray image;
(2) the image segmentation module is used for converting the gray level image into a binary image by a maximum inter-class variance method;
(3) the crop row region range determining module is used for determining the crop row region range through a vertical projection method according to the binary image;
(4) the crop row characteristic point detection module is used for extracting crop row characteristic points in the crop row region range through a vertical projection method based on a moving window according to the crop row region range;
(5) a crop row centerline extraction module to:
establishing a crop row center line solving model according to the characteristics of crop rows in an image, wherein the crop row center line solving model is that farmland crop rows are expressed as approximate straight lines in form, an equation of the crop row center line solving model can be determined according to two crop row feature points in the image, and V is set to represent a crop row feature point data space obtained according to a crop row feature point detection module, (x)i,yi) And (x)j,yj) For two points in V, then the crop row centerline equation can be expressed as:
counting the number of characteristic points within the range of d from the straight line, using the number as a standard for evaluating the quality of the straight line, and adjusting (x)i,yi) And (x)j,yj) Selecting a straight line containing most characteristic points as a crop row center line at the position, wherein the value range of d is (1, 5);
dividing the crop row characteristic points in the crop row area range obtained by the crop row characteristic point detection module into an upper part and a lower part by taking the height of the image 1/2 as a boundary, taking the crop row characteristic points of the upper half part as candidate starting points of a crop row center line and the crop row characteristic points of the lower half part as candidate terminal points of the crop row center line, and establishing arrays to respectively store the candidate starting points and the candidate terminal points of the crop row center line;
solving the model according to the crop row center line, randomly selecting 1 candidate starting point and 1 candidate end point to form a honey source of the artificial bee colony algorithm, representing a candidate crop row center line, initializing a plurality of honey sources to form a plurality of candidate crop row straight lines, counting the number of characteristic points within a certain range from the candidate straight lines, taking the characteristic points as a fitness function for evaluating the quality of the candidate straight lines, and selecting the candidate straight line with the maximum fitness function as the crop row center line through multiple searches of the artificial bee colony algorithm;
(6) and the navigation path determining module is used for determining a navigation path between the two crop row center lines according to the two crop row center lines closest to the image center line.
4. The apparatus according to claim 3, wherein the image graying processing module specifically includes:
(1) converting the crop image from RGB color space to YCrCb color space:
(2) constructing a Cg component irrelevant to illumination on the basis of a YCrCb color space, wherein the Cg component corresponds to the difference between a green signal and brightness:
(3) cg is processed according to ITU-R BT.601-6 to yield (11):
(4) respectively carrying out normalization processing on the Cr, Cb and Cg components to obtain CgNor、CrNor、CbNorComponent (b):
(5) using 2CgNor-CrNor-CbNorThe factor performs graying processing on the color crop image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611076483.9A CN108133471B (en) | 2016-11-30 | 2016-11-30 | Robot navigation path extraction method and device based on artificial bee colony algorithm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611076483.9A CN108133471B (en) | 2016-11-30 | 2016-11-30 | Robot navigation path extraction method and device based on artificial bee colony algorithm |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108133471A CN108133471A (en) | 2018-06-08 |
CN108133471B true CN108133471B (en) | 2021-09-17 |
Family
ID=62387328
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611076483.9A Active CN108133471B (en) | 2016-11-30 | 2016-11-30 | Robot navigation path extraction method and device based on artificial bee colony algorithm |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108133471B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107067430B (en) * | 2017-04-13 | 2020-04-21 | 河南理工大学 | Wheat field crop row detection method based on feature point clustering |
CN108901540A (en) * | 2018-06-28 | 2018-11-30 | 重庆邮电大学 | Fruit tree light filling and fruit thinning method based on artificial bee colony fuzzy clustering algorithm |
CN111931789B (en) * | 2020-07-28 | 2024-05-14 | 江苏大学 | Linear crop row extraction method suitable for different illumination, crop density and growth backgrounds |
CN112395984B (en) * | 2020-11-18 | 2022-09-16 | 河南科技大学 | Method for detecting seedling guide line of unmanned agricultural machine |
CN112712534B (en) * | 2021-01-15 | 2023-05-26 | 山东理工大学 | Corn rhizome navigation datum line extraction method based on navigation trend line |
CN113111892B (en) * | 2021-05-12 | 2021-10-22 | 中国科学院地理科学与资源研究所 | Crop planting row extraction method based on unmanned aerial vehicle image |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2131184A1 (en) * | 2008-06-02 | 2009-12-09 | CNH Belgium N.V. | Crop particle discrimination methods and apparatus |
CN101604166A (en) * | 2009-07-10 | 2009-12-16 | 杭州电子科技大学 | A kind of method for planning path for mobile robot based on particle swarm optimization algorithm |
CN101807252A (en) * | 2010-03-24 | 2010-08-18 | 中国农业大学 | Crop row center line extraction method and system |
CN102999757A (en) * | 2012-11-12 | 2013-03-27 | 中国农业大学 | Leading line extracting method |
CN104866820A (en) * | 2015-04-29 | 2015-08-26 | 中国农业大学 | Farm machine navigation line extraction method based on genetic algorithm and device thereof |
-
2016
- 2016-11-30 CN CN201611076483.9A patent/CN108133471B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2131184A1 (en) * | 2008-06-02 | 2009-12-09 | CNH Belgium N.V. | Crop particle discrimination methods and apparatus |
CN101604166A (en) * | 2009-07-10 | 2009-12-16 | 杭州电子科技大学 | A kind of method for planning path for mobile robot based on particle swarm optimization algorithm |
CN101807252A (en) * | 2010-03-24 | 2010-08-18 | 中国农业大学 | Crop row center line extraction method and system |
CN102999757A (en) * | 2012-11-12 | 2013-03-27 | 中国农业大学 | Leading line extracting method |
CN104866820A (en) * | 2015-04-29 | 2015-08-26 | 中国农业大学 | Farm machine navigation line extraction method based on genetic algorithm and device thereof |
Non-Patent Citations (2)
Title |
---|
Mobile robot path planning using artificial bee colony and evolutionary programming;Contreras-Cruz M A 等;《Applied Soft Computing Journal》;20150210;第319-328页 * |
人工蜂群算法在移动机器人路径规划中的应用;黎竹娟;《计算机仿真》;20121231;第29卷(第12期);第247-250页 * |
Also Published As
Publication number | Publication date |
---|---|
CN108133471A (en) | 2018-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108133471B (en) | Robot navigation path extraction method and device based on artificial bee colony algorithm | |
CN111640157B (en) | Checkerboard corner detection method based on neural network and application thereof | |
Wang et al. | Image segmentation of overlapping leaves based on Chan–Vese model and Sobel operator | |
CN107516077B (en) | Traffic sign information extraction method based on fusion of laser point cloud and image data | |
CN109903331B (en) | Convolutional neural network target detection method based on RGB-D camera | |
CN107492094A (en) | A kind of unmanned plane visible detection method of high voltage line insulator | |
Jin et al. | Corn plant sensing using real‐time stereo vision | |
CN113160192A (en) | Visual sense-based snow pressing vehicle appearance defect detection method and device under complex background | |
CN111915704A (en) | Apple hierarchical identification method based on deep learning | |
CN111753577A (en) | Apple identification and positioning method in automatic picking robot | |
CN108509928A (en) | For Cold region apple jujube garden field pipe operation vision guided navigation path extraction method | |
CN112784869B (en) | Fine-grained image identification method based on attention perception and counterstudy | |
CN112766155A (en) | Deep learning-based mariculture area extraction method | |
CN111724354B (en) | Image processing-based method for measuring wheat ear length and wheat ear number of multiple wheat plants | |
CN113450402B (en) | Navigation center line extraction method for vegetable greenhouse inspection robot | |
CN109190452B (en) | Crop row identification method and device | |
CN116740758A (en) | Bird image recognition method and system for preventing misjudgment | |
Tu et al. | An efficient crop row detection method for agriculture robots | |
CN116977960A (en) | Rice seedling row detection method based on example segmentation | |
Xiang et al. | PhenoStereo: a high-throughput stereo vision system for field-based plant phenotyping-with an application in sorghum stem diameter estimation | |
CN115049689A (en) | Table tennis identification method based on contour detection technology | |
CN117079125A (en) | Kiwi fruit pollination flower identification method based on improved YOLOv5 | |
CN113421301B (en) | Method and system for positioning central area of field crop | |
CN115995017A (en) | Fruit identification and positioning method, device and medium | |
CN115451965A (en) | Binocular vision-based relative heading information detection method for transplanting system of rice transplanter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |