CN115497067A - Path identification and planning method for nursery patrol intelligent vehicle - Google Patents

Path identification and planning method for nursery patrol intelligent vehicle Download PDF

Info

Publication number
CN115497067A
CN115497067A CN202211334834.7A CN202211334834A CN115497067A CN 115497067 A CN115497067 A CN 115497067A CN 202211334834 A CN202211334834 A CN 202211334834A CN 115497067 A CN115497067 A CN 115497067A
Authority
CN
China
Prior art keywords
nursery
image
gray
value
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211334834.7A
Other languages
Chinese (zh)
Inventor
金鑫俊
丁祎
张勇
刘薇
孙勇智
李津蓉
程莉莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lover Health Science and Technology Development Co Ltd
Original Assignee
Zhejiang Lover Health Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lover Health Science and Technology Development Co Ltd filed Critical Zhejiang Lover Health Science and Technology Development Co Ltd
Priority to CN202211334834.7A priority Critical patent/CN115497067A/en
Publication of CN115497067A publication Critical patent/CN115497067A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/36Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Nonlinear Science (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a path identification and planning method of a nursery patrol intelligent vehicle, which comprises the following steps: acquiring nursery garden image data information in an intelligent vehicle inspection range by using a camera; calculating the gray value of the nursery image information, and optimizing the nursery image data information subjected to gray processing; performing image edge enhancement on the nursery garden image with the optimized edge by using a Sobel operator; and identifying the nursery road environment, identifying the background, the driving road, the nursery stock and sundries to obtain two-dimensional plane image coordinates of the position of the nursery stock, converting the two-dimensional plane image coordinates into three-dimensional coordinates of the inspection intelligent vehicle in a three-dimensional coordinate system, and planning the walking path of the inspection intelligent vehicle by adopting a Monte Carlo particle optimization algorithm. According to the method, the relevant paths are identified in real time through image processing by the aid of data acquisition of the sensors and Deep Learning of Deep Learning, path planning is carried out, intelligent decision making is carried out through a cloud, intelligent management of the nursery is achieved, and nursery stocks can grow better.

Description

Path identification and planning method for nursery patrol intelligent vehicle
Technical Field
The invention belongs to the technical field of agricultural intelligent inspection path planning, and particularly relates to a path identification and planning method for a nursery inspection intelligent vehicle.
Background
The forestry nursery is an important material foundation for effectively restraining desertification diffusion, and the cultivation of nursery seedlings is important. As a large-scale planting area, a forestry nursery needs to be manually inspected frequently, the current nursery management combination information technology degree is low, the management mode is backward, the growth period of seedlings in the nursery is long, the influence of climate change and plant diseases and insect pests is serious, the industry suffers from low productivity and unstable production quality for a long time, and the manual inspection service is heavy and needs a large amount of manpower and material resources. The traditional extensive planting mode is no longer suitable for the current market environment and social development. The industry urgently needs to combine informatization means to optimize a management structure, improve the risk resistance and optimize an industrial structure in response to more and more intense market competition.
Therefore, an inspection intelligent vehicle with autonomous path identification and path planning is urgently needed to replace manual inspection, and by means of intelligence and flexibility of the inspection intelligent vehicle, a large amount of manpower is saved, efficiency is improved, and the development trend of forestry management from extensive to fine informatization is met.
Disclosure of Invention
Aiming at the defects, the invention provides a path identification and planning method of a nursery patrol intelligent vehicle. The method comprises the steps of collecting data by using sensors such as a camera and a laser radar, adopting an autonomous navigation technology, deep Learning, an internet of things technology, instant positioning, map construction and the like, collecting nursery environment data in real time, identifying relevant paths in real time through image processing, planning the paths, and making intelligent decisions through a cloud. In addition, the virtual simulation technology is adopted to perform path optimization simulation on the patrol intelligent vehicle, and the simulation result is used for the maneuver control of the patrol intelligent vehicle, so that the intelligent management of the nursery is realized, and the nursery stock can grow better.
The invention provides the following technical scheme: a path identification and planning method for a nursery patrol intelligent vehicle comprises the following steps:
s1: acquiring nursery garden image data information in an intelligent vehicle inspection range by using a camera;
s2: preprocessing the acquired nursery image information, calculating the gray value of the nursery image information, and optimizing the nursery image data information subjected to gray processing by adopting an edge preserving optimal filtering algorithm;
s3: adopting a Sobel operator to carry out image edge enhancement on the nursery garden image after the edge optimization in the step S2;
s4: and (3) identifying the background, the driving road, the seedlings and the sundries of the nursery road environment of the nursery image data information after the image edge enhancement in the step (3) by adopting a yolov3 algorithm of a deep convolutional neural network to obtain two-dimensional plane image coordinates of the position where the nursery seedlings are located, converting the two-dimensional plane image coordinates into three-dimensional coordinates of the patrol intelligent vehicle in a three-dimensional coordinate system, and planning the walking path of the patrol intelligent vehicle by adopting a Monte Carlo particle optimization algorithm.
Further, the formula for calculating the gray value of the nursery image information in the step S2 is as follows:
Figure BDA0003914336410000021
wherein GARY (i, j) represents a gray value of the pixel (i, j) in the nursery image information, R (i, j) represents a value of a red primary color of the pixel (i, j) in an RGB color channel, B (i, j) represents a value of a blue primary color of the pixel in the RGB color channel, and G (i, j) represents a value of a green primary color of the pixel in the RGB color channel.
Further, in the step S2, an edge preserving optimization filtering algorithm is adopted to optimize the nursery image data information after the gray processing, including the following steps:
s21: adopting the step S2 to calculate the gray level value GARY (i, j) of the nursery image information to uniformly replace the value R (i, j) of the red primary color, the value G (i, j) of the green primary color and the value B (i, j) of the blue primary color in the RGB color channels RGB (R (i, j), G (i, j), B (i, j)) acquired in the step S1, and obtaining new gray level image data RGB (GARY (i, j), GARY (i, j) and GARY (i, j));
s22: constructing the ith image data vector x in the gray-scale image in the step S21 i Gaussian noise vector observation model:
Figure BDA0003914336410000022
wherein v is i Representing a noiseless wavelet coefficient of the ith image, and n i I.i.d Gaussian noise representing the independent and uniform distribution of the ith image;
s23: a vector V of non-noisy data in the wavelet domain is constructed and a noisy output vector corresponding to the non-noisy data in the wavelet domain is calculated
Figure BDA0003914336410000023
With image data vectors in the input grey-scale imageMinimum mean square error f between X MSE
V=[v 0 ,v 1 ,...,v N-1 ] T ,X=[x 0 ,x 1 ,...,x N-1 ] T
Figure BDA0003914336410000024
Figure BDA0003914336410000025
Where N is the size of the sub-band, v i Are the wavelet coefficients of a noise-free image,
Figure BDA0003914336410000026
is the threshold wavelet coefficient of the noisy image;
s24: constructing a noise standard deviation sigma calculation model of absolute deviation, and constructing a calculation model of an image data denoising threshold thr (sigma) in the gray-scale image according to the calculated noise standard deviation of the absolute deviation:
Figure BDA0003914336410000031
s25: judging the minimum mean square error f calculated in the step S23 MSE The relation between the image data denoising threshold thr (sigma) in the gray-scale image constructed in the step S24, and then the ith image data vector x in the gray-scale image is output i Filtering denoised value of
Figure BDA0003914336410000032
Figure BDA0003914336410000033
S26: image data x in the grayscale image output by N-1 i Filtering denoised value of
Figure BDA0003914336410000034
And forming the optimized nursery image data information after gray processing.
Further, the noise standard deviation σ calculation model of the absolute deviation constructed in the step S24 is as follows:
Figure BDA0003914336410000035
wherein the content of the first and second substances,
Figure BDA0003914336410000036
for calculating said noisy output vector
Figure BDA0003914336410000037
The median value of (a).
Further, in the step S3, a Sobel operator is adopted to perform image edge enhancement on the nursery image after the edge is optimized in the step S2, and the method includes the following steps:
s31: constructing a horizontal convolution kernel model G of pixel neighborhood pixel calculation of the gray scale image x And a vertical convolution kernel G y Model:
G x =[f(i+1,j-1)+2f(i+1,j)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j+1)];
G y =[f(i-1,j-1)+2f(i,j+1)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j-1)];
s32: constructing the horizontal convolution kernel model G in the step S31 x And the vertical convolution kernel G y The models are converted into a horizontal gradient matrix form and a vertical gradient matrix form which are subjected to convolution calculation with a matrix A formed by the nursery image data information after the gray processing optimized in the step S2, and the horizontal gradient G of the nursery image data after the gray processing optimized in the step S2 at the pixel (i, j) is obtained x And a vertical gradient G y
Figure BDA0003914336410000041
Figure BDA0003914336410000042
S33: the horizontal gradient G of the optimized gray-scale processed nursery image data at the pixel (i, j) is calculated according to the step S32 x And a vertical gradient G y Calculating the actual gradient G of the optimized gray-scale processed nursery image data calculated in the step S32 at the pixel (i, j) g
Figure BDA0003914336410000043
S34: judging the actual gradient G calculated in the step S33 g Whether the value is larger than the edge point determination threshold G thr If yes, judging the pixel (i, j) as an edge point; otherwise, eliminating the pixel, and repeating the steps S31-S33 to complete the image edge enhancement of the nursery image data after the gray processing optimized in the step S2.
Further, the edge point determination threshold G in the step S34 thr The calculation formula of (a) is as follows:
G thr =0.65∑ ij G g
further, the step S4 of planning the traveling route of the intelligent inspection vehicle by using a monte carlo particle optimization algorithm includes the following steps:
s41: constructing a t-time ith particle for converting nursery seedlings into the patrol intelligent vehicle under a three-dimensional coordinate system
Figure BDA0003914336410000044
Initial position calculation model:
Figure BDA0003914336410000045
wherein H isIth particle at time t-1
Figure BDA0003914336410000046
Moving to the ith particle at time t
Figure BDA0003914336410000047
A walking steering angle matrix of an initial position, wherein L is the ith particle at the t-1 moment
Figure BDA0003914336410000048
Moving to the ith particle at time t
Figure BDA0003914336410000049
The travel distance of the initial position, i is the ith particle at the moment of t-1
Figure BDA00039143364100000410
Moving to the ith particle at time t
Figure BDA00039143364100000411
The walking distance L is subject to normal distribution U (0,L), and L is the walking distance of each walk of the nursery patrol intelligent vehicle;
Figure BDA0003914336410000051
let the ith particle at time t
Figure BDA0003914336410000052
Inertial weight factor of
Figure BDA0003914336410000053
S42: when t =1, the ith particle is calculated
Figure BDA0003914336410000054
Initial position deviation value d H
Figure BDA0003914336410000055
S43: performing optimization iteration on the positions of the M particles in the time slot T, and judging the Housdov distance of the ith particle in the M particles
Figure BDA0003914336410000056
Whether the particle size is smaller than the ith particle calculated in the step S42
Figure BDA0003914336410000057
Initial position deviation value d H If less than, the prior probability density is used
Figure BDA0003914336410000058
Will be provided with
Figure BDA0003914336410000059
As the ith particle initial position at the t-th time; otherwise to measure density
Figure BDA00039143364100000510
Setting the initial position of the ith particle at the t-th time as the observed value z of the ith particle at the t-th time t
Figure BDA00039143364100000511
Wherein m is t Is the target position of the ith particle;
s44: outputting optimized target positions of M particles in time slot T
Figure BDA00039143364100000512
According to the optimized target position
Figure BDA00039143364100000513
And the walking path is used as the walking path of the intelligent patrol vehicle at the moment t.
Further, the density is measured in the step S43
Figure BDA00039143364100000514
The calculation formula of (c) is as follows:
Figure BDA00039143364100000515
wherein σ r Is observed value z of ith particle at the t-th time t The ith particle at time t
Figure BDA00039143364100000516
The covariance between the initial positions is,
Figure BDA00039143364100000517
to return to the ith particle at time t
Figure BDA00039143364100000518
Probability density of initial position.
Further, the distance of the ith particle in the step S43 is Housdov
Figure BDA00039143364100000519
The calculation formula of (a) is as follows:
Figure BDA00039143364100000520
wherein A is a first finite point set comprising a plurality of M particles, B is a second finite point set comprising a plurality of M particles, A = { a = 1 ,a 2 ,...,a p },B={b 1 ,b 2 ,...,b q },1<i<p≤M,1<i<q≤M,
Figure BDA0003914336410000061
Is the maximum distance from the ith particle in the first finite point set a to the nearest point of the ith particle in the second finite point set B,
Figure BDA0003914336410000062
Figure BDA0003914336410000063
is the maximum distance from the ith particle in the second finite point set B to the closest point of the ith particle in the first finite point set a,
Figure BDA0003914336410000064
further, in the step S43, the target positions of the M particles after optimization in the time slot T
Figure BDA0003914336410000065
The calculation formula of (c) is as follows:
Figure BDA0003914336410000066
the invention has the beneficial effects that:
1. according to the path identification and planning method of the intelligent nursery patrol inspection vehicle, provided by the invention, the edge-preserving optimized filtering algorithm is adopted, the nursery image data information after gray processing is optimized, the image detail characteristics can be kept as much as possible, the noise of the target image is inhibited, the influence of the noise is effectively removed, and the edge detail information in the nursery image is kept.
2. According to the method, the camera is adopted to collect the nursery image data information in the intelligent vehicle inspection range in real time, the gray processing, edge information retention and de-noising, image edge enhancement and Deep Learning yolov3 algorithm of the Deep convolutional neural network of the collected image data are used for identifying and distinguishing the targets of nursery stocks and non-nursery stocks in the steps S1-S3, and finally the Monte Carlo particle optimization algorithm is adopted to plan the inspection path of the inspection intelligent vehicle, so that the defect of poor robustness caused by the fact that the data are measured and calculated only by a sensor and are easily affected by outdoor factors such as weather, illumination and the like is overcome, the accuracy of basic data for training reinforcement Learning and the universality of a data set are improved, and the robustness of path identification and planning is enhanced.
3. According to the method, the Sobel operator is adopted to carry out image edge enhancement on the nursery image with the optimized edge, the situation that the image edge is fuzzy possibly caused by image filtering is avoided, and the Sobel operator is adopted to carry out image edge enhancement on the nursery image with the optimized edge, so that useful information can be enhanced by edge enhancement processing, and useless information is inhibited, the visual effect of the image is improved, and the distinguishability of the effective components of the image is improved.
4. Before the Monte Carlo particle optimization algorithm is used for planning the walking path of the patrol intelligent vehicle, after the distinction and identification of nursery seedlings and non-seedlings identified by the yolov3 algorithm of the deep convolutional neural network are carried out, the situation that a local planning navigation line obtained only based on the yolov3 visual algorithm is formed by fitting each navigation point in an image and takes a pixel coordinate system as a reference coordinate point is limited in the actual application process is avoided, the two-dimensional plane image coordinate of the position of the nursery seedlings is obtained and is converted into the three-dimensional coordinate of the patrol intelligent vehicle in the three-dimensional coordinate system, and further the three-dimensional moving coordinate of the patrol intelligent vehicle planned by the Monte Carlo particle optimization algorithm can be subjected to real-time path control and adjustment.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The invention will be described in more detail hereinafter on the basis of embodiments and with reference to the drawings. Wherein:
fig. 1 is a schematic flow chart of a path identification and planning method of a nursery patrol inspection intelligent vehicle provided by the invention;
fig. 2 is a schematic flow chart of optimizing the nursery image data information subjected to gray scale processing in step S2 in the method provided by the present invention;
fig. 3 is a schematic flow chart of image edge enhancement performed on the nursery image after edge optimization by using a Sobel operator in step S3 in the method provided by the present invention;
FIG. 4 is a schematic diagram of three-dimensional coordinates of a three-dimensional coordinate system of the intelligent inspection vehicle converted by the method provided by the invention;
fig. 5 is a schematic flow chart of planning a routing inspection path of the smart vehicle by adopting a monte carlo particle optimization algorithm in the step S4 in the method provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
At present, the nursery garden in China is roughly selected in open areas with the characteristics of leeward and sunny exposure, good drainage, higher terrain and flat terrain, and is close to a transportation hub which is convenient for transportation of materials such as railways, roads or water paths. Meanwhile, the land has to meet the conditions of proper weather conditions such as proper sunlight intensity, no invasion of cold wind and the like, moderate soil fertility requirements, proximity to large water bodies such as rivers, lakes and ponds, reservoirs and the like. The nursery garden intelligent patrol vehicle works outdoors, and is more complex than an indoor environment, and the sensor measuring and calculating data is easily influenced by outdoor factors such as weather, illumination and the like, so that the robustness of a common identification algorithm is poor. Therefore, a rich data set containing more factors as much as possible needs to be acquired, and the anti-interference performance of the image algorithm is enhanced in a mode of training a model.
As shown in fig. 1, the path identification and planning method for the nursery patrol inspection intelligent vehicle provided by the invention comprises the following steps:
s1: acquiring nursery garden image data information in an intelligent vehicle inspection range by using a camera;
s2: preprocessing the acquired nursery image information, calculating the gray value of the nursery image information, and optimizing the nursery image data information subjected to gray processing by adopting an edge preserving optimal filtering algorithm;
s3: performing image edge enhancement on the nursery garden image with the edge optimized in the step S2 by using a Sobel operator;
s4: and (3) identifying the background, the driving road, the seedlings and the sundries of the nursery road environment of the nursery image data information after the image edge enhancement in the step (3) by adopting a yolov3 algorithm of a deep convolutional neural network to obtain two-dimensional plane image coordinates of the position of the nursery seedlings, converting the two-dimensional plane image coordinates into three-dimensional coordinates of the patrol intelligent vehicle under a three-dimensional coordinate system, and planning the walking path of the patrol intelligent vehicle by adopting a Monte Carlo particle optimization algorithm.
The method provided by the invention has the advantages that when the data set is collected in the step S1, the inspection vehicle is manually and remotely controlled to walk to simulate the normal operation state of the inspection vehicle, the resolution of an image collected by the camera is 1920 pixels x 1080 pixels, and the collection frame rate is 70f/S. In order to ensure the richness of the road data set, data acquisition is carried out under the changes of various environmental characteristics, weather and light rays at different times. And (3) randomly selecting l0000 images from the acquired image data to manufacture a nursery garden road data set, wherein 8000 images are used for a training set, 1000 images are used for a verification set, and 1000 images are used for a test set.
Due to the possible imperfect conditions of a camera imaging system, a transmission mode, a storage device and the like, pollution caused by various noises is inevitable in the process of acquiring and converting images, so that the quality of the images is reduced to a certain extent, and the acquired images are different from the original images, so that the processing result is influenced. Aiming at stronger time-varying property of outdoor environment, and simultaneously, carrying out gray level processing on the RGB image in order to reduce data processing amount and save memory. When the numbers of R, G, B channels are equal, the color image presents a gray color, wherein the value of each pixel color in the gray image is the gray value, and the range is 0-255. Since the gray value of each pixel in the gray image only needs one byte, the data amount is greatly reduced. Therefore, as a preferred embodiment of the present invention, in the step S2, the color image is processed into a gray-scale image by using a weighted average method, and the calculation formula for calculating the gray-scale value of the nursery image information in the step S2 is as follows:
Figure BDA0003914336410000081
wherein GARY (i, j) represents a gray value of the pixel (i, j) in the nursery image information, R (i, j) represents a value of a red primary color of the pixel (i, j) in the RGB color channel, B (i, j) represents a value of a blue primary color of the pixel in the RGB color channel, and G (i, j) represents a value of a green primary color of the pixel in the RGB color channel.
However, since the grayscale map cannot effectively remove the influence of noise, it is necessary to suppress noise of the target image while retaining the detail features of the image as much as possible. The traditional median filtering and local averaging method is easy to influence the fuzzy edge of the image. Therefore, a more suitable edge-preserving optimized filtering algorithm is adopted, and therefore, as another preferred embodiment of the present invention, as shown in fig. 2, the step S2 of optimizing the nursery image data information after performing the gray processing by using the edge-preserving optimized filtering algorithm comprises the following steps:
s21: calculating a gray level value GARY (i, j) of the nursery image information by adopting the step S2 to uniformly replace the value R (i, j) of the red primary color, the value G (i, j) of the green primary color and the value B (i, j) of the blue primary color in RGB color channels RGB (R (i, j), G (i, j), B (i, j)) acquired in the step S1, so as to obtain new gray level image data RGB (GARY (i, j), GARY (i, j) and GARY (i, j));
s22: constructing the ith image data vector x in the gray-scale image in the step S21 i Gaussian noise vector observation model:
Figure BDA0003914336410000091
wherein v is i Representing a noiseless wavelet coefficient of the ith image, and n i I.i.d Gaussian noise which represents that the ith image is independent and uniformly distributed;
s23: a vector V of non-noisy data in the wavelet domain is constructed and a noisy output vector corresponding to the non-noisy data in the wavelet domain is calculated
Figure BDA0003914336410000092
And the transmissionMinimum mean square error f between image data vectors X in an incoming grayscale image MSE
V=[v 0 ,v 1 ,...,v N-1 ] T ,X=[x 0 ,x 1 ,...,x N-1 ] T
Figure BDA0003914336410000093
Figure BDA0003914336410000094
Where N is the size of the sub-band, v i Are the wavelet coefficients of a noise-free image,
Figure BDA0003914336410000095
is a threshold wavelet coefficient of a noisy image;
s24: constructing a noise standard deviation sigma calculation model of absolute deviation, and constructing a calculation model of an image data denoising threshold thr (sigma) in a gray image according to the calculated noise standard deviation of the absolute deviation:
Figure BDA0003914336410000096
s25: the minimum mean square error f calculated in the step S23 is judged MSE And the relation between the image data denoising threshold thr (sigma) in the gray-scale image constructed in the step S24, and then the ith image data vector x in the gray-scale image is output i Filtering denoised value of
Figure BDA0003914336410000097
Figure BDA0003914336410000101
S26: image data x in a grayscale image output by N-1 i Filtering denoised value of
Figure BDA0003914336410000102
And forming the optimized nursery image data information after gray processing.
According to the method, the noise of the target image is eliminated, and the detail characteristics are kept as much as possible.
Further preferably, the noise standard deviation σ calculation model of the absolute deviation constructed in the step S24 is as follows:
Figure BDA0003914336410000103
wherein the content of the first and second substances,
Figure BDA0003914336410000104
to calculate a noisy output vector
Figure BDA0003914336410000105
The median value of (a).
Image edge blurring may occur through image filtering, and edge enhancement processing is needed to enhance useful information and inhibit useless information, so that the visual effect of an image is improved, and the resolution of effective components of the image is improved. The invention adopts Sobel operator to enhance the image edge. Therefore, as another preferred embodiment of the present invention, as shown in fig. 3, the image edge enhancement of the nursery image after the step S2 of optimizing the edge by using Sobel operator in the step S3 includes the following steps:
s31: construction of gray-scale image pixel neighborhood pixel calculation horizontal convolution kernel model G x And a vertical convolution kernel G y Model:
G x =[f(i+1,j-1)+2f(i+1,j)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j+1)];
G y =[f(i-1,j-1)+2f(i,j+1)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j-1)];
s32: constructing a horizontal convolution kernel model G in the step S31 x And a vertical convolution kernel G y The models are converted into moments formed by the nursery image data information after the gray level processing optimized in the step S2The matrix A is in a horizontal gradient matrix form and a vertical gradient matrix form which are subjected to convolution calculation, and the horizontal gradient G of the nursery garden image data at the pixel (i, j) after the gray level processing optimized in the step S2 is obtained x And a vertical gradient G y
Figure BDA0003914336410000111
Figure BDA0003914336410000112
Horizontal gradient G x Vertical gradient G y And S2, all matrixes A formed by the optimized nursery image data information subjected to gray level processing in the step S are 3 x 3 matrixes;
s33: according to the horizontal gradient G of the optimized gray-scale processed nursery image data calculated in the step S32 at the pixel (i, j) x And a vertical gradient G y Calculating the actual gradient G of the optimized gray-scale processed nursery image data at pixel (i, j) calculated in step S32 g
Figure BDA0003914336410000113
S34: judging the actual gradient G calculated in the step S33 g Whether the value is larger than the edge point determination threshold G thr If yes, judging the pixel (i, j) as an edge point; otherwise, the pixel is eliminated, and the steps S31-S33 are repeated to finish the image edge enhancement of the nursery image data after the gray level processing optimized in the step S2.
Further preferably, the edge point determination threshold G in step S34 thr The calculation formula of (a) is as follows:
G thr =0.65∑ ij G g
as another preferred embodiment of the present invention, in step S4, the yolov3 algorithm of the deep convolutional neural network is used to identify the nursery road environment, identify the background, the driving road, the seedlings and the sundries, and the semantic segmentation labeling tool labelme is used to manually label the road environment data set, so as to label different categories with different colors. According to the characteristics of the nursery environment and the autonomous operation requirement of the inspection intelligent vehicle, objects in the environment can be divided into 4 categories with small correlation, as shown in table 1, the categories are as follows: background, driving roads, seedlings and sundries.
TABLE 1
Figure BDA0003914336410000114
The electric poles and the indication boards, as well as the weeds and equipment and other interferents are usually arranged among the seedlings, and some seedlings and non-seedling objects with certain common characteristics are difficult to rapidly identify by using the traditional method in the prior art, such as target unstructured, the weeds are similar to the leaves in color, and the weeds are difficult to distinguish by using a color segmentation method. Meanwhile, some traditional methods are greatly influenced under different illumination, for example, sunlight illumination positions are different in one day, shading at different angles of tree shade is frequently changed, and the like. The deep convolutional neural network yolov3 algorithm adopted by the invention can carry out semantic segmentation on the complex image, and the processing efficiency of the image algorithm is greatly improved.
Modern forestry nursery planting nursery stock is mostly the seedling, and the trunk is shorter than adult trees, for making the illumination that receives more sufficient, the crown struts the angle degree big. For the top advantage that convenient management and sapling have and root system growth distribution to consider, same type nursery stock is mostly parallel planting, and the width of row and planting density between the trees can satisfy general machinery condition of traveling. The same type of nursery stock is planted at regular intervals along a straight line, and the height of the tree is slightly different from the shape of the crown of the tree. In other words, the trunk in the semi-structured environment can be used as a reference object for positioning, which is very beneficial for the detection of the driving path and the generation of the route.
The method utilizes YOLOv3 trained by the data set to filter telegraph poles and indication boards among seedlings, weeds, equipment and other interferents, identifies some seedling targets and non-seedling targets with certain common characteristics, and outputs data coordinates of a boundary box.
Meanwhile, a semantic segmentation model suitable for the complex road environment is built by using a convolutional neural network, and a Mask-R-CNN instance segmentation algorithm is used.
And (3) performing a reference line fitting algorithm on the seedling rows on two sides by using the reference points of the seedling boundary frame identified by yolov 3. The algorithm respectively judges whether the number of the reference points of the nursery stocks on the two sides meets at least three-point fitting standards of a least square method, namely, a straight line can be fitted by three coordinate points and more than three coordinate points. Because the fruit trees in the nursery are regularly planted, each row is as straight as possible, and the interval between the rows is large, the actual use requirement can be approximately met. If the number of the extractable nursery stock reference points is less than three when the nursery stock is lost, connecting the two nearest reference points as nursery stock row reference lines.
As shown in fig. 4, the coordinates of the two-dimensional plane image of the position of the nursery stock obtained after the processing are converted into the three-dimensional coordinates of the inspection intelligent vehicle in the three-dimensional coordinate system, the locally planned navigation line obtained by the visual algorithm is formed by fitting each navigation point in the image, and the coordinate points are based on the pixel coordinate system, so that the coordinate conversion is needed due to the limitation in the actual application process.
Converting the world coordinates of the navigation point in the world coordinates into coordinates in the camera coordinate system by constructing a conversion matrix model of the world coordinate system and the camera coordinate system as follows:
Figure BDA0003914336410000121
wherein, the world coordinate of the navigation point in the world coordinate system is (X, Y, Z), and the coordinate of the navigation point in the camera coordinate system is (X) c ,Y c ,Z c )
One point P in real space can be converted from a matrix to pixel coordinates.
Figure BDA0003914336410000131
Z C As depth value of camera, X u 、Y v Coordinates of points in the image; f is the focal length of the camera, u 0 ,v 0 Is a principal point coordinate relative to the imaging plane, whose value is about half of the image pixel; r is a 3 × 3 rotation matrix; t is a translation vector of 3 × 1; x, Y, Z is the coordinate value of the actual point in the world coordinate system.
As another preferred embodiment of the present invention, as shown in fig. 5, the step S4 adopts a monte carlo particle optimization algorithm to plan the walking route of the inspection intelligent vehicle, and includes the following steps:
s41: constructing a t-time ith particle for converting nursery seedlings into a three-dimensional coordinate system of the intelligent patrol car
Figure BDA0003914336410000132
Initial position calculation model:
Figure BDA0003914336410000133
wherein H is the ith particle at the time of t-1
Figure BDA0003914336410000134
Moving to the ith particle at time t
Figure BDA0003914336410000135
A walking steering angle matrix of an initial position, wherein L is the ith particle at the t-1 moment
Figure BDA0003914336410000136
Moving to the ith particle at time t
Figure BDA0003914336410000137
The travel distance of the initial position, i is the ith particle at the moment of t-1
Figure BDA0003914336410000138
Moving to the ith time of tParticles
Figure BDA0003914336410000139
The walking distance L is subject to normal distribution U (0,L), and L is the walking distance of each walk of the intelligent nursery garden patrol car;
Figure BDA00039143364100001310
let the ith particle at time t
Figure BDA00039143364100001311
By an inertial weight factor of
Figure BDA00039143364100001312
S42: when t =1, the ith particle is calculated
Figure BDA00039143364100001313
Initial position deviation value d H
Figure BDA00039143364100001314
S43: performing optimization iteration on the positions of the M particles in the time slot T, and judging the Housdov distance of the ith particle in the M particles
Figure BDA00039143364100001315
Whether the number of particles is less than the ith particle calculated in the step S42
Figure BDA00039143364100001316
Initial position deviation value d H If less than, the prior probability density is used
Figure BDA0003914336410000141
Will be provided with
Figure BDA0003914336410000142
As the ith particle initial position at the t-th time; otherwise to measure density
Figure BDA0003914336410000143
Setting the initial position of the ith particle at the t-th time as the observed value z of the ith particle at the t-th time t
Figure BDA0003914336410000144
Wherein m is t Is the target position of the ith particle;
s44: outputting optimized target positions of M particles in time slot T
Figure BDA0003914336410000145
According to the optimized target position
Figure BDA0003914336410000146
And the walking path of the patrol intelligent vehicle is used as t time.
The particles represent the optimal walking points at the time T, namely the optimal walking point at the time T of the ith particle is connected to the optimal walking point at the time T +1 of the ith particle to form an optimal walking path in the time slot T. M particles show that the intelligent vehicle of patrolling and examining can be in any one position of patrolling and examining the within range at the moment of t.
Further preferably, the density is measured in the step S43
Figure BDA0003914336410000147
The calculation formula of (a) is as follows:
Figure BDA0003914336410000148
wherein σ r Is observed value z of ith particle at t time t And the ith particle at time t
Figure BDA0003914336410000149
The covariance between the initial positions is,
Figure BDA00039143364100001410
to return to the ith particle at time t
Figure BDA00039143364100001411
Probability density of initial position.
Further preferably, the distance of the ith particle in the step S43 is Housdov
Figure BDA00039143364100001412
The calculation formula of (a) is as follows:
Figure BDA00039143364100001413
wherein A is a first finite point set comprising a plurality of M particles, B is a second finite point set comprising a plurality of M particles, A = { a = 1 ,a 2 ,...,a p },B={b 1 ,b 2 ,...,b q },1<i<p≤M,1<i<q≤M,
Figure BDA00039143364100001414
Is the maximum distance from the ith particle in the first finite point set a to the nearest point of the ith particle in the second finite point set B,
Figure BDA00039143364100001415
Figure BDA00039143364100001416
is the maximum distance from the ith particle in the second finite point set B to the closest point of the ith particle in the first finite point set a,
Figure BDA0003914336410000151
||a i -b i i is the ith particle a in the first finite point set i With the ith particle b in the second finite point set i Euclidean distance between them, and in the same way, | | b i -a i I is the ith particle b in the second finite point set i With the ith particle a in the first finite point set i In betweenThe euclidean distance.
Further preferably, the target positions of the M particles after optimization in the time slot T in the step S43
Figure BDA0003914336410000152
The calculation formula of (a) is as follows:
Figure BDA0003914336410000153
in general, in feature matching based on classical particle filtering, when the weight becomes large, we can control to participate in calculating the measurement density
Figure BDA0003914336410000154
As observed value z of the ith particle t Weight of the particles of the estimated position
Figure BDA0003914336410000155
And passes the Hausdorff distance of the ith particle
Figure BDA0003914336410000156
And the ith particle as a threshold
Figure BDA0003914336410000157
Initial position deviation value d H Comparing, once the h-Housdov distance of the ith particle
Figure BDA0003914336410000158
The value becomes too large to control. This will result in filter divergence. Due to the ith particle as a threshold
Figure BDA0003914336410000159
Initial position deviation value d H Is the maximum distance between each particle and the target and can therefore be controlled. Thus, by defining the Hausdorff distance for the ith particle
Figure BDA00039143364100001510
The maximum value is in a reasonable range, and we can avoid the Housdov distance of the ith particle
Figure BDA00039143364100001511
The calculation result is inaccurate due to large data dispersion caused by the increase of the size.
It should be noted that the above-mentioned numbers of the embodiments of the present invention are merely for description, and do not represent the merits of the embodiments. And the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, apparatus, article, or method that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
While the invention has been described with reference to a preferred embodiment, various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In particular, the technical features mentioned in the embodiments can be combined in any manner as long as there is no technical solution conflict. It is intended that the invention not be limited to the particular embodiments disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (10)

1. A path identification and planning method for a nursery patrol intelligent vehicle is characterized by comprising the following steps:
s1: collecting nursery image data information in an intelligent vehicle inspection range by using a camera;
s2: preprocessing the acquired nursery image information, calculating the gray value of the nursery image information, and optimizing the nursery image data information subjected to gray processing by adopting an edge preserving optimal filtering algorithm;
s3: performing image edge enhancement on the nursery garden image with the edge optimized in the step S2 by adopting a Sobel operator;
s4: and (3) identifying the background, the driving road, the seedlings and the sundries of the nursery road environment of the nursery image data information after the image edge enhancement in the step (3) by adopting a yolov3 algorithm of a deep convolutional neural network to obtain two-dimensional plane image coordinates of the position where the nursery seedlings are located, converting the two-dimensional plane image coordinates into three-dimensional coordinates of the patrol intelligent vehicle in a three-dimensional coordinate system, and planning the walking path of the patrol intelligent vehicle by adopting a Monte Carlo particle optimization algorithm.
2. The method for path identification and planning of a nursery inspection intelligent vehicle according to claim 1, wherein the formula for calculating the gray-scale value of the nursery image information in the step S2 is as follows:
Figure FDA0003914336400000011
wherein GARY (i, j) represents a gray value of the pixel (i, j) in the nursery image information, R (i, j) represents a value of a red primary color of the pixel (i, j) in an RGB color channel, B (i, j) represents a value of a blue primary color of the pixel in the RGB color channel, and G (i, j) represents a value of a green primary color of the pixel in the RGB color channel.
3. The method for path identification and planning of a nursery inspection intelligent vehicle according to claim 1, wherein in the step S2, an edge preserving optimization filter algorithm is adopted to optimize the nursery image data information after gray processing, and the method comprises the following steps:
s21: adopting the step S2 to calculate the gray level value GARY (i, j) of the nursery image information to uniformly replace the value R (i, j) of the red primary color, the value G (i, j) of the green primary color and the value B (i, j) of the blue primary color in the RGB color channels RGB (R (i, j), G (i, j), B (i, j)) acquired in the step S1, and obtaining new gray level image data RGB (GARY (i, j), GARY (i, j) and GARY (i, j));
s22: constructing the ith image data vector x in the gray-scale image in the step S21 i Gaussian noise vector observation model:
Figure FDA0003914336400000012
i =0,1,2, · N; wherein v is i Representing a noiseless wavelet coefficient of the ith image, and n i I.i.d Gaussian noise which represents that the ith image is independent and uniformly distributed;
s23: a vector V of non-noisy data in the wavelet domain is constructed and a noisy output vector corresponding to the non-noisy data in the wavelet domain is calculated
Figure FDA00039143364000000211
Minimum mean square error f between the input gray scale image and image data vector X MSE
V=[v 0 ,v 1 ,...,v N-1 ] T ,X=[x 0 ,x 1 ,...,x N-1 ] T
Figure FDA0003914336400000021
Figure FDA0003914336400000022
Where N is the size of the sub-band, v i Are the wavelet coefficients of a noise-free image,
Figure FDA0003914336400000023
is a threshold wavelet coefficient of a noisy image;
s24: constructing a noise standard deviation sigma calculation model of absolute deviation, and constructing a calculation model of an image data denoising threshold thr (sigma) in the gray image according to the calculated noise standard deviation of the absolute deviation:
Figure FDA0003914336400000024
s25: judging the minimum mean square error f calculated in the step S23 MSE The relation between the image data denoising threshold thr (sigma) in the gray-scale image constructed in the step S24, and then the ith image data vector x in the gray-scale image is output i Filtering denoised value of
Figure FDA0003914336400000025
Figure FDA0003914336400000026
S26: image data x in the grayscale image output by N-1 i Filtering denoised value of
Figure FDA0003914336400000027
And forming the optimized nursery image data information after gray processing.
4. The method for path identification and planning of a nursery patrol inspection intelligent vehicle according to claim 3, wherein the noise standard deviation σ calculation model of absolute deviation constructed in the step S24 is as follows:
Figure FDA0003914336400000028
wherein the content of the first and second substances,
Figure FDA0003914336400000029
for calculating said noisy output vector
Figure FDA00039143364000000210
The median value of (a).
5. The path identification and planning method for the nursery inspection intelligent vehicle according to claim 1, wherein in the step S3, the image edge enhancement is performed on the nursery image after the edge optimization in the step S2 by using a Sobel operator, and the method comprises the following steps:
s31: constructing a horizontal convolution kernel model G of pixel neighborhood pixel calculation of the gray scale image x And a vertical convolution kernel G y Model:
G x =[f(i+1,j-1)+2f(i+1,j)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j+1)];
G y =[f(i-1,j-1)+2f(i,j+1)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j-1)];
s32: constructing the horizontal convolution kernel model G in the step S31 x And the vertical convolution kernel G y The models are converted into a horizontal gradient matrix form and a vertical gradient matrix form which are subjected to convolution calculation with a matrix A formed by the nursery image data information after the gray level processing optimized in the step S2, and the horizontal gradient matrix form and the vertical gradient matrix form are obtainedS2, the horizontal gradient G of the nursery image data subjected to gray level processing after optimization in the step (i, j) is formed at the pixel (i, j) x And a vertical gradient G y
Figure FDA0003914336400000031
Figure FDA0003914336400000032
S33: the horizontal gradient G of the optimized gray-scale processed nursery image data at the pixel (i, j) is calculated according to the step S32 x And a vertical gradient G y Calculating the actual gradient G of the optimized gray-scale processed nursery image data calculated in step S32 at pixel (i, j) g
Figure FDA0003914336400000033
S34: judging the actual gradient G calculated in the step S33 g Whether the value is larger than the edge point determination threshold G thr If yes, judging the pixel (i, j) as an edge point; otherwise, eliminating the pixel, and repeating the steps S31-S33 to complete the image edge enhancement of the nursery image data after the gray processing optimized in the step S2.
6. The method for path identification and planning of a nursery patrol inspection intelligent vehicle according to claim 5, wherein the threshold value G for edge point identification in the step S34 thr The calculation formula of (c) is as follows:
G thr =0.65∑ ij G g
7. the method for identifying and planning the path of the intelligent nursery inspection vehicle according to claim 1, wherein a Monte Carlo particle optimization algorithm is adopted in the S4 step to plan the walking route of the intelligent nursery inspection vehicle, and the method comprises the following steps:
s41: constructing a t-time ith particle for converting nursery seedlings into the patrol intelligent vehicle under a three-dimensional coordinate system
Figure FDA0003914336400000041
Initial position calculation model:
Figure FDA0003914336400000042
wherein H is the ith particle at the time of t-1
Figure FDA0003914336400000043
Moving to the ith particle at time t
Figure FDA0003914336400000044
A walking steering angle matrix of an initial position, wherein L is the ith particle at the t-1 moment
Figure FDA0003914336400000045
Moving to the ith particle at time t
Figure FDA0003914336400000046
The travel distance of the initial position, i is the ith particle at the time of t-1
Figure FDA0003914336400000047
Moving to the ith particle at time t
Figure FDA0003914336400000048
The walking distance L is subject to normal distribution U (0,L), and L is the walking distance of each walk of the nursery patrol intelligent vehicle;
Figure FDA0003914336400000049
let the ith particle at time t
Figure FDA00039143364000000410
By an inertial weight factor of
Figure FDA00039143364000000411
S42: when t =1, the i-th particle is calculated
Figure FDA00039143364000000412
Initial position deviation value d H
Figure FDA00039143364000000413
S43: performing optimization iteration on the positions of the M particles in the time slot T, and judging the Housdov distance of the ith particle in the M particles
Figure FDA00039143364000000414
Whether the particle size is smaller than the ith particle calculated in the step S42
Figure FDA00039143364000000415
Initial position deviation value d H If less than, the prior probability density is used
Figure FDA00039143364000000416
Will be provided with
Figure FDA00039143364000000417
As the ith particle initial position at the t-th time; otherwise to measure density
Figure FDA00039143364000000422
Setting the initial position of the ith particle at the t-th time as the observed value z of the ith particle at the t-th time t
Figure FDA00039143364000000419
Wherein m is t Is the target position of the ith particle;
s45: outputting optimized target positions of M particles in time slot T
Figure FDA00039143364000000420
According to the optimized target position
Figure FDA00039143364000000421
And the walking path is used as the walking path of the intelligent patrol vehicle at the moment t.
8. The method for path identification and planning for a nursery inspection intelligent vehicle according to claim 7, wherein the density measurement in step S43
Figure FDA0003914336400000051
The calculation formula of (a) is as follows:
Figure FDA0003914336400000052
wherein σ r Is the observed value z of the ith particle at the t-th time t The ith particle at time t
Figure FDA0003914336400000053
The covariance between the initial positions is,
Figure FDA0003914336400000054
to return to the ith particle at time t
Figure FDA0003914336400000055
Probability density of initial position.
9. A nursery routing inspection according to claim 7The method for identifying and planning the path of the intelligent vehicle is characterized in that the Hausdorff distance of the ith particle in the step S43
Figure FDA0003914336400000056
The calculation formula of (a) is as follows:
Figure FDA0003914336400000057
wherein, A is a first finite point set containing a plurality of M particles, B is a second finite point set containing a plurality of M particles, A = { a = 1 ,a 2 ,...,a p },B={b 1 ,b 2 ,...,b q },1<i<p≤M,1<i<q≤M,
Figure FDA0003914336400000058
Is the maximum distance from the ith particle in the first finite point set a to the nearest point of the ith particle in the second finite point set B,
Figure FDA0003914336400000059
Figure FDA00039143364000000510
is the maximum distance from the ith particle in the second set of finite points B to the closest point of the ith particle in the first set of finite points a,
Figure FDA00039143364000000511
10. the method for path identification and planning for a nursery inspection intelligent vehicle according to claim 7, wherein the target positions of the M particles after optimization in the time slot T in the S43 step
Figure FDA00039143364000000512
The calculation formula of (c) is as follows:
Figure FDA00039143364000000513
CN202211334834.7A 2022-10-28 2022-10-28 Path identification and planning method for nursery patrol intelligent vehicle Pending CN115497067A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211334834.7A CN115497067A (en) 2022-10-28 2022-10-28 Path identification and planning method for nursery patrol intelligent vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211334834.7A CN115497067A (en) 2022-10-28 2022-10-28 Path identification and planning method for nursery patrol intelligent vehicle

Publications (1)

Publication Number Publication Date
CN115497067A true CN115497067A (en) 2022-12-20

Family

ID=85115182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211334834.7A Pending CN115497067A (en) 2022-10-28 2022-10-28 Path identification and planning method for nursery patrol intelligent vehicle

Country Status (1)

Country Link
CN (1) CN115497067A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115826655A (en) * 2023-02-16 2023-03-21 山西农业大学 Nursery stock fertilization control system based on machine vision
CN117146832A (en) * 2023-10-31 2023-12-01 北京佳格天地科技有限公司 Agricultural machinery automatic driving control method and system integrating wireless communication and artificial intelligence

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115826655A (en) * 2023-02-16 2023-03-21 山西农业大学 Nursery stock fertilization control system based on machine vision
CN117146832A (en) * 2023-10-31 2023-12-01 北京佳格天地科技有限公司 Agricultural machinery automatic driving control method and system integrating wireless communication and artificial intelligence
CN117146832B (en) * 2023-10-31 2024-01-02 北京佳格天地科技有限公司 Agricultural machinery automatic driving control method and system integrating wireless communication and artificial intelligence

Similar Documents

Publication Publication Date Title
CN115497067A (en) Path identification and planning method for nursery patrol intelligent vehicle
Malambo et al. Automated detection and measurement of individual sorghum panicles using density-based clustering of terrestrial lidar data
CN109886155B (en) Single-plant rice detection and positioning method, system, equipment and medium based on deep learning
CN110427818A (en) The deep learning satellite data cloud detection method of optic that high-spectral data is supported
CN114239756B (en) Insect pest detection method and system
CN111160127A (en) Remote sensing image processing and detecting method based on deep convolutional neural network model
CN112285710A (en) Multi-source remote sensing reservoir water storage capacity estimation method and device
CN111723711A (en) Plianes and object-oriented mulching film information extraction method and system
CN115880487A (en) Forest laser point cloud branch and leaf separation method based on deep learning method
CN115063437B (en) Mangrove canopy visible light image index feature analysis method and system
CN114998728A (en) Method and system for predicting cotton leaf area index by multi-source remote sensing of unmanned aerial vehicle
CN112434569A (en) Thermal imaging system of unmanned aerial vehicle
CN117392627A (en) Corn row line extraction and plant missing position detection method
CN113569772A (en) Remote sensing image farmland instance mask extraction method, system, equipment and storage medium
CN115294562B (en) Intelligent sensing method for operation environment of plant protection robot
CN116739739A (en) Loan amount evaluation method and device, electronic equipment and storage medium
CN117011694A (en) Forest tree growth parameter prediction method based on cascade circulation network
CN106204596A (en) A kind of panchromatic wave-band remote sensing image cloud detection method of optic estimated with fuzzy hybrid based on Gauss curve fitting function
CN113537140B (en) Weed detection method based on deep neural network
CN115879817A (en) Regional carbon reduction amount evaluation method and device, electronic equipment and storage medium
CN118279737A (en) Weed detection method based on YOLOX deep learning
CN117994443B (en) Garden design method based on landscape garden simulation
CN111768039B (en) Animal home domain estimation method based on active learning
CN116052141B (en) Crop growth period identification method, device, equipment and medium
CN117011702A (en) Tea garden identification method and device based on satellite images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination