CN115497067A - Path identification and planning method for nursery patrol intelligent vehicle - Google Patents
Path identification and planning method for nursery patrol intelligent vehicle Download PDFInfo
- Publication number
- CN115497067A CN115497067A CN202211334834.7A CN202211334834A CN115497067A CN 115497067 A CN115497067 A CN 115497067A CN 202211334834 A CN202211334834 A CN 202211334834A CN 115497067 A CN115497067 A CN 115497067A
- Authority
- CN
- China
- Prior art keywords
- nursery
- image
- gray
- value
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 239000002245 particle Substances 0.000 claims abstract description 118
- 238000007689 inspection Methods 0.000 claims abstract description 36
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims abstract description 32
- 238000005457 optimization Methods 0.000 claims abstract description 24
- 238000004364 calculation method Methods 0.000 claims description 32
- 239000013598 vector Substances 0.000 claims description 22
- 238000001914 filtration Methods 0.000 claims description 17
- 239000011159 matrix material Substances 0.000 claims description 17
- 238000013527 convolutional neural network Methods 0.000 claims description 9
- 238000009826 distribution Methods 0.000 claims description 4
- 238000007781 pre-processing Methods 0.000 claims description 3
- 239000000126 substance Substances 0.000 claims description 3
- 238000001739 density measurement Methods 0.000 claims 1
- 238000007726 management method Methods 0.000 abstract description 7
- 238000013135 deep learning Methods 0.000 abstract description 4
- 230000008569 process Effects 0.000 description 7
- 241000196324 Embryophyta Species 0.000 description 5
- 238000005286 illumination Methods 0.000 description 5
- 230000011218 segmentation Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 239000000463 material Substances 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- 241000607479 Yersinia pestis Species 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000035558 fertility Effects 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000452 restraining effect Effects 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/006—Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/36—Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Nonlinear Science (AREA)
- Image Processing (AREA)
Abstract
The invention provides a path identification and planning method of a nursery patrol intelligent vehicle, which comprises the following steps: acquiring nursery garden image data information in an intelligent vehicle inspection range by using a camera; calculating the gray value of the nursery image information, and optimizing the nursery image data information subjected to gray processing; performing image edge enhancement on the nursery garden image with the optimized edge by using a Sobel operator; and identifying the nursery road environment, identifying the background, the driving road, the nursery stock and sundries to obtain two-dimensional plane image coordinates of the position of the nursery stock, converting the two-dimensional plane image coordinates into three-dimensional coordinates of the inspection intelligent vehicle in a three-dimensional coordinate system, and planning the walking path of the inspection intelligent vehicle by adopting a Monte Carlo particle optimization algorithm. According to the method, the relevant paths are identified in real time through image processing by the aid of data acquisition of the sensors and Deep Learning of Deep Learning, path planning is carried out, intelligent decision making is carried out through a cloud, intelligent management of the nursery is achieved, and nursery stocks can grow better.
Description
Technical Field
The invention belongs to the technical field of agricultural intelligent inspection path planning, and particularly relates to a path identification and planning method for a nursery inspection intelligent vehicle.
Background
The forestry nursery is an important material foundation for effectively restraining desertification diffusion, and the cultivation of nursery seedlings is important. As a large-scale planting area, a forestry nursery needs to be manually inspected frequently, the current nursery management combination information technology degree is low, the management mode is backward, the growth period of seedlings in the nursery is long, the influence of climate change and plant diseases and insect pests is serious, the industry suffers from low productivity and unstable production quality for a long time, and the manual inspection service is heavy and needs a large amount of manpower and material resources. The traditional extensive planting mode is no longer suitable for the current market environment and social development. The industry urgently needs to combine informatization means to optimize a management structure, improve the risk resistance and optimize an industrial structure in response to more and more intense market competition.
Therefore, an inspection intelligent vehicle with autonomous path identification and path planning is urgently needed to replace manual inspection, and by means of intelligence and flexibility of the inspection intelligent vehicle, a large amount of manpower is saved, efficiency is improved, and the development trend of forestry management from extensive to fine informatization is met.
Disclosure of Invention
Aiming at the defects, the invention provides a path identification and planning method of a nursery patrol intelligent vehicle. The method comprises the steps of collecting data by using sensors such as a camera and a laser radar, adopting an autonomous navigation technology, deep Learning, an internet of things technology, instant positioning, map construction and the like, collecting nursery environment data in real time, identifying relevant paths in real time through image processing, planning the paths, and making intelligent decisions through a cloud. In addition, the virtual simulation technology is adopted to perform path optimization simulation on the patrol intelligent vehicle, and the simulation result is used for the maneuver control of the patrol intelligent vehicle, so that the intelligent management of the nursery is realized, and the nursery stock can grow better.
The invention provides the following technical scheme: a path identification and planning method for a nursery patrol intelligent vehicle comprises the following steps:
s1: acquiring nursery garden image data information in an intelligent vehicle inspection range by using a camera;
s2: preprocessing the acquired nursery image information, calculating the gray value of the nursery image information, and optimizing the nursery image data information subjected to gray processing by adopting an edge preserving optimal filtering algorithm;
s3: adopting a Sobel operator to carry out image edge enhancement on the nursery garden image after the edge optimization in the step S2;
s4: and (3) identifying the background, the driving road, the seedlings and the sundries of the nursery road environment of the nursery image data information after the image edge enhancement in the step (3) by adopting a yolov3 algorithm of a deep convolutional neural network to obtain two-dimensional plane image coordinates of the position where the nursery seedlings are located, converting the two-dimensional plane image coordinates into three-dimensional coordinates of the patrol intelligent vehicle in a three-dimensional coordinate system, and planning the walking path of the patrol intelligent vehicle by adopting a Monte Carlo particle optimization algorithm.
Further, the formula for calculating the gray value of the nursery image information in the step S2 is as follows:
wherein GARY (i, j) represents a gray value of the pixel (i, j) in the nursery image information, R (i, j) represents a value of a red primary color of the pixel (i, j) in an RGB color channel, B (i, j) represents a value of a blue primary color of the pixel in the RGB color channel, and G (i, j) represents a value of a green primary color of the pixel in the RGB color channel.
Further, in the step S2, an edge preserving optimization filtering algorithm is adopted to optimize the nursery image data information after the gray processing, including the following steps:
s21: adopting the step S2 to calculate the gray level value GARY (i, j) of the nursery image information to uniformly replace the value R (i, j) of the red primary color, the value G (i, j) of the green primary color and the value B (i, j) of the blue primary color in the RGB color channels RGB (R (i, j), G (i, j), B (i, j)) acquired in the step S1, and obtaining new gray level image data RGB (GARY (i, j), GARY (i, j) and GARY (i, j));
s22: constructing the ith image data vector x in the gray-scale image in the step S21 i Gaussian noise vector observation model:
wherein v is i Representing a noiseless wavelet coefficient of the ith image, and n i I.i.d Gaussian noise representing the independent and uniform distribution of the ith image;
s23: a vector V of non-noisy data in the wavelet domain is constructed and a noisy output vector corresponding to the non-noisy data in the wavelet domain is calculatedWith image data vectors in the input grey-scale imageMinimum mean square error f between X MSE :
Where N is the size of the sub-band, v i Are the wavelet coefficients of a noise-free image,is the threshold wavelet coefficient of the noisy image;
s24: constructing a noise standard deviation sigma calculation model of absolute deviation, and constructing a calculation model of an image data denoising threshold thr (sigma) in the gray-scale image according to the calculated noise standard deviation of the absolute deviation:
s25: judging the minimum mean square error f calculated in the step S23 MSE The relation between the image data denoising threshold thr (sigma) in the gray-scale image constructed in the step S24, and then the ith image data vector x in the gray-scale image is output i Filtering denoised value of
S26: image data x in the grayscale image output by N-1 i Filtering denoised value ofAnd forming the optimized nursery image data information after gray processing.
Further, the noise standard deviation σ calculation model of the absolute deviation constructed in the step S24 is as follows:
wherein the content of the first and second substances,for calculating said noisy output vectorThe median value of (a).
Further, in the step S3, a Sobel operator is adopted to perform image edge enhancement on the nursery image after the edge is optimized in the step S2, and the method includes the following steps:
s31: constructing a horizontal convolution kernel model G of pixel neighborhood pixel calculation of the gray scale image x And a vertical convolution kernel G y Model:
G x =[f(i+1,j-1)+2f(i+1,j)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j+1)];
G y =[f(i-1,j-1)+2f(i,j+1)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j-1)];
s32: constructing the horizontal convolution kernel model G in the step S31 x And the vertical convolution kernel G y The models are converted into a horizontal gradient matrix form and a vertical gradient matrix form which are subjected to convolution calculation with a matrix A formed by the nursery image data information after the gray processing optimized in the step S2, and the horizontal gradient G of the nursery image data after the gray processing optimized in the step S2 at the pixel (i, j) is obtained x And a vertical gradient G y :
S33: the horizontal gradient G of the optimized gray-scale processed nursery image data at the pixel (i, j) is calculated according to the step S32 x And a vertical gradient G y Calculating the actual gradient G of the optimized gray-scale processed nursery image data calculated in the step S32 at the pixel (i, j) g :
S34: judging the actual gradient G calculated in the step S33 g Whether the value is larger than the edge point determination threshold G thr If yes, judging the pixel (i, j) as an edge point; otherwise, eliminating the pixel, and repeating the steps S31-S33 to complete the image edge enhancement of the nursery image data after the gray processing optimized in the step S2.
Further, the edge point determination threshold G in the step S34 thr The calculation formula of (a) is as follows:
G thr =0.65∑ i ∑ j G g 。
further, the step S4 of planning the traveling route of the intelligent inspection vehicle by using a monte carlo particle optimization algorithm includes the following steps:
s41: constructing a t-time ith particle for converting nursery seedlings into the patrol intelligent vehicle under a three-dimensional coordinate systemInitial position calculation model:
wherein H isIth particle at time t-1Moving to the ith particle at time tA walking steering angle matrix of an initial position, wherein L is the ith particle at the t-1 momentMoving to the ith particle at time tThe travel distance of the initial position, i is the ith particle at the moment of t-1Moving to the ith particle at time tThe walking distance L is subject to normal distribution U (0,L), and L is the walking distance of each walk of the nursery patrol intelligent vehicle;
S43: performing optimization iteration on the positions of the M particles in the time slot T, and judging the Housdov distance of the ith particle in the M particlesWhether the particle size is smaller than the ith particle calculated in the step S42Initial position deviation value d H If less than, the prior probability density is usedWill be provided withAs the ith particle initial position at the t-th time; otherwise to measure densitySetting the initial position of the ith particle at the t-th time as the observed value z of the ith particle at the t-th time t ,
Wherein m is t Is the target position of the ith particle;
s44: outputting optimized target positions of M particles in time slot TAccording to the optimized target positionAnd the walking path is used as the walking path of the intelligent patrol vehicle at the moment t.
wherein σ r Is observed value z of ith particle at the t-th time t The ith particle at time tThe covariance between the initial positions is,to return to the ith particle at time tProbability density of initial position.
Further, the distance of the ith particle in the step S43 is HousdovThe calculation formula of (a) is as follows:
wherein A is a first finite point set comprising a plurality of M particles, B is a second finite point set comprising a plurality of M particles, A = { a = 1 ,a 2 ,...,a p },B={b 1 ,b 2 ,...,b q },1<i<p≤M,1<i<q≤M,Is the maximum distance from the ith particle in the first finite point set a to the nearest point of the ith particle in the second finite point set B, is the maximum distance from the ith particle in the second finite point set B to the closest point of the ith particle in the first finite point set a,
further, in the step S43, the target positions of the M particles after optimization in the time slot TThe calculation formula of (c) is as follows:
the invention has the beneficial effects that:
1. according to the path identification and planning method of the intelligent nursery patrol inspection vehicle, provided by the invention, the edge-preserving optimized filtering algorithm is adopted, the nursery image data information after gray processing is optimized, the image detail characteristics can be kept as much as possible, the noise of the target image is inhibited, the influence of the noise is effectively removed, and the edge detail information in the nursery image is kept.
2. According to the method, the camera is adopted to collect the nursery image data information in the intelligent vehicle inspection range in real time, the gray processing, edge information retention and de-noising, image edge enhancement and Deep Learning yolov3 algorithm of the Deep convolutional neural network of the collected image data are used for identifying and distinguishing the targets of nursery stocks and non-nursery stocks in the steps S1-S3, and finally the Monte Carlo particle optimization algorithm is adopted to plan the inspection path of the inspection intelligent vehicle, so that the defect of poor robustness caused by the fact that the data are measured and calculated only by a sensor and are easily affected by outdoor factors such as weather, illumination and the like is overcome, the accuracy of basic data for training reinforcement Learning and the universality of a data set are improved, and the robustness of path identification and planning is enhanced.
3. According to the method, the Sobel operator is adopted to carry out image edge enhancement on the nursery image with the optimized edge, the situation that the image edge is fuzzy possibly caused by image filtering is avoided, and the Sobel operator is adopted to carry out image edge enhancement on the nursery image with the optimized edge, so that useful information can be enhanced by edge enhancement processing, and useless information is inhibited, the visual effect of the image is improved, and the distinguishability of the effective components of the image is improved.
4. Before the Monte Carlo particle optimization algorithm is used for planning the walking path of the patrol intelligent vehicle, after the distinction and identification of nursery seedlings and non-seedlings identified by the yolov3 algorithm of the deep convolutional neural network are carried out, the situation that a local planning navigation line obtained only based on the yolov3 visual algorithm is formed by fitting each navigation point in an image and takes a pixel coordinate system as a reference coordinate point is limited in the actual application process is avoided, the two-dimensional plane image coordinate of the position of the nursery seedlings is obtained and is converted into the three-dimensional coordinate of the patrol intelligent vehicle in the three-dimensional coordinate system, and further the three-dimensional moving coordinate of the patrol intelligent vehicle planned by the Monte Carlo particle optimization algorithm can be subjected to real-time path control and adjustment.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The invention will be described in more detail hereinafter on the basis of embodiments and with reference to the drawings. Wherein:
fig. 1 is a schematic flow chart of a path identification and planning method of a nursery patrol inspection intelligent vehicle provided by the invention;
fig. 2 is a schematic flow chart of optimizing the nursery image data information subjected to gray scale processing in step S2 in the method provided by the present invention;
fig. 3 is a schematic flow chart of image edge enhancement performed on the nursery image after edge optimization by using a Sobel operator in step S3 in the method provided by the present invention;
FIG. 4 is a schematic diagram of three-dimensional coordinates of a three-dimensional coordinate system of the intelligent inspection vehicle converted by the method provided by the invention;
fig. 5 is a schematic flow chart of planning a routing inspection path of the smart vehicle by adopting a monte carlo particle optimization algorithm in the step S4 in the method provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
At present, the nursery garden in China is roughly selected in open areas with the characteristics of leeward and sunny exposure, good drainage, higher terrain and flat terrain, and is close to a transportation hub which is convenient for transportation of materials such as railways, roads or water paths. Meanwhile, the land has to meet the conditions of proper weather conditions such as proper sunlight intensity, no invasion of cold wind and the like, moderate soil fertility requirements, proximity to large water bodies such as rivers, lakes and ponds, reservoirs and the like. The nursery garden intelligent patrol vehicle works outdoors, and is more complex than an indoor environment, and the sensor measuring and calculating data is easily influenced by outdoor factors such as weather, illumination and the like, so that the robustness of a common identification algorithm is poor. Therefore, a rich data set containing more factors as much as possible needs to be acquired, and the anti-interference performance of the image algorithm is enhanced in a mode of training a model.
As shown in fig. 1, the path identification and planning method for the nursery patrol inspection intelligent vehicle provided by the invention comprises the following steps:
s1: acquiring nursery garden image data information in an intelligent vehicle inspection range by using a camera;
s2: preprocessing the acquired nursery image information, calculating the gray value of the nursery image information, and optimizing the nursery image data information subjected to gray processing by adopting an edge preserving optimal filtering algorithm;
s3: performing image edge enhancement on the nursery garden image with the edge optimized in the step S2 by using a Sobel operator;
s4: and (3) identifying the background, the driving road, the seedlings and the sundries of the nursery road environment of the nursery image data information after the image edge enhancement in the step (3) by adopting a yolov3 algorithm of a deep convolutional neural network to obtain two-dimensional plane image coordinates of the position of the nursery seedlings, converting the two-dimensional plane image coordinates into three-dimensional coordinates of the patrol intelligent vehicle under a three-dimensional coordinate system, and planning the walking path of the patrol intelligent vehicle by adopting a Monte Carlo particle optimization algorithm.
The method provided by the invention has the advantages that when the data set is collected in the step S1, the inspection vehicle is manually and remotely controlled to walk to simulate the normal operation state of the inspection vehicle, the resolution of an image collected by the camera is 1920 pixels x 1080 pixels, and the collection frame rate is 70f/S. In order to ensure the richness of the road data set, data acquisition is carried out under the changes of various environmental characteristics, weather and light rays at different times. And (3) randomly selecting l0000 images from the acquired image data to manufacture a nursery garden road data set, wherein 8000 images are used for a training set, 1000 images are used for a verification set, and 1000 images are used for a test set.
Due to the possible imperfect conditions of a camera imaging system, a transmission mode, a storage device and the like, pollution caused by various noises is inevitable in the process of acquiring and converting images, so that the quality of the images is reduced to a certain extent, and the acquired images are different from the original images, so that the processing result is influenced. Aiming at stronger time-varying property of outdoor environment, and simultaneously, carrying out gray level processing on the RGB image in order to reduce data processing amount and save memory. When the numbers of R, G, B channels are equal, the color image presents a gray color, wherein the value of each pixel color in the gray image is the gray value, and the range is 0-255. Since the gray value of each pixel in the gray image only needs one byte, the data amount is greatly reduced. Therefore, as a preferred embodiment of the present invention, in the step S2, the color image is processed into a gray-scale image by using a weighted average method, and the calculation formula for calculating the gray-scale value of the nursery image information in the step S2 is as follows:
wherein GARY (i, j) represents a gray value of the pixel (i, j) in the nursery image information, R (i, j) represents a value of a red primary color of the pixel (i, j) in the RGB color channel, B (i, j) represents a value of a blue primary color of the pixel in the RGB color channel, and G (i, j) represents a value of a green primary color of the pixel in the RGB color channel.
However, since the grayscale map cannot effectively remove the influence of noise, it is necessary to suppress noise of the target image while retaining the detail features of the image as much as possible. The traditional median filtering and local averaging method is easy to influence the fuzzy edge of the image. Therefore, a more suitable edge-preserving optimized filtering algorithm is adopted, and therefore, as another preferred embodiment of the present invention, as shown in fig. 2, the step S2 of optimizing the nursery image data information after performing the gray processing by using the edge-preserving optimized filtering algorithm comprises the following steps:
s21: calculating a gray level value GARY (i, j) of the nursery image information by adopting the step S2 to uniformly replace the value R (i, j) of the red primary color, the value G (i, j) of the green primary color and the value B (i, j) of the blue primary color in RGB color channels RGB (R (i, j), G (i, j), B (i, j)) acquired in the step S1, so as to obtain new gray level image data RGB (GARY (i, j), GARY (i, j) and GARY (i, j));
s22: constructing the ith image data vector x in the gray-scale image in the step S21 i Gaussian noise vector observation model:
wherein v is i Representing a noiseless wavelet coefficient of the ith image, and n i I.i.d Gaussian noise which represents that the ith image is independent and uniformly distributed;
s23: a vector V of non-noisy data in the wavelet domain is constructed and a noisy output vector corresponding to the non-noisy data in the wavelet domain is calculatedAnd the transmissionMinimum mean square error f between image data vectors X in an incoming grayscale image MSE :
Where N is the size of the sub-band, v i Are the wavelet coefficients of a noise-free image,is a threshold wavelet coefficient of a noisy image;
s24: constructing a noise standard deviation sigma calculation model of absolute deviation, and constructing a calculation model of an image data denoising threshold thr (sigma) in a gray image according to the calculated noise standard deviation of the absolute deviation:
s25: the minimum mean square error f calculated in the step S23 is judged MSE And the relation between the image data denoising threshold thr (sigma) in the gray-scale image constructed in the step S24, and then the ith image data vector x in the gray-scale image is output i Filtering denoised value of
S26: image data x in a grayscale image output by N-1 i Filtering denoised value ofAnd forming the optimized nursery image data information after gray processing.
According to the method, the noise of the target image is eliminated, and the detail characteristics are kept as much as possible.
Further preferably, the noise standard deviation σ calculation model of the absolute deviation constructed in the step S24 is as follows:
wherein the content of the first and second substances,to calculate a noisy output vectorThe median value of (a).
Image edge blurring may occur through image filtering, and edge enhancement processing is needed to enhance useful information and inhibit useless information, so that the visual effect of an image is improved, and the resolution of effective components of the image is improved. The invention adopts Sobel operator to enhance the image edge. Therefore, as another preferred embodiment of the present invention, as shown in fig. 3, the image edge enhancement of the nursery image after the step S2 of optimizing the edge by using Sobel operator in the step S3 includes the following steps:
s31: construction of gray-scale image pixel neighborhood pixel calculation horizontal convolution kernel model G x And a vertical convolution kernel G y Model:
G x =[f(i+1,j-1)+2f(i+1,j)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j+1)];
G y =[f(i-1,j-1)+2f(i,j+1)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j-1)];
s32: constructing a horizontal convolution kernel model G in the step S31 x And a vertical convolution kernel G y The models are converted into moments formed by the nursery image data information after the gray level processing optimized in the step S2The matrix A is in a horizontal gradient matrix form and a vertical gradient matrix form which are subjected to convolution calculation, and the horizontal gradient G of the nursery garden image data at the pixel (i, j) after the gray level processing optimized in the step S2 is obtained x And a vertical gradient G y :
Horizontal gradient G x Vertical gradient G y And S2, all matrixes A formed by the optimized nursery image data information subjected to gray level processing in the step S are 3 x 3 matrixes;
s33: according to the horizontal gradient G of the optimized gray-scale processed nursery image data calculated in the step S32 at the pixel (i, j) x And a vertical gradient G y Calculating the actual gradient G of the optimized gray-scale processed nursery image data at pixel (i, j) calculated in step S32 g :
S34: judging the actual gradient G calculated in the step S33 g Whether the value is larger than the edge point determination threshold G thr If yes, judging the pixel (i, j) as an edge point; otherwise, the pixel is eliminated, and the steps S31-S33 are repeated to finish the image edge enhancement of the nursery image data after the gray level processing optimized in the step S2.
Further preferably, the edge point determination threshold G in step S34 thr The calculation formula of (a) is as follows:
G thr =0.65∑ i ∑ j G g 。
as another preferred embodiment of the present invention, in step S4, the yolov3 algorithm of the deep convolutional neural network is used to identify the nursery road environment, identify the background, the driving road, the seedlings and the sundries, and the semantic segmentation labeling tool labelme is used to manually label the road environment data set, so as to label different categories with different colors. According to the characteristics of the nursery environment and the autonomous operation requirement of the inspection intelligent vehicle, objects in the environment can be divided into 4 categories with small correlation, as shown in table 1, the categories are as follows: background, driving roads, seedlings and sundries.
TABLE 1
The electric poles and the indication boards, as well as the weeds and equipment and other interferents are usually arranged among the seedlings, and some seedlings and non-seedling objects with certain common characteristics are difficult to rapidly identify by using the traditional method in the prior art, such as target unstructured, the weeds are similar to the leaves in color, and the weeds are difficult to distinguish by using a color segmentation method. Meanwhile, some traditional methods are greatly influenced under different illumination, for example, sunlight illumination positions are different in one day, shading at different angles of tree shade is frequently changed, and the like. The deep convolutional neural network yolov3 algorithm adopted by the invention can carry out semantic segmentation on the complex image, and the processing efficiency of the image algorithm is greatly improved.
Modern forestry nursery planting nursery stock is mostly the seedling, and the trunk is shorter than adult trees, for making the illumination that receives more sufficient, the crown struts the angle degree big. For the top advantage that convenient management and sapling have and root system growth distribution to consider, same type nursery stock is mostly parallel planting, and the width of row and planting density between the trees can satisfy general machinery condition of traveling. The same type of nursery stock is planted at regular intervals along a straight line, and the height of the tree is slightly different from the shape of the crown of the tree. In other words, the trunk in the semi-structured environment can be used as a reference object for positioning, which is very beneficial for the detection of the driving path and the generation of the route.
The method utilizes YOLOv3 trained by the data set to filter telegraph poles and indication boards among seedlings, weeds, equipment and other interferents, identifies some seedling targets and non-seedling targets with certain common characteristics, and outputs data coordinates of a boundary box.
Meanwhile, a semantic segmentation model suitable for the complex road environment is built by using a convolutional neural network, and a Mask-R-CNN instance segmentation algorithm is used.
And (3) performing a reference line fitting algorithm on the seedling rows on two sides by using the reference points of the seedling boundary frame identified by yolov 3. The algorithm respectively judges whether the number of the reference points of the nursery stocks on the two sides meets at least three-point fitting standards of a least square method, namely, a straight line can be fitted by three coordinate points and more than three coordinate points. Because the fruit trees in the nursery are regularly planted, each row is as straight as possible, and the interval between the rows is large, the actual use requirement can be approximately met. If the number of the extractable nursery stock reference points is less than three when the nursery stock is lost, connecting the two nearest reference points as nursery stock row reference lines.
As shown in fig. 4, the coordinates of the two-dimensional plane image of the position of the nursery stock obtained after the processing are converted into the three-dimensional coordinates of the inspection intelligent vehicle in the three-dimensional coordinate system, the locally planned navigation line obtained by the visual algorithm is formed by fitting each navigation point in the image, and the coordinate points are based on the pixel coordinate system, so that the coordinate conversion is needed due to the limitation in the actual application process.
Converting the world coordinates of the navigation point in the world coordinates into coordinates in the camera coordinate system by constructing a conversion matrix model of the world coordinate system and the camera coordinate system as follows:
wherein, the world coordinate of the navigation point in the world coordinate system is (X, Y, Z), and the coordinate of the navigation point in the camera coordinate system is (X) c ,Y c ,Z c )
One point P in real space can be converted from a matrix to pixel coordinates.
Z C As depth value of camera, X u 、Y v Coordinates of points in the image; f is the focal length of the camera, u 0 ,v 0 Is a principal point coordinate relative to the imaging plane, whose value is about half of the image pixel; r is a 3 × 3 rotation matrix; t is a translation vector of 3 × 1; x, Y, Z is the coordinate value of the actual point in the world coordinate system.
As another preferred embodiment of the present invention, as shown in fig. 5, the step S4 adopts a monte carlo particle optimization algorithm to plan the walking route of the inspection intelligent vehicle, and includes the following steps:
s41: constructing a t-time ith particle for converting nursery seedlings into a three-dimensional coordinate system of the intelligent patrol carInitial position calculation model:
wherein H is the ith particle at the time of t-1Moving to the ith particle at time tA walking steering angle matrix of an initial position, wherein L is the ith particle at the t-1 momentMoving to the ith particle at time tThe travel distance of the initial position, i is the ith particle at the moment of t-1Moving to the ith time of tParticlesThe walking distance L is subject to normal distribution U (0,L), and L is the walking distance of each walk of the intelligent nursery garden patrol car;
S43: performing optimization iteration on the positions of the M particles in the time slot T, and judging the Housdov distance of the ith particle in the M particlesWhether the number of particles is less than the ith particle calculated in the step S42Initial position deviation value d H If less than, the prior probability density is usedWill be provided withAs the ith particle initial position at the t-th time; otherwise to measure densitySetting the initial position of the ith particle at the t-th time as the observed value z of the ith particle at the t-th time t ,
Wherein m is t Is the target position of the ith particle;
s44: outputting optimized target positions of M particles in time slot TAccording to the optimized target positionAnd the walking path of the patrol intelligent vehicle is used as t time.
The particles represent the optimal walking points at the time T, namely the optimal walking point at the time T of the ith particle is connected to the optimal walking point at the time T +1 of the ith particle to form an optimal walking path in the time slot T. M particles show that the intelligent vehicle of patrolling and examining can be in any one position of patrolling and examining the within range at the moment of t.
Further preferably, the density is measured in the step S43The calculation formula of (a) is as follows:
wherein σ r Is observed value z of ith particle at t time t And the ith particle at time tThe covariance between the initial positions is,to return to the ith particle at time tProbability density of initial position.
Further preferably, the distance of the ith particle in the step S43 is HousdovThe calculation formula of (a) is as follows:
wherein A is a first finite point set comprising a plurality of M particles, B is a second finite point set comprising a plurality of M particles, A = { a = 1 ,a 2 ,...,a p },B={b 1 ,b 2 ,...,b q },1<i<p≤M,1<i<q≤M,Is the maximum distance from the ith particle in the first finite point set a to the nearest point of the ith particle in the second finite point set B, is the maximum distance from the ith particle in the second finite point set B to the closest point of the ith particle in the first finite point set a,||a i -b i i is the ith particle a in the first finite point set i With the ith particle b in the second finite point set i Euclidean distance between them, and in the same way, | | b i -a i I is the ith particle b in the second finite point set i With the ith particle a in the first finite point set i In betweenThe euclidean distance.
Further preferably, the target positions of the M particles after optimization in the time slot T in the step S43The calculation formula of (a) is as follows:
in general, in feature matching based on classical particle filtering, when the weight becomes large, we can control to participate in calculating the measurement densityAs observed value z of the ith particle t Weight of the particles of the estimated positionAnd passes the Hausdorff distance of the ith particleAnd the ith particle as a thresholdInitial position deviation value d H Comparing, once the h-Housdov distance of the ith particleThe value becomes too large to control. This will result in filter divergence. Due to the ith particle as a thresholdInitial position deviation value d H Is the maximum distance between each particle and the target and can therefore be controlled. Thus, by defining the Hausdorff distance for the ith particleThe maximum value is in a reasonable range, and we can avoid the Housdov distance of the ith particleThe calculation result is inaccurate due to large data dispersion caused by the increase of the size.
It should be noted that the above-mentioned numbers of the embodiments of the present invention are merely for description, and do not represent the merits of the embodiments. And the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, apparatus, article, or method that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
While the invention has been described with reference to a preferred embodiment, various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In particular, the technical features mentioned in the embodiments can be combined in any manner as long as there is no technical solution conflict. It is intended that the invention not be limited to the particular embodiments disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (10)
1. A path identification and planning method for a nursery patrol intelligent vehicle is characterized by comprising the following steps:
s1: collecting nursery image data information in an intelligent vehicle inspection range by using a camera;
s2: preprocessing the acquired nursery image information, calculating the gray value of the nursery image information, and optimizing the nursery image data information subjected to gray processing by adopting an edge preserving optimal filtering algorithm;
s3: performing image edge enhancement on the nursery garden image with the edge optimized in the step S2 by adopting a Sobel operator;
s4: and (3) identifying the background, the driving road, the seedlings and the sundries of the nursery road environment of the nursery image data information after the image edge enhancement in the step (3) by adopting a yolov3 algorithm of a deep convolutional neural network to obtain two-dimensional plane image coordinates of the position where the nursery seedlings are located, converting the two-dimensional plane image coordinates into three-dimensional coordinates of the patrol intelligent vehicle in a three-dimensional coordinate system, and planning the walking path of the patrol intelligent vehicle by adopting a Monte Carlo particle optimization algorithm.
2. The method for path identification and planning of a nursery inspection intelligent vehicle according to claim 1, wherein the formula for calculating the gray-scale value of the nursery image information in the step S2 is as follows:
wherein GARY (i, j) represents a gray value of the pixel (i, j) in the nursery image information, R (i, j) represents a value of a red primary color of the pixel (i, j) in an RGB color channel, B (i, j) represents a value of a blue primary color of the pixel in the RGB color channel, and G (i, j) represents a value of a green primary color of the pixel in the RGB color channel.
3. The method for path identification and planning of a nursery inspection intelligent vehicle according to claim 1, wherein in the step S2, an edge preserving optimization filter algorithm is adopted to optimize the nursery image data information after gray processing, and the method comprises the following steps:
s21: adopting the step S2 to calculate the gray level value GARY (i, j) of the nursery image information to uniformly replace the value R (i, j) of the red primary color, the value G (i, j) of the green primary color and the value B (i, j) of the blue primary color in the RGB color channels RGB (R (i, j), G (i, j), B (i, j)) acquired in the step S1, and obtaining new gray level image data RGB (GARY (i, j), GARY (i, j) and GARY (i, j));
s22: constructing the ith image data vector x in the gray-scale image in the step S21 i Gaussian noise vector observation model:
i =0,1,2, · N; wherein v is i Representing a noiseless wavelet coefficient of the ith image, and n i I.i.d Gaussian noise which represents that the ith image is independent and uniformly distributed;
s23: a vector V of non-noisy data in the wavelet domain is constructed and a noisy output vector corresponding to the non-noisy data in the wavelet domain is calculatedMinimum mean square error f between the input gray scale image and image data vector X MSE :
Where N is the size of the sub-band, v i Are the wavelet coefficients of a noise-free image,is a threshold wavelet coefficient of a noisy image;
s24: constructing a noise standard deviation sigma calculation model of absolute deviation, and constructing a calculation model of an image data denoising threshold thr (sigma) in the gray image according to the calculated noise standard deviation of the absolute deviation:
s25: judging the minimum mean square error f calculated in the step S23 MSE The relation between the image data denoising threshold thr (sigma) in the gray-scale image constructed in the step S24, and then the ith image data vector x in the gray-scale image is output i Filtering denoised value of
4. The method for path identification and planning of a nursery patrol inspection intelligent vehicle according to claim 3, wherein the noise standard deviation σ calculation model of absolute deviation constructed in the step S24 is as follows:
5. The path identification and planning method for the nursery inspection intelligent vehicle according to claim 1, wherein in the step S3, the image edge enhancement is performed on the nursery image after the edge optimization in the step S2 by using a Sobel operator, and the method comprises the following steps:
s31: constructing a horizontal convolution kernel model G of pixel neighborhood pixel calculation of the gray scale image x And a vertical convolution kernel G y Model:
G x =[f(i+1,j-1)+2f(i+1,j)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j+1)];
G y =[f(i-1,j-1)+2f(i,j+1)+f(i+1,j+1)]-[f(i-1,j-1)+2f(i-1,j)+f(i+1,j-1)];
s32: constructing the horizontal convolution kernel model G in the step S31 x And the vertical convolution kernel G y The models are converted into a horizontal gradient matrix form and a vertical gradient matrix form which are subjected to convolution calculation with a matrix A formed by the nursery image data information after the gray level processing optimized in the step S2, and the horizontal gradient matrix form and the vertical gradient matrix form are obtainedS2, the horizontal gradient G of the nursery image data subjected to gray level processing after optimization in the step (i, j) is formed at the pixel (i, j) x And a vertical gradient G y :
S33: the horizontal gradient G of the optimized gray-scale processed nursery image data at the pixel (i, j) is calculated according to the step S32 x And a vertical gradient G y Calculating the actual gradient G of the optimized gray-scale processed nursery image data calculated in step S32 at pixel (i, j) g :
S34: judging the actual gradient G calculated in the step S33 g Whether the value is larger than the edge point determination threshold G thr If yes, judging the pixel (i, j) as an edge point; otherwise, eliminating the pixel, and repeating the steps S31-S33 to complete the image edge enhancement of the nursery image data after the gray processing optimized in the step S2.
6. The method for path identification and planning of a nursery patrol inspection intelligent vehicle according to claim 5, wherein the threshold value G for edge point identification in the step S34 thr The calculation formula of (c) is as follows:
G thr =0.65∑ i ∑ j G g 。
7. the method for identifying and planning the path of the intelligent nursery inspection vehicle according to claim 1, wherein a Monte Carlo particle optimization algorithm is adopted in the S4 step to plan the walking route of the intelligent nursery inspection vehicle, and the method comprises the following steps:
s41: constructing a t-time ith particle for converting nursery seedlings into the patrol intelligent vehicle under a three-dimensional coordinate systemInitial position calculation model:
wherein H is the ith particle at the time of t-1Moving to the ith particle at time tA walking steering angle matrix of an initial position, wherein L is the ith particle at the t-1 momentMoving to the ith particle at time tThe travel distance of the initial position, i is the ith particle at the time of t-1Moving to the ith particle at time tThe walking distance L is subject to normal distribution U (0,L), and L is the walking distance of each walk of the nursery patrol intelligent vehicle;
S43: performing optimization iteration on the positions of the M particles in the time slot T, and judging the Housdov distance of the ith particle in the M particlesWhether the particle size is smaller than the ith particle calculated in the step S42Initial position deviation value d H If less than, the prior probability density is usedWill be provided withAs the ith particle initial position at the t-th time; otherwise to measure densitySetting the initial position of the ith particle at the t-th time as the observed value z of the ith particle at the t-th time t ,
Wherein m is t Is the target position of the ith particle;
8. The method for path identification and planning for a nursery inspection intelligent vehicle according to claim 7, wherein the density measurement in step S43The calculation formula of (a) is as follows:
9. A nursery routing inspection according to claim 7The method for identifying and planning the path of the intelligent vehicle is characterized in that the Hausdorff distance of the ith particle in the step S43The calculation formula of (a) is as follows:
wherein, A is a first finite point set containing a plurality of M particles, B is a second finite point set containing a plurality of M particles, A = { a = 1 ,a 2 ,...,a p },B={b 1 ,b 2 ,...,b q },1<i<p≤M,1<i<q≤M,Is the maximum distance from the ith particle in the first finite point set a to the nearest point of the ith particle in the second finite point set B, is the maximum distance from the ith particle in the second set of finite points B to the closest point of the ith particle in the first set of finite points a,
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211334834.7A CN115497067A (en) | 2022-10-28 | 2022-10-28 | Path identification and planning method for nursery patrol intelligent vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211334834.7A CN115497067A (en) | 2022-10-28 | 2022-10-28 | Path identification and planning method for nursery patrol intelligent vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115497067A true CN115497067A (en) | 2022-12-20 |
Family
ID=85115182
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211334834.7A Pending CN115497067A (en) | 2022-10-28 | 2022-10-28 | Path identification and planning method for nursery patrol intelligent vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115497067A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115826655A (en) * | 2023-02-16 | 2023-03-21 | 山西农业大学 | Nursery stock fertilization control system based on machine vision |
CN117146832A (en) * | 2023-10-31 | 2023-12-01 | 北京佳格天地科技有限公司 | Agricultural machinery automatic driving control method and system integrating wireless communication and artificial intelligence |
-
2022
- 2022-10-28 CN CN202211334834.7A patent/CN115497067A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115826655A (en) * | 2023-02-16 | 2023-03-21 | 山西农业大学 | Nursery stock fertilization control system based on machine vision |
CN117146832A (en) * | 2023-10-31 | 2023-12-01 | 北京佳格天地科技有限公司 | Agricultural machinery automatic driving control method and system integrating wireless communication and artificial intelligence |
CN117146832B (en) * | 2023-10-31 | 2024-01-02 | 北京佳格天地科技有限公司 | Agricultural machinery automatic driving control method and system integrating wireless communication and artificial intelligence |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115497067A (en) | Path identification and planning method for nursery patrol intelligent vehicle | |
Malambo et al. | Automated detection and measurement of individual sorghum panicles using density-based clustering of terrestrial lidar data | |
CN109886155B (en) | Single-plant rice detection and positioning method, system, equipment and medium based on deep learning | |
CN110427818A (en) | The deep learning satellite data cloud detection method of optic that high-spectral data is supported | |
CN114239756B (en) | Insect pest detection method and system | |
CN111160127A (en) | Remote sensing image processing and detecting method based on deep convolutional neural network model | |
CN112285710A (en) | Multi-source remote sensing reservoir water storage capacity estimation method and device | |
CN111723711A (en) | Plianes and object-oriented mulching film information extraction method and system | |
CN115880487A (en) | Forest laser point cloud branch and leaf separation method based on deep learning method | |
CN115063437B (en) | Mangrove canopy visible light image index feature analysis method and system | |
CN114998728A (en) | Method and system for predicting cotton leaf area index by multi-source remote sensing of unmanned aerial vehicle | |
CN112434569A (en) | Thermal imaging system of unmanned aerial vehicle | |
CN117392627A (en) | Corn row line extraction and plant missing position detection method | |
CN113569772A (en) | Remote sensing image farmland instance mask extraction method, system, equipment and storage medium | |
CN115294562B (en) | Intelligent sensing method for operation environment of plant protection robot | |
CN116739739A (en) | Loan amount evaluation method and device, electronic equipment and storage medium | |
CN117011694A (en) | Forest tree growth parameter prediction method based on cascade circulation network | |
CN106204596A (en) | A kind of panchromatic wave-band remote sensing image cloud detection method of optic estimated with fuzzy hybrid based on Gauss curve fitting function | |
CN113537140B (en) | Weed detection method based on deep neural network | |
CN115879817A (en) | Regional carbon reduction amount evaluation method and device, electronic equipment and storage medium | |
CN118279737A (en) | Weed detection method based on YOLOX deep learning | |
CN117994443B (en) | Garden design method based on landscape garden simulation | |
CN111768039B (en) | Animal home domain estimation method based on active learning | |
CN116052141B (en) | Crop growth period identification method, device, equipment and medium | |
CN117011702A (en) | Tea garden identification method and device based on satellite images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |