CN117765051B - Afforestation maintenance monitoring and early warning system and method - Google Patents

Afforestation maintenance monitoring and early warning system and method Download PDF

Info

Publication number
CN117765051B
CN117765051B CN202410034894.XA CN202410034894A CN117765051B CN 117765051 B CN117765051 B CN 117765051B CN 202410034894 A CN202410034894 A CN 202410034894A CN 117765051 B CN117765051 B CN 117765051B
Authority
CN
China
Prior art keywords
preset direction
pixel points
gray
preset
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410034894.XA
Other languages
Chinese (zh)
Other versions
CN117765051A (en
Inventor
相阳
杨宜东
刘伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jining Municipal Garden Maintenance Center
Original Assignee
Jining Municipal Garden Maintenance Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jining Municipal Garden Maintenance Center filed Critical Jining Municipal Garden Maintenance Center
Priority to CN202410034894.XA priority Critical patent/CN117765051B/en
Publication of CN117765051A publication Critical patent/CN117765051A/en
Application granted granted Critical
Publication of CN117765051B publication Critical patent/CN117765051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of landscaping monitoring and early warning, in particular to a system and a method for landscaping maintenance monitoring and early warning. The method comprises the following steps: acquiring a gray image of a garden to be analyzed; traversing the gray level image in different preset directions by utilizing a sliding window to obtain window areas in each preset direction; obtaining gradient values of all pixel points in all window areas according to gray level differences of adjacent window areas in the vertical direction of the preset direction corresponding to all window areas; determining characteristic pixel points corresponding to each preset direction based on the gradient values and the gray values of the pixel points in the gray image; obtaining the corresponding tightness degree of each preset direction according to the position distribution of the characteristic pixel points corresponding to each preset direction; determining a weight corresponding to each preset direction based on the tightness degree, and further constructing a corresponding gray level co-occurrence matrix; and judging whether to perform early warning or not based on the gray level co-occurrence matrix. The invention improves the reliability of abnormality early warning.

Description

Afforestation maintenance monitoring and early warning system and method
Technical Field
The invention relates to the technical field of landscaping monitoring and early warning, in particular to a system and a method for landscaping maintenance monitoring and early warning.
Background
The artificial gardens are forests formed through artificial measures, planning and planting are generally carried out according to the demands of people, the planted trees are high in adaptability and good in genetic quality, are uniformly distributed on the forests, the group structure is uniform and reasonable, planting is characterized by certain rules and directions, the detection of the growth condition of the trees in the gardens is indispensable for reasonably planning and adjusting landscaping, the existing method is generally used for judging whether abnormal conditions occur in the growth of the trees through analyzing the texture change of the trees in garden images, and early warning is timely carried out if the abnormal conditions occur.
In the prior art, when planting and planning rationality judgment of trees in gardens are performed, aerial artificial forest images are firstly obtained, a gray level co-occurrence matrix (GLCM) is established to obtain corresponding characteristic quantity analysis corresponding garden tree distribution textures, texture information of the images is judged according to the size of the characteristic quantity, when characteristic values corresponding to the GLCM are solved, gray level co-occurrence matrixes in four directions of 0 degree, 45 degrees, 90 degrees and 135 degrees are firstly obtained, the corresponding characteristic quantities are comprehensively solved by the four matrixes, and the average value is generally obtained directly, but in aerial garden images, the textures are distributed with certain rules and directions and have complete space structure, the average value is obtained to influence analysis results of image texture characteristics, and therefore the reliability of abnormal early warning is reduced.
Disclosure of Invention
In order to solve the problem that the existing method does not consider differences in different directions when evaluating the texture distribution situation of garden images, so that the reliability of abnormal early warning is low, the invention aims to provide a system and a method for monitoring and early warning of garden greening maintenance, and the adopted technical scheme is as follows:
In a first aspect, the invention provides an landscaping maintenance monitoring and early warning method, which comprises the following steps:
Acquiring a gray image of a garden to be analyzed;
Traversing the gray level image in different preset directions by utilizing a sliding window to obtain window areas in each preset direction; obtaining gradient values of pixel points in each window area in each preset direction according to gray level differences of adjacent window areas in the vertical direction of the preset direction corresponding to each window area; determining characteristic pixel points corresponding to each preset direction based on the gradient values and the gray values of the pixel points in the gray image;
obtaining the corresponding tightness degree of each preset direction according to the position distribution of the characteristic pixel points corresponding to each preset direction; determining the weight of the gray level co-occurrence matrix corresponding to each preset direction based on the tightness;
Constructing a gray level co-occurrence matrix corresponding to the gray level image based on the weight; and judging whether to perform early warning or not based on the gray level co-occurrence matrix.
In a second aspect, the invention provides an landscaping maintenance monitoring and early warning system, which comprises a memory and a processor, wherein the processor executes a computer program stored in the memory to realize the landscaping maintenance monitoring and early warning method.
Preferably, the obtaining the gradient value of each pixel point in each window area in each preset direction according to the gray scale difference of the adjacent window areas in the vertical direction of the preset direction corresponding to each window area includes:
for any preset direction:
Marking the vertical direction of the preset direction as a characteristic direction; marking any window area in the preset direction as an area to be analyzed;
Taking the average gray value of all pixel points in the area to be analyzed as the gray average value corresponding to the area to be analyzed; and determining the absolute value of the difference value between the gray average values corresponding to the two window areas adjacent to the area to be analyzed in the characteristic direction as the gradient value of the pixel points in the area to be analyzed.
Preferably, determining the feature pixel point corresponding to each preset direction based on the gradient value and the gray value of the pixel point in the gray image includes:
for any preset direction:
Obtaining the maximum gradient value of all pixel points in the gray level image in the preset direction; determining the absolute value of the difference between the gradient value of each pixel point in the gray level image and the maximum gradient value in the preset direction as a first difference corresponding to each pixel point;
Obtaining the maximum value of gray average values corresponding to all window areas in the preset direction; determining the absolute value of the difference between the gray value of each pixel point in the gray image and the maximum value of the gray average value as a second difference corresponding to each pixel point;
And if the first difference corresponding to the pixel points in the gray image is smaller than a preset first threshold value and the second difference corresponding to the pixel points is smaller than a preset second difference threshold value, determining the corresponding pixel points as characteristic pixel points corresponding to the preset direction.
Preferably, the obtaining the tightness degree corresponding to each preset direction according to the position distribution of the feature pixel points corresponding to each preset direction includes:
for any preset direction:
Performing straight line fitting on the characteristic pixel points corresponding to the preset direction to obtain a target straight line; marking the characteristic pixel points positioned outside the target straight line as discrete pixel points;
screening reference pixel points from all the discrete pixel points according to the relative positions of the discrete pixel points and the target straight line;
And determining the ratio of the number of the reference pixel points to the number of the discrete pixel points as the tightness degree corresponding to the preset direction.
Preferably, the step of selecting the reference pixel point from all the discrete pixel points according to the relative positions of the discrete pixel points and the target straight line includes:
Respectively connecting every two adjacent discrete pixel points to obtain corresponding straight line segments; and judging whether the included angle between the straight line segment and the target straight line is smaller than a preset angle threshold value, and if so, determining the corresponding discrete pixel point as a reference pixel point.
Preferably, the determining the weight of the gray level co-occurrence matrix corresponding to each preset direction based on the tightness degree includes:
calculating the sum of the tightness degrees corresponding to all preset directions; and determining the ratio of the tightness degree corresponding to each preset direction to the sum value as the weight of the gray level co-occurrence matrix corresponding to each preset direction.
The invention has at least the following beneficial effects:
According to the invention, the fact that the gray value of the pixel point corresponding to the vegetation in the gray image of the garden to be analyzed is generally larger is considered, the top of the vegetation is in cluster distribution on the gray image, a certain gradient exists at the junction of the vegetation and the ground, the characteristic pixel point corresponding to each preset direction is determined based on the characteristic, then the position distribution condition of the characteristic pixel point in each preset direction is analyzed respectively, the weight of the gray co-occurrence matrix corresponding to each preset direction is obtained, the gray co-occurrence matrix corresponding to the gray image of the garden to be analyzed is constructed based on the weight, compared with the existing gray co-occurrence matrix constructed directly according to the average value of each preset direction, the texture characteristic of the gray image of the garden to be analyzed can be described more accurately, the vegetation growth condition of the garden to be analyzed is evaluated and judged whether to perform early warning or not based on the constructed gray co-occurrence matrix, and the reliability of abnormal early warning is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an landscaping maintenance monitoring and early warning method according to an embodiment of the present invention.
Detailed Description
In order to further explain the technical means and effects adopted by the invention to achieve the preset aim, the invention provides a landscaping maintenance monitoring and early warning system and method according to the invention, which are described in detail below with reference to the accompanying drawings and the preferred embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The invention provides a landscaping maintenance monitoring and early warning system and a specific scheme of a method by combining a drawing.
An embodiment of a landscaping maintenance monitoring and early warning method comprises the following steps:
The specific scene aimed at by this embodiment is: the gray level co-occurrence matrix can be used for analyzing the texture characteristics of the image, and the characteristic quantity calculated by the gray level co-occurrence matrix can reflect the texture characteristics of the corresponding image. When the gray level image of the garden to be analyzed is subjected to texture analysis, the texture characteristics of different forest species are different, the natural forest textures are distributed in a plaque and cluster mode, and the artificially constructed garden textures have certain rules and directions, are uniform in texture and have complete space structure; therefore, the characteristic values calculated by the GLCM algorithm have certain differences in different directions due to different texture characteristics, and further, the situation that the average value of the characteristic values in all set directions is larger in the detection result and the actual difference occurs when the algorithm comprehensively quantifies the corresponding characteristic values. Therefore, the embodiment analyzes the contribution of the artificial forest image to the characteristic value of the gray level co-occurrence matrix in each direction, so as to obtain the optimal characteristic value capable of reflecting the texture characteristics of the image, and improve the reliability of abnormal early warning.
The embodiment provides an landscaping maintenance monitoring and early warning method, as shown in fig. 1, which comprises the following steps:
step S1, acquiring a gray image of gardens to be analyzed.
In this embodiment, an aerial image of a garden to be analyzed is first obtained, and the aerial image is a color image (i.e., RGB image). It should be noted that: the gardens to be analyzed in this embodiment are artificial gardens, that is, the texture distribution has certain rules, aerial images in this embodiment are selected to be shot under good weather conditions, and are not shot in rainy days and strong wind days, so that the quality of collected images is guaranteed not to be affected by weather.
According to the embodiment, the acquired aerial image of the garden to be analyzed is subjected to graying treatment, a corresponding gray level image is obtained, filtering treatment is carried out on the gray level image, the influence of noise on a subsequent analysis result is reduced, and the filtered image is recorded as the gray level image of the garden to be analyzed. The image graying processing and the filtering processing are all the prior art, and are not repeated here.
Thus, a gray image of the garden to be analyzed is obtained.
Step S2, traversing the gray level image in different preset directions by utilizing a sliding window to obtain window areas in each preset direction; obtaining gradient values of pixel points in each window area in each preset direction according to gray level differences of adjacent window areas in the vertical direction of the preset direction corresponding to each window area; and determining the characteristic pixel point corresponding to each preset direction based on the gradient value and the gray value of the pixel point in the gray image.
When solving the contrast characteristic quantity of the garden to be analyzed by the GLCM algorithm, certain differences exist in the texture thickness degrees corresponding to different directions on the gray level image of the garden to be analyzed, so that the contrast corresponding to the different directions is different, if the average value of the characteristic values in all directions is taken as the image contrast and can not reflect the local texture characteristics of the garden to be analyzed, the embodiment firstly combines the gray level change conditions of pixel points in the gray level image of the garden to be analyzed in different preset directions, and gives the proper weight to each preset direction, namely the contribution degree of the texture characteristics of each preset direction to the total characteristic quantity of the image is evaluated, and further a gray level co-occurrence matrix is constructed based on the weight, so that the finally obtained characteristic quantity can reflect the texture characteristics of the gray level image of the garden to be analyzed more accurately.
In this embodiment, the preset directions are four, namely 0 degrees, 45 degrees, 90 degrees and 135 degrees, and in specific applications, the practitioner can set the preset directions according to specific situations. The present embodiment will be described below by taking one preset direction as an example, and the method provided in the present embodiment may be used for analyzing other preset directions.
For any preset direction:
Sliding in the preset direction of the gray level image of the garden to be analyzed by utilizing a sliding window with a preset size, obtaining a corresponding window area by sliding each time, traversing all pixel points in the gray level image of the garden to be analyzed, and obtaining a plurality of window areas in the preset direction. It should be noted that, in this embodiment, there is no overlapping area between window areas in the same preset direction, so the sliding step length of the sliding window is set according to the size of the sliding window. In this embodiment, the preset size is 3*3, that is, the size of the sliding window is 3*3, and in a specific application, the practitioner can set the preset size according to the specific situation.
Marking the vertical direction of the preset direction as a characteristic direction; marking any window area in the preset direction as an area to be analyzed; taking the average gray value of all pixel points in the area to be analyzed as the gray average value corresponding to the area to be analyzed; and determining the absolute value of the difference value between the gray average values corresponding to the two window areas adjacent to the area to be analyzed in the characteristic direction as the gradient value of the pixel points in the area to be analyzed.
If the preset direction is a 0-degree direction, that is, a horizontal direction, the vertical direction of the preset direction is a vertical direction, that is, the characteristic direction is a vertical direction, so that two window areas adjacent to the to-be-analyzed area in the characteristic direction are respectively a window area adjacent to the to-be-analyzed area right above the to-be-analyzed area and a window area adjacent to the to-be-analyzed area right below the to-be-analyzed area, the absolute value of the difference between the gray average value corresponding to the to-be-analyzed area right above the to-be-analyzed area and the window area right below the to-be-analyzed area is calculated, and the gradient value of all pixels in each window area can be approximately regarded as equal due to the small size of the sliding window, and the absolute value is regarded as the gradient value of the pixels in the window area, and the gradient value of the pixels in the to-be-analyzed area is equal.
By adopting the method, each window area in the preset direction is analyzed respectively to obtain the gradient value of the pixel point in each window area.
In addition, a certain gradient exists at the junction of the vegetation and the ground, and a large number of junctions, namely edges, exist in the images, so that the embodiment is used for analyzing vegetation growth conditions in gardens to be analyzed, and the embodiment is used for screening out the pixels from the gray level images of the gardens to be analyzed based on the gray values and gradient values of the pixels. Specifically, obtaining the maximum gradient value of all pixel points in the gray level image of the garden to be analyzed in the preset direction; determining the absolute value of the difference between the gradient value of each pixel point in the gray level image of the garden to be analyzed in the preset direction and the maximum gradient value as a first difference corresponding to each pixel point; obtaining the maximum value of gray average values corresponding to all window areas in the preset direction; determining the absolute value of the difference between the gray value of each pixel point in the gray image of the garden to be analyzed and the maximum value of the gray average value as a second difference corresponding to each pixel point; if the first difference corresponding to the pixel points in the gray level image of the garden to be analyzed is smaller than a preset first threshold value and the second difference corresponding to the pixel points is smaller than a preset second difference threshold value, determining the corresponding pixel points as characteristic pixel points corresponding to the preset direction. In this embodiment, the preset first threshold and the preset second threshold are both 10, and in a specific application, an implementer may set according to a specific situation.
By adopting the method provided by the embodiment, the pixel points in the gray level image of the garden to be analyzed are screened, and the characteristic pixel points corresponding to each preset direction are determined.
Step S3, obtaining the corresponding tightness degree of each preset direction according to the position distribution of the characteristic pixel points corresponding to each preset direction; and determining the weight of the gray level co-occurrence matrix corresponding to each preset direction based on the tightness.
In this embodiment, feature pixel points corresponding to each preset direction have been determined, and then, a preset direction is still taken as an example to describe, and the method provided in this embodiment may be used to process other preset directions.
For any preset direction:
The feature pixel points corresponding to the preset direction are distributed in a discrete manner in the gray level image of the garden to be analyzed, if the feature pixel points are distributed towards one direction, the feature pixel points are distributed more regularly and more compactly, and then the embodiment evaluates the compactness degree corresponding to the preset direction based on the distribution condition of the feature pixel points. Specifically, performing straight line fitting on all characteristic pixel points corresponding to the preset direction to obtain a straight line, and marking the straight line obtained at the moment as a target straight line; the straight line fitting of the plurality of points is known in the art and will not be described in detail here. And marking the characteristic pixel points positioned outside the target straight line as discrete pixel points, namely taking the pixel points which are not positioned on the target straight line in all the characteristic pixel points as the discrete pixel points. Respectively connecting every two adjacent discrete pixel points to obtain corresponding straight line segments; and judging whether the included angle between the straight line segment and the target straight line is smaller than a preset angle threshold value, if so, determining the corresponding discrete pixel points as reference pixel points, and obtaining a plurality of reference pixel points. The preset angle threshold in this embodiment is 5 degrees, and in a specific application, the practitioner can set according to the specific situation. The more the number of the reference pixel points is, the closer the distribution direction of the discrete pixel points is to the reference direction of the target straight line, that is, the closer the distribution of the feature pixel points corresponding to the preset direction is, so that the ratio of the number of the reference pixel points to the number of the discrete pixel points is determined as the tightness corresponding to the preset direction.
By adopting the method, the corresponding tightness degree of each preset direction can be obtained. The smaller the degree of tightness, the more serious the deviation of the characteristic pixel points is, and the smaller the contribution degree corresponding to the preset direction should be when the gray level co-occurrence matrix is calculated later, so that the smaller the weight should be given. Based on this, the present embodiment will next determine the weight of the gray co-occurrence matrix corresponding to each preset direction based on the tightness corresponding to each preset direction.
Specifically, calculating the sum of the tightness degrees corresponding to all preset directions; and respectively determining the ratio of the tightness degree corresponding to each preset direction to the sum value as the weight of the gray level co-occurrence matrix corresponding to each preset direction. For any preset direction: taking the ratio of the tightness degree corresponding to the preset direction to the tightness degree corresponding to all the preset directions as the weight of the gray level co-occurrence matrix corresponding to the preset direction.
So far, according to the contribution degree of each preset direction, the weight of the gray level co-occurrence matrix corresponding to each preset direction is determined, and the weight is used for constructing the gray level co-occurrence matrix corresponding to the gray level image of the garden to be analyzed later.
S4, constructing a gray level co-occurrence matrix corresponding to the gray level image based on the weight; and judging whether to perform early warning or not based on the gray level co-occurrence matrix.
In order to eliminate the influence of the direction on the calculation of the feature value of the gray level co-occurrence matrix, the average value is generally calculated for all preset directions, but certain differences exist in the texture features in different directions in the gray level image of the garden to be analyzed. The average value of the characteristic quantities in four preset directions is taken as the element value in the final gray level co-occurrence matrix when the existing gray level co-occurrence matrix is obtained, namely the weights corresponding to each preset direction are equal, and the weights corresponding to each preset direction are 0.25 because of the total of four preset directions, while the embodiment calculates the weights of the gray level co-occurrence matrix corresponding to each preset direction respectively, and replaces the weights corresponding to the existing gray level co-occurrence matrix when the gray level co-occurrence matrix corresponding to each preset direction is constructed, namely the weights of the gray level co-occurrence matrix corresponding to each preset direction calculated by the embodiment are used for replacing the original 0.25, so that a new gray level co-occurrence matrix is obtained. The method for obtaining the gray level co-occurrence matrix is the prior art, and will not be described in detail here.
The gray level co-occurrence matrix corresponding to the garden to be analyzed is obtained based on the method provided by the embodiment, and compared with the existing method that the weight corresponding to each preset direction is the same, the gray level co-occurrence matrix considers the difference condition of textures of pixel points in different directions, so that the gray level co-occurrence matrix obtained by the embodiment can better reflect the texture distribution condition in the gray level image of the garden to be analyzed, and the analysis result of analyzing the vegetation growth condition in the garden to be analyzed based on the gray level co-occurrence matrix obtained by the embodiment is more accurate.
Calculating statistics according to a gray level co-occurrence matrix corresponding to the gray level image of the garden to be analyzed, so as to describe the texture characteristics of the garden to be analyzed, such as an angular second moment, and reflect the uniformity of the gray level image of the garden to be analyzed; entropy reflects the grayscales of gardens to be analyzed; uniformity, which reflects the smoothness of the gray image distribution of gardens to be analyzed; contrast, which reflects the thickness of the gray image texture of the garden to be analyzed. The calculation process of the angle second moment, entropy, uniformity and contrast ratio is the prior art, and will not be repeated here.
So far, according to the gray level co-occurrence matrix corresponding to the gray level image of the garden to be analyzed, corresponding statistics are calculated, so that the texture features of the garden are accurately described, in specific application, a corresponding threshold value can be set, whether the vegetation growth condition in the garden to be analyzed is abnormal or not is judged based on the magnitude relation between the calculated statistics and the threshold value, if so, early warning is given, and a vegetation nursing staff is timely reminded to carry out subsequent planning and adjustment.
According to the method, the situation that the gray value of the pixel point corresponding to the vegetation in the gray image of the garden to be analyzed is generally larger is considered, the top of the vegetation is in cluster distribution on the gray image, a certain gradient exists at the junction of the vegetation and the ground, based on the characteristic, the characteristic pixel point corresponding to each preset direction is determined, then the position distribution situation of the characteristic pixel point in each preset direction is analyzed respectively, the weight of the gray symbiotic matrix corresponding to each preset direction is obtained, the gray symbiotic matrix corresponding to the gray image of the garden to be analyzed is constructed and obtained based on the weight, and compared with the existing gray symbiotic matrix constructed directly according to the mean value of each preset direction, the texture characteristic of the gray image of the garden to be analyzed can be described more accurately, the vegetation production situation to be analyzed is evaluated and whether early warning is carried out or not is judged based on the constructed gray symbiotic matrix, and the reliability of abnormal early warning is improved.
An embodiment of an landscaping maintenance monitoring and early warning system:
The landscaping maintenance monitoring and early warning system comprises a memory and a processor, wherein the processor executes a computer program stored in the memory so as to realize the landscaping maintenance monitoring and early warning method.
Since an embodiment of an landscaping maintenance monitoring and early warning method has been described, the embodiment does not describe a landscaping maintenance monitoring and early warning method.

Claims (5)

1. The landscaping maintenance monitoring and early warning method is characterized by comprising the following steps of:
Acquiring a gray image of a garden to be analyzed;
Traversing the gray level image in different preset directions by utilizing a sliding window to obtain window areas in each preset direction; obtaining gradient values of pixel points in each window area in each preset direction according to gray level differences of adjacent window areas in the vertical direction of the preset direction corresponding to each window area; determining characteristic pixel points corresponding to each preset direction based on the gradient values and the gray values of the pixel points in the gray image;
obtaining the corresponding tightness degree of each preset direction according to the position distribution of the characteristic pixel points corresponding to each preset direction; determining the weight of the gray level co-occurrence matrix corresponding to each preset direction based on the tightness;
Constructing a gray level co-occurrence matrix corresponding to the gray level image based on the weight; judging whether to perform early warning or not based on the gray level co-occurrence matrix;
The step of obtaining the gradient value of each pixel point in each window area in each preset direction according to the gray scale difference of the adjacent window areas in the vertical direction of the preset direction corresponding to each window area comprises the following steps:
for any preset direction:
Marking the vertical direction of the preset direction as a characteristic direction; marking any window area in the preset direction as an area to be analyzed;
Taking the average gray value of all pixel points in the area to be analyzed as the gray average value corresponding to the area to be analyzed; determining the absolute value of the difference value between gray average values corresponding to two window areas adjacent to the area to be analyzed in the characteristic direction as the gradient value of the pixel points in the area to be analyzed;
Based on the gradient value and the gray value of the pixel point in the gray image, determining the feature pixel point corresponding to each preset direction comprises the following steps:
for any preset direction:
Obtaining the maximum gradient value of all pixel points in the gray level image in the preset direction; determining the absolute value of the difference between the gradient value of each pixel point in the gray level image and the maximum gradient value in the preset direction as a first difference corresponding to each pixel point;
Obtaining the maximum value of gray average values corresponding to all window areas in the preset direction; determining the absolute value of the difference between the gray value of each pixel point in the gray image and the maximum value of the gray average value as a second difference corresponding to each pixel point;
And if the first difference corresponding to the pixel points in the gray image is smaller than a preset first threshold value and the second difference corresponding to the pixel points is smaller than a preset second difference threshold value, determining the corresponding pixel points as characteristic pixel points corresponding to the preset direction.
2. The method for monitoring and early warning landscaping according to claim 1, wherein the obtaining the tightness degree corresponding to each preset direction according to the position distribution of the characteristic pixel points corresponding to each preset direction comprises:
for any preset direction:
Performing straight line fitting on the characteristic pixel points corresponding to the preset direction to obtain a target straight line; marking the characteristic pixel points positioned outside the target straight line as discrete pixel points;
screening reference pixel points from all the discrete pixel points according to the relative positions of the discrete pixel points and the target straight line;
And determining the ratio of the number of the reference pixel points to the number of the discrete pixel points as the tightness degree corresponding to the preset direction.
3. The landscaping maintenance monitoring and early warning method according to claim 2, characterized in that the step of screening reference pixel points from all discrete pixel points according to the relative positions of the discrete pixel points and the target straight line comprises the following steps:
Respectively connecting every two adjacent discrete pixel points to obtain corresponding straight line segments; and judging whether the included angle between the straight line segment and the target straight line is smaller than a preset angle threshold value, and if so, determining the corresponding discrete pixel point as a reference pixel point.
4. The landscaping maintenance monitoring and early warning method according to claim 1, wherein the determining the weight of the gray level co-occurrence matrix corresponding to each preset direction based on the tightness degree comprises:
calculating the sum of the tightness degrees corresponding to all preset directions; and determining the ratio of the tightness degree corresponding to each preset direction to the sum value as the weight of the gray level co-occurrence matrix corresponding to each preset direction.
5. An landscaping maintenance monitoring and early warning system comprising a memory and a processor, wherein the processor executes a computer program stored in the memory to implement the landscaping maintenance monitoring and early warning method according to any one of claims 1 to 4.
CN202410034894.XA 2024-01-10 2024-01-10 Afforestation maintenance monitoring and early warning system and method Active CN117765051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410034894.XA CN117765051B (en) 2024-01-10 2024-01-10 Afforestation maintenance monitoring and early warning system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410034894.XA CN117765051B (en) 2024-01-10 2024-01-10 Afforestation maintenance monitoring and early warning system and method

Publications (2)

Publication Number Publication Date
CN117765051A CN117765051A (en) 2024-03-26
CN117765051B true CN117765051B (en) 2024-06-07

Family

ID=90318413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410034894.XA Active CN117765051B (en) 2024-01-10 2024-01-10 Afforestation maintenance monitoring and early warning system and method

Country Status (1)

Country Link
CN (1) CN117765051B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117974639B (en) * 2024-03-28 2024-06-14 山东正为新材料科技有限公司 Sealant abnormal state detection method based on image data
CN118072025A (en) * 2024-04-17 2024-05-24 中建安装集团有限公司 Terrace surface intelligent detection method based on image processing

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105405148A (en) * 2015-11-17 2016-03-16 中国科学院遥感与数字地球研究所 Remote sensing tomentosa recognition method in combination with tree shadow feature
CN105469098A (en) * 2015-11-20 2016-04-06 中北大学 Precise LINDAR data ground object classification method based on adaptive characteristic weight synthesis
CN109726705A (en) * 2019-01-24 2019-05-07 中国科学院地理科学与资源研究所 Extracting method, device and the electronic equipment of mangrove information
EP3499457A1 (en) * 2017-12-15 2019-06-19 Samsung Display Co., Ltd System and method of defect detection on a display
CN113673441A (en) * 2021-08-23 2021-11-19 王彬 Quantitative variation texture-driven high-resolution remote sensing image classification method
CN114529549A (en) * 2022-04-25 2022-05-24 南通东德纺织科技有限公司 Cloth defect labeling method and system based on machine vision
CN114998721A (en) * 2022-05-06 2022-09-02 南京信息工程大学 Method for extracting mangrove wetland by using long-short term memory neural network
CN114998323A (en) * 2022-07-19 2022-09-02 南通飞旋智能科技有限公司 Deformed steel bar abnormity determination method based on attention mechanism
CN115049668A (en) * 2022-08-16 2022-09-13 江苏众联管业有限公司 Steel strip roll mark identification method based on feature extraction
CN115601347A (en) * 2022-11-01 2023-01-13 南通海驹钢结构有限公司(Cn) Steel plate surface defect detection method based on gray texture analysis
CN115691026A (en) * 2022-12-29 2023-02-03 湖北省林业科学研究院 Intelligent early warning monitoring management method for forest fire prevention
CN116563704A (en) * 2023-04-27 2023-08-08 重庆英卡电子有限公司 Multispectral vision epidemic wood identification method, multispectral vision epidemic wood identification system, multispectral vision epidemic wood identification equipment and multispectral vision epidemic wood storage medium
CN116596905A (en) * 2023-05-29 2023-08-15 杭州垣奕信息技术发展有限公司 Method for detecting surface defects of integrated circuit chip

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105405148A (en) * 2015-11-17 2016-03-16 中国科学院遥感与数字地球研究所 Remote sensing tomentosa recognition method in combination with tree shadow feature
CN105469098A (en) * 2015-11-20 2016-04-06 中北大学 Precise LINDAR data ground object classification method based on adaptive characteristic weight synthesis
EP3499457A1 (en) * 2017-12-15 2019-06-19 Samsung Display Co., Ltd System and method of defect detection on a display
CN109726705A (en) * 2019-01-24 2019-05-07 中国科学院地理科学与资源研究所 Extracting method, device and the electronic equipment of mangrove information
CN113673441A (en) * 2021-08-23 2021-11-19 王彬 Quantitative variation texture-driven high-resolution remote sensing image classification method
CN114529549A (en) * 2022-04-25 2022-05-24 南通东德纺织科技有限公司 Cloth defect labeling method and system based on machine vision
CN114998721A (en) * 2022-05-06 2022-09-02 南京信息工程大学 Method for extracting mangrove wetland by using long-short term memory neural network
CN114998323A (en) * 2022-07-19 2022-09-02 南通飞旋智能科技有限公司 Deformed steel bar abnormity determination method based on attention mechanism
CN115049668A (en) * 2022-08-16 2022-09-13 江苏众联管业有限公司 Steel strip roll mark identification method based on feature extraction
CN115601347A (en) * 2022-11-01 2023-01-13 南通海驹钢结构有限公司(Cn) Steel plate surface defect detection method based on gray texture analysis
CN115691026A (en) * 2022-12-29 2023-02-03 湖北省林业科学研究院 Intelligent early warning monitoring management method for forest fire prevention
CN116563704A (en) * 2023-04-27 2023-08-08 重庆英卡电子有限公司 Multispectral vision epidemic wood identification method, multispectral vision epidemic wood identification system, multispectral vision epidemic wood identification equipment and multispectral vision epidemic wood storage medium
CN116596905A (en) * 2023-05-29 2023-08-15 杭州垣奕信息技术发展有限公司 Method for detecting surface defects of integrated circuit chip

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A Study for Texture Feature Extraction of High-Resolution Satellite Images Based on a Direction Measure and Gray Level Co-Occurrence Matrix Fusion Algorithm;Xin Zhang et al;《sensors》;20170622;1-15 *
基于改进颜色直方图和灰度共生矩阵的图像检索;吴庆涛;曹再辉;施进发;;图学学报;20170815(第04期);543-548 *
融合方向测度和灰度共生矩阵的纹理特征提取算法研究;刘天时 等;《科学技术与工程》;20141130;第14卷(第32期);271-275 *

Also Published As

Publication number Publication date
CN117765051A (en) 2024-03-26

Similar Documents

Publication Publication Date Title
CN117765051B (en) Afforestation maintenance monitoring and early warning system and method
CN110427922A (en) One kind is based on machine vision and convolutional neural networks pest and disease damage identifying system and method
CN113029971B (en) Crop canopy nitrogen monitoring method and system
CN102663397B (en) Automatic detection method of wheat seedling emergence
WO2001033505A2 (en) Multi-variable model for identifying crop response zones in a field
CN110414738A (en) A kind of crop yield prediction technique and system
CN108710864B (en) Winter wheat remote sensing extraction method based on multi-dimensional identification and image noise reduction processing
CN112766036A (en) Remote sensing extraction method and device for lodging corn
CN113033279A (en) Crop fine classification method and system based on multi-source remote sensing image
CN110310246A (en) A kind of cane -growing region remote sensing information extracting method based on three-line imagery
CN110399785B (en) Method for detecting leaf occlusion based on deep learning and traditional algorithm
CN114612794A (en) Remote sensing identification method for land covering and planting structure in finely-divided agricultural area
CN110390287A (en) A kind of crop maturity phase prediction technique based on satellite remote sensing
CN112329733B (en) Winter wheat growth monitoring and analyzing method based on GEE cloud platform
CN110598514A (en) Method for monitoring plot scale crop seeding area of land reclamation project area
CN117575953A (en) Detail enhancement method for high-resolution forestry remote sensing image
CN116258968B (en) Method and system for managing fruit diseases and insects
CN116778343B (en) Target image feature extraction method for comprehensive identification
CN106709922B (en) Pasture grass coverage and biomass automatic detection method based on image
CN103900498B (en) A kind of cotton field automatic detection method of the growth of cereal crop seedlings and detection device thereof
CN111832480A (en) Remote sensing identification method for rape planting area based on spectral characteristics
CN116912265A (en) Remote sensing image segmentation method and system
CN108834667A (en) A kind of greenhouse system based on Internet of Things
CN114612899A (en) Wheat seedling row center line detection method based on improved YOLOv3
Hall et al. A method for extracting detailed information from high resolution multispectral images of vineyards

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant