CN113971759A - Device and method for automatically identifying weeds among soybean plants based on computer vision - Google Patents

Device and method for automatically identifying weeds among soybean plants based on computer vision Download PDF

Info

Publication number
CN113971759A
CN113971759A CN202111245172.1A CN202111245172A CN113971759A CN 113971759 A CN113971759 A CN 113971759A CN 202111245172 A CN202111245172 A CN 202111245172A CN 113971759 A CN113971759 A CN 113971759A
Authority
CN
China
Prior art keywords
soybean
fluorescence
mirror
camera
pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111245172.1A
Other languages
Chinese (zh)
Other versions
CN113971759B (en
Inventor
苏文浩
黄青阳
盛基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN202111245172.1A priority Critical patent/CN113971759B/en
Publication of CN113971759A publication Critical patent/CN113971759A/en
Application granted granted Critical
Publication of CN113971759B publication Critical patent/CN113971759B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P60/00Technologies relating to agriculture, livestock or agroalimentary industries
    • Y02P60/14Measures for saving energy, e.g. in green houses

Landscapes

  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Soil Working Implements (AREA)
  • Catching Or Destruction (AREA)

Abstract

The invention discloses a device and a method for automatically identifying weeds among soybean plants based on computer vision, wherein the device comprises a darkroom system built by sectional materials; the outside of the darkroom system is covered by a black light absorption curtain; a gray level camera is fixed at the top end inside the darkroom system and is connected with a computer; six blue LED lamps are fixed in the middle of the interior of the darkroom system and are connected with the transformer; six mirrors with proper inclination angles are fixed at the bottom end inside the darkroom system. Through a series of improvements, a systematic crop signal conduction technology is adopted, fluorescence images and other information of soybean crops and weeds captured simultaneously from 7 visual angles are analyzed, a novel computer vision real-time identification method based on the three-dimensional geometric appearance of a soybean marker is established, the position of a soybean stem in the field is accurately identified, and automatic weeding operation can be realized by combining a weeding device.

Description

Device and method for automatically identifying weeds among soybean plants based on computer vision
Technical Field
The invention relates to the field of weeding, in particular to a device and a method for automatically identifying weeds among soybean plants based on computer vision.
Background
The grass damage is one of the main biological disasters after soybean seedlings emerge, and the inter-plant weed damages soybean seedlings more seriously. The emergence number of the weeds in 3 weeks after the soybean is sowed can be about 95.00 percent of the total emergence number of the weeds, so the early prevention and control of the weeds in the soybean field are particularly important.
At present, measures such as herbicide, artificial weeding, mechanical intertillage weeding and the like are commonly adopted for weed control. The herbicide weeding is a main means for preventing and controlling weeds in bean fields at present, but the problems of quality safety of agricultural products, environmental pollution and the like are easily caused by the application of the herbicide. The manual weeding is a flexible and accurate inter-plant weeding mode, but the manual cost is high, the operation efficiency is low, and the manual weeding is not suitable for large-scale cultivation. Traditional mechanical intertillage can loosen soil at the growth initial stage of soybeans planted in large scale to eliminate inter-row weeds, but cannot eliminate inter-plant weeds. Therefore, the existing weed control is still a short board for agricultural modernization construction in China, and no mature intelligent inter-plant weeding machine is applied to agricultural practice in China at present. Therefore, the real-time automatic identification of crops and weeds is carried out in the early growth stage of soybeans, and the method has important significance for meeting the weed prevention and control requirements of increasing the yield and income of soybean crops and reducing the amount of herbicides.
Traditional visual methods of crop detection fail to reliably distinguish occluded crops from high-density weeds. The marking process of the marker applied by the existing crop signal transmission technology is time-consuming and tedious, and the problem that the marker is removed by sprinkling irrigation or natural rainfall cannot be prevented.
An effective solution to the problems in the related art has not been proposed yet.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a device for automatically identifying weeds among soybean plants based on computer vision.
In order to achieve the above purposes, the technical scheme adopted by the invention is as follows:
a soybean inter-plant weed automatic identification device based on computer vision comprises: a darkroom system 1, a gray camera 3, a computer 4, a blue LED lamp 5, a transformer 6 and a mirror 7;
the darkroom system is built by sectional materials, a black light absorption curtain 2 covers the darkroom system, a gray camera is fixed at the top end of the interior of the darkroom system and connected with a computer, six blue LED lamps are fixed in the middle of the interior of the darkroom system and connected with a transformer, and a mirror with six inclined angles suitable for adjustment is fixed at the bottom end of the interior of the darkroom system.
Each mirror is parallel to the corresponding mirror on the other side. The image in the mirror shows cotyledons, leaves and stems, solving the occlusion problem.
Further, the height range of the gray-scale camera to the ground is 600-1460 mm, so that a larger visual field can be ensured, and the growth environment of the crops can be monitored more comprehensively.
The grayscale camera has a 520nm filter lens.
Further, images of weeds and soybean seedlings can be separated in the same image.
Further, the lower edges of the four mirrors of the six mirrors 7 form an angle of 35 ° with the horizontal.
A soybean inter-plant weed automatic identification method based on computer vision uses the identification device and comprises the following steps:
step 1, cleaning soybean seeds with distilled water, drying, soaking the soybean seeds with an effective dose of Fluorescein solution, and sowing the seeds after soaking;
step 2, after the soybean seeds grow cotyledons, the identification device can be used for automatically identifying weeds (after the soybean seedlings grow for about two weeks, the effect of the shot fluorescent picture is better); the inclination angle of the blue LED lamp 5 is adjusted, so that light can be uniformly emitted to the soybean seedlings 8, and the reflection phenomenon with the mirror 7 cannot be generated;
step 3, adjusting the voltage value of the transformer 6 to a proper value, and changing the brightness of the blue LED lamp 5 along with the voltage value so as to enable the light to be emitted to plants stably;
step 4, adjusting the positions of the mirror 7 and the camera, operating the computer 4, adjusting the exposure value and the gain value of the gray-scale camera 3 and keeping the exposure value and the gain value unchanged, and finally fixing the relative positions of the blue LED lamp, the camera and the mirror;
step 5, when the device is used for walking operation in the field, a camera shoots in real time, and a picture of the soybean seedlings in the center of the mirror is captured through a corresponding algorithm;
and 6, connecting the seven pictures of seven visual angles shot by the camera with an algorithm to calculate intersection points, processing the weed intervention conditions of any number and any position, and accurately positioning the soybean seedlings in the field.
In the step 1, after the seeds are dried, the seeds are soaked in a 120ppm fluorescein solution for 48 hours or 96 hours in a dark room environment.
Step 6 is as follows:
s61, carrying out image binarization on the pictures of seven visual angles shot by the camera;
s62, defining a region of interest from the binary image;
s63, identifying a valid region of interest from the region of interest;
s64, solving particles of each effective region of interest;
and S65, making a corresponding straight line according to the mass points of the effective region of interest, and solving the real position of the root of the soybean seedling.
On the basis of the scheme, the step 6 is as follows:
performing median filtering and image sharpening operation by using a computer 4 based on MATLAB, extracting a fluorescence signal from the image, setting a threshold value according to the difference of gray values to extract a binary image of the soybean cotyledon, and removing noise points through switching operation;
regions with weak fluorescence signals are removed.
Firstly, calculating the areas of all the fluorescence areas, then traversing each pair of fluorescence areas with symmetrical centers, and if the ratio of the area of each fluorescence area in the pair to the maximum value of the area of 6 fluorescence areas is less than 0.3, determining that the pair of fluorescence areas are invalid; traversing the remaining effective regions after traversing three pairs of central symmetrical regions according to the method, wherein if the ratio of the smaller area value to the larger area value in a pair of effective regions is less than 0.5, the smaller area region in the pair of effective regions is invalid, and the larger area region is valid;
positioning the root of the bean seedling according to the effective fluorescence area:
if the fluorescence signals in a pair of centrosymmetric regions of interest are both valid, then a connection is made connecting the particles of the pair of regions of interest. If one of the pair of centrosymmetric regions of interest fails, a perpendicular line can still be made according to the imaging principle of the plane mirror, and the physical location is on a straight line where the image location is perpendicular to the intersection of the mirror plane and the ground. Three pairs of the fluorescence regions of interest are traversed in this way and corresponding lines are drawn.
If 3 straight lines are formed finally, the real position of the root of the soybean seedling is a mass point of 3 intersection points intersected by every two of the three straight lines. If 2 straight lines are formed finally, the real position of the root of the soybean seedling is the intersection point of the two straight lines. And if a straight line is formed finally, drawing a straight vertical central line, and solving an intersection point of the straight line and the central line, wherein the intersection point is the real position of the root of the soybean seedling.
The invention has the following beneficial effects:
(1) when the soybean seeds are soaked in the Fluorescein solution with effective dose, the fluorescence signal of the cotyledons is obviously different from that of weeds. The fluorescent signal under illumination by the blue LED lamp can be captured by a grayscale camera with a 520nm filter lens. The black light absorption curtain covered outside the darkroom system prevents light rays of an external environment from entering the darkroom system, eliminates interference of outside to fluorescence, and effectively avoids the situation that a marker is removed by sprinkling irrigation or natural rainfall. The method is simple to operate, high in practicability and capable of treating a large number of seeds simultaneously.
(2) By observing the growth process of soybean seedlings, the growth direction of the stem was found to be straight in the first 30 days. The center of mass of the cotyledon and the root position can be considered to be on a vertical straight line. Thus, the cotyledon centroid position identified by the algorithm may be approximated as the root-in-soil position. This approximation may simplify the algorithm.
(3) Only the camera can be used for observing the soybean seedlings from a top view angle, and the fluorescent pictures are taken, so that all parts of the soybean seedlings cannot be completely observed. And the camera is combined with a six-sided mirror, so that the analysis fluorescent picture can be observed from seven visual angles, and the position of the bean seedlings in the field can be accurately positioned.
Drawings
The invention has the following drawings:
FIG. 1 is a schematic overall structure of an embodiment of the present invention;
FIG. 2 is a schematic top view of the interior of an embodiment of the present invention;
FIG. 3 is a disassembled view of the blue LED lamp of the present invention;
FIG. 4 is a schematic side view of the present invention;
FIG. 5 is a block flow diagram of the identification method of the present invention;
FIG. 6 is a diagram illustrating the result of binarization and opening/closing operation;
FIG. 7 is a schematic diagram of the result of identifying the active area;
FIG. 8 is a schematic view of a principle of imaging a flat mirror;
FIG. 9 is a schematic diagram of the results of soybean seedlings positioned by different linear numbers;
FIG. 10 is a representative fluorescence image of a portion of soybeans taken in accordance with the present invention.
1. A darkroom system; 2. a black light absorbing curtain; 3. a grayscale camera; 4. a computer; 5. a blue LED lamp; 6. a transformer; 7. a mirror; 8. soybean seedlings; 9. and (4) weeds.
Detailed Description
The invention is described in further detail below with reference to the accompanying figures 1-10 and examples.
The first embodiment is as follows:
the automatic soybean/plant weed identification device based on computer vision comprises a darkroom system 1 built by sectional materials, wherein the outside of the darkroom system 1 is covered with a black light absorption curtain 2, the top end of the inside of the darkroom system 1 is fixed with a gray camera 3, the gray camera 3 is connected with a computer 4, the middle of the inside of the darkroom system 1 is fixed with six blue LED lamps 5, the blue LED lamps 5 are connected with a transformer 6, and the bottom end of the inside of the darkroom system 1 is fixed with a mirror 7 with six surfaces of which the inclination angles are properly adjusted.
By the scheme of the invention, the inclination angle of the blue LED lamp 5 is adjusted, so that light can be uniformly emitted to the soybean seedlings 8 and cannot reflect light with the mirror 7, the voltage value of the transformer 6 is adjusted to a proper value, the brightness of the blue LED lamp 5 is changed accordingly, and the light is emitted to plants stably. The positions of the mirror 7 and the camera are adjusted, the computer 4 is operated, and the exposure value and the gain value of the gray-scale camera 3 are adjusted and kept unchanged, so that high-quality pictures of soybean seedlings positioned right below the gray-scale camera 3 can be shot conveniently. Finally, the relative positions of the blue LED lamp, the camera and the mirror are fixed, the relative positions are determined based on a large number of experiments, and the identification accuracy can reach 97.5%.
When the recognition device (along with a tractor and other devices) runs in the field, the camera shoots in real time and captures a picture of the soybeans in the center of the mirror (namely, right below the camera). The soybean sprout is located in the center of the mirror to facilitate the online solution of the actual position of the soybean, which otherwise causes errors.
The soybean seedlings 8 and the weeds grow in the field, seven pictures from seven visual angles are shot by a camera, an algorithm is combined, lines are connected to calculate intersection points, and the weed intervention conditions of any number and any position can be processed.
By combining the six-sided inclination angle adjustment suitable mirrors 7 with the intermediate top view, it is possible to detect crop signals and predict the position of soybean roots more easily and accurately. Each mirror is parallel to the corresponding mirror on the other side. The image in the mirror shows cotyledons, leaves and stems, solving the occlusion problem. The height of the gray camera 3 is increased from 600 mm to 1460 mm, so that a larger visual field is ensured, and the growing environment of crops is monitored more comprehensively. The number of the mirrors 7 is 6, and the lower edges of the four mirrors form an included angle of 35 degrees with the horizontal line. The LED lamps 5 are blue and 6 in number. Before the automatic identification of soybean weeds is carried out, the gray-scale camera is connected with a computer outside a darkroom system, and the captured fluorescent image can be processed at the computer.
The soybean seeds are soaked in the fluorescein solution before sowing, and the photographed fluorescent picture has good effect after about two weeks from the sowing of the soybean seeds to the sprouting and bean seedling growth under the laboratory condition. It was observed through a number of experiments that fluorescein could persist in soybean seedlings for several weeks thereafter.
Comparison of fluorescence intensity on soybean seedlings: cotyledon > hypocotyl > epicotyl. Therefore, the fluorescein exists in the whole soybean seedling, but the soybean position is determined only according to the fluorescent picture of the cotyledon position. The fluorescein can emit fluorescence under the irradiation of the blue LED lamp, the weeds cannot emit fluorescence without being soaked by the fluorescein, and a fluorescence signal under the irradiation of the blue LED lamp can be captured by a gray camera with a 520-nanometer filter lens, so that the soybean seedlings and the inter-plant weeds can be distinguished and identified conveniently.
Through the scheme of the invention, the computer 4 is used for carrying out median filtering and image sharpening operations based on MATLAB, in order to correctly identify target crops and weeds, fluorescent signals need to be extracted from the images, a higher threshold value can be set according to the difference of gray values to extract binary images of soybean cotyledons, and a lower threshold value can also be set to extract binary images of soybean seedlings and weeds. In order to obtain a high-quality binary image, the image is processed by adopting composite morphology operation. And marking the images to form a label matrix, and extracting the positions of the soybean seedlings by using the two images.
According to the principle of planar mirror imaging, the image in the object and the mirror is symmetrical, the object being positioned on a line perpendicular to the intersection of the mirror plane and the ground. If the fluorescence image in one of the mirrors is lost and there is a valid fluorescence image in the symmetric mirror, then the position of the soybean sprout can still be solved in this case. Under normal circumstances, one of the paired regions of interest may be invalid due to occlusion issues. Wherever weeds grow around a soybean sprout, the three regions of interest on the same side are not theoretically rendered ineffective. In most cases, there are three lines to locate the actual position of the cotyledons, which improves the accuracy of the results. The method is designed for processing binary images containing less than six valid regions of interest. The algorithm has the advantages that the actual position of the cotyledon can be positioned no matter where the effective interested region exists and no matter how many interested regions exist, and the algorithm can adapt to the condition of high-density weeds.
Through the discernment to six speculum images, can obtain the position of soybean seedling root system in soil, prevent that soybean seedling from receiving mechanical damage. In the prior art, a tubular object with a fluorescent marker is inserted at the edge of a soybean seedling, but when crops grow more vigorously, a suction pipe with the fluorescent marker on a shot picture can be shielded by the crops and weeds, so that the fluorescent signal of part of mirrors is weak, the number of connecting lines of images in a six-face mirror is small, and the result cannot be output.
The improved identification method of the present invention can address this extreme case. The flow chart is shown in fig. 5, and the flow is as follows:
the image is binarized, the threshold value for binarization is set, only the fluorescence signal is retained, and the noise point is removed through the opening and closing operation (as shown in fig. 6). Generally, the areas of the fluorescence signals of the six areas are not greatly different, but when the leaves of weeds or soybean seedlings cover cotyledons, the fluorescence signal areas of some mirrors become very small, and the positions of the roots of the soybean seedlings cannot be determined according to mass points of a solved fluorescence image, so that the filtration is needed.
As shown in fig. 7, the fluorescence signal in the middle left ROI region (region of interest) is blocked by the weeds, and thus the fluorescence signal in the middle left ROI region should be deleted. In practical situations, one weed has a higher probability of blocking one side, and if the fluorescent regions of a pair of ROIs are required to be disabled, weeds with almost central symmetry positions are required to grow around the soybean seedlings, and the height of the weeds is approximately the same as that of the soybean seedlings, so that a shielding relation can be generated. The prior art only simply screens the effective area according to the area, but the gesture of bean seedling is different, and the regional area of fluorescence has big or small in the six mirrors. Accordingly, an innovative approach to identifying active areas is presented herein.
Regions with weak fluorescence signals are removed.
First, the area of all the fluorescence regions is calculated. Then, traversing each pair of centrosymmetric fluorescence regions, if the ratio of each fluorescence region in the pair to the maximum value of the areas of the 6 fluorescence regions is less than 0.3, then the fluorescence regions in the pair are judged to be invalid. After traversing three pairs of central symmetrical regions according to the method, traversing the rest of the effective regions, if the ratio of the smaller area value to the larger area value in a pair of effective regions is less than 0.5, then the smaller area region in the pair of effective regions is invalid, and the larger area region is valid. The advantage of this design is to take into account the difference in the areas of the phosphor zones at different angles, and also to take into account the difference in the phosphor zones at the same angle.
Positioning the root of the bean seedling according to the effective fluorescence area:
in low density weeds, the cotyledons were rarely covered and the fluorescence signal of the hexahedral mirror was considered to be effective. In this case, three straight lines can be solved by connecting the coordinates of the cotyledon particles. The pairing relationship is as follows: upper left, lower right, middle left, middle right, upper left, lower right. And then taking the average coordinate of all the intersection points of the three straight lines as the actual position of the soybean seedling. However, in the case of coverage of cotyledons by weeds, the number of effective ROIs may be less than 6. If there is a lack of active area in one of the pair of mirrors, the line cannot be drawn according to the method of connecting the symmetry points and the image of one mirror is wasted. For this purpose, an identification solution based on the principle of planar mirror imaging is proposed.
According to the principle of planar mirror imaging, the object is symmetrical to the image in the mirror. The position of the object is on a straight line (as shown in fig. 8) where the position of the image is perpendicular to the intersection of the mirror surface and the ground when viewed from above. If the fluorescence image in one of the mirrors is lost and there is a valid fluorescence image in the symmetric mirror, a straight line can still be obtained in this case. If the fluorescence signals in a pair of centrosymmetric ROI regions are both valid, then a connection can be made between the particles in the two regions. If one ROI in a pair fails due to weed occlusion, a perpendicular line can still be made according to the imaging principle of the plane mirror, and the entity is on this line. The fluorescence regions in the three pairs of ROIs are traversed in this way and corresponding lines are drawn.
If 3 straight lines are formed finally, the real position of the root is a mass point of 3 intersection points where the three straight lines intersect with each other in pairs (the horizontal coordinate and the vertical coordinate are respectively averaged). If 2 straight lines are formed finally, the real position of the root is the intersection of the two straight lines. And if a straight line is formed finally, drawing a straight vertical central line, and solving an intersection point of the straight line and the central line, wherein the intersection point is the real position of the root. This identification scheme can resolve most special cases (as shown in fig. 9).
FIG. 10 shows representative fluorescence images of 5 soybean seedlings taken according to the present invention, wherein the fluorescence areas on the left and right sides correspond to the fluorescence images in a hexahedral mirror, and the fluorescence area in the middle corresponds to the fluorescence image of the soybean seedling taken by a camera. 1-5 in the figure are five possible weed intervention scenarios, respectively: seven fluorescence areas can be used, one fluorescence area is lost, two symmetrical fluorescence areas are lost, two asymmetrical fluorescence areas are lost, and three fluorescence areas are lost. Fig. 1-5 include the original image, binarized image, and post-algorithm image (i.e., determining soybean sprout positions) for these cases.
Since the foliage of the soybean seedlings blocks the cotyledons at the early stage of growth, the height of weeds may be approximately the same as that of the soybean seedlings, and thus it is difficult to see the position where the roots enter the soil. The soybean seedling detection device can automatically detect soybean seedlings, the weeding knives are avoided when the soybean seedlings are identified, and weeding is carried out by controlling the weeding knives under other conditions, so that when the device is in walking operation in a field, a camera shoots in real time, the positions of the soybean seedlings and weeds are automatically determined, and automatic weeding operation is realized by combining the weeding device.
For the convenience of understanding the technical solutions of the present invention, the following detailed description will be made on the working principle or the operation mode of the present invention in the practical process.
The invention can be used in the field and can also be used for laboratory research. When the method is used in laboratory research, firstly, the soybean seeds are cleaned by distilled water, impurities on the seed epidermis are removed, and the influence of the impurities on the soybean growth is eliminated. After the seeds are dried, the seeds are soaked in a 120ppm fluorescein solution for 48 hours or 96 hours in a dark room environment. Fluorescein was used as the signal label. After treatment, the seeds were germinated in an indoor environment (about 20 ℃, 50% to 60% relative humidity). The seeds were covered with filter paper in the incubator. After about 3 days, the germinated seeds were moved and planted in a nursery tray filled with organic soil. The identification device can be applied to weed identification and weeding, and can also be used for further research on weed identification and weeding, thereby providing a basis and a new research idea for further deep research on weed identification and removal.
The above embodiments are merely illustrative, and not restrictive, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the invention, and therefore all equivalent technical solutions also belong to the scope of the invention.
Those not described in detail in this specification are within the skill of the art.

Claims (8)

1. A soybean inter-plant weed automatic identification device based on computer vision is characterized by comprising: a darkroom system, a grayscale camera, a computer, a blue LED lamp, a transformer and a mirror;
the darkroom system is built by sectional materials, a black light absorption curtain covers the outside of the darkroom system, a gray camera is fixed at the top end of the inside of the darkroom system and connected with a computer, six blue LED lamps are fixed in the middle of the inside of the darkroom system and connected with a transformer, and mirrors with six well-adjusted inclination angles are fixed at the bottom end of the inside of the darkroom system; each mirror is parallel to the corresponding mirror on the other side.
2. The apparatus of claim 1, wherein: the height range from the gray scale camera to the ground is 600-1460 mm.
3. The apparatus of claim 1, wherein: the grayscale camera has a 520nm filter lens.
4. The apparatus of claim 1, wherein: the lower edges of the four mirrors in the six mirrors form an included angle of 35 degrees with the horizontal line.
5. A method for automatically identifying weeds among soybean plants based on computer vision, which uses the device of any one of claims 1 to 4, and is characterized by comprising the following steps:
step 1, cleaning soybean seeds with distilled water, drying, soaking the soybean seeds with an effective dose of fluorescein solution, and sowing the soaked soybean seeds;
step 2, after the soybean seeds grow cotyledons, automatically identifying weeds by using an identification device; the inclination angle of the blue LED lamp is adjusted, so that light can be uniformly emitted to the soybean seedlings, and the reflection phenomenon with the mirror can not occur;
step 3, adjusting the voltage value of the transformer to a proper value, and enabling the brightness of the blue LED lamp to change along with the voltage value of the transformer, so that light rays are emitted to plants stably;
step 4, adjusting the positions of the mirror and the camera, operating the computer, adjusting the exposure value and the gain value of the gray-scale camera and keeping the exposure value and the gain value unchanged, and finally fixing the relative positions of the blue LED lamp, the camera and the mirror;
step 5, when the device is used for walking operation in the field, a camera shoots in real time, and a picture of the soybean seedlings in the center of the mirror is captured through a corresponding algorithm;
and 6, connecting the seven pictures of seven visual angles shot by the camera with an algorithm to calculate intersection points, processing the weed intervention conditions of any number and any position, and accurately positioning the soybean seedlings in the field.
6. The method of claim 5, wherein: in the step 1, after the seeds are dried, the seeds are soaked in a 120ppm fluorescein solution for 48 hours or 96 hours in a dark room environment.
7. The method of claim 5, wherein step 6 is as follows:
s61, carrying out image binarization on the pictures of seven visual angles shot by the camera;
s62, defining a region of interest from the binary image;
s63, identifying a valid region of interest from the region of interest;
s64, solving particles of each effective region of interest;
and S65, making a corresponding straight line according to the mass points of the effective region of interest, and solving the real position of the root of the soybean seedling.
8. The method of claim 7, wherein step 6 is specifically as follows:
performing median filtering and image sharpening operation based on MATLAB by using a computer, extracting a fluorescence signal from the image, setting a threshold value according to the difference of gray values to extract a binary image of the soybean cotyledon, and removing noise points through switching operation;
removing the areas with weak fluorescence signals;
firstly, calculating the areas of all the fluorescence areas, then traversing each pair of fluorescence areas with symmetrical centers, and if the ratio of the area of each fluorescence area in the pair to the maximum value of the area of 6 fluorescence areas is less than 0.3, determining that the pair of fluorescence areas are invalid; traversing the remaining effective regions after traversing three pairs of central symmetrical regions according to the method, wherein if the ratio of the smaller area value to the larger area value in a pair of effective regions is less than 0.5, the smaller area region in the pair of effective regions is invalid, and the larger area region is valid;
positioning the root of the bean seedling according to the effective fluorescence area:
if the fluorescence signals in a pair of centrosymmetric interested regions are effective, a connecting line is made for connecting the mass points of the pair of interested regions; if one of the pair of centrosymmetric interested areas fails, a perpendicular line can still be made according to the imaging principle of the plane mirror, and the position of the entity is on a straight line of which the position of the image is perpendicular to the intersection line of the mirror surface and the ground; traversing three pairs of fluorescence regions of interest according to the method, and making corresponding straight lines;
if 3 straight lines are formed finally, the real position of the root of the soybean seedling is a mass point of 3 intersection points of every two intersected three straight lines; if 2 straight lines are formed at last, the real position of the root of the soybean seedling is the intersection point of the two straight lines; and if a straight line is formed finally, drawing a straight vertical central line, and solving an intersection point of the straight line and the central line, wherein the intersection point is the real position of the root of the soybean seedling.
CN202111245172.1A 2021-10-26 2021-10-26 Automatic identifying device and method for soybean inter-plant weed based on computer vision Active CN113971759B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111245172.1A CN113971759B (en) 2021-10-26 2021-10-26 Automatic identifying device and method for soybean inter-plant weed based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111245172.1A CN113971759B (en) 2021-10-26 2021-10-26 Automatic identifying device and method for soybean inter-plant weed based on computer vision

Publications (2)

Publication Number Publication Date
CN113971759A true CN113971759A (en) 2022-01-25
CN113971759B CN113971759B (en) 2024-04-16

Family

ID=79588531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111245172.1A Active CN113971759B (en) 2021-10-26 2021-10-26 Automatic identifying device and method for soybean inter-plant weed based on computer vision

Country Status (1)

Country Link
CN (1) CN113971759B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315493A (en) * 2023-11-29 2023-12-29 浙江天演维真网络科技股份有限公司 Identification and resolution method, device, equipment and medium for field weeds

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10111929A (en) * 1996-10-04 1998-04-28 Iseki & Co Ltd Weed recognizer for lawn mower
CN106683069A (en) * 2015-11-04 2017-05-17 重庆泰升生态农业发展有限公司 Method for recognizing inline crops and weeds in seedling stage of farmland
CN108596173A (en) * 2018-04-19 2018-09-28 长春理工大学 One camera full view wire size real-time distinguishing apparatus and its detection method
CN109416735A (en) * 2016-05-12 2019-03-01 巴斯夫欧洲公司 The identification of weeds in natural environment
CN109410236A (en) * 2018-06-12 2019-03-01 佛山市顺德区中山大学研究院 The method and system that fluorescent staining image reflective spot is identified and redefined

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10111929A (en) * 1996-10-04 1998-04-28 Iseki & Co Ltd Weed recognizer for lawn mower
CN106683069A (en) * 2015-11-04 2017-05-17 重庆泰升生态农业发展有限公司 Method for recognizing inline crops and weeds in seedling stage of farmland
CN109416735A (en) * 2016-05-12 2019-03-01 巴斯夫欧洲公司 The identification of weeds in natural environment
US20190220666A1 (en) * 2016-05-12 2019-07-18 Bayer Cropscience Aktiengesellschaft Recognition of weed in a natural environment
CN108596173A (en) * 2018-04-19 2018-09-28 长春理工大学 One camera full view wire size real-time distinguishing apparatus and its detection method
CN109410236A (en) * 2018-06-12 2019-03-01 佛山市顺德区中山大学研究院 The method and system that fluorescent staining image reflective spot is identified and redefined

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315493A (en) * 2023-11-29 2023-12-29 浙江天演维真网络科技股份有限公司 Identification and resolution method, device, equipment and medium for field weeds
CN117315493B (en) * 2023-11-29 2024-02-20 浙江天演维真网络科技股份有限公司 Identification and resolution method, device, equipment and medium for field weeds

Also Published As

Publication number Publication date
CN113971759B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
US10568316B2 (en) Apparatus and methods for in-field data collection and sampling
Tian et al. Machine vision identification of tomato seedlings for automated weed control
Slaughter et al. Autonomous robotic weed control systems: A review
EP2822380B1 (en) Method and apparatus for automated plant necrosis
US11816874B2 (en) Plant identification using heterogenous multi-spectral stereo imaging
CN105844632B (en) Rice strain identification based on machine vision and localization method
Slaughter et al. Vision guided precision cultivation
CN113971759B (en) Automatic identifying device and method for soybean inter-plant weed based on computer vision
Ma et al. Automatic detection of crop root rows in paddy fields based on straight-line clustering algorithm and supervised learning method
Li et al. Image detection and verification of visual navigation route during cotton field management period
CN114049295A (en) Method and system for intelligently detecting and processing downy mildew
CN112630184A (en) Method for identifying phenotype of cotton verticillium wilt disease
He et al. Visual detection of rice rows based on Bayesian decision theory and robust regression least squares method
WO2020011318A1 (en) A system for use when performing a weeding operation in an agricultural field
CN110064601B (en) Seedling detection and classification system and classification method for vegetable grafting
CN114419407B (en) Automatic identification method and device for weeds in rows in seedling stage of transplanted crops
CN106778447A (en) A kind of muskmelon grafting machine grafting seam visual identifying system
CN117036926A (en) Weed identification method integrating deep learning and image processing
Su et al. Computer vision technology for identification of snap bean crops using systemic Rhodamine B
JP2019193582A (en) Plant cultivation device
CN111886982B (en) Detection method of dry land planting operation quality real-time detection system
CN114757891A (en) Plant growth state identification method based on machine vision technology
AU2018250354B2 (en) Method and System for Extracting Shadows from an Image during Optical Based Selective Treatment of an Agricultural Field
Giles et al. Development of a machine vision system for weed control using precision chemical application
Jiang et al. Automatic Localization of Soybean Seedlings Based on Crop Signaling and Multi-View Imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant