CN112507768A - Target detection method and device and image acquisition method and device - Google Patents

Target detection method and device and image acquisition method and device Download PDF

Info

Publication number
CN112507768A
CN112507768A CN202010302566.5A CN202010302566A CN112507768A CN 112507768 A CN112507768 A CN 112507768A CN 202010302566 A CN202010302566 A CN 202010302566A CN 112507768 A CN112507768 A CN 112507768A
Authority
CN
China
Prior art keywords
sampling
image
plants
target
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010302566.5A
Other languages
Chinese (zh)
Inventor
陈洪生
董雪松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Eavision Robotic Technologies Co Ltd
Original Assignee
Suzhou Eavision Robotic Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Eavision Robotic Technologies Co Ltd filed Critical Suzhou Eavision Robotic Technologies Co Ltd
Priority to CN202010302566.5A priority Critical patent/CN112507768A/en
Priority to BR112022020889A priority patent/BR112022020889A2/en
Priority to PCT/CN2020/125251 priority patent/WO2021208407A1/en
Publication of CN112507768A publication Critical patent/CN112507768A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06MCOUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
    • G06M11/00Counting of objects distributed at random, e.g. on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a target object detection method and device and an image acquisition method and device, relates to the technical field of image detection application, and aims to obtain sampling images with consistent heights, dynamically adjust the sampling postures of image acquisition equipment to acquire sampling images meeting requirements, label the sampling images, train a deep learning model to accurately identify target objects, detect the sampling images by using the successfully trained deep learning model to acquire the number and proportion of residual target objects, determine the target object removal degree of a plant region and accordingly ensure the seed purity. According to the invention, the machine replaces manpower to automatically collect samples, and the target object identification and removal degree statistics are carried out, so that the target object removal detection efficiency is greatly improved, the target object removal detection time is greatly shortened, the target object detection cost is greatly reduced, and all risks caused by the fact that target object removal detection personnel need to go deep into the field are thoroughly avoided.

Description

Target detection method and device and image acquisition method and device
Technical Field
The invention relates to the technical field of image detection application, in particular to a target object detection method and device and an image acquisition method and device.
Background
When a seed production company cultivates seeds, the requirement on the purity of the seeds reaches more than 99.7 percent. For corn crops, in order to obtain high-purity seeds, a seed industry company needs to thoroughly remove tassels of corn female parents, a mechanical castration method or a manual castration method is adopted at present, the castration effect is detected in time while castration is carried out, the castration rate is ensured, and the castration is carried out again when the purity does not meet the requirement.
The emasculation detection method of the conventional seed production company generally comprises the following steps: manually walking to a plurality of sampling points, randomly inspecting a plurality of plants in the area, and checking whether the removal of the tassels is finished or not, so that the detection effect is poor. Repeated spot inspection is required for multiple days, otherwise, the castration rate cannot be guaranteed. This method is both time consuming and labor intensive. The castration purity is difficult to ensure by sampling in large quantities due to time and labor waste and high cost.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for detecting a target object, and a method and an apparatus for acquiring an image, which acquire a highly consistent sample image by a machine control method, and count the degree of identification and removal of the target object, thereby greatly improving the efficiency of removing and detecting the target object, greatly shortening the time for removing and detecting the target object, greatly reducing the cost for detecting the target object, and thoroughly avoiding all risks caused by the fact that a person who removes and detects the target object needs to work deeply in the field.
In a first aspect, an embodiment provides a target detection method, including:
acquiring a sampling image of sampling points of a regularly planted plant area, wherein the plant area comprises male parent row plants and female parent row plants, and the sampling image is acquired from the position right above the plants and comprises all the female parent row plants except the male parent row plants;
and identifying the target object of the plant according to the sampling image so as to remove and detect the target object.
In an alternative embodiment, the horizontal axis of the sampled image is at a predetermined angle to the plant rows.
In an optional embodiment, the plant area includes a plurality of sampling points, and the sampling images corresponding to each sampling point do not overlap with each other.
In an alternative embodiment, the sample image includes location information.
In alternative embodiments, the target comprises either or both of tassels and buds.
In an optional embodiment, identifying the target object of the plant according to the sampled image for target object removal detection includes:
training a deep learning model according to the sampling image to obtain a target object model;
and identifying the target object in the sampling image according to the target object model.
In an alternative embodiment, the deep learning model includes any open source or self-developed neural network based on deep learning target detection.
In an alternative embodiment, the step of training the deep learning model according to the sampling image includes: marking the sampling images to form a training set with marked sampling images so as to train a deep learning model and further obtain a target object model; the step of identifying the target in the sample image according to the target model comprises: and identifying the target object in the unlabeled sampling image according to the target object model.
In an optional embodiment, identifying a target object of the plant according to the sampling image to perform target object removal detection further includes:
determining the number of plants of the sampling image;
counting the number of the target objects according to the identified target objects;
and determining the removal degree of the target object according to the number of the plants and the number of the target object.
In an alternative embodiment, the step of determining the number of plants of the sampled image comprises:
counting the number of plants in the sampling image;
alternatively, the first and second electrodes may be,
identifying plants in the sampling image to obtain the number of the plants;
alternatively, the first and second electrodes may be,
the method comprises the steps of obtaining the number of plants in a preset number of sampling images, calculating the average number of the plants in the sampling images, and determining the number of the plants in the sampling images according to the average number of the plants and the total number of the sampling images.
In a second aspect, an embodiment provides an object detecting device, including:
the system comprises a sampling image acquisition module, a data acquisition module and a data processing module, wherein the sampling image acquisition module is used for acquiring a sampling image of sampling points of a regularly planted plant area, the plant area comprises male parent row plants and female parent row plants, the sampling image is acquired from the position right above the plants, and the sampling image comprises all the female parent row plants except the male parent row plants;
and the removal detection module is used for identifying the target object of the plant according to the sampling image so as to carry out target object removal detection.
In a third aspect, an embodiment provides an image capturing method, including:
acquiring image information of sampling points of regularly planted plant areas, wherein the plant areas comprise male parent row plants and female parent row plants;
determining a target sampling posture of the image acquisition equipment according to the image information, wherein the target sampling posture at least comprises one or more of the following: collecting height and collecting angle, wherein the collecting angle is the angle between the horizontal axis of the image collecting device and the planting row of the plants;
and acquiring a sampling image corresponding to the sampling point based on the target sampling attitude.
In an alternative embodiment, the sample image includes location information.
In an alternative embodiment, the step of determining the target sampling posture of the image acquisition device according to the image information comprises:
and adjusting the current height of the image acquisition equipment until the image information comprises all female parent row plants except the male parent row plants, and determining the sampling height in the target sampling attitude.
In an alternative embodiment, the step of determining the target sampling posture of the image acquisition device according to the image information comprises:
and adjusting the current acquisition direction of the image acquisition equipment until an image horizontal axis of the image acquisition equipment forms a preset angle with the plant planting line in the image information, and determining the acquisition angle in the target sampling posture.
In an optional embodiment, the target sampling posture further includes a collecting direction, and the collecting direction is a direction of the image collecting device facing the ground, wherein the collecting direction is vertically downward.
In an optional embodiment, the step of obtaining image information of sampling points of the regularly planted plant area comprises:
acquiring sampling points in a regularly planted plant area and position coordinates of the sampling points;
planning a flight path from the image acquisition equipment to the position coordinates according to the position coordinates of the sampling points;
obtaining the image information based on the flight path.
In an alternative embodiment, the method further comprises:
and controlling the exposure of the sampling image according to the brightness of the image information.
In a second aspect, an embodiment further provides an image capturing apparatus, including:
the system comprises an image acquisition module, a data processing module and a data processing module, wherein the image acquisition module is used for acquiring image information of sampling points of regularly planted plant areas, and the plant areas comprise male parent row plants and female parent row plants;
the posture determining module is used for determining a target sampling posture of the image acquisition equipment according to the image information, wherein the target sampling posture at least comprises one or two of the following: collecting height and collecting angle, wherein the collecting angle is the angle between the horizontal axis of the image collecting device and the planting row of the plants;
and the image determining module is used for acquiring the sampling image corresponding to the sampling point based on the target sampling attitude.
According to the target object detection method and device, the sampling images with the consistent heights are obtained, the number and the proportion of the residual target objects are obtained by detecting the sampling images through the successfully trained deep learning model, the target object removal degree of the plant area is determined, automatic sample collection is carried out through a machine instead of manpower, target object identification and removal degree statistics are carried out, the target object removal detection efficiency is greatly improved, the target object removal detection time is greatly shortened, the target object detection cost is greatly reduced, and all risks caused by the fact that target object removal detection personnel need to work deeply in the field are thoroughly avoided.
Additional features and advantages of the disclosure will be set forth in the description which follows, or in part may be learned by the practice of the above-described techniques of the disclosure, or may be learned by practice of the disclosure.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of a male parent and a female parent plant provided by an embodiment of the present invention;
FIG. 2 is a flowchart of an image acquisition method according to an embodiment of the present invention;
fig. 3 is a schematic diagram of image information of an image capturing device in a preparation position according to an embodiment of the present invention;
fig. 4 is a schematic diagram of image information of an image acquisition device at a sampling position according to an embodiment of the present invention;
FIG. 5 is a flowchart of a target detection method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a user interface for emasculation detection according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of an application scenario of a castration detection method according to an embodiment of the present invention;
FIG. 8 is a functional block diagram of an image capturing device according to an embodiment of the present invention;
FIG. 9 is a functional block diagram of an apparatus for detecting a target object according to an embodiment of the present invention;
fig. 10 is a schematic diagram of a hardware architecture of an electronic device according to an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The emasculation detection is widely used in the field of agricultural breeding, and here, corn emasculation is taken as an example for explanation. When a seed production company produces seeds, a general planting area reaches more than 1000 mu, the boundary of a planting area block reaches more than 1 kilometer, plants are planted in rows, when castration is performed, firstly, castration is performed through mechanical equipment, the male ear on a female parent row is removed, the male ear on a male parent row is reserved for pollination, but the male ear which is not removed is usually omitted on a thick female parent row subjected to castration through the mechanical equipment, or the male ear which is grown subsequently is not grown, at the moment, in order to ensure the castration rate of the female parent row, the castration condition of the 1000 mu of land is spot-checked through manual walking and manual detection, and multiple spot-checks at intervals are needed, so that huge time is spent, and huge manpower and material resources are also spent.
In the corn seed production, the corn is planted in the middle row, namely the male parent row and the female parent row are planted at intervals, as shown in figure 1. In the pollination period, the tassel of the female parent row is removed, only the tassel of the male parent row is left, and during pollination, the fruits bearing on the plant strains of the female parent row are formed by combining the pollen of the male parent and the ovum of the female parent, so that the hybrid seed production is realized. If the castration of the female parent row is not good, the pollen of the female parent is pollinated to the self fruit cluster to form the selfed seed, which can greatly influence the seed purity. Based on this, to ensure that the tassels on the female parent row are completely removed, castration detection is required to realize better seed purity. Wherein the planting law of the plant can be as shown in figure 1, for example: 4 rows of female parent rows, 2 rows of male parent rows, 4 rows of female parent rows and 2 rows of male parent rows are planted at intervals, and the row spacing is determined; alternatively, 6 rows of female parent, 2 rows of male parent, 6 rows of female parent and 2 rows of male parent can be planted at intervals. Here, the number of rows of the parent row and the number of rows of the parent row are not limited, and the row pitch is not limited.
To the staff that castration detected, need artifical walking to the sampling point, waste time and energy, the condition is complicated in the planting area of crops such as maize, probably has to meet the snake worm, meets with dangerous condition such as heatstroke, whether castration through artifical inspection, efficiency is lower, when the cost is higher, owing to be limited by the cost, can not carry out a large amount of samplings to the plant, leads to being difficult to guarantee seed purity.
Due to the limitations of the manual emasculation detection method, through research and experiments of the inventor, an automatic control device can be used for controlling an image acquisition device, such as an aircraft aerial photograph, to acquire a castrated planting field picture so as to identify the emasculation rate, and the castration rate is read by a deep learning method. However, the scheme has the following technical difficulties, so that the scheme cannot be successfully applied to production practice:
1. the parents and parents of the plants are difficult to be automatically separated with high precision through machine learning image recognition;
2. because the branches and leaves of the maize in the flowering phase are staggered, the number of the maize plants is difficult to automatically and accurately count by machine learning image recognition, and therefore the ratio of the missed tassels is given;
3. images with different ground clearance, plant number and exposure degree are difficult to accurately identify;
4. due to complex field conditions and huge difference among pictures, missing tassels of a large number of varieties of corns are difficult to detect with high precision;
5. the detection process is complex, the manual participation degree is high, and the castration detection efficiency and precision are difficult to be really and effectively improved.
Based on the above, the target detection method and device and the image acquisition method and device provided by the embodiments of the present invention acquire highly consistent sampling images in a machine control manner, so as to ensure the efficiency and accuracy of target removal detection, save time and labor, and reduce the detection cost.
To facilitate understanding of the embodiment, a detailed description will be given of an image capturing method disclosed in the embodiment of the present invention, which is mainly applied to a control device, such as an aircraft, adapted to a plant scene of regular planting, where the regular planting includes but is not limited to planting along a row or planting in a cell, such as regularly planted corn, rice, soybean, rape, etc., and the embodiment of the present invention takes corn emasculation as an example for description.
Fig. 2 is a flowchart of an image acquisition method according to an embodiment of the present invention.
Referring to fig. 2, the image acquisition method mainly includes the following steps:
step S102, obtaining image information of sampling points of regularly planted plant areas, wherein the plant areas comprise male parent row plants and female parent row plants.
Step S104, determining a target sampling posture of the image acquisition equipment according to the image information, wherein the target sampling posture at least comprises one or more of the following: the height and the collection angle are collected, wherein the collection angle is the angle between the horizontal axis of the image collection device and the planting row of the plants.
And S106, acquiring a sampling image corresponding to the sampling point based on the target sampling attitude.
In a preferred embodiment of practical application, the acquisition height and the acquisition angle of the image acquisition equipment are determined to reach the target sampling posture according to the acquired image information of the acquisition point of the plant area, and the sampled images with consistent height are acquired at the sampling point according to the target sampling posture so as to perform target object removal detection, such as emasculation detection, according to the plant conditions in the sampled images.
The embodiment of the present invention is described by taking emasculation detection as an example, but the present invention is not limited to this, and is also applicable to other target removal detection scenarios.
Specifically, the sampling images with high consistency are acquired in a standardized manner by a machine control mode, and mainly aiming at regularly planted plant scenes (the planting rules are high in consistency), the sampling images with high consistency can be obtained by the image acquisition method (the acquisition modes are high in consistency), the actual area of each sampling image is ensured to be the same, the consistency of the number of the plants to be sampled is high (generally, the number of the plants in each sampling image is the same), and therefore sampling inspection can be well carried out.
As an optional embodiment, for plants regularly planted in a planting area, the plane position and the height position shot by the aircraft can be artificially determined through the remote control device, so that the consistency of the shot images is ensured, the interference of other plant rows is eliminated, and the consistency of the sampled images is improved. The boundary of the parent-parent line is confirmed through human eyes, and the acquisition height and the acquisition angle are determined under the condition that the interference of the parent line is eliminated during sampling, so that the accuracy is ensured, and the difficulty that machine identification is difficult to accurately distinguish due to extremely similar appearance of the parent-parent line is avoided. After the sampling images are obtained, the plant number (which can be counted manually or identified by images without limitation) displayed in one of the sampling images is obtained, and then the total number of plants in all the sampling images can be obtained for subsequent statistics. The tassel image of the female parent row is collected through the image collecting device to detect whether castration meets requirements or not, so that the detection efficiency is improved, and manpower and material resources are saved. The tassel image is collected through the remote control aircraft to carry out tassel selective examination, so that manual operation is avoided, the risk of deep farmland is avoided, manpower and material resources are saved, and the detection effect and the detection efficiency are improved.
As another alternative embodiment, the attitude of the aircraft, that is, the acquisition height, the acquisition angle, and the like, may be controlled by the upper computer, and then the aircraft carries an image acquisition device (camera) to perform image acquisition.
In an optional embodiment, step S104 further includes the following steps:
step 1.1), adjusting the current height of the image acquisition equipment until the image information comprises all the plants in the female parent row except the plants in the male parent row, and determining the sampling height in the target sampling posture.
As shown in fig. 4, when the parent line is just not seen in the image information and all the parent lines between the adjacent parent lines are retained, the aerial photographing height at this time is the sampling height in the target sampling posture. In the scene of corn emasculation, the shot boundary takes the male parent plant as the boundary, and in the application scene of other continuous plant planting rows, the boundary can be set as required.
In an alternative embodiment, the target sampling poses comprise the same acquisition direction, wherein the acquisition direction is the direction of the image acquisition device towards the ground, such as vertically downwards.
As a preferred embodiment, the plant is shot vertically downwards by the direction of the aerial photographing lens, and then a sampling image is obtained, so that the top end of the tassel is identified, the identification rate of the tassel is improved, if the tassel is inclined, a part of the tassel is identified, and the identification accuracy rate is influenced by the overlapped tassels.
In an alternative embodiment, step S104 can be further implemented by the following steps:
and step 1.2), adjusting the current acquisition direction of the image acquisition equipment until the horizontal axis of the image acquisition equipment forms a preset angle with the plant planting line in the image information, and determining the acquisition angle in the target sampling posture. Here, when the sampling image is acquired, the preset angle between the plant row direction in the image information and the horizontal axis of the graph is required to be not more than ± 15 ° so as to ensure the accuracy and robustness of image recognition. In a preferred embodiment, the current collecting direction of the image collecting device is adjusted until the horizontal axis of the image collecting device is parallel or vertical to the plant planting rows in the image information, and the collecting angle in the target sampling posture is determined, wherein the collecting angle is 0 ° or 90 °. That is, the plant rows in the collected image are parallel or vertical to the boundary line of the image, and the collection angle of the image collection equipment is preferably 0 degree or 90 degrees v
In an optional implementation manner, the method provided in the embodiment of the present invention further includes:
step 1.3), controlling the exposure of the sampled image according to the brightness of the image information.
Here, the aerial photography brightness is controlled according to the requirement that the brightest part in the image information cannot have obvious white spots. And then guarantee the quality of the sampled image, prevent the situation that the picture can't be discerned.
In an optional embodiment, the plant area includes a plurality of sampling points, and the sampling images corresponding to each sampling point do not overlap with each other.
Specifically, in order to remove the detection to the plant region comparatively accurately, general plant region can include a plurality of sampling points, need guarantee that the sampling image of every sampling point region collection does not overlap each other, avoids tassel quantity and plant quantity statistical error, and then ensures the accuracy of castration testing result.
As an alternative embodiment, the aerial photography aircraft is controlled (remotely controlled by the aircraft or controlled by the upper computer) to fly above the target sampling point to be acquired, the flying height of the aircraft is hovered and adjusted, the lens acquisition angle and the flying acquisition direction meet the above requirements, and the target female parent row and the male parent rows on both sides of the female parent row are all located in the picture, as shown in fig. 3, which is the preparation position of the aerial photography aircraft. And slowly reducing the height of the aerial photography aircraft and adjusting the position and the posture of the aircraft until the male parent row just moves out of the picture and all female parent rows in the middle of the male parent row are completely remained in the picture, and the horizontal axis of the picture of the sampling image is parallel to the extending direction of plant planting to the position of the aerial photography aircraft shown in figure 4, adjusting the exposure degree until the brightest part of the picture has no large white spots, and shooting and collecting the image. It should be noted that, if only a partial parent line is included in the sampled image, such as the parent line in the 3 lines or the parent line in the 2 lines in the figure, the sampling accuracy is affected.
It can be understood that the castration detection sampling image and the deep learning training set image in the embodiment of the invention both need to be acquired according to the same method, so as to ensure the accuracy of the detection of the tassel pattern. In an alternative embodiment, step S102 further includes the following steps:
step 2.1), acquiring sampling points and position coordinates of the sampling points in the regularly planted plant area;
step 2.2), planning a flight path of the image acquisition equipment to the position coordinates according to the position coordinates of the sampling points;
and 2.3) obtaining image information based on the flight path.
The execution subject of the method for acquiring the sampling image is flight control (controlling an aircraft carrying the image acquisition device) or an aircraft carrying the image acquisition device, and the method is realized by controlling the aircraft to reach the sampling point and controlling the image acquisition device to acquire the sampling image.
As an alternative embodiment, the aircraft receives the position coordinates of a sampling point in the planting area, and the aircraft arrives above the sampling point according to the flight path; the sampling points may be random or preset. The aircraft can reach each sampling point through remote control of an operator, or the aircraft plans a flight path according to the position of each sampling point of a preset or randomly set planting area, and automatically reaches each sampling point according to the planned path so as to acquire images.
In an image acquisition scene of practical application, whether male parent row plants exist in an image shot by image acquisition equipment is judged (machine judgment or manual judgment) by reducing the flying height of an aircraft to a preset value, at the moment, a plurality of unremoved male ears possibly exist in the female parent row plants, the male ears on the plants in the male parent row are not castrated, and if the male parent row plants are identified, the identification result is influenced. Meanwhile, if the image has the parent row, the aircraft can be controlled to adjust the acquisition angle, including but not limited to adjusting the flight attitude of the aircraft or the attitude of the image acquisition equipment, so that the angle between the horizontal axis of the image acquired by the image acquisition equipment and the planting row is 0 degree until the image acquisition equipment acquires the plants of the parent row, the position is determined as the sampling position, and the target sampling attitude of the sampling position is determined.
It is to be understood that, based on the above-mentioned manner of acquiring the image information including the position coordinates of the sampling points, in an alternative embodiment, the sampled image includes position information, such as position information corresponding to the sampling points. And based on the position information of the sampling points, positioning the accurate position of the target object.
On the basis of the embodiment, the image acquisition device is controlled to acquire the image information of the sampling position, obtain the sampling image and store the sampling image. And repeating the steps until the number of the obtained sampling images meets the requirement. In order to obtain the emasculation detection result more accurately, a large number of sampling images need to be acquired.
In some embodiments, after a predetermined time interval, the image sampling step is repeated in the planting area, and the step is repeated for a plurality of times, so as to ensure that emasculation detection is continued during the growth period of the plant, and prevent the late-growing tassel from being undetected.
Further, before the sampling position is determined, it is also necessary to determine whether the photographed image includes female parent row plants of all the continuous planting rows (machine judgment or manual judgment), for example, if it is known that the number of rows of the continuously planted female parent rows is 4, it is determined whether the photographed image includes four rows of female parent rows, and if only three rows are included, one row is omitted and not subjected to the sampling inspection, which may affect the correctness of the sampling inspection result. Here, it is understood that the number of rows is known and can be preset specifically according to the planting law.
For the sample image, as shown in fig. 4, the sample image is a bottom view, and is an image of a vertical downward angle of view of the lens, and the postures of the lenses of the image capturing devices are fixed (the same) when different images are captured. The axis that can fixed image acquisition equipment camera lens is vertical direction, when gathering the image, keeps the gesture of aircraft unchangeable, guarantees that the gesture of image acquisition equipment collection image is the same, also can control image acquisition equipment camera lens through the cloud platform, changes along with the change of aircraft gesture, guarantees that image acquisition equipment camera lens gesture is the same when gathering the image, the direction of collection is the same.
During actual operation, as shown in fig. 7, an operator can remotely control the aircraft to reach a certain random or set sampling point of the planting area, adjust the collection direction, remotely control the aircraft to reduce the flying height, and confirm whether the image to be collected only includes the maternal line or not through the real-time image shot by the image collection device while reducing the flying height, and the direction of the corn line is parallel to or perpendicular to the horizontal axis of the picture, if the condition is met and all the maternal lines planted continuously are included, control the image collection device to collect the image; if the image further comprises a father line or only comprises a small number of continuous mother lines, or the direction of the plant planting line is not parallel or vertical to the horizontal axis of the picture, the position of the aircraft is adjusted while the height is reduced until the photographed image only comprises the mother lines and comprises all the continuous mother lines, and the direction of the plant planting line is parallel or vertical to the horizontal axis of the picture, and the image at the position can be acquired. It should be noted that, the flying height of the aircraft is not preset, but is dynamically determined during shooting according to the plant line spacing, the line number of the planting mother lines, the plant column spacing, the size of the field of view of the aerial image acquisition device, and the like. During actual castration detection, the image acquisition equipment can obtain 80 pictures at sampling points, each picture can contain 30 plants, and then one sampling point can obtain the castration information of 2400 plants, so that the castration detection efficiency is greatly improved, wherein the number of the pictures and the number of the plants are not limited and are only exemplified. After the sampling point is collected, the operator remotely controls the aircraft to reach another random or set sampling point, and the operation is repeated until the collection of a plurality of sampling points is completed. The number of sampling points may be 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, etc., without limitation.
As an optional embodiment, if the position information of the planting area and the information of the row spacing, the column spacing, the number of female parent rows, the number of male parent rows and the like are obtained, the flight path of the image acquired by the aircraft can be planned, and automatic acquisition is realized. At the moment, the positions of sampling points in a planting area are determined according to a preset or random mode, a flight path is planned, a sampling path (including a preset flight height) is planned according to the position information of the sampling points, after the flight path and the sampling path are obtained by the aircraft, the aircraft flies to the sampling points according to the flight path, the sampling path of each sampling point is reduced to the preset height (only female parent rows can be seen and all continuous female parent rows can be seen after calculation), a sampling image is obtained, after the sampling point is completed, the aircraft flies to the next sampling point according to the flight path, the image is collected until the sampling points are collected and then fly back to the terminal point.
In some embodiments, after the sampling image with higher consistency is obtained based on the image acquisition method of the foregoing embodiment, the target object detection method may be executed, so as to achieve the purpose of removing the target object, as shown in fig. 5, specifically including the following steps:
step S202, acquiring a sampling image of sampling points of a regularly planted plant area, wherein the plant area comprises male parent row plants and female parent row plants, and the sampling image is acquired from the position right above the plants and comprises all the female parent row plants except the male parent row plants;
and step S204, identifying the target object of the plant according to the sampling image to remove and detect the target object.
By the target object detection method, based on the sampling images with consistent height, the sampling posture of the image acquisition equipment is dynamically adjusted to acquire the sampling images meeting the requirements, the sampling images are marked and the deep learning model is trained to accurately identify the target object, and the number and the proportion of the residual target objects are acquired by detecting the sampling images by the deep learning model which is successfully trained, so that the target object removal degree of the plant area is determined, and the seed purity is ensured. According to the invention, the machine replaces manpower to automatically collect samples, and the target object identification and removal degree statistics are carried out, so that the target object removal detection efficiency is greatly improved, the target object removal detection time is greatly shortened, the target object detection cost is greatly reduced, and all risks caused by the fact that target object removal detection personnel need to go deep into the field are thoroughly avoided.
In a practical preferred embodiment, the target recognition is performed by collecting images including all the plants in the female parent row except the plants in the male parent row from right above the plants, so as to detect the removal condition of the target.
In an alternative embodiment, the horizontal axis of the sampled image is at a predetermined angle to the plant rows. So as to improve the detection accuracy of the target object and prevent missing detection.
In an optional embodiment, the plant area includes a plurality of sampling points, and the sampling images corresponding to each sampling point do not overlap with each other. And the target object identification and detection efficiency and accuracy are further improved.
In an alternative embodiment, the sampled image includes position information for subsequent positioning of the target object.
In alternative embodiments, the target includes either or both of tassels and buds. In the corn castration detection, the tassel and the bud can be detected, and the bud detection difficulty is far higher than that of the tassel, so that the tassel detection method can be used as a reference detection scheme.
Here, the sampling image satisfying the foregoing embodiment has higher consistency, and further, in the image recognition process, the target removal condition can be detected more accurately, and a more accurate detection result is obtained.
Generally, after a sampling image is acquired by an aircraft image acquisition device, the sampling image is stored in a memory so that a preset identification model is imported after sampling is finished, identification processing is performed, and the cost of an aircraft is reduced; of course, the calculation processing equipment can be arranged on the aircraft, and the sampling image is calculated and processed in real time during flight, so that the identification result is obtained quickly.
The method provided by the embodiment of the invention acquires the image by specifying the height, the angle and the direction of the acquisition of the target object, thereby ensuring the consistency of the acquired image and improving the identification accuracy; meanwhile, the interference of the father parent row is eliminated by controlling the preset flying height, all the female parent rows planted continuously can be guaranteed to be identified, omission of the female parent rows is avoided, and the identification and detection accuracy is further improved.
In an alternative embodiment, step S204 can also be implemented by the following steps:
and 3.1) training a deep learning model according to the sampling image to obtain a target object model.
Specifically, a training set of labeled sampling images can be formed by labeling the sampling images to train a deep learning model, so that a target object model is obtained.
And 3.2) identifying the target object in the sampling image according to the target object model.
Furthermore, the target object in the unlabeled sampling image can be identified according to the target object model.
In an alternative embodiment, the deep learning model includes any open source or self-developed neural network based on deep learning target detection. And the accuracy of target object detection is further improved by training a deep learning model and identifying a detected target object based on the sampling images of the same standard.
In an optional embodiment, step S204 further includes:
step 4.1), determining the number of plants of the sampled image;
as an alternative, the number of plants in each sample image obtained is c1, c2, c3 …, cn, which is known and can be automatically identified or manually read, and the total number of plants in the sample points, c1+ c2 … + cn, if the sizes of the sample images are consistent, the number of plants is obtained, c0, and the number of sample images, N, is obtained, c 0.
As an optional embodiment, a certain sampling point is selected in the planting area, image acquisition is performed, after the acquisition is completed, the next sampling point is reached, image acquisition is performed again, and the steps are repeated, and the image information is removed by performing spot inspection on the target object in the whole planting area. Sampling points can be randomly selected, and can be predetermined according to the planting area without limitation. The number of sampling points is not limited, and the number of sampling points can be increased appropriately in order to improve detection accuracy. The number of the sampling points can be determined according to the resolution ratio of the image acquisition equipment, the flight height, the planting row spacing, the planting column spacing and the like so as to improve the accuracy of spot check.
The sampled image can be stored in the memory of the aircraft or transmitted back in real time without limitation. Particularly, the sampling image only comprises the plants in the parent row, the plants in the parent row are eliminated, the identification accuracy is improved, and the interference of the parent row is avoided. Simultaneously, the sampling image includes all female parent row plants of continuous planting row, avoids omitting, prevents to miss and examines. Moreover, the number of plants included in the sampling image is larger than or equal to a preset value, the detection efficiency is low due to the fact that the number of plants is too small, meanwhile, the number of plants is smaller than a second preset value, the problem that the detection is inaccurate due to too low resolution ratio is avoided, and the accuracy of sampling statistics is guaranteed. The image may be provided by an image acquisition device such as a mapping device, a camera, etc., and the image includes one or more of mapping image information and picture information, but is not limited thereto.
And 4.2) counting the number of the target objects according to the identified target objects.
Specifically, the tassel texture features in the sampled image can be identified according to the depth learning model, so that the tassel number in the sampled image is obtained;
the method comprises the steps of training a preset identification model by utilizing a large amount of historical data (training images) to obtain a depth network model of the tassel, identifying the tassel in a sampling image by the depth learning model through the training images, wherein the acquisition mode of the training images is the same as the standard image acquisition mode in the embodiment, repeated description is omitted, the training images and the sampling images are kept consistent, and the detection precision is improved.
And inputting the sampling image into a preset identification model, and detecting the sampling image by the preset identification model to obtain the tassel information. Specifically, the image characteristics of the tassel to be detected are extracted from the obtained sampling image, and the image characteristics of the tassel to be detected are processed by utilizing a depth network model to obtain the information of the tassel. After sampling images of a plurality of sampling points are obtained at one time, image processing is carried out on special computing and processing equipment; it may also be processed in real time each time a sampled image is obtained, depending on whether the aircraft is equipped with a computational processor, and its computational capabilities.
And 4.3) determining the removal degree of the target object according to the number of the plants and the number of the target object.
Specifically, the castration rate can be determined according to the total plant amount and the number of tassels; further judging whether the emasculation rate reaches an emasculation rate threshold value; if so, emasculation is qualified; if not, the castration is not qualified.
As an alternative embodiment, the total number c of all plants at the sampling point is obtained by manual reading, when the total number of the identified tassels is b1+ b2 … + bn, the emasculation rate d is calculated in a manner of d-b/c, and whether emasculation is finished can be determined according to the numerical value of d. For example, when d is less than or equal to a preset threshold, the preset threshold is the tassel proportion for ensuring that the seeds meet the purity requirement, when d is less than or equal to the preset threshold, the castration requirement is met, the castration is not needed again, and otherwise, the castration is needed again. After the preset time interval, the sampling image acquisition step is repeated in the same planting area, the sampling image of the same planting area is obtained at the moment, the sampling image reflects the situation of the tassel of the plant after the plant grows for a period of time, the sampling image is acquired again at the original sampling point or other sampling points of the same planting area, the tassel growing later is prevented from being undetected, and the sampling image is repeated for a plurality of times on the basis, so that all the growing tassels can be detected in the growth period of a period of time, and the castration requirements are met.
The sampling image acquired by the invention has the advantages of high resolution and good consistency. The high resolution can keep the texture characteristics of the tassel, the high consistency enables the picture characteristics of the training set and the detection set to be closer, so that the picture characteristics can be easily extracted and the tassel can be identified by applying the current mature deep learning target detection technology on the basis of training a large number of historical pictures, and the number of the tassels in each picture is automatically acquired by a computer. Meanwhile, only the parent row exists in each picture, so that the interference of the male parent on the male ear counting is completely avoided.
Furthermore, according to the quantity of the tassels identified in the collected female parent row plant images, the high-precision identification and counting of the tassels are carried out, and the castration rate is calculated. The castration purity estimation method in the embodiment of the invention is based on the plant counting method and the high-precision corn tassel identification counting method, and can obtain the corn castration purity estimation value of the detected plant area through the following formula:
the removal rate is the total number of targets in all the detected pictures/(total number of pictures average number of plants in each picture)
The removal rate can be calculated according to the time (day) of the plot, so that the change trend is found, the plant images of the female parent row are images of random areas (sampling points) in the planting area or images of preset areas (sampling points), and the number is not limited. Generally, the greater the number, the more accurate the removal rate obtained.
In an alternative embodiment, step 4.1), the following steps may be further included:
step 4.1.1), counting the number of plants in the sampling image;
alternatively, the first and second electrodes may be,
step 4.1.2), identifying plants in the sampling image to obtain the number of the plants;
alternatively, the first and second electrodes may be,
and 4.1.3) obtaining the number of plants in the sampling images with the preset number, calculating the average number of plants in the sampling images, and determining the number of plants in the sampling images according to the average number of plants and the total number of the sampling images.
Here, the total number of plants in the plant image of the female parent row collected in the planting area can be obtained from the sampling image, including only the female parent row, and including the female parent rows of all the successive planting rows.
According to the embodiment, the sampled images have high consistency, the planting distance of plants in the same planting field is basically fixed, and the number of plants in each image is basically consistent, so that the average number of plants in each image can be manually counted in advance from a plurality of images. The image-plant number mapping typically varies depending on the field and plant type. For a field, the plant species generally only need to be counted once, so that the problems of high cost of manual image recognition and counting or accuracy brought by machine image recognition and counting are avoided; the number of plants can be obtained by automatically identifying the images through a machine; or counting each image manually, and counting to obtain the number of plants.
As an alternative example, several sampling images are randomly extracted for each plot or a set of plots of the same planting type and density, the number of plants contained in each image is counted manually, and the average number of plants contained in each image C is calculated, at this time, the number of target objects contained in each image i, the number of tassels Ti and the number of buds Bi (i is 1, 2 … N, N is the total number of sampling images obtained for the plot) are detected by the target object detection method.
The ratio of missing tassels in the plot is: rt ═ sum (ti)/(N × C);
the missing bud ratio is as follows: rb ═ sum (bi)/(N × C);
the target removal completeness is as follows: d is (1-Sum (Ti + Bi).
It should be noted that the detection of the target is not limited to one type, and a plurality of different targets can be detected simultaneously on the same land, so as to obtain a result of complete removal degree.
The detection method provided by the embodiment of the invention utilizes the preset deep learning model to process the sampling image, obtains the target object information of the sampling point, and simplifies the statistical process of the target object, thereby improving the statistical efficiency of the target object, and avoiding the problems of low statistical efficiency and complicated statistical process caused by manually counting the target object data of the planting area.
As another possible embodiment, as shown in fig. 6, an operator may implement emasculation detection through a user interface, where the user interface of the emasculation detection system includes a user login and management system, a picture uploading and management system, an AI tassel recognition system, a recognition result presentation and reporting system, a user feedback system, an area and billing system, an AI tassel recognition model, an emasculation detection database, and an emasculation detection picture file system, and may implement an operation of recognizing tassels and plants in a sample image, and further calculating a removal rate.
In some embodiments, after the removal rate is calculated, whether the target object exists in the plant growing area can be automatically detected in a sampling inspection mode, and whether the removal requirement is met is judged.
The embodiment of the invention realizes the removal and detection of the target object based on deep learning and aerial pictures, and aims to solve the following problems: 1. the method comprises the following steps of (1) identifying a parent book, (2) identifying precision, (3) target object proportion, (4) operating efficiency. Through the target object detection method, the aerial photography flying height of the aircraft, the shooting angle, the shooting direction, the light sensing degree and other collection postures are controlled, so that the aerial photography sampling picture with the standardized high consistency is obtained, the plant number counting is realized, the identification difficulty of the sampling image is reduced, the detection precision is improved, and the accurate and efficient target object removal detection effect is achieved.
As shown in fig. 8, an embodiment also provides an image capturing apparatus, including:
the image acquisition module 801 is used for acquiring image information of sampling points of regularly planted plant areas, wherein the plant areas comprise male parent row plants and female parent row plants;
an attitude determination module 802, configured to determine a target sampling attitude of the image acquisition device according to the image information, where the target sampling attitude includes at least one or more of the following: acquiring height and an acquisition angle, wherein the acquisition angle is the angle between a horizontal axis of an image of the image acquisition equipment and a plant planting row;
and the image determining module 803 is configured to obtain a sampling image corresponding to the sampling point based on the target sampling posture, where target removal detection is performed on a plant in the sampling image.
In an alternative embodiment, the sample image includes location information.
In an optional embodiment, the posture determining module is further configured to adjust the current height of the image capturing device until the image information includes all the plants in the mother line except the plants in the male parent line, and determine the sampling height in the target sampling posture.
In an alternative embodiment, the target sampling poses include the same acquisition direction, which is the direction of the image acquisition device towards the ground, wherein the acquisition direction is vertically downward.
In an optional embodiment, the posture determining module is further configured to adjust a current collecting direction of the image collecting device until an image horizontal axis of the image collecting device forms a preset angle with the plant planting line in the image information, and determine a collecting angle in the target sampling posture.
In an optional embodiment, the image acquisition module is further configured to acquire sampling points and position coordinates of the sampling points in the regularly planted plant area; planning a flight path of the image acquisition equipment to the position coordinates according to the position coordinates of the sampling points; image information is obtained based on the flight path.
In an alternative embodiment, the image determination module is further configured to control the exposure level of the sampled image based on the brightness of the image information.
As shown in fig. 9, the embodiment further provides a target object detection apparatus 900, including:
a sampling image obtaining module 901, configured to obtain a sampling image of sampling points of a regularly planted plant area, where the plant area includes male parent row plants and female parent row plants, and the sampling image is collected from right above the plant and includes all the female parent row plants except the male parent row plants;
and a removal detection module 902, configured to identify a target object of the plant according to the sampling image to perform target object removal detection.
In an alternative embodiment, the horizontal axis of the sample image is at a predetermined angle to the plant rows.
In an optional embodiment, the plant area includes a plurality of sampling points, and the sampling images corresponding to each sampling point do not overlap with each other.
In an alternative embodiment, the sample image includes location information.
In alternative embodiments, the target comprises either or both of tassels and buds.
In an optional embodiment, the removal detection module is further configured to train a deep learning model according to the sampling image to obtain a target model; and identifying the target object in the sampling image according to the target object model.
In an alternative embodiment, the deep learning model includes any open source or self-developed neural network based on deep learning target detection.
In an optional implementation manner, the removal detection module is further configured to label the sampling image to form a training set labeled with the sampling image, so as to train a deep learning model, and further obtain a target object model; and identifying the target object in the unlabeled sampling image according to the target object model.
In an alternative embodiment, the removal detection module is further configured to determine the number of plants in the sampled image; counting the number of the target objects according to the identified target objects; and determining the removal degree of the target object according to the number of the plants and the number of the target object.
In an optional embodiment, the removal detection module is further configured to count the number of plants in the sampled image; or identifying plants in the sampling image to obtain the number of the plants; or acquiring the number of plants in the sampling images with the preset number, calculating the average number of plants in the sampling images, and determining the number of plants in the sampling images according to the average number of plants and the total number of the sampling images.
Fig. 10 is a schematic hardware architecture diagram of an electronic device 1000 according to an embodiment of the present invention. Referring to fig. 10, the electronic device includes: a machine-readable storage medium 1001 and a processor 1002, and may further include a nonvolatile storage medium 1003, a communication interface 1004, and a bus 1005; the machine-readable storage medium 1001, the processor 1002, the nonvolatile storage medium 1003, and the communication interface 1004 communicate with each other via the bus 1005. The processor 1002 may perform the method for detecting object removal described in the above embodiments by reading and executing the machine-executable instructions for object removal detection in the machine-readable storage medium 1001.
A machine-readable storage medium as referred to herein may be any electronic, magnetic, optical, or other physical storage device that can contain or store information such as executable instructions, data, and the like. For example, the machine-readable storage medium may be: a RAM (random Access Memory), a volatile Memory, a non-volatile Memory, a flash Memory, a storage drive (e.g., a hard drive), any type of storage disk (e.g., an optical disk, a dvd, etc.), or similar storage medium, or a combination thereof.
The non-volatile medium may be non-volatile memory, flash memory, a storage drive (e.g., a hard drive), any type of storage disk (e.g., an optical disk, dvd, etc.), or similar non-volatile storage medium, or a combination thereof.
It can be understood that, for the specific operation method of each functional module in this embodiment, reference may be made to the detailed description of the corresponding step in the foregoing method embodiment, and no repeated description is provided herein.
The computer-readable storage medium provided in the embodiments of the present invention stores a computer program, and when executed, the computer program code may implement the method for emasculation detection described in any of the above embodiments, for specific implementation, refer to the method embodiments, and will not be described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein.

Claims (19)

1. A method for detecting a target, comprising:
acquiring a sampling image of sampling points of a regularly planted plant area, wherein the plant area comprises male parent row plants and female parent row plants, and the sampling image is acquired from the position right above the plants and comprises all the female parent row plants except the male parent row plants;
and identifying the target object of the plant according to the sampling image so as to remove and detect the target object.
2. The method of claim 1, wherein a horizontal axis of the sample image is at a predetermined angle to the plant rows.
3. The method for detecting the target object according to claim 1, wherein the plant area comprises a plurality of sampling points, and the sampling images corresponding to each sampling point are not overlapped with each other.
4. The object detection method according to claim 1, wherein the sample image includes position information.
5. The method for detecting a target according to claim 1, wherein the target includes either or both of a tassel and a bud.
6. The method for detecting the target object according to claim 1, wherein identifying the target object of the plant according to the sampling image for target object removal detection comprises:
training a deep learning model according to the sampling image to obtain a target object model;
and identifying the target object in the sampling image according to the target object model.
7. The method of claim 6, wherein the deep learning model comprises an arbitrary open source or self-developed neural network for deep learning based object detection.
8. The method of claim 6, wherein the step of training a deep learning model based on the sampled image comprises: marking the sampling images to form a training set with marked sampling images so as to train a deep learning model and further obtain a target object model; the step of identifying the target in the sample image according to the target model comprises: and identifying the target object in the unlabeled sampling image according to the target object model.
9. The method of claim 6, wherein identifying the plant for object removal detection based on the sampled image further comprises:
determining the number of plants of the sampling image;
counting the number of the target objects according to the identified target objects;
and determining the removal degree of the target object according to the number of the plants and the number of the target object.
10. The method of claim 9, wherein the step of determining the number of plants in the sampled image comprises:
counting the number of plants in the sampling image;
alternatively, the first and second electrodes may be,
identifying plants in the sampling image to obtain the number of the plants;
alternatively, the first and second electrodes may be,
the method comprises the steps of obtaining the number of plants in a preset number of sampling images, calculating the average number of the plants in the sampling images, and determining the number of the plants in the sampling images according to the average number of the plants and the total number of the sampling images.
11. An object detecting device, comprising:
the system comprises a sampling image acquisition module, a data acquisition module and a data processing module, wherein the sampling image acquisition module is used for acquiring a sampling image of sampling points of a regularly planted plant area, the plant area comprises male parent row plants and female parent row plants, the sampling image is acquired from the position right above the plants, and the sampling image comprises all the female parent row plants except the male parent row plants;
and the removal detection module is used for identifying the target object of the plant according to the sampling image so as to carry out target object removal detection.
12. An image acquisition method, comprising:
acquiring image information of sampling points of regularly planted plant areas, wherein the plant areas comprise male parent row plants and female parent row plants;
determining a target sampling posture of the image acquisition equipment according to the image information, wherein the target sampling posture at least comprises one or two of the following conditions: collecting height and collecting angle, wherein the collecting angle is the angle between the horizontal axis of the image collecting device and the planting row of the plants;
and acquiring a sampling image corresponding to the sampling point based on the target sampling attitude.
13. The image acquisition method of claim 12, wherein the sample image includes location information.
14. The image capturing method according to claim 12, wherein the step of determining the target sampling pose of the image capturing device from the image information comprises:
and adjusting the current height of the image acquisition equipment until the image information comprises all female parent row plants except the male parent row plants, and determining the sampling height in the target sampling attitude.
15. The image capturing method according to claim 12, wherein the step of determining the target sampling pose of the image capturing device from the image information comprises:
and adjusting the current acquisition direction of the image acquisition equipment until an image horizontal axis of the image acquisition equipment forms a preset angle with the plant planting line in the image information, and determining the acquisition angle in the target sampling posture.
16. The image capturing method of claim 12, wherein the target capturing pose further comprises a capturing direction, the capturing direction being a direction of the image capturing device towards the ground, wherein the capturing direction is vertically downward.
17. The image capturing method as claimed in claim 12, wherein the step of obtaining image information of sampling points of the regularly planted plant area comprises:
acquiring sampling points in a regularly planted plant area and position coordinates of the sampling points;
planning a flight path from the image acquisition equipment to the position coordinates according to the position coordinates of the sampling points;
obtaining the image information based on the flight path.
18. The image acquisition method according to claim 12, characterized in that the method further comprises:
and controlling the exposure of the sampling image according to the brightness of the image information.
19. An image acquisition apparatus, comprising:
the system comprises an image acquisition module, a data processing module and a data processing module, wherein the image acquisition module is used for acquiring image information of sampling points of regularly planted plant areas, and the plant areas comprise male parent row plants and female parent row plants;
the posture determining module is used for determining a target sampling posture of the image acquisition equipment according to the image information, wherein the target sampling posture at least comprises one or two of the following: collecting height and collecting angle, wherein the collecting angle is the angle between the horizontal axis of the image collecting device and the planting row of the plants;
and the image determining module is used for acquiring the sampling image corresponding to the sampling point based on the target sampling attitude.
CN202010302566.5A 2020-04-16 2020-04-16 Target detection method and device and image acquisition method and device Pending CN112507768A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010302566.5A CN112507768A (en) 2020-04-16 2020-04-16 Target detection method and device and image acquisition method and device
BR112022020889A BR112022020889A2 (en) 2020-04-16 2020-10-30 METHODS OF TARGET OBJECT DETECTION AND IMAGE ACQUISITION, AND, ELECTRONIC DEVICE
PCT/CN2020/125251 WO2021208407A1 (en) 2020-04-16 2020-10-30 Target object detection method and apparatus, and image collection method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010302566.5A CN112507768A (en) 2020-04-16 2020-04-16 Target detection method and device and image acquisition method and device

Publications (1)

Publication Number Publication Date
CN112507768A true CN112507768A (en) 2021-03-16

Family

ID=74953230

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010302566.5A Pending CN112507768A (en) 2020-04-16 2020-04-16 Target detection method and device and image acquisition method and device

Country Status (3)

Country Link
CN (1) CN112507768A (en)
BR (1) BR112022020889A2 (en)
WO (1) WO2021208407A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113819921A (en) * 2021-10-27 2021-12-21 苏州极目机器人科技有限公司 Navigation method of execution terminal and electronic equipment
CN114402995A (en) * 2022-01-19 2022-04-29 北京市农林科学院智能装备技术研究中心 Air-ground cooperative corn emasculation method and system and air unmanned emasculation machine
CN114489113A (en) * 2021-12-15 2022-05-13 北京市农林科学院智能装备技术研究中心 Castration unmanned aerial vehicle control method and system
CN114766347A (en) * 2022-04-26 2022-07-22 新疆荣耀九天科技有限公司 Method for acquiring target three-dimensional position of corn tasseling operation
CN115511194A (en) * 2021-06-29 2022-12-23 布瑞克农业大数据科技集团有限公司 Agricultural data processing method, system, device and medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114631478B (en) * 2022-03-25 2023-01-20 安徽科技学院 Corn breeding field selfing ear counting device and control method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480706A (en) * 2017-07-24 2017-12-15 中国农业大学 A kind of seed production corn field remote sensing recognition method and device
CN109324051A (en) * 2018-11-08 2019-02-12 北方民族大学 A kind of plant moisture detection method and system
WO2019127395A1 (en) * 2017-12-29 2019-07-04 深圳市大疆创新科技有限公司 Image capturing and processing method and device for unmanned aerial vehicle
CN110852341A (en) * 2019-09-23 2020-02-28 平安科技(深圳)有限公司 Atractylodes macrocephala detection method based on deep learning and related equipment thereof
CN110992325A (en) * 2019-11-27 2020-04-10 同济大学 Target counting method, device and equipment based on deep learning

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10663445B2 (en) * 2018-05-06 2020-05-26 Beijing Normal University Method and system for identifying plant species based on hyperspectral data
CN209086157U (en) * 2018-11-08 2019-07-09 北方民族大学 A kind of plant moisture detection system
CN110414491B (en) * 2019-08-29 2024-06-14 新疆农业科学院经济作物研究所 Shooting method of cotton single plant morphology photo and auxiliary device thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480706A (en) * 2017-07-24 2017-12-15 中国农业大学 A kind of seed production corn field remote sensing recognition method and device
WO2019127395A1 (en) * 2017-12-29 2019-07-04 深圳市大疆创新科技有限公司 Image capturing and processing method and device for unmanned aerial vehicle
CN109324051A (en) * 2018-11-08 2019-02-12 北方民族大学 A kind of plant moisture detection method and system
CN110852341A (en) * 2019-09-23 2020-02-28 平安科技(深圳)有限公司 Atractylodes macrocephala detection method based on deep learning and related equipment thereof
CN110992325A (en) * 2019-11-27 2020-04-10 同济大学 Target counting method, device and equipment based on deep learning

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115511194A (en) * 2021-06-29 2022-12-23 布瑞克农业大数据科技集团有限公司 Agricultural data processing method, system, device and medium
CN113819921A (en) * 2021-10-27 2021-12-21 苏州极目机器人科技有限公司 Navigation method of execution terminal and electronic equipment
CN114489113A (en) * 2021-12-15 2022-05-13 北京市农林科学院智能装备技术研究中心 Castration unmanned aerial vehicle control method and system
CN114489113B (en) * 2021-12-15 2024-02-23 北京市农林科学院智能装备技术研究中心 Emasculation unmanned aerial vehicle control method and system
CN114402995A (en) * 2022-01-19 2022-04-29 北京市农林科学院智能装备技术研究中心 Air-ground cooperative corn emasculation method and system and air unmanned emasculation machine
CN114766347A (en) * 2022-04-26 2022-07-22 新疆荣耀九天科技有限公司 Method for acquiring target three-dimensional position of corn tasseling operation

Also Published As

Publication number Publication date
WO2021208407A1 (en) 2021-10-21
BR112022020889A2 (en) 2022-11-29

Similar Documents

Publication Publication Date Title
CN112507768A (en) Target detection method and device and image acquisition method and device
CN106971167B (en) Crop growth analysis method and system based on unmanned aerial vehicle platform
EP3467702A1 (en) Method and system for performing data analysis for plant phenotyping
KR101974638B1 (en) Apparatus for processing plant images and method thereof
CN110969654A (en) Corn high-throughput phenotype measurement method and device based on harvester and harvester
CN112580671A (en) Automatic detection method and system for multiple development stages of rice ears based on deep learning
JP2016154510A (en) Information processor, growth state determination method, and program
CN104732564A (en) Maize leaf area lossless dynamic monitoring device and method
CN111462058A (en) Method for quickly detecting effective ears of rice
CN109598215A (en) A kind of orchard Modeling Analysis System and method based on unmanned plane positioning shooting
CN115527130A (en) Grassland pest mouse density investigation method and intelligent evaluation system
CN110689022B (en) Method for extracting images of crops of each plant based on blade matching
Lyu et al. Development of phenotyping system using low altitude UAV imagery and deep learning
CN115061168A (en) Mobile inspection type crop growth monitoring system and method
Gao et al. Maize seedling information extraction from UAV images based on semi-automatic sample generation and Mask R-CNN model
CN117197595A (en) Fruit tree growth period identification method, device and management platform based on edge calculation
CN113724250A (en) Animal target counting method based on double-optical camera
CN112381028A (en) Target feature detection method and device
CN117115811A (en) High-precision determining method for potato crop ridge line independent of unmanned aerial vehicle
CN113807128A (en) Seedling shortage marking method and device, computer equipment and storage medium
CN115641500A (en) Intelligent detection and identification method for pitaya in target picking row of close-planting orchard
CN114663652A (en) Image processing method, image processing apparatus, management system, electronic device, and storage medium
US20220358641A1 (en) Information processing device and index value calculation method
CN114651283A (en) Seedling emergence by search function
CN116823918B (en) Crop seedling number measuring method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination