CN112931309B - Method and system for monitoring fish proliferation and releasing direction - Google Patents
Method and system for monitoring fish proliferation and releasing direction Download PDFInfo
- Publication number
- CN112931309B CN112931309B CN202110144245.1A CN202110144245A CN112931309B CN 112931309 B CN112931309 B CN 112931309B CN 202110144245 A CN202110144245 A CN 202110144245A CN 112931309 B CN112931309 B CN 112931309B
- Authority
- CN
- China
- Prior art keywords
- fish
- types
- proliferation
- upstream
- downstream
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 241000251468 Actinopterygii Species 0.000 title claims abstract description 190
- 230000035755 proliferation Effects 0.000 title claims abstract description 90
- 230000003578 releasing effect Effects 0.000 title claims abstract description 58
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000012544 monitoring process Methods 0.000 title claims abstract description 32
- 238000011144 upstream manufacturing Methods 0.000 claims abstract description 62
- 238000013135 deep learning Methods 0.000 claims abstract description 19
- 230000002708 enhancing effect Effects 0.000 claims abstract description 8
- 210000001525 retina Anatomy 0.000 claims abstract description 6
- 238000003062 neural network model Methods 0.000 claims description 26
- 238000012549 training Methods 0.000 claims description 13
- 238000011156 evaluation Methods 0.000 description 9
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 9
- 230000008901 benefit Effects 0.000 description 8
- 241000894007 species Species 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 238000000605 extraction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000004256 retinal image Effects 0.000 description 3
- 230000004083 survival effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000002207 retinal effect Effects 0.000 description 2
- 108091092878 Microsatellite Proteins 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 210000003710 cerebral cortex Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001447 compensatory effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000007850 fluorescent dye Substances 0.000 description 1
- 238000001215 fluorescent labelling Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 230000000366 juvenile effect Effects 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 239000003147 molecular marker Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000000979 retarding effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K61/00—Culture of aquatic animals
- A01K61/10—Culture of aquatic animals of fish
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K29/00—Other apparatus for animal husbandry
- A01K29/005—Monitoring or measuring activity, e.g. detecting heat or mating
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/80—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
- Y02A40/81—Aquaculture, e.g. of fish
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Animal Husbandry (AREA)
- Biodiversity & Conservation Biology (AREA)
- Biophysics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Marine Sciences & Fisheries (AREA)
- Zoology (AREA)
- Farming Of Fish And Shellfish (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a method and a system for monitoring fish proliferation and releasing directions, wherein the method comprises the following steps: acquiring video images of the upstream and downstream of a proliferation release point; enhancing the upstream and downstream video images by adopting a multi-scale retina image enhancement algorithm with color protection; and identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a deep learning algorithm, and obtaining the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and releasing points. The invention does not damage the fish body and reduces the monitoring cost.
Description
Technical Field
The invention relates to the technical field of fish monitoring, in particular to a method and a system for monitoring fish proliferation and releasing directions.
Background
River damming development has certain influence on hydrological situation, hydraulic characteristics, water temperature, water quality, fish migration and the like, and according to the environmental protection requirements of water conservancy and hydropower engineering construction projects, an environmental protection measure for retarding damming influence needs to be established, ecological compensation is implemented, and the health of a river ecological system is restored. The fish is a relatively sensitive indicator species of the river ecosystem, and the change of the fish population directly reflects the condition of the river ecosystem.
At present, China takes a plurality of water, atmosphere, sound and ecological environment protection measures in the construction period and the operation period of hydropower engineering. The artificial fish proliferation and releasing is an important means for relieving the influence of water and electricity development on aquatic ecology in a drainage basin, protecting rare or endangered fish populations and supplementing economic fish resources. Fish proliferation and releasing can be classified into three types: (1) and (4) ecological restoration. The premise of implementing ecological restoration type proliferation and releasing is that the habitat is selectable, the environmental adaptability and the bearing capacity are suitable, and the natural proliferation is limited. Its main function is to repair and compensate the resource loss due to environmental changes caused by human activities. Selecting indigenous species and kindred species as release objects; (2) the resource amount is increased. The precondition is that the natural multiplication is limited, the number of habitat resources is less than the bearing capacity, and the coexistence of wild and breeder species can not generate negative influence. The main purpose is to supplement the resource loss caused by over-fishing. Taking indigenous species as an object for proliferation and releasing; (3) changing the ecological structure type. Provided that the species can adapt to the new environment, the amount of resources already present is less than the ecological capacity of the environment. The main purpose is to maintain high yield of water area, fully utilize water area resources, and generally select foreign species as objects of proliferation and releasing.
At present, the proliferation and releasing have more than 100 years of practical experience, more than 100 countries are involved, and the proliferation and releasing project brings great benefits and new problems to people. Many proliferation and release projects do not achieve the expected targets in release plans, some projects are still not ideal in effect even implemented for decades, manpower and financial resources are input, but resource crisis is not relieved, and even some negative effects such as continuous reduction of precious resource amount and biological invasion occur. The main reason for this situation is the lack of effective evaluation and monitoring of the value added play. Most of the proliferation and releasing in China only stay at the releasing stage, and subsequent follow-up effect evaluation is not carried out, so that the research on an evaluation system of the proliferation and releasing effect is not concerned and does not pay attention to the evaluation system, and the proliferation and releasing effect evaluation system is lack. Research data which can be found about an evaluation system of the proliferation and releasing effect is very limited; the only literature data are general and have no pertinence, and the guiding significance of evaluating the proliferation and releasing effect of a specific proliferation and releasing project is not large. According to different properties and purposes, the proliferation and releasing are divided into a plurality of different categories, some are used for obtaining the proliferation and releasing of the fishery products, so the evaluation of the proliferation and releasing inevitably focuses on the economic benefits of the proliferation and releasing, and meanwhile, the ecological benefits are considered; some proliferation releasing taking the valuable and endangered fishes as releasing targets inevitably pay more attention to the survival and proliferation conditions of releasing types without considering economic benefits firstly; for compensatory proliferation and releasing aiming at population recovery, ecological benefits should be paid attention to firstly, and economic benefits and social benefits are taken into consideration at the same time. At present, a proliferation and release effect evaluation system which can be specifically proposed from the nature of proliferation and release as a starting point is lacked; the evaluation indexes given by the proposed system do not indicate specific operation guidance schemes, and are difficult to realize when applied to specific proliferation and releasing projects.
At present, the relatively mature fish release marking technology in China can be divided into a traditional marking method and a modern marking method. The former is mainly a listing and marking method; the latter is mainly fluorescence labeling, microsatellite labeling, etc. The traditional tag-hanging mark often damages the fish body and influences the survival rate after releasing, and the fluorescent mark can avoid damage and is suitable for marking small juvenile fish. The investigation result shows that at present, the proliferation and releasing of a few fishes are developing other marking methods, and most of the running fish proliferation station marking technologies adopt the traditional tag marking. The main reasons are that the marketization degree of fishery in many watersheds is not high, most of fishery objects are scattered to appear on the market, the recapture rate of fluorescent markers is low, and other methods such as a molecular marker method need a large amount of test work and are high in cost.
Generally, the traditional marking method can cause damage to the fish body, the survival rate after releasing is influenced, the fluorescent marking recapture rate is low, and the molecular marking method needs large workload and high cost.
Disclosure of Invention
Based on this, the invention aims to provide a method and a system for monitoring fish proliferation and releasing directions, which do not damage fish bodies and reduce monitoring cost.
In order to achieve the purpose, the invention provides the following scheme:
a method of monitoring fish proliferation discharge, the method comprising:
acquiring video images of the upstream and downstream of a proliferation release point;
enhancing the upstream and downstream video images by adopting a multi-scale retina image enhancement algorithm with color protection;
and identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a deep learning algorithm, and obtaining the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and releasing points.
Optionally, the identifying, by using a deep learning algorithm, the type and number of fish in the enhanced upstream and downstream video images to obtain the type and number of upstream fish and the type and number of downstream fish at the proliferation and drainage point specifically includes:
acquiring a fish image dataset;
marking the types of the fishes in the fish image data set and counting the number of the fishes of various types to obtain the types and the corresponding number of the fishes in the fish image data set;
taking the fish images in the fish image data set as input, and taking the types and the corresponding number of the fish as output to train a neural network model; the neural network model is internally provided with the deep learning algorithm;
and identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a trained neural network model to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and drainage points.
Optionally, the marking the fish type and counting the number of the fish of each type in the fish image data set specifically include:
and marking the target fishes through labelimg software to the fish image data set to obtain the types and the corresponding quantities of the target fishes.
Optionally, the neural network model is the YOLOv4 model.
Optionally, the acquiring video images upstream and downstream of the proliferation and release point specifically includes:
and adopting an underwater camera to collect video images of the upstream and the downstream of the proliferation and release point in real time.
The invention also discloses a monitoring system for fish proliferation and releasing direction, which comprises:
the video image acquisition module is used for acquiring video images at the upstream and the downstream of the proliferation and release point;
the image enhancement module is used for enhancing the video images at the upstream and the downstream by adopting a multi-scale retina image enhancement algorithm with color protection;
and the identification module is used for identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a deep learning algorithm to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and free-flow points.
Optionally, the identification module specifically includes:
a data set acquisition unit for acquiring a fish image data set;
a counting unit, configured to count the number of fish and the type of fish marked in the fish image data set, and obtain the type and the corresponding number of fish in the fish image data set;
the model training unit is used for training a neural network model by taking the fish images in the fish image data set as input and taking the types and the corresponding number of the fishes as output; the neural network model is internally provided with the deep learning algorithm;
and the identification unit is used for identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting the trained neural network model to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and free-flow points.
Optionally, the statistical unit specifically includes:
and the counting subunit is used for marking the target fishes through labelimg software to the fish image data set to obtain the types and the corresponding quantities of the target fishes.
Optionally, the neural network model is the YOLOv4 model.
Optionally, the video image obtaining module specifically includes:
and the video image acquisition unit is used for acquiring video images of the upstream and the downstream of the proliferation and release point in real time by adopting an underwater camera.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention discloses a method and a system for monitoring fish proliferation and release directions.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic flow chart of a method for monitoring fish proliferation and releasing direction according to the present invention;
FIG. 2 is a schematic structural diagram of the YOLOv4 model of the present invention;
FIG. 3 is a detailed structural diagram of the YOLOv4 model of the invention;
FIG. 4 is a schematic diagram of a monitoring system for fish proliferation and releasing direction.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide a method and a system for monitoring fish proliferation and releasing directions, which do not damage fish bodies and reduce monitoring cost.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Fig. 1 is a schematic flow chart of a monitoring method for fish proliferation and releasing, according to the present invention, and as shown in fig. 1, the monitoring method for fish proliferation and releasing comprises:
step 101: video images are acquired upstream and downstream of the proliferation release point.
The acquiring of the video images upstream and downstream of the proliferation and releasing point specifically comprises:
and a high-definition underwater camera is adopted to collect video images of the upstream and the downstream of the proliferation and releasing point in real time. The video images of the upstream and the downstream collected by the high-definition underwater camera in Real Time are transmitted in Real Time in a Real Time Streaming Protocol (RTSP) mode.
Step 102: and enhancing the video images at the upstream and the downstream by adopting a multi-scale retina image enhancement algorithm with color protection.
The fundamental theory of Retinex (retinal cerebral cortex theory) is that the color of an object is determined by the reflection ability of the object to long-wave (red), medium-wave (green) and short-wave (blue) light, rather than the absolute value of the intensity of the reflected light, and the color of the object is not affected by illumination nonuniformity and has uniformity, i.e., Retinex is based on color sense uniformity (color constancy). Unlike the traditional linear and nonlinear methods which can only enhance a certain feature of an image, Retinex can balance three aspects of dynamic range compression, edge enhancement and color constancy, so that various different types of images can be adaptively enhanced. For more than 40 years, researchers have developed Retinex algorithms, which have been improved from single-Scale Retinex algorithms, MSR (Multi-Scale Retinex, Multi-Scale retinal algorithm) to Multi-Scale weighted average MSR algorithms, and then to Color-restored Multi-Scale MSRCR (Multi-Scale Retinex with Color Restoration, Multi-Scale retinal image enhancement algorithm with Color Restoration) and Color-protected Multi-Scale MSRCP (Multi-Scale Retinex with chrominance prediction, Multi-Scale retinal image enhancement algorithm with Color protection), imitating the human visual system.
Step 103: and identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a deep learning algorithm, and obtaining the types and the number corresponding to various fishes of the upstream fish and the downstream fish of the proliferation and free flow point and the number corresponding to various fishes.
And monitoring the fish proliferation and release directions according to the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and release points.
The method for recognizing the types and the number of the fishes in the enhanced upstream and downstream video images by adopting the deep learning algorithm to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and free-flow points specifically comprises the following steps:
acquiring a fish image dataset;
marking the fish type and counting the number of various fish to obtain the fish type and the corresponding number in the fish image data set;
taking the fish images in the fish image data set as input, and taking the types and the corresponding number of the fish as output to train a neural network model; the neural network model is internally provided with the deep learning algorithm;
and identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a trained neural network model to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and drainage points.
The marking of the fish type and the counting of the number of the fish of each type on the fish image data set specifically include:
and marking the target fishes through labelimg software to the fish image data set to obtain the types and the corresponding quantities of the target fishes.
The neural network model is a Yolov4 model.
The following description will discuss a method for monitoring fish growth and releasing direction according to the present invention with reference to a specific example.
Firstly, a set of monitoring system is respectively arranged at the upstream and the downstream of a proliferation and release point for real-time observation. The video acquisition system is mainly implemented through a high-definition underwater camera, and in order to ensure that high-definition underwater fish video images can be obtained due to muddy river water in the collection area, the high-definition underwater camera transmits the acquired video images to a real-time image enhancement system terminal in real time in an RTSP video stream mode, and an image enhancement algorithm is arranged in the terminal to enhance the images transmitted in real time.
Then, the video image after image enhancement is input into a fish intelligent identification system terminal, a fish image identification algorithm based on deep learning is built in the terminal, and the algorithm can identify the fish type in the input image.
The invention relates to a monitoring method for fish proliferation and releasing direction, which mainly comprises the following steps:
step 1: and installing high-definition underwater cameras at the upstream and downstream of the proliferation releasing point to take pictures.
Step 2: and transmitting the video image acquired by the high-definition underwater camera to an image enhancement terminal in real time in an RTSP video streaming mode.
The gray level adjustment can highlight interested targets or gray level intervals in the image, inhibit uninteresting gray level intervals, simultaneously do not change the spatial relationship in the image, and adjust the gray level value to better separate fish and muddy water, so that the image enhancement is realized by adopting gray level conversion enhancement.
Step 3: and (4) inputting the high-definition fish image obtained in Step2 into a fish intelligent identification terminal, identifying the fish species through a trained built-in deep learning algorithm, and counting.
YOLOv4 is an efficient and powerful target detection model developed recently, which adopts the most elegant optimization strategy in the CNN (Convolutional neural network) field in recent years, and has different degrees of optimization in various aspects such as data processing, backbone network, network training, activation function and loss function, the structure of the YOLOv4 model is shown in fig. 2-3, and the YOLOv4 model mainly trains as follows:
the first step is as follows: and labeling the proliferation and releasing point target fish image set collected in advance through labelimg software to form a labeled data set.
The second step is that: and modifying the configuration file, wherein the configuration file comprises a training path and a training class, and the purpose of modifying the configuration file is mainly to initialize the basic model and carry out environment setting.
The third step: inputting the labeled data set obtained in the first step into a Yolov4 model of the modified configuration file in the second step for training to obtain a trained neural network model, wherein the overall training process of Yolov4 is approximately as follows: inputting a training set, performing feature extraction through a backbone extraction network CSPDarknet53, an enhanced feature extraction network SPP (spatial Pyramid) and PANET (path Aggregation network) structure to obtain three feature layers with fixed sizes of 13 x 1024, 26 x 512 and 52 x 256, finally comparing the obtained feature layers with real labeled data through a YOLO head, and finally converting the feature layers into a prediction result, wherein the training is terminated when the average loss fluctuates in a fixed amplitude or is not reduced any more after multiple iterations.
According to the method for monitoring the fish proliferation and releasing direction, the fish proliferation and releasing direction is monitored through the fish identification algorithm of video acquisition and deep learning, the method does not need to label the fish, fish bodies can not be damaged, meanwhile, the whole system can realize intelligent automation, and a large amount of labor cost is saved. In addition, by acquiring the video images of the upstream and downstream of the proliferation and releasing point in real time, performing image enhancement on the video images in real time and acquiring the type and the number of the upstream fish and the type and the number of the downstream fish of the proliferation and releasing point in real time through the trained neural network model, the timeliness of monitoring is improved.
Fig. 4 is a schematic structural diagram of a monitoring system for fish proliferation and releasing, according to the present invention, and the monitoring system for fish proliferation and releasing, as shown in fig. 4, includes:
a video image obtaining module 201, configured to obtain video images upstream and downstream of the proliferation release point.
The video image obtaining module 201 specifically includes:
and the video image acquisition unit is used for acquiring video images of the upstream and the downstream of the proliferation and release point in real time by adopting an underwater camera.
An image enhancement module 202 for enhancing the video images upstream and downstream using a multi-scale retinal image enhancement algorithm with color protection.
The video images of the upstream and downstream acquired by the high-definition underwater camera in Real Time are transmitted to the image enhancement module 202 in Real Time in a RTSP (Real Time Streaming Protocol) video stream manner.
And the identification module 203 is used for identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a deep learning algorithm to obtain the types and the number of the fishes upstream and the types and the number of the fishes downstream of the proliferation and the streaming point.
The identification module 203 specifically includes:
a data set acquisition unit for acquiring a fish image data set;
a counting unit, configured to mark fish types and count the number of fish of each type for the fish image data set, to obtain the fish types and the corresponding number of fish in the fish image data set;
the model training unit is used for training a neural network model by taking the fish images in the fish image data set as input and taking the types and the corresponding number of the fishes as output; the neural network model is internally provided with the deep learning algorithm;
and the identification unit is used for identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting the trained neural network model to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and free-flow points.
The statistical unit specifically comprises:
and the counting subunit is used for marking the target fishes through labelimg software to the fish image data set to obtain the types and the corresponding quantities of the target fishes.
The neural network model is a Yolov4 model.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.
Claims (8)
1. A method of monitoring fish proliferation and release directions, the method comprising:
acquiring video images of the upstream and downstream of a proliferation release point;
enhancing the upstream and downstream video images by adopting a multi-scale retina image enhancement algorithm with color protection;
identifying the types and the number of fish in the enhanced upstream and downstream video images by adopting a deep learning algorithm to obtain the types and the number of upstream fish and the types and the number of downstream fish at the proliferation and free-flow point;
the acquiring of the video images upstream and downstream of the proliferation and releasing point specifically comprises:
an underwater camera is adopted to collect video images of the upstream and the downstream of the proliferation and releasing point in real time;
and transmitting the video image acquired by the underwater camera to an image enhancement terminal in real time in an RTSP video streaming mode.
2. The method for monitoring fish proliferation and releasing direction according to claim 1, wherein the identifying the types and the number of fish in the video images of the upstream and the downstream after the enhancing by using the deep learning algorithm to obtain the types and the number of fish upstream and the types and the number of fish downstream of the proliferation and releasing point comprises:
acquiring a fish image dataset;
marking the fish type and counting the number of various fish to obtain the fish type and the corresponding number in the fish image data set;
taking the fish images in the fish image data set as input, and taking the types and the corresponding number of the fish as output to train a neural network model; the neural network model is internally provided with the deep learning algorithm;
and identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a trained neural network model to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and drainage points.
3. The method for monitoring fish proliferation and releasing fate of claim 2, wherein the marking of fish species and the counting of the number of fish species for the fish image data set specifically comprise:
and marking the target fishes through labelimg software to the fish image data set to obtain the types and the corresponding quantities of the target fishes.
4. The method of claim 2, wherein the neural network model is a YOLOv4 model.
5. A system for monitoring fish proliferation discharge, the system comprising:
the video image acquisition module is used for acquiring video images at the upstream and the downstream of the proliferation and release point;
the image enhancement module is used for enhancing the video images at the upstream and the downstream by adopting a multi-scale retina image enhancement algorithm with color protection;
and the identification module is used for identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a deep learning algorithm to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and free-flow points.
6. The system for monitoring fish proliferation and releasing direction according to claim 5, wherein the identification module specifically comprises:
a data set acquisition unit for acquiring a fish image data set;
a counting unit, configured to mark fish types and count the number of fish of each type for the fish image data set, to obtain the fish types and the corresponding number of fish in the fish image data set;
the model training unit is used for training a neural network model by taking the fish images in the fish image data set as input and taking the types and the corresponding number of the fishes as output; the neural network model is internally provided with the deep learning algorithm;
the identification unit is used for identifying the types and the number of the fishes in the enhanced upstream and downstream video images by adopting a trained neural network model to obtain the types and the number of the upstream fishes and the types and the number of the downstream fishes at the proliferation and free-flow points;
the video image acquisition module specifically comprises:
and the video image acquisition unit is used for acquiring the video images of the upstream and the downstream of the proliferation and release point in real time by adopting an underwater camera and transmitting the video images acquired by the underwater camera to the image enhancement terminal in real time in an RTSP video stream mode.
7. The system for monitoring fish proliferation and releasing direction according to claim 6, wherein the statistical unit comprises:
and the counting subunit is used for marking the target fishes through labelimg software to the fish image data set to obtain the types and the corresponding quantities of the target fishes.
8. The system of claim 6, wherein the neural network model is a YOLOv4 model.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110144245.1A CN112931309B (en) | 2021-02-02 | 2021-02-02 | Method and system for monitoring fish proliferation and releasing direction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110144245.1A CN112931309B (en) | 2021-02-02 | 2021-02-02 | Method and system for monitoring fish proliferation and releasing direction |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112931309A CN112931309A (en) | 2021-06-11 |
CN112931309B true CN112931309B (en) | 2021-11-09 |
Family
ID=76241724
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110144245.1A Active CN112931309B (en) | 2021-02-02 | 2021-02-02 | Method and system for monitoring fish proliferation and releasing direction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112931309B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114615252B (en) * | 2022-03-25 | 2023-05-16 | 广东海洋大学 | Online monitoring system for fish proliferation and releasing |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102006462A (en) * | 2010-11-27 | 2011-04-06 | 南京理工大学 | Rapid monitoring video enhancement method by using motion information and implementation device thereof |
CN108614896A (en) * | 2018-05-10 | 2018-10-02 | 济南浪潮高新科技投资发展有限公司 | Bank Hall client's moving-wire track describing system based on deep learning and method |
CN109034090A (en) * | 2018-08-07 | 2018-12-18 | 南通大学 | A kind of emotion recognition system and method based on limb action |
CN109856138A (en) * | 2018-12-18 | 2019-06-07 | 杭州电子科技大学 | Deep sea net cage healthy fish identifying system and method based on deep learning |
CN111406693A (en) * | 2020-04-23 | 2020-07-14 | 上海海洋大学 | Marine ranch fishery resource maintenance effect evaluation method based on bionic sea eels |
CN112070799A (en) * | 2020-05-29 | 2020-12-11 | 清华大学 | Fish trajectory tracking method and system based on artificial neural network |
-
2021
- 2021-02-02 CN CN202110144245.1A patent/CN112931309B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102006462A (en) * | 2010-11-27 | 2011-04-06 | 南京理工大学 | Rapid monitoring video enhancement method by using motion information and implementation device thereof |
CN108614896A (en) * | 2018-05-10 | 2018-10-02 | 济南浪潮高新科技投资发展有限公司 | Bank Hall client's moving-wire track describing system based on deep learning and method |
CN109034090A (en) * | 2018-08-07 | 2018-12-18 | 南通大学 | A kind of emotion recognition system and method based on limb action |
CN109856138A (en) * | 2018-12-18 | 2019-06-07 | 杭州电子科技大学 | Deep sea net cage healthy fish identifying system and method based on deep learning |
CN111406693A (en) * | 2020-04-23 | 2020-07-14 | 上海海洋大学 | Marine ranch fishery resource maintenance effect evaluation method based on bionic sea eels |
CN112070799A (en) * | 2020-05-29 | 2020-12-11 | 清华大学 | Fish trajectory tracking method and system based on artificial neural network |
Also Published As
Publication number | Publication date |
---|---|
CN112931309A (en) | 2021-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109829443A (en) | Video behavior recognition methods based on image enhancement Yu 3D convolutional neural networks | |
CN113486865B (en) | Power transmission line suspended foreign object target detection method based on deep learning | |
CN106991666B (en) | A kind of disease geo-radar image recognition methods suitable for more size pictorial informations | |
CN114004866B (en) | Mosquito recognition system and method based on image similarity difference | |
CN107038416A (en) | A kind of pedestrian detection method based on bianry image modified HOG features | |
CN111127360B (en) | Gray image transfer learning method based on automatic encoder | |
CN104700405B (en) | A kind of foreground detection method and system | |
CN110287902B (en) | Livestock and poultry survival detection method, device, equipment and computer program product | |
Gehlot et al. | Analysis of different CNN architectures for tomato leaf disease classification | |
CN109408985A (en) | The accurate recognition methods in bridge steel structure crack based on computer vision | |
CN112931309B (en) | Method and system for monitoring fish proliferation and releasing direction | |
CN110020658A (en) | A kind of well-marked target detection method based on multitask deep learning | |
CN112183448B (en) | Method for dividing pod-removed soybean image based on three-level classification and multi-scale FCN | |
CN107464260A (en) | A kind of rice canopy image processing method using unmanned plane | |
CN111950812A (en) | Method and device for automatically identifying and predicting rainfall | |
CN111767826A (en) | Timing fixed-point scene abnormity detection method | |
CN112906510A (en) | Fishery resource statistical method and system | |
CN114898405A (en) | Portable broiler chicken abnormity monitoring system based on edge calculation | |
Wang et al. | SLMS-SSD: Improving the balance of semantic and spatial information in object detection | |
CN103700118A (en) | Moving target detection method on basis of pulse coupled neural network | |
CN108764026A (en) | A kind of video behavior detection method based on time-series rules unit prescreening | |
CN108197655A (en) | Road surface breakage disease geo-radar image sorting technique based on principal component analysis and neural network | |
CN117115688A (en) | Dead fish identification and counting system and method based on deep learning under low-brightness environment | |
CN112200008A (en) | Face attribute recognition method in community monitoring scene | |
CN115152671B (en) | Hydraulic engineering regulation and control system and method for improving habitat of rare fish population |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |