CN117671242A - Dense target detection method, device, equipment and medium of self-adaptive density - Google Patents

Dense target detection method, device, equipment and medium of self-adaptive density Download PDF

Info

Publication number
CN117671242A
CN117671242A CN202311666779.6A CN202311666779A CN117671242A CN 117671242 A CN117671242 A CN 117671242A CN 202311666779 A CN202311666779 A CN 202311666779A CN 117671242 A CN117671242 A CN 117671242A
Authority
CN
China
Prior art keywords
target
dense
detection
preset
overlapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311666779.6A
Other languages
Chinese (zh)
Inventor
王洋洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agricultural Bank of China
Original Assignee
Agricultural Bank of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agricultural Bank of China filed Critical Agricultural Bank of China
Priority to CN202311666779.6A priority Critical patent/CN117671242A/en
Publication of CN117671242A publication Critical patent/CN117671242A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a dense target detection method, device, equipment and medium of self-adaptive density. The method comprises the following steps: performing target detection on an input image by adopting a preset target detection model, and determining whether a dense target exists in the detection target according to a preset target dense formula and a preset density degree threshold; performing self-adaptive overlapping blocking operation on an input image with a dense target to obtain at least two overlapping blocks; and detecting the at least two overlapped blocks to determine two overlapped blocks belonging to the same dense target, and fusing the two overlapped blocks belonging to the same dense target to determine the detection result of the dense target. By adopting the scheme, the dense target is determined according to a preset target dense formula and a preset dense degree threshold; overlapping and blocking the image containing the dense targets, and fusing the overlapping and blocking to obtain a dense target detection result; the detection accuracy of the dense targets is improved, and the omission rate and the false detection rate are reduced.

Description

Dense target detection method, device, equipment and medium of self-adaptive density
Technical Field
The embodiment of the invention relates to the technical field of target detection, in particular to a dense target detection method, device, equipment and medium with self-adaptive density.
Background
In recent years, deep learning has become a mainstream algorithm in the field of target detection. The target detection method based on deep learning adopts convolutional neural network automatic learning characteristics, and the detection precision and speed are improved effectively. The target detection method based on deep learning can be roughly divided into two types, one is based on Region suggestion (Region Propos); another is an End-to-End (End-to-End) method.
The detection method based on the Region Proposal has high detection precision, but the end-to-end method improves the detection speed at the expense of the detection speed, and can meet the requirement of real-time detection. Although great progress is made in current target detection, most of the detection methods are directed to sparse targets, and detection methods directed to dense targets are fewer. When the existing general object detection model is directly used for dense objects, the detection effect on images containing the dense objects is poor, so that the detection on the dense objects is still a challenging task.
Disclosure of Invention
The embodiment of the invention provides a dense target detection method, device, electronic equipment and storage medium with self-adaptive density, which are used for improving the accuracy of dense target detection and reducing the false detection rate and omission rate of the dense target.
In a first aspect, an embodiment of the present invention provides a method for detecting a dense target with adaptive density, including:
performing target detection on an input image by adopting a preset target detection model, and determining whether a dense target exists in the detection target according to a preset target dense formula and a preset density degree threshold;
performing self-adaptive overlapping blocking operation on an input image with a dense target to obtain at least two overlapping blocks;
and detecting the at least two overlapped blocks to determine two overlapped blocks belonging to the same dense target, and fusing the two overlapped blocks belonging to the same dense target to determine the detection result of the dense target.
In a second aspect, an embodiment of the present invention further provides a dense target detection apparatus with adaptive density, including:
the dense target determining module is used for carrying out target detection on the input image by adopting a preset target detection model, and determining whether dense targets exist in the detection targets according to a preset target dense formula and a preset density degree threshold value;
the self-adaptive overlapped block dividing module is used for carrying out self-adaptive overlapped block dividing operation on the input image with the dense target to obtain at least two overlapped blocks;
and the dense target fusion module is used for detecting the at least two overlapped blocks to determine two overlapped blocks belonging to the same dense target, and fusing the two overlapped blocks belonging to the same dense target to determine the detection result of the dense target.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
one or more processors;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the adaptive density dense object detection method of any embodiment of the present invention.
In a fourth aspect, embodiments of the present invention further provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the adaptive density dense target detection method according to any of the embodiments of the present invention.
The embodiment of the invention provides a dense target detection method, a dense target detection device, electronic equipment and a storage medium of self-adaptive density, which are characterized in that a preset target detection model is adopted to carry out target detection on an input image, and whether dense targets exist in detection targets is determined according to a preset target dense formula and a preset dense degree threshold; performing self-adaptive overlapping blocking operation on an input image with a dense target to obtain at least two overlapping blocks; and detecting the at least two overlapped blocks to determine two overlapped blocks belonging to the same dense target, and fusing the two overlapped blocks belonging to the same dense target to determine the detection result of the dense target. By adopting the technical scheme of the embodiment of the invention, the density degree of the detection target is estimated by calculating the ratio of the Euclidean distance of the nearest detection target to the diagonal line of the image; after a preset density degree threshold value is set, comparing the density degree of the detection target with the preset density degree threshold value, and adaptively judging whether the image contains the density target or not. If the dense targets exist in the image, performing self-adaptive overlapping blocking by using the blocking strategy designed by the invention; after the overlapped block images are detected by using the detection model, if two overlapped blocks belonging to the same dense target exist, the detection target is fused by using the fusion method designed by the invention; the detection accuracy of the dense targets is improved, the omission rate and the false detection rate are reduced, and the dense targets are positioned more accurately.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a flow chart of a dense target detection method of adaptive density provided in an embodiment of the present invention;
FIG. 2 is a flow chart of another dense object detection method of adaptive density provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a first fusion overlay segment provided in an embodiment of the present invention;
FIG. 4 is a schematic diagram of a second fusion overlap partition provided in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a third fusion overlap partition provided in an embodiment of the present invention;
FIG. 6 is a schematic diagram of a fourth fusion overlap partition provided in an embodiment of the present invention;
FIG. 7 is a schematic diagram of a dense object detection device with adaptive density according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Before discussing the exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations (or steps) can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The technical scheme of the application is that the acquisition, storage, use, processing and the like of the data meet the relevant regulations of national laws and regulations.
Fig. 1 is a flowchart of a dense target detection method with adaptive density according to an embodiment of the present invention, where the embodiment is applicable to a dense target detection with adaptive density, and the method of the present embodiment may be performed by a dense target detection device with adaptive density, where the device may be implemented in hardware and/or software. The apparatus may be configured in a server for dense target detection of an adaptive density. The method specifically comprises the following steps:
s110, performing target detection on the input image by adopting a preset target detection model, and determining whether dense targets exist in the detection targets according to a preset target dense formula and a preset density degree threshold value.
The object detection is a popular research direction in the field of computer vision, and the main task is to identify the kind of the object from the detected image and locate the position of the object. For example, it may refer to recognizing a face image from a detected image and locating it.
The preset target detection model includes, but is not limited to, YOLOv2, and in the embodiment of the present invention, YOLOv2 is taken as an example to perform target detection on the input image by using the preset target detection model. YOLOv2 is an end-to-end object detection network model. YOLOv2 makes many improvements including increasing the input size of the pre-trained model, removing the full connection layer, introducing the anchor boxes mechanism, adding the passthrough layer to connect the high resolution features with the low resolution features, and multi-size training, etc., so that the detection performance is greatly improved.
As an optional but non-limiting implementation manner, the target detection is performed on the input image by using a preset target detection model, and whether a dense target exists in the detected target is determined according to a preset target dense formula and a preset dense degree threshold, including but not limited to steps A1-A3:
step A1: and carrying out target detection on the input image by adopting a preset target detection model, and determining the number of detection targets and the coordinate information of the detection targets.
Step A2: and determining the density degree of each detection target according to a preset target density formula.
Step A3: and determining whether dense targets exist in the detection targets according to the density degree of each detection target and a preset density degree threshold value.
The method comprises the steps of detecting an input image by using a trained preset target detection model, and outputting the number n of detection targets and coordinate information of the detection targets, wherein the coordinate information comprises an upper left corner coordinate value and a lower right corner coordinate value. And determining the density degree of each detection target by adopting a preset target density formula, and taking the detection targets with the density degree larger than a preset density degree threshold value as the density targets.
As an alternative, but not limiting implementation manner, the determining the density degree of each detection target according to the preset target density formula includes, but is not limited to, steps B1-B3:
step B1: and determining Euclidean distances of coordinates of the center points of all detection targets, comparing the Euclidean distances, and determining the target Euclidean distance with the minimum Euclidean distance.
Step B2: and normalizing the target Euclidean distance according to the diagonal length of the input image to obtain the preset target dense formula.
Step B3: and determining the density degree of each detection target by adopting the preset target density formula.
Wherein, the preset target density formula is expressed as:
wherein min (D f ) Representing the euclidean distance of the object,(x ci ,y ci ) Represents the coordinates of the center point of the detected object, i E [1, …, n]The method comprises the steps of carrying out a first treatment on the surface of the dense may refer to the intensity of the detection target,representing the diagonal length of the input image.
In the embodiment of the invention, the Euclidean distance D of the coordinates of the central point of each detection target is compared f Obtaining the nearest Euclidean distance min (D f ) And the nearest euclidean distance is normalized. And defining the normalized nearest neighbor target distance as a preset target density formula and taking the normalized nearest neighbor target distance as a basis for judging the density degree of the detection target.
The embodiment of the invention also provides another target dense formula, wherein the ratio of the product of the average target size of the detection targets and the number of the detection targets to all detection target areas contained in the input image is used as a first target dense formula.
Wherein S is af The average target size of the detection targets is represented, and n represents the number of detection targets. r and d represent the abscissa and ordinate of the detection target in the lower right corner of the input image, and l and u represent the input imageLike the upper left hand corner detects the abscissa of the object. (r-l) x (d-u) represents the area of the input image containing all the inspection targets, and dens1 represents the degree of density of the inspection targets determined using the first target density formula.
The embodiment of the invention also provides a target dense formula, which takes a matrix A with the same size as the input image to obtain the center point coordinates of all detection targets; and setting 1 in the element at the coordinate position of the corresponding detection target center point in A, and setting the rest as 0. And carrying out convolution operation on the sliding window with the size of M multiplied by N and the matrix A, wherein the ratio of the maximum value in the convolution result to the size of the sliding window is used as a second target dense formula.
Where mxn represents the convolution kernel size, max (C) represents the maximum value of the convolution result, and dens2 represents the degree of density of the detection target determined using the second target density formula.
Verifying the validity of the three target dense formulas, respectively calculating the density degree of the detection targets by adopting a dense face data set and an FDDB face detection data set, analyzing the characteristics of the detection targets, and finally determining to adoptDetermining the intensity of the detection targets; and simultaneously determining that the preset density threshold sigma is 0.1.
S120, performing self-adaptive overlapping blocking operation on the input image with the dense target to obtain at least two overlapping blocks.
The input image containing dense targets is subjected to self-adaptive overlapped blocking operation, and the overlapped blocking can prevent missed detection caused by the segmentation of the detection targets.
As an alternative but non-limiting implementation manner, the adaptive overlap blocking operation is performed on the input image with dense targets, so as to obtain at least two overlap blocks, including but not limited to steps C1-C3:
step C1: and determining the average value of the length and width values of the detection targets according to the coordinate information of each detection target, and determining the length and width value of the overlapping region according to the average value.
Step C2: and determining the length and width values of the input image and presetting the length and width values of the overlapped blocks.
Step C3: and performing self-adaptive overlapping blocking operation on the input image according to the length and width values of the overlapping region, the length and width values of the input image and the length and width values of the preset overlapping blocks to obtain at least two overlapping blocks.
Wherein, first determining the size of the overlapping area, and taking one half of the average size of the detection target as the length and width value of the overlapping area. Determining a length and width value of an input image, and taking the preset length and width value as a length and width value of a preset overlapped block when the length and width value of the input image is not smaller than the preset length and width value; the preset length and width value is 416 obtained from experimental experience. Determining the size of the overlap region (w ov ×h ov ) Input image size (w×h), preset overlap block size (416×416), number of horizontal blocks w n Number of horizontal blocks h n Relationship between:
w n ×(416-w ov )=w-w ov
h n ×(416-h ov )=h-h ov
since the size of the overlapping area, the size of the input image and the preset overlapping block size are known, the number w of blocks can be obtained n H n Expressed as:
wherein,representing an upward rounding.
As an alternative but non-limiting implementation, after obtaining at least two overlapping partitions, the method further comprises:
and determining the number of the overlapped blocks, and determining the actual length and width values of the overlapped blocks according to the number of the overlapped blocks, the length and width values of the overlapped region and the length and width values of the input image.
Wherein, in the number w of the divided blocks n H n After the size of the overlapping area is determined, the actual length and width values of the overlapping blocks can be determined, and are expressed as follows:
wherein w is b H b Representing the actual length and width values of the overlapping tiles,representing a rounding down.
At this time, the parameter overlap area size of the overlapped block, the actual size of the overlapped block, the number of the lateral blocks, and the number of the lateral blocks are all determined. And carrying out self-adaptive overlapping and blocking operation on the input image according to the parameters of the overlapping and blocking.
S130, detecting the at least two overlapped blocks to determine two overlapped blocks belonging to the same dense target, and fusing the two overlapped blocks belonging to the same dense target to determine a detection result of the dense target.
Wherein, after overlapping blocks are performed on the input image, a dense object may exist in several overlapping blocks at the same time. After overlapped block detection, when the absolute coordinates of the dense targets are returned, a dense target can have a plurality of absolute coordinates, namely repeated frames, at the same time; wherein, the repeated frame may refer to two overlapping partitions belonging to the same dense target. If the repeated frames exist, fusing the repeated frames to obtain a detection result of the dense target.
As an alternative but non-limiting implementation, detecting the at least two overlapping partitions determines two overlapping partitions belonging to the same dense target, including but not limited to steps D1-D2:
step D1: and carrying out pairwise difference value operation on the central points of the at least two overlapped blocks, and taking the two overlapped blocks with the central point abscissa and ordinate distances smaller than the preset distance threshold value as candidate fusion overlapped blocks if the central point abscissa and ordinate distances of the two overlapped blocks are smaller than the preset distance threshold value.
Step D2: and if the intersection ratio of the candidate fusion overlapping blocks is larger than zero and the duty ratio of the overlapping part is larger than a preset overlapping threshold value, determining that the candidate fusion overlapping blocks are two overlapping blocks belonging to the same dense target.
And carrying out difference value operation on the center points of the overlapped blocks, wherein if the distances between the transverse coordinates and the longitudinal coordinates of the two target center points are smaller than a preset distance threshold value, repeating frames are possible to be used as candidate fusion overlapped blocks. Positioning coordinates of the overlapping blocks: [ (x) ai ,y ai ),(x bi ,y bi )]. Wherein, (x) ai ,y ai ) Upper left corner coordinates of overlapping tiles, (x) bi ,y bi ) I, j e [1, …, N for the lower right corner coordinates of the overlapping tiles]N is the target number after the block detection. If the overlap ratio mu of two fusion overlapped blocks as candidates is larger than 0, the two fusion overlapped blocks have overlapped parts, and the duty ratio lambda of the overlapped parts 1 Or lambda 2 And if the number of the overlapping blocks is larger than 0.5, judging that the two overlapping blocks serving as candidate fusion overlapping blocks belong to the same dense target.
As an alternative but non-limiting implementation manner, two overlapping blocks belonging to the same dense target are fused to determine the detection result of the dense target, including but not limited to steps E1-E3:
step E1: and fusing two overlapped blocks belonging to the same dense target, and determining a minimum rectangular frame comprising the two overlapped blocks.
Step E2: and determining the coordinate information of two overlapped blocks belonging to the same dense target, and determining the coordinate information of the minimum rectangular frame according to the coordinate information.
Step E3: and determining the detection result of the dense target according to the coordinate information of the minimum rectangular frame.
Wherein, the process of fusing overlapping blocks is shown in figures 3-6, and the upper left corner and the lower right corner of the two overlapping blocks are respectively [ (x) ai ,y ai ),(x bi ,y bi )]、[(x aj ,y aj ),(x bj ,y bj )]. If the two overlapped blocks have longitudinal overlapped parts, fusing according to the mode shown in the mode of FIG. 3; if there is a lateral overlap between the two overlapping segments, the fusion is performed in the manner of FIG. 5. The upper left corner coordinates of the fused object are (x ai ,y ai ) That is, the abscissa and the ordinate are the coordinates of the two repeated targets closest to the upper left, and the right lower corner coordinates of the fused targets are (x) bj ,y bj ) I.e. the abscissa takes the coordinate of the two repeated objects closest to the lower right.
The embodiment of the invention provides a dense target detection method of self-adaptive density, which is characterized in that a preset target detection model is adopted to carry out target detection on an input image, and whether dense targets exist in detection targets is determined according to a preset target dense formula and a preset density degree threshold; performing self-adaptive overlapping blocking operation on an input image with a dense target to obtain at least two overlapping blocks; and detecting the at least two overlapped blocks to determine two overlapped blocks belonging to the same dense target, and fusing the two overlapped blocks belonging to the same dense target to determine the detection result of the dense target. By adopting the technical scheme of the embodiment of the invention, the density degree of the detection target is estimated by calculating the ratio of the Euclidean distance of the nearest detection target to the diagonal line of the image; after a preset density degree threshold value is set, comparing the density degree of the detection target with the preset density degree threshold value, and adaptively judging whether the image contains the density target or not. If the dense targets exist in the image, performing self-adaptive overlapping blocking by using the blocking strategy designed by the invention; after the overlapped block images are detected by using the detection model, if two overlapped blocks belonging to the same dense target exist, the detection target is fused by using the fusion method designed by the invention; the detection accuracy of the dense targets is improved, the omission rate and the false detection rate are reduced, and the dense targets are positioned more accurately.
Fig. 7 is a schematic structural diagram of an adaptive density dense target detection device according to an embodiment of the present invention, where the technical solution of the present embodiment is applicable to an adaptive density dense target detection case, and the device may be implemented by software and/or hardware and is generally integrated on any electronic device having a network communication function, where the electronic device includes but is not limited to: server, computer, personal digital assistant, etc. As shown in fig. 7, the dense target detection apparatus of adaptive density provided in the present embodiment may include: a dense target determination module 710, an adaptive overlap blocking module 720, and a dense target fusion module 730; wherein,
the dense target determining module 710 is configured to perform target detection on the input image by using a preset target detection model, and determine whether a dense target exists in the detected target according to a preset target dense formula and a preset density threshold;
the adaptive overlap blocking module 720 is configured to perform adaptive overlap blocking operation on an input image with a dense target, so as to obtain at least two overlap blocks;
and the dense target fusion module 730 is configured to detect the at least two overlapping blocks to determine two overlapping blocks belonging to the same dense target, and fuse the two overlapping blocks belonging to the same dense target to determine a detection result of the dense target.
On the basis of the above embodiment, optionally, the dense target determining module is specifically configured to:
performing target detection on the input image by adopting a preset target detection model, and determining the number of detection targets and coordinate information of the detection targets;
determining the density degree of each detection target according to a preset target density formula;
determining whether dense targets exist in the detection targets according to the density degree of each detection target and a preset density degree threshold;
wherein, the detection target with the density degree larger than the preset density degree threshold value is a dense target.
On the basis of the above embodiment, optionally, the dense target determining module is further specifically configured to:
determining Euclidean distances of coordinates of center points of all detection targets, comparing the Euclidean distances, and determining the target Euclidean distance with the minimum Euclidean distance;
normalizing the target Euclidean distance according to the diagonal length of the input image to obtain the preset target dense formula;
determining the density degree of each detection target by adopting the preset target density formula;
wherein, the preset target density formula is expressed as:
wherein min (D f ) Representing the euclidean distance of the object,(x ci ,y ci ) Represents the coordinates of the center point of the detected object, i E [1, …, n]The method comprises the steps of carrying out a first treatment on the surface of the dense may refer to the intensity of the detection target,representing the diagonal length of the input image.
On the basis of the foregoing embodiment, optionally, the adaptive overlapping blocking module is specifically configured to:
determining an average value of the length and width values of the detection targets according to the coordinate information of each detection target, and determining the length and width value of the overlapping region according to the average value;
determining a length and width value of an input image and presetting a length and width value of an overlapped block;
and performing self-adaptive overlapping blocking operation on the input image according to the length and width values of the overlapping region, the length and width values of the input image and the length and width values of the preset overlapping blocks to obtain at least two overlapping blocks.
On the basis of the foregoing embodiment, optionally, the adaptive overlapping blocking module is further specifically configured to:
and determining the number of the overlapped blocks, and determining the actual length and width values of the overlapped blocks according to the number of the overlapped blocks, the length and width values of the overlapped region and the length and width values of the input image.
On the basis of the above embodiment, optionally, the dense target fusion module is specifically configured to:
performing pairwise difference value operation on the central points of the at least two overlapped blocks, and taking the two overlapped blocks with the central point having the horizontal and vertical coordinate distances smaller than the preset distance threshold value as candidate fusion overlapped blocks if the horizontal and vertical coordinate distances of the central points of the two overlapped blocks are smaller than the preset distance threshold value;
and if the intersection ratio of the candidate fusion overlapping blocks is larger than zero and the duty ratio of the overlapping part is larger than a preset overlapping threshold value, determining that the candidate fusion overlapping blocks are two overlapping blocks belonging to the same dense target.
On the basis of the above embodiment, optionally, the dense target fusion module is further specifically configured to:
fusing two overlapped blocks belonging to the same dense target, and determining a minimum rectangular frame comprising the two overlapped blocks;
determining coordinate information of two overlapped blocks belonging to the same dense target, and determining coordinate information of a minimum rectangular frame according to the coordinate information;
and determining the detection result of the dense target according to the coordinate information of the minimum rectangular frame.
The dense target detection device with adaptive density provided in the embodiment of the invention can execute the dense target detection method with adaptive density provided in any embodiment of the invention, has the corresponding functions and beneficial effects of executing the dense target detection method with adaptive density, and the detailed process refers to the related operation of the dense target detection method with adaptive density in the embodiment.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device 10 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 8, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the adaptive density dense object detection method.
In some embodiments, the adaptive density dense target detection method may be implemented as a computer program tangibly embodied on a computer readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the adaptive density dense object detection method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the dense target detection method of adaptive density in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A dense target detection method of adaptive density, the method comprising:
performing target detection on an input image by adopting a preset target detection model, and determining whether a dense target exists in the detection target according to a preset target dense formula and a preset density degree threshold;
performing self-adaptive overlapping blocking operation on an input image with a dense target to obtain at least two overlapping blocks;
and detecting the at least two overlapped blocks to determine two overlapped blocks belonging to the same dense target, and fusing the two overlapped blocks belonging to the same dense target to determine the detection result of the dense target.
2. The method of claim 1, wherein the performing object detection on the input image using the preset object detection model, and determining whether a dense object exists in the detected object according to a preset object dense formula and a preset dense degree threshold value, comprises:
performing target detection on the input image by adopting a preset target detection model, and determining the number of detection targets and coordinate information of the detection targets;
determining the density degree of each detection target according to a preset target density formula;
determining whether dense targets exist in the detection targets according to the density degree of each detection target and a preset density degree threshold;
wherein, the detection target with the density degree larger than the preset density degree threshold value is a dense target.
3. The method of claim 2, wherein determining the density level of each detection target according to a predetermined target density formula comprises:
determining Euclidean distances of coordinates of center points of all detection targets, comparing the Euclidean distances, and determining the target Euclidean distance with the minimum Euclidean distance;
normalizing the target Euclidean distance according to the diagonal length of the input image to obtain the preset target dense formula;
determining the density degree of each detection target by adopting the preset target density formula;
wherein, the preset target density formula is expressed as:
wherein min (D f ) Representing the euclidean distance of the object,(x ci ,y ci ) Represents the coordinates of the center point of the detected object, i E [1, …, n]The method comprises the steps of carrying out a first treatment on the surface of the dense may refer to the intensity of the detection target,representing the diagonal length of the input image.
4. The method according to claim 2, wherein said adaptively overlapping tiles the input image in which the dense object exists, to obtain at least two overlapping tiles, comprises:
determining an average value of the length and width values of the detection targets according to the coordinate information of each detection target, and determining the length and width value of the overlapping region according to the average value;
determining a length and width value of an input image and presetting a length and width value of an overlapped block;
and performing self-adaptive overlapping blocking operation on the input image according to the length and width values of the overlapping region, the length and width values of the input image and the length and width values of the preset overlapping blocks to obtain at least two overlapping blocks.
5. The method of claim 4, wherein after obtaining at least two overlapping partitions, the method further comprises:
and determining the number of the overlapped blocks, and determining the actual length and width values of the overlapped blocks according to the number of the overlapped blocks, the length and width values of the overlapped region and the length and width values of the input image.
6. The method of claim 1, wherein detecting the at least two overlapping partitions to determine two overlapping partitions belonging to the same dense target comprises:
performing pairwise difference value operation on the central points of the at least two overlapped blocks, and taking the two overlapped blocks with the central point having the horizontal and vertical coordinate distances smaller than the preset distance threshold value as candidate fusion overlapped blocks if the horizontal and vertical coordinate distances of the central points of the two overlapped blocks are smaller than the preset distance threshold value;
and if the intersection ratio of the candidate fusion overlapping blocks is larger than zero and the duty ratio of the overlapping part is larger than a preset overlapping threshold value, determining that the candidate fusion overlapping blocks are two overlapping blocks belonging to the same dense target.
7. The method of claim 1, wherein fusing two overlapping partitions belonging to the same dense target to determine a detection result of the dense target comprises:
fusing two overlapped blocks belonging to the same dense target, and determining a minimum rectangular frame comprising the two overlapped blocks;
determining coordinate information of two overlapped blocks belonging to the same dense target, and determining coordinate information of a minimum rectangular frame according to the coordinate information;
and determining the detection result of the dense target according to the coordinate information of the minimum rectangular frame.
8. An adaptive density dense target detection apparatus, the apparatus comprising:
the dense target determining module is used for carrying out target detection on the input image by adopting a preset target detection model, and determining whether dense targets exist in the detection targets according to a preset target dense formula and a preset density degree threshold value;
the self-adaptive overlapped block dividing module is used for carrying out self-adaptive overlapped block dividing operation on the input image with the dense target to obtain at least two overlapped blocks;
and the dense target fusion module is used for detecting the at least two overlapped blocks to determine two overlapped blocks belonging to the same dense target, and fusing the two overlapped blocks belonging to the same dense target to determine the detection result of the dense target.
9. An electronic device, comprising:
one or more processors;
a storage means for storing one or more programs;
when executed by the one or more processors, causes the one or more processors to implement the adaptive density dense target detection method of any of claims 1-7.
10. A storage medium containing computer executable instructions which, when executed by a computer processor, are for performing the adaptive density dense object detection method of any of claims 1-7.
CN202311666779.6A 2023-12-06 2023-12-06 Dense target detection method, device, equipment and medium of self-adaptive density Pending CN117671242A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311666779.6A CN117671242A (en) 2023-12-06 2023-12-06 Dense target detection method, device, equipment and medium of self-adaptive density

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311666779.6A CN117671242A (en) 2023-12-06 2023-12-06 Dense target detection method, device, equipment and medium of self-adaptive density

Publications (1)

Publication Number Publication Date
CN117671242A true CN117671242A (en) 2024-03-08

Family

ID=90076585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311666779.6A Pending CN117671242A (en) 2023-12-06 2023-12-06 Dense target detection method, device, equipment and medium of self-adaptive density

Country Status (1)

Country Link
CN (1) CN117671242A (en)

Similar Documents

Publication Publication Date Title
CN112801164B (en) Training method, device, equipment and storage medium of target detection model
CN112597837B (en) Image detection method, apparatus, device, storage medium, and computer program product
CN113392794B (en) Vehicle line crossing identification method and device, electronic equipment and storage medium
CN113378969B (en) Fusion method, device, equipment and medium of target detection results
CN113205041A (en) Structured information extraction method, device, equipment and storage medium
CN116740355A (en) Automatic driving image segmentation method, device, equipment and storage medium
CN113177497B (en) Training method of visual model, vehicle identification method and device
CN117372663A (en) Method, device, equipment and storage medium for supplementing log end face shielding
CN116758280A (en) Target detection method, device, equipment and storage medium
CN117671242A (en) Dense target detection method, device, equipment and medium of self-adaptive density
CN115631376A (en) Confrontation sample image generation method, training method and target detection method
CN115546764A (en) Obstacle detection method, device, equipment and storage medium
CN113283442B (en) Feature point extraction method and device
CN114092739B (en) Image processing method, apparatus, device, storage medium, and program product
CN114049615B (en) Traffic object fusion association method and device in driving environment and edge computing equipment
CN116109991B (en) Constraint parameter determination method and device of model and electronic equipment
CN114612492B (en) Image frame detection method and device and electronic equipment
CN114037865B (en) Image processing method, apparatus, device, storage medium, and program product
CN118014971A (en) Surface defect detection method, device and equipment for photovoltaic module and storage medium
CN116593986A (en) Multi-sensor track fusion filtering method and device, electronic equipment and storage medium
CN118135528A (en) Parking space detection method and device, electronic equipment and storage medium
CN116798104A (en) Pupil detection method, device, equipment and storage medium
CN115131243A (en) Image processing method and device, electronic equipment and storage medium
CN117686986A (en) Vehicle identification method, device, electronic equipment and storage medium
CN114387217A (en) Method, system, device and storage medium for segmenting quasi-circular overlapped target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination