CN111553904B - Unmanned aerial vehicle-based regional people counting method and system - Google Patents

Unmanned aerial vehicle-based regional people counting method and system Download PDF

Info

Publication number
CN111553904B
CN111553904B CN202010356436.XA CN202010356436A CN111553904B CN 111553904 B CN111553904 B CN 111553904B CN 202010356436 A CN202010356436 A CN 202010356436A CN 111553904 B CN111553904 B CN 111553904B
Authority
CN
China
Prior art keywords
unmanned aerial
area
aerial vehicle
people
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010356436.XA
Other languages
Chinese (zh)
Other versions
CN111553904A (en
Inventor
高志斌
郭洋洋
黄联芬
林和志
王明康
陈发明
陈舒玲
黄长龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Huafang Software Technology Co ltd
Xiamen University
Original Assignee
Xiamen Huafang Software Technology Co ltd
Xiamen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Huafang Software Technology Co ltd, Xiamen University filed Critical Xiamen Huafang Software Technology Co ltd
Priority to CN202010356436.XA priority Critical patent/CN111553904B/en
Publication of CN111553904A publication Critical patent/CN111553904A/en
Application granted granted Critical
Publication of CN111553904B publication Critical patent/CN111553904B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a regional people counting method, medium, equipment and system based on an unmanned aerial vehicle, wherein the method comprises the following steps: calculating the maximum hovering height of the unmanned aerial vehicle according to the unmanned aerial vehicle parameters and the personnel shooting parameters; calculating the minimum number of unmanned stands according to the information of the area to be counted and the information of the shooting overlapping area so as to determine the adjustable range of the control parameters; acquiring control parameters, and calculating positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted so as to control each unmanned aerial vehicle to fly to a specified position; the visual field area is photographed to obtain an area picture, and the area picture is subjected to personnel detection through a lightweight neural network classifier to generate the number information corresponding to the unmanned aerial vehicle, so that the number of people in the area to be counted is counted according to the number information; the number of people in the open place can be automatically and effectively counted, so that the number of people in the open place is controlled in real time, the trampling event is prevented, and the safety of the people in the open place is guaranteed.

Description

Unmanned aerial vehicle-based regional people counting method and system
Technical Field
The invention relates to the technical field of unmanned aerial vehicle control, in particular to an unmanned aerial vehicle-based regional people counting method, a computer-readable storage medium, computer equipment and an unmanned aerial vehicle-based regional people counting system.
Background
People counting is indispensable data in the aspects of management and decision making of public places such as superstores, shopping centers, airports, stations and the like. The system can reasonably schedule manpower and material resources by using the people counting information, take precautionary measures on the change of the number of controlled people, timely deal with the over-dense situation of the people, prevent the occurrence of trampling events, and ensure the safety of the people and the stability of social life orderly.
In the related art, in the process of counting the number of people, a fixed image acquisition device or a camera with a sliding device is mostly adopted to acquire images of a fixed entrance, and then the number of people entering the fixed entrance is counted; however, since there is no fixed entrance to an open place such as a square or a playground, it is difficult for the conventional method to effectively count the number of people in the open place.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the art described above. Therefore, one object of the present invention is to provide a method for counting the number of people in an area based on an unmanned aerial vehicle, which can automatically and effectively count the number of people in an open place, and further control the number of people in the open place in real time, thereby preventing the occurrence of a stepping event and ensuring the safety of the people in the open place.
A second object of the invention is to propose a computer-readable storage medium.
A third object of the invention is to propose a computer device.
The fourth purpose of the invention is to provide an unmanned aerial vehicle-based regional people counting system.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides an area people counting method based on an unmanned aerial vehicle, including the following steps: acquiring unmanned aerial vehicle parameters and personnel shooting parameters, and calculating the maximum hovering height of the unmanned aerial vehicle according to the unmanned aerial vehicle parameters and the personnel shooting parameters; acquiring information of an area to be counted and information of shooting overlapping areas between every two unmanned aerial vehicles, and calculating the minimum number of unmanned aerial vehicles according to the information of the area to be counted and the information of the shooting overlapping areas between every two unmanned aerial vehicles so as to determine the adjustable range of control parameters according to the maximum hovering height and the minimum number of unmanned aerial vehicles; acquiring control parameters according to the adjustable range of the control parameters, calculating positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information; controlling each unmanned aerial vehicle to photograph a visual field region to obtain a region picture, inputting the region picture into a lightweight neural network classifier, detecting people in the region picture through the lightweight neural network classifier, and generating people number information corresponding to the unmanned aerial vehicle according to a detection result so as to count people in a region to be counted according to the people number information.
According to the area people counting method based on the unmanned aerial vehicle, firstly, parameters of the unmanned aerial vehicle and personnel shooting parameters are obtained, and the maximum hovering height of the unmanned aerial vehicle is calculated according to the parameters of the unmanned aerial vehicle and the personnel shooting parameters; then, acquiring information of an area to be counted and information of shooting overlapping areas between every two unmanned aerial vehicles, and calculating the minimum number of unmanned aerial vehicles according to the information of the area to be counted and the information of the shooting overlapping areas between every two unmanned aerial vehicles so as to determine the adjustable range of the control parameters according to the maximum hovering height and the minimum number of the unmanned aerial vehicles; then, acquiring control parameters according to the adjustable range of the control parameters, calculating positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information; then, controlling each unmanned aerial vehicle to photograph a visual field region to obtain a region picture, inputting the region picture into a lightweight neural network classifier to perform personnel detection on the region picture through the lightweight neural network classifier, and generating the number information corresponding to the unmanned aerial vehicle according to a detection result so as to perform people counting in a region to be counted according to the number information; therefore, the number of people in the open place can be automatically and effectively counted, the number of people in the open place can be controlled in real time, the trampling event is prevented, and the safety of the people in the open place is guaranteed.
In addition, the method for counting the number of people in the unmanned plane based area according to the above embodiment of the present invention may further have the following additional technical features:
optionally, the method further comprises: and generating the panoramic picture of the area to be counted according to the area pictures corresponding to all the unmanned aerial vehicles, and inputting the panoramic picture into a heavyweight end-to-end convolution network, so that the heavyweight end-to-end convolution network can accurately count the number of people in the area to be counted according to the panoramic picture.
Optionally, before inputting the region picture into the lightweight neural network classifier, further comprising: the method comprises the steps of obtaining region pictures corresponding to adjacent unmanned aerial vehicles of the unmanned aerial vehicles, and marking the region pictures corresponding to the unmanned aerial vehicles according to the region pictures corresponding to the adjacent unmanned aerial vehicles, wherein the marking comprises visual field overlapping region marking and visual field non-overlapping region marking, so that the light weight neural network classifier can generate visual field region human numerical values, visual field overlapping region human numerical values and visual field non-overlapping region human numerical values according to the marked region pictures corresponding to the unmanned aerial vehicles.
Optionally, the method further comprises: judging whether the number of people in the visual field area is greater than a preset number of people threshold value or not; and if the number of people in the visual field area is larger than a preset number of people threshold, generating alarm information, and pushing the alarm information to related personnel so that the related personnel can dispose the alarm information.
Optionally, the maximum hover height is calculated according to the following formula:
Figure BDA0002473619300000021
wherein h is max Representing the maximum hover height, a denotes the resolution of the region picture, e 1 *e 2 Representing the resolution, S, corresponding to the person man The corresponding floor area of the person is shown, lambda represents the imaging scale, and beta represents the camera view angle.
Optionally, the minimum number of unmanned racks is calculated according to the following formula:
Figure BDA0002473619300000031
wherein, num min Representing the minimum number of unmanned stands, M and N representing the length and width of the region to be counted, respectively, d 1 Long edge value, d, representing the overlapping area of the fields of view between two adjacent drones 2 The wide edge value of a visual field overlapping region between two adjacent unmanned aerial vehicles is represented, lambda represents an imaging proportion, and k represents a ratio of actual ground edge length to corresponding region picture edge length.
In order to achieve the above object, a second embodiment of the present invention provides a computer-readable storage medium, on which an unmanned aerial vehicle-based regional population counting program is stored, wherein the unmanned aerial vehicle-based regional population counting program, when executed by a processor, implements the unmanned aerial vehicle-based regional population counting method.
According to the computer-readable storage medium of the embodiment of the invention, the area people counting program based on the unmanned aerial vehicle is stored, so that when the processor executes the area people counting program based on the unmanned aerial vehicle, the above area people counting method based on the unmanned aerial vehicle is realized, the effective counting of the number of people in an open place is realized automatically, the number of people in the open place is controlled in real time, the occurrence of trampling events is prevented, and the safety of people in the open place is ensured.
In order to achieve the above object, a third embodiment of the present invention provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the method for counting the number of people in the unmanned aerial vehicle-based region is implemented.
According to the computer equipment provided by the embodiment of the invention, the memory is used for storing the area people counting program based on the unmanned aerial vehicle, so that the processor can realize the area people counting method based on the unmanned aerial vehicle when executing the area people counting program based on the unmanned aerial vehicle, thereby automatically and effectively counting the number of people in an open place, further controlling the number of people in the open place in real time, preventing the occurrence of trampling events and ensuring the safety of the people in the open place.
In order to achieve the above object, a fourth aspect of the present invention provides an area people counting system based on an unmanned aerial vehicle, including: a plurality of unmanned aerial vehicle and surveillance center, wherein: the monitoring center is used for acquiring unmanned aerial vehicle parameters and personnel shooting parameters, calculating the maximum hovering height of the unmanned aerial vehicle according to the unmanned aerial vehicle parameters and the personnel shooting parameters, acquiring information of an area to be counted and information of shooting overlapping areas between every two unmanned aerial vehicles, and calculating the minimum number of unmanned racks according to the information of the area to be counted and the information of the shooting overlapping areas between every two unmanned aerial vehicles so as to determine the adjustable range of the control parameters according to the maximum hovering height and the minimum number of unmanned racks; the monitoring center is also used for acquiring control parameters according to the adjustable range of the control parameters, calculating the positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information; the unmanned aerial vehicle is used for photographing a visual field region to obtain a region picture, inputting the region picture into the light weight neural network classifier, detecting people in the region picture through the light weight neural network classifier, generating people number information corresponding to the unmanned aerial vehicle according to a detection result, and counting the number of people in a region to be counted according to the people number information.
According to the regional people counting system based on the unmanned aerial vehicle, the monitoring center is arranged to obtain parameters of the unmanned aerial vehicle and personnel shooting parameters, the maximum hovering height of the unmanned aerial vehicle is calculated according to the parameters of the unmanned aerial vehicle and the personnel shooting parameters, the information of a region to be counted and the information of shooting overlapping regions between every two unmanned aerial vehicles are obtained, the minimum number of unmanned racks is calculated according to the information of the region to be counted and the information of the shooting overlapping regions between every two unmanned aerial vehicles, and therefore the adjustable range of the control parameters is determined according to the maximum hovering height and the minimum number of the unmanned racks; the monitoring center is also used for acquiring control parameters according to the adjustable range of the control parameters, calculating the positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information; the unmanned aerial vehicle is used for photographing a visual field region to obtain a region picture, inputting the region picture into the light weight neural network classifier to detect people in the region picture through the light weight neural network classifier, and generating people number information corresponding to the unmanned aerial vehicle according to a detection result so as to count people number in a region to be counted according to the people number information; therefore, the number of people in the open place can be automatically and effectively counted, the number of people in the open place can be controlled in real time, the trampling event is prevented, and the safety of the people in the open place is guaranteed.
In addition, the unmanned aerial vehicle-based regional people counting system provided by the embodiment of the invention can also have the following additional technical characteristics:
optionally, the monitoring center is further configured to generate a panoramic picture of the area to be counted according to the area pictures corresponding to all the unmanned aerial vehicles, and input the panoramic picture into a heavy end-to-end convolutional network, so that the heavy end-to-end convolutional network accurately counts the number of people in the area to be counted according to the panoramic picture.
Drawings
Fig. 1 is a schematic flow chart of a method for counting the number of people in an area based on an unmanned aerial vehicle according to an embodiment of the invention;
fig. 2 is a schematic view of an imaging mode of an unmanned aerial vehicle according to an embodiment of the invention;
fig. 3 is a schematic diagram illustrating an arrangement of overall area pictures according to an embodiment of the present invention;
fig. 4 is a schematic flow chart of a method for counting the number of people in an area based on a drone according to another embodiment of the present invention;
fig. 5 is a schematic structural diagram of a region people counting system based on an unmanned aerial vehicle according to an embodiment of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
In the related art, because an open place is not provided with a fixed entrance, the number of people in the open place is difficult to be effectively counted, according to the regional people counting method based on the unmanned aerial vehicle, firstly, the parameters of the unmanned aerial vehicle and the personnel shooting parameters are obtained, and the maximum hovering height of the unmanned aerial vehicle is calculated according to the parameters of the unmanned aerial vehicle and the personnel shooting parameters; then, acquiring information of an area to be counted and information of shooting overlapping areas between every two unmanned aerial vehicles, and calculating the minimum number of unmanned aerial vehicles according to the information of the area to be counted and the information of the shooting overlapping areas between every two unmanned aerial vehicles so as to determine the adjustable range of the control parameters according to the maximum hovering height and the minimum number of unmanned aerial vehicles; then, acquiring control parameters according to the adjustable range of the control parameters, calculating positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information; then, controlling each unmanned aerial vehicle to photograph a visual field region to obtain a region picture, inputting the region picture into a lightweight neural network classifier to perform personnel detection on the region picture through the lightweight neural network classifier, and generating the number information corresponding to the unmanned aerial vehicle according to a detection result so as to perform people counting in a region to be counted according to the number information; therefore, the number of people in the open place can be automatically and effectively counted, the number of people in the open place can be controlled in real time, the trampling event is prevented, and the safety of the people in the open place is guaranteed.
In order to better understand the above technical solutions, exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
Fig. 1 is a schematic flow chart of a method for counting the number of people in an area based on an unmanned aerial vehicle according to an embodiment of the present invention, as shown in fig. 1, the method for counting the number of people in the area based on the unmanned aerial vehicle includes the following steps:
s101, acquiring parameters of the unmanned aerial vehicle and personnel shooting parameters, and calculating the maximum hovering height of the unmanned aerial vehicle according to the parameters of the unmanned aerial vehicle and the personnel shooting parameters.
That is to say, the parameters of the drone and the shooting parameters of the person (for example, the resolution corresponding to each person, etc.) are obtained, so that the maximum hovering height of the drone can be calculated according to the parameters of the drone and the shooting parameters of the person.
Wherein, the drone parameters may include a plurality of, e.g., drone models, etc.
As an example, the drone parameters include: visual angle beta of camera, resolution of regional picture obtained by unmanned aerial vehicle photographingRatio a b, imaging ratio λ:1, resolution e of person in image 1 *e 2 Average area S of a person standing man The ratio k between the actual ground side length and the corresponding region picture side length, and the long side value d of the view overlapping region between two adjacent unmanned aerial vehicles 1 And the width margin value d of the overlapping area of the visual fields between two adjacent unmanned aerial vehicles 2
The maximum hovering height may be calculated in various ways.
In some embodiments, the maximum hover height is calculated according to the following formula:
Figure BDA0002473619300000051
wherein h is max Representing the maximum hover height, a × b representing the resolution of the region picture, e 1 *e 2 Indicating the resolution, S, corresponding to the person man The corresponding floor area of the person is represented, lambda represents the imaging scale, and beta represents the camera view angle.
It can be understood that as shown in fig. 2, the unmanned aerial vehicle is taken as an O point, h is the height of the unmanned aerial vehicle from the ground, and L AC For the actual length corresponding to the diagonal line of the imaged region picture, the following relation can be obtained:
Figure BDA0002473619300000061
further, the relationship of the hover height of the drone to the field of view area may be expressed as:
Figure BDA0002473619300000062
in conclusion, the calculation formula of the maximum hovering height of the unmanned aerial vehicle can be obtained.
In addition, e is 1 *e 2 And S man There are many ways of determining(s), and in the specific embodiment of the present invention, in order to ensure that the drone is unmannedDefinition of persons in the field of view, e used 1 *e 2 30 by 30; meanwhile, the average shoulder width of young men of 18-25 years old is 38.6cm, the average shoulder width of young women of the same age is 35cm, for the purpose of clearer shooting, 35cm is selected, the shoe size is 35 yards, namely 22.5cm, and the average floor area is determined to be 0.35 x 0.225=0.07875 square meter.
S102, acquiring information of an area to be counted and information of a shooting overlapping area between every two unmanned aerial vehicles, and calculating the minimum number of unmanned aerial vehicles according to the information of the area to be counted and the information of the shooting overlapping area between every two unmanned aerial vehicles so as to determine the adjustable range of the control parameters according to the maximum hovering height and the minimum number of unmanned aerial vehicles.
That is, the information of the area to be counted (i.e., the information of the open area of the number of people to be counted, for example, the GPS positioning information, the area, the length and the width of the open area) and the information of the shooting overlap area between every two drones (i.e., the information of the overlap portion of the pictures obtained by shooting between the two drones) are obtained, and then, the minimum number of unmanned racks capable of completing the number counting of people in the area to be counted is calculated according to the information of the area to be counted and the information of the shooting overlap area between every two drones, so that the adjustable range of the control parameter is determined according to the maximum hovering height and the minimum number of unmanned racks.
In some embodiments, the minimum number of unmanned racks is calculated according to the following formula:
Figure BDA0002473619300000063
wherein, num min Representing the minimum number of unmanned stands, M and N representing the length and width of the region to be counted, respectively, d 1 Long edge value, d, representing the overlapping area of the fields of view between two adjacent drones 2 The wide edge value of a visual field overlapping region between two adjacent unmanned aerial vehicles is represented, lambda represents an imaging proportion, and k represents a ratio of actual ground edge length to corresponding region picture edge length.
It will be appreciated that the shaded portions 5 and 6 in the figure are both areas of overlapping field of view between two drones, as shown in figure 3, and for the areasFor picture 4, it is adjacent to two unmanned aerial vehicles, there are two overlapping regions of field of vision, and one of them overlapping regions of field of vision is the overlap in the broadside direction of two regional pictures, namely d 2 The long side in the view overlapping area is the long side corresponding to the area picture, and the short side is d 2 Correspondingly, the other overlapping field of view is the overlap in the long side direction of the two region pictures, i.e. d 1 The long side of the corresponding view overlapping area is the wide side corresponding to the area picture, and the short side is d 1 (ii) a Thus, for each drone, only its respective two overlapping-of-view regions are calculated, and statistics can be made for all overlapping-of-view regions.
Thus, after the maximum hovering height and the minimum number of unmanned racks are determined, the control parameter adjustable range can be set according thereto; for example, under the condition that the number of unmanned stands is enough, the hovering height of the unmanned aerial vehicle can be reduced, so that the area picture shot by the unmanned aerial vehicle is clearer, and the people counting is more accurate; or the hovering height of the unmanned aerial vehicle, the ratio of the actual ground side length to the image side length and the number of the required unmanned aerial vehicles are changed by adjusting the parameters of the camera of the unmanned aerial vehicle; alternatively, dd is adjusted 1 、dd 2 (wherein, dd) 1 The ratio (dd) of the long edge value of the overlapped area of the visual fields between two adjacent unmanned aerial vehicles to the long edge of the area picture obtained by shooting of a single unmanned aerial vehicle 2 The ratio of the width value of the overlapping area of the visual fields between two adjacent unmanned aerial vehicles to the width of the area picture obtained by shooting by a single unmanned aerial vehicle) can be adjusted.
S103, acquiring control parameters according to the adjustable range of the control parameters, calculating positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information.
That is, the control parameters input by the user according to the adjustable range of the control parameters are acquired, the positioning information corresponding to each unmanned aerial vehicle (i.e., the specific hovering position of the unmanned aerial vehicle during the task execution process) is calculated according to the input control parameters, the positioning information of the area to be counted and the length and width information, and then the unmanned aerial vehicle is controlled to fly to the corresponding execution position of the unmanned aerial vehicle every day according to the positioning information obtained through calculation.
S104, controlling each unmanned aerial vehicle to photograph the visual field region to obtain a region picture, inputting the region picture into the light weight neural network classifier, carrying out personnel detection on the region picture through the light weight neural network classifier, and generating the number information corresponding to the unmanned aerial vehicle according to the detection result so as to carry out the number statistics of the region to be counted according to the number information.
That is to say, after all unmanned aerial vehicles take one's place, control every unmanned aerial vehicle and take a picture to the field of vision region of self to obtain the regional picture that this field of vision region corresponds, then, input the regional picture that obtains to the lightweight neural network classifier that trains in advance, with through this lightweight neural network classifier carry out personnel's detection to the regional picture, and then, can obtain the number of people information that this unmanned aerial vehicle corresponds, and further carry out the statistics of the number of people in the region of waiting to make statistics of according to the number of people information.
Wherein, the personnel detection can be carried out in various ways; for example, a camera is arranged at the bottom of the unmanned aerial vehicle at a certain angle, so that images of personnel are obtained through the camera, and body characteristic information of the personnel is obtained according to the images so as to identify the quantity of the personnel in the images; or, install the camera in unmanned aerial vehicle's bottom to acquire personnel's head image through this camera, with the head characteristic information through the discernment image and acquire personnel's quantity.
In some embodiments, in order to further improve the accuracy of the method for counting the number of people in the unmanned aerial vehicle-based region, the method for counting the number of people in the unmanned aerial vehicle-based region further includes:
and generating panoramic pictures of the area to be counted according to the area pictures corresponding to all the unmanned aerial vehicles, and inputting the panoramic pictures into a heavyweight end-to-end convolution network so that the heavyweight end-to-end convolution network can accurately count the number of people in the area to be counted according to the panoramic pictures.
It can be understood that because the calculation capability of the unmanned aerial vehicle is relatively poor and the memory is low, the lightweight neural network classifier is loaded on the unmanned aerial vehicle to preliminarily count the number of people in the corresponding visual field area; further, after the area pictures corresponding to all the unmanned aerial vehicles are obtained, all the pictures can form a picture matrix according to the positioning information of the unmanned aerial vehicles, and then the picture matrix is spliced according to the characteristic matching to obtain the panoramic pictures corresponding to the areas to be counted; so that the heavyweight end-to-end convolution network can accurately count the number of people in the area to be counted according to the panoramic picture.
In some embodiments, to further improve the applicability of the method for counting the number of people in the unmanned aerial vehicle-based region, before inputting the region picture into the lightweight neural network classifier, the method further includes:
obtain the regional picture that this unmanned aerial vehicle's adjacent unmanned aerial vehicle corresponds to the regional picture that corresponds this unmanned aerial vehicle is marked according to the regional picture that adjacent unmanned aerial vehicle corresponds, wherein, the mark includes the overlapping region mark in field of vision, the non-overlapping region mark in field of vision, so that light-weighted neural network classifier generates the regional people numerical value in field of vision according to the regional picture that this unmanned aerial vehicle after the mark corresponds, overlapping region people numerical value in field of vision and the non-overlapping region people numerical value in field of vision.
That is, while the unmanned aerial vehicle is in wireless communication with the monitoring center, the unmanned aerial vehicle is also in D2D communication with an adjacent unmanned aerial vehicle to acquire an area picture of the adjacent unmanned aerial vehicle, so that the area picture of the unmanned aerial vehicle can be labeled according to the area picture of the adjacent unmanned aerial vehicle to label a visual field overlapping area and a visual field non-overlapping area, and further, the unmanned aerial vehicle can identify the number of people corresponding to the area picture, the number of people in the visual field overlapping area and the number of people in the visual field non-overlapping area according to the labeled area picture; it can be understood that after the people number value corresponding to the image in the area, the people number value in the area with overlapped view fields and the people number value in the area without overlapped view fields are obtained, the people number values corresponding to different area ranges can be calculated; that is, after the statistical area selected by the user is obtained, the people number value of the corresponding statistical area is calculated according to the people number value information.
In some embodiments, the method for counting the number of people in the unmanned plane based area further includes: judging whether the number of people in the visual field area is greater than a preset number of people threshold value or not;
and if the number of people in the visual field area is greater than a preset number of people threshold, generating alarm information, and pushing the alarm information to related personnel so that the related personnel can dispose the alarm information.
Specifically, whether the numerical value of people in the visual field area exceeds a preset threshold value is judged, if yes, the flow of people corresponding to the visual field area exceeds the standard, and alarm information is generated so that related people can dredge the people in the area and tread events are prevented.
In an embodiment of the present invention, as shown in fig. 4, the method for counting the number of people in the unmanned aerial vehicle-based region includes the following steps:
s201, acquiring parameters of the unmanned aerial vehicle and personnel shooting parameters, and calculating the maximum hovering height of the unmanned aerial vehicle according to the parameters of the unmanned aerial vehicle and the personnel shooting parameters.
S202, acquiring information of an area to be counted and information of a shooting overlapping area between every two unmanned aerial vehicles, and calculating the minimum number of unmanned aerial vehicles according to the information of the area to be counted and the information of the shooting overlapping area between every two unmanned aerial vehicles.
S203, determining the adjustable range of the control parameters according to the maximum hovering height and the minimum number of the unmanned racks.
And S204, acquiring control parameters, and calculating the positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted.
S205, controlling each unmanned aerial vehicle to fly to a designated position according to the positioning information, and controlling each unmanned aerial vehicle to photograph the visual field area so as to obtain the area picture.
S206, obtaining the area picture corresponding to the adjacent unmanned aerial vehicle of the unmanned aerial vehicle, and marking the area picture corresponding to the unmanned aerial vehicle according to the area picture corresponding to the adjacent unmanned aerial vehicle.
And S207, inputting the marked region picture into a lightweight neural network classifier, and performing personnel detection on the region picture through the lightweight neural network classifier so as to generate the information of the number of the people corresponding to the unmanned aerial vehicle according to a detection result.
S208, judging whether the number of people in the visual field area is larger than a preset number of people threshold value; if yes, go to step S209; if not, step S210 is performed.
And S209, generating alarm information and pushing the alarm information to related personnel.
S210, each unmanned aerial vehicle sends the corresponding number information and the area pictures to the monitoring center.
S211, the monitoring center generates panoramic pictures of the area to be counted according to the area pictures corresponding to all the unmanned aerial vehicles, and inputs the panoramic pictures into a heavyweight end-to-end convolution network so as to accurately count the number of people in the area to be counted.
In summary, according to the area people counting method based on the unmanned aerial vehicle of the embodiment of the invention, firstly, parameters of the unmanned aerial vehicle and personnel shooting parameters are obtained, and the maximum hovering height of the unmanned aerial vehicle is calculated according to the parameters of the unmanned aerial vehicle and the personnel shooting parameters; then, acquiring information of an area to be counted and information of shooting overlapping areas between every two unmanned aerial vehicles, and calculating the minimum number of unmanned aerial vehicles according to the information of the area to be counted and the information of the shooting overlapping areas between every two unmanned aerial vehicles so as to determine the adjustable range of the control parameters according to the maximum hovering height and the minimum number of unmanned aerial vehicles; then, acquiring control parameters according to the adjustable range of the control parameters, calculating positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information; then, controlling each unmanned aerial vehicle to photograph the visual field region to obtain a region picture, inputting the region picture into a lightweight neural network classifier to perform personnel detection on the region picture through the lightweight neural network classifier, and generating the number information corresponding to the unmanned aerial vehicle according to the detection result so as to perform people counting of the region to be counted according to the number information; therefore, the number of people in the open place can be automatically and effectively counted, the number of people in the open place can be controlled in real time, the trampling event is prevented, and the safety of the people in the open place is guaranteed.
In order to implement the above embodiment, an embodiment of the present invention provides a computer-readable storage medium, on which an unmanned aerial vehicle-based region population counting program is stored, where the unmanned aerial vehicle-based region population counting program, when executed by a processor, implements the unmanned aerial vehicle-based region population counting method as described above.
According to the computer-readable storage medium of the embodiment of the invention, the area people counting program based on the unmanned aerial vehicle is stored, so that when the processor executes the area people counting program based on the unmanned aerial vehicle, the above area people counting method based on the unmanned aerial vehicle is realized, the effective counting of the number of people in an open place is realized automatically, the number of people in the open place is controlled in real time, the occurrence of trampling events is prevented, and the safety of people in the open place is ensured.
In order to implement the foregoing embodiments, an embodiment of the present invention provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and running on the processor, where when the processor executes the computer program, the method for counting the number of people in the area based on the unmanned aerial vehicle is implemented.
According to the computer equipment provided by the embodiment of the invention, the memory is used for storing the area people counting program based on the unmanned aerial vehicle, so that the processor can realize the area people counting method based on the unmanned aerial vehicle when executing the area people counting program based on the unmanned aerial vehicle, thereby automatically and effectively counting the number of people in an open place, further controlling the number of people in the open place in real time, preventing the occurrence of trampling events and ensuring the safety of the people in the open place.
In order to implement the above embodiments, an embodiment of the present invention provides an area people counting system based on an unmanned aerial vehicle, and as shown in fig. 5, the area people counting system based on the unmanned aerial vehicle includes: a plurality of drones 10 and a monitoring center 20.
The monitoring center 20 is configured to obtain parameters of the unmanned aerial vehicles and parameters of shooting by people, calculate a maximum hovering height of the unmanned aerial vehicles 10 according to the parameters of the unmanned aerial vehicles and the parameters of shooting by people, obtain information of an area to be counted and information of a shooting overlapping area between every two unmanned aerial vehicles 10, and calculate a minimum number of unmanned racks according to the information of the area to be counted and the information of the shooting overlapping area between every two unmanned aerial vehicles 10, so as to determine an adjustable range of control parameters according to the maximum hovering height and the minimum number of unmanned racks;
the monitoring center 20 is further configured to obtain a control parameter according to the adjustable range of the control parameter, calculate positioning information corresponding to each unmanned aerial vehicle 10 according to the control parameter and the information of the area to be counted, and control each unmanned aerial vehicle 10 to fly to a specified position according to the positioning information;
the unmanned aerial vehicle 10 is used for taking a picture of a visual field region to obtain a region picture, inputting the region picture into the light weight neural network classifier, carrying out personnel detection on the region picture through the light weight neural network classifier, and generating the number information corresponding to the unmanned aerial vehicle 10 according to a detection result, so that the number statistics of a region to be counted is carried out according to the number information.
In some embodiments, the monitoring center 20 is further configured to generate a panoramic image of the area to be counted according to the area images corresponding to all the unmanned aerial vehicles 10, and input the panoramic image into the heavyweight end-to-end convolutional network, so that the heavyweight end-to-end convolutional network accurately counts the number of people in the area to be counted according to the panoramic image.
It should be noted that the above description about the unmanned aerial vehicle-based regional population counting method in fig. 1 is also applicable to the unmanned aerial vehicle-based regional population counting system, and is not repeated herein.
In summary, according to the area people counting system based on the unmanned aerial vehicle of the embodiment of the invention, the monitoring center is arranged to obtain the parameters of the unmanned aerial vehicle and the personnel shooting parameters, calculate the maximum hovering height of the unmanned aerial vehicle according to the parameters of the unmanned aerial vehicle and the personnel shooting parameters, obtain the information of the area to be counted and the information of the shooting overlapping area between every two unmanned aerial vehicles, and calculate the minimum number of unmanned aerial vehicles according to the information of the area to be counted and the information of the shooting overlapping area between every two unmanned aerial vehicles, so as to determine the adjustable range of the control parameters according to the maximum hovering height and the minimum number of unmanned aerial vehicles; the monitoring center is also used for acquiring control parameters according to the adjustable range of the control parameters, calculating positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information; the unmanned aerial vehicle is used for photographing a visual field region to obtain a region picture, inputting the region picture into the light weight neural network classifier to detect people in the region picture through the light weight neural network classifier, and generating people number information corresponding to the unmanned aerial vehicle according to a detection result so as to count people number in a region to be counted according to the people number information; therefore, the number of people in the open place can be automatically and effectively counted, the number of people in the open place can be controlled in real time, the trampling event is prevented, and the safety of the people in the open place is guaranteed.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the present invention, unless otherwise explicitly stated or limited, the terms "mounted," "connected," "fixed," and the like are to be construed broadly, e.g., as being permanently connected, detachably connected, or integral; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "above," and "over" a second feature may be directly on or obliquely above the second feature, or simply mean that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the terminology used in the description presented above should not be understood as necessarily referring to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are exemplary and not to be construed as limiting the present invention, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (8)

1. An unmanned aerial vehicle-based regional people counting method is characterized by comprising the following steps:
acquiring unmanned aerial vehicle parameters and personnel shooting parameters, and calculating the maximum hovering height of the unmanned aerial vehicle according to the unmanned aerial vehicle parameters and the personnel shooting parameters;
acquiring information of an area to be counted and information of shooting overlapping areas between every two unmanned aerial vehicles, and calculating the minimum number of unmanned aerial vehicles according to the information of the area to be counted and the information of the shooting overlapping areas between every two unmanned aerial vehicles so as to determine the adjustable range of control parameters according to the maximum hovering height and the minimum number of unmanned aerial vehicles;
acquiring control parameters according to the adjustable range of the control parameters, calculating positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information;
controlling each unmanned aerial vehicle to photograph a visual field region to obtain a region picture, inputting the region picture into a lightweight neural network classifier to perform personnel detection on the region picture through the lightweight neural network classifier, and generating personnel number information corresponding to the unmanned aerial vehicle according to a detection result so as to perform personnel number statistics of a region to be counted according to the personnel number information;
wherein the maximum hover height is calculated according to the formula:
Figure FDA0003787236080000011
wherein h is max Representing the maximum hovering height, a, b representing the resolution of the picture of the area, e 1 、e 2 Indicating the resolution, S, corresponding to the person man The method comprises the following steps of (1) representing the corresponding floor area of a person, wherein lambda represents an imaging proportion, and beta represents a camera view angle;
the minimum number of unmanned frames is calculated according to the following formula:
Figure FDA0003787236080000012
wherein, num min Representing the minimum number of unmanned stands, M and N representing the length and width of the region to be counted, respectively, d 1 Long edge value, d, representing the overlapping area of the fields of view between two adjacent drones 2 The wide edge value of the visual field overlapping region between two adjacent unmanned aerial vehicles is represented, lambda represents the imaging proportion, and k represents the ratio of the actual ground edge length to the corresponding region picture edge length.
2. The unmanned aerial vehicle-based regional people counting method of claim 1, further comprising:
and generating the panoramic picture of the area to be counted according to the area pictures corresponding to all the unmanned aerial vehicles, and inputting the panoramic picture into a heavyweight end-to-end convolution network, so that the heavyweight end-to-end convolution network can accurately count the number of people in the area to be counted according to the panoramic picture.
3. The unmanned aerial vehicle-based regional population counting method of claim 1, wherein prior to inputting the regional picture into a lightweight neural network classifier, further comprising:
the method comprises the steps of obtaining the area pictures corresponding to the adjacent unmanned aerial vehicles of the unmanned aerial vehicles, and marking the area pictures corresponding to the unmanned aerial vehicles according to the area pictures corresponding to the adjacent unmanned aerial vehicles, wherein the marking comprises visual field overlapping area marking and visual field non-overlapping area marking, so that the light weight neural network classifier generates visual field area human numerical values, visual field overlapping area human numerical values and visual field non-overlapping area human numerical values according to the marked area pictures corresponding to the unmanned aerial vehicles.
4. The unmanned aerial vehicle-based regional people counting method of claim 3, further comprising:
judging whether the number of people in the visual field area is greater than a preset number of people threshold value or not;
and if the number of people in the visual field area is larger than a preset number of people threshold, generating alarm information, and pushing the alarm information to related personnel so that the related personnel can dispose the alarm information.
5. A computer-readable storage medium, having stored thereon a drone-based region population statistics program that, when executed by a processor, implements the drone-based region population statistics method of any of claims 1-4.
6. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the drone-based regional population counting method of any one of claims 1-4.
7. The utility model provides a regional people statistical system based on unmanned aerial vehicle which characterized in that includes: a plurality of unmanned aerial vehicle and surveillance center, wherein:
the monitoring center is used for acquiring unmanned aerial vehicle parameters and personnel shooting parameters, calculating the maximum hovering height of the unmanned aerial vehicle according to the unmanned aerial vehicle parameters and the personnel shooting parameters, acquiring information of an area to be counted and information of shooting overlapping areas between every two unmanned aerial vehicles, and calculating the minimum number of unmanned racks according to the information of the area to be counted and the information of the shooting overlapping areas between every two unmanned aerial vehicles so as to determine the adjustable range of the control parameters according to the maximum hovering height and the minimum number of unmanned racks;
the monitoring center is also used for acquiring control parameters according to the adjustable range of the control parameters, calculating the positioning information corresponding to each unmanned aerial vehicle according to the control parameters and the information of the area to be counted, and controlling each unmanned aerial vehicle to fly to a specified position according to the positioning information;
the unmanned aerial vehicle is used for photographing a visual field region to obtain a region picture, inputting the region picture into the light weight neural network classifier, detecting personnel in the region picture through the light weight neural network classifier, and generating the number information corresponding to the unmanned aerial vehicle according to a detection result so as to count the number of people in a region to be counted according to the number information;
wherein the maximum hover height is calculated according to the formula:
Figure FDA0003787236080000021
wherein h is max Representing the maximum hovering height, a, b representing the resolution of the region picture, e 1 、e 2 Indicating the resolution, S, corresponding to the person man The method comprises the following steps of (1) representing a floor area corresponding to personnel, wherein lambda represents an imaging proportion, and beta represents a camera view angle;
the minimum number of unmanned racks is calculated according to the following formula:
Figure FDA0003787236080000031
wherein, num min Representing the minimum number of unmanned racks, M and N beingDenotes the length and width of the region to be counted, d 1 Long edge value, d, representing the overlapping area of the fields of view between two adjacent drones 2 The wide edge value of the visual field overlapping region between two adjacent unmanned aerial vehicles is represented, lambda represents the imaging proportion, and k represents the ratio of the actual ground edge length to the corresponding region picture edge length.
8. The unmanned aerial vehicle-based regional people counting system of claim 7, wherein:
the monitoring center is further used for generating the panoramic picture of the area to be counted according to the area pictures corresponding to all the unmanned aerial vehicles, and inputting the panoramic picture into the heavy end-to-end convolution network, so that the heavy end-to-end convolution network can accurately count the number of people in the area to be counted according to the panoramic picture.
CN202010356436.XA 2020-04-29 2020-04-29 Unmanned aerial vehicle-based regional people counting method and system Active CN111553904B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010356436.XA CN111553904B (en) 2020-04-29 2020-04-29 Unmanned aerial vehicle-based regional people counting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010356436.XA CN111553904B (en) 2020-04-29 2020-04-29 Unmanned aerial vehicle-based regional people counting method and system

Publications (2)

Publication Number Publication Date
CN111553904A CN111553904A (en) 2020-08-18
CN111553904B true CN111553904B (en) 2022-11-22

Family

ID=72004162

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010356436.XA Active CN111553904B (en) 2020-04-29 2020-04-29 Unmanned aerial vehicle-based regional people counting method and system

Country Status (1)

Country Link
CN (1) CN111553904B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898216A (en) * 2016-04-14 2016-08-24 武汉科技大学 Method of counting number of people by using unmanned plane
CN108955645A (en) * 2018-07-16 2018-12-07 福州日兆信息科技有限公司 Three-dimensional modeling method and device applied to communication iron tower intelligent patrol detection
CN108983806A (en) * 2017-06-01 2018-12-11 菜鸟智能物流控股有限公司 Method and system for generating area detection and air route planning data and aircraft
CN109117811A (en) * 2018-08-24 2019-01-01 颜俊君 A kind of system and method based on low-altitude remote sensing measuring technique estimation urban vegetation coverage rate
KR20190130278A (en) * 2018-05-14 2019-11-22 삼성중공업 주식회사 Apparatus and method for inspecting structure on water

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767706B (en) * 2016-12-09 2019-05-14 中山大学 A kind of unmanned plane reconnoitres the Aerial Images acquisition method and system of the scene of a traffic accident

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898216A (en) * 2016-04-14 2016-08-24 武汉科技大学 Method of counting number of people by using unmanned plane
CN108983806A (en) * 2017-06-01 2018-12-11 菜鸟智能物流控股有限公司 Method and system for generating area detection and air route planning data and aircraft
KR20190130278A (en) * 2018-05-14 2019-11-22 삼성중공업 주식회사 Apparatus and method for inspecting structure on water
CN108955645A (en) * 2018-07-16 2018-12-07 福州日兆信息科技有限公司 Three-dimensional modeling method and device applied to communication iron tower intelligent patrol detection
CN109117811A (en) * 2018-08-24 2019-01-01 颜俊君 A kind of system and method based on low-altitude remote sensing measuring technique estimation urban vegetation coverage rate

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Geometric and Physical Constraints for Drone-Based Head Plane Crowd Density Estimation;Weizhe Liu et al.;《2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)》;20200127;全文 *
基于四轴飞行器的区域人数识别;花蕾等;《数字技术与应用》;20190331;第37卷(第03期);全文 *

Also Published As

Publication number Publication date
CN111553904A (en) 2020-08-18

Similar Documents

Publication Publication Date Title
CN111126399B (en) Image detection method, device and equipment and readable storage medium
CN103870798B (en) Target detecting method, target detecting equipment and image pickup equipment
JP2022103234A (en) Information processing device, information processing method, and program
CN110142785A (en) A kind of crusing robot visual servo method based on target detection
CN107591207A (en) A kind of epidemic situation investigation method, apparatus, system and equipment
KR20170035922A (en) Techniques for automatic real-time calculation of user wait times
US9177214B1 (en) Method and apparatus for an adaptive threshold based object detection
CN113240249B (en) Urban engineering quality intelligent evaluation method and system based on unmanned aerial vehicle augmented reality
CN115797873B (en) Crowd density detection method, system, equipment, storage medium and robot
CN111666920B (en) Target article wearing detection method and device, storage medium and electronic device
CN107730880A (en) A kind of congestion monitoring method and unmanned vehicle based on unmanned vehicle
CN108932273A (en) Picture screening technique and device
CN110414400A (en) A kind of construction site safety cap wearing automatic testing method and system
US11763662B2 (en) Systems and methods of enforcing dynamic thresholds of social distancing rules
CN113011371A (en) Target detection method, device, equipment and storage medium
CN112487894A (en) Automatic inspection method and device for rail transit protection area based on artificial intelligence
CN110795998B (en) People flow detection method and device, electronic equipment and readable storage medium
JP2013137604A (en) Image collation processing device, image collation processing method and image collation processing program
CN111553904B (en) Unmanned aerial vehicle-based regional people counting method and system
WO2022198508A1 (en) Lens abnormality prompt method and apparatus, movable platform, and readable storage medium
JP7199645B2 (en) Object recognition system and object recognition method
CN102867214B (en) Counting management method for people within area range
JP5411641B2 (en) Image classification apparatus, image classification system, image classification method, program, and recording medium
CN112418096A (en) Method and device for detecting falling and robot
KR102099816B1 (en) Method and apparatus for collecting floating population data on realtime road image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant