WO2021063046A1 - Distributed target monitoring system and method - Google Patents

Distributed target monitoring system and method Download PDF

Info

Publication number
WO2021063046A1
WO2021063046A1 PCT/CN2020/098588 CN2020098588W WO2021063046A1 WO 2021063046 A1 WO2021063046 A1 WO 2021063046A1 CN 2020098588 W CN2020098588 W CN 2020098588W WO 2021063046 A1 WO2021063046 A1 WO 2021063046A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
image
unit
probability
target object
Prior art date
Application number
PCT/CN2020/098588
Other languages
French (fr)
Chinese (zh)
Inventor
周飞
刘倞
Original Assignee
熵康(深圳)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 熵康(深圳)科技有限公司 filed Critical 熵康(深圳)科技有限公司
Publication of WO2021063046A1 publication Critical patent/WO2021063046A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/19Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using infrared-radiation detection systems
    • G08B13/191Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using infrared-radiation detection systems using pyroelectric sensor means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19684Portable terminal, e.g. mobile phone, used for viewing video remotely
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B15/00Identifying, scaring or incapacitating burglars, thieves or intruders, e.g. by explosives
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • H04L67/025Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • This application relates to the field of target recognition and detection, and in particular to a distributed target monitoring system and method.
  • the embodiments of the present application provide a distributed target monitoring system.
  • the system includes at least one detection and alarm device, an image detection device, and a remote monitoring terminal.
  • the image detection device is respectively connected to the detection and alarm device and the remote monitoring terminal.
  • the detection and alarm equipment includes an infrared detection unit, a central processing unit, and an alarm unit, and the central processing unit is electrically connected to the infrared detection unit and the alarm unit, respectively;
  • the infrared detection unit is used to detect thermal infrared signals within a sensing range
  • the central processing unit is used to receive the thermal infrared signals, and control the alarm unit to issue an alarm signal according to the thermal infrared signals
  • the image detection device includes an image acquisition unit and a control unit that are electrically connected, the image acquisition unit is used to acquire a target area image, the control unit is used to receive the target area image and process the target area image, and Controlling the central processing unit according to the target area image, so that the central processing unit controls the alarm unit to issue an alarm signal;
  • the remote monitoring terminal includes an image display unit and a remote control unit that are electrically connected, and the remote control unit is used to obtain a target area image collected by the image detection device and display the target area image on the image display unit, And controlling the central processing unit, so that the central processing unit controls the alarm unit to issue an alarm signal.
  • an embodiment of the present application also provides a distributed target monitoring method, which is applied to an image detection device, and the method includes:
  • the embodiments of the present application also provide a distributed target monitoring method, which is applied to detecting alarm equipment, and the method includes:
  • the beneficial effect of the present application is: different from the prior art, the distributed target monitoring system and method in the embodiments of the present application detects the sensing range by detecting the infrared detection unit in the alarm unit According to the thermal infrared information detected by the infrared detection unit, the central processing unit controls the alarm unit to issue an alarm signal. On the other hand, it collects the target area image through the image acquisition unit in the image detection equipment and controls it according to the target area image. The central processing unit, so that the central processing unit controls the alarm unit to issue an alarm signal. At the same time, the remote control unit in the remote monitoring terminal acquires the target area image and displays the target area image on the image display unit.
  • the remote control unit controls the central processing unit to make The central processing unit controls the alarm unit to send out an alarm signal.
  • the detection and alarm equipment and the image detection equipment are distributed in a distributed manner and cooperate with each other, so that key targets can be monitored in real time and alarms in time.
  • Figure 1a is a schematic diagram of the connection of an image detection device, a remote monitoring terminal, and a detection alarm device according to an embodiment of the present application;
  • FIG. 1b is a schematic diagram of the connection between the detection alarm device, the network server, and the remote monitoring terminal in an embodiment of the present application;
  • FIG. 2 is a schematic diagram of an image detection device and a remote monitoring terminal and a detection alarm device in another embodiment of the present application through wireless connection;
  • Fig. 3a is a schematic diagram of the hardware structure of a detection alarm device according to an embodiment of the present application.
  • Figure 3b is a schematic diagram of the hardware structure of a detection alarm device according to another embodiment of the present application.
  • Figure 4 is a schematic diagram of an alarm unit according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the hardware structure of a detection alarm device according to another embodiment of the present application.
  • FIG. 6 is a schematic diagram of the hardware structure of an image detection device according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of the hardware structure of a remote monitoring terminal according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the connection of a cloud server, a remote monitoring terminal, an image detection device, and a detection alarm device according to an embodiment of the present application;
  • FIG. 9 is a schematic diagram of a cloud server, a remote monitoring terminal, an image detection device, and a detection alarm device in another embodiment of the present application through wireless connection;
  • FIG. 10 is a flowchart of an embodiment of the distributed target monitoring method of the present application.
  • FIG. 11 is a schematic diagram of the total number of harmful organisms in time according to an embodiment of the present application.
  • Figure 12a is a statistical diagram of pest density distribution in an embodiment of the present application.
  • Figure 12b is a schematic diagram of pest density distribution in an embodiment of the present application.
  • FIG. 13 is a flowchart of deep neural network model training in an embodiment of the present application.
  • FIG. 14 is a flowchart of data processing using a deep learning network model in an embodiment of the present application.
  • FIG. 15 is a flowchart of obtaining a second recognition result in an embodiment of the present application.
  • FIG. 16 is a specific flow chart of obtaining a second recognition result in an embodiment of the present application.
  • FIG. 17 is a flowchart of determining a target object and a target position in an embodiment of the present application.
  • FIG. 18 is a flowchart of another embodiment of the distributed target monitoring method of the present application.
  • Figure 19 is a schematic diagram of the distribution of detection and alarm devices in an embodiment of the present application.
  • FIG. 20 is a schematic structural diagram of an embodiment of a distributed target monitoring device of the present application.
  • FIG. 21 is a schematic diagram of the hardware structure of a control unit provided by an embodiment of the present application.
  • An embodiment of the present application provides a distributed target monitoring system, which includes at least one detection alarm device, a remote monitoring terminal 20, and an image detection device 30.
  • the detection alarm device is connected to the remote monitoring terminal 20.
  • Figure 1a exemplarily shows that the image detection device 30 is connected to the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N through a bus (such as a 485 bus).
  • the image detection device 30 and the remote monitoring terminal 20 are connected wirelessly (such as the 433MHZ wireless frequency band).
  • the buses in this embodiment are not all connected by image detection devices, but are cascaded between devices, which makes the actual operation more convenient and more flexible.
  • the devices connected by wireless are equipped with a wireless communication unit.
  • the wireless communication unit can use the Internet of Things system, such as WIFI network, Zigbee network, Bluetooth and NB-IOT network, etc., or mobile communication system, such as 2G, 3G. , 4G and 5G, etc.
  • the wireless communication unit can also include open frequency bands such as 433MHz.
  • an embodiment of the present application also provides a distributed target monitoring system, including at least one detection alarm device, a network server 50, and a remote monitoring terminal 20 .
  • Fig. 1b exemplarily shows that the network server 50 is connected to the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N through a bus.
  • the network server 50 is connected to the remote
  • the monitoring terminal 20 is connected wirelessly. Multiple detection and alarm devices are combined together to construct an Internet of Things system, and the detected thermal infrared signals and alarm signals are sent to the remote monitoring terminal 20 through the network server 50.
  • the buses in this embodiment are not all connected by a network server, but the network server is connected to one of the detection and alarm devices, and then the cascade connection between the detection and alarm devices is used to make the actual operation It is more convenient and flexible.
  • the detection alarm device, the image detection device 30, and the remote monitoring terminal 20 are all provided with a wireless communication unit, and FIG. 2 exemplarily shows The image detection device 30 is connected to the remote monitoring terminal 20, the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N via wireless (such as the 433MHZ wireless frequency band), which includes more detection in the actual environment Alarm equipment.
  • wireless such as the 433MHZ wireless frequency band
  • the detection alarm device 10 includes an infrared detection unit 110, a central processing unit 120 and an alarm unit 130, and the central processing unit 120 is electrically connected to the infrared detection unit 110 and the alarm unit 130, respectively. .
  • the infrared detection unit 110 may be an infrared sensor, a pyroelectric touch sensor, other infrared thermal sensors, etc., for detecting thermal infrared signals within a sensing range.
  • the infrared detection unit 110 is fixed at a high place, and the infrared sensing area can cover the target active area.
  • the infrared detection unit 110 can set the infrared system according to the infrared radiation characteristics of the target to be measured.
  • the infrared sensor detects the change in the infrared spectrum of the target and sends it to the central processing unit 120.
  • the central processing unit 120 may use an STM series chip.
  • the infrared detection unit 110 may also be equipped with a Fresnel lens. It should be noted that the target in this application can be pests, animals or humans, etc.
  • the infrared sensor includes an infrared light emitting diode and an infrared light sensitive diode, and the infrared light emitting diode and the infrared light sensitive diode are encapsulated in a plastic casing.
  • the infrared light-emitting diode lights up and emits an infrared light invisible to the human eye. If there is no target in front of the infrared sensor, then this infrared light will disappear in the cosmic space.
  • the infrared light will be It will be reflected back and shine on the infrared photodiode that is also shining next to itself.
  • the infrared photodiode receives infrared light, the resistance value of its output pin will change. By judging the change in the resistance of the infrared photodiode, Can sense the target ahead.
  • the central processing unit 120 may be a central processing unit (CPU: central processing unit), or a GPU graphics processing unit (GPU: Graphics Processing Unit), etc., for receiving thermal infrared signals, and according to The thermal infrared signal controls the alarm unit 130 to issue an alarm signal.
  • CPU central processing unit
  • GPU Graphics Processing Unit
  • the central processing unit 120 determines whether there is an active target in the sensing range according to the thermal infrared signal. After there is an active target sign, the alarm unit 130 is controlled to issue an alarm signal.
  • the detection alarm device 10 only includes an infrared detection unit 110 and a central processing unit 120 that are electrically connected.
  • the infrared detection unit 110 detects the induction After the thermal infrared signal within the range, it is sent to the central processing unit 120 for processing, and the central processing unit 120 determines whether there is an active target in the sensing range according to the thermal infrared signal.
  • the alarm unit 130 includes a light generator 131 and a sound generator 132. Both the sound generator 132 and the light generator 131 are electrically connected to the central processing unit 120. connection.
  • the light generator 131 is a strong light generator
  • the strong light generator may be a high-power LED lamp bead patch
  • the generated light source is a high-intensity light source
  • the sound generator 132 may be a buzzer or an ultrasonic generator
  • the ultrasonic horn of model 3840 produces high-decibel sound.
  • the central processing unit 120 will control the light generator 131 to generate a high-intensity light source, and the central processing unit 120 will control the sound generator 132 to generate a high-decibel sound to drive the moving target.
  • multiple light generators 131 and sound generators 132 may be distributed at each corner of a closed space or an open space.
  • the processing unit 120 will control the light generator 131 in each corner to generate a high-intensity light source, and the central processing unit 120 will control the sound generator 132 in each corner to generate a high-decibel sound to drive the moving target.
  • the light generator 131 can be replaced with a light strip, which is fixed on the bottom of the wall with 3M glue, similar to a skirting line.
  • the central processing unit 120 controls the light strip to flash continuously to drive away the moving target.
  • the detection and alarm device 10 further includes a photosensitive unit 140, which may be a photosensitive diode or similar components, such as an HPI-6FER2 photosensitive diode.
  • the photosensitive unit 140 can determine whether the current moment is day or night by sensing illumination information.
  • the photosensitive unit 140 is electrically connected to the central processing unit 120 for transmitting the illumination information to the central processing unit 120.
  • the processing unit 120 controls the infrared detection unit 110 and the alarm unit 130 to work at night according to the illumination information, thereby reducing power consumption.
  • the model of the above-mentioned photosensitive unit can be selected according to requirements without being restricted to the limitation in this embodiment.
  • the image detection device 30 includes an image acquisition unit 310 and a control unit 320.
  • the control unit 320 is electrically connected to the image acquisition unit 310, and the image detection device 30 may Fixed in a high place.
  • the image acquisition unit 310 is used to acquire images of the target area.
  • the image acquisition unit 310 may be, for example, an infrared light and/or a visible light camera.
  • the control unit 320 may use chips such as Intel Movidius or Huawei HiSilicon 3519, and infrared light.
  • the camera and/or visible light camera takes advantage of the high altitude to upload the collected target image to the control unit 320.
  • the control unit 320 analyzes the target image through machine vision and neural network.
  • control unit 320 If there is an active target in the target image, the control unit 320 The control signal and the image information of the target are sent to the central processing unit 120, so that the central processing unit 120 controls the alarm unit 130 to issue an alarm signal to drive away the active target according to the received control signal and the image information of the target.
  • the image detection device 30 only includes the control unit 320.
  • the image detection device 30 and the detection alarm device 10 are connected via a bus or wirelessly.
  • the control unit 320 in the image detection device 30 is used to collect and analyze data from multiple detection and alarm devices 10, where the data can be target information and then uploaded to a cloud server.
  • the user can use a terminal device, such as a mobile phone or The computer checks the target information in real time.
  • the remote monitoring terminal 20 includes an image display unit 210 and a remote control unit 220, and the remote control unit 220 and the image display unit 210 are electrically connected.
  • the image display unit 210 may be a display system such as a computer, a mobile phone, or a tablet
  • the remote control unit 220 may be, for example, a host computer software.
  • the remote control unit 220 is used to obtain the target area image sent by the control unit 320, and the state information of the image detection device 30 and the detection alarm device 10, and send it to the image display unit 210 for display.
  • the remote control unit 220 is also used to control the control unit 320 in the image detection device 30 according to the target area image, so that the control unit 320 controls the central processing unit 120 in the detection alarm device, so that the central The processing unit 120 controls the alarm unit 130 to issue an alarm signal to drive the target away.
  • the remote monitoring terminal 20 is wirelessly connected to the image detection device 30 and the detection alarm device, respectively.
  • the remote control unit 220 in the remote monitoring terminal 20 obtains the target area image sent by the image detection device 30, the remote control unit 220 can directly control the detection alarm device to drive the target.
  • the remote control unit 220 The central processing unit 120 is controlled so that the central processing unit 120 controls the alarm unit 130 to issue an alarm signal, so as to drive the target away.
  • the distributed target monitoring system further includes a cloud server 40.
  • the cloud server 40 is wirelessly connected to the image detection device 30, and the cloud server 40 is used to receive the target area image, target information, state information of the image detection device 30, and state information of the detection alarm device sent by the image detection device 30.
  • the cloud server 40 may be a server, such as a rack server, a blade server, a tower server, or a cabinet server, etc., or a server cluster composed of several servers, or a cloud computing service center.
  • FIG. 8 exemplarily shows a cloud server 40, a remote monitoring terminal 20, an image detection device 30, and a detection alarm device 1, a detection alarm device 2, a detection alarm device 3, and Detect alarm device N.
  • the cloud server 40 and the remote monitoring terminal 20 are both wirelessly connected to the image detection device 30, and the image detection device 30 is respectively connected to the detection alarm device 1, the detection alarm device 2, and the detection alarm via a bus.
  • the device 3 is connected to the detection and alarm device N.
  • the image detection device 30 receives the thermal infrared signal and alarm signal sent by the detection alarm device through the bus, and detects the thermal infrared signal and alarm signal, as well as the image of the target area collected by itself, to detect the alarm device's
  • the status information and its own status information are sent to the remote monitoring terminal 20 and the cloud server 40 through the wireless communication unit together.
  • the user can view the above-mentioned various information directly on the remote monitoring terminal 20, or wirelessly connect to the cloud server 40 through a terminal device such as a mobile phone or a computer to view the above-mentioned various information in real time. It should be noted that all devices connected wirelessly are equipped with a wireless communication unit.
  • FIG. 9 exemplarily shows the cloud server 40, the remote monitoring terminal A and the remote monitoring terminal B, the image detection device 1 and the image detection device 2, and the detection alarm device 1.
  • Detect alarm equipment 2 detect alarm equipment 3, detect alarm equipment 4, detect alarm equipment 5, and detect alarm equipment N.
  • the cloud server 40 is wirelessly connected to the image detection device 1 and the image detection device 2
  • the remote monitoring terminal A is wirelessly connected to the image detection device 1.
  • the image detection device 1 is connected to the detection alarm device. 1.
  • the detection alarm device 3 is wirelessly connected; the remote monitoring terminal B is wirelessly connected to the image detection unit 2, and the image detection device 2 is respectively connected to the detection alarm device 4, the detection alarm device 5 and the detection alarm device N wireless connection.
  • the image detection device receives, through a wireless communication unit, information such as thermal infrared signals and alarm signals sent by the detection alarm device wirelessly, and combines the thermal infrared signals and alarm signals, as well as the image of the target area obtained by itself. , Detect the status information of the alarm device and its own status information, etc., and send them to the remote monitoring terminal and the cloud server 40 through the wireless communication unit.
  • the user can view the foregoing various information directly on the remote monitoring terminal, or wirelessly connect to the cloud server 40 through a terminal device such as a mobile phone or a computer to view the foregoing various information in real time, so that the target can be monitored in real time.
  • a terminal device such as a mobile phone or a computer to view the foregoing various information in real time, so that the target can be monitored in real time.
  • all devices connected wirelessly are equipped with a wireless communication unit.
  • an embodiment of the present application also provides a distributed target monitoring method, which is applied to an image detection device, and the method is executed by a control unit in the image detection device, and includes:
  • Step 1010 Obtain an image of the target area.
  • the target area is a biological activity area or a human activity area, preferably a harmful biological activity area.
  • the image includes a video image and a picture image. The image of the target area captured by the camera is obtained, and the target area image contains harmful Biology or human body.
  • Step 1020 Input the target area image into a preset deep neural network model to obtain a first recognition result.
  • the first recognition result is obtained by inputting the target area image captured by the camera into a preset deep neural network model, and the preset deep neural network model is obtained by learning and training a large number of images carrying the target area.
  • Step 1030 Obtain a second recognition result based on the target area image and a preset reference image.
  • the preset reference image is a pre-photographed image that does not contain the target. This image is used as a background image, and the second recognition result is obtained by comparing the target area image taken by the infrared device with the background image.
  • Step 1040 Obtain the target object and the target position in the target area image according to the first recognition result and the second recognition result.
  • the target object is a pest or human body
  • the target position is the position of the pest or human body in the image of the target area.
  • it may be the smallest bounding rectangular frame that encloses the target object, and the position of this rectangular frame is The position of the target object, the target object and the target position in the target area image are obtained through the first recognition result and the second recognition result.
  • Step 1050 Obtain the relationship between the target location, time, and frequency.
  • the target location may be marked by coordinate positions in the electronic map corresponding to the target area, and the coordinate positions may be two-dimensional plane coordinates or three-dimensional space coordinates. Since the target object may be located at different locations at different time points, and the number of times the target object has been active at different time points, the infrared device continuously collects the location of the target object at different time points, and The number of occurrences at different time points to determine the relationship between the target location, time and frequency.
  • Step 1060 Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
  • the camera continuously collects images of the target area
  • the control unit analyzes the collected image information to connect the target position and frequency of the target object in a time series into a line to form the trajectory of the target object.
  • Statistics of the target location, time and frequency can determine the activity density of the target object. According to the activity trajectory of the target object and the activity density of the target object, it can be clearly known.
  • the activity area and living habits of the target object can be known, for example, In a specific time range, the target object has the highest activity frequency during that time period, and in which areas the target object likes to move, so that subsequent measures can be taken for the target object.
  • the target object is a pest
  • the activity density information of the target object includes pest density, peak pest density, average pest density, peak number of pests, and continuous activity of pests. Time, peak continuous activity time of pests, total number of pests, and pest density distribution map, etc.
  • the pest density is the density of the number of occurrences of pests in a unit area and unit time.
  • N i is the number of pests per unit time, unit only; [Delta] t units of time, in minutes or hour unit of time measurement; [Delta] s per unit area in square meters or square centimeters area and other units of measurement; ⁇ i It is the density of pests, the typical unit is only/(m 2 ⁇ h), this value has been changing at different times.
  • the peak pest density is the maximum density of pests in a unit area and unit time.
  • ⁇ max MAX ⁇ i ⁇ , ⁇ i is the pest density monitored in the i-th unit time unit area;
  • ⁇ max is the peak pest density within a certain period of time or within a certain area.
  • the average pest density is the average value of the density of pest occurrences per unit area and unit time.
  • ⁇ i is the pest density monitored per unit area for the i-th time;
  • n is the number of pests in a unit time period within a period of time; It is the average pest density within a certain period of time or within a certain area.
  • the peak number of pests is the maximum number of pests in a unit area and unit time.
  • N max MAX ⁇ N i ⁇ , where N i is the number of pests at a certain moment in the unit area, in units; N max is the peak number of pests in the unit area in a certain period of time, in units.
  • the continuous activity time of pests is the sum of the activity time of pests per unit time within the field of view.
  • T i is the i consecutive event time, in seconds, minutes or other time units of measurement;
  • T total is the total active time.
  • the unit time is generally 24 hours. As long as the harmful organisms are seen in the field of view, the accumulated activity time will be the total time.
  • the peak continuous activity time of pests is the longest continuous activity time of pests per unit time within the field of view.
  • T max MAX ⁇ T i ⁇
  • T i is the i-th continuous activity time
  • the unit is a time measurement unit such as minutes or seconds
  • T max is the peak continuous activity time of pests in David's time.
  • the unit time is generally 24 hours.
  • the total number of pests time is the sum of the number of pests per unit time in the field of view multiplied by the activity time.
  • n T / ⁇ t, wherein, [Delta] t units of time, in minutes or hour unit of time measurement; N i is the number of pest organisms at a certain time, only the units; T is a unit time, usually 24 hours; n- It is the quantity per unit time period in a period of time; NT total is the total number of pests per unit time, the unit is only ⁇ h (only ⁇ hour), please refer to Figure 11.
  • the pest density distribution map is the density of pests per unit time within the field of view, which is represented by a chart.
  • Each pixel value of the density distribution map reflects the number of occurrences of harmful organisms per unit time at the location of the area, please refer to Figure 12a and Figure 12b.
  • the target area image is obtained through the camera, and the target area image is input into the preset deep neural network model to obtain the first recognition result, and then the second recognition result is obtained based on the target area image and the preset reference image , Determine the target object and target location according to the first recognition result and the second recognition result, then obtain the relationship between the target location, time and frequency, and determine the target object’s activity trajectory and/or activity density according to the relationship between the target location, time and frequency, As a result, the density of harmful organisms can be accurately monitored.
  • the method further includes:
  • Step 1310 Obtain a sample image of the target area and the target object, label the target object in the sample image, and generate label information.
  • the target object is a target marquee in the sample image of the target area, where the target marquee includes information such as the coordinate position and the center position of the target.
  • the target marquee includes information such as the coordinate position and the center position of the target.
  • Step 1320 Input the marked image into the deep neural network model for training, and obtain the preset deep neural network model.
  • the model is trained using the marked sample images.
  • the purpose is to improve the accuracy of model training, and thus the accuracy of density monitoring. The more sample images, the more situations are covered, and the higher the recognition ability of the deep neural network model.
  • the preset deep neural network model contains multiple convolutional layers and pooling layers.
  • the input target area image passes through the convolutional pooling layer 1 to obtain intermediate result 1, and then passes through the convolutional pooling layer 2 to obtain intermediate result 2; the intermediate result 2 passes through the convolutional pooling layer 4 to obtain intermediate result 4; intermediate result 1
  • the fusion result is obtained through fusion with intermediate result 4; the fusion result passes through the convolutional pooling layer 5 to obtain intermediate result 5; the intermediate result 2 results convolutional pooling layer 3 to obtain intermediate result 3; the intermediate result 3 and intermediate result 5 are fused to obtain
  • the final result is the first recognition result.
  • the obtained target area image samples can be integrated into an image sample set, and then the image sample set can be divided into a training sample set and a test sample set, where the sample training set is used as Train the deep neural network model, and the test sample set is used to test the trained deep neural network model.
  • each picture in the training sample set is input into the deep neural network model, and the pictures in the training sample set are automatically trained through the deep neural network model to obtain the trained deep neural network model.
  • each picture of the test sample set into the trained deep neural network model recognize each input picture through the model, obtain the corresponding recognition result, and integrate the recognition result corresponding to each image to obtain the recognition result set.
  • the test recognition rate can be determined according to the number of target objects in the recognition result set and the number of target objects in the test sample set.
  • the test recognition rate is used to measure the recognition ability of the trained deep neural network model. If the test recognition rate reaches the preset threshold, it indicates that the recognition ability of the trained deep neural network model meets expectations, and the trained deep neural network model can be directly used as a trained deep neural network model for image recognition. On the contrary, the parameters of the deep neural network model are continuously adjusted, and the deep neural network model is trained again until the recognition rate of the model reaches the preset threshold.
  • the obtaining of the second recognition result based on the target area image and the preset reference image includes:
  • Step 1510 Obtain a changed part image of the target area image relative to the preset reference image, and convert the changed part image into a grayscale image.
  • a grayscale image can be obtained by averaging the RGB values of 3 channels at the same pixel position, or the maximum and minimum brightness of the RGB at the same pixel position can be averaged, or a grayscale image can be obtained.
  • the method of converting the changed part of the image into a grayscale image is not limited to the above two.
  • Step 1520 Perform noise filtering and threshold segmentation on the grayscale image to obtain a binarized image, and obtain connected regions in the binarized image through a connected domain algorithm.
  • Linear filtering, threshold average, weighted average, and template smoothing can be used to filter the noise to obtain a filtered binary image.
  • the filtered binary image Pixels are divided into several categories. Several types of pixel points are divided to find the target point of interest. The foreground pixels of the target point of interest that have the same pixel value and are adjacent to each other form a connected area.
  • Step 1530 Obtain a potential contour of the target object according to the connected region.
  • a connected area is formed by foreground pixels with the same pixel value and adjacent positions in the target points of interest, and the potential contour of each target can be obtained through the connected area.
  • Step 1540 Perform a morphological operation on the potential contour of the target object to obtain a second recognition result.
  • the second recognition result includes the second target object in the target area image and the second target object corresponding to the second target object. Two probabilities and the second target position of the second target object in the target area image.
  • the morphological operation includes expanding and filling a closed area. Specifically, pixels can be added to the boundary of the target object in the image, and the target object The feature contour map of the object is filled with holes to obtain the second target object, the probability corresponding to the second target object, and the second target position of the second target object in the target area image.
  • the obtaining the target object and the target position in the target area image according to the first recognition result and the second recognition result includes:
  • Step 1710 Compare the first probability and the second probability.
  • the first probability refers to the target object obtained through deep neural network model recognition
  • the second probability refers to the target object obtained through image processing.
  • Step 1720 If the first probability is greater than the second probability, and the first probability is greater than or equal to a preset probability threshold, the first target object is taken as a target object, and the first target position is taken as target location.
  • the preset probability threshold can be used as the criterion of the target object, and the probability threshold can be set in advance. If the target probability identified by the deep neural network model, that is, the first probability is greater than the target probability acquired by image processing, that is, the second probability, and the first probability is greater than or equal to the preset probability threshold, it will be identified by the deep learning network model The obtained first target object is taken as the target object, and the position of the first target object is taken as the target position.
  • the preset probability threshold is 60%
  • the first probability is 70%
  • the second probability is 40%
  • the first probability 70% is greater than the second probability 40%
  • the first probability 70% is greater than the preset probability threshold 60%
  • Step 1730 If the first probability is less than the second probability, and the second probability is greater than or equal to the preset probability threshold, the second target object is used as the target object, and the second target position is used as the target position .
  • the target probability identified by the deep neural network model that is, the first probability is less than the target probability obtained by image processing, that is, the second probability, and the second probability is greater than or equal to the preset probability threshold
  • the first probability obtained by image processing will be
  • the second target object is used as the target object, and the position of the second target object is used as the target position.
  • the preset probability is 60%
  • the first probability is 20%
  • the second probability is 80%
  • the first probability 20% is less than the second probability 80%
  • the second probability 80% is greater than the preset probability threshold 60%
  • the second target object is taken as the target object
  • the second target position is taken as the target position.
  • Step 1740 If the first probability and the second probability are both less than the preset probability threshold, but the sum of the first probability and the second probability is greater than the preset second probability threshold, then the target is regarded as a suspected target.
  • the preset probability is 60%
  • the preset second probability is 55%
  • the first probability is 40%
  • the second probability is 18%. It can be seen that the first summary 40% and the second probability 18% are both less than the preset probability threshold 60%, but the sum of the first probability 40% and the second probability 18% is 58%, which is greater than the preset second threshold 55%.
  • the target is regarded as a suspected target.
  • Step 1750 if the first probability and the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset second probability threshold, then discard the first probability The recognition result and the second recognition result.
  • the target probability recognized by the deep neural network model, that is, the first probability, and the target probability obtained by image recognition, that is, the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset probability. If the second probability threshold is set, the recognition result is inaccurate.
  • the first recognition result obtained by the deep neural network model is the first target object, the position of the first target object, and the first probability recognized here. At the same time, it gives up passing the image
  • the second recognition result obtained by the processing is the second target object, the position of the second target object, and the second probability of this recognition.
  • the preset probability threshold is 60%
  • the preset second probability threshold is 55%
  • the probability of the first target object recognized by the deep neural network model is 40%
  • the probability of the second target object obtained by image processing is 10%
  • the first probability 40% and the second probability 10% are both less than 60%
  • the sum of the first probability and the second probability 50% is less than the preset second probability threshold 55%
  • the above-mentioned preset probability threshold can be set according to actual needs, and does not need to be restricted to the limitation in this embodiment.
  • an embodiment of the present application also provides a distributed target monitoring method, as shown in FIG. 18 to FIG. 19, which is applied to the detection and alarm equipment, and the method is executed by the central processing unit in the detection and alarm equipment.
  • a distributed target monitoring method as shown in FIG. 18 to FIG. 19, which is applied to the detection and alarm equipment, and the method is executed by the central processing unit in the detection and alarm equipment.
  • Step 1810 Acquire thermal infrared signals within the sensing range.
  • the detection alarm device is placed in each corner of the room or room, and the central processing unit in the detection alarm device obtains the thermal infrared signal sent by the infrared detection unit.
  • the infrared detection unit may be an infrared sensor, etc. It can be one or more, please refer to the system embodiment.
  • Step 1820 Determine the relationship between the target position, time and frequency according to the thermal infrared signal.
  • the thermal infrared signal is a special signal possessed by the target object within the sensing range. Since the target object may be located at different positions at different time points, and at different time points, the number of times the target object moves is also different. Therefore, the infrared detection unit continuously collects the position of the target object at different time points and the number of times at different time points for 24 hours, so as to determine the relationship between the target position, time and frequency.
  • Step 1830 Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
  • the infrared detection unit continuously performs infrared sensing on the target area, and sends the sensed thermal infrared signal to the central processing unit, and the central processing unit connects the target position and frequency of the target object in the time series
  • the target object’s activity trajectory is formed in a line.
  • the target position, time and frequency are counted to determine the target object’s activity density.
  • the target object’s activity trajectory and the target object’s activity density it can be clearly known that the target object’s activity Activity area and life habits, for example, it can be known that within a certain time range, the target object has the highest activity frequency during that time period, and which areas the target object likes to move in, so that subsequent measures can be taken for the target object.
  • the target object is a pest
  • the activity density information of the target object includes: pest density at the detection alarm device, pest density in the detection alarm device area, average pest density in the detection alarm device area, and pest detection in the alarm device area Density distribution map, etc.
  • the pest density at the detection alarm device is the number of times the detection alarm device finds the pest per unit time.
  • N/T, where N is the total number of pests found per unit time, in units; T is unit time, in hours or days and other time measurement units; ⁇ is the density of pests, and the typical unit is per day.
  • Detect the pest density in the area of the alarm device which is the number of times the alarm device detects the pest per unit area and unit time.
  • ⁇ i N i / T, where, N i is the total number of pests found in the i-th station apparatus per unit time, unit only; T is a unit time, in hours or days, etc. time measurement unit; [rho] i for the first The typical unit of pest density monitored by i equipment is per day; n is the number of detection alarm devices in a unit area; ⁇ total is the density of pests in a certain area within a certain period of time, and the typical unit is per day.
  • the average density of pests in the detection and alarm equipment area is the average of the density of pest occurrences per unit area and unit time.
  • ⁇ i N i / T, where, N i is the total number of pests found in the i-th station apparatus per unit time, unit only; T is a unit time, in hours or days, etc. time measurement unit; [rho] i for the first
  • the density of pests detected by i equipment is typically unit per day; n is the number of detection alarm devices in the unit area; It is the average pest density within a certain period of time in a certain area, and the typical unit is per day.
  • the distribution map of the pest density in the detection and alarm equipment area is the density of pests in a unit area and unit time, which is represented by a chart.
  • the value of each sensing position in the density distribution map reflects the number of occurrences of harmful organisms per unit time at the location of the area.
  • the thermal infrared signal in the sensing range is acquired by detecting the alarm device, the target position, the relationship between time and frequency are determined according to the thermal infrared signal, and the target position, the relationship between time and frequency is determined according to the relationship between the target position, time and frequency.
  • the activity trajectory of the target object and/or the activity density information of the target object so that the density of harmful organisms can be accurately monitored.
  • an embodiment of the present application also provides a distributed target monitoring device 2000, and the device 2000 includes:
  • the first acquisition module 2010 is used to acquire an image of a target area
  • the input module 2020 is configured to input the target area image into a preset deep neural network model to obtain a first recognition result
  • the second obtaining module 2030 is configured to obtain a second recognition result based on the target area image and the preset reference image;
  • the third obtaining module 2040 is configured to obtain the target object and the target position in the target area image according to the first recognition result and the second recognition result;
  • the fourth obtaining module 2050 is configured to obtain the relationship between the target position, time and frequency;
  • the determining module 2060 is configured to determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
  • the target area image is acquired through the first acquisition module, and then the acquired target area image is input into the preset deep neural network model through the input module to obtain the first recognition result, and then use
  • the second acquisition module obtains the second recognition result based on the target area image and the preset reference image, and obtains the target object and the target position in the target area image according to the first recognition result and the second recognition result through the third acquisition module.
  • the acquisition module acquires the relationship between the target location, time, and frequency.
  • the determination module determines the target's activity trajectory and/or activity density according to the relationship between the target object, time and frequency, so that the pest density can be accurately monitored.
  • the apparatus 2000 further includes:
  • the labeling module 2070 obtains a sample image of a target area and a target object, labels the target object in the sample image, and generates labeling information.
  • the training module 2080 inputs the marked image into the deep neural network model for training, and obtains the preset deep neural network model.
  • the input module 2020 is specifically configured to:
  • the preset deep neural network model includes several convolutional layers and pooling layers, and the first recognition result includes a first target object in the target area image, a first probability corresponding to the first target object, and The first target position of the first target object in the target area image.
  • the second acquiring module 2030 is specifically configured to:
  • Noise filtering and threshold segmentation are performed on the gray image to obtain a binarized image, and connected regions in the gray image are obtained through a connected domain algorithm;
  • the third obtaining module 2040 is specifically configured to:
  • first probability is greater than the second probability, and the first probability is greater than or equal to a preset probability threshold, use the first target object as the target object and the first target position as the target position;
  • the second target object is used as the target object, and the second target position is used as the target position;
  • the target is regarded as a suspected target
  • first probability and the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset second probability threshold, then discard the first recognition result and The second recognition result.
  • the above-mentioned distributed target monitoring device can execute the distributed target monitoring method provided in the embodiments of the present application, and has the corresponding functional modules and beneficial effects of the execution method.
  • the distributed target monitoring method provided in the embodiment of the present application.
  • FIG. 21 is a schematic diagram of the hardware structure of the control unit in the image detection device provided by an embodiment of the present application. As shown in FIG. 21, the control unit 2100 includes:
  • One or more processors 2110 and memory 2120 are taken as an example.
  • the processor 2110 and the memory 2120 may be connected through a bus or in other ways.
  • the connection through a bus is taken as an example.
  • the memory 2120 can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, as corresponding to the distributed target monitoring method in the embodiment of the present application.
  • Program instructions/modules for example, the first acquisition module 2010, the input module 2020, the second acquisition module 2030, the third acquisition module 2040, the fourth acquisition module 2050, and the determination module 2060 shown in FIG. 20.
  • the processor 2110 executes various functional applications and data processing of the image detection device by running non-volatile software programs, instructions, and modules stored in the memory 2120, that is, realizes the distributed target monitoring method of the foregoing method embodiment.
  • the memory 2120 may include a storage program area and a storage data area.
  • the storage program area may store an operating system and an application program required by at least one function; the storage data area may store data created according to the use of the distributed target monitoring device.
  • the memory 2120 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the memory 2120 may optionally include memories remotely provided with respect to the processor 2110, and these remote memories may be connected to the distributed target monitoring device through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the one or more modules are stored in the memory 2120, and when executed by the one or more control units 2100, the distributed target monitoring method in any of the foregoing method embodiments is executed, for example, the above-described diagram is executed.
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each implementation manner can be implemented by means of software plus a general hardware platform, and of course, it can also be implemented by hardware.
  • a person of ordinary skill in the art can understand that all or part of the processes in the methods of the foregoing embodiments can be implemented by a computer program instructing relevant hardware.
  • the program can be stored in a computer readable storage medium, and the program can be stored in a computer readable storage medium. When executed, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Signal Processing (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Alarm Systems (AREA)
  • Geophysics And Detection Of Objects (AREA)

Abstract

A distributed target monitoring system and method, relating to the field of target recognition and detection. The system comprises at least one detection alarm device (10), an image detection device (30), and a remote monitoring terminal (20), the image detection device (30) respectively being connected to the detection alarm device (10) and the remote monitoring terminal (20); an infrared detection unit (110) in the detection alarm device (10) detects thermal infrared information in a sensing range, and a central processing unit (120) controls an alarm unit (130) to emit an alarm signal on the basis of the thermal infrared information detected by the infrared detection unit (110); an image acquisition unit (310) in the image detection device (30) acquires a target area image and, on the basis of the target area image, controls the central processing unit (120) so that the central processing unit (120) controls the alarm unit (130) to emit an alarm signal; and a remote control unit (220) in the remote monitoring terminal (20) acquires the target area image and displays the target area image on an image display unit (210), and the remote control unit (220) controls the central processing unit (120) so that the central processing unit (120) controls the alarm unit (130) to emit an alarm signal.

Description

一种分布式目标监测***和方法Distributed target monitoring system and method
相关申请交叉引用Cross-reference to related applications
本申请要求于2019年09月30日申请的、申请号为201910942708.1,申请名称为“一种分布式目标监测***和方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application filed on September 30, 2019, the application number is 201910942708.1, and the application name is "a distributed target monitoring system and method", the entire content of which is incorporated into this application by reference .
技术领域Technical field
本申请涉及目标识别与检测领域,特别是涉及一种分布式目标监测***和方法。This application relates to the field of target recognition and detection, and in particular to a distributed target monitoring system and method.
背景技术Background technique
机场和果园的鸟类,田间、餐厅或者粮仓的鼠类,都是人类面临的难题。以鼠患为例,据世界粮农组织调查,全球每年因老鼠损失贮粮3300万吨,相当于3亿人一年的口粮。民以食为生,在餐饮行业,经常能见到老鼠的身影。近年来餐饮行业大品牌相继被报道后厨老鼠乱窜,被有关部门要求停业整顿的报道。对于一些脏乱差、环境条件一般的餐厅更是少不了老鼠的身影,常见的害鼠都可以传播疾病,常见的鼠传疾病有:钩端螺旋体病、流行性出血热、鼠疫、斑疹伤寒、鼠咬热、沙门氏病、炭疽病、狂犬病、森林脑炎、***等,这些疾病给人类身体健康带来了巨大威胁。Birds in airports and orchards, rodents in fields, restaurants or granaries are all problems facing humans. Take rodent infestation as an example. According to a survey by the World Food and Agriculture Organization, the world loses 33 million tons of food stored by rats every year, which is equivalent to 300 million people's rations a year. People live on food, and mice are often seen in the catering industry. In recent years, big brands in the catering industry have been reported that after the kitchen rat scurrying, relevant departments have asked for reports of suspending business for rectification. Rats are indispensable for some messy restaurants with general environmental conditions. Common pests can transmit diseases. Common rodent-borne diseases are: leptospirosis, epidemic hemorrhagic fever, plague, typhus, Rat bite fever, salmon's disease, anthracnose, rabies, forest encephalitis, tsutsugamushi, etc., these diseases pose a huge threat to human health.
传统的物理捕鼠产品包括粘鼠板与老鼠笼,驱鼠产品主要为声波驱鼠器。这类捕鼠驱鼠产品原理单一,粘鼠板与老鼠笼只有第一次效果比较好,之后老鼠不会再次上当;声波的作用有限,老鼠长期接触会产生适应性,无法进行有效驱赶。采用化学药剂的灭杀方法,需要有专业公司上门进行定期消杀,此类方式比较被动,无法掌握老鼠出入地及活动范围。此外,大量使用老鼠药,最终都会流入下水道,进入周边的河流,给环境带来沉重负担,也给整个城市的可持续发展带来负面效应。Traditional physical mouse trap products include sticky mouse boards and mouse cages, and the main rodent repellent products are sound wave mouse repellents. This kind of mouse trapping and repellent product has a simple principle. The sticky mouse board and the mouse cage are only effective for the first time, and the mouse will not be fooled again afterwards; the effect of sound waves is limited, and the rats will be adaptable for long-term contact and cannot be effectively repelled. The method of killing using chemical agents requires a professional company to come to the door for regular killing. This type of method is relatively passive and cannot control the entry and exit of the rats and the scope of their activities. In addition, a large amount of rat poison will eventually flow into the sewers and into the surrounding rivers, bringing a heavy burden to the environment and negative effects on the sustainable development of the entire city.
发明内容Summary of the invention
基于此,有必要针对上述技术问题,提供一种分布式目标监测***和方法,能够实时监测重点目标并及时报警。Based on this, it is necessary to provide a distributed target monitoring system and method for the above technical problems, which can monitor key targets in real time and give an alarm in time.
第一方面,本申请实施例提供了一种分布式目标监测***,所述***包括至少一个检测报警设备、图像检测设备和远程监控终端,所述图像检测设备分别与所述检测报警设备和所述远程监控终端连接;In the first aspect, the embodiments of the present application provide a distributed target monitoring system. The system includes at least one detection and alarm device, an image detection device, and a remote monitoring terminal. The image detection device is respectively connected to the detection and alarm device and the remote monitoring terminal. The remote monitoring terminal connection;
所述检测报警设备包括红外检测单元、中央处理单元和报警单元,所述中央处理单元分别与所述红外检测单元和所述报警单元电性连接;The detection and alarm equipment includes an infrared detection unit, a central processing unit, and an alarm unit, and the central processing unit is electrically connected to the infrared detection unit and the alarm unit, respectively;
所述红外检测单元用于检测感应范围内的热红外信号,所述中央处理单元用于接收所述热红外信号,并根据所述热红外信号控制所述报警单元发出报警信号;The infrared detection unit is used to detect thermal infrared signals within a sensing range, and the central processing unit is used to receive the thermal infrared signals, and control the alarm unit to issue an alarm signal according to the thermal infrared signals;
所述图像检测设备包括电性连接的图像采集单元和控制单元,所述图像采集单元用于采集目标区域图像,所述控制单元用于接收目标区域图像并对所述目标区域图像进行处理,并根据所述目标区域图像控制所述中央处理单元,以使所述中央处理单元控制所述报警单元发出报警信号;The image detection device includes an image acquisition unit and a control unit that are electrically connected, the image acquisition unit is used to acquire a target area image, the control unit is used to receive the target area image and process the target area image, and Controlling the central processing unit according to the target area image, so that the central processing unit controls the alarm unit to issue an alarm signal;
所述远程监控终端包括电性连接的图像显示单元和遥控单元,所述遥控单元用于获得所述图像检测设备采集的目标区域图像、并将所述目标区域图像在所述图像显示单元显示,以及控制所述中央处理单元,以使所述中央处理单元控制所述报警单元发出报警信号。The remote monitoring terminal includes an image display unit and a remote control unit that are electrically connected, and the remote control unit is used to obtain a target area image collected by the image detection device and display the target area image on the image display unit, And controlling the central processing unit, so that the central processing unit controls the alarm unit to issue an alarm signal.
第二方面,本申请实施例还提供了一种分布式目标监测方法,应用于图像检测设备,所述方法包括:In the second aspect, an embodiment of the present application also provides a distributed target monitoring method, which is applied to an image detection device, and the method includes:
获取目标区域图像;Obtain an image of the target area;
将所述目标区域图像输入预设的深度神经网络模型,获得第一识别结果;Input the target area image into a preset deep neural network model to obtain a first recognition result;
基于所述目标区域图像和预设基准图像获得第二识别结果;Obtaining a second recognition result based on the target area image and the preset reference image;
融合所述第一识别结果和所述第二识别结果获得所述目标区域图像中的目标对象和目标位置;Fusing the first recognition result and the second recognition result to obtain the target object and the target position in the target area image;
获取所述目标位置、时间与频次的关系;Acquiring the relationship between the target location, time and frequency;
根据所述目标位置、时间与频次的关系确定所述目标对象的活动轨迹和/或目标对象的活动密度信息。Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
第三方面,本申请实施例还提供了一种分布式目标监测方法,应用于检测报警设备,所述方法包括:In the third aspect, the embodiments of the present application also provide a distributed target monitoring method, which is applied to detecting alarm equipment, and the method includes:
获取感应范围内的热红外信号;Obtain thermal infrared signals within the sensing range;
根据所述热红外信号确定目标位置、时间与频次的关系;Determine the relationship between target position, time and frequency according to the thermal infrared signal;
根据所述目标位置、时间与频次的关系确定所述目标对象的活动轨迹和/或目标对象的活动密度信息。Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
与现有技术相比,本申请的有益效果是:区别于现有技术的情况,本申请实施例中的一种分布式目标监测***和方法,通过检测报警单元中的红外检测单元检测感应范围内的热红外信息,中央处理单元根据红外检测单元检测到的热红外信息控制报警单元发出报警信号,另一方面,通过图像检测设备中的图像采集单元采集目标区域图像,并根据目标区域图像控制中央处理单元,以使中央处理单元控制报警单元发出报警信号,同时,远程监控终端中的遥控单元获取目标区域图像,并将目标区域图像在图像显示单元显示,遥控单元控制中央处理单元,以使中央处理单元控制报警单元发出报警信号。检测报警设备和图像检测设备分布式分布,且相互配合,从而能够实时监测重点目标并及时报警。Compared with the prior art, the beneficial effect of the present application is: different from the prior art, the distributed target monitoring system and method in the embodiments of the present application detects the sensing range by detecting the infrared detection unit in the alarm unit According to the thermal infrared information detected by the infrared detection unit, the central processing unit controls the alarm unit to issue an alarm signal. On the other hand, it collects the target area image through the image acquisition unit in the image detection equipment and controls it according to the target area image. The central processing unit, so that the central processing unit controls the alarm unit to issue an alarm signal. At the same time, the remote control unit in the remote monitoring terminal acquires the target area image and displays the target area image on the image display unit. The remote control unit controls the central processing unit to make The central processing unit controls the alarm unit to send out an alarm signal. The detection and alarm equipment and the image detection equipment are distributed in a distributed manner and cooperate with each other, so that key targets can be monitored in real time and alarms in time.
附图说明Description of the drawings
一个或多个实施例通过与之对应的附图进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。One or more embodiments are exemplified by the accompanying drawings. These exemplified descriptions do not constitute a limitation on the embodiments. Elements with the same reference numerals in the drawings are denoted as similar elements, unless there are special Affirm that the figures in the attached drawings do not constitute a limitation of proportion.
图1a是本申请实施例的图像检测设备、远程监控终端及检测报警设备连接示意图;Figure 1a is a schematic diagram of the connection of an image detection device, a remote monitoring terminal, and a detection alarm device according to an embodiment of the present application;
图1b是本申请实施例的检测报警设备、网络服务器和远程监控终端连接示意图;FIG. 1b is a schematic diagram of the connection between the detection alarm device, the network server, and the remote monitoring terminal in an embodiment of the present application;
图2是本申请另一个实施例的图像检测设备与远程监控终端及检测报警设备通过无线连接的示意图;FIG. 2 is a schematic diagram of an image detection device and a remote monitoring terminal and a detection alarm device in another embodiment of the present application through wireless connection;
图3a是本申请一个实施例的检测报警设备硬件结构示意图;Fig. 3a is a schematic diagram of the hardware structure of a detection alarm device according to an embodiment of the present application;
图3b是本申请另一个实施例的检测报警设备硬件结构示意图;Figure 3b is a schematic diagram of the hardware structure of a detection alarm device according to another embodiment of the present application;
图4是本申请一个实施例的报警单元的示意图;Figure 4 is a schematic diagram of an alarm unit according to an embodiment of the present application;
图5是本申请另一个实施例的检测报警设备硬件结构示意图;FIG. 5 is a schematic diagram of the hardware structure of a detection alarm device according to another embodiment of the present application;
图6是本申请一个实施例的图像检测设备的硬件结构示意图;FIG. 6 is a schematic diagram of the hardware structure of an image detection device according to an embodiment of the present application;
图7是本申请一个实施例的远程监控终端的硬件结构示意图;FIG. 7 is a schematic diagram of the hardware structure of a remote monitoring terminal according to an embodiment of the present application;
图8是本申请一个实施例的云端服务器、远程监控终端、图像检测设备及检测报警设备的连接示意图;FIG. 8 is a schematic diagram of the connection of a cloud server, a remote monitoring terminal, an image detection device, and a detection alarm device according to an embodiment of the present application;
图9是本申请另一个实施例的云端服务器、远程监控终端、图像检测设备及检测报警设备通过无线连接的示意图;FIG. 9 is a schematic diagram of a cloud server, a remote monitoring terminal, an image detection device, and a detection alarm device in another embodiment of the present application through wireless connection;
图10是本申请分布式目标监测方法的一个实施例的流程图;FIG. 10 is a flowchart of an embodiment of the distributed target monitoring method of the present application;
图11是本申请一个实施例中有害生物总数量时间示意图;FIG. 11 is a schematic diagram of the total number of harmful organisms in time according to an embodiment of the present application;
图12a是本申请一个实施例中有害生物密度分布统计图;Figure 12a is a statistical diagram of pest density distribution in an embodiment of the present application;
图12b是本申请一个实施例中有害生物密度分布示意图;Figure 12b is a schematic diagram of pest density distribution in an embodiment of the present application;
图13是本申请一个实施例中深度神经网络模型训练的流程图;FIG. 13 is a flowchart of deep neural network model training in an embodiment of the present application;
图14是本申请一个实施例中利用深度学习网络模型进行数据处理的流程图;FIG. 14 is a flowchart of data processing using a deep learning network model in an embodiment of the present application;
图15是本申请一个实施例中获得第二识别结果的流程图;FIG. 15 is a flowchart of obtaining a second recognition result in an embodiment of the present application;
图16是本申请一个实施例中获得第二识别结果的具体流程图;FIG. 16 is a specific flow chart of obtaining a second recognition result in an embodiment of the present application;
图17是本申请一个实施例中确定目标对象和目标位置的流程图;FIG. 17 is a flowchart of determining a target object and a target position in an embodiment of the present application;
图18是本申请分布式目标监测方法的另一个实施例的流程图;FIG. 18 is a flowchart of another embodiment of the distributed target monitoring method of the present application;
图19是本申请一个实施例中检测报警设备的分布示意图;Figure 19 is a schematic diagram of the distribution of detection and alarm devices in an embodiment of the present application;
图20是本申请分布式目标监测装置的一个实施例的结构示意图;FIG. 20 is a schematic structural diagram of an embodiment of a distributed target monitoring device of the present application;
图21是本申请实施例提供的控制单元的硬件结构示意图。FIG. 21 is a schematic diagram of the hardware structure of a control unit provided by an embodiment of the present application.
具体实施方式Detailed ways
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的 实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be described clearly and completely in conjunction with the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments It is a part of the embodiments of the present application, but not all of the embodiments. Based on the embodiments in this application, all other embodiments obtained by a person of ordinary skill in the art without creative work shall fall within the protection scope of this application.
需要说明的是,如果不冲突,本申请实施例中的各个特征可以相互结合,均在本申请的保护范围之内。另外,虽然在装置示意图中进行了功能模块划分,在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于装置中的模块划分,或流程图中的顺序执行所示出或描述的步骤。再者,本申请所采用的“第一”、“第二”、“第三”等字样并不对数据和执行次序进行限定,仅是对功能和作用基本相同的相同项或相似项进行区分。It should be noted that if there is no conflict, the various features in the embodiments of the present application can be combined with each other, and all are within the protection scope of the present application. In addition, although the functional modules are divided in the schematic diagram of the device, and the logical sequence is shown in the flowchart, in some cases, the module division in the device may be different from the module division in the device, or the sequence shown in the flowchart may be executed. Or the steps described. Furthermore, the words "first", "second" and "third" used in this application do not limit the data and execution order, but only distinguish the same or similar items with basically the same function and effect.
请一并参阅图1至图9,本申请实施例提供了一种分布式目标监测***,包括至少一个检测报警设备、远程监控终端20和图像检测设备30,所述图像检测设备30分别与所述检测报警设备和所述远程监控终端20连接。Please refer to FIGS. 1 to 9 together. An embodiment of the present application provides a distributed target monitoring system, which includes at least one detection alarm device, a remote monitoring terminal 20, and an image detection device 30. The detection alarm device is connected to the remote monitoring terminal 20.
具体地,如图1a所示,图1a示例性的示出了图像检测设备30通过总线(比如485总线)分别与检测报警设备1、检测报警设备2、检测报警设备3及检测报警设备N连接,同时,所述图像检测设备30与远程监控终端20通过无线(比如433MHZ无线频段)连接。需要说明的是,本实施例中的总线并不都是由图像检测设备连接出来,而是通过设备之间的级联,从而使得实际操作中更方便、更灵活。并且通过无线连接的设备均设有无线通信单元,无线通信单元可以采用物联网***,例如,WIFI网络、Zigbee网络、蓝牙和NB-IOT网络等,也可以采用移动通信***,例如,2G、3G、4G和5G等,无线通信单元还可以包括433MHz等开放频段。Specifically, as shown in Figure 1a, Figure 1a exemplarily shows that the image detection device 30 is connected to the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N through a bus (such as a 485 bus). At the same time, the image detection device 30 and the remote monitoring terminal 20 are connected wirelessly (such as the 433MHZ wireless frequency band). It should be noted that the buses in this embodiment are not all connected by image detection devices, but are cascaded between devices, which makes the actual operation more convenient and more flexible. And the devices connected by wireless are equipped with a wireless communication unit. The wireless communication unit can use the Internet of Things system, such as WIFI network, Zigbee network, Bluetooth and NB-IOT network, etc., or mobile communication system, such as 2G, 3G. , 4G and 5G, etc. The wireless communication unit can also include open frequency bands such as 433MHz.
可以理解的是,在本申请其他一些实施例中,如图1b所示,本申请实施例还提供了一种分布式目标监测***,包括至少一个检测报警设备、网络服务器50和远程监控终端20。It is understandable that in some other embodiments of the present application, as shown in FIG. 1b, an embodiment of the present application also provides a distributed target monitoring system, including at least one detection alarm device, a network server 50, and a remote monitoring terminal 20 .
具体地,如图1b所示,图1b示例性的示出了网络服务器50分别与检测报警设备1、检测报警设备2、检测报警设备3及检测报警设备N通过总线连接,网络服务器50和远程监控终端20通过无线连接。多个 检测报警设备组合在一起构建成一个物联网***,将检测到的热红外信号以及报警信号通过网络服务器50发送给远程监控终端20。需要说明的是,本实施例中的总线并不都是由网络服务器连接出来,而是网络服务器与其中一台检测报警设备连接,然后再通过检测报警设备之间的级联,从而使得实际操作中更方便、更灵活。Specifically, as shown in Fig. 1b, Fig. 1b exemplarily shows that the network server 50 is connected to the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N through a bus. The network server 50 is connected to the remote The monitoring terminal 20 is connected wirelessly. Multiple detection and alarm devices are combined together to construct an Internet of Things system, and the detected thermal infrared signals and alarm signals are sent to the remote monitoring terminal 20 through the network server 50. It should be noted that the buses in this embodiment are not all connected by a network server, but the network server is connected to one of the detection and alarm devices, and then the cascade connection between the detection and alarm devices is used to make the actual operation It is more convenient and flexible.
可以理解的是,在本申请其他一些实施例中,如图2所示,所述检测报警设备、图像检测设备30和远程监控终端20均设有无线通信单元,图2示例性的示出了图像检测设备30分别与远程监控终端20,检测报警设备1、检测报警设备2、检测报警设备3和检测报警设备N通过无线(比如433MHZ无线频段)连接,在实际的环境中包括更多的检测报警设备。It is understandable that in some other embodiments of the present application, as shown in FIG. 2, the detection alarm device, the image detection device 30, and the remote monitoring terminal 20 are all provided with a wireless communication unit, and FIG. 2 exemplarily shows The image detection device 30 is connected to the remote monitoring terminal 20, the detection alarm device 1, the detection alarm device 2, the detection alarm device 3, and the detection alarm device N via wireless (such as the 433MHZ wireless frequency band), which includes more detection in the actual environment Alarm equipment.
如图3a所示,所述检测报警设备10包括红外检测单元110,中央处理单元120和报警单元130,所述中央处理单元120分别与所述红外检测单元110和所述报警单元130电性连接。As shown in FIG. 3a, the detection alarm device 10 includes an infrared detection unit 110, a central processing unit 120 and an alarm unit 130, and the central processing unit 120 is electrically connected to the infrared detection unit 110 and the alarm unit 130, respectively. .
在本申请实施例中,所述红外检测单元110可以为红外感应器、热释电触感器和其他红外热传感器等,用于检测感应范围内的热红外信号。具体地,将红外检测单元110固定在高处,红外传感区域可覆盖目标活动区域,红外检测单元110可以根据待测目标的红外辐射特性进行红外***的设定,当目标进入感应范围时,红外感应器探测到目标红外光谱的变化,并将其发送给中央处理单元120,中央处理单元120可以采用STM系列芯片。为了提升红外检测单元的感应效果,还可以为红外检测单元110配备菲涅尔透镜。需要说明的是,本申请中的目标可以为有害生物,也可以为动物或者人体等。In the embodiment of the present application, the infrared detection unit 110 may be an infrared sensor, a pyroelectric touch sensor, other infrared thermal sensors, etc., for detecting thermal infrared signals within a sensing range. Specifically, the infrared detection unit 110 is fixed at a high place, and the infrared sensing area can cover the target active area. The infrared detection unit 110 can set the infrared system according to the infrared radiation characteristics of the target to be measured. When the target enters the sensing range, The infrared sensor detects the change in the infrared spectrum of the target and sends it to the central processing unit 120. The central processing unit 120 may use an STM series chip. In order to improve the sensing effect of the infrared detection unit, the infrared detection unit 110 may also be equipped with a Fresnel lens. It should be noted that the target in this application can be pests, animals or humans, etc.
可以理解的是,在本申请其他一些实施例中,红外传感器包括红外线发光二极管和红外线光敏二极管,红外线发光二极管和红外线光敏二极管封装在一个塑料外壳内。实际使用时,红外线发光二极管点亮,发出一道人眼看不见的红外光,如果红外传感器前方没有目标,那么这道红外光就消失在宇宙空间内,若红外传感器的前方存在有害物体,红外光就会被反射回来,照在自己也照在旁边的红外线光敏二极管上,红外 线光敏二极管接收到红外光时,其输出引脚的电阻值就会发生变化,通过判断红外线光敏二极管阻值的变化,就可以感应前方的目标。It is understandable that in some other embodiments of the present application, the infrared sensor includes an infrared light emitting diode and an infrared light sensitive diode, and the infrared light emitting diode and the infrared light sensitive diode are encapsulated in a plastic casing. In actual use, the infrared light-emitting diode lights up and emits an infrared light invisible to the human eye. If there is no target in front of the infrared sensor, then this infrared light will disappear in the cosmic space. If there are harmful objects in front of the infrared sensor, the infrared light will be It will be reflected back and shine on the infrared photodiode that is also shining next to itself. When the infrared photodiode receives infrared light, the resistance value of its output pin will change. By judging the change in the resistance of the infrared photodiode, Can sense the target ahead.
在本申请实施例中,所述中央处理单元120可以为中央处理器(CPU:central processing unit),或者是GPU图形处理器(GPU:Graphics Processing Unit)等,用于接收热红外信号,并且根据热红外信号控制报警单元130发出报警信号。具体地,当红外检测单元110检测到感应范围内的热红外信号后,将其发送给中央处理单元120,中央处理单元120根据热红外信号判断感应范围内是否存在活动的目标,当发现感应范围内存在活动的目标迹象后则控制报警单元130发出报警信号。In the embodiment of the present application, the central processing unit 120 may be a central processing unit (CPU: central processing unit), or a GPU graphics processing unit (GPU: Graphics Processing Unit), etc., for receiving thermal infrared signals, and according to The thermal infrared signal controls the alarm unit 130 to issue an alarm signal. Specifically, after the infrared detection unit 110 detects the thermal infrared signal in the sensing range, it sends it to the central processing unit 120, and the central processing unit 120 determines whether there is an active target in the sensing range according to the thermal infrared signal. After there is an active target sign, the alarm unit 130 is controlled to issue an alarm signal.
可以理解的是,在本申请其他一些实施例中,如图3b所示,所述检测报警设备10仅包括电性连接的红外检测单元110和中央处理单元120,当红外检测单元110检测到感应范围内的热红外信号后,将其发送给中央处理单元120进行处理,中央处理单元120根据热红外信号判断感应范围内是否存在活动的目标。It is understandable that, in some other embodiments of the present application, as shown in FIG. 3b, the detection alarm device 10 only includes an infrared detection unit 110 and a central processing unit 120 that are electrically connected. When the infrared detection unit 110 detects the induction After the thermal infrared signal within the range, it is sent to the central processing unit 120 for processing, and the central processing unit 120 determines whether there is an active target in the sensing range according to the thermal infrared signal.
在本申请实施例中,如图4所示,报警单元130包括光发生器131和声发生器132,所述声发生器132和所述光发生器131均与所述中央处理单元120电性连接。其中,所述光发生器131为强光发生器,强光发生器可以为大功率led灯珠贴片,产生的光源为高强光源,所述声发生器132可以为蜂鸣器或者超声发生器,比如型号为3840的超声波喇叭,产生的声音为高分贝声音。具体地,当红外检测单元110发现活动的目标时,中央处理单元120将控制光发生器131产生高强光源,同时中央处理单元120控制声发生器132产生高分贝声音对活动的目标进行驱赶。In the embodiment of the present application, as shown in FIG. 4, the alarm unit 130 includes a light generator 131 and a sound generator 132. Both the sound generator 132 and the light generator 131 are electrically connected to the central processing unit 120. connection. Wherein, the light generator 131 is a strong light generator, the strong light generator may be a high-power LED lamp bead patch, the generated light source is a high-intensity light source, and the sound generator 132 may be a buzzer or an ultrasonic generator For example, the ultrasonic horn of model 3840 produces high-decibel sound. Specifically, when the infrared detection unit 110 finds a moving target, the central processing unit 120 will control the light generator 131 to generate a high-intensity light source, and the central processing unit 120 will control the sound generator 132 to generate a high-decibel sound to drive the moving target.
可以理解的是,在本申请其他一些实施例中,可以在封闭空间或开放空间的每个角落,分布多个光发生器131和声发生器132,当红外检测单元110发现活动目标时,中央处理单元120将控制各个角落的光发生器131产生高强光源,同时中央处理单元120将控制各个角落的声发生器132产生高分贝声音对活动的目标进行驱赶。It is understandable that in some other embodiments of the present application, multiple light generators 131 and sound generators 132 may be distributed at each corner of a closed space or an open space. When the infrared detection unit 110 finds a moving target, the center The processing unit 120 will control the light generator 131 in each corner to generate a high-intensity light source, and the central processing unit 120 will control the sound generator 132 in each corner to generate a high-decibel sound to drive the moving target.
在本申请其他一些实施例中,所述光发生器131可以替换为灯带, 通过3M胶将灯带固定在墙壁四周底部,类似于踢脚线,当红外检测单元110发现活动的目标时,中央处理单元120控制灯带不停闪光,对活动的目标进行驱赶。In some other embodiments of the present application, the light generator 131 can be replaced with a light strip, which is fixed on the bottom of the wall with 3M glue, similar to a skirting line. When the infrared detection unit 110 finds a moving target, The central processing unit 120 controls the light strip to flash continuously to drive away the moving target.
在本申请一些实施例中,如图5所示,所述检测报警设备10还包括光敏单元140,所述光敏单元140可以为光敏二极管或类似组件,例如HPI-6FER2光敏二极管。所述光敏单元140可以通过感知光照信息从而确定当前时刻是白天还是黑夜,所述光敏单元140与所述中央处理单元120电性连接,用于将光照信息传送给中央处理单元120,所述中央处理单元120根据所述光照信息控制所述红外检测单元110和所述报警单元130在晚上工作,从而降低功耗。需要说明的是,上述光敏单元的型号可根据需求自行选择,无需拘泥于本实施例中的限定。In some embodiments of the present application, as shown in FIG. 5, the detection and alarm device 10 further includes a photosensitive unit 140, which may be a photosensitive diode or similar components, such as an HPI-6FER2 photosensitive diode. The photosensitive unit 140 can determine whether the current moment is day or night by sensing illumination information. The photosensitive unit 140 is electrically connected to the central processing unit 120 for transmitting the illumination information to the central processing unit 120. The processing unit 120 controls the infrared detection unit 110 and the alarm unit 130 to work at night according to the illumination information, thereby reducing power consumption. It should be noted that the model of the above-mentioned photosensitive unit can be selected according to requirements without being restricted to the limitation in this embodiment.
在本申请实施例中,如图6所示,图像检测设备30包括图像采集单元310和控制单元320,所述控制单元320和所述图像采集单元310电性连接,所述图像检测设备30可固定在高处。所述图像采集单元310,用于采集目标区域图像,所述图像采集单元310可以例如是红外光和/或可见光摄像头等,控制单元320,可以采用intel movidius或华为海思3519等芯片,红外光摄像头和/或可见光摄像头利用高空优势,将采集的目标图像上传至控制单元320,控制单元320通过机器视觉和神经网络对目标图像进行分析,若发现目标图像中存在活动的目标,控制单元320则发送控制信号和目标的图像信息给中央处理单元120,以使所述中央处理单元120根据接收到的控制信号和目标的图像信息控制所述报警单元130发出报警信号对活动的目标进行驱赶。In the embodiment of the present application, as shown in FIG. 6, the image detection device 30 includes an image acquisition unit 310 and a control unit 320. The control unit 320 is electrically connected to the image acquisition unit 310, and the image detection device 30 may Fixed in a high place. The image acquisition unit 310 is used to acquire images of the target area. The image acquisition unit 310 may be, for example, an infrared light and/or a visible light camera. The control unit 320 may use chips such as Intel Movidius or Huawei HiSilicon 3519, and infrared light. The camera and/or visible light camera takes advantage of the high altitude to upload the collected target image to the control unit 320. The control unit 320 analyzes the target image through machine vision and neural network. If there is an active target in the target image, the control unit 320 The control signal and the image information of the target are sent to the central processing unit 120, so that the central processing unit 120 controls the alarm unit 130 to issue an alarm signal to drive away the active target according to the received control signal and the image information of the target.
可以理解的是,在本申请其他一些实施例中,所述图像检测设备30仅包括控制单元320。所述图像检测设备30与检测报警设备10通过总线或者无线连接。所述图像检测设备30中的控制单元320用于采集和分析多个检测报警设备10的数据,其中,所述数据可以为目标信息,然后上传至云端服务器,用户可以通过终端设备,比如手机或电脑实时查看目标信息。It is understandable that in some other embodiments of the present application, the image detection device 30 only includes the control unit 320. The image detection device 30 and the detection alarm device 10 are connected via a bus or wirelessly. The control unit 320 in the image detection device 30 is used to collect and analyze data from multiple detection and alarm devices 10, where the data can be target information and then uploaded to a cloud server. The user can use a terminal device, such as a mobile phone or The computer checks the target information in real time.
在本申请实施例中,如图7所示,远程监控终端20包括图像显示 单元210和遥控单元220,所述遥控单元220和所述图像显示单元210电性连接。其中,所述图像显示单元210可以为电脑、手机、平板等显示***,遥控单元220可以例如是上位机软件等。具体地,所述遥控单元220用于获取所述控制单元320发送的目标区域图像,以及图像检测设备30和检测报警设备10的状态信息等,并将其发送给图像显示单元210进行显示。所述遥控单元220还用于根据所述目标区域图像控制所述图像检测设备30中的控制单元320,以使所述控制单元320控制所述检测报警设备中的中央处理单元120,从而使得中央处理单元120控制所述报警单元130发出报警信号对目标进行驱赶。In the embodiment of the present application, as shown in FIG. 7, the remote monitoring terminal 20 includes an image display unit 210 and a remote control unit 220, and the remote control unit 220 and the image display unit 210 are electrically connected. Wherein, the image display unit 210 may be a display system such as a computer, a mobile phone, or a tablet, and the remote control unit 220 may be, for example, a host computer software. Specifically, the remote control unit 220 is used to obtain the target area image sent by the control unit 320, and the state information of the image detection device 30 and the detection alarm device 10, and send it to the image display unit 210 for display. The remote control unit 220 is also used to control the control unit 320 in the image detection device 30 according to the target area image, so that the control unit 320 controls the central processing unit 120 in the detection alarm device, so that the central The processing unit 120 controls the alarm unit 130 to issue an alarm signal to drive the target away.
可以理解的是,在本申请其他一些实施例中,所述远程监测终端20分别与所述图像检测设备30和所述检测报警设备无线连接。所述远程监控终端20中的遥控单元220获取到图像检测设备30发送的目标区域图像后,所述遥控单元220可直接控制所述检测报警设备对目标进行驱赶,具体地,所述遥控单元220控制所述中央处理单元120,以使所述中央处理单元120控制所述报警单元130发出报警信号,从而对目标进行驱赶。It is understandable that in some other embodiments of the present application, the remote monitoring terminal 20 is wirelessly connected to the image detection device 30 and the detection alarm device, respectively. After the remote control unit 220 in the remote monitoring terminal 20 obtains the target area image sent by the image detection device 30, the remote control unit 220 can directly control the detection alarm device to drive the target. Specifically, the remote control unit 220 The central processing unit 120 is controlled so that the central processing unit 120 controls the alarm unit 130 to issue an alarm signal, so as to drive the target away.
在本申请其他一些实施例中,所述分布式目标监测***还包括云端服务器40。云端服务器40与图像检测设备30通过无线连接,云端服务器40用于接收所述图像检测设备30发送的目标区域图像,目标信息,图像检测设备30的状态信息和检测报警设备的状态信息等。云端服务器40可以是一台服务器,例如机架式服务器、刀片式服务器、塔式服务器或者机柜式服务器等,也可以是由若干台服务器组成的服务器集群,或者是一个云计算服务中心。In some other embodiments of the present application, the distributed target monitoring system further includes a cloud server 40. The cloud server 40 is wirelessly connected to the image detection device 30, and the cloud server 40 is used to receive the target area image, target information, state information of the image detection device 30, and state information of the detection alarm device sent by the image detection device 30. The cloud server 40 may be a server, such as a rack server, a blade server, a tower server, or a cabinet server, etc., or a server cluster composed of several servers, or a cloud computing service center.
在本申请实施例中,如图8所示,图8示例性的示出了云端服务器40,远程监控终端20,图像检测设备30和检测报警设备1、检测报警设备2、检测报警设备3和检测报警设备N。其中,所述云端服务器40和所述远程监控终端20均通过无线的方式与图像检测设备30连接,所述图像检测设备30通过总线分别与所述检测报警设备1、检测报警设备2、检测报警设备3和检测报警设备N连接。In the embodiment of the present application, as shown in FIG. 8, FIG. 8 exemplarily shows a cloud server 40, a remote monitoring terminal 20, an image detection device 30, and a detection alarm device 1, a detection alarm device 2, a detection alarm device 3, and Detect alarm device N. Wherein, the cloud server 40 and the remote monitoring terminal 20 are both wirelessly connected to the image detection device 30, and the image detection device 30 is respectively connected to the detection alarm device 1, the detection alarm device 2, and the detection alarm via a bus. The device 3 is connected to the detection and alarm device N.
具体地,所述图像检测设备30通过总线接收所述检测报警设备发送的热红外信号和报警信号等信息,并将热红外信号和报警信号,以及自身采集到的目标区域图像,检测报警设备的状态信息及自身的状态信息等,一并通过无线通信单元发送给远程监控终端20和云端服务器40。用户可直接在远程监控终端20上查看上述各种信息,或者通过终端设备例如手机或者电脑无线连接所述云端服务器40实时查看上述各种信息。需要说明的是,通过无线连接的设备均设有无线通信单元。Specifically, the image detection device 30 receives the thermal infrared signal and alarm signal sent by the detection alarm device through the bus, and detects the thermal infrared signal and alarm signal, as well as the image of the target area collected by itself, to detect the alarm device's The status information and its own status information are sent to the remote monitoring terminal 20 and the cloud server 40 through the wireless communication unit together. The user can view the above-mentioned various information directly on the remote monitoring terminal 20, or wirelessly connect to the cloud server 40 through a terminal device such as a mobile phone or a computer to view the above-mentioned various information in real time. It should be noted that all devices connected wirelessly are equipped with a wireless communication unit.
在本申请实施例中,如图9所示,图9示例性的示出了云端服务器40,远程监控终端A和远程监控终端B,图像检测设备1和图像检测设备2,以及检测报警设备1、检测报警设备2、检测报警设备3、检测报警设备4、检测报警设备5以及检测报警设备N。云端服务器40分别与所述图像检测设备1和图像检测设备2无线连接,所述远程监控终端A与所述图像检测设备1无线连接,同时,所述图像检测设备1分别与所述检测报警设备1、检测报警设备2、检测报警设备3无线连接;远程监控终端B与图像检测单元2无线连接,同时所述图像检测设备2分别与所述检测报警设备4、检测报警设备5以及检测报警设备N无线连接。In the embodiment of the present application, as shown in FIG. 9, FIG. 9 exemplarily shows the cloud server 40, the remote monitoring terminal A and the remote monitoring terminal B, the image detection device 1 and the image detection device 2, and the detection alarm device 1. , Detect alarm equipment 2, detect alarm equipment 3, detect alarm equipment 4, detect alarm equipment 5, and detect alarm equipment N. The cloud server 40 is wirelessly connected to the image detection device 1 and the image detection device 2, and the remote monitoring terminal A is wirelessly connected to the image detection device 1. At the same time, the image detection device 1 is connected to the detection alarm device. 1. The detection alarm device 2. The detection alarm device 3 is wirelessly connected; the remote monitoring terminal B is wirelessly connected to the image detection unit 2, and the image detection device 2 is respectively connected to the detection alarm device 4, the detection alarm device 5 and the detection alarm device N wireless connection.
具体地,所述图像检测设备通过无线通信单元接收所述检测报警设备通过无线的方式发送的热红外信号和报警信号等信息,并将热红外信号和报警信号,以及自身获取到的目标区域图像,检测报警设备的状态信息及自身的状态信息等,一并通过无线通信单元发送给远程监控终端和云端服务器40。用户可直接在远程监控终端上查看上述各种信息,或者通过终端设备例如手机或者电脑无线连接所述云端服务器40实时查看有上述各种信息,从而能够实时监测目标。需要说明的是,通过无线连接的设备均设有无线通信单元。Specifically, the image detection device receives, through a wireless communication unit, information such as thermal infrared signals and alarm signals sent by the detection alarm device wirelessly, and combines the thermal infrared signals and alarm signals, as well as the image of the target area obtained by itself. , Detect the status information of the alarm device and its own status information, etc., and send them to the remote monitoring terminal and the cloud server 40 through the wireless communication unit. The user can view the foregoing various information directly on the remote monitoring terminal, or wirelessly connect to the cloud server 40 through a terminal device such as a mobile phone or a computer to view the foregoing various information in real time, so that the target can be monitored in real time. It should be noted that all devices connected wirelessly are equipped with a wireless communication unit.
相应的,如图10所示,本申请实施例还提供了一种分布式目标监测方法,应用于图像检测设备,所述方法由图像检测设备中的控制单元执行,包括:Correspondingly, as shown in FIG. 10, an embodiment of the present application also provides a distributed target monitoring method, which is applied to an image detection device, and the method is executed by a control unit in the image detection device, and includes:
步骤1010,获取目标区域图像。Step 1010: Obtain an image of the target area.
在本申请实施例中,目标区域为生物活动区域或者人体活动区域, 优选为有害生物活动区域,图像包括视频图像和图片图像,获取摄像头拍摄到的目标区域图像,该目标区域图像中包含了有害生物或者人体。In the embodiments of the present application, the target area is a biological activity area or a human activity area, preferably a harmful biological activity area. The image includes a video image and a picture image. The image of the target area captured by the camera is obtained, and the target area image contains harmful Biology or human body.
步骤1020,将所述目标区域图像输入预设的深度神经网络模型,获得第一识别结果。Step 1020: Input the target area image into a preset deep neural network model to obtain a first recognition result.
第一识别结果是将摄像头拍摄到的目标区域图像输入预设的深度神经网络模型得到的,预设的深度神经网络模型是对大量携带目标区域图像进行学习训练得到的。The first recognition result is obtained by inputting the target area image captured by the camera into a preset deep neural network model, and the preset deep neural network model is obtained by learning and training a large number of images carrying the target area.
步骤1030,基于所述目标区域图像和预设基准图像获得第二识别结果。Step 1030: Obtain a second recognition result based on the target area image and a preset reference image.
预设的基准图像为预先拍摄的不包含目标的图像,将该图像作为背景图像,通过将红外设备拍摄到的目标区域图像与背景图像进行比对,进而获得第二识别结果。The preset reference image is a pre-photographed image that does not contain the target. This image is used as a background image, and the second recognition result is obtained by comparing the target area image taken by the infrared device with the background image.
步骤1040,根据所述第一识别结果和所述第二识别结果获得所述目标区域图像中的目标对象和目标位置。Step 1040: Obtain the target object and the target position in the target area image according to the first recognition result and the second recognition result.
在本申请实施例中,目标对象为有害生物或者人体,目标位置为有害生物或者人体在目标区域图像中的位置,例如可以为框住所述目标对象的最小外接矩形框,这个矩形框的位置就是目标对象的位置,通过第一识别结果和第二识别结果来获得目标区域图像中的目标对象和目标位置。In the embodiment of the present application, the target object is a pest or human body, and the target position is the position of the pest or human body in the image of the target area. For example, it may be the smallest bounding rectangular frame that encloses the target object, and the position of this rectangular frame is The position of the target object, the target object and the target position in the target area image are obtained through the first recognition result and the second recognition result.
步骤1050,获取所述目标位置、时间与频次的关系。Step 1050: Obtain the relationship between the target location, time, and frequency.
在本申请实施例中,可以在目标区域对应的电子地图中使用坐标位置标记目标位置,坐标位置可以为二维平面坐标,也可以为三维空间坐标。由于目标对象在不同的时间点可能位于不同的位置,并且不同的时间点,目标对象活动的次数也不同,因此,红外设备24小时不间断的采集目标对象在不同的时间点出现的位置,以及不同的时间点出现的次数,从而确定目标位置、时间和频次的关系。In the embodiment of the present application, the target location may be marked by coordinate positions in the electronic map corresponding to the target area, and the coordinate positions may be two-dimensional plane coordinates or three-dimensional space coordinates. Since the target object may be located at different locations at different time points, and the number of times the target object has been active at different time points, the infrared device continuously collects the location of the target object at different time points, and The number of occurrences at different time points to determine the relationship between the target location, time and frequency.
步骤1060,根据所述目标位置、时间与频次的关系确定所述目标对象的活动轨迹和/或目标对象的活动密度信息。Step 1060: Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
在本申请实施例中,摄像头不间断的对目标区域进行图像采集,控 制单元通过分析采集的图像信息,将目标对象在时间序列上的目标位置、频次连接成线从而形成目标对象的活动轨迹,将所述目标位置、时间和频次进行统计,可以确定目标对象的活动密度,根据目标对象的活动轨迹和目标对象的活动密度可以清楚的知道,目标对象的活动区域和生活习性,例如可以知道,在特定时间范围内,目标对象在那个时间段活动频率最高,目标对象喜欢在哪些区域活动,便于后续对目标对象采取处理措施。In the embodiment of the present application, the camera continuously collects images of the target area, and the control unit analyzes the collected image information to connect the target position and frequency of the target object in a time series into a line to form the trajectory of the target object. Statistics of the target location, time and frequency can determine the activity density of the target object. According to the activity trajectory of the target object and the activity density of the target object, it can be clearly known. The activity area and living habits of the target object can be known, for example, In a specific time range, the target object has the highest activity frequency during that time period, and in which areas the target object likes to move, so that subsequent measures can be taken for the target object.
可以理解的是,在本申请其他一些实施例中,目标对象为有害生物,目标对象的活动密度信息包括有害生物密度、峰值有害生物密度、平均有害生物密度、峰值有害生物数量、有害生物连续活动时间、有害生物峰值连续活动时间、有害生物总数量时间以及有害生物密度分布图等。It is understandable that in some other embodiments of this application, the target object is a pest, and the activity density information of the target object includes pest density, peak pest density, average pest density, peak number of pests, and continuous activity of pests. Time, peak continuous activity time of pests, total number of pests, and pest density distribution map, etc.
其中,有害生物密度,为单位区域单位时间出现有害生物次数的密度。
Figure PCTCN2020098588-appb-000001
其中,N i为单位时间内有害生物数量,单位为只;Δt为单位时间,单位为分钟或者小时等时间计量单位;Δs为单位面积,单位为平方米或者平方厘米等面积计量单位;ρ i为有害生物密度,典型单位为只/(m 2·h),该数值在不同时刻一直处于变化之中。
Among them, the pest density is the density of the number of occurrences of pests in a unit area and unit time.
Figure PCTCN2020098588-appb-000001
Where, N i is the number of pests per unit time, unit only; [Delta] t units of time, in minutes or hour unit of time measurement; [Delta] s per unit area in square meters or square centimeters area and other units of measurement; ρ i It is the density of pests, the typical unit is only/(m 2 ·h), this value has been changing at different times.
峰值有害生物密度,为单位区域单位时间出现有害生物次数密度的最大值。ρ max=MAX{ρ i},ρ i为第i次单位时间单位区域监测到的有害生物密度;ρ max为一定时间段内或者一定区域范围内的峰值有害生物密度。 The peak pest density is the maximum density of pests in a unit area and unit time. ρ max =MAX{ρ i }, ρ i is the pest density monitored in the i-th unit time unit area; ρ max is the peak pest density within a certain period of time or within a certain area.
平均有害生物密度,为单位区域单位时间出现有害生物次数密度的平均值。
Figure PCTCN2020098588-appb-000002
ρ i为第i次单位时间单位区域监测到的有害生物密度;n为一段时间内单位时间段的数量;
Figure PCTCN2020098588-appb-000003
为一定时间段内或者一定区域范围内的平均有害生物密度。
The average pest density is the average value of the density of pest occurrences per unit area and unit time.
Figure PCTCN2020098588-appb-000002
ρ i is the pest density monitored per unit area for the i-th time; n is the number of pests in a unit time period within a period of time;
Figure PCTCN2020098588-appb-000003
It is the average pest density within a certain period of time or within a certain area.
峰值有害生物数量,为单位区域单位时间出现有害生物次数数量的最大值。N max=MAX{N i},其中,N i为单位区域内某一时刻的有害生物数量,单位为只;N max为单位区域内一定时间段内峰值有害生物数量,单位为只。 The peak number of pests is the maximum number of pests in a unit area and unit time. N max =MAX{N i }, where N i is the number of pests at a certain moment in the unit area, in units; N max is the peak number of pests in the unit area in a certain period of time, in units.
有害生物连续活动时间,为视场范围内单位时间有害生物活动时间的总和。
Figure PCTCN2020098588-appb-000004
其中,T i为第i次连续活动时间,单位为分钟或者秒等时间计量单位;T 为总活动时间。单位时间一般为24小时,只要视场范围看到有害生物在活动,则累计活动时间得到总时间。
The continuous activity time of pests is the sum of the activity time of pests per unit time within the field of view.
Figure PCTCN2020098588-appb-000004
Wherein, T i is the i consecutive event time, in seconds, minutes or other time units of measurement; T total is the total active time. The unit time is generally 24 hours. As long as the harmful organisms are seen in the field of view, the accumulated activity time will be the total time.
有害生物峰值连续活动时间,为视场范围内单位时间有害生物最长连续活动时间。T max=MAX{T i},T i为第i次连续活动时间,单位为分钟或者秒等时间计量单位;T max为大卫时间内有害生物峰值连续活动时间。单位时间一般为24小时。 The peak continuous activity time of pests is the longest continuous activity time of pests per unit time within the field of view. T max =MAX{T i }, T i is the i-th continuous activity time, the unit is a time measurement unit such as minutes or seconds; T max is the peak continuous activity time of pests in David's time. The unit time is generally 24 hours.
有害生物总数量时间,为视场范围内单位时间有害生物数量乘以活动时间的总和。
Figure PCTCN2020098588-appb-000005
且n=T/Δt,其中,Δt为单位时间,单位为分钟或者小时等时间计量单位;N i为某一时刻的有害生物数量,单位为只;T为单位时间,一般为24小时;n为一段时间内单位时间段的数量;NT 为单位时间内有害生物总数量时间,单位为只·h(只·小时),请参阅图11。
The total number of pests time is the sum of the number of pests per unit time in the field of view multiplied by the activity time.
Figure PCTCN2020098588-appb-000005
And n = T / Δt, wherein, [Delta] t units of time, in minutes or hour unit of time measurement; N i is the number of pest organisms at a certain time, only the units; T is a unit time, usually 24 hours; n- It is the quantity per unit time period in a period of time; NT total is the total number of pests per unit time, the unit is only·h (only·hour), please refer to Figure 11.
有害生物密度分布图,为视场范围内单位时间有害生物出现的密度,用图表的方式表现出来。密度分布图的每个像素值都体现了该区域位置处,单位时间内有害生物出现的次数,请参阅图12a和图12b。The pest density distribution map is the density of pests per unit time within the field of view, which is represented by a chart. Each pixel value of the density distribution map reflects the number of occurrences of harmful organisms per unit time at the location of the area, please refer to Figure 12a and Figure 12b.
在本申请实施例中,通过摄像头获取目标区域图像,并将目标区域图像输入预设的深度神经网络模型,获得第一识别结果,然后基于目标区域图像和预设的基准图像获得第二识别结果,根据第一识别结果和第二识别结果确定目标对象及目标位置,接着获取目标位置、时间和频次的关系,根据目标位置、时间和频次的关系确定目标对象的活动轨迹和/或活动密度,由此,能够准确的对有害生物密度进行监测。In the embodiment of the present application, the target area image is obtained through the camera, and the target area image is input into the preset deep neural network model to obtain the first recognition result, and then the second recognition result is obtained based on the target area image and the preset reference image , Determine the target object and target location according to the first recognition result and the second recognition result, then obtain the relationship between the target location, time and frequency, and determine the target object’s activity trajectory and/or activity density according to the relationship between the target location, time and frequency, As a result, the density of harmful organisms can be accurately monitored.
在本申请一些实施例中,如图13所示,所述方法还包括:In some embodiments of the present application, as shown in FIG. 13, the method further includes:
步骤1310,获取目标区域样本图像和目标对象,对所述样本图像中的目标对象进行标注,生成标注信息。Step 1310: Obtain a sample image of the target area and the target object, label the target object in the sample image, and generate label information.
在本申请实施例中,目标对象为目标区域样本图像中的目标选框, 其中,所述目标选框包括目标的坐标位置、中心位置等信息。当获取到大量的目标区域样本图像后,需要对每一张样本图像进行标注。具体地,当候选目标确定是真实目标的情况下,***自动完成标注;当候选目标为疑似目标的情况下,判断是否为目标,是目标则完成标注工作,不是目标则放弃该数据。In the embodiment of the present application, the target object is a target marquee in the sample image of the target area, where the target marquee includes information such as the coordinate position and the center position of the target. When a large number of sample images of the target area are obtained, each sample image needs to be annotated. Specifically, when the candidate target is determined to be a real target, the system automatically completes the labeling; when the candidate target is a suspected target, it is judged whether it is a target, the labeling work is completed if it is a target, and the data is discarded if it is not a target.
步骤1320,将标注完成的图像输入深度神经网络模型进行训练,得到所述预设的深度神经网络模型。Step 1320: Input the marked image into the deep neural network model for training, and obtain the preset deep neural network model.
当所有的样本图像标注完成后,使用标注完成后的样本图像训练模型,目的是提高模型训练的准确度,进而提高密度监测的准确度。样本图像越多,涵盖的情形就越多,从而深度神经网络模型的识别能力也越高。When all the sample images are marked, the model is trained using the marked sample images. The purpose is to improve the accuracy of model training, and thus the accuracy of density monitoring. The more sample images, the more situations are covered, and the higher the recognition ability of the deep neural network model.
在本申请其他一些实施例中,如图14所示,预设的深度神经网络模型内含多层卷积层和池化层,通过将获得的目标区域图像输入预设的深度神经网络模型,输入的目标区域图像经过卷积池化层1之后得到中间结果1,再经过卷积池化层2得到中间结果2;中间结果2经过卷积池化层4,得到中间结果4;中间结果1和中间结果4通过融合,得到融合结果;融合结果经过卷积池化层5得到中间结果5;中间结果2结果卷积池化层3,得到中间结果3;中间结果3和中间结果5融合得到最终结果即第一识别结果。In some other embodiments of the present application, as shown in FIG. 14, the preset deep neural network model contains multiple convolutional layers and pooling layers. By inputting the obtained target area image into the preset deep neural network model, The input target area image passes through the convolutional pooling layer 1 to obtain intermediate result 1, and then passes through the convolutional pooling layer 2 to obtain intermediate result 2; the intermediate result 2 passes through the convolutional pooling layer 4 to obtain intermediate result 4; intermediate result 1 The fusion result is obtained through fusion with intermediate result 4; the fusion result passes through the convolutional pooling layer 5 to obtain intermediate result 5; the intermediate result 2 results convolutional pooling layer 3 to obtain intermediate result 3; the intermediate result 3 and intermediate result 5 are fused to obtain The final result is the first recognition result.
可以理解的是,在本申请其他一些实施例中,可以将获得的目标区域图像样本整合成图像样本集合,然后将图像样本集合分为训练样本集合和测试样本集合,其中,样本训练集合用作训练深度神经网络模型,测试样本集合用于对训练后的深度神经网络模型进行测试。It is understandable that in some other embodiments of the present application, the obtained target area image samples can be integrated into an image sample set, and then the image sample set can be divided into a training sample set and a test sample set, where the sample training set is used as Train the deep neural network model, and the test sample set is used to test the trained deep neural network model.
具体地,将训练样本集合中的各个图片输入到深度神经网路模型中,通过该深度神经网络模型自动对训练样本集合中的图片进行训练,得到训练后的深度神经网络模型。Specifically, each picture in the training sample set is input into the deep neural network model, and the pictures in the training sample set are automatically trained through the deep neural network model to obtain the trained deep neural network model.
将测试样本集合的各个图片输入训练后的深度神经网络模型,通过该模型对输入的各个图片进行识别,得到对应的识别结果,将每个图像对应的识别结果进行整合,得到识别结果集合。根据识别结果集合中的 目标对象个数和测试样本集合中的目标对象个数可以确定测试识别率,测试识别率用于衡量训练后的深度神经网络模型的识别能力。若测试识别率到达预设阈值时,则表示训练后的深度神经网络模型的识别能力符合预期,可直接将训练后的深度神经网络模型作为已训练好的深度神经网络模型进行图像识别。反之,则不断调整深度神经网络模型的各个参数,再次训练该深度神经网络模型,直到该模型的识别率达到预设阈值。Input each picture of the test sample set into the trained deep neural network model, recognize each input picture through the model, obtain the corresponding recognition result, and integrate the recognition result corresponding to each image to obtain the recognition result set. The test recognition rate can be determined according to the number of target objects in the recognition result set and the number of target objects in the test sample set. The test recognition rate is used to measure the recognition ability of the trained deep neural network model. If the test recognition rate reaches the preset threshold, it indicates that the recognition ability of the trained deep neural network model meets expectations, and the trained deep neural network model can be directly used as a trained deep neural network model for image recognition. On the contrary, the parameters of the deep neural network model are continuously adjusted, and the deep neural network model is trained again until the recognition rate of the model reaches the preset threshold.
在本申请一些实施例中,请一并参阅图15和图16,所述基于所述目标区域图像和预设基准图像获得第二识别结果,包括:In some embodiments of the present application, please refer to FIGS. 15 and 16 together. The obtaining of the second recognition result based on the target area image and the preset reference image includes:
步骤1510,获取所述目标区域图像中相对于所述预设基准图像的变化部分图像,并将所述变化部分图像转换为灰度图像。Step 1510: Obtain a changed part image of the target area image relative to the preset reference image, and convert the changed part image into a grayscale image.
首先将获得的目标区域图像和预设的基准图像进行比对,得到目标区域图像中变化的部分,将目标区域图像中变化部分图像转换为灰度图像,灰度转换后可以去除图像中的彩色信息,减小后续的计算量。例如可以通过将同一个像素位置3个通道RGB的值进行平均,可以得到灰度图像,也可以获取同一个像素位置的RGB中亮度最大的和最小的进行平均,也可以得到灰度图像,将变化部分图像转换为灰度图像的方式不限于上述两种。First compare the obtained target area image with the preset reference image to obtain the changed part of the target area image, convert the changed part of the target area image into a grayscale image, and remove the color in the image after grayscale conversion Information, reduce the amount of subsequent calculations. For example, a grayscale image can be obtained by averaging the RGB values of 3 channels at the same pixel position, or the maximum and minimum brightness of the RGB at the same pixel position can be averaged, or a grayscale image can be obtained. The method of converting the changed part of the image into a grayscale image is not limited to the above two.
步骤1520,对所述灰度图像进行噪声过滤和阈值分割,得到二值化图像,通过连通域算法得到所述二值化图像中的连通区域。Step 1520: Perform noise filtering and threshold segmentation on the grayscale image to obtain a binarized image, and obtain connected regions in the binarized image through a connected domain algorithm.
图像在生成和传送的过程中,会因为各种原因受到各种噪声的干扰和影响,从而降低图像的质量,对后续图像处理和分析造成影响。因此,需要对灰度图像进行噪声过滤,可以采用线性滤波、阈值平均、加权平均和模板平滑等多种方式对噪声进行过滤,得到过滤后的二值化图像,将过滤后的二值化图像像素点分为若干类,对若干类像素点进行划分,找到感兴趣的目标点,将感兴趣的目标点中具有相同像素值且位置相邻的前景像素点组成连通区域。In the process of image generation and transmission, it will be interfered and affected by various noises due to various reasons, thereby reducing the quality of the image and affecting the subsequent image processing and analysis. Therefore, it is necessary to filter the noise of the gray image. Linear filtering, threshold average, weighted average, and template smoothing can be used to filter the noise to obtain a filtered binary image. The filtered binary image Pixels are divided into several categories. Several types of pixel points are divided to find the target point of interest. The foreground pixels of the target point of interest that have the same pixel value and are adjacent to each other form a connected area.
步骤1530,根据所述连通区域获取目标对象的潜在轮廓。Step 1530: Obtain a potential contour of the target object according to the connected region.
通过将感兴趣的目标点中具有相同像素值且位置相邻的前景像素点组成连通区域,通过该连通区域可以得到每个目标的潜在轮廓。A connected area is formed by foreground pixels with the same pixel value and adjacent positions in the target points of interest, and the potential contour of each target can be obtained through the connected area.
步骤1540,对所述目标对象的潜在轮廓进行形态学操作,获得第二识别结果,所述第二识别结果包括所述目标区域图像中的第二目标对象、所述第二目标对象对应的第二概率以及所述第二目标对象在所述目标区域图像中的第二目标位置。Step 1540: Perform a morphological operation on the potential contour of the target object to obtain a second recognition result. The second recognition result includes the second target object in the target area image and the second target object corresponding to the second target object. Two probabilities and the second target position of the second target object in the target area image.
在本申请实施例中,为了识别目标对象的潜在轮廓,需要对目标对象进行形态学操作,形态学操作包括膨胀和填充封闭区域,具体地,可以给图像中的目标对象边界添加像素,对目标对象的特征轮廓图进行孔洞填充,获得第二目标对象,第二目标对象对应的概率以及第二目标对象在所述目标区域图像中的第二目标位置。In the embodiment of the present application, in order to identify the potential contour of the target object, it is necessary to perform morphological operations on the target object. The morphological operation includes expanding and filling a closed area. Specifically, pixels can be added to the boundary of the target object in the image, and the target object The feature contour map of the object is filled with holes to obtain the second target object, the probability corresponding to the second target object, and the second target position of the second target object in the target area image.
在本申请一些实施例中,如图17所示,所述根据所述第一识别结果和所述第二识别结果获得所述目标区域图像中的目标对象和目标位置,包括:In some embodiments of the present application, as shown in FIG. 17, the obtaining the target object and the target position in the target area image according to the first recognition result and the second recognition result includes:
步骤1710,比较所述第一概率和所述第二概率。Step 1710: Compare the first probability and the second probability.
在本申请实施例中,第一概率是指目标对象通过深度神经网络模型识别得到的;第二概率是指目标对象通过图像处理得到的。通过比较第一概率和第二概率,可以确定目标对象。In the embodiments of the present application, the first probability refers to the target object obtained through deep neural network model recognition; the second probability refers to the target object obtained through image processing. By comparing the first probability and the second probability, the target object can be determined.
步骤1720,如果所述第一概率大于所述第二概率,且所述第一概率大于或等于预设概率阈值,则将所述第一目标对象作为目标对象,将所述第一目标位置作为目标位置。Step 1720: If the first probability is greater than the second probability, and the first probability is greater than or equal to a preset probability threshold, the first target object is taken as a target object, and the first target position is taken as target location.
预设概率阈值可以作为目标对象的评判标准,可预先设置概率阈值。如果通过深度神经网络模型识别得到的目标概率即第一概率大于所述通过图像处理得到的目标概率即第二概率,并且第一概率大于或者等于预设概率阈值,则将通过深度学习网络模型识别得到的第一目标对象作为目标对象,将第一目标对象的位置作为目标位置。例如,预设概率阈值为60%,第一概率为70%,第二概率为40%,第一概率70%大于第二概率40%,且第一概率70%大于预设概率阈值60%,则将第一目标对象作为目标对象,并且将第一目标位置作为目标位置。The preset probability threshold can be used as the criterion of the target object, and the probability threshold can be set in advance. If the target probability identified by the deep neural network model, that is, the first probability is greater than the target probability acquired by image processing, that is, the second probability, and the first probability is greater than or equal to the preset probability threshold, it will be identified by the deep learning network model The obtained first target object is taken as the target object, and the position of the first target object is taken as the target position. For example, the preset probability threshold is 60%, the first probability is 70%, the second probability is 40%, the first probability 70% is greater than the second probability 40%, and the first probability 70% is greater than the preset probability threshold 60%, Then the first target object is taken as the target object, and the first target position is taken as the target position.
步骤1730,如果所述第一概率小于第二概率,且所述第二概率大于或等于预设概率阈值,则将所述第二目标对象作为目标对象,将所述第 二目标位置作为目标位置。Step 1730: If the first probability is less than the second probability, and the second probability is greater than or equal to the preset probability threshold, the second target object is used as the target object, and the second target position is used as the target position .
具体地,通过深度神经网络模型识别得到的目标概率即第一概率小于通过图像处理得到的目标概率即第二概率,并且第二概率大于或者等于预设概率阈值,则将通过图像处理得到的第二目标对象作为目标对象,将第二目标对象的位置作为目标位置。例如,预设概率为60%,第一概率为20%,第二概率为80%,第一概率20%小于第二概率80%,且第二概率80%大于预设概率阈值60%,则将第二目标对象作为目标对象,并且将第二目标位置作为目标位置。Specifically, if the target probability identified by the deep neural network model, that is, the first probability is less than the target probability obtained by image processing, that is, the second probability, and the second probability is greater than or equal to the preset probability threshold, then the first probability obtained by image processing will be The second target object is used as the target object, and the position of the second target object is used as the target position. For example, the preset probability is 60%, the first probability is 20%, the second probability is 80%, the first probability 20% is less than the second probability 80%, and the second probability 80% is greater than the preset probability threshold 60%, then The second target object is taken as the target object, and the second target position is taken as the target position.
步骤1740,如果所述第一概率和所述第二概率均小于预设概率阈值,但所述第一概率和所述第二概率之和大于预设第二概率阈值,则作为疑似目标。Step 1740: If the first probability and the second probability are both less than the preset probability threshold, but the sum of the first probability and the second probability is greater than the preset second probability threshold, then the target is regarded as a suspected target.
具体地,预设概率为60%,预设第二概率为55%,第一概率为40%,第二概率为18%。可知,第一概括40%和第二概率18%均小于预设概率阈值60%,但第一概率40%和第二概率18%之和58%,大于预设第二阈值55%,则将所述目标作为疑似目标。Specifically, the preset probability is 60%, the preset second probability is 55%, the first probability is 40%, and the second probability is 18%. It can be seen that the first summary 40% and the second probability 18% are both less than the preset probability threshold 60%, but the sum of the first probability 40% and the second probability 18% is 58%, which is greater than the preset second threshold 55%. The target is regarded as a suspected target.
步骤1750,如果所述第一概率和所述第二概率均小于预设概率阈值,且所述第一概率和所述第二概率之和小于预设第二概率阈值,则放弃所述第一识别结果和所述第二识别结果。 Step 1750, if the first probability and the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset second probability threshold, then discard the first probability The recognition result and the second recognition result.
通过深度神经网络模型识别得到的目标概率即第一概率和通过图像识别得到的目标概率即第二概率均小于预设概率阈值,且所述第一概率和所述第二概率之和均小于预设第二概率阈值,则说明识别结果不准确,放弃通过深度神经网络模型得到的第一识别结果即第一目标对象、第一目标对象的位置以及此处识别的第一概率,同时放弃通过图像处理得到的第二识别结果即第二目标对象、第二目标对象的位置以及此次识别的第二概率。例如,预设概率阈值为60%,预设第二概率阈值为55%,通过深度神经网络模型识别得到的第一目标对象的概率为40%,通过图像处理得到的第二目标对象的概率为10%,第一概率40%和第二概率10%均小于60%,且第一概率和第二概率之和50%小于预设第二概率阈值55%,则放弃第一识别结果和第二识别结果。需要说明的是,上述预设的概率 阈值,可以根据实际需要自行设置,无需拘泥于本实施例中的限定。The target probability recognized by the deep neural network model, that is, the first probability, and the target probability obtained by image recognition, that is, the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset probability. If the second probability threshold is set, the recognition result is inaccurate. The first recognition result obtained by the deep neural network model is the first target object, the position of the first target object, and the first probability recognized here. At the same time, it gives up passing the image The second recognition result obtained by the processing is the second target object, the position of the second target object, and the second probability of this recognition. For example, the preset probability threshold is 60%, the preset second probability threshold is 55%, the probability of the first target object recognized by the deep neural network model is 40%, and the probability of the second target object obtained by image processing is 10%, the first probability 40% and the second probability 10% are both less than 60%, and the sum of the first probability and the second probability 50% is less than the preset second probability threshold 55%, then the first recognition result and the second probability are discarded Recognition results. It should be noted that the above-mentioned preset probability threshold can be set according to actual needs, and does not need to be restricted to the limitation in this embodiment.
相应的,本申请实施例还提供了一种分布式目标监测方法,如图18至图19所示,应用于检测报警设备,所述方法由检测报警设备中的中央处理单元执行,所述方法包括:Correspondingly, an embodiment of the present application also provides a distributed target monitoring method, as shown in FIG. 18 to FIG. 19, which is applied to the detection and alarm equipment, and the method is executed by the central processing unit in the detection and alarm equipment. include:
步骤1810,获取感应范围内的热红外信号。Step 1810: Acquire thermal infrared signals within the sensing range.
在本申请实施例中,将所述检测报警设备放置在室内或房间的各个角落,检测报警设备中的中央处理单元获取红外检测单元发送的热红外信号,红外检测单元可以为红外传感器等,数量可以为一个或者多个,可参考***实施例。In the embodiment of the present application, the detection alarm device is placed in each corner of the room or room, and the central processing unit in the detection alarm device obtains the thermal infrared signal sent by the infrared detection unit. The infrared detection unit may be an infrared sensor, etc. It can be one or more, please refer to the system embodiment.
步骤1820,根据所述热红外信号确定目标位置、时间与频次的关系。Step 1820: Determine the relationship between the target position, time and frequency according to the thermal infrared signal.
在本申请实施例中,热红外信号为感应范围内的目标对象具有的特殊信号,由于目标对象在不同的时间点可能位于不同的位置,并且不同的时间点,目标对象活动的次数也不同,因此,红外检测单元24小时不间断的采集目标对象在不同的时间点出现的位置,以及不同的时间点出现的次数,从而确定目标位置、时间和频次的关系。In the embodiment of the present application, the thermal infrared signal is a special signal possessed by the target object within the sensing range. Since the target object may be located at different positions at different time points, and at different time points, the number of times the target object moves is also different. Therefore, the infrared detection unit continuously collects the position of the target object at different time points and the number of times at different time points for 24 hours, so as to determine the relationship between the target position, time and frequency.
步骤1830,根据所述目标位置、时间与频次的关系确定所述目标对象的活动轨迹和/或目标对象的活动密度信息。Step 1830: Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
在本申请实施例中,红外检测单元不间断的对目标区域进行红外感应,并将感应到的热红外信号发送给中央处理单元,中央处理单元将目标对象在时间序列上的目标位置、频次连接成线从而形成目标对象的活动轨迹,将所述目标位置、时间和频次进行统计,可以确定目标对象的活动密度,根据目标对象的活动轨迹和目标对象的活动密度可以清楚的知道,目标对象的活动区域和生活习性,例如可以知道,在特定时间范围内,目标对象在那个时间段活动频率最高,目标对象喜欢在哪些区域活动,便于后续对目标对象采取处理措施。In the embodiment of the present application, the infrared detection unit continuously performs infrared sensing on the target area, and sends the sensed thermal infrared signal to the central processing unit, and the central processing unit connects the target position and frequency of the target object in the time series The target object’s activity trajectory is formed in a line. The target position, time and frequency are counted to determine the target object’s activity density. According to the target object’s activity trajectory and the target object’s activity density, it can be clearly known that the target object’s activity Activity area and life habits, for example, it can be known that within a certain time range, the target object has the highest activity frequency during that time period, and which areas the target object likes to move in, so that subsequent measures can be taken for the target object.
可以理解的是,目标对象为有害生物,目标对象的活动密度信息包括:检测报警设备处有害生物密度、检测报警设备区域有害生物密度、检测报警设备区域平均有害生物密度、检测报警设备区域有害生物密度分布图等。It is understandable that the target object is a pest, and the activity density information of the target object includes: pest density at the detection alarm device, pest density in the detection alarm device area, average pest density in the detection alarm device area, and pest detection in the alarm device area Density distribution map, etc.
其中,检测报警设备处有害生物密度,为单位时间检测报警设备发现有害生物的次数。ρ=N/T,其中,N为单位时间内发现的有害生物总数量,单位为只;T为单位时间,单位为小时或者天等时间计量单位;ρ为有害生物密度,典型单位为只/天。Among them, the pest density at the detection alarm device is the number of times the detection alarm device finds the pest per unit time. ρ=N/T, where N is the total number of pests found per unit time, in units; T is unit time, in hours or days and other time measurement units; ρ is the density of pests, and the typical unit is per day.
检测报警设备区域有害生物密度,为单位区域单位时间检测报警设备发现有害生物的次数。
Figure PCTCN2020098588-appb-000006
ρ i=N i/T,其中,N i为第i台设备单位时间内发现的有害生物总数量,单位为只;T为单位时间,单位为小时或者天等时间计量单位;ρ i为第i台设备监测到的有害生物密度,典型单位为只/天;n为单位区域里面检测报警设备的数量;ρ total为一定区域一定时间内有害生物密度,典型单位为只/天。
Detect the pest density in the area of the alarm device, which is the number of times the alarm device detects the pest per unit area and unit time.
Figure PCTCN2020098588-appb-000006
ρ i = N i / T, where, N i is the total number of pests found in the i-th station apparatus per unit time, unit only; T is a unit time, in hours or days, etc. time measurement unit; [rho] i for the first The typical unit of pest density monitored by i equipment is per day; n is the number of detection alarm devices in a unit area; ρ total is the density of pests in a certain area within a certain period of time, and the typical unit is per day.
检测报警设备区域平均有害生物密度,为单位区域单位时间出现有害生物次数密度的平均值。
Figure PCTCN2020098588-appb-000007
ρ i=N i/T,其中,N i为第i台设备单位时间内发现的有害生物总数量,单位为只;T为单位时间,单位为小时或者天等时间计量单位;ρ i为第i台设备监测到的有害生物密度,典型单位为只/天;n为单位区域里面检测报警设备的数量;
Figure PCTCN2020098588-appb-000008
为一定区域一定时间内平均有害生物密度,典型单位为只/天。
The average density of pests in the detection and alarm equipment area is the average of the density of pest occurrences per unit area and unit time.
Figure PCTCN2020098588-appb-000007
ρ i = N i / T, where, N i is the total number of pests found in the i-th station apparatus per unit time, unit only; T is a unit time, in hours or days, etc. time measurement unit; [rho] i for the first The density of pests detected by i equipment is typically unit per day; n is the number of detection alarm devices in the unit area;
Figure PCTCN2020098588-appb-000008
It is the average pest density within a certain period of time in a certain area, and the typical unit is per day.
检测报警设备区域有害生物密度分布图,为单位区域单位时间有害生物出现的密度,用图表的方式表现出来。密度分布图的每个传感位置的数值都体现了该区域位置处,单位时间内有害生物出现的次数。The distribution map of the pest density in the detection and alarm equipment area is the density of pests in a unit area and unit time, which is represented by a chart. The value of each sensing position in the density distribution map reflects the number of occurrences of harmful organisms per unit time at the location of the area.
在本申请实施例中,通过检测报警设备获取感应范围内的热红外信号,根据所述热红外信号确定目标位置、时间与频次的关系,根据所述目标位置、时间与频次的关系确定所述目标对象的活动轨迹和/或目标对象的活动密度信息,由此能够准确的对有害生物的密度进行监测。In the embodiment of the present application, the thermal infrared signal in the sensing range is acquired by detecting the alarm device, the target position, the relationship between time and frequency are determined according to the thermal infrared signal, and the target position, the relationship between time and frequency is determined according to the relationship between the target position, time and frequency. The activity trajectory of the target object and/or the activity density information of the target object, so that the density of harmful organisms can be accurately monitored.
相应的,如图20所示,本申请实施例还提供了一种分布式目标监测装置2000,所述装置2000包括:Correspondingly, as shown in FIG. 20, an embodiment of the present application also provides a distributed target monitoring device 2000, and the device 2000 includes:
第一获取模块2010,用于获取目标区域图像;The first acquisition module 2010 is used to acquire an image of a target area;
输入模块2020,用于将所述目标区域图像输入预设的深度神经网络 模型,获得第一识别结果;The input module 2020 is configured to input the target area image into a preset deep neural network model to obtain a first recognition result;
第二获取模块2030,用于基于所述目标区域图像和预设基准图像获得第二识别结果;The second obtaining module 2030 is configured to obtain a second recognition result based on the target area image and the preset reference image;
第三获取模块2040,用于根据所述第一识别结果和所述第二识别结果获得所述目标区域图像中的目标对象和目标位置;The third obtaining module 2040 is configured to obtain the target object and the target position in the target area image according to the first recognition result and the second recognition result;
第四获取模块2050,用于获取所述目标位置、时间与频次的关系;The fourth obtaining module 2050 is configured to obtain the relationship between the target position, time and frequency;
确定模块2060,用于根据所述目标位置、时间与频次的关系确定所述目标对象的活动轨迹和/或目标对象的活动密度信息。The determining module 2060 is configured to determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
本申请实施例提供的分布式目标监测装置,通过第一获取模块获取目标区域图像,然后通过输入模块将获取到的目标区域图像输入预设的深度神经网络模型,获得第一识别结果,接着使用第二获取模块,基于目标区域图像和预设的基准图像获得第二识别结果,通过第三获取模块,根据第一识别结果和第二识别结果获得目标区域图像中的目标对象和目标位置,第四获取模块获取目标位置、时间和频次的关系,最后,确定模块根据目标对象、时间和频次的关确定计目标的活动轨迹和/或活动密度,由此能够准确的对有害生物密度进行监测。In the distributed target monitoring device provided by the embodiment of the application, the target area image is acquired through the first acquisition module, and then the acquired target area image is input into the preset deep neural network model through the input module to obtain the first recognition result, and then use The second acquisition module obtains the second recognition result based on the target area image and the preset reference image, and obtains the target object and the target position in the target area image according to the first recognition result and the second recognition result through the third acquisition module. Fourth, the acquisition module acquires the relationship between the target location, time, and frequency. Finally, the determination module determines the target's activity trajectory and/or activity density according to the relationship between the target object, time and frequency, so that the pest density can be accurately monitored.
可选的,在装置的其他实施例中,如图20所示,所述装置2000还包括:Optionally, in other embodiments of the apparatus, as shown in FIG. 20, the apparatus 2000 further includes:
标注模块2070,获取目标区域样本图像和目标对象,对所述样本图像中的目标对象进行标注,生成标注信息。The labeling module 2070 obtains a sample image of a target area and a target object, labels the target object in the sample image, and generates labeling information.
训练模块2080,将标注完成的图像输入深度神经网络模型进行训练,得到所述预设的深度神经网络模型。The training module 2080 inputs the marked image into the deep neural network model for training, and obtains the preset deep neural network model.
可选的,在装置的其他实施例中,所述输入模块2020具体用于:Optionally, in other embodiments of the device, the input module 2020 is specifically configured to:
所述预设的深度神经网络模型包括若干卷积层和池化层,所述第一识别结果包括所述目标区域图像中的第一目标对象、所述第一目标对象对应的第一概率以及所述第一目标对象在所述目标区域图像中的第一目标位置。The preset deep neural network model includes several convolutional layers and pooling layers, and the first recognition result includes a first target object in the target area image, a first probability corresponding to the first target object, and The first target position of the first target object in the target area image.
可选的,在装置的其他实施例中,所述第二获取模块2030具体用于:Optionally, in other embodiments of the apparatus, the second acquiring module 2030 is specifically configured to:
获取所述目标区域图像中相对于所述预设基准图像的变化部分图像,并将所述变化部分图像转换为灰度图像;Acquiring a changed part image of the target area image relative to the preset reference image, and converting the changed part image into a grayscale image;
对所述灰度图像进行噪声过滤和阈值分割,得到二值化图像,通过连通域算法得到所述灰度图像中的连通区域;Noise filtering and threshold segmentation are performed on the gray image to obtain a binarized image, and connected regions in the gray image are obtained through a connected domain algorithm;
根据所述连通区域获取目标对象的潜在轮廓;Acquiring a potential contour of the target object according to the connected area;
对所述目标对象的潜在轮廓进行形态学操作,获得第二识别结果,所述第二识别结果包括所述目标区域图像中的第二目标对象、所述第二目标对象对应的第二概率以及所述第二目标对象在所述目标区域图像中的第二目标位置。Perform a morphological operation on the potential contour of the target object to obtain a second recognition result, the second recognition result including the second target object in the target area image, the second probability corresponding to the second target object, and The second target position of the second target object in the target area image.
可选的,在装置的其他实施例中,第三获取模块2040具体用于:Optionally, in other embodiments of the apparatus, the third obtaining module 2040 is specifically configured to:
比较所述第一概率和所述第二概率;Comparing the first probability and the second probability;
如果所述第一概率大于所述第二概率,且所述第一概率大于或等于预设概率阈值,则将所述第一目标对象作为目标对象,将所述第一目标位置作为目标位置;If the first probability is greater than the second probability, and the first probability is greater than or equal to a preset probability threshold, use the first target object as the target object and the first target position as the target position;
如果所述第一概率小于第二概率,且所述第二概率大于或等于预设概率阈值,则将所述第二目标对象作为目标对象,将所述第二目标位置作为目标位置;If the first probability is less than the second probability, and the second probability is greater than or equal to a preset probability threshold, the second target object is used as the target object, and the second target position is used as the target position;
如果所述第一概率和所述第二概率均小于预设概率阈值,但所述第一概率和所述第二概率之和大于预设第二概率阈值,则作为疑似目标;If the first probability and the second probability are both less than the preset probability threshold, but the sum of the first probability and the second probability is greater than the preset second probability threshold, then the target is regarded as a suspected target;
如果所述第一概率和所述第二概率均小于预设概率阈值,且所述第一概率和所述第二概率之和小于预设第二概率阈值,则放弃所述第一识别结果和所述第二识别结果。If the first probability and the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset second probability threshold, then discard the first recognition result and The second recognition result.
需要说明的是,上述分布式目标监测装置可执行本申请实施例所提供的分布式目标监测方法,具备执行方法相应的功能模块和有益效果。未在分布式目标监测装置实施例中详尽描述的技术细节,请参见本申请实施例提供的分布式目标监测方法。It should be noted that the above-mentioned distributed target monitoring device can execute the distributed target monitoring method provided in the embodiments of the present application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in the embodiment of the distributed target monitoring device, please refer to the distributed target monitoring method provided in the embodiment of the present application.
图21是本申请实施例提供的图像检测设备中的控制单元的硬件结构示意图,如图21所示,该控制单元2100包括:FIG. 21 is a schematic diagram of the hardware structure of the control unit in the image detection device provided by an embodiment of the present application. As shown in FIG. 21, the control unit 2100 includes:
一个或多个处理器2110以及存储器2120,图21中以一个处理器 2110为例。One or more processors 2110 and memory 2120. In FIG. 21, one processor 2110 is taken as an example.
处理器2110和存储器2120可以通过总线或者其他方式连接,图21中以通过总线连接为例。The processor 2110 and the memory 2120 may be connected through a bus or in other ways. In FIG. 21, the connection through a bus is taken as an example.
存储器2120作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的分布式目标监测方法对应的程序指令/模块(例如,附图20所示的第一获取模块2010、输入模块2020、第二获取模块2030、第三获取模块2040、第四获取模块2050和确定模块2060)。处理器2110通过运行存储在存储器2120中的非易失性软件程序、指令以及模块,从而执行图像检测设备的各种功能应用以及数据处理,即实现上述方法实施例的分布式目标监测方法。The memory 2120, as a non-volatile computer-readable storage medium, can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, as corresponding to the distributed target monitoring method in the embodiment of the present application. Program instructions/modules (for example, the first acquisition module 2010, the input module 2020, the second acquisition module 2030, the third acquisition module 2040, the fourth acquisition module 2050, and the determination module 2060 shown in FIG. 20). The processor 2110 executes various functional applications and data processing of the image detection device by running non-volatile software programs, instructions, and modules stored in the memory 2120, that is, realizes the distributed target monitoring method of the foregoing method embodiment.
存储器2120可以包括存储程序区和存储数据区,其中,存储程序区可存储操作***、至少一个功能所需要的应用程序;存储数据区可存储根据分布式目标监测装置使用所创建的数据等。此外,存储器2120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器2120可选包括相对于处理器2110远程设置的存储器,这些远程存储器可以通过网络连接至分布式目标监测装置。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。The memory 2120 may include a storage program area and a storage data area. The storage program area may store an operating system and an application program required by at least one function; the storage data area may store data created according to the use of the distributed target monitoring device. In addition, the memory 2120 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices. In some embodiments, the memory 2120 may optionally include memories remotely provided with respect to the processor 2110, and these remote memories may be connected to the distributed target monitoring device through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
所述一个或者多个模块存储在所述存储器2120中,当被所述一个或者多个控制单元2100执行时,执行上述任意方法实施例中的分布式目标监测方法,例如,执行以上描述的图10中的方法步骤1010至步骤1060、图13中的方法步骤1310至步骤1320、图15中的方法步骤1510至步骤1540、图17中的方法步骤1710至步骤1750;实现图20中的模块2010至2080的功能。The one or more modules are stored in the memory 2120, and when executed by the one or more control units 2100, the distributed target monitoring method in any of the foregoing method embodiments is executed, for example, the above-described diagram is executed. Method steps 1010 to 1060 in 10, method steps 1310 to 1320 in FIG. 13, method steps 1510 to 1540 in FIG. 15, method steps 1710 to 1750 in FIG. 17; implement module 2010 in FIG. 20 Up to 2080 features.
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模 块来实现本实施例方案的目的。The device embodiments described above are merely illustrative. The units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
通过以上的实施方式的描述,本领域普通技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。Through the description of the above implementation manners, those of ordinary skill in the art can clearly understand that each implementation manner can be implemented by means of software plus a general hardware platform, and of course, it can also be implemented by hardware. A person of ordinary skill in the art can understand that all or part of the processes in the methods of the foregoing embodiments can be implemented by a computer program instructing relevant hardware. The program can be stored in a computer readable storage medium, and the program can be stored in a computer readable storage medium. When executed, it may include the procedures of the above-mentioned method embodiments. Wherein, the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;在本申请的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本申请的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。Finally, it should be noted that the above embodiments are only used to illustrate the technical solutions of this application, not to limit them; under the idea of this application, the above embodiments or the technical features in different embodiments can also be combined. The steps can be implemented in any order, and there are many other variations in different aspects of the application as described above. For the sake of brevity, they are not provided in the details; although the application has been described in detail with reference to the foregoing embodiments, it is common in the art The technical personnel should understand that: they can still modify the technical solutions recorded in the foregoing embodiments, or equivalently replace some of the technical features; and these modifications or substitutions do not make the essence of the corresponding technical solutions deviate from the implementations of this application Examples of the scope of technical solutions.

Claims (12)

  1. 一种分布式目标监测***,其特征在于,所述***包括至少一个检测报警设备、图像检测设备和远程监控终端,所述图像检测设备分别与所述检测报警设备和所述远程监控终端连接;A distributed target monitoring system, characterized in that the system includes at least one detection alarm device, an image detection device and a remote monitoring terminal, and the image detection device is respectively connected to the detection alarm device and the remote monitoring terminal;
    所述检测报警设备包括红外检测单元、中央处理单元和报警单元,所述中央处理单元分别与所述红外检测单元和所述报警单元电性连接;The detection and alarm equipment includes an infrared detection unit, a central processing unit, and an alarm unit, and the central processing unit is electrically connected to the infrared detection unit and the alarm unit, respectively;
    所述红外检测单元用于检测感应范围内的热红外信号,所述中央处理单元用于接收所述热红外信号,并根据所述热红外信号控制所述报警单元发出报警信号;The infrared detection unit is used to detect thermal infrared signals within a sensing range, and the central processing unit is used to receive the thermal infrared signals, and control the alarm unit to issue an alarm signal according to the thermal infrared signals;
    所述图像检测设备包括电性连接的图像采集单元和控制单元,所述图像采集单元用于采集目标区域图像,所述控制单元用于接收目标区域图像并对所述目标区域图像进行处理,并根据所述目标区域图像控制所述中央处理单元,以使所述中央处理单元控制所述报警单元发出报警信号;The image detection device includes an image acquisition unit and a control unit that are electrically connected, the image acquisition unit is used to acquire a target area image, the control unit is used to receive the target area image and process the target area image, and Controlling the central processing unit according to the target area image, so that the central processing unit controls the alarm unit to issue an alarm signal;
    所述远程监控终端包括电性连接的图像显示单元和遥控单元,所述遥控单元用于获得所述图像检测设备采集的目标区域图像、并将所述目标区域图像在所述图像显示单元显示,以及控制所述中央处理单元,以使所述中央处理单元控制所述报警单元发出报警信号。The remote monitoring terminal includes an image display unit and a remote control unit that are electrically connected, and the remote control unit is used to obtain a target area image collected by the image detection device and display the target area image on the image display unit, And controlling the central processing unit, so that the central processing unit controls the alarm unit to issue an alarm signal.
  2. 根据权利要求1所述的分布式目标监测***,其特征在于,所述红外检测单元为红外感应器、热释电触感器和红外热传感器中的任意一种。The distributed target monitoring system according to claim 1, wherein the infrared detection unit is any one of an infrared sensor, a pyroelectric touch sensor, and an infrared thermal sensor.
  3. 根据权利要求1或2所述的分布式目标监测***,其特征在于,所述报警单元包括用于产生光源的光发生器和用于产生声音的声发生器,所述声发生器和所述光发生器均与所述中央处理单元电性连接。The distributed target monitoring system according to claim 1 or 2, wherein the alarm unit includes a light generator for generating a light source and a sound generator for generating sound, the sound generator and the The light generators are all electrically connected with the central processing unit.
  4. 根据权利要求3所述的分布式目标监测***,其特征在于,所述检测报警设备还包括光敏单元;The distributed target monitoring system according to claim 3, wherein the detection and alarm equipment further comprises a photosensitive unit;
    所述光敏单元与所述中央处理单元电性连接,用于将光照信息传送给所述中央处理单元,以使所述中央处理单元根据所述光照信息控制所述红外检测单元和所述报警单元工作。The photosensitive unit is electrically connected to the central processing unit, and is used to transmit illumination information to the central processing unit, so that the central processing unit controls the infrared detection unit and the alarm unit according to the illumination information jobs.
  5. 根据权利要求4所述的分布式目标监测***,其特征在于,所述检测报警设备、图像检测设备和远程监控终端均设有无线通信单元。The distributed target monitoring system according to claim 4, wherein the detection alarm device, the image detection device and the remote monitoring terminal are all provided with a wireless communication unit.
  6. 根据权利要求5所述的分布式目标监测***,其特征在于,所述***还包括云端服务器;The distributed target monitoring system according to claim 5, wherein the system further comprises a cloud server;
    所述云端服务器与所述图像检测设备连接,用于接收所述图像检测设备发送的目标区域图像、自身的状态信息和检测报警设备的状态信息。The cloud server is connected to the image detection device, and is configured to receive the target area image, its own state information, and the state information of the detection alarm device sent by the image detection device.
  7. 一种分布式目标监测方法,应用于图像检测设备,其特征在于,所述方法包括:A distributed target monitoring method applied to image detection equipment, characterized in that the method includes:
    获取目标区域图像;Obtain an image of the target area;
    将所述目标区域图像输入预设的深度神经网络模型,获得第一识别结果;Input the target area image into a preset deep neural network model to obtain a first recognition result;
    基于所述目标区域图像和预设基准图像获得第二识别结果;Obtaining a second recognition result based on the target area image and the preset reference image;
    融合所述第一识别结果和所述第二识别结果获得所述目标区域图像中的目标对象和目标位置;Fusing the first recognition result and the second recognition result to obtain the target object and the target position in the target area image;
    获取所述目标位置、时间与频次的关系;Acquiring the relationship between the target location, time and frequency;
    根据所述目标位置、时间与频次的关系确定所述目标对象的活动轨迹和/或目标对象的活动密度信息。Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:The method according to claim 7, wherein the method further comprises:
    获取目标区域样本图像和目标对象,对所述样本图像中的目标对象进行标注,生成标注信息;Acquiring a sample image of a target area and a target object, labeling the target object in the sample image, and generating labeling information;
    将标注完成的图像输入深度神经网络模型进行训练,得到所述预设的深度神经网络模型。The marked image is input into the deep neural network model for training, and the preset deep neural network model is obtained.
  9. 根据权利要求7或8所述的方法,其特征在于,所述预设的深度神经网络模型包括若干卷积层和池化层,所述第一识别结果包括所述目标区域图像中的第一目标对象、所述第一目标对象对应的第一概率以及所述第一目标对象在所述目标区域图像中的第一目标位置。The method according to claim 7 or 8, wherein the preset deep neural network model includes several convolutional layers and pooling layers, and the first recognition result includes the first recognition result in the target region image. A target object, a first probability corresponding to the first target object, and a first target position of the first target object in the target area image.
  10. 根据权利要求9所述的方法,其特征在于,所述基于所述目标区域图像和预设基准图像获得第二识别结果,包括:The method according to claim 9, wherein the obtaining a second recognition result based on the target area image and a preset reference image comprises:
    获取所述目标区域图像中相对于所述预设基准图像的变化部分图像,并将所述变化部分图像转换为灰度图像;Acquiring a changed part image of the target area image relative to the preset reference image, and converting the changed part image into a grayscale image;
    对所述灰度图像进行噪声过滤和阈值分割,得到二值化图像,通过连通域算法得到所述二值化图像中的连通区域;Performing noise filtering and threshold segmentation on the grayscale image to obtain a binarized image, and obtaining connected regions in the binarized image through a connected domain algorithm;
    根据所述连通区域获取目标对象的潜在轮廓;Acquiring a potential contour of the target object according to the connected area;
    对所述目标对象的潜在轮廓进行形态学操作,获得第二识别结果,所述第二识别结果包括所述目标区域图像中的第二目标对象、所述第二目标对象对应的第二概率以及所述第二目标对象在所述目标区域图像中的第二目标位置。Perform a morphological operation on the potential contour of the target object to obtain a second recognition result, the second recognition result including the second target object in the target area image, the second probability corresponding to the second target object, and The second target position of the second target object in the target area image.
  11. 根据权利要求10所述的方法,其特征在于,所述根据所述第一识别结果和所述第二识别结果获得所述目标区域图像中的目标对象和目标位置,包括:The method according to claim 10, wherein the obtaining the target object and the target position in the target area image according to the first recognition result and the second recognition result comprises:
    比较所述第一概率和所述第二概率;Comparing the first probability and the second probability;
    如果所述第一概率大于所述第二概率,且所述第一概率大于或等于预设概率阈值,则将所述第一目标对象作为目标对象,将所述第一目标位置作为目标位置;If the first probability is greater than the second probability, and the first probability is greater than or equal to a preset probability threshold, use the first target object as the target object and the first target position as the target position;
    如果所述第一概率小于第二概率,且所述第二概率大于或等于预设概率阈值,则将所述第二目标对象作为目标对象,将所述第二目标位置作为目标位置;If the first probability is less than the second probability, and the second probability is greater than or equal to a preset probability threshold, the second target object is used as the target object, and the second target position is used as the target position;
    如果所述第一概率和所述第二概率均小于预设概率阈值,但所述第一概率和所述第二概率之和大于预设第二概率阈值,则作为疑似目标;If the first probability and the second probability are both less than the preset probability threshold, but the sum of the first probability and the second probability is greater than the preset second probability threshold, then the target is regarded as a suspected target;
    如果所述第一概率和所述第二概率均小于预设概率阈值,且所述第一概率和所述第二概率之和小于预设第二概率阈值,则放弃所述第一识别结果和所述第二识别结果。If the first probability and the second probability are both less than the preset probability threshold, and the sum of the first probability and the second probability is less than the preset second probability threshold, then discard the first recognition result and The second recognition result.
  12. 一种分布式目标监测方法,应用于检测报警设备,其特征在于,所述方法包括:A distributed target monitoring method, which is applied to detection and alarm equipment, is characterized in that the method includes:
    获取感应范围内的热红外信号;Obtain thermal infrared signals within the sensing range;
    根据所述热红外信号确定目标位置、时间与频次的关系;Determine the relationship between target position, time and frequency according to the thermal infrared signal;
    根据所述目标位置、时间与频次的关系确定所述目标对象的活动轨 迹和/或目标对象的活动密度信息。Determine the activity trajectory of the target object and/or the activity density information of the target object according to the relationship between the target location, time and frequency.
PCT/CN2020/098588 2019-09-30 2020-06-28 Distributed target monitoring system and method WO2021063046A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910942708.1A CN110728810B (en) 2019-09-30 2019-09-30 Distributed target monitoring system and method
CN201910942708.1 2019-09-30

Publications (1)

Publication Number Publication Date
WO2021063046A1 true WO2021063046A1 (en) 2021-04-08

Family

ID=69218728

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/098588 WO2021063046A1 (en) 2019-09-30 2020-06-28 Distributed target monitoring system and method

Country Status (2)

Country Link
CN (1) CN110728810B (en)
WO (1) WO2021063046A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436295A (en) * 2021-06-25 2021-09-24 平安科技(深圳)有限公司 Living body breeding monitoring track drawing method, device, equipment and storage medium
CN113724240A (en) * 2021-09-09 2021-11-30 中国海洋大学 Refrigerator caster detection method, system and device based on visual identification
CN115063940A (en) * 2022-06-06 2022-09-16 中国银行股份有限公司 Risk monitoring method and device, storage medium and electronic equipment
CN117706045A (en) * 2024-02-06 2024-03-15 四川省德阳生态环境监测中心站 Combined control method and system for realizing atmospheric ozone monitoring equipment based on Internet of things

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113496444A (en) * 2020-03-19 2021-10-12 杭州海康威视***技术有限公司 Method, device and system for identifying foothold

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015139204A (en) * 2014-01-24 2015-07-30 国立大学法人岐阜大学 Method for evaluating rat detection system
CN108540773A (en) * 2018-04-12 2018-09-14 云丁网络技术(北京)有限公司 A kind of monitoring method, device, system and Cloud Server
CN109299703A (en) * 2018-10-17 2019-02-01 思百达物联网科技(北京)有限公司 The method, apparatus and image capture device counted to mouse feelings
CN109831634A (en) * 2019-02-28 2019-05-31 北京明略软件***有限公司 The density information of target object determines method and device
CN109922310A (en) * 2019-01-24 2019-06-21 北京明略软件***有限公司 The monitoring method of target object, apparatus and system
CN110235890A (en) * 2019-05-14 2019-09-17 熵康(深圳)科技有限公司 A kind of detection of harmful organism and drive method, apparatus, equipment and system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930249A (en) * 2012-10-23 2013-02-13 四川农业大学 Method for identifying and counting farmland pests based on colors and models
CN103793923A (en) * 2014-01-24 2014-05-14 华为技术有限公司 Method and device for acquiring moving object in image
WO2016040960A1 (en) * 2014-09-12 2016-03-17 Appareo System, Llc Non-image-based grain quality sensor
CN204695482U (en) * 2015-06-15 2015-10-07 深圳市尼得科技有限公司 A kind of camera supervised automatic alarm system
CN107103717A (en) * 2017-06-28 2017-08-29 四川亚润科技有限公司 A kind of system for remote monitoring and prewarning
CN107665355B (en) * 2017-09-27 2020-09-29 重庆邮电大学 Agricultural pest detection method based on regional convolutional neural network

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015139204A (en) * 2014-01-24 2015-07-30 国立大学法人岐阜大学 Method for evaluating rat detection system
CN108540773A (en) * 2018-04-12 2018-09-14 云丁网络技术(北京)有限公司 A kind of monitoring method, device, system and Cloud Server
CN109299703A (en) * 2018-10-17 2019-02-01 思百达物联网科技(北京)有限公司 The method, apparatus and image capture device counted to mouse feelings
CN109922310A (en) * 2019-01-24 2019-06-21 北京明略软件***有限公司 The monitoring method of target object, apparatus and system
CN109831634A (en) * 2019-02-28 2019-05-31 北京明略软件***有限公司 The density information of target object determines method and device
CN110235890A (en) * 2019-05-14 2019-09-17 熵康(深圳)科技有限公司 A kind of detection of harmful organism and drive method, apparatus, equipment and system

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436295A (en) * 2021-06-25 2021-09-24 平安科技(深圳)有限公司 Living body breeding monitoring track drawing method, device, equipment and storage medium
CN113436295B (en) * 2021-06-25 2023-09-15 平安科技(深圳)有限公司 Living body culture monitoring track drawing method, device, equipment and storage medium
CN113724240A (en) * 2021-09-09 2021-11-30 中国海洋大学 Refrigerator caster detection method, system and device based on visual identification
CN113724240B (en) * 2021-09-09 2023-10-17 中国海洋大学 Refrigerator caster detection method, system and device based on visual identification
CN115063940A (en) * 2022-06-06 2022-09-16 中国银行股份有限公司 Risk monitoring method and device, storage medium and electronic equipment
CN115063940B (en) * 2022-06-06 2024-02-09 中国银行股份有限公司 Risk monitoring method and device, storage medium and electronic equipment
CN117706045A (en) * 2024-02-06 2024-03-15 四川省德阳生态环境监测中心站 Combined control method and system for realizing atmospheric ozone monitoring equipment based on Internet of things
CN117706045B (en) * 2024-02-06 2024-05-10 四川省德阳生态环境监测中心站 Combined control method and system for realizing atmospheric ozone monitoring equipment based on Internet of things

Also Published As

Publication number Publication date
CN110728810B (en) 2021-08-17
CN110728810A (en) 2020-01-24

Similar Documents

Publication Publication Date Title
WO2021063046A1 (en) Distributed target monitoring system and method
JP2021514548A (en) Target object monitoring methods, devices and systems
EP3107382B1 (en) Object detection systems
CN109886999B (en) Position determination method, device, storage medium and processor
JP2020038845A (en) Detection lighting system and method for characterizing lighting space
US10195008B2 (en) System, device and method for observing piglet birth
US9740921B2 (en) Image processing sensor systems
CN103489006A (en) Computer vision-based rice disease, pest and weed diagnostic method
CN109299703A (en) The method, apparatus and image capture device counted to mouse feelings
CN109886555A (en) The monitoring method and device of food safety
US20220270238A1 (en) System, device, process and method of measuring food, food consumption and food waste
CA3196344A1 (en) Rail feature identification system
KR101944374B1 (en) Apparatus and method for detecting abnormal object and imaging device comprising the same
CN108829762A (en) The Small object recognition methods of view-based access control model and device
US11532153B2 (en) Splash detection for surface splash scoring
CN109831634A (en) The density information of target object determines method and device
Vasconcelos et al. Counting mosquitoes in the wild: An internet of things approach
US20220067377A1 (en) Device for managing environment of breeding farm
Bhattacharya et al. Arrays of single pixel time-of-flight sensors for privacy preserving tracking and coarse pose estimation
KR20190143518A (en) Apparatus and method for determining abnormal object
KR20190103510A (en) Imaging device, apparatus and method for managing of fowl comprising the same
KR20200009530A (en) System and method for detecting abnormal object
KR102505691B1 (en) Apparatus and method for detecting abnormal object and imaging device comprising the same
KR102492066B1 (en) Mobile preventive warning system
US20230172489A1 (en) Method And A System For Monitoring A Subject

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20870577

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20870577

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20870577

Country of ref document: EP

Kind code of ref document: A1