CN117706544A - Intelligent environment-friendly remote monitoring system - Google Patents

Intelligent environment-friendly remote monitoring system Download PDF

Info

Publication number
CN117706544A
CN117706544A CN202410159732.9A CN202410159732A CN117706544A CN 117706544 A CN117706544 A CN 117706544A CN 202410159732 A CN202410159732 A CN 202410159732A CN 117706544 A CN117706544 A CN 117706544A
Authority
CN
China
Prior art keywords
information
data
image acquisition
acquisition module
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410159732.9A
Other languages
Chinese (zh)
Other versions
CN117706544B (en
Inventor
王艳
焦洁
牛国友
张辉
刘根立
刘负贞
陈灵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Titan Air Purification Technology Beijing Co ltd
Original Assignee
New Titan Air Purification Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Titan Air Purification Technology Beijing Co ltd filed Critical New Titan Air Purification Technology Beijing Co ltd
Priority to CN202410346311.7A priority Critical patent/CN118258443A/en
Priority to CN202410159732.9A priority patent/CN117706544B/en
Publication of CN117706544A publication Critical patent/CN117706544A/en
Application granted granted Critical
Publication of CN117706544B publication Critical patent/CN117706544B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The application relates to the technical field of remote monitoring, and provides an intelligent environment-friendly remote monitoring system, which comprises: the image acquisition module is used for acquiring remote environment image information of a preset area; the space ranging module is used for acquiring remote environment space information of a preset area; the pose sensor is rigidly connected with the space ranging module and is used for acquiring pose information of the space ranging module; the clock synchronization device is used for carrying out clock consistency processing on the image acquisition module, the space ranging module and the pose sensor; the data exchange equipment is used for respectively establishing connection between the image acquisition module, the spatial ranging module and the pose sensor and the clock synchronization equipment and the edge computing unit; the edge computing unit is used for carrying out fusion processing on the remote environment image information, the remote environment space information and the pose information to obtain multi-mode fusion data of the preset area environment. The multi-mode fusion data of the preset area environment are acquired in real time and conveniently, and intelligent environment-friendly remote monitoring is realized.

Description

Intelligent environment-friendly remote monitoring system
Technical Field
The application relates to the technical field of remote monitoring, in particular to an intelligent environment-friendly remote monitoring system.
Background
With the progress of technology, the application of digital technology is gradually moving to the environment-friendly monitoring industry. As known from the related art, in the process of efficiently managing the environment of the preset area based on the digitizing technology, it is required to obtain the environmental scene sensing information of the whole environment of the preset area.
Therefore, finding an intelligent environment-friendly remote monitoring system capable of acquiring environmental scene perception information of the whole environment of a preset area becomes a current research hotspot.
Disclosure of Invention
The application provides an intelligent environment-friendly remote monitoring system, which can acquire multi-mode fusion data of a preset area environment in real time and conveniently, thereby providing a basis for the application of a digital technology of the whole environment scene of the preset area and realizing intelligent environment-friendly remote monitoring.
The application provides an intelligent environment-friendly remote monitoring system, which comprises an image acquisition module, a space ranging module, a pose sensor, data exchange equipment, clock synchronization equipment and an edge calculation unit, wherein the image acquisition module is used for acquiring remote environment image information of a preset area, and the image acquisition module is provided with a 360-degree field angle; the space ranging module is used for collecting remote environment space information of a preset area, wherein the space ranging module is provided with a 360-degree field angle; the pose sensor is rigidly connected with the space ranging module and is used for acquiring pose information of the space ranging module; the clock synchronization device is used for performing clock consistency processing on the image acquisition module, the spatial ranging module and the pose sensor; the data exchange equipment is used for respectively establishing connection between the image acquisition module, the spatial ranging module and the pose sensor and the clock synchronization equipment and the edge computing unit; the edge computing unit is used for carrying out fusion processing on the remote environment image information, the remote environment space information and the pose information to obtain multi-mode fusion data of a preset area environment, wherein the multi-mode fusion data is used for representing an environment-friendly remote monitoring result of the preset area environment.
According to the intelligent environment-friendly remote monitoring system, the remote environment image information comprises registration remote environment image information, the remote environment space information comprises registration remote environment space information, and the pose information comprises registration pose information; the edge computing unit performs fusion processing on the remote environment image information, the remote environment space information and the pose information in the following manner to obtain multi-mode fusion data of a preset area environment: the image acquisition module, the space ranging module and the pose sensor are subjected to control registration processing respectively, so that the image acquisition module after the control registration processing, the space ranging module after the control registration processing and the pose sensor after the control registration processing simultaneously acquire corresponding acquisition information, wherein the corresponding acquisition information comprises registration remote environment image information acquired by the image acquisition module after the control registration processing, registration remote environment space information acquired by the space ranging module after the control registration processing and registration pose information acquired by the pose sensor after the control registration processing; and carrying out fusion processing on the registration remote environment image information, the registration remote environment space information and the registration pose information to obtain multi-mode fusion data of a preset area environment.
According to the intelligent environment-friendly remote monitoring system provided by the application, the edge computing unit performs fusion processing on the registration remote environment image information, the registration remote environment space information and the registration pose information in the following manner: performing filtering interpolation processing based on the registration pose information to obtain processed registration pose information; performing data compensation processing on the registration remote environment space information based on the processed registration pose information to obtain processed registration remote environment space information, wherein the processed registration remote environment space information is used for representing the registration remote environment space information for eliminating motion noise; acquiring external parameter calibration data, and performing projection processing on the processed registration remote environment space information through matrix change based on the external parameter calibration data to obtain space image data with depth information; and carrying out fusion processing on the registration remote environment image information and the spatial image data with the depth information.
According to the intelligent environment-friendly remote monitoring system provided by the application, the edge computing unit is used for controlling and registering an image acquisition module, a space ranging module and a pose sensor in the following mode: after clock consistency time service processing is performed on the image acquisition module, the space ranging module and the pose sensor based on the clock synchronization equipment, the edge calculation unit performs control registration processing on the image acquisition module and the pose sensor by taking a data distribution period of the space ranging module as a reference.
According to the intelligent environment-friendly remote monitoring system provided by the application, the edge computing unit is used for controlling and registering the image acquisition module and the pose sensor by taking the data distribution period of the space ranging module as a reference in the following way: the edge calculation unit is used for controlling and registering the image acquisition module by taking the data distribution period of the spatial ranging module as a reference and combining the triggering delay of the image acquisition module, the transmission delay of the image acquisition module and the acquisition delay of the image acquisition module; the edge computing unit performs control registration processing on the pose sensor based on a first data information acquisition time of the image acquisition module and a second data information acquisition time of the pose sensor after control registration processing, wherein the data information acquired by the pose sensor is subjected to filtering interpolation processing in advance.
According to the intelligent environment-friendly remote monitoring system provided by the application, the spatial ranging module is the spatial ranging module after the joint external parameter calibration, the image acquisition module is the image acquisition module after the joint external parameter calibration, and the position and orientation sensor is the position and orientation sensor after the joint external parameter calibration, wherein the joint external parameter calibration is based on the spatial ranging module, the image acquisition module and the position and orientation sensor joint external parameter calibration.
According to the intelligent environment-friendly remote monitoring system provided by the application, the edge computing unit is further used for: and carrying out data consistency detection on the multi-mode fusion data in a preset period, and carrying out joint external parameter calibration again based on the spatial ranging module, the image acquisition module and the pose sensor under the condition that the multi-mode fusion data does not meet the data consistency requirement until the multi-mode fusion data obtained based on the corresponding acquisition information acquired by the spatial ranging module, the image acquisition module and the pose sensor after the joint external parameter calibration meets the data consistency requirement.
According to the intelligent environment-friendly remote monitoring system provided by the application, the edge computing unit is further used for processing the multi-mode fusion data in the preset time period according to the time sequence to obtain a multi-mode fusion data stream; the data exchange device is also used for being in communication connection with external equipment and transmitting the multi-mode fusion data stream to the external equipment.
According to the intelligent environment-friendly remote monitoring system provided by the application, the contact ratio of the image acquisition module and the view angle of the space ranging module is greater than or equal to the contact ratio threshold value.
According to the intelligent environment-friendly remote monitoring system that this application provided, intelligent environment-friendly remote monitoring system still includes: and the electromagnetic shielding device is used for shielding electromagnetic interference formed by the electromagnetic generated by the image acquisition module, the space ranging module and the pose sensor when in work on the edge computing unit.
The intelligent environment-friendly remote monitoring system comprises an image acquisition module, a space ranging module, a pose sensor, data exchange equipment, clock synchronization equipment and an edge calculation unit, wherein the image acquisition module is used for acquiring remote environment image information of a preset area; the space ranging module is used for acquiring remote environment space information of a preset area; the pose sensor is rigidly connected with the space ranging module and is used for acquiring pose information of the space ranging module; the clock synchronization device is used for carrying out clock consistency processing on the image acquisition module, the space ranging module and the pose sensor; the data exchange equipment is used for respectively establishing connection between the image acquisition module, the spatial ranging module and the pose sensor and the clock synchronization equipment and the edge computing unit; and the edge computing unit is used for carrying out fusion processing on the remote environment image information, the remote environment space information and the pose information so as to obtain multi-mode fusion data of the preset area environment. The multi-mode fusion data of the preset area environment can be obtained conveniently in real time, so that a foundation is provided for the application of the digital technology of the full environment scene of the preset area, and intelligent environment-friendly remote monitoring is realized.
Drawings
For a clearer description of the present application or of the prior art, the drawings that are used in the description of the embodiments or of the prior art will be briefly described, it being apparent that the drawings in the description below are some embodiments of the present application, and that other drawings may be obtained from these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of the structure of the intelligent environmental protection remote monitoring system provided by the present application;
fig. 2 is a schematic diagram of a system framework of the intelligent environmental protection remote monitoring system provided by the application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is apparent that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
Fig. 1 is a schematic structural diagram of an intelligent environmental protection remote monitoring system provided by the present application.
As can be seen in fig. 1, the intelligent environmental protection remote monitoring system may include an image acquisition module 110, a spatial ranging module 120, a pose sensor 130, a data exchange device 140, a clock synchronization device 150, and an edge calculation unit 160, and each module will be described below.
In one embodiment, the image acquisition module 110 may be configured to acquire remote environmental image information of a predetermined area, wherein the image acquisition module has a 360 ° field angle. In yet another example, the image acquisition module 110 may also be an array of multiple image acquisition modules, wherein the field angle of view of the entire array of multiple image acquisition modules is 360 °, thereby laying the foundation for acquiring image data of a full environment scene. In yet another embodiment, the image acquisition module 110 may be an image sensor, such as a webcam, an embedded camera, an infrared camera, an ultraviolet camera, a high-speed camera, or the like.
In yet another embodiment, the spatial ranging module 120 may be configured to collect remote environmental spatial information of a preset area, wherein the spatial ranging module 120 is provided with a 360 ° field angle. In yet another example, the spatial ranging module 120 may also be an array of multiple spatial ranging modules, where the field angle of view of the entire array of multiple spatial ranging modules is 360 °, thereby laying a foundation for acquiring spatial ranging data (also known as remote environmental spatial information data) of a full environmental scene. In yet another embodiment, spatial ranging module 120 may be a spatial ranging radar.
It should be noted that, the whole field angle of view of the space ranging radar and the image sensor needs to be substantially coincident to provide a large enough common viewing angle, and under ideal conditions, the longitudinal visible range of the panoramic image needs to cover the sensing area of the space ranging radar.
In yet another embodiment, the overlap ratio of the field angles of the image acquisition module 110 and the spatial ranging module 120 is greater than or equal to the overlap ratio threshold. The overlap ratio threshold may be adjusted according to actual situations, and is not specifically limited in this embodiment.
In yet another embodiment, the pose sensor 130 may be rigidly connected to the spatial ranging module 120 for acquiring pose information of the spatial ranging module 120. The pose sensor 130 may detect rotation, inclination, angular acceleration, linear acceleration, and other various measured object poses and states of the spatial ranging module 120. The pose sensor 130 may include, but is not limited to, gyroscopes, accelerometers, magnetometers, rotation vector sensors, inertial measurement units, tilt sensors.
In yet another embodiment, the clock synchronization device 140 may be used to perform clock consistency processing on the image acquisition module 110, the spatial ranging module 120, and the pose sensor 130. Among other things, clock synchronization devices 140, including but not limited to Network Time Protocol (NTP) servers, IEEE 15888 (Precision Time Protocal, PTP) devices, clock distribution systems, precision clock sources.
In yet another embodiment, the data exchange device 150 may be used to connect the image acquisition module 110, the spatial ranging module 120, and the pose sensor 130 with the clock synchronization device 140 and the edge calculation unit 160, respectively, so that data transmission may be facilitated. Data switching device 150 refers to a device for computer transmission and exchange of data, including but not limited to a single or combined data transmission architecture of a switch, bridge, hub, load balancer, etc.
The edge computing unit 160 may be configured to perform fusion processing on the remote environment image information, the remote environment space information, and the pose information, to obtain multi-mode fusion data of the preset area environment, where the multi-mode fusion data may be used to characterize an environmental protection remote monitoring result of the preset area environment. In an embodiment, the multimodal fusion data may be data forming a map of a preset area environment, in other words, the multimodal fusion data may be used to form a panoramic map of the preset area environment. The multi-mode fusion data obtained by the embodiment can provide a foundation for the application of the preset area full-environment scene digitizing technology.
The intelligent environment-friendly remote monitoring system comprises an image acquisition module, a space ranging module, a pose sensor, data exchange equipment, clock synchronization equipment and an edge calculation unit, wherein the image acquisition module is used for acquiring remote environment image information of a preset area; the space ranging module is used for acquiring remote environment space information of a preset area; the pose sensor is rigidly connected with the space ranging module and is used for acquiring pose information of the space ranging module; the clock synchronization device is used for carrying out clock consistency processing on the image acquisition module, the space ranging module and the pose sensor; the data exchange equipment is used for respectively establishing connection between the image acquisition module, the spatial ranging module and the pose sensor and the clock synchronization equipment and the edge computing unit; and the edge computing unit is used for carrying out fusion processing on the remote environment image information, the remote environment space information and the pose information so as to obtain multi-mode fusion data of the preset area environment. The multi-mode fusion data of the preset area environment can be obtained conveniently in real time, so that a foundation is provided for the application of the digital technology of the full environment scene of the preset area, and intelligent environment-friendly remote monitoring is realized.
Fig. 2 is a schematic diagram of a system framework of the intelligent environmental protection remote monitoring system provided by the application.
In yet another embodiment, as can be seen in conjunction with fig. 2, the intelligent environmental protection remote monitoring system mainly comprises an environmental perception sensor array, such as an image acquisition module and a spatial ranging module; an array of attitude measurement type sensors, such as attitude sensors; an edge calculation unit; a clock synchronization device; and a data exchange device.
The environment sensing sensor array mainly comprises various image sensors and various space ranging radar sensors; the image sensor may be a collection of devices that convert various optical signals and images into electronic signals, including webcams, infrared cameras, ultraviolet cameras, and the like.
The spatial range radar sensor may be a collection of devices of various types that calculate target distance by measuring return signals with various types of radio waves or pulses, including but not limited to lidar, millimeter wave radar, microwave radar, ultrasonic radar.
The attitude measurement type sensor array may mainly comprise various sensor units capable of detecting rotation, inclination, angular acceleration, linear acceleration and other various measurement object attitudes and states, including but not limited to gyroscopes, accelerometers, magnetometers, rotation vector sensors, inertial measurement units, tilt angle sensors.
Clock synchronization devices are devices used to keep clocks in multiple devices or systems consistent, including but not limited to Network Time Protocol (NTP) servers, IEEE 15888 (Precision Time Protocal, PTP) devices, clock distribution systems, precision clock sources.
The edge computing unit comprises, but is not limited to, various high-performance edge intelligent computing processors, field Programmable Gate Arrays (FPGA), edge computing system chips (SOC) and devices, and can meet the computing capacity required by data fusion processing while having the hardware touch registration capacity.
Data switching devices may refer to devices for computer transmission and exchange of data, including but not limited to single or combined data transmission architectures of switches, bridges, hubs, load balancers, and the like. Through the intelligent environment-friendly remote monitoring system, the multi-mode fusion data of the preset area environment can be obtained conveniently in real time, so that a foundation is provided for the application of the digital technology of the full environment scene of the preset area.
In yet another embodiment, as can be seen in conjunction with fig. 2, after the hardware system is started by external power supply, the hardware system performs time service on all other sensing units and computing units through the clock synchronization device, so as to ensure the consistency of clock flow rate. The specific mode and protocol adopted by time service are adapted by combining with the support of the sensing unit, for example, a network type image sensor and a space ranging radar sensor can normally adopt PTP and NTP modes to transmit through a data exchange device for time service, and an attitude measurement sensor such as an inertial measurement unit can adopt a clock pulse distribution mode for time service.
In an exemplary embodiment of the present application, the remote environment image information may include registering the remote environment image information; the remote environmental spatial information may include registering remote environmental spatial information; the pose information may include registration pose information.
In one embodiment, the edge computing unit may perform fusion processing on the remote environment image information, the remote environment space information, and the pose information in the following manner to obtain multi-mode fusion data of the preset area environment:
the method comprises the steps of respectively carrying out control registration processing on an image acquisition module, a space ranging module and a pose sensor, so that the image acquisition module after the control registration processing, the space ranging module after the control registration processing and the pose sensor after the control registration processing simultaneously acquire corresponding acquisition information, wherein the corresponding acquisition information comprises registration remote environment image information acquired by the image acquisition module after the control registration processing, registration remote environment space information acquired by the space ranging module after the control registration processing and registration pose information acquired by the pose sensor after the control registration processing;
and carrying out fusion processing on the registration remote environment image information, the registration remote environment space information and the registration pose information to obtain multi-mode fusion data of the preset area environment.
In one embodiment, the image acquisition module, the spatial ranging module and the pose sensor can be respectively subjected to control registration processing through the edge computing unit, and the image acquisition module after the control registration processing, the spatial ranging module after the control registration processing and the pose sensor after the control registration processing are obtained. The image acquisition module after the control registration processing, the space ranging module after the control registration processing and the pose sensor after the control registration processing can acquire corresponding acquisition information at the same time. In other words, under the same moment, the image acquisition module after the control registration processing can acquire registration remote environment image information, the space ranging module after the control registration processing can acquire registration remote environment space information, and the pose sensor after the control registration processing can acquire registration pose information.
Furthermore, fusion processing is carried out on the registration remote environment image information, the registration remote environment space information and the registration pose information, so that multi-mode fusion data of a high-precision preset area environment can be obtained. In the embodiment, the multi-mode data registration is performed by combining clock synchronization, registration control processing and filtering optimization interpolation, so that the accuracy and stability of the obtained multi-mode fusion data are higher, and intelligent environment-friendly remote monitoring is realized.
In still another exemplary embodiment of the present application, the edge calculation unit may perform fusion processing on the registration remote environment image information, the registration remote environment space information, and the registration pose information in the following manner:
performing filtering interpolation processing based on the registration pose information to obtain processed registration pose information;
performing data compensation processing on the registration remote environment space information based on the processed registration pose information to obtain processed registration remote environment space information, wherein the processed registration remote environment space information is used for representing the registration remote environment space information for eliminating motion noise;
acquiring external parameter calibration data, and performing projection processing on the processed registration remote environment space information through matrix change based on the external parameter calibration data to obtain space image data with depth information;
and carrying out fusion processing on the registered remote environment image information and the spatial image data with the depth information.
In one embodiment, after time service, touch control, registration and calibration of all sensors are completed, the edge computing unit may process various sensor data and perform data preprocessing by the following procedures:
since a single transmission data packet of a spatial range radar sensor typically contains a large amount of ranging data collected at different time points, if the hardware device moves during this time period, the data will contain motion noise. Thus, in one example, after filtering down-sampling, filtering interpolation and ranging data compensation are performed based on related data (corresponding to registration pose information) transmitted by the pose sensor array (corresponding to the pose sensor), such as linear acceleration, angular acceleration, rotation, and speed, so as to obtain processed registration remote environment space information, and motion noise in the ranging data (corresponding to the registration remote environment space information) is eliminated.
In yet another embodiment, external parameter calibration data may be further acquired, and projection processing is performed on the processed registered remote environment spatial information through matrix change based on the external parameter calibration data, so as to obtain spatial image data with depth information. The external parameter calibration data are external parameter calibration data obtained based on external parameter calibration carried out by combining the spatial ranging module, the image acquisition module and the pose sensor.
Furthermore, fusion processing can be carried out on the registered remote environment image information and the spatial image data with depth information so as to obtain multi-mode fusion data of a preset area environment, and intelligent environment-friendly remote monitoring is realized.
In yet another exemplary embodiment of the present application, the edge calculation unit may perform the control registration processing on the image acquisition module, the spatial ranging module, and the pose sensor in the following manner:
after clock consistency time service processing is carried out on the image acquisition module, the space ranging module and the pose sensor based on the clock synchronization equipment, the edge calculation unit carries out control registration processing on the image acquisition module and the pose sensor by taking a data distribution period of the space ranging module as a reference.
In one embodiment, after time service of all the sensors is completed, the edge computing unit may perform touch control and registration of other sensors (including the image acquisition module and the pose sensor) with reference to the data distribution period of the spatial range radar sensor, that is, perform control registration processing.
In still another exemplary embodiment of the present application, the edge calculating unit may perform the control registration processing on the image acquisition module and the pose sensor with reference to the data distribution period of the spatial ranging module in the following manner:
the edge computing unit is used for controlling and registering the image acquisition module by taking the data distribution period of the space ranging module as a reference and combining the triggering delay of the image acquisition module, the transmission delay of the image acquisition module and the acquisition delay of the image acquisition module;
the edge computing unit performs control registration processing on the pose sensor based on the first data information acquisition time of the image acquisition module and the second data information acquisition time of the pose sensor after the control registration processing, wherein the data information acquired by the pose sensor is subjected to filtering interpolation processing in advance.
In one embodiment, the transmission time point of the data acquired by the space ranging radar sensor (corresponding to the space ranging module) is thatAt the point in time when the edge intelligent computing unit is reached by the data exchange device is +.>. The edge calculating unit combines the triggering delay of the image sensor (corresponding to the image acquisition module) according to the radar sensor acquisition data period>Transmission delay of image acquisition module>Acquisition delay of image acquisition module>And triggering a time point t of the image acquisition module to realize control registration processing of the image acquisition module. In other words, the image acquisition module may be controlled to acquire data at the trigger time point t of the image acquisition module. Wherein t= =>. Where T represents the radar sensor acquisition data period. Wherein (1)>,/>,/>The specific values of (2) may be obtained by combining with an image sensor data manual and performing a trigger experimental test.
In the application process, the spatial range radar sensor and the image sensor can synchronously acquire data through control registration processing, so that a data matching state is achieved.
In yet another embodiment, the gesture sensor is considered to have smaller single data acquisition scale and higher frequency, so the data registration of the gesture sensor is performed by adopting a mode of filtering before interpolation, and the filtering method comprises, but is not limited to, various Kalman filtering, median filtering and mean filtering. Interpolation methods include, but are not limited to, linear interpolation, lagrangian interpolation, nearest neighbor interpolation.
In an example, the first data information acquisition time of the image acquisition module after the registration processing may be controlled, for example, the range radar sensor and the image sensor are synchronized to the time point (corresponding to the first data information acquisition time) t for data acquisition, and the gesture sensor is filtered to obtain the time point of front-back adjacent data acquisitionCorresponding to two data information acquisition moments) is,/>Data matching is carried out by a linear interpolation method, and then the data at the time t should beWherein f (t) represents specific measurement data acquired at the time t, namely the specific measurement data acquired by the pose sensor at the time t is obtained, so that the control registration processing of the pose sensor is realized.
In still another exemplary embodiment of the present application, the spatial ranging module may be a spatial ranging module after performing joint external parameter calibration, the image acquisition module may be an image acquisition module after performing joint external parameter calibration, the pose sensor may be a pose sensor after performing joint external parameter calibration, where the joint external parameter calibration may be external parameter calibration based on the joint of the spatial ranging module, the image acquisition module and the pose sensor.
In one embodiment, after time service, touch control and registration of all sensors are completed, the hardware system performs joint external parameter calibration of various sensors. Calibration methods include, but are not limited to, geometric feature class methods, deep learning class methods, and state registration class methods. For example, the image sensor, the spatial range radar sensor and the attitude sensor array can respectively and independently perform state estimation of hardware equipment in a period of time, perform joint calibration of external parameter data through state alignment based on an initial external parameter matrix, and perform joint optimization by combining a chain rule.
In a further exemplary embodiment of the present application, the edge calculation unit may be further configured to:
and carrying out data consistency detection on the multi-mode fusion data in a preset period, and carrying out joint external parameter calibration again based on the spatial ranging module, the image acquisition module and the pose sensor under the condition that the multi-mode fusion data does not meet the data consistency requirement until the multi-mode fusion data obtained based on the corresponding acquisition information acquired by the spatial ranging module, the image acquisition module and the pose sensor after the joint external parameter calibration meets the data consistency requirement.
In one embodiment, the edge computing unit may further perform data consistency detection of multi-mode fusion data information based on the topological relation, the scene plane information and the multi-frame data association information in a specific period, and if the data consistency detection does not exist, perform sensor external parameter joint calibration until multi-mode fusion data obtained based on the corresponding acquisition information acquired by the spatial ranging module, the image acquisition module and the pose sensor after joint external parameter calibration meets the data consistency requirement. In the embodiment, the pre-calibration result is taken as an initial value, and the calibration data can be maintained and updated in real time in the running process of the device through a dynamic self-calibration algorithm module deployed on the intelligent computing terminal in the device.
In still another exemplary embodiment of the present application, the edge calculation unit may be further configured to process, according to a time sequence, the plurality of multi-mode fusion data in a preset time period, so as to obtain a multi-mode fusion data stream;
the data exchange device may also be configured for communication connection with an external device for transmitting the multimodal fusion data stream to the external device.
In one embodiment, a multi-modal data stream may be constructed based on a time series, and data compression and combination may be performed according to the various sensors used to reject redundant data information, e.g., various data have been synchronously matched, and only one item of time stamp data need be saved. Further, external communication is performed based on the data exchange device, and the multi-mode data stream is transmitted.
In an exemplary embodiment of the present application, the monitoring system may further include: electromagnetic shielding means. The electromagnetic shielding device can be used for shielding electromagnetic interference formed by an electromagnetic pair edge computing unit generated by the image acquisition module, the space ranging module and the pose sensor during working.
Further, it is to be understood that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (10)

1. An intelligent environment-friendly remote monitoring system is characterized by comprising an image acquisition module, a space ranging module, a pose sensor, a data exchange device, a clock synchronization device and an edge calculation unit, wherein,
the image acquisition module is used for acquiring remote environment image information of a preset area, wherein the image acquisition module is provided with a 360-degree field angle;
the space ranging module is used for collecting remote environment space information of a preset area, wherein the space ranging module is provided with a 360-degree field angle;
the pose sensor is rigidly connected with the space ranging module and is used for acquiring pose information of the space ranging module;
the clock synchronization device is used for performing clock consistency processing on the image acquisition module, the spatial ranging module and the pose sensor;
the data exchange equipment is used for respectively establishing connection between the image acquisition module, the spatial ranging module and the pose sensor and the clock synchronization equipment and the edge computing unit;
the edge computing unit is used for carrying out fusion processing on the remote environment image information, the remote environment space information and the pose information to obtain multi-mode fusion data of a preset area environment, wherein the multi-mode fusion data is used for representing an environment-friendly remote monitoring result of the preset area environment.
2. The intelligent environmental protection remote monitoring system of claim 1, wherein the remote environment image information comprises registered remote environment image information, the remote environment space information comprises registered remote environment space information, and the pose information comprises registered pose information;
the edge computing unit performs fusion processing on the remote environment image information, the remote environment space information and the pose information in the following manner to obtain multi-mode fusion data of a preset area environment:
the image acquisition module, the space ranging module and the pose sensor are subjected to control registration processing respectively, so that the image acquisition module after the control registration processing, the space ranging module after the control registration processing and the pose sensor after the control registration processing simultaneously acquire corresponding acquisition information, wherein the corresponding acquisition information comprises registration remote environment image information acquired by the image acquisition module after the control registration processing, registration remote environment space information acquired by the space ranging module after the control registration processing and registration pose information acquired by the pose sensor after the control registration processing;
and carrying out fusion processing on the registration remote environment image information, the registration remote environment space information and the registration pose information to obtain multi-mode fusion data of a preset area environment.
3. The intelligent environmental protection remote monitoring system according to claim 2, wherein the edge computing unit performs fusion processing on the registration remote environment image information, the registration remote environment space information and the registration pose information in the following manner:
performing filtering interpolation processing based on the registration pose information to obtain processed registration pose information;
performing data compensation processing on the registration remote environment space information based on the processed registration pose information to obtain processed registration remote environment space information, wherein the processed registration remote environment space information is used for representing the registration remote environment space information for eliminating motion noise;
acquiring external parameter calibration data, and performing projection processing on the processed registration remote environment space information through matrix change based on the external parameter calibration data to obtain space image data with depth information;
and carrying out fusion processing on the registration remote environment image information and the spatial image data with the depth information.
4. The intelligent environmental protection remote monitoring system according to claim 2, wherein the edge computing unit performs control registration processing on the image acquisition module, the spatial ranging module and the pose sensor in the following manner:
after clock consistency time service processing is performed on the image acquisition module, the space ranging module and the pose sensor based on the clock synchronization equipment, the edge calculation unit performs control registration processing on the image acquisition module and the pose sensor by taking a data distribution period of the space ranging module as a reference.
5. The intelligent environmental protection remote monitoring system according to claim 4, wherein the edge calculating unit performs control registration processing on the image acquisition module and the pose sensor with reference to a data distribution period of the spatial ranging module by:
the edge calculation unit is used for controlling and registering the image acquisition module by taking the data distribution period of the spatial ranging module as a reference and combining the triggering delay of the image acquisition module, the transmission delay of the image acquisition module and the acquisition delay of the image acquisition module;
the edge computing unit performs control registration processing on the pose sensor based on a first data information acquisition time of the image acquisition module and a second data information acquisition time of the pose sensor after control registration processing, wherein the data information acquired by the pose sensor is subjected to filtering interpolation processing in advance.
6. The intelligent environmental protection remote monitoring system according to claim 5, wherein the spatial ranging module is a spatial ranging module after performing joint external parameter calibration, the image acquisition module is an image acquisition module after performing joint external parameter calibration, and the pose sensor is a pose sensor after performing joint external parameter calibration, wherein the joint external parameter calibration is based on the joint external parameter calibration performed by the spatial ranging module, the image acquisition module, and the pose sensor.
7. The intelligent environmental protection remote monitoring system of claim 6, wherein the edge computing unit is further configured to:
and carrying out data consistency detection on the multi-mode fusion data in a preset period, and carrying out joint external parameter calibration again based on the spatial ranging module, the image acquisition module and the pose sensor under the condition that the multi-mode fusion data does not meet the data consistency requirement until the multi-mode fusion data obtained based on the corresponding acquisition information acquired by the spatial ranging module, the image acquisition module and the pose sensor after the joint external parameter calibration meets the data consistency requirement.
8. The intelligent environmental protection remote monitoring system according to any one of claims 1-7, wherein the edge computing unit is further configured to process the plurality of multi-modal fusion data in a preset time period according to a time sequence, so as to obtain a multi-modal fusion data stream;
the data exchange device is also used for being in communication connection with external equipment and transmitting the multi-mode fusion data stream to the external equipment.
9. The intelligent environmental protection remote monitoring system of claim 1, wherein the overlap ratio of the angles of view of the image acquisition module and the spatial ranging module is greater than or equal to an overlap ratio threshold.
10. The intelligent environmental protection remote monitoring system of claim 1, further comprising:
and the electromagnetic shielding device is used for shielding electromagnetic interference formed by the electromagnetic generated by the image acquisition module, the space ranging module and the pose sensor when in work on the edge computing unit.
CN202410159732.9A 2024-02-04 2024-02-04 Intelligent environment-friendly remote monitoring system Active CN117706544B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202410346311.7A CN118258443A (en) 2024-02-04 2024-02-04 Intelligent environment-friendly remote monitoring system and monitoring method
CN202410159732.9A CN117706544B (en) 2024-02-04 2024-02-04 Intelligent environment-friendly remote monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410159732.9A CN117706544B (en) 2024-02-04 2024-02-04 Intelligent environment-friendly remote monitoring system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410346311.7A Division CN118258443A (en) 2024-02-04 2024-02-04 Intelligent environment-friendly remote monitoring system and monitoring method

Publications (2)

Publication Number Publication Date
CN117706544A true CN117706544A (en) 2024-03-15
CN117706544B CN117706544B (en) 2024-04-09

Family

ID=90148248

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410159732.9A Active CN117706544B (en) 2024-02-04 2024-02-04 Intelligent environment-friendly remote monitoring system
CN202410346311.7A Pending CN118258443A (en) 2024-02-04 2024-02-04 Intelligent environment-friendly remote monitoring system and monitoring method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202410346311.7A Pending CN118258443A (en) 2024-02-04 2024-02-04 Intelligent environment-friendly remote monitoring system and monitoring method

Country Status (1)

Country Link
CN (2) CN117706544B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120099952A (en) * 2011-03-02 2012-09-12 한국과학기술원 Sensor system, and system and method for preparing environment map using the same
CN103729883A (en) * 2013-12-30 2014-04-16 浙江大学 Three-dimensional environmental information collection and reconstitution system and method
CN108917837A (en) * 2018-07-02 2018-11-30 吉林农业科技学院 A kind of indoor environment monitoring system of combination architectural environment simulation
CN114453709A (en) * 2022-02-22 2022-05-10 中国计量大学 Robot welding site intelligent monitoring system based on edge calculation
CN115200588A (en) * 2022-09-14 2022-10-18 煤炭科学研究总院有限公司 SLAM autonomous navigation method and device for mobile robot
CN116907469A (en) * 2023-03-24 2023-10-20 山东浪潮科学研究院有限公司 Synchronous positioning and mapping method and system for multi-mode data combined optimization

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120099952A (en) * 2011-03-02 2012-09-12 한국과학기술원 Sensor system, and system and method for preparing environment map using the same
CN103729883A (en) * 2013-12-30 2014-04-16 浙江大学 Three-dimensional environmental information collection and reconstitution system and method
CN108917837A (en) * 2018-07-02 2018-11-30 吉林农业科技学院 A kind of indoor environment monitoring system of combination architectural environment simulation
CN114453709A (en) * 2022-02-22 2022-05-10 中国计量大学 Robot welding site intelligent monitoring system based on edge calculation
CN115200588A (en) * 2022-09-14 2022-10-18 煤炭科学研究总院有限公司 SLAM autonomous navigation method and device for mobile robot
CN116907469A (en) * 2023-03-24 2023-10-20 山东浪潮科学研究院有限公司 Synchronous positioning and mapping method and system for multi-mode data combined optimization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐斌: "基于边缘计算的移动机器人视觉SLAM 方法", 高技术通讯, vol. 33, no. 9, 25 September 2023 (2023-09-25), pages 1000 - 1008 *

Also Published As

Publication number Publication date
CN118258443A (en) 2024-06-28
CN117706544B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
CN107255476B (en) Indoor positioning method and device based on inertial data and visual features
US8180107B2 (en) Active coordinated tracking for multi-camera systems
JP5992184B2 (en) Image data processing apparatus, image data processing method, and image data processing program
JP3880702B2 (en) Optical flow detection apparatus for image and self-position recognition system for moving object
CN102798350B (en) Method, device and system for measuring deflection of arm support
CN111156998A (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
KR20140049361A (en) Multiple sensor system, and apparatus and method for three dimensional world modeling using the same
CN109547769B (en) Highway traffic dynamic three-dimensional digital scene acquisition and construction system and working method thereof
KR20120099952A (en) Sensor system, and system and method for preparing environment map using the same
CN109269525B (en) Optical measurement system and method for take-off or landing process of space probe
US20230014421A1 (en) 6DoF INSIDE-OUT TRACKING GAME CONTROLLER INITIAL REGISTRATION
KR101764222B1 (en) System and method for high precise positioning
KR20200023974A (en) Method and apparatus for synchronization of rotating lidar and multiple cameras
US12022197B2 (en) Image capturing system, method, and analysis of objects of interest
CN110361001B (en) Space debris movement measuring system and calibration method
CN114199235B (en) Positioning system and positioning method based on sector depth camera
CN117706544B (en) Intelligent environment-friendly remote monitoring system
CN113701750A (en) Fusion positioning system of underground multi-sensor
CN113587934A (en) Robot, indoor positioning method and device and readable storage medium
WO2021056503A1 (en) Positioning method and apparatus for movable platform, movable platform, and storage medium
Wang et al. Progress on ISPRS benchmark on multisensory indoor mapping and positioning
WO2022228461A1 (en) Three-dimensional ultrasonic imaging method and system based on laser radar
CN108426559B (en) Antenna attitude detection device and method
CN202084081U (en) Moving object motion attitude sensing system
KR101856151B1 (en) An portable apparatus for gathering indoor positioning infra information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant