CN109856643B - Movable type non-inductive panoramic sensing method based on 3D laser - Google Patents

Movable type non-inductive panoramic sensing method based on 3D laser Download PDF

Info

Publication number
CN109856643B
CN109856643B CN201811536086.4A CN201811536086A CN109856643B CN 109856643 B CN109856643 B CN 109856643B CN 201811536086 A CN201811536086 A CN 201811536086A CN 109856643 B CN109856643 B CN 109856643B
Authority
CN
China
Prior art keywords
point cloud
moving
laser
moving object
personnel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811536086.4A
Other languages
Chinese (zh)
Other versions
CN109856643A (en
Inventor
林国贤
黄金魁
林力辉
林峰
黄春红
陈月卿
刘旭
于晓翔
赖德炎
刘斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Super High Voltage Branch Of State Grid Fujian Electric Power Co ltd
State Grid Fujian Electric Power Co Ltd
Original Assignee
Shanghai Vkingtele Communication Science And Technology Co ltd
State Grid Fujian Electric Power Co Ltd
Maintenance Branch of State Grid Fujian Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Vkingtele Communication Science And Technology Co ltd, State Grid Fujian Electric Power Co Ltd, Maintenance Branch of State Grid Fujian Electric Power Co Ltd filed Critical Shanghai Vkingtele Communication Science And Technology Co ltd
Priority to CN201811536086.4A priority Critical patent/CN109856643B/en
Publication of CN109856643A publication Critical patent/CN109856643A/en
Application granted granted Critical
Publication of CN109856643B publication Critical patent/CN109856643B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a movable type non-inductive panoramic sensing method based on 3D laser. Identifying the category of the moving object by the size and the outline characteristics of the reflected laser beam area through a collision detection algorithm of the laser beam emitted by the laser in real time and the moving object; and then, the moving object type is an independent outline feature identification of the moving personnel, so that the accurate space position positioning of the moving personnel in the three-dimensional map scene is realized, the risk judgment of the mistaken entering and the mistaken exiting areas is further realized, and the safety alarm and prompt are provided for transformer substation monitors and field operating personnel. The method can realize remote centralized monitoring and centralized management, effectively reduce the workload of monitoring personnel, improve the working efficiency and greatly improve the field operation management and control efficiency and the field operation safety.

Description

Movable type non-inductive panoramic sensing method based on 3D laser
Technical Field
The invention belongs to the technical field of data processing, and particularly relates to a movable type non-inductive panoramic sensing method based on 3D laser.
Background
With the rapid expansion of the scale of the power grid, the technology level of the power system is continuously improved, the number of unmanned transformer substations is more and more, especially, the construction of a power verticalization management system is continuously deepened and optimized, the range of administration required by power production/operation management and control personnel is larger and larger, and the unprecedented challenge is brought to the power production/operation management and control personnel.
In order to further improve the management and control means of the transformer substation, multiple high-precision positioning and sensing detection technologies are applied, application expansion of a power grid equipment centralized management and control platform based on a three-dimensional live-action technology is developed, a power grid sensing fusion intelligent ecological circle is created, and the existing OMS system and SCADA system are combined to construct a power grid three-dimensional live-action intelligent management and control platform which is good in interactivity, namely strong in visual sensation, immersive, real and accurate.
The transformer substation field live equipment is erected, in order to make up the defect that the traditional monitoring means lacks space detection sensing capability on a transformer substation field, a laser radar detection technology is applied, a device and a technical means for researching and developing space detection capability on an operation area are combined with a three-dimensional live-action scene with space coordinates, which is accurately cloned and restored on the transformer substation, functions of dividing a space danger area, prohibiting entry/exit warning in real time and the like are realized, and a high-speed tracking camera shooting ball machine is combined, field operation videos are captured in real time in a linkage mode, warning trigger photos are captured in a snapshot mode, and three-dimensional management and control on the operation field are realized.
Disclosure of Invention
The invention aims to provide a movable non-inductive panoramic sensing method based on a 3D laser, which can realize remote centralized monitoring and management, effectively reduce the workload of monitoring personnel, improve the working efficiency and greatly improve the field operation management and control efficiency and the field operation safety.
In order to realize the purpose, the technical scheme of the invention is as follows: A3D laser-based movable type non-inductive panoramic perception method is characterized in that the type of a moving object is identified through the size and the outline characteristics of a reflected laser beam area through a collision detection algorithm of a laser beam emitted by laser in real time and the moving object; and then, the moving object type is an independent outline feature identification of the moving personnel, so that the accurate space position positioning of the moving personnel in the three-dimensional map scene is realized, the risk judgment of the mistaken entering and the mistaken exiting areas is further realized, and the safety alarm and prompt are provided for transformer substation monitors and field operating personnel.
In an embodiment of the present invention, a collision detection algorithm for a laser beam emitted by a laser in real time and a moving object is used to identify the category of the moving object according to the size and the contour characteristics of the reflected laser beam area, and the specific implementation steps are as follows:
s21, outputting position information data of scanned points in an FOV space through a 3D laser radar, and converting the output data into a standard point cloud data format;
s22, recording a plurality of frames of original point cloud data, and accumulating to obtain background point cloud data;
s23, acquiring real-time point cloud data according to the original point cloud data;
s24, comparing the real-time point cloud data with the background point cloud data to obtain difference point cloud data;
step S25, clustering the difference point cloud data set according to the distance between each point to obtain a plurality of clustering point sets, wherein each clustering point represents a moving object;
and S26, identifying the category of the moving object according to the outline characteristics.
In an embodiment of the present invention, the mobile object type is an independent identification of the outline feature of the mobile person, so that the specific implementation steps of the accurate spatial position positioning of the mobile person in the three-dimensional map scene are as follows:
s31, matching the 3D laser radar with a three-dimensional map through scanning point cloud to determine the position and the posture of the laser radar in a scene;
step S32, identifying a set of points of which the category of the moving object is the moving person according to the contour characteristics, taking out the latest frame of point cloud data from each set of points representing the moving person, acquiring the current position point cloud of the moving person, and calculating the centroid of the current position point cloud;
s33, determining the distance of the moving personnel relative to the 3D laser radar by adopting a TOF method and taking half of the product of the light speed and the transmitting and receiving time difference;
and S34, converting the position of the mobile personnel in the laser coordinate system into a scene coordinate system, and realizing accurate spatial position positioning of the mobile personnel in a three-dimensional map scene.
In an embodiment of the present invention, the moving object category includes moving people, vehicles, and birds.
Compared with the prior art, the invention has the following beneficial effects:
(1) the method of the invention can make the primary monitoring personnel more and more familiar with the field situation, and can realize remote centralized monitoring and centralized management, effectively reduce the workload of the monitoring personnel and improve the working efficiency;
(2) the method utilizes the laser radar detection technology to accurately sense the space of moving objects such as transformer substation operators, vehicles and the like, and greatly improves the field operation control efficiency and the field operation safety;
(3) the method applies high-precision mobile positioning equipment to position, control and record tracks of field operators, enrich and perfect the transformer substation operation control technology and improve the control efficiency;
(4) the method can realize the fusion of multiple platforms and big data, greatly improve the analysis early warning and decision-making assisting capability of the three-dimensional live-action platform, finally realize the improvement of the safe operation level of the power grid, and realize the high fusion and the fine management of the power flow, the information flow and the service flow of the intelligent power grid.
Drawings
Fig. 1 is a flowchart of a method for implementing a moving object classification by using a collision detection algorithm.
FIG. 2 is a flow chart of a method for locating a precise spatial position of a moving person.
Fig. 3 is a schematic diagram illustrating the method of the present invention for recognizing the target of the mobile person.
Fig. 4 is a schematic diagram of the positioning of a mobile person in a three-dimensional map scene.
Detailed Description
The technical scheme of the invention is specifically explained in the following by combining the attached drawings.
The invention provides a movable type non-inductive panoramic perception method based on 3D laser, which is characterized in that the type of a moving object (comprising moving personnel, vehicles, birds and the like) is identified by the size and the outline characteristics of a reflected laser beam area through a collision detection algorithm of a laser beam emitted by the laser in real time and the moving object; and then, the moving object type is an independent outline feature identification of the moving personnel, so that the accurate spatial position positioning of the moving personnel in a three-dimensional map scene is realized, the judgment of the risk of the entering and leaving areas by mistake is realized, and safety warning and prompt are provided for transformer substation monitors and field operation personnel.
As shown in fig. 1, the specific implementation steps of identifying the category of the moving object by using the size and the contour characteristics of the reflected laser beam area through the collision detection algorithm of the laser beam emitted in real time and the moving object are as follows:
s21, outputting position information data of scanned points in an FOV space through a 3D laser radar, and converting the output data into a standard point cloud data format;
s22, recording a plurality of frames of original point cloud data, and accumulating to obtain background point cloud data;
s23, acquiring real-time point cloud data according to the original point cloud data;
s24, comparing the real-time point cloud data with the background point cloud data to obtain difference point cloud data;
step S25, clustering the difference point cloud data set according to the distance between each point to obtain a plurality of clustering point sets, wherein each clustering point represents a moving object;
and S26, identifying the category of the moving object according to the outline characteristics.
As shown in fig. 2, the specific implementation steps of using the moving object type as the profile feature independent identifier of the moving person to realize the accurate spatial position location of the moving person in the three-dimensional map scene are as follows:
s31, matching the 3D laser radar with a three-dimensional map through scanning point cloud to determine the position and the posture of the laser radar in a scene;
step S32, identifying a set of points of which the category of the moving object is the moving person according to the contour characteristics, taking out the latest frame of point cloud data from each set of points representing the moving person, acquiring the current position point cloud of the moving person, and calculating the centroid of the current position point cloud;
s33, determining the distance of the moving personnel relative to the 3D laser radar by adopting a TOF method and taking half of the product of the light speed and the transmitting and receiving time difference;
and S34, converting the position of the mobile personnel in the laser coordinate system into a scene coordinate system, and realizing accurate spatial position positioning of the mobile personnel in a three-dimensional map scene.
The following is a specific implementation process of the present invention.
The invention relates to a movable type non-inductive panoramic sensing method based on 3D laser, which mainly comprises the steps of recognizing whether personnel move or not by researching a collision detection algorithm of laser beams emitted by laser in real time and moving personnel and identifying whether the moving personnel move or not through the size and the contour characteristics of a reflected laser beam area, and realizing accurate spatial position positioning of the moving personnel in a three-dimensional map scene, wherein the method can detect the moving personnel in a cylindrical area with the radius of 30 meters by returning data through a laser sensor and combining the volume and the moving speed; the concrete implementation is as follows:
1. as shown in fig. 1, capturing a moving person in a laser point cloud scene:
the method is an algorithm which is mainly used for recognizing the profile characteristics of a moving person by researching laser beams emitted by lasers in real time and then independently identifying the characteristic area;
(1) Outputting position information of scanned points in an FOV space through a three-dimensional laser sensor, and converting all output data into a standard point cloud data format;
(2) Recording a plurality of frames of original point cloud data, and accumulating to obtain background point cloud data;
(3) Acquiring real-time point cloud data according to the original point cloud data;
(4) Comparing the real-time point cloud data with the background point cloud data to obtain difference point cloud data;
(5) And clustering the difference point cloud data set according to the distance between each point to obtain a plurality of clustering point sets, wherein each clustering point represents a moving object.
(6) In combination with volume, speed of movement, visible camera pictures capture the moving person.
Through the above process, the monitoring personnel can see the moving personnel target in the point cloud scene, as shown in fig. 3.
2. Positioning mobile personnel in a laser point cloud scene:
the method mainly comprises the steps of researching an algorithm of position information of a mobile person in a point cloud scene, and analyzing the distance between the mobile person and laser radar equipment through information of a laser discrete coordinate point and reflection intensity information, so as to calculate coordinate information of the mobile person in the point cloud;
(1) The laser radar determines the position and the posture of the laser radar in the scene through matching the scanning point cloud with the three-dimensional map.
(2) And (4) taking the latest frame of point cloud data from each point cloud set representing the moving object, and acquiring the current position point cloud of the person. And calculating the mass center of the point cloud.
(3) 16 lines of three-dimensional laser, the distance test adopts a TOF method, and the relative laser distance of the moving personnel is determined by half of the product of the light speed and the transmitting and receiving time difference.
(4) Converting the position of a moving person in a laser coordinate system into a scene coordinate system
Through the steps, the detected accurate position of the mobile personnel can be accurately marked in the three-dimensional scene; the monitoring personnel can see the position of the mobile personnel in the cloud-point scene, as shown in the block of fig. 4, that is, the position information of the personnel.
The above are preferred embodiments of the present invention, and all changes made according to the technical solutions of the present invention that produce functional effects do not exceed the scope of the technical solutions of the present invention belong to the protection scope of the present invention.

Claims (2)

1. A movable non-inductive panoramic perception method based on 3D laser is characterized in that the type of a moving object is identified by the size and the outline characteristics of a reflected laser beam region through a collision detection algorithm of a laser beam emitted by the laser in real time and the moving object; then, the moving object type is an independent outline feature identification of the moving personnel, so that the accurate space position positioning of the moving personnel in a three-dimensional map scene is realized, the risk judgment of the mistaken entering and wrong exiting areas is further realized, and safety warning and prompt are provided for transformer substation monitors and field operating personnel;
the method specifically realizes the identification of the category of the moving object by the size and the contour characteristics of the area of the reflected laser beam through a collision detection algorithm of the laser beam emitted by the laser in real time and the moving object, and comprises the following steps:
s21, outputting position information data of scanned points in an FOV space through a 3D laser radar, and converting the output data into a standard point cloud data format;
s22, recording a plurality of frames of original point cloud data, and accumulating to obtain background point cloud data;
s23, acquiring real-time point cloud data according to the original point cloud data;
s24, comparing the real-time point cloud data with the background point cloud data to obtain difference point cloud data;
step S25, clustering the difference point cloud data set according to the distance between every two points to obtain a plurality of clustering point sets, wherein each clustering point represents a moving object;
step S26, identifying the category of the moving object according to the outline characteristics;
the method comprises the following specific implementation steps of independently identifying the moving object type as the outline characteristics of the moving person, so as to realize the accurate space position positioning of the moving person in the three-dimensional map scene:
s31, matching the 3D laser radar with a three-dimensional map through scanning point cloud to determine the position and the posture of the laser radar in a scene;
step S32, identifying a set of points of which the category of the moving object is the moving person according to the contour characteristics, taking out the latest frame of point cloud data from each set of points representing the moving person, acquiring the current position point cloud of the moving person, and calculating the centroid of the current position point cloud;
s33, determining the distance of the moving personnel relative to the 3D laser radar by adopting a TOF method and taking half of the product of the light speed and the transmitting and receiving time difference;
and S34, converting the position of the mobile personnel in the laser coordinate system into a scene coordinate system, and realizing accurate spatial position positioning of the mobile personnel in a three-dimensional map scene.
2. The 3D laser-based movable type non-inductive panoramic perception method according to claim 1, characterized in that the moving object categories include moving people, vehicles and birds.
CN201811536086.4A 2018-12-15 2018-12-15 Movable type non-inductive panoramic sensing method based on 3D laser Active CN109856643B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811536086.4A CN109856643B (en) 2018-12-15 2018-12-15 Movable type non-inductive panoramic sensing method based on 3D laser

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811536086.4A CN109856643B (en) 2018-12-15 2018-12-15 Movable type non-inductive panoramic sensing method based on 3D laser

Publications (2)

Publication Number Publication Date
CN109856643A CN109856643A (en) 2019-06-07
CN109856643B true CN109856643B (en) 2022-10-04

Family

ID=66891261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811536086.4A Active CN109856643B (en) 2018-12-15 2018-12-15 Movable type non-inductive panoramic sensing method based on 3D laser

Country Status (1)

Country Link
CN (1) CN109856643B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110634143B (en) * 2019-08-28 2022-08-05 国网福建省电力有限公司 Electric power production area warning method based on laser scanning point cloud
CN110568449B (en) * 2019-10-14 2021-04-16 自然资源部第二海洋研究所 Wind-borne rough sea surface laser reflection and transmission matrix calculation method
CN110898353A (en) * 2019-12-09 2020-03-24 国网智能科技股份有限公司 Panoramic monitoring and linkage control method and system for fire-fighting robot of transformer substation
CN111412833B (en) * 2020-03-30 2021-07-30 广东电网有限责任公司电力科学研究院 Alarming method, system and equipment for positioning safe distance of three-dimensional scene of transformer substation
CN112465959B (en) * 2020-12-17 2022-07-01 国网四川省电力公司电力科学研究院 Transformer substation three-dimensional live-action model inspection method based on local scene updating
CN113298163A (en) * 2021-05-31 2021-08-24 国网湖北省电力有限公司黄石供电公司 Target identification monitoring method based on LiDAR point cloud data
CN114067472B (en) * 2021-11-29 2024-06-25 广东电网有限责任公司 Substation entering authorization management system and method
CN114879160B (en) * 2022-07-12 2022-10-14 合肥派光感知信息技术有限公司 Rail foreign matter invasion real-time monitoring method and system based on three-dimensional point cloud data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106324619A (en) * 2016-10-28 2017-01-11 武汉大学 Automatic obstacle avoiding method of substation inspection robot
CN106406338A (en) * 2016-04-14 2017-02-15 中山大学 Omnidirectional mobile robot autonomous navigation apparatus and method based on laser range finder
WO2017197617A1 (en) * 2016-05-19 2017-11-23 深圳市速腾聚创科技有限公司 Movable three-dimensional laser scanning system and movable three-dimensional laser scanning method
CN108802758A (en) * 2018-05-30 2018-11-13 北京应互科技有限公司 A kind of Intelligent security monitoring device, method and system based on laser radar

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107818288B (en) * 2016-09-13 2019-04-09 腾讯科技(深圳)有限公司 Sign board information acquisition method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406338A (en) * 2016-04-14 2017-02-15 中山大学 Omnidirectional mobile robot autonomous navigation apparatus and method based on laser range finder
WO2017197617A1 (en) * 2016-05-19 2017-11-23 深圳市速腾聚创科技有限公司 Movable three-dimensional laser scanning system and movable three-dimensional laser scanning method
CN106324619A (en) * 2016-10-28 2017-01-11 武汉大学 Automatic obstacle avoiding method of substation inspection robot
CN108802758A (en) * 2018-05-30 2018-11-13 北京应互科技有限公司 A kind of Intelligent security monitoring device, method and system based on laser radar

Also Published As

Publication number Publication date
CN109856643A (en) 2019-06-07

Similar Documents

Publication Publication Date Title
CN109856643B (en) Movable type non-inductive panoramic sensing method based on 3D laser
CN108447075B (en) Unmanned aerial vehicle monitoring system and monitoring method thereof
CN115597659B (en) Intelligent safety management and control method for transformer substation
CN108550234B (en) Label matching and fence boundary management method and device for double base stations and storage medium
CN110889350A (en) Line obstacle monitoring and alarming system and method based on three-dimensional imaging
CN103235562A (en) Patrol-robot-based comprehensive parameter detection system and method for substations
JP6524529B2 (en) Building limit judging device
CN104897132A (en) System for measuring vehicle distance through single camera, and measurement method thereof
CN110940316A (en) Navigation method and system for fire-fighting robot of transformer substation in complex environment
CN113935379B (en) Human body activity segmentation method and system based on millimeter wave radar signals
CN115272425A (en) Railway construction site area intrusion detection method and system based on three-dimensional point cloud
CN113160292B (en) Laser radar point cloud data three-dimensional modeling device and method based on intelligent mobile terminal
CN105892451A (en) Femtosecond laser processing dynamic abnormity diagnosis system and method based on internet remote monitoring
CN117274378A (en) Indoor positioning system and method based on AI vision fusion three-dimensional scene
CN113965733A (en) Binocular video monitoring method, system, computer equipment and storage medium
CN107607939B (en) Optical target tracking and positioning radar device based on real map and image
CN117253203A (en) Obstacle detecting system based on visual sensor
CN106303412A (en) Refuse dump displacement remote real time monitoring apparatus and method based on monitoring image
CN115984770A (en) Remote monitoring method for converter station based on image recognition
Qian et al. Real-time power line safety distance detection system based on LOAM SLAM
CN110375654A (en) The monitoring method of real-time detection bridge three-D displacement
CN113792645A (en) AI eyeball fusing image and laser radar
CN111046765B (en) Dangerous early warning method and system for high-speed rail
CN112697064A (en) Intelligent track deformation identification system based on vision and laser radar
Sui Research on object location method of inspection robot based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231031

Address after: 350000 Shuikou Hydropower building, 92 Beihuan East Road, Jin'an District, Fuzhou City, Fujian Province

Patentee after: Super high voltage branch of State Grid Fujian Electric Power Co.,Ltd.

Patentee after: STATE GRID FUJIAN ELECTRIC POWER Co.,Ltd.

Address before: 350000 Shuikou Hydropower building, 92 Beihuan East Road, Jin'an District, Fuzhou City, Fujian Province

Patentee before: STATE GRID FUJIAN MAINTENANCE Co.

Patentee before: STATE GRID FUJIAN ELECTRIC POWER Co.,Ltd.

Patentee before: SHANGHAI VKINGTELE COMMUNICATION SCIENCE AND TECHNOLOGY CO.,LTD.