CN112549034B - Robot task deployment method, system, equipment and storage medium - Google Patents

Robot task deployment method, system, equipment and storage medium Download PDF

Info

Publication number
CN112549034B
CN112549034B CN202011519589.8A CN202011519589A CN112549034B CN 112549034 B CN112549034 B CN 112549034B CN 202011519589 A CN202011519589 A CN 202011519589A CN 112549034 B CN112549034 B CN 112549034B
Authority
CN
China
Prior art keywords
robot
inspection
task
point cloud
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011519589.8A
Other languages
Chinese (zh)
Other versions
CN112549034A (en
Inventor
苏启奖
黄炎
王柯
吴昊
麦晓明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Southern Power Grid Power Technology Co Ltd
Original Assignee
China Southern Power Grid Power Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Southern Power Grid Power Technology Co Ltd filed Critical China Southern Power Grid Power Technology Co Ltd
Priority to CN202011519589.8A priority Critical patent/CN112549034B/en
Publication of CN112549034A publication Critical patent/CN112549034A/en
Application granted granted Critical
Publication of CN112549034B publication Critical patent/CN112549034B/en
Priority to PCT/CN2021/136043 priority patent/WO2022135138A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a robot task deployment method, a system, equipment and a storage medium, wherein a walking path network of a robot is formed in a scene three-dimensional point cloud model, the robot three-dimensional model which is pre-established is controlled to walk in the walking path network according to an inspection task, and deployment information can be obtained according to the inspection task, so that the deployment information can be obtained without driving the robot to a corresponding position point in an inspection field, and the problems of great workload, extremely low efficiency and inaccurate deployment of the task are solved. Meanwhile, after the task is deployed, an actual robot is controlled according to the inspection task to automatically inspect, an inspection result can be obtained, an incidence relation is established between the inspection result and a pre-established scene three-dimensional point cloud model, and the inspection result can be checked in the scene three-dimensional point cloud model, so that information interaction between a user and the inspection task and corresponding equipment is facilitated, and the information interaction is improved.

Description

Robot task deployment method, system, equipment and storage medium
Technical Field
The application relates to the technical field of inspection robots, in particular to a robot task deployment method, system, equipment and storage medium.
Background
Before the inspection robot of the transformer substation inspects the inspection, a corresponding inspection task needs to be deployed at a specific position point according to the inspection requirement. Therefore, the inspection robot can be controlled to reach a specific position point to execute an inspection task corresponding to the position point in the inspection process, and then, inspection actions of a given task, such as meter counting or infrared detection, can be completed according to the deployed task.
At present, when the inspection robot deploys a task, the inspection robot needs to drive the robot to a corresponding position point in an inspection field, records relevant information (such as coordinates, poses and the like) of the position point, and controls an execution terminal of the inspection robot to complete actions (such as adjusting a corresponding angle or focal length of a holder) corresponding to the inspection task, so that all deployment work of a single task is completed. And after all the deployment tasks are completed in sequence, all the inspection actions can be completed.
However, the task deployment method has a large workload, and usually requires human field control, resulting in extremely low efficiency and inaccurate deployment, which may cause adverse effects on the inspection of subsequent tasks. Meanwhile, the inspection result cannot be intuitively associated with the inspection task and is difficult to check, so that the information interactivity between the user and the inspection task and corresponding equipment is poor.
Disclosure of Invention
The application provides a robot task deployment method, a robot task deployment system, a robot task deployment device and a storage medium, which are used for solving the technical problems of great workload, extremely low efficiency, inaccurate deployment and poor information interactivity between a user and an inspection task and corresponding equipment.
In view of the above, a first aspect of the present application provides a robot task deployment method, including the following steps:
determining all walkable paths of the robot based on a pre-established scene three-dimensional point cloud model, thereby forming a walkable path network;
walking in the network of the feasible walking path according to the issued routing inspection task through a pre-established three-dimensional model of the robot, so as to obtain deployment information corresponding to the routing inspection task;
finishing task deployment of the robot according to the inspection task and the corresponding deployment information;
and controlling the robot to automatically inspect according to the inspection task so as to obtain an inspection result, and establishing an association relation between the inspection result and the pre-established scene three-dimensional point cloud model so as to check the inspection result in the pre-established scene three-dimensional point cloud model.
Preferably, the determining all walkable paths of the robot based on the pre-established scene three-dimensional point cloud model comprises:
collecting full scene point cloud data and image data of an actual patrol scene of the robot;
establishing a scene three-dimensional point cloud model based on the full scene point cloud data and the image data;
in the scene three-dimensional point cloud model, the parts of actual scene equipment are taken as a reference, the full scene point cloud data are clustered and divided, so that a part three-dimensional point cloud model is obtained, and the full scene point cloud data are respectively in one-to-one correspondence with the scene three-dimensional point cloud model and the part three-dimensional point cloud model.
Preferably, after the obtaining of the component three-dimensional point cloud model and further the one-to-one correspondence between the full scene point cloud data and the scene three-dimensional point cloud model and the component three-dimensional point cloud model are respectively realized, the method includes:
importing corresponding preset component attribute information to a model component in the three-dimensional point cloud model of the component, wherein the preset component attribute information comprises component nameplate information and corresponding routing inspection historical data;
and establishing an association relation between the model component and the corresponding preset component attribute information in the three-dimensional point cloud model of the component, so that the associated preset component attribute information can be checked through the model component.
Preferably, the part nameplate information includes an actual part name, an actual part material property, and an actual part setting parameter.
Preferably, the step of walking in the network of walking paths through the pre-established three-dimensional model of the robot according to the issued patrol task comprises:
establishing a robot three-dimensional model based on the robot and an execution terminal corresponding to the robot;
and importing preset robot attribute information of the robot and the execution terminal into the three-dimensional robot model, wherein the preset robot attribute information comprises actual robot material attributes and execution terminal material attributes.
Preferably, the walking in the walking-possible path network according to the issued inspection task through the pre-established three-dimensional model of the robot, so as to obtain the deployment information corresponding to the inspection task specifically includes:
establishing a three-dimensional coordinate system on the pre-established scene three-dimensional point cloud model;
controlling the pre-established robot three-dimensional model to travel along a preset inspection route on the routing network according to the inspection task, so as to determine inspection point positions and model components corresponding to the inspection task;
and acquiring deployment information of the inspection point location corresponding to the robot according to a three-dimensional coordinate system, wherein the deployment information comprises coordinate values of the inspection point location, the pose of the robot, and an execution angle and an execution focal length of an execution terminal of the robot.
Preferably, the obtaining deployment information of the inspection point location corresponding to the robot according to the three-dimensional coordinate system specifically includes:
calculating coordinate values of the inspection point location and the model component based on a three-dimensional coordinate system;
calculating the point location distance of the corresponding adjacent inspection point location according to the coordinate values of the adjacent inspection point locations;
and comparing the point location distance with a preset point location distance, and when the point location distance is smaller than the preset point location distance, taking the corresponding adjacent inspection point location as the same inspection point location, so that the deployment information of the same inspection point location is taken as the corresponding adjacent deployment information of the inspection point location.
In a second aspect, the present application provides a robotic task deployment system comprising:
the path determining module is used for determining all walkable paths of the robot based on a pre-established scene three-dimensional point cloud model so as to form a walkable path network;
the acquisition module is used for walking in the walking-possible path network according to a delivered inspection task through a pre-established three-dimensional model of the robot so as to acquire deployment information corresponding to the inspection task;
the deployment module is used for completing task deployment of the robot according to the inspection task and the corresponding deployment information;
and the first association module is used for controlling the robot to automatically inspect according to the inspection task so as to obtain an inspection result, and establishing an association relationship between the inspection result and the pre-established scene three-dimensional point cloud model so as to check the inspection result in the pre-established scene three-dimensional point cloud model.
In a third aspect, the present application provides an electronic device, comprising: a processor and a memory for storing program instructions, the processor being adapted to carry out the steps of the robot task deployment method as described above when executing the computer program stored in the memory.
In a fourth aspect, the present application provides a storage medium storing a computer program which, when executed by a processor, performs the steps of the robot task deployment method as described above.
According to the technical scheme, the embodiment of the application has the following advantages:
according to the robot task deployment method, the system, the equipment and the storage medium, the walking path network of the robot is formed in the scene three-dimensional point cloud model, the robot three-dimensional model which is pre-established is controlled to walk in the walking path network according to the inspection task, and the deployment information can be obtained according to the inspection task, so that the deployment information can be obtained without driving the robot to a corresponding position point in the inspection field, and the problems of great workload, extremely low efficiency and inaccurate deployment of the task are solved. Meanwhile, after the task is deployed, an actual robot is controlled according to the inspection task to automatically inspect, an inspection result can be obtained, an incidence relation is established between the inspection result and a pre-established scene three-dimensional point cloud model, and the inspection result can be checked in the scene three-dimensional point cloud model, so that information interaction between a user and the inspection task and corresponding equipment is facilitated, and the information interaction is improved.
Drawings
Fig. 1 is a flowchart of a robot task deployment method according to a first embodiment of the present application;
fig. 2 is a flowchart of a robot task deployment method according to a second embodiment of the present disclosure;
fig. 3 is a flowchart of a robot task deployment method according to a third embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a robot task deployment system according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
For easy understanding, please refer to fig. 1, the present application provides a robot task deployment method, including the following steps:
s1: determining all walkable paths of the robot based on a pre-established scene three-dimensional point cloud model, thereby forming a walkable path network;
s2: walking in a walkable path network according to a delivered inspection task through a pre-established three-dimensional model of the robot, so as to obtain deployment information corresponding to the inspection task;
s3: finishing task deployment of the robot according to the inspection task and the corresponding deployment information;
s4: and controlling the robot to automatically inspect according to the inspection task so as to obtain an inspection result, and establishing an incidence relation between the inspection result and a pre-established scene three-dimensional point cloud model so as to check the inspection result in the pre-established scene three-dimensional point cloud model.
In the embodiment, the walking-possible path network of the robot is formed in the scene three-dimensional point cloud model, the robot three-dimensional model which is pre-established is controlled to walk in the walking path network according to the inspection task, and the inspection task has the inspection instruction, so that the pre-established robot three-dimensional model can obtain deployment information according to the inspection task, the robot can obtain the deployment information without being driven to a corresponding position point in the inspection field, and the problems of great workload, extremely low efficiency and inaccurate deployment of the task are solved.
Meanwhile, after the task is deployed, an actual robot is controlled according to the inspection task to automatically inspect, an inspection result can be obtained, an incidence relation is established between the inspection result and a pre-established scene three-dimensional point cloud model, the inspection result can be checked in the scene three-dimensional point cloud model, and the checking mode is not limited to clicking and the like, so that information interaction between a user and the inspection task and corresponding equipment is facilitated, and the information interaction is improved.
The above is a detailed description of a first embodiment of a robot task deployment method provided by the present invention, and the following is a detailed description of a second embodiment of a robot task deployment method provided by the present invention.
For convenience of understanding, please refer to fig. 2, the present application provides a robot task deployment method, including the following steps:
s101: collecting full scene point cloud data and image data of an actual patrol scene of the robot;
it should be noted that, the full scene point cloud data and the image data of the actual inspection scene can be respectively obtained through the laser radar and the camera.
S102: establishing a scene three-dimensional point cloud model based on the full scene spot cloud data and the image data;
it should be noted that the full scene point cloud data and the image data are fused, and in the fusion process, the global optimization of the scene is completed through laser frame matching, attitude error estimation and visual image detection, and a scene three-dimensional point cloud model is established.
S103: in the scene three-dimensional point cloud model, the parts of actual scene equipment are taken as a reference, and the full scene point cloud data is clustered and divided, so that the part three-dimensional point cloud model is obtained, and the full scene point cloud data is respectively in one-to-one correspondence with the scene three-dimensional point cloud model and the part three-dimensional point cloud model.
It should be noted that the scene three-dimensional point cloud model includes a device and a field, and therefore the device and the field need to be divided, in this embodiment, components of actual scene devices (such as a casing, a porcelain bottle, and the like) are used as references, and the full scene point cloud data is clustered and divided, so that a component-level three-dimensional model based on the scene three-dimensional point cloud model is obtained, and then the full scene point cloud data is respectively in one-to-one correspondence with the scene three-dimensional point cloud model and the component three-dimensional point cloud model, so that the simulated actual inspection environment is more real.
S104: importing corresponding preset component attribute information into a model component in the three-dimensional point cloud model of the component, wherein the preset component attribute information comprises component nameplate information and corresponding routing inspection historical data;
it is understood that the part nameplate information includes the actual part name, actual part material properties, and actual part setup parameters; the patrol historical data is former patrol data, and the data presentation form is not limited to curves, tables and the like.
S105: and in the three-dimensional point cloud model of the part, establishing an incidence relation between the model part and corresponding preset part attribute information, and checking the associated preset part attribute information through the model part.
It can be understood that the association relationship between the model component and the corresponding preset component attribute information is established, so that the associated preset component attribute information can be viewed through the model component, and in a general example, the associated preset component attribute information can be displayed by clicking the model component, so that the viewing is facilitated, and the information interactivity is improved.
S106: determining all walkable paths of the robot based on the scene three-dimensional point cloud model, thereby forming a walkable path network;
s107: establishing a robot three-dimensional model based on the robot and an execution terminal corresponding to the robot;
it can be understood that the execution terminal is a device for the robot to perform the inspection task, such as a camera, a holder and the like.
S108: importing preset robot attribute information of a robot and an execution terminal into a three-dimensional model of the robot, wherein the preset robot attribute information comprises actual robot material attributes and execution terminal material attributes;
it can be understood that after the robot and the execution terminal corresponding to the robot establish the three-dimensional robot model, the preset robot attribute information is imported, so that the established three-dimensional robot model is closer to the real situation.
S109: walking in the walking path network according to the issued inspection task through the three-dimensional model of the robot, so as to obtain deployment information corresponding to the inspection task;
s110: finishing task deployment of the robot according to the inspection task and the corresponding deployment information;
it can be understood that the inspection task and the corresponding deployment information are imported into a background management system arranged by the robot, so that the rapid and automatic deployment of the inspection point position can be completed, the robot does not need to drive to an actual position, and the complicated deployment process of the detection end is manually controlled.
S111: and controlling the robot to automatically inspect according to the inspection task so as to obtain an inspection result, and establishing an association relation between the inspection result and the scene three-dimensional point cloud model so as to check the inspection result in the scene three-dimensional point cloud model.
It should be noted that after the user views the inspection result, the deployment information can be appropriately fine-tuned according to the actual execution condition of the robot, so as to complete the accurate deployment of the task.
In the embodiment, after the scene three-dimensional point cloud model is constructed by simulating the actual patrol scene, the network of the feasible path of the robot is formed in the scene three-dimensional point cloud model, and the robot three-dimensional model is established by the actual robot and the corresponding execution terminal, so that the task deployment accuracy is improved.
The robot three-dimensional model is controlled to walk in the network of the walkable path according to the routing inspection task, and the routing inspection task has a routing inspection instruction, so that the robot three-dimensional model can obtain deployment information according to the routing inspection task, the deployment information can be obtained without driving the robot to a corresponding position point in a routing inspection field, and the problems of great workload, extremely low efficiency and inaccurate deployment of the task are solved.
Meanwhile, after the task is deployed, an actual robot is controlled according to the inspection task to automatically inspect, an inspection result can be obtained, an association relation is established between the inspection result and the scene three-dimensional point cloud model, the inspection result can be checked in the scene three-dimensional point cloud model, and the checking mode is not limited to clicking and the like, so that information interaction between a user and the inspection task and corresponding equipment is facilitated, and the information interaction is improved.
The above is a detailed description of a second embodiment of the robot task deployment method provided by the present invention, and the following is a detailed description of a third embodiment of the robot task deployment method provided by the present invention.
For convenience of understanding, please refer to fig. 3, the present application provides a robot task deployment method, including the following steps:
s201: collecting full scene point cloud data and image data of an actual patrol scene of the robot;
it should be noted that, the full scene point cloud data and the image data of the actual inspection scene can be respectively obtained through the laser radar and the camera.
S202: establishing a scene three-dimensional point cloud model based on the full scene spot cloud data and the image data;
it should be noted that the full scene point cloud data and the image data are fused, and in the fusion process, the global optimization of the scene is completed through laser frame matching, attitude error estimation and visual image detection, and a scene three-dimensional point cloud model is established.
S203: in the scene three-dimensional point cloud model, the parts of actual scene equipment are taken as a reference, and the full scene point cloud data is clustered and divided, so that the part three-dimensional point cloud model is obtained, and the full scene point cloud data is respectively in one-to-one correspondence with the scene three-dimensional point cloud model and the part three-dimensional point cloud model.
It should be noted that the scene three-dimensional point cloud model includes a device and a field, and therefore the device and the field need to be divided, in this embodiment, components of actual scene devices (such as a casing, a porcelain bottle, and the like) are used as references, and the full scene point cloud data is clustered and divided, so that a component-level three-dimensional model based on the scene three-dimensional point cloud model is obtained, and then the full scene point cloud data is respectively in one-to-one correspondence with the scene three-dimensional point cloud model and the component three-dimensional point cloud model, so that the simulated actual inspection environment is more real.
S204: importing corresponding preset component attribute information into a model component in the three-dimensional point cloud model of the component, wherein the preset component attribute information comprises component nameplate information and corresponding routing inspection historical data;
it is understood that the part nameplate information includes the actual part name, actual part material properties, and actual part setup parameters; the patrol historical data is former patrol data, and the data presentation form is not limited to curves, tables and the like.
S205: and in the three-dimensional point cloud model of the part, establishing an incidence relation between the model part and corresponding preset part attribute information, and checking the associated preset part attribute information through the model part.
It can be understood that the association relationship between the model component and the corresponding preset component attribute information is established, so that the associated preset component attribute information can be viewed through the model component, and in a general example, the associated preset component attribute information can be displayed by clicking the model component, so that the viewing is facilitated, and the information interactivity is improved.
S206: establishing a robot three-dimensional model based on the robot and an execution terminal corresponding to the robot;
it can be understood that the execution terminal is a device for the robot to perform the inspection task, such as a camera, a holder and the like.
S207: importing preset robot attribute information of a robot and an execution terminal into a three-dimensional model of the robot, wherein the preset robot attribute information comprises actual robot material attributes and execution terminal material attributes;
it can be understood that after the robot and the execution terminal corresponding to the robot establish the three-dimensional robot model, the preset robot attribute information is imported, so that the established three-dimensional robot model is closer to the real situation.
S208: establishing a three-dimensional coordinate system on the scene three-dimensional point cloud model;
s209: controlling the robot three-dimensional model to walk along a preset inspection route on the walking path network according to the inspection task, so as to determine inspection point positions and model components corresponding to the inspection task;
it can be understood that the inspection task has instruction information and comprises a preset inspection route, the robot three-dimensional model performs inspection work along the inspection route according to the inspection task, and the robot three-dimensional model performs positioning at the position point location to be inspected to detect the model component, so that the corresponding inspection point location and the model component are determined.
S210: and acquiring deployment information of the inspection point positions corresponding to the robot according to the three-dimensional coordinate system, wherein the deployment information comprises coordinate values of the inspection point positions, the pose of the robot, and the execution angle and the execution focal length of an execution terminal of the robot.
It should be noted that, after the inspection point location and the model component corresponding to the inspection task are determined, since the inspection point location and the model component are both in the three-dimensional coordinate system, the coordinate values of the inspection point location and the coordinate values of the model component can be determined according to the known coordinates of the three-dimensional coordinate system.
The pose (orientation) of the robot can be determined from the coordinate deviations of the three-dimensional model of the robot with respect to the X-axis and Y-axis of the three-dimensional coordinate system.
The execution angle can be determined according to the coordinate deviation of the execution terminal (such as a holder) relative to the X-axis, the Y-axis and the Z-axis of the three-dimensional coordinate system.
The execution focal length of the execution terminal can be obtained according to internal parameters in the execution terminal (such as a camera) and the distance between the execution terminal and the model part.
S211: finishing task deployment of the robot according to the inspection task and the corresponding deployment information;
s212: and controlling the robot to automatically inspect according to the inspection task so as to obtain an inspection result, and establishing an association relation between the inspection result and the scene three-dimensional point cloud model so as to check the inspection result in the scene three-dimensional point cloud model.
Further, the obtaining of the deployment information of the inspection point location corresponding to the robot according to the three-dimensional coordinate system in step S210 specifically includes:
s2101: calculating coordinate values of the inspection point location and the model component based on the three-dimensional coordinate system;
s2102: calculating the point location distance of the corresponding adjacent patrol point location according to the coordinate values of the adjacent patrol point locations;
s2103: and comparing the point location distance with the preset point location distance, and when the point location distance is smaller than the preset point location distance, taking the corresponding adjacent inspection point location as the same inspection point location, so that the deployment information of the same inspection point location is taken as the deployment information of the corresponding adjacent inspection point location.
It can be understood that when the adjacent inspection point locations are dense and less than the preset point location distance, the inspection point location can be regarded as one inspection point location, so that the visual range of the execution terminal can also be regarded as the same range, the task deployment work is simplified, and the task deployment efficiency is improved.
The above is a detailed description of a third embodiment of the robot task deployment method provided by the present invention, and the above is a detailed description of an embodiment of the robot task deployment system provided by the present invention.
For ease of understanding, referring to fig. 4, the present application provides a robotic task deployment system comprising:
a path determining module 100, configured to determine all walkable paths of the robot based on a pre-established scene three-dimensional point cloud model, so as to form a walkable path network;
the acquiring module 101 is used for walking in a walking path network according to a delivered inspection task through a pre-established three-dimensional model of the robot so as to acquire deployment information corresponding to the inspection task;
the deployment module 102 is used for completing task deployment of the robot according to the inspection task and the corresponding deployment information;
the first association module 103 is used for controlling the robot to automatically patrol according to the patrol task so as to obtain a patrol result, and establishing an association relationship between the patrol result and a pre-established scene three-dimensional point cloud model so as to check the patrol result in the pre-established scene three-dimensional point cloud model.
Further, the system further comprises:
the robot inspection system comprises an acquisition module, a processing module and a control module, wherein the acquisition module is used for acquiring full scene point cloud data and image data of an actual inspection scene of the robot;
the system comprises a first establishing module, a second establishing module and a third establishing module, wherein the first establishing module is used for establishing a scene three-dimensional point cloud model based on full scene spot cloud data and image data;
and the second establishing module is used for clustering and dividing the full scene spot cloud data in the scene three-dimensional point cloud model by taking the parts of the actual scene equipment as a reference so as to obtain a part three-dimensional point cloud model, and further realize that the full scene spot cloud data respectively corresponds to the scene three-dimensional point cloud model and the part three-dimensional point cloud model one to one.
Further, the system further comprises:
the system comprises a first importing module, a second importing module and a third importing module, wherein the first importing module is used for importing corresponding preset component attribute information to a model component in a three-dimensional point cloud model of the component, and the preset component attribute information comprises component nameplate information and corresponding routing inspection historical data;
in this embodiment, the part nameplate information includes the actual part name, the actual part material properties, and the actual part setup parameters.
And the second association module is used for establishing the association relationship between the model component and the corresponding preset component attribute information in the three-dimensional point cloud model of the component, so that the associated preset component attribute information is checked through the model component.
Further, the system further comprises:
the third establishing module is used for establishing a robot three-dimensional model based on the robot and an executing terminal corresponding to the robot;
and the second import module is used for importing preset robot attribute information of the robot and the execution terminal into the three-dimensional model of the robot, and the preset robot attribute information comprises actual robot material attributes and execution terminal material attributes.
Further, the system further comprises:
the fourth establishing module is used for establishing a three-dimensional coordinate system on a pre-established scene three-dimensional point cloud model;
the determining module is used for controlling a pre-established robot three-dimensional model to walk along a preset inspection route on the feasible walking path network according to the inspection task so as to determine an inspection point position and a model component corresponding to the inspection task;
the obtaining module 101 is further configured to obtain deployment information of the inspection point corresponding to the robot according to the three-dimensional coordinate system, where the deployment information includes a coordinate value of the inspection point, a pose of the robot, and an execution angle and an execution focal length of an execution terminal of the robot.
Further, the system further comprises:
the first calculation module is used for calculating coordinate values of the inspection point location and the model component based on the three-dimensional coordinate system;
the second calculation module is used for calculating the point location distance of the corresponding adjacent patrol point location according to the coordinate values of the adjacent patrol point locations;
and the comparison module is used for comparing the point location distance with the preset point location distance, and when the point location distance is smaller than the preset point location distance, the corresponding adjacent inspection point locations are taken as the same inspection point location, so that the deployment information of the same inspection point location is taken as the deployment information of the corresponding adjacent inspection point locations.
The present invention also provides an electronic device, comprising: a processor and a memory, the memory for storing program instructions, the processor being adapted to carry out the steps of the robot task deployment method as in the above embodiments when executing the computer program stored in the memory.
The invention also provides a storage medium storing a computer program which, when executed by a processor, implements the steps of the robot task deployment method as in the above embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for executing all or part of the steps of the method described in the embodiments of the present application through a computer device (which may be a personal computer, a server, or a network device). And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (9)

1. A robot task deployment method is characterized by comprising the following steps:
determining all walkable paths of the robot based on a pre-established scene three-dimensional point cloud model, thereby forming a walkable path network;
walking in the network of the feasible walking path according to the issued routing inspection task through a pre-established three-dimensional model of the robot, so as to obtain deployment information corresponding to the routing inspection task;
finishing task deployment of the robot according to the inspection task and the corresponding deployment information;
controlling the robot to automatically inspect according to the inspection task so as to obtain an inspection result, and establishing an association relationship between the inspection result and the pre-established scene three-dimensional point cloud model so as to check the inspection result in the pre-established scene three-dimensional point cloud model;
the method comprises the following steps of, before walking in the network of the walking-possible path according to a transmitted patrol task through a pre-established three-dimensional model of the robot:
establishing a robot three-dimensional model based on the robot and an execution terminal corresponding to the robot;
and importing preset robot attribute information of the robot and the execution terminal into the three-dimensional robot model, wherein the preset robot attribute information comprises actual robot material attributes and execution terminal material attributes.
2. The method of claim 1, wherein determining all walkable paths of the robot based on the pre-established three-dimensional point cloud model of the scene comprises:
collecting full scene point cloud data and image data of an actual patrol scene of the robot;
establishing a scene three-dimensional point cloud model based on the full scene point cloud data and the image data;
in the scene three-dimensional point cloud model, the parts of actual scene equipment are taken as a reference, the full scene point cloud data are clustered and divided, so that a part three-dimensional point cloud model is obtained, and the full scene point cloud data are respectively in one-to-one correspondence with the scene three-dimensional point cloud model and the part three-dimensional point cloud model.
3. The robotic task deployment method of claim 2, wherein after the obtaining a part three-dimensional point cloud model and thereby achieving the one-to-one correspondence of the full scene point cloud data with the scene three-dimensional point cloud model and the part three-dimensional point cloud model, respectively, comprises:
importing corresponding preset component attribute information to a model component in the three-dimensional point cloud model of the component, wherein the preset component attribute information comprises component nameplate information and corresponding routing inspection historical data;
and establishing an association relation between the model component and the corresponding preset component attribute information in the three-dimensional point cloud model of the component, so that the associated preset component attribute information can be checked through the model component.
4. The robotic task deployment method of claim 3, wherein the part nameplate information includes actual part name, actual part material properties, and actual part setup parameters.
5. The robot task deployment method according to claim 3, wherein the obtaining deployment information corresponding to the inspection task by walking in the network of the walking path according to the delivered inspection task through the pre-established three-dimensional model of the robot specifically comprises:
establishing a three-dimensional coordinate system on the pre-established scene three-dimensional point cloud model;
controlling the pre-established robot three-dimensional model to travel along a preset inspection route on the routing network according to the inspection task, so as to determine inspection point positions and model components corresponding to the inspection task;
and acquiring deployment information of the inspection point location corresponding to the robot according to a three-dimensional coordinate system, wherein the deployment information comprises coordinate values of the inspection point location, the pose of the robot, and an execution angle and an execution focal length of an execution terminal of the robot.
6. The robot task deployment method according to claim 5, wherein the obtaining deployment information of the inspection point locations corresponding to the robot according to the three-dimensional coordinate system specifically includes:
calculating coordinate values of the inspection point location and the model component based on a three-dimensional coordinate system;
calculating the point location distance of the corresponding adjacent inspection point location according to the coordinate values of the adjacent inspection point locations;
and comparing the point location distance with a preset point location distance, and when the point location distance is smaller than the preset point location distance, taking the corresponding adjacent inspection point location as the same inspection point location, so that the deployment information of the same inspection point location is taken as the corresponding adjacent deployment information of the inspection point location.
7. A robotic task deployment system, comprising:
the path determining module is used for determining all walkable paths of the robot based on a pre-established scene three-dimensional point cloud model so as to form a walkable path network;
the acquisition module is used for walking in the walking-possible path network according to a delivered inspection task through a pre-established three-dimensional model of the robot so as to acquire deployment information corresponding to the inspection task;
the deployment module is used for completing task deployment of the robot according to the inspection task and the corresponding deployment information;
the first association module is used for controlling the robot to automatically inspect according to the inspection task so as to obtain an inspection result, and establishing an association relationship between the inspection result and the pre-established scene three-dimensional point cloud model so as to check the inspection result in the pre-established scene three-dimensional point cloud model;
the third establishing module is used for establishing a robot three-dimensional model based on the robot and an executing terminal corresponding to the robot;
and the second import module is used for importing preset robot attribute information of the robot and the execution terminal into the three-dimensional model of the robot, and the preset robot attribute information comprises actual robot material attributes and execution terminal material attributes.
8. An electronic device, comprising: a processor and a memory for storing program instructions, the processor being configured to implement the steps of the robot task deployment method of any of claims 1-6 when executing the computer program stored in the memory.
9. A storage medium, characterized in that it stores a computer program which, when being executed by a processor, carries out the steps of the robot task deployment method according to any one of claims 1 to 6.
CN202011519589.8A 2020-12-21 2020-12-21 Robot task deployment method, system, equipment and storage medium Active CN112549034B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011519589.8A CN112549034B (en) 2020-12-21 2020-12-21 Robot task deployment method, system, equipment and storage medium
PCT/CN2021/136043 WO2022135138A1 (en) 2020-12-21 2021-12-07 Robot task deployment method and system, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011519589.8A CN112549034B (en) 2020-12-21 2020-12-21 Robot task deployment method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112549034A CN112549034A (en) 2021-03-26
CN112549034B true CN112549034B (en) 2021-09-03

Family

ID=75031636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011519589.8A Active CN112549034B (en) 2020-12-21 2020-12-21 Robot task deployment method, system, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112549034B (en)
WO (1) WO2022135138A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112549034B (en) * 2020-12-21 2021-09-03 南方电网电力科技股份有限公司 Robot task deployment method, system, equipment and storage medium
CN113190019B (en) * 2021-05-26 2023-05-16 立得空间信息技术股份有限公司 Virtual simulation-based routing inspection robot task point arrangement method and system
CN113500605B (en) * 2021-09-13 2022-01-25 中科开创(广州)智能科技发展有限公司 Inspection task visualization method and device, computer equipment and storage medium
CN114089770B (en) * 2021-11-23 2024-04-12 广东电网有限责任公司 Inspection point position generation method and related device for substation inspection robot
CN115755954B (en) * 2022-10-28 2023-07-25 佳源科技股份有限公司 Routing inspection path planning method, system, computer equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100779510B1 (en) * 2007-05-23 2007-11-27 문용선 Patrol robot and control system therefor
CN106710001A (en) * 2016-12-29 2017-05-24 山东鲁能智能技术有限公司 Substation inspection robot based centralized monitoring and simulation system and method thereof
CN108422435A (en) * 2018-03-21 2018-08-21 青岛理工大学 Remote monitoring and control system based on augmented reality
CN108805327A (en) * 2018-04-23 2018-11-13 西安科技大学 The method and system of robot path planning and environment rebuilt based on virtual reality
CN108839016A (en) * 2018-06-11 2018-11-20 深圳市百创网络科技有限公司 Robot method for inspecting, storage medium, computer equipment and crusing robot
CN109240311A (en) * 2018-11-19 2019-01-18 国网四川省电力公司电力科学研究院 Outdoor power field construction operation measure of supervision based on intelligent robot
CN110648420A (en) * 2019-09-20 2020-01-03 云南恒协科技有限公司 Intelligent inspection system for scheduling master station system equipment
CN110722559A (en) * 2019-10-25 2020-01-24 国网山东省电力公司信息通信公司 Auxiliary inspection positioning method for intelligent inspection robot
CN111897332A (en) * 2020-07-30 2020-11-06 国网智能科技股份有限公司 Semantic intelligent substation robot humanoid inspection operation method and system
CN111968262A (en) * 2020-07-30 2020-11-20 国网智能科技股份有限公司 Semantic intelligent substation inspection operation robot navigation system and method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105182792B (en) * 2015-08-10 2017-10-03 西南科技大学 Robot manipulating task analogue system and method under a kind of nuclear radiation environment
CN110544298B (en) * 2019-08-29 2023-06-30 中国南方电网有限责任公司 Substation modeling method, device, computer equipment and storage medium
CN110908370B (en) * 2019-10-31 2023-03-21 华能国际电力股份有限公司海门电厂 Unmanned inspection task planning method and system for thermal power plant
CN111552306A (en) * 2020-04-10 2020-08-18 安徽继远软件有限公司 Unmanned aerial vehicle path generation method and device supporting pole tower key component inspection
CN112549034B (en) * 2020-12-21 2021-09-03 南方电网电力科技股份有限公司 Robot task deployment method, system, equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100779510B1 (en) * 2007-05-23 2007-11-27 문용선 Patrol robot and control system therefor
CN106710001A (en) * 2016-12-29 2017-05-24 山东鲁能智能技术有限公司 Substation inspection robot based centralized monitoring and simulation system and method thereof
CN108422435A (en) * 2018-03-21 2018-08-21 青岛理工大学 Remote monitoring and control system based on augmented reality
CN108805327A (en) * 2018-04-23 2018-11-13 西安科技大学 The method and system of robot path planning and environment rebuilt based on virtual reality
CN108839016A (en) * 2018-06-11 2018-11-20 深圳市百创网络科技有限公司 Robot method for inspecting, storage medium, computer equipment and crusing robot
CN109240311A (en) * 2018-11-19 2019-01-18 国网四川省电力公司电力科学研究院 Outdoor power field construction operation measure of supervision based on intelligent robot
CN110648420A (en) * 2019-09-20 2020-01-03 云南恒协科技有限公司 Intelligent inspection system for scheduling master station system equipment
CN110722559A (en) * 2019-10-25 2020-01-24 国网山东省电力公司信息通信公司 Auxiliary inspection positioning method for intelligent inspection robot
CN111897332A (en) * 2020-07-30 2020-11-06 国网智能科技股份有限公司 Semantic intelligent substation robot humanoid inspection operation method and system
CN111968262A (en) * 2020-07-30 2020-11-20 国网智能科技股份有限公司 Semantic intelligent substation inspection operation robot navigation system and method

Also Published As

Publication number Publication date
WO2022135138A1 (en) 2022-06-30
CN112549034A (en) 2021-03-26

Similar Documents

Publication Publication Date Title
CN112549034B (en) Robot task deployment method, system, equipment and storage medium
CN111442722B (en) Positioning method, positioning device, storage medium and electronic equipment
CN110689585B (en) Multi-phase external parameter combined calibration method, device, equipment and medium
CN112258567B (en) Visual positioning method and device for object grabbing point, storage medium and electronic equipment
CN111415409B (en) Modeling method, system, equipment and storage medium based on oblique photography
CN108053473A (en) A kind of processing method of interior three-dimensional modeling data
CN105637435A (en) A method and a device for verifying one or more safety volumes for a movable mechanical unit
CN112652016A (en) Point cloud prediction model generation method, pose estimation method and device
CN110443850B (en) Target object positioning method and device, storage medium and electronic device
CN110648363A (en) Camera posture determining method and device, storage medium and electronic equipment
CN111462029A (en) Visual point cloud and high-precision map fusion method and device and electronic equipment
CN107504917B (en) Three-dimensional size measuring method and device
CN111578951B (en) Method and device for generating information in automatic driving
CN110796738A (en) Three-dimensional visualization method and device for tracking state of inspection equipment
CN109064499B (en) Multilayer frame anti-seismic experiment high-speed video measurement method based on distributed analysis
CN110000793A (en) A kind of motion planning and robot control method, apparatus, storage medium and robot
CN113063421A (en) Navigation method and related device, mobile terminal and computer readable storage medium
CN111583338B (en) Positioning method and device for unmanned equipment, medium and unmanned equipment
CN105468881A (en) Live scenery distance calculation method and device based on aerial photographing images
CN113473118B (en) Data timestamp alignment method, device, equipment and storage medium
CN111985266A (en) Scale map determination method, device, equipment and storage medium
CN113592897B (en) Point cloud data labeling method and device
CN111784797A (en) Robot networking interaction method, device and medium based on AR
CN113218392A (en) Indoor positioning navigation method and navigation device
WO2024164812A1 (en) Multi-sensor fusion-based slam method and device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant