CN112327898B - Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle - Google Patents

Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle Download PDF

Info

Publication number
CN112327898B
CN112327898B CN202011228810.4A CN202011228810A CN112327898B CN 112327898 B CN112327898 B CN 112327898B CN 202011228810 A CN202011228810 A CN 202011228810A CN 112327898 B CN112327898 B CN 112327898B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
vehicle body
current
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011228810.4A
Other languages
Chinese (zh)
Other versions
CN112327898A (en
Inventor
唐崇
仲兆峰
黄立明
李基源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Building Technology Guangzhou Co Ltd
Original Assignee
Hitachi Building Technology Guangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Building Technology Guangzhou Co Ltd filed Critical Hitachi Building Technology Guangzhou Co Ltd
Priority to CN202011228810.4A priority Critical patent/CN112327898B/en
Publication of CN112327898A publication Critical patent/CN112327898A/en
Application granted granted Critical
Publication of CN112327898B publication Critical patent/CN112327898B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application relates to a well inspection navigation method and device of an unmanned aerial vehicle and the unmanned aerial vehicle. The method for the inspection navigation of the well of the unmanned aerial vehicle comprises the steps of outputting a rising instruction, and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station; acquiring current position information of an unmanned aerial vehicle body in the flight process, and processing initial position information and current position information to obtain position deviation; correcting the ascending route of the unmanned aerial vehicle body according to the position deviation; detecting that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route, and entering a collection flow of point cloud data and image data of the current station; outputting a take-off instruction until all ascending stations finish the acquisition process under the condition of finishing the acquisition process; the take-off instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station. By the aid of the method, the unmanned aerial vehicle can finish self-service inspection navigation under the condition that a well GPS signal is weak.

Description

Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle
Technical Field
The application relates to the technical field of hoistway inspection, in particular to a hoistway inspection navigation method and device of an unmanned aerial vehicle and the unmanned aerial vehicle.
Background
The hoistway is an important component of elevator equipment and can provide a closed space to perform the functions of sound insulation, shock absorption and elevator safety operation protection. The unmanned aerial vehicle is used as a novel information acquisition carrier, has the advantages of high flexibility, strong operability, low cost, low requirement on the operation environment and the like, and can be used for carrying out inspection work.
In the implementation process, the inventor finds that at least the following problems exist in the conventional technology: traditional unmanned aerial vehicle navigation methods cannot be implemented within an elevator hoistway.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a hoistway inspection navigation method and apparatus for an unmanned aerial vehicle, and an unmanned aerial vehicle capable of realizing inspection of an elevator hoistway.
In order to achieve the above object, in one aspect, an embodiment of the present invention provides a method for inspecting and navigating a hoistway of an unmanned aerial vehicle, including the steps of:
outputting a rising instruction, and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
acquiring current position information of an unmanned aerial vehicle body in the flight process, and processing initial position information and current position information to obtain position deviation;
correcting the ascending route of the unmanned aerial vehicle body according to the position deviation;
Detecting that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route, and entering a collection flow of point cloud data and image data of the current station; outputting a take-off instruction until all ascending stations finish the acquisition process under the condition of finishing the acquisition process; the take-off instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station.
In one embodiment, the current location information includes a first current horizontal coordinate of the four walls of the hoistway relative to the unmanned aerial vehicle body; the initial position information comprises first initial horizontal coordinates of four walls of a well relative to the unmanned aerial vehicle body;
the step of processing the initial position information and the current position information to obtain the position deviation includes:
and confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the step of processing the initial position information and the current position information to obtain the position deviation includes:
acquiring first current horizontal coordinates in a preset time period, and acquiring average horizontal coordinates according to each first current horizontal coordinate;
and confirming the coordinate difference value between the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the step of obtaining initial position information of the unmanned aerial vehicle body includes:
acquiring an initial distance value and an initial scanning angle returned by the beam light in a scanning period through a laser radar;
obtaining a first initial horizontal coordinate according to the initial distance value and the initial scanning angle;
the step of obtaining the current horizontal position of the unmanned aerial vehicle body in the flight process comprises the following steps:
acquiring a current distance value returned by the beam light in a scanning period through a laser radar;
acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
and obtaining a first current horizontal coordinate according to the current distance value, the rolling angle and the pitch angle.
In one embodiment, the initial position information includes a second initial horizontal coordinate of the drone body relative to the laser emitting device; wherein, the laser emission device is arranged at the pit of the well;
the step of obtaining initial position information of the unmanned aerial vehicle body comprises the following steps:
acquiring initial position coordinates transmitted by the photoelectric position sensor, and confirming the initial position coordinates as second initial horizontal coordinates; wherein, the photoelectric position sensor is arranged on the unmanned aerial vehicle body; under the condition that the initial position coordinates are that the unmanned aerial vehicle body reaches an initial station, the photoelectric position sensor responds to laser emitted by the laser emitting device to obtain the initial position coordinates.
In one embodiment, the current position information includes a second current horizontal coordinate of the drone body relative to the laser emitting device;
the step of obtaining the current horizontal position of the unmanned aerial vehicle body in the flight process comprises the following steps:
acquiring a current position coordinate transmitted by a photoelectric position sensor, and acquiring an attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle; the current position coordinates are obtained by a photoelectric position sensor and responding to laser emitted by a laser emitting device in the flying process;
acquiring a distance value between the gravity center of the unmanned aerial vehicle body and an induction surface of the photoelectric position sensor;
and processing the current position coordinate, the distance value, the rolling angle and the pitch angle to obtain a second current horizontal coordinate.
In one embodiment, in the step of processing the current position coordinate, the distance value, the roll angle and the pitch angle to obtain the second current horizontal coordinate, the second current horizontal coordinate is obtained based on the following formula:
X t i=X t offset -L a ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·tanθ t
Y t i=Y t offset -L a ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·tanΦ t
Wherein X is ti The abscissa is the second current horizontal coordinate; y is Y ti Is the ordinate of the second current horizontal coordinate; l (L) a Is a distance value; θ t Is a pitch angle; phi t Is the roll angle.
In one embodiment, the step of processing the initial position information and the current position information to obtain the position deviation includes:
And confirming the coordinate difference value of the second current horizontal coordinate and the second initial horizontal coordinate as the position deviation.
In one embodiment, the method further comprises the steps of:
and under the condition that all ascending stations complete the acquisition process, entering a return process.
In one embodiment, the return process includes:
inputting a descending instruction, and correcting a descending route of the unmanned aerial vehicle body according to the position deviation;
detecting that the unmanned aerial vehicle body reaches the next station according to the corrected descending route, and entering a collection flow of point cloud data and image data of the current station; outputting a descending instruction until all stations finish the acquisition process under the condition of finishing the acquisition process; the descending instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
and under the condition that all descending stations complete the acquisition process, entering a landing process.
In one embodiment, the step of detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route includes:
acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and the radiation surface of a lower right-angle transmitting prism and a third distance between the intersection point of the rotation axes of the swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle transmitting prism and the laser radar are arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and is used for swinging the laser radar;
Processing the first distance, the second distance, the third distance, the rolling angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body;
and if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route.
In one embodiment, in the step of processing the first distance, the second distance, the third distance, the roll angle, and the pitch angle to obtain the current height of the unmanned aerial vehicle body, the current height is obtained based on the following formula:
Z=H·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 )±Lb·sin(arctan(tan 2 θ+tan 2 Φ) 1/2 )-(L c -L c ·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 ));
wherein Z is the current height; θ is the pitch angle; phi is the roll angle; h is the first distance; l (L) b Is the second distance; l (L) c Is the third distance.
In one aspect, an embodiment of the present invention further provides a hoistway inspection navigation device of an unmanned aerial vehicle, including:
the initial position information acquisition module is used for outputting a rising instruction and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
the position deviation acquisition module is used for acquiring the current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
The correction module is used for correcting the flight route of the unmanned aerial vehicle body according to the position deviation;
the acquisition module is used for detecting that the unmanned aerial vehicle body reaches the next station according to the corrected flight route and entering the acquisition process of the point cloud data and the image data of the current station; outputting a take-off instruction until all stations finish the acquisition process under the condition of finishing the acquisition process; the take-off instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station.
In one aspect, the embodiment of the invention further provides an unmanned aerial vehicle, which comprises an unmanned aerial vehicle body, a memory and a processor, wherein the memory and the processor are arranged on the unmanned aerial vehicle body, the memory stores a computer program, and the processor realizes the steps of any one of the methods when executing the computer program.
In one embodiment, the system further comprises a laser radar, a photoelectric position sensor, an inertial measurement unit and an image acquisition device which are arranged on the unmanned aerial vehicle body;
the processor is respectively connected with the laser radar, the photoelectric position sensor, the inertial measurement unit and the image acquisition equipment.
In another aspect, embodiments of the present invention also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any of the above.
One of the above technical solutions has the following advantages and beneficial effects:
according to the hoistway inspection navigation method of the unmanned aerial vehicle, initial position information is obtained from an initial station, and the current position information and the initial position information of the unmanned aerial vehicle body in the flight process are processed to obtain position deviation. The ascending route of the unmanned aerial vehicle body is corrected according to the position deviation, so that the unmanned aerial vehicle can vertically ascend in a well without colliding with four walls of the well. When the unmanned aerial vehicle body reaches the next station, the collection flow is entered until all stations complete the collection flow. By the aid of the method, the unmanned aerial vehicle can finish self-service inspection navigation under the condition that a well GPS signal is weak.
Drawings
The foregoing and other objects, features and advantages of the application will be apparent from the following more particular description of preferred embodiments of the application, as illustrated in the accompanying drawings. Like reference numerals refer to like parts throughout the drawings, and the drawings are not intentionally drawn to scale on actual size or the like, with emphasis on illustrating the principles of the application.
Fig. 1 is a schematic flow diagram of a method of inspection navigation of a hoistway of an unmanned aerial vehicle in one embodiment;
FIG. 2 is a flow chart illustrating steps for obtaining a position deviation in one embodiment;
fig. 3 is a flowchart illustrating a step of acquiring initial position information of an unmanned aerial vehicle body according to an embodiment;
FIG. 4 is a first flowchart illustrating a step of acquiring a current horizontal position of a unmanned aerial vehicle body during a flight process according to an embodiment;
FIG. 5 is a second flow chart of steps for obtaining a current horizontal position of a drone body during flight in one embodiment;
FIG. 6 is a flow diagram of a return flow in one embodiment;
FIG. 7 is a schematic diagram of a step of detecting that the unmanned aerial vehicle body arrives at a next station according to the corrected ascending route in an embodiment;
fig. 8 is a block diagram of a hoistway inspection navigation device of the unmanned aerial vehicle in one embodiment;
FIG. 9 is a front view of the drone in one embodiment;
FIG. 10 is a left side view of the drone in one embodiment;
fig. 11 is a three-view diagram illustrating a radar cradle head structure of a drone in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In one embodiment, as shown in fig. 1, a method for inspecting and navigating a hoistway of an unmanned aerial vehicle is provided, including the steps of:
s110, outputting a rising instruction, and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
the ascending instruction is used for indicating the unmanned aerial vehicle body to ascend and fly vertically. Specifically, the ascending instruction may be output to a propeller motor of the unmanned aerial vehicle. The initial station is the first station that the unmanned aerial vehicle body arrived, namely the position when the unmanned aerial vehicle body has not taken off the pit yet. The initial position information may be used as a reference to determine the positions of the remaining stations. It should be noted that the heights of the stations are different. The site positions are preset according to a well design drawing, and can be the upper and lower frames of each door opening, the upper and lower frames of the ring beam and the like. The initial position information may be any data representing position information in the art.
Specifically, it may be detected by any means in the art whether the drone body arrives at the initial site. In a specific example, the altitude of the unmanned aerial vehicle body can be obtained through a laser radar, and whether the unmanned aerial vehicle body reaches an initial station or not is confirmed according to the altitude. In another embodiment, it is also possible to confirm whether the original station is reached by setting a marker at each station by identifying the marker.
Before outputting the up command, the unmanned aerial vehicle needs to be instructed to enter an initializing and self-checking step.
S120, acquiring current position information of the unmanned aerial vehicle body in the flight process, and processing initial position information and current position information to obtain position deviation;
specifically, the current position information of the unmanned aerial vehicle body may be real-time position information of the unmanned aerial vehicle body in the flight process. The current position information can be obtained by any means in the art, for example, the positions of the four walls of the well can be obtained by a laser radar, and then the current position information of the unmanned aerial vehicle body is obtained. The laser radar is arranged on the unmanned aerial vehicle body, and the size of each cross section of the well is constant, namely the length, width and height of each cross section of the well and the size of the center position of the cross section relative to the length, width and height are constant. On the basis, the current position information of the unmanned aerial vehicle body can be reflected according to the positions of the four walls of the well. Another example is: be equipped with photoelectric position sensor on unmanned aerial vehicle body, be equipped with laser emitter at the well pit, the light beam that laser emitter sent can be received by photoelectric position sensor's sensing surface to output corresponding positional information. Under the condition that the unmanned aerial vehicle body moves, the light beam emitted by the laser emission device is received by different positions of the sensing surface, and different position information is output. Therefore, the current position information of the unmanned aerial vehicle body can also be reflected through the position information output by the photoelectric position sensor. In one specific example, the current position information and the initial position information are both horizontal position information.
Further, the current position information and the initial position information may be processed by any means in the art to obtain the position deviation. For example, when the current position information and the initial position information are characterized by coordinates, the position deviation may be characterized by an X-axis coordinate difference and a Y-axis coordinate difference. For another example, the distance between the current position and the initial position can be obtained according to the current position information and the initial position information, and the position deviation can be represented by the distance.
S130, correcting the ascending route of the unmanned aerial vehicle body according to the position deviation;
in particular, the ascending course may be modified by the positional deviation. In one example, if the positional deviation is greater than a set point, the ascent route of the drone body is adjusted to maintain the positional deviation within the set point.
S140, detecting that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route, and entering a collection flow of point cloud data and image data of the current station; outputting a take-off instruction until all ascending stations finish the acquisition process under the condition of finishing the acquisition process; the take-off instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station.
The point cloud data may include geometric location and color information of the current site, among other things.
Specifically, image data of a hoistway site can be collected through image collection equipment, and point cloud data of the hoistway can be collected through a laser radar. Whether the unmanned aerial vehicle body reaches the next station according to the corrected ascending route can be detected by any means in the field. It should be noted that, under the condition of completing the acquisition process, a take-off instruction may be output until the distance from the top layer is smaller than the preset distance, that is, the distance from the top layer is used as the condition of finishing the inspection.
In a specific example, the current height may be directly obtained by the distance detection sensor, and when the current height is the same as the height of the next station, it is confirmed that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route. In another specific example, the step of detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route includes: acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle; acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and the radiation surface of a lower right-angle transmitting prism and a third distance between the intersection point of the rotation axes of the swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle transmitting prism and the laser radar are arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and is used for swinging the laser radar; processing the first distance, the second distance, the third distance, the rolling angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body; and if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route. It should be noted that, in the step of processing the first distance, the second distance, the third distance, the roll angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body, the current height is obtained based on the following formula:
Z=H·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 )±L b ·sin(arctan(tan 2 θ+tan 2 Φ) 1/2 )-(L c -L c ·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 ));
Wherein Z is the current height; θ is the pitch angle; phi is the roll angle; h is the first distance; l (L) b Is the second distance; l (L) c Is the third distance.
It should be noted that, the process of collecting the point cloud data and the image data of the current site may be any process of collecting the point cloud data and the image data in the field. And when the acquisition process is completed, outputting a take-off instruction for indicating the unmanned aerial vehicle body to move to the next station. And the action of moving from the current station to the next station is circulated until all ascending stations finish the collection flow. The ascending station is a station which needs to carry out the collection flow in the ascending process.
According to the hoistway inspection navigation method of the unmanned aerial vehicle, the initial position information is acquired from the initial station, and the current position information and the initial position information of the unmanned aerial vehicle body in the flight process are processed to obtain the position deviation. The ascending route of the unmanned aerial vehicle body is corrected according to the position deviation, so that the unmanned aerial vehicle can vertically ascend in a well without colliding with four walls of the well. When the unmanned aerial vehicle body reaches the next station, the collection flow is entered until all stations complete the collection flow. By the aid of the method, the unmanned aerial vehicle can finish self-service inspection navigation under the condition that a well GPS signal is weak.
In one embodiment, the current location information includes a first current horizontal coordinate of the four walls of the hoistway relative to the unmanned aerial vehicle body; the initial position information comprises first initial horizontal coordinates of four walls of a well relative to the unmanned aerial vehicle body;
the step of processing the initial position information and the current position information to obtain the position deviation includes:
and confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
The first current horizontal coordinate of the four walls of the well relative to the unmanned aerial vehicle body can be the origin of the unmanned aerial vehicle body, and other reference objects (such as a working starting point) can be the origin of the unmanned aerial vehicle body, which is the real-time horizontal coordinate in the flight process of the unmanned aerial vehicle body. The first initial horizontal coordinate of well four walls for the unmanned aerial vehicle body can regard unmanned aerial vehicle body as the origin, also can regard other references as the origin, and it is the horizontal coordinate of unmanned aerial vehicle body when initial website.
Specifically, the coordinate difference between the X-axis coordinate of the first current horizontal coordinate and the X-axis coordinate of the first initial horizontal coordinate, and the coordinate difference between the Y-axis coordinate of the first current horizontal coordinate and the Y-axis coordinate of the first initial horizontal coordinate are confirmed as the positional deviation.
In one embodiment, as shown in fig. 2, the step of processing the initial position information and the current position information to obtain the position deviation includes:
s210, acquiring first current horizontal coordinates in a preset time period, and acquiring average horizontal coordinates according to each first current horizontal coordinate;
specifically, each first current horizontal coordinate in the preset time period is obtained, and the average horizontal coordinate of each first current horizontal coordinate is calculated.
And S220, confirming the coordinate difference value between the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
Specifically, the coordinate difference between the X-axis coordinate of the average horizontal coordinate and the X-axis coordinate of the first initial horizontal coordinate and the coordinate difference between the Y-axis coordinate of the average horizontal coordinate and the Y-axis coordinate of the first initial horizontal coordinate are confirmed as the positional deviation.
In one embodiment, as shown in fig. 3, the step of acquiring initial position information of the unmanned aerial vehicle body includes:
s310, acquiring an initial distance value and an initial scanning angle of the beam light returned in one scanning period through a laser radar;
specifically, when the unmanned aerial vehicle is locally located at an initial site, the laser radar emits beam light to the periphery, and an initial distance value and an initial scanning angle of any beam light returned in one scanning period are obtained.
S320, obtaining a first initial horizontal coordinate according to the initial distance value and the initial scanning angle;
specifically, the first initial horizontal coordinate may be obtained by the following formula:
X 1i =r 1i ·cosε i
Y 1i =r 1i ·sinε i
wherein r is i 、ε i Respectively the distance value returned by each beam light of the laser radar in one period of scanning and the initial scanning angle value, r 1i The initial distance value returned in one period scanning for the initial site lidar beam light.
As shown in fig. 4, the step of obtaining the current horizontal position of the unmanned aerial vehicle body in the flight process includes:
s410, acquiring a current distance value returned by the beam light in a scanning period through a laser radar;
specifically, when the unmanned aerial vehicle is locally located at the next site, the laser radar emits beam light to the periphery, and a current distance value returned by any beam light in one scanning period is obtained.
S420, acquiring the attitude change quantity of the unmanned aerial vehicle body through an inertia measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
and S430, obtaining a first current horizontal coordinate according to the current distance value, the rolling angle and the pitch angle.
Specifically, the first current horizontal coordinate may be obtained by the following formula:
X ti =r ti ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·cosε i
Y ti =r ti ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·sinε i
wherein r is ti The current distance value returned by the laser radar beam light of the t station in one period of scanning is obtained;
θ t A pitch angle of the unmanned aerial vehicle body at the t station; phi t Is the rolling angle of the body of the station unmanned aerial vehicle.
Based on this, the positional deviation is:
△X ti =r ti ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·cosε i -r 1i ·cosε i
△Y ti =r ti ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·sinε i -r 1i ·sinε i
in one embodiment, the initial position information comprises a second initial horizontal coordinate of the drone body relative to the laser emitting device; wherein, the laser emission device is arranged at the pit of the well;
the step of obtaining initial position information of the unmanned aerial vehicle body comprises the following steps:
acquiring initial position coordinates transmitted by the photoelectric position sensor, and confirming the initial position coordinates as second initial horizontal coordinates; wherein, the photoelectric position sensor is arranged on the unmanned aerial vehicle body; under the condition that the initial position coordinates are that the unmanned aerial vehicle body reaches an initial station, the photoelectric position sensor responds to laser emitted by the laser emitting device to obtain the initial position coordinates.
Specifically, the photoelectric position sensor is arranged on the unmanned aerial vehicle body, and the laser emission device is arranged in a pit of a well; the beam of light emitted upwardly from the pit of the hoistway by the laser emitting device may be received by a sensing surface of the photoelectric position sensor, which responds to the beam and outputs an initial position coordinate. In one specific example, the initial horizontal coordinate may be confirmed as (0, 0), that is, the initial position information is confirmed as the reference coordinate.
In one embodiment, the current position information includes a second current horizontal coordinate of the drone body relative to the laser emitting device;
as shown in fig. 5, the step of obtaining the current horizontal position of the unmanned aerial vehicle body in the flight process includes:
s510, acquiring current position coordinates transmitted by a photoelectric position sensor, and acquiring the attitude change quantity of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle; the current position coordinates are obtained by a photoelectric position sensor and responding to laser emitted by a laser emitting device in the flying process;
specifically, if the unmanned aerial vehicle body moves, laser irradiates on different positions of the sensing surface of the photoelectric position sensor, so that the current position coordinates transmitted by the photoelectric position sensor are different.
S520, acquiring a distance value between the gravity center of the unmanned aerial vehicle body and the sensing surface of the photoelectric position sensor;
the distance value between the gravity center of the unmanned aerial vehicle body and the sensing surface of the photoelectric position sensor can be a preset value, and the distance value can be directly called when the unmanned aerial vehicle is needed through a memory or other positions stored in the unmanned aerial vehicle body in advance.
And S530, processing the current position coordinate, the distance value, the rolling angle and the pitch angle to obtain a second current horizontal coordinate.
In one embodiment, in the step of processing the current position coordinate, the distance value, the roll angle and the pitch angle to obtain the second current horizontal coordinate, the second current horizontal coordinate is obtained based on the following formula:
X t i=X t offset -L a ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·tanθ t
Y t i=Y t offset -L a ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·tanΦ t
Wherein X is ti The abscissa is the second current horizontal coordinate; y is Y ti Is the ordinate of the second current horizontal coordinate; l (L) a Is a distance value; θ t Is a pitch angle; phi t Is the roll angle.
If the second initial horizontal coordinate is set as the reference value, that is, (0, 0), the second current horizontal coordinate is also referred to as the positional deviation.
In one embodiment, the step of processing the initial position information and the current position information to obtain the position deviation comprises:
and confirming the coordinate difference value of the second current horizontal coordinate and the second initial horizontal coordinate as the position deviation.
Specifically, the coordinate difference between the X-axis coordinate of the second current horizontal coordinate and the X-axis coordinate of the second initial horizontal coordinate, and the coordinate difference between the Y-axis coordinate of the second initial horizontal coordinate and the Y-axis coordinate of the second initial horizontal coordinate are confirmed as the positional deviation.
In one embodiment, the method further comprises the steps of:
and under the condition that all ascending stations complete the acquisition process, entering a return process.
Specifically, the return route may be to land back to the departure point.
In one embodiment, as shown in fig. 6, the return flow includes:
s610, inputting a descending instruction, and correcting a descending route of the unmanned aerial vehicle body according to the position deviation;
specifically, under the condition that all ascending stations complete the acquisition process, a descending instruction is input, and the descending route of the unmanned aerial vehicle body is corrected according to the output position deviation.
S620, detecting that the unmanned aerial vehicle body reaches the next station according to the corrected descending route, and entering a collection flow of point cloud data and image data of the current station; outputting a descending instruction until all stations finish the acquisition process under the condition of finishing the acquisition process; the descending instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
specifically, all stations can be collected again in the descending route according to the height sequence, so that verification is facilitated. Furthermore, a new site can be added to the original site.
S630, entering a landing process under the condition that all descending stations complete the acquisition process.
Specifically, the landing procedure may be any landing procedure in the field, and is not limited herein.
In one embodiment, as shown in fig. 7, the step of detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route includes:
s710, acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
specifically, the inertial measurement unit may be a 9-axis MEMS inertial measurement unit including a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetic field meter.
S720, acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and a radiation surface of a lower right-angle transmitting prism and a third distance between a rotation axis intersection point of each swing arm and the beam center, wherein the first distance is output by the laser radar; the lower right-angle transmitting prism and the laser radar are arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and is used for swinging the laser radar;
specifically, the lower right-angle transmitting prism of the laser radar can reflect the light beam of the laser radar to the pit of the well, and the height of the unmanned aerial vehicle relative to the pit can be measured. Still further, still be equipped with right angle emission prism on, go up right angle emission prism and can reflect the light beam of laser radar to the top of well for measure unmanned aerial vehicle body for the height at top. The position information of each station may be based on the pit or the top.
S730, processing the first distance, the second distance, the third distance, the rolling angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body;
specifically, in the step of processing the first distance, the second distance, the third distance, the roll angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body, the current height is obtained based on the following formula:
Z=H·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 )±L b ·sin(arctan(tan 2 θ+tan 2 Φ) 1/2 )-(L c -L c ·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 ));
wherein Z is the current height; θ is the pitch angle; phi is the roll angle; h is the first distance; l (L) b Is the second distance; l (L) c Is the third distance.
And S740, if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route.
Specifically, if the current height of the unmanned aerial vehicle is the same as the height of the next station, the unmanned aerial vehicle body is confirmed to reach the next station.
It should be understood that, although the steps in the flowcharts of fig. 1-7 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in fig. 1-7 may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the sub-steps or stages are performed necessarily occur sequentially, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or steps.
In one embodiment, as shown in fig. 8, there is provided a hoistway inspection navigation apparatus of an unmanned aerial vehicle, including:
the initial position information acquisition module is used for outputting a rising instruction and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
the position deviation acquisition module is used for acquiring the current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
the correction module is used for correcting the flight route of the unmanned aerial vehicle body according to the position deviation;
the acquisition module is used for detecting that the unmanned aerial vehicle body reaches the next station according to the corrected flight route and entering the acquisition process of the point cloud data and the image data of the current station; outputting a take-off instruction until all stations finish the acquisition process under the condition of finishing the acquisition process; the take-off instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station.
In one embodiment, the positional deviation acquisition module further comprises:
and the first position deviation acquisition module is used for confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the positional deviation acquisition module further comprises:
the second position deviation acquisition module is used for acquiring first current horizontal coordinates in preset time length and acquiring average horizontal coordinates according to each first current horizontal coordinate; and confirming the coordinate difference value between the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
The specific limitation of the inspection navigation device of the unmanned aerial vehicle can be referred to the limitation of the inspection navigation method of the unmanned aerial vehicle, and the description thereof is omitted here. All or part of each module in the hoistway inspection navigation device of the unmanned aerial vehicle can be realized by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a drone is provided, comprising a drone body, a memory disposed on the drone body, the memory storing a computer program, and a processor that when executing the computer program performs the steps of any of the methods described above.
In one embodiment, the system further comprises a laser radar, a photoelectric position sensor, an inertial measurement unit and an image acquisition device which are arranged on the unmanned aerial vehicle body;
the processor is respectively connected with the laser radar, the photoelectric position sensor, the inertial measurement unit and the image acquisition equipment.
Specifically, the image capturing device may be a camera; the photoelectric position sensor is used for receiving laser of the laser emitting device arranged at the pit and outputting position information.
To further illustrate the drone of the present application, the following is further described with particular reference to the following examples:
the unmanned aerial vehicle comprises a propeller 1, an unmanned aerial vehicle body 2, a fixed landing gear 3, an upper emission right-angle prism 4, a lower emission right-angle prism 5, a two-dimensional laser radar 6, a radar cradle head 7, an HDR camera 8, a camera cradle head 9, an IMU module 10, an onboard processor 11, a power supply 12, a wireless image transmission and communication module 13, an SOS module 14, an RC module 15 (no image) and a 16 laser alignment system. The main structure is shown in fig. 9, 10 and 11.
The overall layout of the unmanned aerial vehicle ensures that the gravity center of the unmanned aerial vehicle is located at the geometric center as far as possible.
The two upper/lower right-angle transmitting prisms 4\5 are mounted on the swing arm 7d in the cradle head 7 like the laser radar 6, and keep consistent with the relative position of the radar all the time, and reflect a small part of the light beam of the two-dimensional laser radar 6 to the top/pit of the hoistway for measuring the height of the pit relative to the top surface of the unmanned aerial vehicle.
The two-dimensional laser radar 6 is fixed on a swing arm 7d in the cradle head 7, and the center of gravity of the two-dimensional laser radar is adjusted to pass through the axes of the two swing arm motors. Under the initial condition, the working starting point, the laser radar center adjustment coincides with the Z-axis direction of the gravity center of the unmanned aerial vehicle.
The radar cradle head 7 is a mechanical framework arranged on the bottom plate of the unmanned aerial vehicle and used for mounting the laser radar 6, and mainly comprises a lifting device 7a, a bracket 7b, a swing arm 7c and a swing arm 7 d. The lifting device 7a enables the bracket 7b to vertically lift and finely adjust the up-down height of the cradle head. The swing arm 7c can rotate left and right around the bracket 7b, the swing arm 7d can rotate back and forth around the swing arm 7c, and each axle center is provided with a motor. The radar 6 is mounted on the swing arm 7d to swing therewith.
The camera cradle head 9 is mounted on top of the unmanned aerial vehicle for hanging the mechanical architecture of the HDR camera 8, and mainly consists of a rotating column 9a, a bracket 9b, a swing arm 9c and a swing arm type camera mounting clamping groove 9 d. The bracket 9b can rotate around the center of the rotating column 9a, so that the camera can shoot at 360 degrees without dead angles on the same horizontal plane. The swing arm 9c rotates left and right around the bracket 9b, the swing arm type camera mounting clamping groove 9d rotates back and forth around the swing arm 9c, and each axle center is provided with a motor. The HDR camera 8 is mounted on the swing arm type camera mounting groove 9d to swing therewith.
The IMU module (i.e., the above inertial measurement unit) 10 adopts a 9-axis MEMS inertial measurement unit (a three-axis gyroscope, a three-axis accelerometer, a three-axis magnetic field meter), can output three-axis acceleration, three-axis rotational speed, and three-axis geomagnetic field intensity, can output a roll angle Φ, a pitch angle θ, and a yaw angle ψ without drift, and adopts an anti-vibration gyroscope design.
The SOS module 14 sends out a distress signal in case of emergency by means of red signal flashing lights and ultrasound waves.
The RC module 15 is used for taking control right of the unmanned aerial vehicle back by an operator in emergency, and the unmanned aerial vehicle is brought to the ground by manual control when the unmanned aerial vehicle is controlled to run and work or SOS module 13 sends out a distress signal.
The laser alignment system 16 includes a laser emitting device 16a, a photo-electronic position sensor device 16b. The photoelectric position sensor can adopt an area array CCD. The laser emitting device 16a is installed in the pit of the hoistway. The photoelectric position sensor device 16b is mounted on the laser radar, and the target center of the photoelectric position sensor device is consistent with the center of gravity of the laser radar and is coincident with the Z-axis direction of the center of gravity of the unmanned aerial vehicle. The relative position of the radar cloud platform 7 and the radar 6 is always consistent along with the swing of the radar cloud platform 7 like the laser radar 6.
In order to further illustrate the inspection navigation method of the unmanned aerial vehicle well, the following is further described with specific example:
step one: establishing a coordinate system
And establishing an unmanned aerial vehicle body coordinate system Boor. And (3) horizontally placing the unmanned aerial vehicle at the center of the pit of the well, and adjusting the laser beam of the laser emission device 16a to be aligned with the center of the light spot of the photoelectric position sensor device 16b. The method comprises the steps of taking the center of gravity of an unmanned aerial vehicle (the geometric center of the unmanned aerial vehicle coincides with the center of gravity as much as possible when the unmanned aerial vehicle is designed) as an origin, defining an X-axis positive direction of a coordinate system of the unmanned aerial vehicle body as an X-axis positive direction of a three-axis acceleration output by an IMU (namely the inertial measurement unit) in an unmanned aerial vehicle plane, rotating the unmanned aerial vehicle body anticlockwise by 90 degrees as a Y-axis positive direction in the unmanned aerial vehicle plane, and defining an upward direction of a vertical unmanned aerial vehicle plane as a Z-axis positive direction.
A world coordinate system gcor with the work start as the origin is established. Translating the unmanned plane body coordinate system Bcoor in the vertical direction to the pit plane to be the world coordinate system Gcor O for unmanned plane operation 0 (0,0,0). At the moment, the coordinate of the laser radar center in the world coordinate system is O 1 (0,0,Z 1 )(Z 1 For the working starting point laser radar reaching the height of the pit of the well, the coordinates of the unmanned aerial vehicle body coordinate system Boor in the world coordinate system are O none (0, Z) B )(Z B The center of gravity of the unmanned aerial vehicle is at the height of the pit to the well for the working starting point). The coordinate system and navigation related parameters are shown in fig. 9, 10 and 11.
Step two: initializing work
And (5) turning on a power supply, and starting initialization work and unmanned aerial vehicle self-checking work.
Step three: first station workstation data acquisition
The working starting point, the coordinate of the laser radar center in the world coordinate system is O 1 (0,0,Z 1 ) The first station data acquisition is performed. And the photoelectric position sensor records the deviation value (0, 0) of the position relative to the initial position of the laser spot.
And laser radar data X 1i =r 1i ·cosε i ,Y 1i =r 1i ·sinε i . Wherein r is i 、ε i Respectively the distance value and the scanning angle value returned by each beam light of the laser radar in one period of scanning, r 1i The distance value returned by each beam of light in one cycle scan for the first station (i.e., the initial station) lidar. The height dimension of the acquired data of the first station is Z 1
Step four: route positioning in flight after data acquisition of first station
When the unmanned aerial vehicle is displaced, the IMU module 10 outputs a roll angle Φt, a pitch angle θt, and a yaw angle ψt (all angles are turned positive counterclockwise), the laser radar scanner outputs a height Ht to the pit, and the photoelectric position sensor apparatus 16b outputs (Xt Offset of deflection ,Yt Offset of deflection ) The laser radar scans the relative position changes of the profile of the hoistway walls.
1) The horizontal positioning method comprises the following steps:
the laser radar scans the relative position of the profile of the walls of the hoistway to determine if the drone deviates from the course.
In the scanning matching process, comparing the relative deviation value of the coordinates of the data points collected by the laser radar and the coordinates of the data points on the four walls of the well collected by the first station laser radar, or the relative deviation value of the average value of the set of data frames and the first station data in a period of time.
△X ti =r ti ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·cosε i -r 1i ·cosε i
△Y ti =r ti ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·sinε i -r 1i ·sinε i
2) The horizontal positioning method adopts a second method:
△X t i=X t offset -L a ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·tanθ t
△Y t i=Y t offset -L a ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·tanΦ t
Wherein L is a The distance from the center of gravity of the unmanned aerial vehicle to the sensing surface of the photoelectric position sensor device 16b is the working starting point.
(△X ti ,△Y ti ) I.e. an estimate of the deviation of the drone from the course (i.e. the positional deviation).
Step five: hovering second station data acquisition
When the laser radar reaches the set acquisition position Z 2 estimation And when in hovering, shooting, and collecting laser radar data. The IMU module 10 outputs a roll angle Φ 2 Pitch angle theta 2 And yaw angle ψ 2 (all angles turn positive anticlockwise) the lidar scanner output is at pit height H2.
Z 2 estimation =H2·cos(arctan(tan 2 θ2+tan 2 Φ2) 1/2 )±Lb·sin(arctan(tan 2 θ2+tan 2 Φ2) 1/2 )-(L c -L c ·cos(arctan(tan 2 θ2+tan 2 Φ2) 1/2 ))
Wherein L is b L is the distance from the laser radar beam center to the radiation surface of the lower right-angle transmitting prism 5 c Is the distance from the intersection point of the rotation axes of the swing arm 7c and the swing arm 7d to the center of the laser radar beam.
After hovering, the radar cradle head 7 and the camera cradle head 9 impose corresponding directional power by a motor, so that the tilting and shaking of the laser radar and the camera along with the unmanned aerial vehicle are prevented. I.e. the swing arm 7b rotates-phi left and right about the bracket 7a 2 Swing arm 7c rotates back and forth by- θ around swing arm 7b 2 The method comprises the steps of carrying out a first treatment on the surface of the Swing arm 9c rotates-phi around bracket 9b 2 Swing arm type camera mounting groove 9d rotates- θ back and forth around swing arm 9c 2 . That is, after the camera and the radar are restored to the initialized pose information, data acquisition is started.
At this time, the photoelectric position sensor records the deviation value (X 2 offset ,Y 2 offset ) The laser radar 6 passes through a lower emission right angle prism5 measuring the accurate distance Z from the pit 2 essence
And laser radar data X 2i =r 2i cosε i +X 2 offset ,Y 2i =r 2i sinε i +Y 2 offset
Wherein r is 2i And (5) returning the distance value of each beam light of the second station laser radar in one period of scanning. The second station collects data at a height Z 2 essence
And meanwhile, the rotating column 9a in the camera cradle head 9 rotates 360 degrees to collect image data of a well.
After the data acquisition is completed, the swing arm 7b rotates around the bracket 7a in a left-right direction phi 2 Swing arm 7c rotates θ around swing arm 7b back and forth 2 The method comprises the steps of carrying out a first treatment on the surface of the Swing arm 9c rotates around bracket 9b in a left-right direction 2 Swing arm type camera mounting groove 9d rotates θ around swing arm 9c back and forth 2 . The camera and the radar recover the pose which is unchanged relative to the unmanned plane body.
Step six: repeated take-off and suspension
After the second station data acquisition is completed, the unmanned aerial vehicle takes off again. Reaching a set acquisition position Z n estimation Time-fixed point hovering operation.
The IMU module 10 outputs a roll angle Φ n Pitch angle theta n And yaw angle ψ n (all angles turn positive anticlockwise) the laser radar scanner outputs to the pit at a height H n
n station hover height estimation Z n estimation =H n ·cos(arctan(tan 2 θ n +tan 2 Φ n ) 1/2 )±Lb·sin(arctan(tan 2 θ n +tan 2 Φ n ) 1/2 )-(L c -L c ·cos(arctan(tan 2 θ n +tan 2 Φ n ) 1/2 ))
After hovering, the radar cradle head 7 and the camera cradle head 9 impose corresponding directional power by a motor, so that the tilting and shaking of the laser radar and the camera along with the unmanned aerial vehicle are prevented. I.e. the swing arm 7b rotates-phi left and right about the bracket 7a n Swing arm 7c rotates back and forth by- θ around swing arm 7b n The method comprises the steps of carrying out a first treatment on the surface of the Swing arm 9c rotates-phi around bracket 9b n Swing arm type camera mounting groove 9d rotates- θ back and forth around swing arm 9c n . That is, after the camera and the radar are restored to the initialized pose information, data acquisition is started.
Z n essence The accurate distance Z measured by the lower emission rectangular prism 5 to the pit for the n-station laser radar 6 n essence I.e. the height of the data collected by the nth station is Z n essence And laser radar data X ni =r ni cosε i +X n-offset ;Y ni =r ni sinε i +Y n-offset 。(X n-offset ,Y n-offset ) And recording the deviation value of the n-station laser radar relative to the initial position of the laser spot for the photoelectric position sensor.
And meanwhile, the rotating column 9a in the camera cradle head 9 rotates 360 degrees to collect image data of a well.
After the data acquisition is completed, the swing arm 7b rotates around the bracket 7a in a left-right direction phi n Swing arm 7c rotates θ around swing arm 7b back and forth n The method comprises the steps of carrying out a first treatment on the surface of the Swing arm 9c rotates around bracket 9b in a left-right direction n Swing arm type camera mounting groove 9d rotates θ around swing arm 9c back and forth n . The camera and the radar recover the pose which is unchanged relative to the unmanned plane body.
Repeated take-off and fixed-point hovering work until all stations are collected or when the distance Z from the top floor is preset S estimation (this value is estimated by the upper right angle emission prism 4 output value fused IMU) the last acquisition operation is performed.
Step seven: sailing back landing
The data acquisition work in the sailing process can be added in the sailing process, and the sailing landing can be directly carried out. The direct back-navigation landing or the emergency landing under abnormal conditions can rely on a laser radar to navigate the obstacle avoidance strategy.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
outputting a rising instruction, and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
acquiring current position information of an unmanned aerial vehicle body in the flight process, and processing initial position information and current position information to obtain position deviation;
correcting the ascending route of the unmanned aerial vehicle body according to the position deviation;
detecting that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route, and entering a collection flow of point cloud data and image data of the current station; outputting a take-off instruction until all ascending stations finish the acquisition process under the condition of finishing the acquisition process; the take-off instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station.
In one embodiment, the step of processing the initial position information and the current position information to obtain a position deviation further comprises the steps of:
and confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the step of processing the initial position information and the current position information to obtain a position deviation further comprises the steps of:
acquiring first current horizontal coordinates in a preset time period, and acquiring average horizontal coordinates according to each first current horizontal coordinate;
and confirming the coordinate difference value between the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
In one embodiment, the step of obtaining initial position information of the drone body when executed by the processor further performs the steps of:
acquiring an initial distance value and an initial scanning angle returned by the beam light in a scanning period through a laser radar;
obtaining a first initial horizontal coordinate according to the initial distance value and the initial scanning angle;
in one embodiment, the step of obtaining the current horizontal position of the drone body during flight, when executed by the processor, further performs the steps of:
acquiring a current distance value returned by the beam light in a scanning period through a laser radar;
acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
and obtaining a first current horizontal coordinate according to the current distance value, the rolling angle and the pitch angle.
In one embodiment, the step of obtaining initial position information of the drone body when executed by the processor further performs the steps of:
acquiring initial position coordinates transmitted by the photoelectric position sensor, and confirming the initial position coordinates as second initial horizontal coordinates; wherein, the photoelectric position sensor is arranged on the unmanned aerial vehicle body; under the condition that the initial position coordinates are that the unmanned aerial vehicle body reaches an initial station, the photoelectric position sensor responds to laser emitted by the laser emitting device to obtain the initial position coordinates.
In one embodiment, the step of obtaining the current horizontal position of the drone body during flight, when executed by the processor, further performs the steps of:
in one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring a current position coordinate transmitted by a photoelectric position sensor, and acquiring an attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle; the current position coordinates are obtained by a photoelectric position sensor and responding to laser emitted by a laser emitting device in the flying process;
acquiring a distance value between the gravity center of the unmanned aerial vehicle body and an induction surface of the photoelectric position sensor;
and processing the current position coordinate, the distance value, the rolling angle and the pitch angle to obtain a second current horizontal coordinate.
In one embodiment, the step of processing the initial position information and the current position information to obtain a position deviation further comprises the steps of:
and confirming the coordinate difference value of the second current horizontal coordinate and the second initial horizontal coordinate as the position deviation.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and under the condition that all ascending stations complete the acquisition process, entering a return process.
In one embodiment, the return process, when executed by the processor, further performs the steps of:
inputting a descending instruction, and correcting a descending route of the unmanned aerial vehicle body according to the position deviation;
detecting that the unmanned aerial vehicle body reaches the next station according to the corrected descending route, and entering a collection flow of point cloud data and image data of the current station; outputting a descending instruction until all stations finish the acquisition process under the condition of finishing the acquisition process; the descending instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
and under the condition that all descending stations complete the acquisition process, entering a landing process.
In one embodiment, the step of detecting that the drone body arrives at the next station according to the corrected ascending route is performed by the processor further implements the steps of:
Acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and the radiation surface of a lower right-angle transmitting prism and a third distance between the intersection point of the rotation axes of the swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle transmitting prism and the laser radar are arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and is used for swinging the laser radar;
processing the first distance, the second distance, the third distance, the rolling angle and the pitch angle to obtain the current height of the unmanned aerial vehicle body;
and if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus dynamic random access memory (RDRAM), and interface dynamic random access memory (DRDRAM).
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (14)

1. The hoistway inspection navigation method of the unmanned aerial vehicle is characterized by comprising the following steps:
outputting a rising instruction, and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
acquiring current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
Correcting the ascending route of the unmanned aerial vehicle body according to the position deviation;
detecting that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route, and entering a collection flow of point cloud data and image data of the current station; outputting a take-off instruction until all ascending stations finish the acquisition process under the condition of finishing the acquisition process; the take-off instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
the step of detecting that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route comprises the following steps:
acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
acquiring a first distance between the unmanned aerial vehicle body and a pit, a second distance between the beam center of the laser radar and the radiation surface of a lower right-angle transmitting prism and a third distance between the intersection point of the rotation axes of the swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle transmitting prism and the laser radar are arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and is used for swinging the laser radar;
Processing the first distance, the second distance, the third distance, the roll angle and the pitch angle based on the following formula to obtain the current height of the unmanned aerial vehicle body:
Z=H·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 )±Lb·sin(arctan(tan 2 θ+tan 2 Φ) 1/2 )-(L c -L c ·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 ));
wherein Z is the followingCurrent altitude; θ is the pitch angle; phi is the roll angle; h is the first distance; l (L) b Is the second distance; l (L) c Is the third distance;
and if the current height is the same as the height of the next station, confirming that the unmanned aerial vehicle body reaches the next station according to the corrected ascending route.
2. The method of claim 1, wherein the current location information comprises a first current horizontal coordinate of a hoistway wall with respect to the unmanned body; the initial position information comprises first initial horizontal coordinates of the four walls of the well relative to the unmanned aerial vehicle body;
the step of processing the initial position information and the current position information to obtain a position deviation includes:
and confirming the coordinate difference value of the first current horizontal coordinate and the first initial horizontal coordinate as the position deviation.
3. The method of claim 2, wherein the step of processing the initial position information and the current position information to obtain a position deviation comprises:
Acquiring the first current horizontal coordinates in a preset time period, and acquiring average horizontal coordinates according to each first current horizontal coordinate;
and confirming the coordinate difference value of the average horizontal coordinate and the first initial horizontal coordinate as the position deviation.
4. The method of claim 2, wherein the step of obtaining initial position information of the unmanned aerial vehicle body comprises:
acquiring an initial distance value and an initial scanning angle returned by the beam light in a scanning period through a laser radar;
obtaining the first initial horizontal coordinate according to the initial distance value and the initial scanning angle;
the step of obtaining the current horizontal position of the unmanned aerial vehicle body in the flight process comprises the following steps:
acquiring a current distance value returned by the beam light in a scanning period through a laser radar;
acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
and obtaining the first current horizontal coordinate according to the current distance value, the rolling angle and the pitch angle.
5. The method of claim 1, wherein the initial position information comprises a second initial horizontal coordinate of the drone body relative to a laser transmitter; wherein the laser emission device is arranged at the pit of the well;
The step of obtaining the initial position information of the unmanned aerial vehicle body comprises the following steps:
acquiring initial position coordinates transmitted by a photoelectric position sensor, and confirming the initial position coordinates as the second initial horizontal coordinates; wherein the photoelectric position sensor is arranged on the unmanned aerial vehicle body; and under the condition that the initial position coordinates are that the unmanned aerial vehicle body reaches an initial station, the photoelectric position sensor responds to laser sent by the laser transmitting device to obtain the initial position coordinates.
6. The method of claim 5, wherein the current location information comprises a second current horizontal coordinate of the drone body relative to a laser transmitter;
the step of obtaining the current horizontal position of the unmanned aerial vehicle body in the flight process comprises the following steps:
acquiring current position coordinates transmitted by a photoelectric position sensor, and acquiring the attitude variation of the unmanned aerial vehicle body through an inertial measurement unit; the attitude change amount comprises a rolling angle and a pitch angle; the current position coordinates are obtained by responding to laser emitted by the laser emitting device by the photoelectric position sensor in the flight process;
acquiring a distance value between the gravity center of the unmanned aerial vehicle body and the sensing surface of the photoelectric position sensor;
And processing the current position coordinate, the distance value, the rolling angle and the pitch angle to obtain the second current horizontal coordinate.
7. The method according to claim 6, wherein in the step of processing the current position coordinates, the distance values, the roll angle, and the pitch angle to obtain the second current horizontal coordinates, the second current horizontal coordinates are obtained based on the following formula:
X ti =X t offset -L a ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·tanθ t
Y ti =Y t offset -L a ·cos(arctan(tan 2 θ t +tan 2 Φ t ) 1/2 )·tanΦ t
Wherein X is ti An abscissa being the second current horizontal coordinate; y is Y ti An ordinate that is the second current horizontal coordinate; la is the distance value; θ t Is the pitch angle; phi t And the roll angle is the roll angle.
8. The method of claim 6, wherein the step of processing the initial position information and the current position information to obtain a position deviation comprises:
and confirming the coordinate difference value of the second current horizontal coordinate and the second initial horizontal coordinate as the position deviation.
9. A hoistway inspection navigation method of an unmanned aerial vehicle according to any of claims 1 to 8, further comprising the steps of:
And under the condition that all ascending stations complete the acquisition process, entering a return process.
10. The method for inspection navigation of a hoistway of an unmanned aerial vehicle according to claim 9, wherein the return process comprises:
inputting a descending instruction, and correcting a descending route of the unmanned aerial vehicle body according to the position deviation;
detecting that the unmanned aerial vehicle body reaches the next station according to the corrected descending route, and entering a collection flow of point cloud data and image data of the current station; outputting a descending instruction until all stations finish the acquisition process under the condition of finishing the acquisition process; the descending instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
and entering a landing process under the condition that all descending stations complete the acquisition process.
11. Unmanned aerial vehicle's well navigation head is patrolled and examined, its characterized in that includes:
the initial position information acquisition module is used for outputting a rising instruction and acquiring initial position information of the unmanned aerial vehicle body under the condition that the unmanned aerial vehicle body is at an initial station;
the position deviation acquisition module is used for acquiring current position information of the unmanned aerial vehicle body in the flight process, and processing the initial position information and the current position information to obtain position deviation;
The correction module is used for correcting the flight route of the unmanned aerial vehicle body according to the position deviation;
the acquisition module is used for detecting that the unmanned aerial vehicle body arrives at the next station according to the corrected flight route and entering the acquisition process of the point cloud data and the image data of the current station; outputting a take-off instruction until all stations finish the acquisition process under the condition that the acquisition process is finished; the take-off instruction is used for indicating the unmanned aerial vehicle body to move from the current station to the next station;
wherein, the collection module includes:
the attitude change amount acquisition unit is used for acquiring the attitude change amount of the unmanned aerial vehicle body through the inertia measurement unit; the attitude change amount comprises a rolling angle and a pitch angle;
the distance calculation unit is used for obtaining a first distance between the unmanned aerial vehicle body and the pit, a second distance between the beam center of the laser radar and the radiation surface of the lower right-angle transmitting prism and a third distance between the intersection point of the rotation axes of the swing arms and the beam center, wherein the first distance is output by the laser radar; the lower right-angle transmitting prism and the laser radar are arranged on any swing arm; the swing arm is arranged on the unmanned aerial vehicle body and is used for swinging the laser radar;
The altitude calculating unit is configured to process the first distance, the second distance, the third distance, the roll angle, and the pitch angle based on the following formula, to obtain a current altitude of the unmanned aerial vehicle body:
Z=H·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 )±L b ·sin(arctan(tan 2 θ+tan 2 Φ) 1/2 )-(L c -L c ·cos(arctan(tan 2 θ+tan 2 Φ) 1/2 ));
wherein Z is the current height; θ is the pitch angle; phi is the roll angle; h is the first distance; l (L) b Is the second distance; l (L) c Is the third distance;
and the station arrival confirming unit is used for confirming that the unmanned aerial vehicle body arrives at the next station according to the corrected ascending route if the current height is the same as the height of the next station.
12. A drone comprising a drone body, a memory provided on the drone body, the memory storing a computer program, wherein the processor when executing the computer program implements the steps of the method of any one of claims 1 to 10.
13. The unmanned aerial vehicle of claim 12, further comprising a lidar, a photoelectric position sensor, an inertial measurement unit, and an image acquisition device disposed on the unmanned aerial vehicle body;
the processor is respectively connected with the laser radar, the photoelectric position sensor, the inertial measurement unit and the image acquisition equipment.
14. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 10.
CN202011228810.4A 2020-11-06 2020-11-06 Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle Active CN112327898B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011228810.4A CN112327898B (en) 2020-11-06 2020-11-06 Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011228810.4A CN112327898B (en) 2020-11-06 2020-11-06 Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN112327898A CN112327898A (en) 2021-02-05
CN112327898B true CN112327898B (en) 2023-08-29

Family

ID=74316246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011228810.4A Active CN112327898B (en) 2020-11-06 2020-11-06 Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN112327898B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113359829B (en) * 2021-06-10 2022-12-09 西安图迹信息科技有限公司 Unmanned aerial vehicle power plant intelligent inspection method based on big data
CN114217626B (en) * 2021-12-14 2022-06-28 集展通航(北京)科技有限公司 Railway engineering detection method and system based on unmanned aerial vehicle routing inspection video
CN117707206B (en) * 2024-02-06 2024-05-14 天津云圣智能科技有限责任公司 Unmanned aerial vehicle aerial survey operation method, unmanned aerial vehicle aerial survey operation device and computer storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104843176A (en) * 2015-04-28 2015-08-19 武汉大学 Unmanned-gyroplane system used for automatic-inspection of bridges and tunnels and navigation method
WO2016065623A1 (en) * 2014-10-31 2016-05-06 SZ DJI Technology Co., Ltd. Systems and methods for surveillance with visual marker
JP2017128440A (en) * 2016-01-22 2017-07-27 株式会社日立ビルシステム Elevator inspection device and elevator inspection system
CN107783106A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Data fusion method between unmanned plane and barrier
JP2018144981A (en) * 2017-03-07 2018-09-20 株式会社日立ビルシステム Inspection device for elevator, inspection system for elevator, and control method and terminal device thereof
CN109101039A (en) * 2018-06-29 2018-12-28 太原理工大学 Vertical detection method and system
CN109189088A (en) * 2018-08-21 2019-01-11 中南林业科技大学 Captive unmanned plane adaptive cruise tracking, terminal and storage medium
EP3489184A1 (en) * 2017-11-28 2019-05-29 Otis Elevator Company Hoistway inspection device
JP2020040781A (en) * 2018-09-10 2020-03-19 株式会社日立ビルシステム Measurement system and measurement method
JP6720382B1 (en) * 2019-04-24 2020-07-08 東芝エレベータ株式会社 Elevator system, unmanned aerial vehicle used therefor, and elevator pretreatment method
CN111573461A (en) * 2020-05-20 2020-08-25 迅达(中国)电梯有限公司 Elevator maintenance system
WO2020202289A1 (en) * 2019-03-29 2020-10-08 三菱電機株式会社 Physical distribution system and unmanned flying object

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3085659B1 (en) * 2015-04-23 2017-12-06 KONE Corporation An arrangement and a method for measuring the position of an installation platform in an elevator shaft
CA2986526C (en) * 2017-07-31 2023-10-17 Queen's University At Kingston Autorotating unmanned aerial vehicle surveying platform

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016065623A1 (en) * 2014-10-31 2016-05-06 SZ DJI Technology Co., Ltd. Systems and methods for surveillance with visual marker
CN104843176A (en) * 2015-04-28 2015-08-19 武汉大学 Unmanned-gyroplane system used for automatic-inspection of bridges and tunnels and navigation method
JP2017128440A (en) * 2016-01-22 2017-07-27 株式会社日立ビルシステム Elevator inspection device and elevator inspection system
CN107783106A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Data fusion method between unmanned plane and barrier
JP2018144981A (en) * 2017-03-07 2018-09-20 株式会社日立ビルシステム Inspection device for elevator, inspection system for elevator, and control method and terminal device thereof
EP3489184A1 (en) * 2017-11-28 2019-05-29 Otis Elevator Company Hoistway inspection device
CN109101039A (en) * 2018-06-29 2018-12-28 太原理工大学 Vertical detection method and system
CN109189088A (en) * 2018-08-21 2019-01-11 中南林业科技大学 Captive unmanned plane adaptive cruise tracking, terminal and storage medium
JP2020040781A (en) * 2018-09-10 2020-03-19 株式会社日立ビルシステム Measurement system and measurement method
WO2020202289A1 (en) * 2019-03-29 2020-10-08 三菱電機株式会社 Physical distribution system and unmanned flying object
JP6720382B1 (en) * 2019-04-24 2020-07-08 東芝エレベータ株式会社 Elevator system, unmanned aerial vehicle used therefor, and elevator pretreatment method
CN111573461A (en) * 2020-05-20 2020-08-25 迅达(中国)电梯有限公司 Elevator maintenance system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
桥梁检测无人机控制技术研究;屈利伟;中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑(第01(2019)期);第C034-1494页 *

Also Published As

Publication number Publication date
CN112327898A (en) 2021-02-05

Similar Documents

Publication Publication Date Title
CN112327898B (en) Unmanned aerial vehicle well inspection navigation method and device and unmanned aerial vehicle
CN110262568B (en) Unmanned aerial vehicle obstacle avoidance method and device based on target tracking and unmanned aerial vehicle
JP7260269B2 (en) Positioning system for aeronautical non-destructive inspection
US8666571B2 (en) Flight control system for flying object
CN106979773B (en) Surface mapping apparatus, 3D coordinate determination method, computer-readable storage medium
EP2818958B1 (en) Flying vehicle guiding system and associated guiding method
US9201422B2 (en) Measuring system
US10800344B2 (en) Aerial photogrammetric device and aerial photogrammetric method
CN111670419A (en) Active supplemental exposure settings for autonomous navigation
JP2015037937A (en) Flight vehicle flight control system
JP2017065467A (en) Drone and control method thereof
CN112363176B (en) Elevator hoistway inspection and modeling method and device and inspection and modeling system
CN110427042A (en) Unmanned plane low latitude barrier-avoiding method based on ultrasonic wave and binocular vision
JP6577083B2 (en) Measuring system
CN112478968B (en) Elevator hoistway inspection control method, device and system and storage medium
US20220099442A1 (en) Surveying System
WO2020204201A1 (en) Aircraft
US20240045449A1 (en) Method for controlling a drone along a shaft
CN114942421A (en) Omnidirectional scanning multiline laser radar autonomous positioning device and method
CN115718298A (en) System for UGV and UAV automatically provide lidar data reference thereof for 3D detection
JP2018138922A (en) Measuring system
CN114554030B (en) Device detection system and device detection method
US20210387743A1 (en) Flight vehicle
US20240111311A1 (en) Control apparatus, base station, control method, and program
JP7504502B2 (en) Aircraft

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant