WO2020139195A1 - Drone for surface defects inspection - Google Patents

Drone for surface defects inspection Download PDF

Info

Publication number
WO2020139195A1
WO2020139195A1 PCT/SG2019/050635 SG2019050635W WO2020139195A1 WO 2020139195 A1 WO2020139195 A1 WO 2020139195A1 SG 2019050635 W SG2019050635 W SG 2019050635W WO 2020139195 A1 WO2020139195 A1 WO 2020139195A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
surface defects
visual data
flight
sensors
Prior art date
Application number
PCT/SG2019/050635
Other languages
French (fr)
Inventor
Wei Jun Jay ANG
Tze Huan Jake GOH
Chien Ming Mervin HOON
Kok Wee Keith NG
Yu Da TAN
Wi-Soon Mark TOH
Original Assignee
Performance Rotors Pte. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Performance Rotors Pte. Ltd. filed Critical Performance Rotors Pte. Ltd.
Priority to SG11202100153SA priority Critical patent/SG11202100153SA/en
Publication of WO2020139195A1 publication Critical patent/WO2020139195A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports
    • B64U30/29Constructional aspects of rotors or rotor supports; Arrangements thereof
    • B64U30/299Rotor guards
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0086Surveillance aids for monitoring terrain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Definitions

  • the present disclosure generally relates to a drone for defects inspection. More particularly, the present disclosure describes various embodiments of a drone for inspecting surface defects within an enclosed environment.
  • GPS Global Positioning System
  • Many drones at present rely on a global navigation satellite system, such as Global Positioning System (GPS) for navigation. These drones can be used in various applications including inspection of surface defects, such as cracks on external walls of buildings or facades.
  • GPS Global Positioning System
  • Drones that rely on GPS navigation tend to work well in outdoor environments where the drones can detect GPS signals with minimal interference or disruption.
  • GPS signals are often blocked, deviated, and/or erroneous.
  • Enclosed environments that are near power transformers and electrical transmission systems also interfere with drone navigation, especially if the drones also rely on magnetic compasses for navigation.
  • a drone for inspecting surface defects in an enclosed environment comprises: a body comprising a set of guard frames; a set of propellers for moving the drone, each propeller configured to be mounted to one of the guard frames; a set of sensors for cooperatively navigating the drone in the enclosed environment; and an inspection module configured to be mounted to the body, the inspection module comprising a set of cameras for capturing visual data of the surface defects, wherein the visual data is subsequently processed to thereby inspect the surface defects.
  • a system for inspecting surface defects in an enclosed environment comprises a set of drones, each drone comprising: a body comprising a set of guard frames; a set of propellers for moving the drone, each propeller mounted to one of the guard frames; a set of sensors for cooperatively navigating the drone in the enclosed environment; and an inspection module mounted to the body, the inspection module comprising a set of cameras for capturing visual data of the surface defects.
  • the system further comprises a computer system configured for receiving the visual data from the drones.
  • An advantage of the present disclosure is that the sensors of the drone enable it to estimate its position in an unknown enclosed environment and to navigate itself in the enclosed environment. More specifically, in GPS-denied enclosed environments, the drone 100 is able to localize its position and navigate in such enclosed environments, making it suitable for surface defects inspection, particularly in deep wells or underground tunnels. The drone could potentially replace human inspectors which advantageously avoids the hazardous risk posed to human lives, thus improving safety for the human inspectors as they are no longer exposed to hazardous environments.
  • Figure 1 is an illustration of a system for inspecting surface defects.
  • Figure 2A to Figure 2C are illustrations of a drone for inspecting surface defects.
  • Figure 3 is an illustration of a flight data graph of drone height relative to ground.
  • Figure 4 is an illustration of an inspection module for inspecting surface defects.
  • Figure 5A to Figure 5C are illustrations of a housing of the drone.
  • Figure 6A and Figure 6B are illustrations of flight data graphs from a flight stability test performed on the drone.
  • Figure 7 is an illustration of a flight data graph from a height sensor test performed on the drone.
  • Figure 8 is an illustration of a flight data graph from an optical flow test performed on the drone.
  • Figure 9 is an illustration of a flight data graph from a tunnel centering test performed on the drone.
  • Figure 10A to Figure 10D are illustrations of a user interface of a ground station.
  • Figure 1 1 is an illustration of inspection results from inspecting surface defects.
  • depiction of a given element or consideration or use of a particular element number in a particular figure or a reference thereto in corresponding descriptive material can encompass the same, an equivalent, or an analogous element or element number identified in another figure or descriptive material associated therewith.
  • references to“an embodiment / example”,“another embodiment / example”,“some embodiments / examples”, “some other embodiments / examples”, and so on, indicate that the embodiment(s) / example(s) so described may include a particular feature, structure, characteristic, property, element, or limitation, but that not every embodiment / example necessarily includes that particular feature, structure, characteristic, property, element or limitation. Furthermore, repeated use of the phrase“in an embodiment / example” or“in another embodiment / example” does not necessarily refer to the same embodiment / example.
  • the terms“a” and“an” are defined as one or more than one.
  • the use of 7” in a figure or associated text is understood to mean“and/or” unless otherwise indicated.
  • the term“set” is defined as a non-empty finite organization of elements that mathematically exhibits a cardinality of at least one (e.g. a set as defined herein can correspond to a unit, singlet, or single-element set, or a multiple-element set), in accordance with known mathematical definitions.
  • the recitation of a particular numerical value or value range herein is understood to include or be a recitation of an approximate numerical value or value range.
  • the system 50 includes a set of drones 100 and a computer system 200.
  • a drone 100 is an unmanned aerial vehicle (UAV) which is an aircraft without a human pilot on board.
  • the computer system 200 may include a set of ground stations 220 communicative with one or more drones 100.
  • the computer system 200 may further include a remote server 240, such as a cloud-based server, communicative with the ground stations 220.
  • each drone 100 includes a body 102 having a set of guard frames 104.
  • Each guard frame 104 may be removably mounted to the body 102 such that the guard frame 104 is removable for replacement.
  • Each guard frame 104 includes a plurality of guard spars 106. In many embodiments as shown in Figure 2C, the number of guard spars 106 is five, as this optimizes rigidity, support, and structural strength of the guard frame 104. It will be appreciated that fewer or more guard spars 106 may be possible, as will be readily understood by the skilled person.
  • the drone 100 further includes a set of propellers 108 for moving the drone 100.
  • the propellers 108 operate to generate propulsion thrust for flight motion of the drone 100.
  • the propellers 108 simultaneously generate prop wash downwards away from the propellers 108 to counteract the upwards airlift of the drone 100.
  • the drone 100 using multiple propellers 108 for flight motion may be referred to as a multicopter.
  • the drone 100 uses four propellers 108 and may be referred to as a quadcopter, although fewer or more propellers 108 may be possible.
  • detailed flight mechanics of a quadcopter are not described herein as this would be readily understood by the skilled person.
  • Each propeller 108 is configured to be mounted to one of the guard frames 104. Specifically, each propeller 108 is configured to be mounted underneath the guard spars 106 of the respective guard frame 104. By positioning the propellers 108 underneath the guard spars 106, the prop wash generated by the propellers 108 during flight motion of the drone 100 will be unobstructed, thereby increasing the efficiency of the propellers 108.
  • the guard frame 104 and guard spars 108 protect the propellers 108 from coming into contact with other objects or obstacles, particularly when the drone 100 is in flight motion within the enclosed environment, as such contact could result in damage to the propellers 108.
  • each propeller 108 may be removably mounted to the respective guard frame 104, such that the propellers 108 can be replaced if they are damaged.
  • Each propeller 108 includes a plurality of propeller blades 108a and a propeller motor 108b for rotating the propeller blades 108 to generate the propulsion thrust.
  • two-bladed or three-bladed propellers 108 are used in the drone 100.
  • the size of the propeller motors 108b is dependent at least on the overall weight of the drone 100, including the propellers 108 themselves and the avionics carried on the drone 100. In selecting the suitable propeller motors 108b, the size of the propeller blades 108a, e.g. blade length, was first selected based on an estimated overall weight of the drone 100.
  • Both 4-inch and 5-inch propeller blades 108a and corresponding propeller motors 108b were considered.
  • 4-inch propellers 108 may be used to keep the drone 100 small and light, but the avionics carried on the drone 100 may increase the overall weight of the drone 100 to beyond the operation limit of the 4-inch propellers 108.
  • 5- inch propellers 108 may be used to generate more propulsion thrust, but the size of the blades 108a and motors 108b add to the overall weight of the drone 100.
  • One factor in this consideration was how the 4-inch or 5-inch propellers 108 affected the flight time of the drone 100. A flight time of 10 minutes was selected as a benchmark parameter. For controllability, the drone 100 should have a thrust-to-weight ratio of approximately 2.
  • Drones using different models of 4-inch and 5-inch propeller motors 108b were simulated and tested.
  • the 4-inch propeller motors 108b included the Lumenier RX1806, T-Motor F30, and Xnova RM1407
  • the 5-inch propeller motors 108b included the DJI Snail, T-Motor F40, and Brotherhobby 2205.
  • Operating temperatures of the drone 100 particularly from operation of the propellers 108, should be kept low to prevent the drone 100 from overheating which would likely damage the drone 100.
  • the technical specifications of the various propeller motors 108b were considered to identify the motors 108b that are able to achieve lower operating temperatures.
  • a comparison of the 4-inch propeller motors 108b showed that the Lumenier RX1806 is the largest motor, but it has a relatively low Kv rating (number of revolutions per minute (rpm) that a motor turns when one volt is applied with no load attached to that motor), which provides good propulsion thrust and efficiency. Based on the technical specifications, the Lumenier RX1806 has higher efficiency compared to other 4-inch propeller motors 108b. It was inferred that the Lumenier RX1806 would achieve the lowest operating temperature and would thus be suitable for the propellers 108 of the drone 100.
  • a comparison of the 5-inch propeller motors 108b showed the DJI Snail is the lightest motor and is able to achieve the lowest operating temperature. It was inferred that the DJI Snail would be suitable for the propellers 108 of the drone 100.
  • Both the Lumenier RX1806 4-inch propeller motor 108b and DJI Snail 5-inch propeller motor 108b were selected for a flight simulation analysis performed on eCalc which is an electric motor calculator. The flight time of 10 minutes was selected as a benchmark parameter in this simulation analysis.
  • the drone 100 was simulated to have an overall weight of 1 kg when fitted with the Lumenier RX1806 4- inch propeller motors 108b and corresponding 4-inch propeller blades 108a. The simulation result showed that the“4-inch” drone 100 flew for 5.4 minutes.
  • the drone 100 was then simulated with the DJI Snail 5-inch propeller motors 108b and corresponding 5-inch propeller blades 108a fitted.
  • the“5-inch” drone 100 weighed approximately 1.134 kg. However, although the overall weight is greater, the simulation result showed that the“5-inch” drone 100 achieved better performance in terms of flight time, estimated operating temperature, and thrust-to-weight ratio.
  • the results from our simulation analysis also showed that, the overall weight of the “4-inch” drone 100 should be reduced by 0.35 kg in order to achieve the benchmark 10-minute flight time, while the overall weight of the“5-inch” drone 100 should be reduced by only 0.134 kg in order to achieve the benchmark 10-minute flight time.
  • the DJI-Snail 5-inch propellers 108 were considered to be suitable for the drone 100.
  • the drone 100 was assembled with the DJI-Snail 5-inch propellers 108 and a live flight test was performed.
  • a flight data log of the drone 100 was extracted from a flight controller software, such as Pixhawk.
  • Figure 3 shows a graph of the flight data log, plotting height of the drone 100 above ground against time. It can be seen from the graph that the drone 100 was able to achieve a flight time of 10 minutes, thus verifying that the DJI-Snail 5-inch propellers 108 are suitable for the drone 100.
  • there may be other models of 5-inch propellers 108 suitable for the drone 100 which will allow for a heavier drone 100 to fly without achieving the maximum propulsion thrust of the 5-inch propellers 108.
  • using 4-inch propellers 108 would not be suitable as the“4-inch” drone 100 barely achieved the maximum propulsion thrust that the 4-inch propellers 108 should normally achieve.
  • the body 102 provides physical protection to the drone 100, especially for the propellers 108 and avionics carried on the drone 100.
  • Other factors in designing the body 102 include weight, durability, ease of fabrication, and aerodynamics.
  • the body of the Lumenier Danaus drone was used as reference in design the body 102 of the drone 100 as it was easy to fabricate.
  • the body 102 has dimensions of approximately 330 mm by 300 mm with an inverted propeller design as describe above.
  • each guard frame 104 has more guard spars 106 for better protection of the propellers 108 from debris without compromising on the structural support required.
  • the body 102 also provides structural support around the boundary of the centre of the body 102 for mounting of other components, such as a housing 1 10 for avionics.
  • a finite element analysis was performed to optimize the design of the body 102. Theoretical maximum loadings were applied on the body 102 in an effort to reduce the weight of the body 102. Although only a few grams could be reduced, every gram saved would improve the flight time of the drone 100.
  • the finite element analysis was performed by simulating the drone 100 falling from a height of 30 m, specifically simulating power loss when the drone 100 is flying at its designed maximum operating altitude. Based on results of the finite element analysis and the ultimate tensile strength (which is the maximum tensile stress that a material can withstand before failing or breaking) of carbon fibre (which is a suitable material for the body 102), the safety factor was calculated to be approximately 2.032.
  • the safety factor is a ratio of ultimate tensile strength over the maximum observed stress.
  • the calculated safety factor of at least 2 is greater than the recommended safety factor of 1.5 for aircraft and spacecraft, as indicated in the document“The Ultimate Factor of Safety for Aircraft and Spacecraft Its History, Applications and Misconceptions” on the NASA Technical Reports Server. Accordingly, the drone 100 is safe for use for its intended application of surface defects inspection.
  • the drone 100 further includes a set of sensors 1 12 for cooperatively navigating the drone 100 within the enclosed environment.
  • the sensors 1 12 may include a plurality of obstacle detection sensors for detecting obstacles around the drone 100 within the enclosed environment.
  • the obstacle detection sensors are disposed around the periphery of the drone 100 to detect obstacles at a predefined distance away from the drone 100, and thus enable the drone 100 to avoid these obstacles during navigation.
  • the predefined distance may be 0.5 m and may be set by the user during operation of the drone 100.
  • the obstacle detection sensors are time-of-flight sensors.
  • Other types of obstacle detection sensors were also considered, such as ultrasonic sensors and infrared sensors.
  • An ultrasonic sensor emits a high-frequency sound pulse and measures how long it takes for the echo of the sound to reflect back from the obstacle.
  • ultrasonic sensors are not suitable because the reflected echo would be interfered by external sound waves, especially since the drone 100 is intended to be operated in an enclosed environment where sounds can be reflected from all directions.
  • An infrared sensor works in accordance with the infrared reflection principle to detect obstacles.
  • infrared sensors are not suitable because they have a low proximity range of less than a metre, and the consistency of accurate measurements worsens the further the infrared sensors are from the obstacles.
  • a time-of-flight sensor is a range imaging sensor that measures the time of flight of a light signal emitted from a light source at the sensor to the obstacle and then reflecting off the obstacle. The distance between the sensor and the obstacle can be calculated since the speed of light is known.
  • the light source may be a laser or a light-emitting diode (LED).
  • Time-of-flight sensors using lasers may be known as a class of LiDar sensors, and these are comparatively more accurate and have less compromises on measurement results compared to the ultrasonic and infrared sensors. This is because the emitted laser is able to travel in a straight line and across a long distance. Additionally, the time-of-flight sensors can operate even in enclosed environments with minimal lighting, such as the case in deep wells and tunnels.
  • time-of-flight sensors is the TeraRanger Multiflex which contains eight sensors, each having a maximum proximity range of 2 m, making it suitable for the drone 100 to detect obstacles at approximately 0.5 m away. At least six sensors time-of-flight sensors are fitted on both sides of the drone 100 for optimal obstacle avoidance and navigation, although fewer or more sensors may be possible.
  • the sensors 1 12 include an optical flow sensor for detecting forward motion of the drone 100.
  • the sensors 112 may further include a height sensor for detecting a ground within the enclosed environment.
  • the height sensor may be a time-of-flight sensor or an altimeter.
  • a flight controller 130 combines sensor data from the sensors 1 12 and navigates the drone 100 accordingly while avoiding detected obstacles.
  • the time-of-flight obstacle detection sensors enables the drone 100 to detect the two side walls of the tunnel (Y-direction) and to stabilize the drone 100 in the centre of the tunnel.
  • the least-squares line method may be used to plot a line that represents the walls based on the sensor data.
  • the optical flow sensor complements the navigation by tracking motion of the drone 100 along the tunnel (X-direction), while the height sensor detects the ground of the tunnel underneath the drone 100, thus stabilizing the drone 100 at a suitable height above the ground (Z-direction).
  • the flight controller 130 computes the flight path of the drone 100 and the flight path data can be communicated to the ground station 220.
  • the drone 100 further includes an inspection module 1 14 as shown in Figure 4.
  • the inspection module 1 14 is configured to be mounted to the body 102 and may be configured to be removably mounted to the body 102 so that it can be replaced if necessary.
  • the inspection module 1 14 includes a set of cameras 116 for capturing visual data of the surface defects.
  • the visual data may include image data, e.g. a set of images, and/or video data, e.g. a series of images, of the surface defects.
  • the set of cameras 1 16 includes a thermal camera 1 16a.
  • the set of cameras 1 16 includes an optical camera 1 16b.
  • the cameras 1 16 include the thermal camera 1 16a and optical camera 116b as shown in Figure 4.
  • the thermal camera 116a is an infrared camera for capturing thermal images in the infrared spectrum.
  • the optical camera 1 16b captures normal optical images / videos in the visible light spectrum.
  • the inspection module 1 16 may include a light source, such as a dimmer LED, for providing suitable illumination to the optical camera 116b for capturing the optical images. It will be appreciated that there may be other types of cameras 1 16 or visual sensors implemented in the inspection module 114 for capturing visual data of the surface defects.
  • the inspection module 114 further includes a gimbal 1 18 for stabilizing the cameras 116 while the drone 100 is in motion.
  • the gimbal 1 18 was designed to optimize its shape and performance and minimizing its overall size.
  • small-sized thermal camera 1 16a and/or optical camera 1 16b were selected.
  • the thermal camera 1 16a is an AMG8833 sensor which is small and discrete and is able to capture proper thermal images.
  • the AMG8833 sensor has only an 8x8 resolution array, this can be improved to 70x70 through interpolation and programming, as will be readily understood by the skilled person.
  • An example of the optical camera 116b is the Foxeer Legend 1.
  • the gimbal 1 18 may be designed to operate on two axes or three axes. Flowever, the 3-axis gimbal is larger and heavier than the 2-axis gimbal and may not be suitable for use with the drone 100. As such, the 2-axis gimbal 118 was selected for the drone 100 operates on two axes. The centre of gravity of the cameras 116 was obtained before fitting the gimbal 1 18, so that the gimbal 1 18 can be appropriately balanced and further so that less energy is required to stabilize the cameras 1 16 while the drone 100 is in motion. Figure 4 shows the cameras 116 fitted with the gimbal 1 18.
  • the drone 100 further includes the housing 1 10 for the avionics as well as for the sensors 112 and inspection module 1 14.
  • the housing 1 10 may be configured to be removably mounted to the body 102, such as by way of an attachment mechanism 120 that may be lockable to secure the housing 1 10 while the drone 100 is moving.
  • the attachment mechanism 120 allows the user to easily access and remove the housing 1 10, such as to access the internal avionics for repairs and/or replacements.
  • the inspection module 114 is housed within the housing 1 10 so that they are protected by the housing 110.
  • the housing 1 10 includes an optically clear window 122 so that the cameras 116 can still capture visual data even when housed within the housing 1 10.
  • the inspection module 1 14 is disposed underneath the housing 1 10.
  • the housing 1 10 provides physical protection to the avionics housed inside. Additionally, the housing 1 10 is configured to be water resistant. Seals such as rubber gaskets or silicone sealant are provided at interfacing joints of the housing 1 10 to achieve the water resistance.
  • the housing 1 10 may be formed of two halves or shells coupleable together at interfacing joints which are lined with sealing gaskets. Other interfacing joints may be between the housing 1 10 and a battery compartment 124.
  • the housing 1 10 may be formed such that a large space is provided underneath the housing 1 10 for disposing the battery compartment 124, the large space allowing for a large high-capacity battery to be used, such as a lithium-ion polymer battery, to maximize the flight time of the drone 100.
  • the housing 1 10 may be rated with an Ingress Protection rating IP55, i.e. the housing 1 10 is protected from limited dust ingress and from low pressure water jets from any direction.
  • the housing 1 10 was tested for its water resistance and the IP55 rating without any avionics housed inside.
  • a water spray test was performed using rubber gaskets as the seals at the interfacing joints, and the result was that there was no water ingress into the housing 1 10.
  • Another water spray test was performed using silicone sealant as the seals at the interfacing joints, and the result was better water resistance and ingress protection.
  • the material for the housing 1 10 is selected to withstand environmental elements such as heat, rain, and humid conditions, and to ensure minimal or zero water ingress which can harm the crucial avionics housed inside.
  • the material should also be mechanically strong to securely mount to the body 102 and to provide physical protection to the avionics. Density and weight of the material are also factors in the material selection as they affect the maximum flight time of the drone 100. Other factors which may be considered in the material selection include ability to withstand operating temperatures, and mechanical properties such as bending stiffness, toughness, and vibration damping.
  • the material of the housing 1 10 is epoxy resin as it has a higher glass temperature range and maximum service temperature. It will be appreciated that other materials may be possible in consideration of the aforementioned factors.
  • the size of the housing 1 10 is sufficiently large to accommodate larger avionics, allow greater freedom for wire management within the housing 1 10, and increase empty spaces between the various avionics. Larger empty spaces between the avionics can improve heat dissipation from the avionics.
  • Other avionics which the housing 1 10 may house include a geolocation module 126 having a GPS module and/or a magnetic compass, electrical interface 128, flight controller 130, electronic speed controller 132, flight telemetry communication module 134, visual data communication module 136, and power delivery module 138.
  • the avionics are positioned within the housing 1 10 in consideration of their electromagnetic outputs as well as their thermal characteristics.
  • the housing 110 is separated into three regions - the front region 1 10a for low thermal activity such as for temperatures below 35 °C, middle region 1 10b for moderate thermal activity such as for temperatures from 35 °C to 45 °C, and rear region 1 10c for high thermal activity such as for temperatures up to 60 °C.
  • Avionics of different operating frequencies are distributed throughout the housing 1 10, but avionics of similar operating temperatures are grouped together at the appropriate regions 1 10abc.
  • the inspection module 1 14 and geolocation module 126 may be positioned at the front region 110a; the electrical interface 128, flight controller 130, and flight telemetry communication module 134 may be positioned at the middle region 1 10b; and the battery compartment 124, electronic speed controller 132, visual data communication module 136, and power delivery module 138 may be positioned at the rear region 1 10c.
  • the electrical interface 128 is configured to provide electrical connections between the avionics of the housing 110 and the sensors 1 12.
  • the electrical interface 128 may be in the form of an array of pogo pins.
  • the pogo pin array is easily connected and disconnected so that the housing 110 is modular and can be easily swapped to other bodies 102 of other drones 100, or conversely swapping the housing 110 of one drone 100 with another housing 1 10.
  • the visual data communication module 136 is configured for communicating the visual data to the ground station 220, which may occur via a communication protocol such as a 5.8 GHz video link.
  • the power delivery module 138 is configured to transfer power from the battery compartment 126 to the avionics of the drone 100, including the electrical interface 128, sensors 1 12, and cameras 1 16.
  • the flight telemetry communication module 134 is configured for communicating with the ground station 220, which may include a remote controller, that controls flight motion of the drone 100.
  • the flight telemetry communication module 134 may communicate with the ground station 220 via a telemetry communication protocol such as MAVLink Telemetry.
  • An example of the flight telemetry communication module 134 is a RFD900 radio modem.
  • the ground station 220 or remote controller sends control signals to the flight telemetry communication module 134.
  • the flight controller 130 processes the control signals in cooperation with sensor data from the sensors 1 12 and optionally with other information such as from one or more inertial measurement units (IMU), gyroscope, and the geolocation module 126.
  • the flight controller 130 performs computation calculations and sends the results to the electronic speed controller 132.
  • IMU inertial measurement units
  • the electronic speed controller 132 is configured to vary the speed, direction, and braking of the propellers 108. Using the results from the flight controller 130, the electronic speed controller 132 controls the propellers 108 accordingly to the control input of the user at the ground station 220 using the remote controller.
  • the flight controller 130 is selected based on versatility to tweaks and changes so that the drone 100 can be customized for deployment in an enclosed environment.
  • the flight controller 130 supports companion computers via connections through high-bandwidth communication ports, and has multiple redundancy IMUs, and a 32- bit flight processor.
  • the flight controller 130 is a Pixhawk 2.1 with a versatile supporting PX4 firmware.
  • the Pixhawk 2.1 flight controller 130 is compatible and integrated with a companion computer such as a single-board computer (SBC).
  • SBC single-board computer
  • the SBC may be an Intel Edison computer-on-module as it is compact and lightweight, it can be easily integrated with the Pixhawk 2.1 flight controller 130, and it is computationally powerful enough to perform algorithms with sensor data from the sensors 1 12, particularly to localize the drone 100 in an unknown enclosed environment and navigate the drone 100 in the enclosed environment.
  • the flight controller 130 uses various algorithms to achieve semi-autonomous flight for the drone 100.
  • the flight controller 130 may use the least-squares matching (LSQM) algorithm which is based on the least-squares line method for finding a line with the sensor data. Assuming that the drone 100 is deployed in a square tunnel, the flight controller 130 attempts to solve a linear system using sensor data from the multiple sensors 1 12, specifically the ranging sensing or time-of-flight sensors on both sides of the drone 100, and with the least squares line method to find two lines that represent the walls of the tunnel.
  • LQM least-squares matching
  • the angle of the line corresponding to the tunnel wall can be estimated.
  • the distance and angular orientation of the drone 100 relative to the tunnel walls can be computed as the positions of all the sensors 1 12 around the drone 100 are known.
  • the flight controller 130 then controls flight motion of the drone 100 based on its computations using the sensor input data.
  • the flight controller 130 may use the second order extended Kalman filter (EKF2) algorithm that fuses or combines all the sensor input data from all the sensors 1 12.
  • the EKF2 algorithm then linearizes the fused sensor input data for calculations of the drone’s current estimated position and altitude in 3D space and time. Using this linearized function, the flight controller 130 compares the current data with past data, as well as taking into consideration the user’s control input from the ground station 220 / remote controller, and decides on its autonomous control output. The control output is sent to the electronic speed controller 132 which then controls the propellers 108 to physically change the position and altitude of the drone 100, thus demonstrating the semi-autonomous capability of the drone 100.
  • EKF2 second order extended Kalman filter
  • the first test was a flight stability test on whether the drone 100 is controllable and stable in flight motion.
  • the gains of the drone 100 were tuned in manual flight during the test.
  • Figure 6A shows the test results for a poorly-tuned drone 100.
  • the plot of the estimated roll angle fluctuated at a high frequency due to the drone 100 vibrating from very high gains that overcompensated the controls over the balancing of the drone 100 in mid-air.
  • the drone 100 was then properly tuned by reducing the control gains and thus reducing the overcompensation, and the test results for the properly- tuned drone 100 are shown in Figure 6B.
  • the estimated roll angle did not fluctuate at a high frequency as the drone 100 was not overcompensating.
  • the drone 100 was not vibrating in mid-air and this means that the properly-tuned drone 100 is stable in mid-air.
  • the reduction in vibration of the drone 100 is beneficial for the sensors 1 12 on the drone 100 to measure good sensor input data so that the flight controller 130 can compute the position of the drone 100 more accurately.
  • the second test was a height sensor test on the altitude holding stability of the drone 100 using the height sensor for relative height estimation.
  • the height sensor used was a time-of-flight sensor mounted at the bottom of the drone 100.
  • Figure 7 shows the measured height of the drone 100 above the ground based on sensor data from the height sensor.
  • the drone 100 was hovered at an estimated height of 1.3 m above ground. Once the drone 100 has stabilized at that height, the drone 100 was flown over a box which resulted in a decreased in measured height.
  • the flight controller 130 was running a height holding algorithm, it compensated by increasing the height relative to the box, thus increasing the height of the drone 100 above ground.
  • the drone 100 was then hovered above the box before landing the drone 100.
  • the third test was an optical flow test on the ability of the drone 100 to estimate its position in an enclosed environment or indoors using the optical flow sensor. This test was performed in an outdoor environment where the drone 100 is not bounded by any walls.
  • Figure 8 shows the XY position of the drone 100 estimated by the optical flow sensor. It can be seen that even though the plot seemed messy, the drone 100 was concluded to be holding its position but drifting occasionally where there were poor readings from the optical flow sensor. Such drifting is shown resembling spots in the plot where these spots are areas of high density estimated sensor data. This test confirmed that the optical flow position hold is plausible, and that the drone 100 can estimate its position indoors using the optical flow sensor.
  • the fourth test was a tunnel centering test on whether the drone 100 can hold its position in enclosed environments or confined spaces using the time-of-flight sensors on the sides of the drone 100.
  • This test was performed on the drone 100 in a mock-up tunnel and tested the drone’s ability to centre itself in the tunnel and to hold its flight heading or direction.
  • the optical flow sensor for X position and the time-of-flight sensors for Y position the XY position of the drone 100 between the two tunnel walls can be estimated.
  • Figure 9 shows the plot of the XY position of the drone 100 from a hover test done in the mock-up tunnel.
  • the Y position of the drone 100 can be better estimated as the drone 100 is bounded by the tunnel walls, resulting in more stable hovering.
  • the plot also proved the flight stability of the drone 100 as it drifted within a small circle of a 20 cm radius.
  • the drone 100 was evaluated to be able to estimate its position in an unknown enclosed environment and to navigate itself in the enclosed environment. More specifically, the drone 100 is able to achieve such position estimate and navigation in the enclosed environment, e.g. tunnel, where GPS signals are weak or non-existent, i.e. the enclosed environment is GPS-denied. This means that the drone 100 is suitable for use in many types of enclosed environments for surface defects inspection, particularly in deep wells or underground tunnels where there is poor or no GPS reception and where normal drones could not navigate since they are reliant on GPS.
  • the drone 100 is communicative with the ground station 220, such as for sending flight control input signals from a remote controller and extraction of flight data from the drone 100.
  • the ground station 220 may be communicative with one or more drones 100, meaning the ground station 220 is able to control and extract flight data from the one or more drones 100.
  • the system 50 may include one or more ground stations 220 where each ground station 220 is communicative with its own set of drones 100.
  • the ground stations 220 form part of the computer system 200 which also includes the remote server 240 which is communicative with each ground station 220.
  • the ground station 220 sends the visual data of the surface defects for processing by the remote server 240 to thereby inspect the surface defects, and the remote server 240 returns the inspection results to the ground station 220.
  • the inspection results may include a confidence level of each surface defect which indicates how likely the inspected surface defect is a true defect.
  • the ground station 220 has a user interface that provides the user with visualization of the drone’s current parameters, a 2D map representation based on the drone’s localized position data from the sensors 1 12, and the visual data of surface defects captured by the cameras 1 16. Operation of the user interface should be intuitive to the user with minimal training / guidance, especially for a user who is not technically trained in drone control. To minimize the amount of interactive elements shown on the user interface which thereby simplifies operation of the user interface and saves time, operation of user interface may be split across three distinct screens. Specifically, these are the Main Screen 300 as shown in Figure 10A, Recording Screen 320 as shown in Figure 10B, and Playback Screen 340 as shown in Figure 10C.
  • the Main Screen 300 shows a number of log files 302, each containing the drone’s flight position data and the visual data of the surface defects that are recorded over a particular time period. The user may select a log file 302 to view the past data in the Playback Screen 340.
  • the Main Screen 300 also shows a Record function 304 for the user to start a new recording or logging of the drone’s current position data and surface defects visual data, as shown in the Recording Screen 320.
  • the Recording Screen 320 shows a map representation 322 based on the flight position data received from the drone 100. No representation of the drone’s flight path is shown in the map representation 322 immediately after the recording starts, since the drone 100 is not yet moving along any flight path. However, after some time, the map representation 322 becomes one as shown in Figure 10D, showing the flight path of the drone 100 across a time period.
  • the current position of the drone 100 is represented by a solid dot on the map representation 322, together with the corresponding XY coordinates 324 on the bottom left of the map representation 322.
  • the map representation 322 includes a grid to allow the user to estimate the scale of the map representation 322 and the size of each grid square 326 is shown on the bottom right of the map representation 322.
  • the map representation 322 also provides standard zooming and panning functionalities.
  • the Recording Screen 320 displays current parameters 328 of the drone 100.
  • the current parameters 328 may include, but are not limited to, battery voltage, vertical speed, ground speed, and yaw angle.
  • the Recording Screen 320 displays the visual data captured by the cameras 1 16, such as in the form of a live video feed or stream 330.
  • the user may activate a Stop function 332 and the position and visual data recorded thus far will be stored on a new log file 302.
  • the Playback Screen 340 shows the recorded flight position data 342 and recorded visual data 344 of the surface defects 346.
  • the recorded visual data 344 is communicated to the remote server 240 for processing and inspecting the surface defects 346 during playback of the recorded log file 302. In some other embodiments, this is done during recording while the drone 100 is in flight motion.
  • the ground station 220 receives the visual data from the drone 100 and simultaneously sends the visual data to the remote server 240 for further processing. This allows the ground station 220 to obtain live inspection results of the surface defects while the drone 100 is still flying.
  • boxes 348 are rendered for each surface defect 346 and each box 348 indicates the confidence level of that surface 346 being a true one. For example, a confidence level of 0.91 indicates that the inspected surface defect 346 has more than 90% chance of being a true and potentially problematic defect. Appropriate follow up actions can then be planned accordingly to address the high-confidence surface defects 346.
  • the Playback Screen 340 may further include a line graph 350 of confidence level against captured frame.
  • the recorded visual data 344 is in the form of a video feed which contains a series of frames.
  • the horizontal axis of the line graph 350 represents the frames while the vertical axis represents the confidence level, ranging from 0 to 1 , of the surface defects 346 in each frame.
  • the user interface thus allows the user to quickly inspect a surface, such as an internal wall of a tunnel or deep well, for surface defects. Potential surface defects with their respective confidence levels are boxed and highlighted to the user so that appropriate rectification actions may be taken. Notably, the time taken to complete an inspection process in an enclosed environment depends on the flight speed of the drone 100 and the processing time required by the remote server 240 to process the captured visual data.
  • the ground station 220 is communicative with the remote server 240 via a communication network such as across the Internet.
  • the communication network is a medium or environment through which content, notifications, and/or messages are communicated among various entities.
  • Some non-limiting examples of the communication network include a virtual private network (VPN), wireless fidelity (Wi-Fi) network, light fidelity (Li-Fi) network, local area network (LAN), wide area network (WAN), metropolitan area network (MAN), satellite network, Internet, fiber optic network, coaxial cable network, infrared (IR) network, radio frequency (RF) network, and any combination thereof.
  • VPN virtual private network
  • Wi-Fi wireless fidelity
  • Li-Fi light fidelity
  • LAN local area network
  • WAN wide area network
  • MAN metropolitan area network
  • satellite network Internet
  • fiber optic network coaxial cable network
  • IR infrared
  • RF radio frequency
  • Various entities in the communication network may connect in accordance with various wired and wireless communication protocols, such as Transmission Control Protocol / Internet Protocol (TCP/IP), User Datagram Protocol (UDP), 2nd to 5th Generation (2G to 5G) communication protocols, Long Term Evolution (LTE) communication protocols, and any combination thereof.
  • TCP/IP Transmission Control Protocol / Internet Protocol
  • UDP User Datagram Protocol
  • 2G to 5G 2nd to 5th Generation
  • LTE Long Term Evolution
  • ground station 220 and remote server 240 include computers, laptops, mini-computers, mainframe computers, any non transient and tangible machines that can execute a machine-readable code, cloud- based servers, distributed server networks, and a network of computer systems.
  • a server is a physical or cloud data processing system on which a server program runs. The server may be implemented in hardware or software, or a combination thereof.
  • the ground station 220 and remote server 240 each includes a processor, a memory, and various other modules or components. The modules and components thereof are configured for performing various operations or steps and are configured as part of the processor. Such operations or steps are performed in response to non-transitory instructions operative or executed by the processor.
  • the memory is used to store instructions and perhaps data which are read during program execution.
  • the memory may be referred to in some contexts as computer-readable storage media and/or non-transitory computer- readable media.
  • Non-transitory computer-readable media include all computer- readable media, with the sole exception being a transitory propagating signal per se.
  • the remote server 240 processes the visual data of the surface defects to thereby inspect the surface defects, such as including calculating a confidence level of the surface defects which may include cracks of various shapes and sizes.
  • the remote server 240 provides an application programming interface (API) cooperative with the user interface of the ground station 220 so that the ground station 220 can send the captured visual data to the remote server 240 for subsequent processing.
  • API application programming interface
  • Various inspection models trained with pre-existing training data including prior known data associated with surface defects, may be implemented on the remote server 240 to process the visual data and inspect the surface defects. Sufficient training data is necessary to train a robust inspection model and more training data facilitates training with a larger number of parameters without overfitting. Having more variance in the training data will also allow the model to generalize better when operating on new / unknown data under a variety of different conditions, such as view point, illumination level, and noise patterns.
  • the inspection model implemented on the remote server 240 is a geometric features model.
  • This model provides a logical system to capture geometric features or characteristics of the surface defects from the visual data, particularly a video feed containing a series of images.
  • This model includes a pre processing stage, a single image detection stage, and a video processing stage.
  • the images are processed to highlight geometric features and to filter out unwanted features.
  • a texture detection system is used to filter out any inputs that are not surface defects such as cracks.
  • Transfer learning was used on a residual neutral network (ResNet) which was fine-tuned with images of various textures, such as concrete textures that show presence of cracks, to improve accuracy of detecting surface defects.
  • ResNet residual neutral network
  • techniques such as adaptive thresholding, light normalization, and shadow reduction filters are used to process the images. This tends to yield better results for feature extraction in the later stages of processing.
  • the pre-processed images are used to train a model to detect surface defects from a single image.
  • Morphological filters are applied in this stage to extract the edge information from each image. These filters attempt to filter out noise while preserving details of the surface defects.
  • Some examples of morphological filters include rotation-invariant Gabor, Canny Edge, Otsu, black hat, white hat, and any combination thereof.
  • a classifier is also used to classify a set of positive examples of images with surface defects having these features and a set of negative examples containing other objects or images without surface defects. The classifier may be based on support vector machine (SVM) or logistic regression. Validation results were computed for all permutations of the morphological filters and classifier and the permutation (Canny Edge morphological filter and logistical regression classifier) gave the most accurate results and was selected for this stage of processing the visual data.
  • SVM support vector machine
  • the visual data or video is processed by extracting the images or frames at regular intervals and processing the extracted images accordingly.
  • a sliding window approach may be used in this stage. Specifically, a window having a fixed size smaller than the image is used to scan through the image and send it to the classifier as a single sliding window. The sliding windows of each frame are then computed to find surface defects. For all windows found to have surface defects, the model pre-processes that image or frame as described above and the surface defect is highlighted in the original image. While the sliding window approach is effective in finding surface defects from the captured visual data or video, it may be slow because one image needs to be scanned multiple times depending on the size of the sliding window.
  • the inspection model implemented on the remote server 240 is an automated neural network model. This model is able to learn the features of surface defects on its own and process new visual data provided to it.
  • the automated neural network model is based on a simple convolutional neural network (CNN).
  • CNN convolutional neural network
  • Different neural network structures for the CNN were tested for cracks prediction given an input image.
  • a complex structure and two simple structures were tested out.
  • the results showed that simple structures generally have a higher accuracy because surface defects such as concrete surface cracks contain less information than other objects.
  • the automated neural network model is based on a generative adversarial network (GAN).
  • GAN generative adversarial network
  • One generator is used to generate fake images, while a discriminator is used to distinguish real and fake images. By competing with each other, the discriminator could learn the features of input images with cracks better and improve the classification. Similar to the CNN, simpler models with fewer parameters tend to perform better at classifying cracks. Tests on the GAN showed that the true positive rate was 82.3% and the true negative rate was 87.1%, suggesting that the GAN is suitable for use as the automated neural network model.
  • the automated neural network model is based on a“you only look once” (YOLO) neural network.
  • YOLO a“you only look once”
  • the YOLO neural network was implemented and fine-tuned with a training dataset of 500 manually-labelled data elements. Specifically, this training dataset was split into 80% as training data and 20% as testing data. The training data was used to train the YOLO neural network and the testing data was used to test the trained YOLO neural network.
  • the resultant loU metric (Intersect over Union) was approximately 80% and the average loss improved from 80 to 0.5 after 3000 epochs training. It can thus be seen that the YOLO neural network is accurate in inspecting surface defects.
  • Figure 1 1 illustrates the inspection results on cracks using the YOLO neural network.
  • the automated neural network model has the advantage of continued deep learning with continual input of new visual data. Deep learning could continuously improve with more manual labelled data and training. As the drone 100 is continuously used to inspect surface defects, more data can be collected which helps with the deep learning and further improves the neural network model.
  • the system 50 including the drone 100 and computer system 200 is able to achieve high accuracy of approximately 80% in the inspection of surface defects based on visual data captured by the drone 100 deployed in an enclosed environment.
  • the processing time in inspecting the visual data is also quicker, thus reducing the total inspection time required in detecting and locating the surface defects, consequently improving overall efficiency.
  • the drone 100 is able to navigate, through use of the sensors 1 12 such as time-of-flight sensors, in GPS-denied enclosed environments where GPS signals are weak or non-existent.
  • Possible enclosed environments where the drone 100 can be deployed include confined spaces like deep wells, under viaducts / bridges, storage / process tanks such as brewery tanks, service / motorway tunnels, indoor spaces, under forest / jungle canopy, interior of ships or tankers, such as hull interiors.
  • the drone 100 is also able to navigate in enclosed environments which in or are near to facilities that interfere with magnetic compasses.
  • Some examples include metallic structures like steel towers, stacks, chimneys, and vessels, as well as power transformers, electrical transmission systems, electrical cables, generators, and turbines. Tests have also shown that the drone 100 is able to localize itself in an unknown enclosed environment and navigate well in the enclosed environment, even if it is dark as the sensors 1 12 do not require illumination to function.
  • the drone 100 is thus a viable alternative to human inspectors for inspection of surface defects in enclosed environments.
  • the drone 100 is smaller and lighter compared to a human inspector, allowing for inspection of narrower confined spaces where human inspectors cannot access.
  • Replacing human inspectors with the drone 100 advantageously avoids the hazardous risk posed to human lives since the human inspectors can be redeployed to the ground stations 220. This improves safety for the human inspectors as they are no longer exposed to hazardous environments.
  • Human inspectors with minimal training on drone control can be redeployed as a drone pilot as the drone 100 is designed with semi-autonomous flight capability and can be easily operated by the pilot. The pilot only needs to give small control inputs to control the drone 100.
  • the pilot need not focus too much attention on controlling the drone 100, and can instead focus his attention on checking the surface defects inspection results computed by the remote server 240.
  • embodiments of the present disclosure in relation to a drone for inspecting surface defects in an enclosed environment are described with reference to the provided figures.
  • the description of the various embodiments herein is not intended to call out or be limited only to specific or particular representations of the present disclosure, but merely to illustrate non limiting examples of the present disclosure.
  • the present disclosure serves to address at least one of the mentioned problems and issues associated with the prior art.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present disclosure relates to a drone (100) for inspecting surface defects in an enclosed environment. The drone (100) comprises: a body (102) comprising a set of guard frames (104); a set of propellers (108) for moving the drone (100), each propeller (108) configured to be mounted to one of the guard frames (104); a set of sensors (112) for cooperatively navigating the drone (100) in the enclosed environment; and an inspection module (114) configured to be mounted to the body (102), the inspection module (114) comprising a set of cameras (116) for capturing visual data of the surface defects, wherein the visual data is subsequently processed to thereby inspect the surface defects.

Description

DRONE FOR SURFACE DEFECTS INSPECTION
Cross Reference to Related Application(s)
The present disclosure claims the benefit of Singapore Patent Application No. 1020181 1696Q filed on 27 December 2018, which is incorporated in its entirety by reference herein.
Technical Field
The present disclosure generally relates to a drone for defects inspection. More particularly, the present disclosure describes various embodiments of a drone for inspecting surface defects within an enclosed environment.
Background
Many drones at present rely on a global navigation satellite system, such as Global Positioning System (GPS) for navigation. These drones can be used in various applications including inspection of surface defects, such as cracks on external walls of buildings or facades. Drones that rely on GPS navigation tend to work well in outdoor environments where the drones can detect GPS signals with minimal interference or disruption. However, in enclosed environments such as deep wells or underground tunnels, GPS signals are often blocked, deviated, and/or erroneous. Enclosed environments that are near power transformers and electrical transmission systems also interfere with drone navigation, especially if the drones also rely on magnetic compasses for navigation.
In order to inspect surface defects in these enclosed environments, such as cracks on internal wall surfaces of deep wells, human inspectors have to be deployed inside. Comparing to using drones, inspection by human inspectors are often slower and likely less accurate. More importantly, this is potentially hazardous to the human inspectors as they are exposed to risk of injury or even death, especially because they are working in confined spaces. Such risk is exacerbated if the enclosed environments have minimal illumination, which is common in deep wells and tunnels without artificial lighting.
There may also be hazardous gases in the confined spaces especially in chemical industries. While these hazardous gases can be purged from the confined spaces before deploying human inspectors, the purging process is time consuming and would affect the manufacturing output of these industries, since output has to be stopped for the purging and consequent inspection. Moreover, purging of the hazardous gases would reduce oxygen supply in the confined spaces, which could impair the human inspectors’ judgement in such environments, potentially resulting in injuries and deaths.
Therefore, in order to address or alleviate at least one of the aforementioned problems and/or disadvantages, there is a need to provide an improved drone for inspecting surface defects in an enclosed environment.
Summary
According to a first aspect of the present disclosure, there is a drone for inspecting surface defects in an enclosed environment. The drone comprises: a body comprising a set of guard frames; a set of propellers for moving the drone, each propeller configured to be mounted to one of the guard frames; a set of sensors for cooperatively navigating the drone in the enclosed environment; and an inspection module configured to be mounted to the body, the inspection module comprising a set of cameras for capturing visual data of the surface defects, wherein the visual data is subsequently processed to thereby inspect the surface defects.
According to a second aspect of the present disclosure, there is a system for inspecting surface defects in an enclosed environment. The system comprises a set of drones, each drone comprising: a body comprising a set of guard frames; a set of propellers for moving the drone, each propeller mounted to one of the guard frames; a set of sensors for cooperatively navigating the drone in the enclosed environment; and an inspection module mounted to the body, the inspection module comprising a set of cameras for capturing visual data of the surface defects. The system further comprises a computer system configured for receiving the visual data from the drones.
An advantage of the present disclosure is that the sensors of the drone enable it to estimate its position in an unknown enclosed environment and to navigate itself in the enclosed environment. More specifically, in GPS-denied enclosed environments, the drone 100 is able to localize its position and navigate in such enclosed environments, making it suitable for surface defects inspection, particularly in deep wells or underground tunnels. The drone could potentially replace human inspectors which advantageously avoids the hazardous risk posed to human lives, thus improving safety for the human inspectors as they are no longer exposed to hazardous environments.
A drone for inspecting surface defects in an enclosed environment according to the present disclosure are thus disclosed herein. Various features, aspects, and advantages of the present disclosure will become more apparent from the following detailed description of the embodiments of the present disclosure, by way of non limiting examples only, along with the accompanying drawings.
Brief Description of the Drawings
Figure 1 is an illustration of a system for inspecting surface defects.
Figure 2A to Figure 2C are illustrations of a drone for inspecting surface defects. Figure 3 is an illustration of a flight data graph of drone height relative to ground. Figure 4 is an illustration of an inspection module for inspecting surface defects.
Figure 5A to Figure 5C are illustrations of a housing of the drone. Figure 6A and Figure 6B are illustrations of flight data graphs from a flight stability test performed on the drone.
Figure 7 is an illustration of a flight data graph from a height sensor test performed on the drone.
Figure 8 is an illustration of a flight data graph from an optical flow test performed on the drone.
Figure 9 is an illustration of a flight data graph from a tunnel centering test performed on the drone.
Figure 10A to Figure 10D are illustrations of a user interface of a ground station. Figure 1 1 is an illustration of inspection results from inspecting surface defects.
Detailed Description
For purposes of brevity and clarity, descriptions of embodiments of the present disclosure are directed to a drone for inspecting surface defects in an enclosed environment, in accordance with the drawings. While aspects of the present disclosure will be described in conjunction with the embodiments provided herein, it will be understood that they are not intended to limit the present disclosure to these embodiments. On the contrary, the present disclosure is intended to cover alternatives, modifications and equivalents to the embodiments described herein, which are included within the scope of the present disclosure as defined by the appended claims. Furthermore, in the following detailed description, specific details are set forth in order to provide a thorough understanding of the present disclosure. Flowever, it will be recognized by an individual having ordinary skill in the art, i.e. a skilled person, that the present disclosure may be practiced without specific details, and/or with multiple details arising from combinations of aspects of particular embodiments. In a number of instances, well-known systems, methods, procedures, and components have not been described in detail so as to not unnecessarily obscure aspects of the embodiments of the present disclosure.
In embodiments of the present disclosure, depiction of a given element or consideration or use of a particular element number in a particular figure or a reference thereto in corresponding descriptive material can encompass the same, an equivalent, or an analogous element or element number identified in another figure or descriptive material associated therewith.
References to“an embodiment / example”,“another embodiment / example”,“some embodiments / examples”, “some other embodiments / examples”, and so on, indicate that the embodiment(s) / example(s) so described may include a particular feature, structure, characteristic, property, element, or limitation, but that not every embodiment / example necessarily includes that particular feature, structure, characteristic, property, element or limitation. Furthermore, repeated use of the phrase“in an embodiment / example” or“in another embodiment / example” does not necessarily refer to the same embodiment / example.
The terms “comprising”, “including”, “having”, and the like do not exclude the presence of other features / elements / steps than those listed in an embodiment. Recitation of certain features / elements / steps in mutually different embodiments does not indicate that a combination of these features / elements / steps cannot be used in an embodiment.
As used herein, the terms“a” and“an” are defined as one or more than one. The use of 7” in a figure or associated text is understood to mean“and/or” unless otherwise indicated. The term“set” is defined as a non-empty finite organization of elements that mathematically exhibits a cardinality of at least one (e.g. a set as defined herein can correspond to a unit, singlet, or single-element set, or a multiple-element set), in accordance with known mathematical definitions. The recitation of a particular numerical value or value range herein is understood to include or be a recitation of an approximate numerical value or value range. System 50
In representative or exemplary embodiments of the present disclosure, there is a system 50 for inspecting surface defects in an enclosed environment, as shown in Figure 1. The system 50 includes a set of drones 100 and a computer system 200. A drone 100 is an unmanned aerial vehicle (UAV) which is an aircraft without a human pilot on board. The computer system 200 may include a set of ground stations 220 communicative with one or more drones 100. The computer system 200 may further include a remote server 240, such as a cloud-based server, communicative with the ground stations 220.
Drone 100
With reference to Figure 2A to Figure 2C, each drone 100 includes a body 102 having a set of guard frames 104. Each guard frame 104 may be removably mounted to the body 102 such that the guard frame 104 is removable for replacement. Each guard frame 104 includes a plurality of guard spars 106. In many embodiments as shown in Figure 2C, the number of guard spars 106 is five, as this optimizes rigidity, support, and structural strength of the guard frame 104. It will be appreciated that fewer or more guard spars 106 may be possible, as will be readily understood by the skilled person.
The drone 100 further includes a set of propellers 108 for moving the drone 100. The propellers 108 operate to generate propulsion thrust for flight motion of the drone 100. The propellers 108 simultaneously generate prop wash downwards away from the propellers 108 to counteract the upwards airlift of the drone 100. The drone 100 using multiple propellers 108 for flight motion may be referred to as a multicopter. In many embodiments, the drone 100 uses four propellers 108 and may be referred to as a quadcopter, although fewer or more propellers 108 may be possible. For purpose of brevity, detailed flight mechanics of a quadcopter are not described herein as this would be readily understood by the skilled person. Each propeller 108 is configured to be mounted to one of the guard frames 104. Specifically, each propeller 108 is configured to be mounted underneath the guard spars 106 of the respective guard frame 104. By positioning the propellers 108 underneath the guard spars 106, the prop wash generated by the propellers 108 during flight motion of the drone 100 will be unobstructed, thereby increasing the efficiency of the propellers 108. The guard frame 104 and guard spars 108 protect the propellers 108 from coming into contact with other objects or obstacles, particularly when the drone 100 is in flight motion within the enclosed environment, as such contact could result in damage to the propellers 108. The guarded propellers 108 are also less likely to damage surrounding infrastructure or cause injury to nearby workers, allowing the drone 100 to be safely used in the enclosed environment. Further, each propeller 108 may be removably mounted to the respective guard frame 104, such that the propellers 108 can be replaced if they are damaged.
Each propeller 108 includes a plurality of propeller blades 108a and a propeller motor 108b for rotating the propeller blades 108 to generate the propulsion thrust. In many embodiments, two-bladed or three-bladed propellers 108 are used in the drone 100. The size of the propeller motors 108b is dependent at least on the overall weight of the drone 100, including the propellers 108 themselves and the avionics carried on the drone 100. In selecting the suitable propeller motors 108b, the size of the propeller blades 108a, e.g. blade length, was first selected based on an estimated overall weight of the drone 100.
Both 4-inch and 5-inch propeller blades 108a and corresponding propeller motors 108b were considered. 4-inch propellers 108 may be used to keep the drone 100 small and light, but the avionics carried on the drone 100 may increase the overall weight of the drone 100 to beyond the operation limit of the 4-inch propellers 108. 5- inch propellers 108 may be used to generate more propulsion thrust, but the size of the blades 108a and motors 108b add to the overall weight of the drone 100. One factor in this consideration was how the 4-inch or 5-inch propellers 108 affected the flight time of the drone 100. A flight time of 10 minutes was selected as a benchmark parameter. For controllability, the drone 100 should have a thrust-to-weight ratio of approximately 2.
Drones using different models of 4-inch and 5-inch propeller motors 108b were simulated and tested. Specifically, the 4-inch propeller motors 108b included the Lumenier RX1806, T-Motor F30, and Xnova RM1407, and the 5-inch propeller motors 108b included the DJI Snail, T-Motor F40, and Brotherhobby 2205. As the drone 100 could be operating in an enclosed environment with poor ventilation, the dissipation of heat from the drone 100 would be less efficient. Operating temperatures of the drone 100, particularly from operation of the propellers 108, should be kept low to prevent the drone 100 from overheating which would likely damage the drone 100. The technical specifications of the various propeller motors 108b were considered to identify the motors 108b that are able to achieve lower operating temperatures.
A comparison of the 4-inch propeller motors 108b showed that the Lumenier RX1806 is the largest motor, but it has a relatively low Kv rating (number of revolutions per minute (rpm) that a motor turns when one volt is applied with no load attached to that motor), which provides good propulsion thrust and efficiency. Based on the technical specifications, the Lumenier RX1806 has higher efficiency compared to other 4-inch propeller motors 108b. It was inferred that the Lumenier RX1806 would achieve the lowest operating temperature and would thus be suitable for the propellers 108 of the drone 100. A comparison of the 5-inch propeller motors 108b showed the DJI Snail is the lightest motor and is able to achieve the lowest operating temperature. It was inferred that the DJI Snail would be suitable for the propellers 108 of the drone 100.
Both the Lumenier RX1806 4-inch propeller motor 108b and DJI Snail 5-inch propeller motor 108b were selected for a flight simulation analysis performed on eCalc which is an electric motor calculator. The flight time of 10 minutes was selected as a benchmark parameter in this simulation analysis. The drone 100 was simulated to have an overall weight of 1 kg when fitted with the Lumenier RX1806 4- inch propeller motors 108b and corresponding 4-inch propeller blades 108a. The simulation result showed that the“4-inch” drone 100 flew for 5.4 minutes. The drone 100 was then simulated with the DJI Snail 5-inch propeller motors 108b and corresponding 5-inch propeller blades 108a fitted. Due to the larger size of the 5-inch propellers 108, the“5-inch” drone 100 weighed approximately 1.134 kg. However, although the overall weight is greater, the simulation result showed that the“5-inch” drone 100 achieved better performance in terms of flight time, estimated operating temperature, and thrust-to-weight ratio.
The results from our simulation analysis also showed that, the overall weight of the “4-inch” drone 100 should be reduced by 0.35 kg in order to achieve the benchmark 10-minute flight time, while the overall weight of the“5-inch” drone 100 should be reduced by only 0.134 kg in order to achieve the benchmark 10-minute flight time. On balance of the better performance and increased weight, the DJI-Snail 5-inch propellers 108 were considered to be suitable for the drone 100.
The drone 100 was assembled with the DJI-Snail 5-inch propellers 108 and a live flight test was performed. A flight data log of the drone 100 was extracted from a flight controller software, such as Pixhawk. Figure 3 shows a graph of the flight data log, plotting height of the drone 100 above ground against time. It can be seen from the graph that the drone 100 was able to achieve a flight time of 10 minutes, thus verifying that the DJI-Snail 5-inch propellers 108 are suitable for the drone 100. It will be appreciated that there may be other models of 5-inch propellers 108 suitable for the drone 100 which will allow for a heavier drone 100 to fly without achieving the maximum propulsion thrust of the 5-inch propellers 108. In contrast, using 4-inch propellers 108 would not be suitable as the“4-inch” drone 100 barely achieved the maximum propulsion thrust that the 4-inch propellers 108 should normally achieve.
The body 102, including the guard frames 104 and guard spars 106, provides physical protection to the drone 100, especially for the propellers 108 and avionics carried on the drone 100. Other factors in designing the body 102 include weight, durability, ease of fabrication, and aerodynamics. The body of the Lumenier Danaus drone was used as reference in design the body 102 of the drone 100 as it was easy to fabricate. The body 102 has dimensions of approximately 330 mm by 300 mm with an inverted propeller design as describe above. Compared to the Lumenier Danaus, each guard frame 104 has more guard spars 106 for better protection of the propellers 108 from debris without compromising on the structural support required. The body 102 also provides structural support around the boundary of the centre of the body 102 for mounting of other components, such as a housing 1 10 for avionics.
A finite element analysis was performed to optimize the design of the body 102. Theoretical maximum loadings were applied on the body 102 in an effort to reduce the weight of the body 102. Although only a few grams could be reduced, every gram saved would improve the flight time of the drone 100. The finite element analysis was performed by simulating the drone 100 falling from a height of 30 m, specifically simulating power loss when the drone 100 is flying at its designed maximum operating altitude. Based on results of the finite element analysis and the ultimate tensile strength (which is the maximum tensile stress that a material can withstand before failing or breaking) of carbon fibre (which is a suitable material for the body 102), the safety factor was calculated to be approximately 2.032. The safety factor is a ratio of ultimate tensile strength over the maximum observed stress. The calculated safety factor of at least 2 is greater than the recommended safety factor of 1.5 for aircraft and spacecraft, as indicated in the document“The Ultimate Factor of Safety for Aircraft and Spacecraft Its History, Applications and Misconceptions” on the NASA Technical Reports Server. Accordingly, the drone 100 is safe for use for its intended application of surface defects inspection.
In various embodiments of the present disclosure, the drone 100 further includes a set of sensors 1 12 for cooperatively navigating the drone 100 within the enclosed environment. The sensors 1 12 may include a plurality of obstacle detection sensors for detecting obstacles around the drone 100 within the enclosed environment. The obstacle detection sensors are disposed around the periphery of the drone 100 to detect obstacles at a predefined distance away from the drone 100, and thus enable the drone 100 to avoid these obstacles during navigation. The predefined distance may be 0.5 m and may be set by the user during operation of the drone 100.
In some embodiments, the obstacle detection sensors are time-of-flight sensors. Other types of obstacle detection sensors were also considered, such as ultrasonic sensors and infrared sensors. An ultrasonic sensor emits a high-frequency sound pulse and measures how long it takes for the echo of the sound to reflect back from the obstacle. However, ultrasonic sensors are not suitable because the reflected echo would be interfered by external sound waves, especially since the drone 100 is intended to be operated in an enclosed environment where sounds can be reflected from all directions. An infrared sensor works in accordance with the infrared reflection principle to detect obstacles. However, infrared sensors are not suitable because they have a low proximity range of less than a metre, and the consistency of accurate measurements worsens the further the infrared sensors are from the obstacles.
A time-of-flight sensor is a range imaging sensor that measures the time of flight of a light signal emitted from a light source at the sensor to the obstacle and then reflecting off the obstacle. The distance between the sensor and the obstacle can be calculated since the speed of light is known. The light source may be a laser or a light-emitting diode (LED). Time-of-flight sensors using lasers may be known as a class of LiDar sensors, and these are comparatively more accurate and have less compromises on measurement results compared to the ultrasonic and infrared sensors. This is because the emitted laser is able to travel in a straight line and across a long distance. Additionally, the time-of-flight sensors can operate even in enclosed environments with minimal lighting, such as the case in deep wells and tunnels.
One suitable model of time-of-flight sensors is the TeraRanger Multiflex which contains eight sensors, each having a maximum proximity range of 2 m, making it suitable for the drone 100 to detect obstacles at approximately 0.5 m away. At least six sensors time-of-flight sensors are fitted on both sides of the drone 100 for optimal obstacle avoidance and navigation, although fewer or more sensors may be possible.
In some embodiments, the sensors 1 12 include an optical flow sensor for detecting forward motion of the drone 100. The sensors 112 may further include a height sensor for detecting a ground within the enclosed environment. The height sensor may be a time-of-flight sensor or an altimeter. A flight controller 130 combines sensor data from the sensors 1 12 and navigates the drone 100 accordingly while avoiding detected obstacles.
As an example, when the drone 100 is deployed in a dark tunnel, the time-of-flight obstacle detection sensors enables the drone 100 to detect the two side walls of the tunnel (Y-direction) and to stabilize the drone 100 in the centre of the tunnel. The least-squares line method may be used to plot a line that represents the walls based on the sensor data. The optical flow sensor complements the navigation by tracking motion of the drone 100 along the tunnel (X-direction), while the height sensor detects the ground of the tunnel underneath the drone 100, thus stabilizing the drone 100 at a suitable height above the ground (Z-direction). The flight controller 130 computes the flight path of the drone 100 and the flight path data can be communicated to the ground station 220.
In various embodiments of the present disclosure, the drone 100 further includes an inspection module 1 14 as shown in Figure 4. The inspection module 1 14 is configured to be mounted to the body 102 and may be configured to be removably mounted to the body 102 so that it can be replaced if necessary. The inspection module 1 14 includes a set of cameras 116 for capturing visual data of the surface defects. The visual data may include image data, e.g. a set of images, and/or video data, e.g. a series of images, of the surface defects. In one embodiment, the set of cameras 1 16 includes a thermal camera 1 16a. In another embodiment, the set of cameras 1 16 includes an optical camera 1 16b. In yet another embodiment, the cameras 1 16 include the thermal camera 1 16a and optical camera 116b as shown in Figure 4. The thermal camera 116a is an infrared camera for capturing thermal images in the infrared spectrum. The optical camera 1 16b captures normal optical images / videos in the visible light spectrum. The inspection module 1 16 may include a light source, such as a dimmer LED, for providing suitable illumination to the optical camera 116b for capturing the optical images. It will be appreciated that there may be other types of cameras 1 16 or visual sensors implemented in the inspection module 114 for capturing visual data of the surface defects. In some embodiments, the inspection module 114 further includes a gimbal 1 18 for stabilizing the cameras 116 while the drone 100 is in motion. The gimbal 1 18 was designed to optimize its shape and performance and minimizing its overall size. To minimize the weight and size of the inspection module 1 14, small-sized thermal camera 1 16a and/or optical camera 1 16b were selected. For example, the thermal camera 1 16a is an AMG8833 sensor which is small and discrete and is able to capture proper thermal images. Although the AMG8833 sensor has only an 8x8 resolution array, this can be improved to 70x70 through interpolation and programming, as will be readily understood by the skilled person. An example of the optical camera 116b is the Foxeer Legend 1.
The gimbal 1 18 may be designed to operate on two axes or three axes. Flowever, the 3-axis gimbal is larger and heavier than the 2-axis gimbal and may not be suitable for use with the drone 100. As such, the 2-axis gimbal 118 was selected for the drone 100 operates on two axes. The centre of gravity of the cameras 116 was obtained before fitting the gimbal 1 18, so that the gimbal 1 18 can be appropriately balanced and further so that less energy is required to stabilize the cameras 1 16 while the drone 100 is in motion. Figure 4 shows the cameras 116 fitted with the gimbal 1 18.
In some embodiments, the drone 100 further includes the housing 1 10 for the avionics as well as for the sensors 112 and inspection module 1 14. The housing 1 10 may be configured to be removably mounted to the body 102, such as by way of an attachment mechanism 120 that may be lockable to secure the housing 1 10 while the drone 100 is moving. The attachment mechanism 120 allows the user to easily access and remove the housing 1 10, such as to access the internal avionics for repairs and/or replacements. In one embodiment as shown in Figure 2A, the inspection module 114 is housed within the housing 1 10 so that they are protected by the housing 110. The housing 1 10 includes an optically clear window 122 so that the cameras 116 can still capture visual data even when housed within the housing 1 10. In another embodiment as shown in Figure 2B, the inspection module 1 14 is disposed underneath the housing 1 10. With reference to Figure 5A and Figure 5B, the housing 1 10 provides physical protection to the avionics housed inside. Additionally, the housing 1 10 is configured to be water resistant. Seals such as rubber gaskets or silicone sealant are provided at interfacing joints of the housing 1 10 to achieve the water resistance. For example, the housing 1 10 may be formed of two halves or shells coupleable together at interfacing joints which are lined with sealing gaskets. Other interfacing joints may be between the housing 1 10 and a battery compartment 124. The housing 1 10 may be formed such that a large space is provided underneath the housing 1 10 for disposing the battery compartment 124, the large space allowing for a large high-capacity battery to be used, such as a lithium-ion polymer battery, to maximize the flight time of the drone 100. The housing 1 10 may be rated with an Ingress Protection rating IP55, i.e. the housing 1 10 is protected from limited dust ingress and from low pressure water jets from any direction.
The housing 1 10 was tested for its water resistance and the IP55 rating without any avionics housed inside. A water spray test was performed using rubber gaskets as the seals at the interfacing joints, and the result was that there was no water ingress into the housing 1 10. Another water spray test was performed using silicone sealant as the seals at the interfacing joints, and the result was better water resistance and ingress protection.
The material for the housing 1 10 is selected to withstand environmental elements such as heat, rain, and humid conditions, and to ensure minimal or zero water ingress which can harm the crucial avionics housed inside. The material should also be mechanically strong to securely mount to the body 102 and to provide physical protection to the avionics. Density and weight of the material are also factors in the material selection as they affect the maximum flight time of the drone 100. Other factors which may be considered in the material selection include ability to withstand operating temperatures, and mechanical properties such as bending stiffness, toughness, and vibration damping. In many embodiments, the material of the housing 1 10 is epoxy resin as it has a higher glass temperature range and maximum service temperature. It will be appreciated that other materials may be possible in consideration of the aforementioned factors. The size of the housing 1 10 is sufficiently large to accommodate larger avionics, allow greater freedom for wire management within the housing 1 10, and increase empty spaces between the various avionics. Larger empty spaces between the avionics can improve heat dissipation from the avionics. Other avionics which the housing 1 10 may house include a geolocation module 126 having a GPS module and/or a magnetic compass, electrical interface 128, flight controller 130, electronic speed controller 132, flight telemetry communication module 134, visual data communication module 136, and power delivery module 138.
To enhance robustness of the drone 100, the avionics are positioned within the housing 1 10 in consideration of their electromagnetic outputs as well as their thermal characteristics. In some embodiments with reference to Figure 5C, the housing 110 is separated into three regions - the front region 1 10a for low thermal activity such as for temperatures below 35 °C, middle region 1 10b for moderate thermal activity such as for temperatures from 35 °C to 45 °C, and rear region 1 10c for high thermal activity such as for temperatures up to 60 °C. Avionics of different operating frequencies are distributed throughout the housing 1 10, but avionics of similar operating temperatures are grouped together at the appropriate regions 1 10abc. In the example shown in Figure 5C, the inspection module 1 14 and geolocation module 126 may be positioned at the front region 110a; the electrical interface 128, flight controller 130, and flight telemetry communication module 134 may be positioned at the middle region 1 10b; and the battery compartment 124, electronic speed controller 132, visual data communication module 136, and power delivery module 138 may be positioned at the rear region 1 10c.
The electrical interface 128 is configured to provide electrical connections between the avionics of the housing 110 and the sensors 1 12. For example, the electrical interface 128 may be in the form of an array of pogo pins. The pogo pin array is easily connected and disconnected so that the housing 110 is modular and can be easily swapped to other bodies 102 of other drones 100, or conversely swapping the housing 110 of one drone 100 with another housing 1 10. The visual data communication module 136 is configured for communicating the visual data to the ground station 220, which may occur via a communication protocol such as a 5.8 GHz video link. The power delivery module 138 is configured to transfer power from the battery compartment 126 to the avionics of the drone 100, including the electrical interface 128, sensors 1 12, and cameras 1 16.
The flight telemetry communication module 134 is configured for communicating with the ground station 220, which may include a remote controller, that controls flight motion of the drone 100. The flight telemetry communication module 134 may communicate with the ground station 220 via a telemetry communication protocol such as MAVLink Telemetry. An example of the flight telemetry communication module 134 is a RFD900 radio modem. The ground station 220 or remote controller sends control signals to the flight telemetry communication module 134. The flight controller 130 processes the control signals in cooperation with sensor data from the sensors 1 12 and optionally with other information such as from one or more inertial measurement units (IMU), gyroscope, and the geolocation module 126. The flight controller 130 performs computation calculations and sends the results to the electronic speed controller 132. The electronic speed controller 132 is configured to vary the speed, direction, and braking of the propellers 108. Using the results from the flight controller 130, the electronic speed controller 132 controls the propellers 108 accordingly to the control input of the user at the ground station 220 using the remote controller.
The flight controller 130 is selected based on versatility to tweaks and changes so that the drone 100 can be customized for deployment in an enclosed environment. The flight controller 130 supports companion computers via connections through high-bandwidth communication ports, and has multiple redundancy IMUs, and a 32- bit flight processor. In many embodiments, the flight controller 130 is a Pixhawk 2.1 with a versatile supporting PX4 firmware. The Pixhawk 2.1 flight controller 130 is compatible and integrated with a companion computer such as a single-board computer (SBC). The SBC may be an Intel Edison computer-on-module as it is compact and lightweight, it can be easily integrated with the Pixhawk 2.1 flight controller 130, and it is computationally powerful enough to perform algorithms with sensor data from the sensors 1 12, particularly to localize the drone 100 in an unknown enclosed environment and navigate the drone 100 in the enclosed environment.
The flight controller 130 uses various algorithms to achieve semi-autonomous flight for the drone 100. The flight controller 130 may use the least-squares matching (LSQM) algorithm which is based on the least-squares line method for finding a line with the sensor data. Assuming that the drone 100 is deployed in a square tunnel, the flight controller 130 attempts to solve a linear system using sensor data from the multiple sensors 1 12, specifically the ranging sensing or time-of-flight sensors on both sides of the drone 100, and with the least squares line method to find two lines that represent the walls of the tunnel. Combining with the LSQM algorithm that creates the lines, and considering the difference in sensor readings from the front and rear time-of-flight sensors of one side of the drone 100, the angle of the line corresponding to the tunnel wall can be estimated. The distance and angular orientation of the drone 100 relative to the tunnel walls can be computed as the positions of all the sensors 1 12 around the drone 100 are known.
The flight controller 130 then controls flight motion of the drone 100 based on its computations using the sensor input data. The flight controller 130 may use the second order extended Kalman filter (EKF2) algorithm that fuses or combines all the sensor input data from all the sensors 1 12. The EKF2 algorithm then linearizes the fused sensor input data for calculations of the drone’s current estimated position and altitude in 3D space and time. Using this linearized function, the flight controller 130 compares the current data with past data, as well as taking into consideration the user’s control input from the ground station 220 / remote controller, and decides on its autonomous control output. The control output is sent to the electronic speed controller 132 which then controls the propellers 108 to physically change the position and altitude of the drone 100, thus demonstrating the semi-autonomous capability of the drone 100.
Four flight tests were performed on the drone 100 to test its semi-autonomous flight capabilities. The four tests were performed successively with an increasing number of semi-autonomous controls added after every test. Detailed flight log data from every flight test of the drone 100 was obtained from the tests using the flight controller 130 and associated firmware. The flight log data are illustrated in the following Figures described below.
The first test was a flight stability test on whether the drone 100 is controllable and stable in flight motion. The gains of the drone 100 were tuned in manual flight during the test. Figure 6A shows the test results for a poorly-tuned drone 100. The plot of the estimated roll angle fluctuated at a high frequency due to the drone 100 vibrating from very high gains that overcompensated the controls over the balancing of the drone 100 in mid-air. The drone 100 was then properly tuned by reducing the control gains and thus reducing the overcompensation, and the test results for the properly- tuned drone 100 are shown in Figure 6B. The estimated roll angle did not fluctuate at a high frequency as the drone 100 was not overcompensating. Additionally, the drone 100 was not vibrating in mid-air and this means that the properly-tuned drone 100 is stable in mid-air. The reduction in vibration of the drone 100 is beneficial for the sensors 1 12 on the drone 100 to measure good sensor input data so that the flight controller 130 can compute the position of the drone 100 more accurately.
The second test was a height sensor test on the altitude holding stability of the drone 100 using the height sensor for relative height estimation. In this test, the height sensor used was a time-of-flight sensor mounted at the bottom of the drone 100. Figure 7 shows the measured height of the drone 100 above the ground based on sensor data from the height sensor. During the test, the drone 100 was hovered at an estimated height of 1.3 m above ground. Once the drone 100 has stabilized at that height, the drone 100 was flown over a box which resulted in a decreased in measured height. As the flight controller 130 was running a height holding algorithm, it compensated by increasing the height relative to the box, thus increasing the height of the drone 100 above ground. The drone 100 was then hovered above the box before landing the drone 100. This test confirmed that the height sensor is taken into account for the height holding algorithm as the drone 100 showed that it wanted to hold a predefined height above the nearest obstacle underneath it, which is usually the ground. The third test was an optical flow test on the ability of the drone 100 to estimate its position in an enclosed environment or indoors using the optical flow sensor. This test was performed in an outdoor environment where the drone 100 is not bounded by any walls. Figure 8 shows the XY position of the drone 100 estimated by the optical flow sensor. It can be seen that even though the plot seemed messy, the drone 100 was concluded to be holding its position but drifting occasionally where there were poor readings from the optical flow sensor. Such drifting is shown resembling spots in the plot where these spots are areas of high density estimated sensor data. This test confirmed that the optical flow position hold is plausible, and that the drone 100 can estimate its position indoors using the optical flow sensor.
The fourth test was a tunnel centering test on whether the drone 100 can hold its position in enclosed environments or confined spaces using the time-of-flight sensors on the sides of the drone 100. This test was performed on the drone 100 in a mock-up tunnel and tested the drone’s ability to centre itself in the tunnel and to hold its flight heading or direction. Using the optical flow sensor for X position and the time-of-flight sensors for Y position, the XY position of the drone 100 between the two tunnel walls can be estimated. Figure 9 shows the plot of the XY position of the drone 100 from a hover test done in the mock-up tunnel. The Y position of the drone 100 can be better estimated as the drone 100 is bounded by the tunnel walls, resulting in more stable hovering. The plot also proved the flight stability of the drone 100 as it drifted within a small circle of a 20 cm radius.
Based on the successful results of the tests, the drone 100 was evaluated to be able to estimate its position in an unknown enclosed environment and to navigate itself in the enclosed environment. More specifically, the drone 100 is able to achieve such position estimate and navigation in the enclosed environment, e.g. tunnel, where GPS signals are weak or non-existent, i.e. the enclosed environment is GPS-denied. This means that the drone 100 is suitable for use in many types of enclosed environments for surface defects inspection, particularly in deep wells or underground tunnels where there is poor or no GPS reception and where normal drones could not navigate since they are reliant on GPS. Ground Station 220
As shown in Figure 1 , in the system 50, the drone 100 is communicative with the ground station 220, such as for sending flight control input signals from a remote controller and extraction of flight data from the drone 100. The ground station 220 may be communicative with one or more drones 100, meaning the ground station 220 is able to control and extract flight data from the one or more drones 100. Further, the system 50 may include one or more ground stations 220 where each ground station 220 is communicative with its own set of drones 100. The ground stations 220 form part of the computer system 200 which also includes the remote server 240 which is communicative with each ground station 220. Specifically, the ground station 220 sends the visual data of the surface defects for processing by the remote server 240 to thereby inspect the surface defects, and the remote server 240 returns the inspection results to the ground station 220. The inspection results may include a confidence level of each surface defect which indicates how likely the inspected surface defect is a true defect.
The ground station 220 has a user interface that provides the user with visualization of the drone’s current parameters, a 2D map representation based on the drone’s localized position data from the sensors 1 12, and the visual data of surface defects captured by the cameras 1 16. Operation of the user interface should be intuitive to the user with minimal training / guidance, especially for a user who is not technically trained in drone control. To minimize the amount of interactive elements shown on the user interface which thereby simplifies operation of the user interface and saves time, operation of user interface may be split across three distinct screens. Specifically, these are the Main Screen 300 as shown in Figure 10A, Recording Screen 320 as shown in Figure 10B, and Playback Screen 340 as shown in Figure 10C.
The Main Screen 300 shows a number of log files 302, each containing the drone’s flight position data and the visual data of the surface defects that are recorded over a particular time period. The user may select a log file 302 to view the past data in the Playback Screen 340. The Main Screen 300 also shows a Record function 304 for the user to start a new recording or logging of the drone’s current position data and surface defects visual data, as shown in the Recording Screen 320.
The Recording Screen 320 shows a map representation 322 based on the flight position data received from the drone 100. No representation of the drone’s flight path is shown in the map representation 322 immediately after the recording starts, since the drone 100 is not yet moving along any flight path. However, after some time, the map representation 322 becomes one as shown in Figure 10D, showing the flight path of the drone 100 across a time period. The current position of the drone 100 is represented by a solid dot on the map representation 322, together with the corresponding XY coordinates 324 on the bottom left of the map representation 322. The map representation 322 includes a grid to allow the user to estimate the scale of the map representation 322 and the size of each grid square 326 is shown on the bottom right of the map representation 322. The map representation 322 also provides standard zooming and panning functionalities.
While the drone 100 is in flight motion during recording, the Recording Screen 320 displays current parameters 328 of the drone 100. The current parameters 328 may include, but are not limited to, battery voltage, vertical speed, ground speed, and yaw angle. Additionally, during recording, the Recording Screen 320 displays the visual data captured by the cameras 1 16, such as in the form of a live video feed or stream 330. To stop the recording, the user may activate a Stop function 332 and the position and visual data recorded thus far will be stored on a new log file 302.
During playback of a recorded log file 302, the Playback Screen 340 shows the recorded flight position data 342 and recorded visual data 344 of the surface defects 346. In some embodiments, the recorded visual data 344 is communicated to the remote server 240 for processing and inspecting the surface defects 346 during playback of the recorded log file 302. In some other embodiments, this is done during recording while the drone 100 is in flight motion. Particularly, the ground station 220 receives the visual data from the drone 100 and simultaneously sends the visual data to the remote server 240 for further processing. This allows the ground station 220 to obtain live inspection results of the surface defects while the drone 100 is still flying.
As shown in the Playback Screen 340, after the ground station 220 has received the inspection results from the remote server 240, boxes 348 are rendered for each surface defect 346 and each box 348 indicates the confidence level of that surface 346 being a true one. For example, a confidence level of 0.91 indicates that the inspected surface defect 346 has more than 90% chance of being a true and potentially problematic defect. Appropriate follow up actions can then be planned accordingly to address the high-confidence surface defects 346.
The Playback Screen 340 may further include a line graph 350 of confidence level against captured frame. For example, the recorded visual data 344 is in the form of a video feed which contains a series of frames. The horizontal axis of the line graph 350 represents the frames while the vertical axis represents the confidence level, ranging from 0 to 1 , of the surface defects 346 in each frame.
The user interface thus allows the user to quickly inspect a surface, such as an internal wall of a tunnel or deep well, for surface defects. Potential surface defects with their respective confidence levels are boxed and highlighted to the user so that appropriate rectification actions may be taken. Notably, the time taken to complete an inspection process in an enclosed environment depends on the flight speed of the drone 100 and the processing time required by the remote server 240 to process the captured visual data.
The ground station 220 is communicative with the remote server 240 via a communication network such as across the Internet. More generally, the communication network is a medium or environment through which content, notifications, and/or messages are communicated among various entities. Some non-limiting examples of the communication network include a virtual private network (VPN), wireless fidelity (Wi-Fi) network, light fidelity (Li-Fi) network, local area network (LAN), wide area network (WAN), metropolitan area network (MAN), satellite network, Internet, fiber optic network, coaxial cable network, infrared (IR) network, radio frequency (RF) network, and any combination thereof. Various entities in the communication network may connect in accordance with various wired and wireless communication protocols, such as Transmission Control Protocol / Internet Protocol (TCP/IP), User Datagram Protocol (UDP), 2nd to 5th Generation (2G to 5G) communication protocols, Long Term Evolution (LTE) communication protocols, and any combination thereof.
Some non-limiting examples of the ground station 220 and remote server 240 include computers, laptops, mini-computers, mainframe computers, any non transient and tangible machines that can execute a machine-readable code, cloud- based servers, distributed server networks, and a network of computer systems. As used herein, a server is a physical or cloud data processing system on which a server program runs. The server may be implemented in hardware or software, or a combination thereof. Additionally, the ground station 220 and remote server 240 each includes a processor, a memory, and various other modules or components. The modules and components thereof are configured for performing various operations or steps and are configured as part of the processor. Such operations or steps are performed in response to non-transitory instructions operative or executed by the processor. The memory is used to store instructions and perhaps data which are read during program execution. The memory may be referred to in some contexts as computer-readable storage media and/or non-transitory computer- readable media. Non-transitory computer-readable media include all computer- readable media, with the sole exception being a transitory propagating signal per se.
Remote Server 240
The remote server 240 processes the visual data of the surface defects to thereby inspect the surface defects, such as including calculating a confidence level of the surface defects which may include cracks of various shapes and sizes. The remote server 240 provides an application programming interface (API) cooperative with the user interface of the ground station 220 so that the ground station 220 can send the captured visual data to the remote server 240 for subsequent processing. Various inspection models trained with pre-existing training data, including prior known data associated with surface defects, may be implemented on the remote server 240 to process the visual data and inspect the surface defects. Sufficient training data is necessary to train a robust inspection model and more training data facilitates training with a larger number of parameters without overfitting. Having more variance in the training data will also allow the model to generalize better when operating on new / unknown data under a variety of different conditions, such as view point, illumination level, and noise patterns.
In some embodiments, the inspection model implemented on the remote server 240 is a geometric features model. This model provides a logical system to capture geometric features or characteristics of the surface defects from the visual data, particularly a video feed containing a series of images. This model includes a pre processing stage, a single image detection stage, and a video processing stage.
In the pre-processing stage, the images are processed to highlight geometric features and to filter out unwanted features. A texture detection system is used to filter out any inputs that are not surface defects such as cracks. Transfer learning was used on a residual neutral network (ResNet) which was fine-tuned with images of various textures, such as concrete textures that show presence of cracks, to improve accuracy of detecting surface defects. Further, in order to reduce the possible variance in lightning conditions and the occlusions caused by shadows, techniques such as adaptive thresholding, light normalization, and shadow reduction filters are used to process the images. This tends to yield better results for feature extraction in the later stages of processing.
In the single image detection stage, the pre-processed images are used to train a model to detect surface defects from a single image. Morphological filters are applied in this stage to extract the edge information from each image. These filters attempt to filter out noise while preserving details of the surface defects. Some examples of morphological filters include rotation-invariant Gabor, Canny Edge, Otsu, black hat, white hat, and any combination thereof. A classifier is also used to classify a set of positive examples of images with surface defects having these features and a set of negative examples containing other objects or images without surface defects. The classifier may be based on support vector machine (SVM) or logistic regression. Validation results were computed for all permutations of the morphological filters and classifier and the permutation (Canny Edge morphological filter and logistical regression classifier) gave the most accurate results and was selected for this stage of processing the visual data.
In the video processing stage, the visual data or video is processed by extracting the images or frames at regular intervals and processing the extracted images accordingly. A sliding window approach may be used in this stage. Specifically, a window having a fixed size smaller than the image is used to scan through the image and send it to the classifier as a single sliding window. The sliding windows of each frame are then computed to find surface defects. For all windows found to have surface defects, the model pre-processes that image or frame as described above and the surface defect is highlighted in the original image. While the sliding window approach is effective in finding surface defects from the captured visual data or video, it may be slow because one image needs to be scanned multiple times depending on the size of the sliding window.
In some embodiments, the inspection model implemented on the remote server 240 is an automated neural network model. This model is able to learn the features of surface defects on its own and process new visual data provided to it.
In one embodiment, the automated neural network model is based on a simple convolutional neural network (CNN). Different neural network structures for the CNN were tested for cracks prediction given an input image. A complex structure and two simple structures were tested out. The results showed that simple structures generally have a higher accuracy because surface defects such as concrete surface cracks contain less information than other objects. Further, there is the problem of vanishing gradients when a neural network structure is too complex and the training data size is too small, resulting in poorer robustness for complex structures.
In one embodiment, the automated neural network model is based on a generative adversarial network (GAN). One generator is used to generate fake images, while a discriminator is used to distinguish real and fake images. By competing with each other, the discriminator could learn the features of input images with cracks better and improve the classification. Similar to the CNN, simpler models with fewer parameters tend to perform better at classifying cracks. Tests on the GAN showed that the true positive rate was 82.3% and the true negative rate was 87.1%, suggesting that the GAN is suitable for use as the automated neural network model.
In one embodiment, the automated neural network model is based on a“you only look once” (YOLO) neural network. This is one of the fastest object detection algorithms available. The YOLO neural network was implemented and fine-tuned with a training dataset of 500 manually-labelled data elements. Specifically, this training dataset was split into 80% as training data and 20% as testing data. The training data was used to train the YOLO neural network and the testing data was used to test the trained YOLO neural network. The resultant loU metric (Intersect over Union) was approximately 80% and the average loss improved from 80 to 0.5 after 3000 epochs training. It can thus be seen that the YOLO neural network is accurate in inspecting surface defects. Figure 1 1 illustrates the inspection results on cracks using the YOLO neural network.
Comparing between the geometric features model and automated neural network model, both achieved similar speeds in processing the visual data of the surface defects (approximately 500 images processed per hour). However, the geometric features model tends to be more prone to noise which could lead to a higher false positive rate. The geometric features model is also unable to indicate a confidence level of the detected surface defects. The automated neural network model has the advantage of continued deep learning with continual input of new visual data. Deep learning could continuously improve with more manual labelled data and training. As the drone 100 is continuously used to inspect surface defects, more data can be collected which helps with the deep learning and further improves the neural network model.
Therefore, the system 50 including the drone 100 and computer system 200 is able to achieve high accuracy of approximately 80% in the inspection of surface defects based on visual data captured by the drone 100 deployed in an enclosed environment. The processing time in inspecting the visual data is also quicker, thus reducing the total inspection time required in detecting and locating the surface defects, consequently improving overall efficiency.
Additionally, in contrast with existing drones that are reliant on GPS reception and signals, the drone 100 is able to navigate, through use of the sensors 1 12 such as time-of-flight sensors, in GPS-denied enclosed environments where GPS signals are weak or non-existent. Possible enclosed environments where the drone 100 can be deployed include confined spaces like deep wells, under viaducts / bridges, storage / process tanks such as brewery tanks, service / motorway tunnels, indoor spaces, under forest / jungle canopy, interior of ships or tankers, such as hull interiors. The drone 100 is also able to navigate in enclosed environments which in or are near to facilities that interfere with magnetic compasses. Some examples include metallic structures like steel towers, stacks, chimneys, and vessels, as well as power transformers, electrical transmission systems, electrical cables, generators, and turbines. Tests have also shown that the drone 100 is able to localize itself in an unknown enclosed environment and navigate well in the enclosed environment, even if it is dark as the sensors 1 12 do not require illumination to function.
The drone 100 is thus a viable alternative to human inspectors for inspection of surface defects in enclosed environments. The drone 100 is smaller and lighter compared to a human inspector, allowing for inspection of narrower confined spaces where human inspectors cannot access. Replacing human inspectors with the drone 100 advantageously avoids the hazardous risk posed to human lives since the human inspectors can be redeployed to the ground stations 220. This improves safety for the human inspectors as they are no longer exposed to hazardous environments. Human inspectors with minimal training on drone control can be redeployed as a drone pilot as the drone 100 is designed with semi-autonomous flight capability and can be easily operated by the pilot. The pilot only needs to give small control inputs to control the drone 100. As such, the pilot need not focus too much attention on controlling the drone 100, and can instead focus his attention on checking the surface defects inspection results computed by the remote server 240. In the foregoing detailed description, embodiments of the present disclosure in relation to a drone for inspecting surface defects in an enclosed environment are described with reference to the provided figures. The description of the various embodiments herein is not intended to call out or be limited only to specific or particular representations of the present disclosure, but merely to illustrate non limiting examples of the present disclosure. The present disclosure serves to address at least one of the mentioned problems and issues associated with the prior art. Although only some embodiments of the present disclosure are disclosed herein, it will be apparent to a person having ordinary skill in the art in view of this disclosure that a variety of changes and/or modifications can be made to the disclosed embodiments without departing from the scope of the present disclosure. Therefore, the scope of the disclosure as well as the scope of the following claims is not limited to embodiments described herein.

Claims

Claims
1. A drone for inspecting surface defects in an enclosed environment, the drone comprising:
a body comprising a set of guard frames;
a set of propellers for moving the drone, each propeller configured to be mounted to one of the guard frames;
a set of sensors for cooperatively navigating the drone in the enclosed environment; and
an inspection module configured to be mounted to the body, the inspection module comprising a set of cameras for capturing visual data of the surface defects,
wherein the visual data is subsequently processed to thereby inspect the surface defects.
2. The drone according to claim 1 , wherein the set of sensors comprises a plurality of time-of-flight sensors for detecting obstacles around the drone in the enclosed environment.
3. The drone according to claim 1 or 2, wherein the set of sensors comprises an optical flow sensor for detecting forward motion of the drone.
4. The drone according to any one of claims 1 to 3, wherein the set of sensors comprises a height sensor for detecting a ground in the enclosed environment.
5. The drone according to any one of claims 1 to 4, wherein the inspection module comprises a gimbal for stabilizing the cameras while the drone is in motion.
6. The drone according to any one of claims 1 to 5, wherein the set of cameras comprises an optical camera and/or a thermal camera.
7. The drone according to any one of claims 1 to 6, further comprising a housing for avionics, wherein the housing is configured to be removably mounted to the body.
8. The drone according to claim 7, wherein the housing is configured to be water-resistant.
9. The drone according to claim 7 or 8, wherein the inspection module is housed within the housing.
10. The drone according to any one of claims 1 to 9, wherein the inspection module is configured to be removably mounted to the body.
1 1. The drone according to any one of claims 1 to 10, wherein each propeller is configured to be removably mounted to the respective guard frame.
12. The drone according to any one of claims 1 to 11 , wherein each propeller is configured to be mounted underneath guard spars of the respective guard frame.
13. The drone according to any one of claims 1 to 12, further comprising a visual data communication module configured for communicating the visual data to a ground station for said inspecting of the surface defects.
14. The drone according to any one of claims 1 to 13, further comprising a flight telemetry communication module configured for communicating with a ground station for controlling motion of the drone.
15. A system for inspecting surface defects in an enclosed environment, the system comprising:
a set of drones, each drone comprising:
a body comprising a set of guard frames;
a set of propellers for moving the drone, each propeller mounted to one of the guard frames;
a set of sensors for cooperatively navigating the drone in the enclosed environment; and an inspection module mounted to the body, the inspection module comprising a set of cameras for capturing visual data of the surface defects; and
a computer system configured for receiving the visual data from the drones and processing the visual data to thereby inspect the surface defects.
16. The system according to claim 15, wherein the computer system comprises a ground station configured for receiving the visual data from the drones and communicating the visual data to a remote server for said processing of the visual data to thereby inspect the surface defects.
17. The system according to claim 16, wherein the ground station is further configured for receiving, from the remote server, inspection results from the inspection of the surface defects.
18. The system according to claim 17, wherein the inspection results comprise confidence levels of the surface defects.
19. The system according to any one of claims 16 to 18, wherein the computer system further comprises the remote server configured to process the visual data to thereby inspect the surface defects.
20. The system according to claim 19, wherein the remote server is configured to process the visual data using an automated neural network model.
PCT/SG2019/050635 2018-12-27 2019-12-24 Drone for surface defects inspection WO2020139195A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
SG11202100153SA SG11202100153SA (en) 2018-12-27 2019-12-24 Drone for surface defects inspection

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201811696Q 2018-12-27
SG10201811696Q 2018-12-27

Publications (1)

Publication Number Publication Date
WO2020139195A1 true WO2020139195A1 (en) 2020-07-02

Family

ID=71130180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2019/050635 WO2020139195A1 (en) 2018-12-27 2019-12-24 Drone for surface defects inspection

Country Status (2)

Country Link
SG (1) SG11202100153SA (en)
WO (1) WO2020139195A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113093617A (en) * 2021-04-06 2021-07-09 安徽理工大学 Four-axis unmanned aerial vehicle motor turns to judgement system based on DSP
CN113109355A (en) * 2021-04-09 2021-07-13 上海赤塞智能科技有限公司 High stability structure surface inspection device
CN113588871A (en) * 2021-06-17 2021-11-02 山东胜源建筑工程有限公司 Construction engineering crack detection device
CN113762183A (en) * 2021-09-13 2021-12-07 墙管家建筑科技(上海)有限公司 Intelligent checking and analyzing system for existing building safety and operation method
CN114812398A (en) * 2022-04-10 2022-07-29 同济大学 High-precision real-time crack detection platform based on unmanned aerial vehicle
CN116087235A (en) * 2023-04-07 2023-05-09 四川川交路桥有限责任公司 Multi-source coupling bridge damage detection method and system
TWI806430B (en) * 2022-02-16 2023-06-21 財團法人工業技術研究院 Defect detection method and defect detection device
WO2024006372A1 (en) * 2022-06-29 2024-01-04 Electrical Components International, Inc. Next generation quality inspection
KR102663155B1 (en) * 2023-10-17 2024-05-07 주식회사 디와이스코프코리아 Apparatus and method for providing monitoring and building structure safety inspection based on artificial intelligence using a smartphone

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9382002B1 (en) * 2012-12-04 2016-07-05 United Dynamics Advanced Technologies Corporation Drone for inspection of enclosed space and method thereof
US20170090481A1 (en) * 2015-09-24 2017-03-30 Kespry, Inc. Enhanced distance detection system
US20170139410A1 (en) * 2014-07-02 2017-05-18 Mitsubishi Heavy Industries, Ltd. Indoor monitoring system and method for structure
CN206709853U (en) * 2017-04-06 2017-12-05 南京航空航天大学 Drawing system is synchronously positioned and builds in a kind of multi-rotor unmanned aerial vehicle room
US20180149947A1 (en) * 2016-11-28 2018-05-31 Korea Institute Of Civil Engineering And Building Technology Unmanned aerial vehicle system for taking close-up picture of facility and photography method using the same
US20180217614A1 (en) * 2017-01-19 2018-08-02 Vtrus, Inc. Indoor mapping and modular control for uavs and other autonomous vehicles, and associated systems and methods
CN108688805A (en) * 2018-06-11 2018-10-23 视海博(中山)科技股份有限公司 The unmanned plane detected safely applied to restricted clearance

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9382002B1 (en) * 2012-12-04 2016-07-05 United Dynamics Advanced Technologies Corporation Drone for inspection of enclosed space and method thereof
US20170139410A1 (en) * 2014-07-02 2017-05-18 Mitsubishi Heavy Industries, Ltd. Indoor monitoring system and method for structure
US20170090481A1 (en) * 2015-09-24 2017-03-30 Kespry, Inc. Enhanced distance detection system
US20180149947A1 (en) * 2016-11-28 2018-05-31 Korea Institute Of Civil Engineering And Building Technology Unmanned aerial vehicle system for taking close-up picture of facility and photography method using the same
US20180217614A1 (en) * 2017-01-19 2018-08-02 Vtrus, Inc. Indoor mapping and modular control for uavs and other autonomous vehicles, and associated systems and methods
CN206709853U (en) * 2017-04-06 2017-12-05 南京航空航天大学 Drawing system is synchronously positioned and builds in a kind of multi-rotor unmanned aerial vehicle room
CN108688805A (en) * 2018-06-11 2018-10-23 视海博(中山)科技股份有限公司 The unmanned plane detected safely applied to restricted clearance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TRIPICCHIO P. ET AL.: "Confined spaces industrial inspection with micro aerial vehicles and laser range finder localization", INTERNATIONAL JOURNAL OF MICRO AIR VEHICLES, vol. 10, no. 2, 16 May 2018 (2018-05-16), pages 207 - 224, XP055722908, DOI: 10.1177/1756829318757471 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113093617A (en) * 2021-04-06 2021-07-09 安徽理工大学 Four-axis unmanned aerial vehicle motor turns to judgement system based on DSP
CN113109355A (en) * 2021-04-09 2021-07-13 上海赤塞智能科技有限公司 High stability structure surface inspection device
CN113588871B (en) * 2021-06-17 2023-12-22 梁君 Construction engineering crack detection device
CN113588871A (en) * 2021-06-17 2021-11-02 山东胜源建筑工程有限公司 Construction engineering crack detection device
CN113762183A (en) * 2021-09-13 2021-12-07 墙管家建筑科技(上海)有限公司 Intelligent checking and analyzing system for existing building safety and operation method
TWI806430B (en) * 2022-02-16 2023-06-21 財團法人工業技術研究院 Defect detection method and defect detection device
CN114812398B (en) * 2022-04-10 2023-10-03 同济大学 High-precision real-time crack detection platform based on unmanned aerial vehicle
CN114812398A (en) * 2022-04-10 2022-07-29 同济大学 High-precision real-time crack detection platform based on unmanned aerial vehicle
WO2024006372A1 (en) * 2022-06-29 2024-01-04 Electrical Components International, Inc. Next generation quality inspection
US11995812B2 (en) 2022-06-29 2024-05-28 Electrical Components International, Inc. Next generation quality inspection
CN116087235A (en) * 2023-04-07 2023-05-09 四川川交路桥有限责任公司 Multi-source coupling bridge damage detection method and system
CN116087235B (en) * 2023-04-07 2023-06-20 四川川交路桥有限责任公司 Multi-source coupling bridge damage detection method and system
KR102663155B1 (en) * 2023-10-17 2024-05-07 주식회사 디와이스코프코리아 Apparatus and method for providing monitoring and building structure safety inspection based on artificial intelligence using a smartphone

Also Published As

Publication number Publication date
SG11202100153SA (en) 2021-02-25

Similar Documents

Publication Publication Date Title
WO2020139195A1 (en) Drone for surface defects inspection
US20220003213A1 (en) Unmanned Aerial Vehicle Wind Turbine Inspection Systems And Methods
US11237572B2 (en) Collision avoidance system, depth imaging system, vehicle, map generator and methods thereof
US10914590B2 (en) Methods and systems for determining a state of an unmanned aerial vehicle
CN110308457B (en) Unmanned aerial vehicle-based power transmission line inspection system
US10377485B2 (en) System and method for automatically inspecting surfaces
US11127202B2 (en) Search and rescue unmanned aerial system
Nuske et al. Autonomous exploration and motion planning for an unmanned aerial vehicle navigating rivers
CN112904877A (en) Automatic fan blade inspection system and method based on unmanned aerial vehicle
EP3850456B1 (en) Control and navigation systems, pose optimisation, mapping, and localisation techniques
US20220050460A1 (en) Control and navigation systems
Hrabar An evaluation of stereo and laser‐based range sensing for rotorcraft unmanned aerial vehicle obstacle avoidance
JP6140458B2 (en) Autonomous mobile robot
CN109031312A (en) Flying platform positioning device and localization method suitable for chimney inside processing
JP6014484B2 (en) Autonomous mobile robot
JP2016173709A (en) Autonomous mobile robot
Langåker et al. An autonomous drone-based system for inspection of electrical substations
Tsintotas et al. Safe UAV landing: A low-complexity pipeline for surface conditions recognition
Duecker et al. RGB-D camera-based navigation for autonomous underwater inspection using low-cost micro AUVs
US11869236B1 (en) Generating data for training vision-based algorithms to detect airborne objects
CN109270957A (en) A kind of plant protection system and its flying vehicles control method and apparatus
JP7130409B2 (en) Control device
Krause Multi-purpose environment awareness approach for single line laser scanner in a small rotorcraft UA
WO2022124392A1 (en) Aircraft and aircraft control method
Castelar Wembers et al. LiDAR‐based automated UAV inspection of wind turbine rotor blades

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19901529

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19901529

Country of ref document: EP

Kind code of ref document: A1