WO2019240797A1 - System and method for specular detection and removal - Google Patents

System and method for specular detection and removal Download PDF

Info

Publication number
WO2019240797A1
WO2019240797A1 PCT/US2018/037554 US2018037554W WO2019240797A1 WO 2019240797 A1 WO2019240797 A1 WO 2019240797A1 US 2018037554 W US2018037554 W US 2018037554W WO 2019240797 A1 WO2019240797 A1 WO 2019240797A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
edges
drone
data
Prior art date
Application number
PCT/US2018/037554
Other languages
French (fr)
Inventor
Joshua S. MCCONKEY
Original Assignee
Siemens Energy, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Energy, Inc. filed Critical Siemens Energy, Inc.
Priority to PCT/US2018/037554 priority Critical patent/WO2019240797A1/en
Publication of WO2019240797A1 publication Critical patent/WO2019240797A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present disclosure is directed, in general to the detection and removal of specular reflections, and more specifically to the automated detection and removal of specular reflections using remote imaging.
  • Imaging and in particular infrared (IR) imaging can be used to determine the temperature of objects.
  • IR imaging is susceptible to specular reflections as the image will indicate the temperature of the reflected object rather than the actual temperature of the underlying object.
  • a method of detecting and removing specular data to create a corrected image includes capturing a first image of a scene including an object from a first position, capturing a second image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position, comparing the first image to the second image to find spectral images that are in different positions in the second image than in the first image with respect to the object, and removing the spectral images from the first image to define the corrected image.
  • a method of detecting and removing specular data to create a corrected image includes capturing a first image of a scene including an object from a first position, capturing a second image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position, subtracting the second image from the first image to define an anomaly image that includes any spectral images that were captured within the first image and the second image, identifying spectral data within the anomaly image, and replacing portions of the first image corresponding to spectral data in the anomaly image with non-spectral data from the second image to define the corrected image.
  • a method of detecting and removing specular data to create a corrected image includes capturing a first IR image of a scene including an object from a first position, capturing a second IR image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position, and identifying edges of the object in the first image.
  • the method also includes using the edges in the first image to identify a first surface of interest in the first image, identifying edges of the object in the second image, and using the edges in the second image to identify a second surface of interest in the second image, each of the first surface and the second surface corresponding to a surface of interest of the object.
  • the method further includes distorting the second image to align the second surface to the first surface, comparing the IR image of the first surface to the IR image of the second surface to find spectral images that are in different positions in the second image than in the first image with respect to the edges of the object in the first image, and removing the spectral images from the first surface to define the corrected image.
  • FIG. 1 is a schematic illustration of a scene suitable for IR imaging.
  • Fig. 2 is an image of a first object taken from a first position.
  • Fig. 3 is an image of the first object taken from a second position.
  • Fig. 4 is the image of Fig. 2 with a surface of interest identified.
  • Fig. 5 is the image of Fig 3 with the surface of interest identified.
  • Fig. 6 is a corrected image created from the combination of Figs. 4 and 5.
  • Fig. 7 is a series of temperature graphs taken along line 7-7 of Figs. 4, 5, and 6.
  • phrases“associated with” and“associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like.
  • first, second, third and so forth may be used herein to refer to various elements, information, functions, or acts, these elements, information, functions, or acts should not be limited by these terms. Rather these numeral adjectives are used to distinguish different elements, information, functions or acts from each other. For example, a first element, information, function, or act could be termed a second element, information, function, or act, and, similarly, a second element, information, function, or act could be termed a first element, information, function, or act, without departing from the scope of the present disclosure.
  • the term “adjacent to” may mean: that an element is relatively near to but not in contact with a further element; or that the element is in contact with the further portion, unless the context clearly indicates otherwise.
  • the phrase“based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Terms“about” or“substantially” or like terms are intended to cover variations in a value that are within normal industry manufacturing tolerances for that dimension. If no industry standard as available a variation of 20 percent would fall within the meaning of these terms unless otherwise stated.
  • Fig. 1 schematically illustrates a potential scene where infrared (IR) imaging may be useful.
  • IR infrared
  • the imaging is performed in the far-infrared range of the electromagnetic spectrum which typically includes between 9,000 nm to 14,000 nm. Imaging in this spectrum range is commonly referred to as passive thermography and the images are sometimes referred to as thermographs or thermograms.
  • thermographs or thermograms Of course, images in lower IR wavelengths, 900 nm and higher, might also be of value.
  • the scene illustrates a system that is used to image an object 10 in the form of an automobile with other objects being well-suited to this type of imaging.
  • objects such as transformers, power distribution components, transmission components, switch gear, etc. are all well-suited to this process.
  • an automobile is used as an example object 10 in the present description, the object 10 should not be limited to an automobile.
  • a drone 15 carries an IR imaging device 20 (e.g., IR camera) that is arranged to capture images in the desired wavelengths.
  • the drone 15 is a remote piloted drone 15, or the drone 15 is preprogrammed to fly a predetermined route. As illustrated, the drone 15 flies to a first position 25 and captures a first image 30 of the automobile 10, shown in Fig. 2. The drone 15 then shifts its position slightly to a second position 35 and captures a second image 40 shown in Fig. 3. When the first image 30 is captured, the sun 45 is positioned such that a reflection of the sun 50 is captured in the first image 30.
  • IR imaging device 20 e.g., IR camera
  • the reflection of the sun 50 also shifts as illustrated in Fig. 1.
  • the imaging device 20 or IR camera is generally aimed in the same direction (e.g., northeast).
  • the positions 25, 35 are such that the imaging device 20 is far enough away from the object 10 that the object 10 remains in frame.
  • Figs. 2 and 3 also illustrate how the reflection of the sun 50 shifts when the drone 15 moves from the first position 25 to the second position 35.
  • Fig. 1 illustrates the drone 15 being used to capture the images 30, 40
  • the same process could be applied to images captured using virtually any system.
  • a user could take the first image 30 with a handheld device, shift his or her position slightly and take the second image 40.
  • the imaging device 20 could be arranged to provide a binocular image with the two images 30, 40 being taken from slightly different angles simultaneously or could be taken with two separate imaging devices 20 spaced apart from one another.
  • the two images 30, 40 should be taken from two different positions 25, 35 that are spaced at least 150 mm from one another.
  • Fig. 2 illustrates the first image 30 captured by the drone 15 in Fig. 1.
  • the image 30 is analyzed to detect edges 55 that outline features of the object 10.
  • the edge processing defines fixed actual structural edges 55 of the object 10 and avoids defining edges created by optical artifacts.
  • the area of interest is a hood 60 of the automobile 10 and the edges 55 assist in identifying that area.
  • a reflection of the sun 50a is clearly visible in Fig. 2 as a small hot spot 65a.
  • Fig. 3 is the second image 40 of the automobile hood 60 taken from the second position 35.
  • the image 40 is analyzed to detect edges 55 of the object 10.
  • a reflection of the sun 50b is clearly visible as a hot spot 65b.
  • the hot spot 65b appears to have shifted across the hood 60 of the automobile 10 a distance that is related to the distance between the first position 25 and the second position 35.
  • the detected edges 55 are then used to define a surface or area of interest 70.
  • the surface is defined by a polygon 70 that is created based on the defined edges 55 and that includes the region of interest on the object 10, which in this example is the hood 60 of the automobile 10.
  • a simple rectangle may be suitable, with more complicated surfaces requiring a more complicated polygon 70.
  • Existing algorithms, such as 2nd order spatial differential or Laplace Transforms can be used to define the polygons 70.
  • the polygon 70 of Fig. 4 is then superimposed onto the second image 40 as illustrated in Fig. 5 and the second image 40 is distorted in the X and Y directions through smooth stretching and vertex matching until the edges 55 of Fig. 5 align with the polygon 70 as they did in Fig. 4.
  • the same surface should be located within the polygon 70 in both Figs. 4 and 5, thereby allowing for a pixel-by-pixel analysis or an analysis of smaller multi-pixel regions that can be used to remove any specular reflections that would otherwise skew the data.
  • the polygon 70 created for the first image 30 can be different than a polygon created for the second image 40.
  • the differences can be accepted, and an internal shape can be created that resides within each polygon 70, and includes an area of interest, and does not include any strong discontinuities (edges).
  • the pixel-by-pixel analysis or the analysis of smaller multi-pixel regions can then be used to remove any specular reflections within the area of interest that would otherwise skew the data.
  • the preferred pixel-by-pixel analysis includes a pixel-by-pixel subtraction which leads to data that represents the differences between each pixel in the two images 30, 40. While an image of this data (difference image) could be produced, it is an intermediate step and is not necessary to complete the process. A smoothing or noise filter can be applied to remove small differences between the images 30, 40 if desired.
  • the pixel values of the difference image are analyzed to identify any positive or negative spikes.
  • a positive spike indicates a specular image from the first image 30 in this example and a negative spike indicates a specular image in the second image 40 or the image subtracted from the first image 30.
  • the positive spike data is removed from the first image 30 and replaced with data for the same pixels from the second image 40 to produce corrected image data that can be used to produce a corrected image.
  • the corrected image contains all the relevant pixels, but with no specular pollution.
  • the corrected image can be used for image processing and temperature detection (e.g., min -max analysis) without specular issues causing incorrect analysis.
  • Fig. 7 illustrates the temperature data taken along line 7-7 which passes through the specular reflection of the sun 50a, 50b in both Figs. 4 and 5.
  • the temperature data in the curves is normalized such that the values shown are the temperature variation from the average.
  • the first curve 75 includes temperature values taken from the first image 30 that oscillate around an average temperature of about 32 degrees C. At the point where the sun’s reflection 50a is located, the temperature rises rapidly and well beyond a typical temperature variation. However, without the second image 40, it is difficult to say that the high temperature is an anomaly and not a true hot spot 65a on the object 10.
  • the second curve 80 in Fig. 7 illustrates temperature data for the same line 7-7 taken from the second image 40 shown in Fig. 5. Again, a value of zero represents an average temperature and again, at the point where the line passes through the reflection of the sun 50b a large spike in temperature is present.
  • the spike in temperature in the first curve 75 of Fig. 7 been a true hot spot 65a, the spike would appear in the same position in the second curve 80. However, as is clearly seen, the spike in temperature has moved a significant distance. This is indicative of a specular reflection.
  • the third curve 85 of Fig. 7 illustrates the results of a pixel-by-pixel subtraction of the second curve 80 from the first curve 75.
  • the temperature values of the first curve should be about the same as the temperature values of the second curve.
  • the third curve should be very close to zero at all points except where there is a specular reflection.
  • the specular reflection 50a in the first curve 75 will appear as a large hot spot spike in the third curve 85, while the reflection 50b in the second curve 80 will appear in the third curve 85 as a large cold spot spike.
  • the third curve 85 clearly identifies where any specular reflections occur in the first curve 75 and therefore where any specular reflections are located in the first image 30 (Figs. 2 and 4).
  • the data of the first curve 75, the second curve, 80, and the third curve 85 are combined to arrive at the corrected or fourth curve 90 of Fig. 7.
  • the data from the first curve 75 is translated to the fourth curve 90.
  • data for any axial positions that show a positive spike in Fig. 3 is omitted. This omitted region is identified as region 95.
  • Data from these same axial positions in the second curve 80 does not include the same specular reflection. Therefore, this data
  • replacement data 100 is copied to the fourth curve and replaces the specular reflection data from the first curve 75.
  • Fig. 7 show how the analysis and correction of specular data is performed along a line. This process is extended to all the pixels in the captured images 30, 40 to arrive at a corrected image 105 illustrated in Fig. 6. As can be seen, there is no reflection of the sun 50 visible in Fig. 6.
  • a first image is captured and the drone shifts to a second position a few feet from the first position to capture the second image.
  • Software on the drone 15 or remote from the drone 15 removes any specular reflections and analyses a single corrected image or provides a single image to a user for analysis.
  • specular reflection source used herein is the sun 45
  • other sources are also possible.
  • street lights, navigation lights, or other light sources could also create a specular reflection and would be filtered or corrected in the same manner as the specular reflection created by the sun 45.
  • the system is used to image a number of identical or similar objects.
  • the user predefines the location and distance from which each image is captured and those locations and distances are repeated for every object to assure that the area of interest always appears the same in the two images.
  • known corners or points can be aligned to allow the pixel-by-pixel analysis without the need to detect edges 55 or define polygons 70 within the defined edges 55.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Radiation Pyrometers (AREA)

Abstract

A method of detecting and removing specular data to create a corrected image includes capturing a first image of a scene including an object from a first position, capturing a second image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position, comparing the first image to the second image to find spectral images that are in different positions in the second image than in the first image with respect to the object, and removing the spectral images from the first image to define the corrected image.

Description

SYSTEM AND METHOD FOR SPECULAR DETECTION AND REMOVAL
TECHNICAL FIELD
[0001] The present disclosure is directed, in general to the detection and removal of specular reflections, and more specifically to the automated detection and removal of specular reflections using remote imaging.
BACKGROUND
[0002] Imaging and in particular infrared (IR) imaging can be used to determine the temperature of objects. However, IR imaging is susceptible to specular reflections as the image will indicate the temperature of the reflected object rather than the actual temperature of the underlying object.
SUMMARY
[0003] A method of detecting and removing specular data to create a corrected image includes capturing a first image of a scene including an object from a first position, capturing a second image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position, comparing the first image to the second image to find spectral images that are in different positions in the second image than in the first image with respect to the object, and removing the spectral images from the first image to define the corrected image.
[0004] In another construction, a method of detecting and removing specular data to create a corrected image includes capturing a first image of a scene including an object from a first position, capturing a second image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position, subtracting the second image from the first image to define an anomaly image that includes any spectral images that were captured within the first image and the second image, identifying spectral data within the anomaly image, and replacing portions of the first image corresponding to spectral data in the anomaly image with non-spectral data from the second image to define the corrected image.
[0005] In another construction, a method of detecting and removing specular data to create a corrected image includes capturing a first IR image of a scene including an object from a first position, capturing a second IR image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position, and identifying edges of the object in the first image. The method also includes using the edges in the first image to identify a first surface of interest in the first image, identifying edges of the object in the second image, and using the edges in the second image to identify a second surface of interest in the second image, each of the first surface and the second surface corresponding to a surface of interest of the object. The method further includes distorting the second image to align the second surface to the first surface, comparing the IR image of the first surface to the IR image of the second surface to find spectral images that are in different positions in the second image than in the first image with respect to the edges of the object in the first image, and removing the spectral images from the first surface to define the corrected image.
[0006] The foregoing has outlined rather broadly the technical features of the present disclosure so that those skilled in the art may better understand the detailed description that follows.
Additional features and advantages of the disclosure will be described hereinafter that form the subject of the claims. Those skilled in the art will appreciate that they may readily use the conception and the specific embodiments disclosed as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Those skilled in the art will also realize that such equivalent constructions do not depart from the spirit and scope of the disclosure in its broadest form.
[0007] Also, before undertaking the Detailed Description below, it should be understood that various definitions for certain words and phrases are provided throughout this specification and those of ordinary skill in the art will understand that such definitions apply in many, if not most, instances to prior as well as future uses of such defined words and phrases. While some terms may include a wide variety of embodiments, the appended claims may expressly limit these terms to specific embodiments. BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Fig. 1 is a schematic illustration of a scene suitable for IR imaging.
[0009] Fig. 2 is an image of a first object taken from a first position.
[0010] Fig. 3 is an image of the first object taken from a second position.
[0011] Fig. 4 is the image of Fig. 2 with a surface of interest identified.
[0012] Fig. 5 is the image of Fig 3 with the surface of interest identified.
[0013] Fig. 6 is a corrected image created from the combination of Figs. 4 and 5.
[0014] Fig. 7 is a series of temperature graphs taken along line 7-7 of Figs. 4, 5, and 6.
[0015] Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
DETAILED DESCRIPTION
[0016] Various technologies that pertain to systems and methods will now be described with reference to the drawings, where like reference numerals represent like elements throughout.
The drawings discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged apparatus. It is to be understood that functionality that is described as being carried out by certain system elements may be performed by multiple elements. Similarly, for instance, an element may be configured to perform functionality that is described as being carried out by multiple elements. The numerous innovative teachings of the present application will be described with reference to exemplary non-limiting embodiments.
[0017] Also, it should be understood that the words or phrases used herein should be construed broadly, unless expressly limited in some examples. For example, the terms“including,” “having,” and“comprising,” as well as derivatives thereof, mean inclusion without limitation. The singular forms“a”,“an” and“the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Further, the term“and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. The term“or” is inclusive, meaning and/or, unless the context clearly indicates otherwise. The phrases“associated with” and“associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like.
[0018] Also, although the terms "first", "second", "third" and so forth may be used herein to refer to various elements, information, functions, or acts, these elements, information, functions, or acts should not be limited by these terms. Rather these numeral adjectives are used to distinguish different elements, information, functions or acts from each other. For example, a first element, information, function, or act could be termed a second element, information, function, or act, and, similarly, a second element, information, function, or act could be termed a first element, information, function, or act, without departing from the scope of the present disclosure.
[0019] In addition, the term "adjacent to" may mean: that an element is relatively near to but not in contact with a further element; or that the element is in contact with the further portion, unless the context clearly indicates otherwise. Further, the phrase“based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Terms“about” or“substantially” or like terms are intended to cover variations in a value that are within normal industry manufacturing tolerances for that dimension. If no industry standard as available a variation of 20 percent would fall within the meaning of these terms unless otherwise stated.
[0020] Fig. 1 schematically illustrates a potential scene where infrared (IR) imaging may be useful. For purposes of measuring temperature, the imaging is performed in the far-infrared range of the electromagnetic spectrum which typically includes between 9,000 nm to 14,000 nm. Imaging in this spectrum range is commonly referred to as passive thermography and the images are sometimes referred to as thermographs or thermograms. Of course, images in lower IR wavelengths, 900 nm and higher, might also be of value.
[0021] With continued reference to Fig. 1 , the scene illustrates a system that is used to image an object 10 in the form of an automobile with other objects being well-suited to this type of imaging. For example, remote industrial objects such as transformers, power distribution components, transmission components, switch gear, etc. are all well-suited to this process.
Therefore, while an automobile is used as an example object 10 in the present description, the object 10 should not be limited to an automobile.
[0022] In the scene of Fig. 1, a drone 15 carries an IR imaging device 20 (e.g., IR camera) that is arranged to capture images in the desired wavelengths. In one construction, the drone 15 is a remote piloted drone 15, or the drone 15 is preprogrammed to fly a predetermined route. As illustrated, the drone 15 flies to a first position 25 and captures a first image 30 of the automobile 10, shown in Fig. 2. The drone 15 then shifts its position slightly to a second position 35 and captures a second image 40 shown in Fig. 3. When the first image 30 is captured, the sun 45 is positioned such that a reflection of the sun 50 is captured in the first image 30. When the drone 15 shifts to the second position 35, the reflection of the sun 50 also shifts as illustrated in Fig. 1. When capturing the two images 30, 40, the imaging device 20 or IR camera is generally aimed in the same direction (e.g., northeast). In addition, the positions 25, 35 are such that the imaging device 20 is far enough away from the object 10 that the object 10 remains in frame. Figs. 2 and 3 also illustrate how the reflection of the sun 50 shifts when the drone 15 moves from the first position 25 to the second position 35.
[0023] While Fig. 1 illustrates the drone 15 being used to capture the images 30, 40, the same process could be applied to images captured using virtually any system. For example, a user could take the first image 30 with a handheld device, shift his or her position slightly and take the second image 40. In addition, the imaging device 20 could be arranged to provide a binocular image with the two images 30, 40 being taken from slightly different angles simultaneously or could be taken with two separate imaging devices 20 spaced apart from one another. In preferred arrangements, the two images 30, 40 should be taken from two different positions 25, 35 that are spaced at least 150 mm from one another.
[0024] Fig. 2 illustrates the first image 30 captured by the drone 15 in Fig. 1. The image 30 is analyzed to detect edges 55 that outline features of the object 10. The edge processing defines fixed actual structural edges 55 of the object 10 and avoids defining edges created by optical artifacts. In this example, the area of interest is a hood 60 of the automobile 10 and the edges 55 assist in identifying that area. In addition, a reflection of the sun 50a is clearly visible in Fig. 2 as a small hot spot 65a. Fig. 3 is the second image 40 of the automobile hood 60 taken from the second position 35. Again, the image 40 is analyzed to detect edges 55 of the object 10. Also, a reflection of the sun 50b is clearly visible as a hot spot 65b. However, the hot spot 65b appears to have shifted across the hood 60 of the automobile 10 a distance that is related to the distance between the first position 25 and the second position 35.
[0025] As illustrated in Fig. 4, the detected edges 55 are then used to define a surface or area of interest 70. Typically, the surface is defined by a polygon 70 that is created based on the defined edges 55 and that includes the region of interest on the object 10, which in this example is the hood 60 of the automobile 10. In some cases, a simple rectangle may be suitable, with more complicated surfaces requiring a more complicated polygon 70. Existing algorithms, such as 2nd order spatial differential or Laplace Transforms can be used to define the polygons 70.
[0026] The polygon 70 of Fig. 4 is then superimposed onto the second image 40 as illustrated in Fig. 5 and the second image 40 is distorted in the X and Y directions through smooth stretching and vertex matching until the edges 55 of Fig. 5 align with the polygon 70 as they did in Fig. 4. At this point, the same surface should be located within the polygon 70 in both Figs. 4 and 5, thereby allowing for a pixel-by-pixel analysis or an analysis of smaller multi-pixel regions that can be used to remove any specular reflections that would otherwise skew the data.
[0027] Alternatively, the polygon 70 created for the first image 30 can be different than a polygon created for the second image 40. The differences can be accepted, and an internal shape can be created that resides within each polygon 70, and includes an area of interest, and does not include any strong discontinuities (edges). The pixel-by-pixel analysis or the analysis of smaller multi-pixel regions can then be used to remove any specular reflections within the area of interest that would otherwise skew the data.
[0028] The preferred pixel-by-pixel analysis includes a pixel-by-pixel subtraction which leads to data that represents the differences between each pixel in the two images 30, 40. While an image of this data (difference image) could be produced, it is an intermediate step and is not necessary to complete the process. A smoothing or noise filter can be applied to remove small differences between the images 30, 40 if desired.
[0029] Next, the pixel values of the difference image are analyzed to identify any positive or negative spikes. A positive spike indicates a specular image from the first image 30 in this example and a negative spike indicates a specular image in the second image 40 or the image subtracted from the first image 30. The positive spike data is removed from the first image 30 and replaced with data for the same pixels from the second image 40 to produce corrected image data that can be used to produce a corrected image.
[0030] The corrected image contains all the relevant pixels, but with no specular pollution. The corrected image can be used for image processing and temperature detection (e.g., min -max analysis) without specular issues causing incorrect analysis.
[0031] Fig. 7 illustrates the temperature data taken along line 7-7 which passes through the specular reflection of the sun 50a, 50b in both Figs. 4 and 5. The temperature data in the curves is normalized such that the values shown are the temperature variation from the average.
Therefore, a temperature measurement that falls on the average temperature of the object would show as a zero in these curves. As can be seen, the first curve 75 includes temperature values taken from the first image 30 that oscillate around an average temperature of about 32 degrees C. At the point where the sun’s reflection 50a is located, the temperature rises rapidly and well beyond a typical temperature variation. However, without the second image 40, it is difficult to say that the high temperature is an anomaly and not a true hot spot 65a on the object 10.
[0032] The second curve 80 in Fig. 7 illustrates temperature data for the same line 7-7 taken from the second image 40 shown in Fig. 5. Again, a value of zero represents an average temperature and again, at the point where the line passes through the reflection of the sun 50b a large spike in temperature is present. Had the spike in temperature in the first curve 75 of Fig. 7 been a true hot spot 65a, the spike would appear in the same position in the second curve 80. However, as is clearly seen, the spike in temperature has moved a significant distance. This is indicative of a specular reflection.
[0033] The third curve 85 of Fig. 7 illustrates the results of a pixel-by-pixel subtraction of the second curve 80 from the first curve 75. Where there is no specular reflection, the temperature values of the first curve should be about the same as the temperature values of the second curve. Thus, the third curve should be very close to zero at all points except where there is a specular reflection. The specular reflection 50a in the first curve 75 will appear as a large hot spot spike in the third curve 85, while the reflection 50b in the second curve 80 will appear in the third curve 85 as a large cold spot spike. By ignoring the cold spots, the third curve 85 clearly identifies where any specular reflections occur in the first curve 75 and therefore where any specular reflections are located in the first image 30 (Figs. 2 and 4).
[0034] To complete the removal of the undesired specular reflection, the data of the first curve 75, the second curve, 80, and the third curve 85 are combined to arrive at the corrected or fourth curve 90 of Fig. 7. In the Example of Fig. 7, the data from the first curve 75 is translated to the fourth curve 90. However, data for any axial positions that show a positive spike in Fig. 3 is omitted. This omitted region is identified as region 95. Data from these same axial positions in the second curve 80 does not include the same specular reflection. Therefore, this data
(replacement data 100) is copied to the fourth curve and replaces the specular reflection data from the first curve 75.
[0035] The curves of Fig. 7 show how the analysis and correction of specular data is performed along a line. This process is extended to all the pixels in the captured images 30, 40 to arrive at a corrected image 105 illustrated in Fig. 6. As can be seen, there is no reflection of the sun 50 visible in Fig. 6.
[0036] While many different uses are contemplated where specular reflections need to be removed, one application is the inspection of remote equipment such as power transformers. Periodic images of this equipment can show hot spots forming that are indicative of improper operation or a pending failure. The use of drones 15, including remote piloted or autonomous drones 15 could allow for more periodic and efficient image capture. The elimination of specular reflections would allow for a more automated analysis of the images. In one application, the drone 15 is programed to fly to a particular position with respect to a
transformer. A first image is captured and the drone shifts to a second position a few feet from the first position to capture the second image. Software on the drone 15 or remote from the drone 15 removes any specular reflections and analyses a single corrected image or provides a single image to a user for analysis.
[0037] While the example of a specular reflection source used herein is the sun 45, other sources are also possible. For example, street lights, navigation lights, or other light sources could also create a specular reflection and would be filtered or corrected in the same manner as the specular reflection created by the sun 45.
[0038] In another application, the system is used to image a number of identical or similar objects. The user predefines the location and distance from which each image is captured and those locations and distances are repeated for every object to assure that the area of interest always appears the same in the two images. In this case, known corners or points can be aligned to allow the pixel-by-pixel analysis without the need to detect edges 55 or define polygons 70 within the defined edges 55.
[0039] Although an exemplary embodiment of the present disclosure has been described in detail, those skilled in the art will understand that various changes, substitutions, variations, and improvements disclosed herein may be made without departing from the spirit and scope of the disclosure in its broadest form.
[0040] None of the description in the present application should be read as implying that any particular element, step, act, or function is an essential element, which must be included in the claim scope: the scope of patented subject matter is defined only by the allowed claims.
Moreover, none of these claims are intended to invoke a means plus function claim construction unless the exact words "means for" are followed by a participle.

Claims

CLAIMS What is claimed is:
1. A method of detecting and removing specular data to create a corrected image, the method comprising:
capturing a first image of a scene including an object from a first position;
capturing a second image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position;
comparing the first image to the second image to find spectral images that are in different positions in the second image than in the first image with respect to the object; and
removing the spectral images from the first image to define the corrected image.
2. The method of claim 1 , wherein the first image and the second image are infrared images.
3. The method of claim 1, wherein the non-zero distance is at least 75 mm.
4. The method of claim 1, wherein the comparing step includes subtracting the second image from the first image to produce an anomaly image and identifying spectral images from the anomaly image that includes any spectral images that were captured within the first image and the second image.
5. The method of claim 4, wherein the removing step includes replacing portions of the first image corresponding to specular data in the anomaly image with image data from the second image.
6. The method of claim 4, further comprising converting the first image into temperature data and converting the second image into temperature data and wherein the subtracting step includes subtracting the temperature data of the second image from the temperature data of the first image.
7. The method of claim 1, further comprising identifying edges of the object in the first image and identifying edges of the object in the second image.
8. The method of claim 7, further comprising aligning the edges of the object from the second image with edges of the object from the first image and defining an area of interest based at least in part on the edges of the object from the first image.
9. The method of claim 1, further comprising positioning an infrared (IR) camera on a flyable drone and flying the drone to the first position to capture the first image and the second position to capture the second image.
10. The method of claim 1, further comprising positioning a first infrared (IR) camera and a second IR camera on a flyable drone and flying the drone to the position wherein the first IR camera is in the first position to capture the first image and the second IR camera is in the second position to capture the second image.
11. The method of claim 10, wherein the first IR camera and the second IR camera are each part of a single binocular vision camera.
12. A method of detecting and removing specular data to create a corrected image, the method comprising:
capturing a first image of a scene including an object from a first position;
capturing a second image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position;
subtracting the second image from the first image to define an anomaly image that includes any spectral images that were captured within the first image and the second image; identifying spectral data within the anomaly image; and
replacing portions of the first image corresponding to spectral data in the anomaly image with non-spectral data from the second image to define the corrected image.
13. The method of claim 12, wherein the first image and the second image are infrared images.
14. The method of claim 12, wherein the non-zero distance is at least 75 mm.
15. The method of claim 12, further comprising converting the first image into temperature data and converting the second image into temperature data and wherein the subtracting step includes subtracting the temperature data of the second image from the temperature data of the first image.
16. The method of claim 12, further comprising identifying edges of the object in the first image and identifying edges of the object in the second image.
17. The method of claim 16, further comprising aligning the edges of the object from the second image with edges of the object from the first image and defining an area of interest based at least in part on the edges of the object from the first image.
18. The method of claim 12, further comprising positioning an infrared (IR) camera on a flyable drone and flying the drone to the first position to capture the first image and the second position to capture the second image.
19. The method of claim 12, further comprising positioning a first infrared (IR) camera and a second IR camera on a flyable drone and flying the drone to the position wherein the first IR camera is in the first position to capture the first image and the second IR camera is in the second position to capture the second image.
20. The method of claim 19, wherein the first IR camera and the second IR camera are each part of a single binocular vision camera.
21. A method of detecting and removing specular data to create a corrected image, the method comprising:
capturing a first IR image of a scene including an object from a first position;
capturing a second IR image of the scene including the object from a second position, the second position being offset a non-zero distance from the first position;
identifying edges of the object in the first image;
using the edges in the first image to identify a first surface of interest in the first image; identifying edges of the object in the second image;
using the edges in the second image to identify a second surface of interest in the second image, each of the first surface and the second surface corresponding to a surface of interest of the object;
distorting the second image to align the second surface to the first surface;
comparing the IR image of the first surface to the IR image of the second surface to find spectral images that are in different positions in the second image than in the first image with respect to the edges of the object in the first image; and
removing the spectral images from the first surface to define the corrected image.
22. The method of claim 21, wherein the non-zero distance is at least 75 mm.
23. The method of claim 21, wherein the comparing step includes subtracting the second image from the first image to produce an anomaly image and identifying spectral images from the anomaly image that includes any spectral images that were captured within the first image and the second image.
24. The method of claim 23, wherein the removing step includes replacing portions of the first image corresponding to specular data in the anomaly image with image data from the second image.
25. The method of claim 21, further comprising converting the first image into temperature data and converting the second image into temperature data and wherein the subtracting step includes subtracting the temperature data of the second image from the temperature data of the first image.
26. The method of claim 21 , further comprising positioning an IR camera on a flyable drone and flying the drone to the first position to capture the first image and the second position to capture the second image.
27. The method of claim 21 , further comprising positioning a first IR camera and a second IR camera on a flyable drone and flying the drone to the position wherein the first IR camera is in the first position to capture the first image and the second IR camera is in the second position to capture the second image.
28. The method of claim 27, wherein the first IR camera and the second IR camera are each part of a single binocular vision camera.
PCT/US2018/037554 2018-06-14 2018-06-14 System and method for specular detection and removal WO2019240797A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2018/037554 WO2019240797A1 (en) 2018-06-14 2018-06-14 System and method for specular detection and removal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/037554 WO2019240797A1 (en) 2018-06-14 2018-06-14 System and method for specular detection and removal

Publications (1)

Publication Number Publication Date
WO2019240797A1 true WO2019240797A1 (en) 2019-12-19

Family

ID=62842244

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/037554 WO2019240797A1 (en) 2018-06-14 2018-06-14 System and method for specular detection and removal

Country Status (1)

Country Link
WO (1) WO2019240797A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6298149B1 (en) * 1996-03-21 2001-10-02 Cognex Corporation Semiconductor device image inspection with contrast enhancement
US20080317378A1 (en) * 2006-02-14 2008-12-25 Fotonation Ireland Limited Digital image enhancement with reference images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6298149B1 (en) * 1996-03-21 2001-10-02 Cognex Corporation Semiconductor device image inspection with contrast enhancement
US20080317378A1 (en) * 2006-02-14 2008-12-25 Fotonation Ireland Limited Digital image enhancement with reference images

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ALESSANDRO ARTUSI ET AL: "A Survey of Specularity Removal Methods", COMPUTER GRAPHICS FORUM, vol. 30, no. 8, 2 August 2011 (2011-08-02), GB, pages 2208 - 2230, XP055289481, ISSN: 0167-7055, DOI: 10.1111/j.1467-8659.2011.01971.x *
KYLILI ANGELIKI ET AL: "Infrared thermography (IRT) applications for building diagnostics: A review", APPLIED ENERGY, ELSEVIER SCIENCE PUBLISHERS, GB, vol. 134, 6 September 2014 (2014-09-06), pages 531 - 549, XP029059953, ISSN: 0306-2619, DOI: 10.1016/J.APENERGY.2014.08.005 *
LEE S W ET AL: "Detection of specularity using colour and multiple views", IMAGE AND VISION COMPUTING, ELSEVIER, GUILDFORD, GB, vol. 10, no. 10, 1 December 1992 (1992-12-01), pages 643 - 653, XP026655889, ISSN: 0262-8856, [retrieved on 19921201], DOI: 10.1016/0262-8856(92)90009-R *
RECAI SINEKLI ET AL: "A Comparison of Edge Detection Techniques to use in 2D Image Registration based on ICP Algorithm", 1ST INTERNATIONAL MEDITERRANEAN SCIENCE AND ENGINEERING CONGRESS (IMSEC 2016), 1 October 2016 (2016-10-01), XP055505246 *

Similar Documents

Publication Publication Date Title
US9726543B2 (en) Apparatus and method for validating leak survey results
Vidas et al. A mask-based approach for the geometric calibration of thermal-infrared cameras
US8212210B2 (en) IR camera and method for presenting IR information
JP7037876B2 (en) Use of 3D vision in automated industrial inspection
US20080075385A1 (en) Detection and Correction of Flash Artifacts from Airborne Particulates
WO2011018999A1 (en) Obstacle detection device and method and obstacle detection system
JP6052590B2 (en) Surface inspection apparatus and surface inspection method for automobile body
US11158039B2 (en) Using 3D vision for automated industrial inspection
JP5156601B2 (en) Shape measuring apparatus and program
US20160267668A1 (en) Measurement apparatus
JP6633454B2 (en) How to detect deformed parts
JP5297779B2 (en) Shape measuring apparatus and program
JP2010112802A (en) Appearance inspection apparatus for wood and appearance inspection method for wood
JP2009259036A (en) Image processing device, image processing method, image processing program, recording medium, and image processing system
JP6197340B2 (en) Image processing apparatus, image processing method, and program
KR101913705B1 (en) Method and apparatus for measuring depth of materials attached to cylinder using line laser
JP2021179441A (en) Device for inspecting runway floodlight and method for inspecting runway floodlight
WO2019240797A1 (en) System and method for specular detection and removal
CN106097312B (en) A kind of gloves based on machine vision are torn and greasy dirt detection method
US9230337B2 (en) Analysis of the digital image of the internal surface of a tyre and processing of false measurement points
TW201530121A (en) Panel defective pixel detection method and system
KR20180074175A (en) Visual inspection method of lens module
US9208369B2 (en) System, method and computer software product for searching for a latent fingerprint while simultaneously constructing a three-dimensional topographic map of the searched space
JP6566903B2 (en) Surface defect detection method and surface defect detection apparatus
EP3637092A1 (en) Inspecting device and inspecting method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18738086

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18738086

Country of ref document: EP

Kind code of ref document: A1