US20190339206A1 - System and method for damage detection by cast shadows - Google Patents
System and method for damage detection by cast shadows Download PDFInfo
- Publication number
- US20190339206A1 US20190339206A1 US15/971,227 US201815971227A US2019339206A1 US 20190339206 A1 US20190339206 A1 US 20190339206A1 US 201815971227 A US201815971227 A US 201815971227A US 2019339206 A1 US2019339206 A1 US 2019339206A1
- Authority
- US
- United States
- Prior art keywords
- processor
- component
- light
- image data
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8829—Shadow projection or structured background, e.g. for deflectometry
Definitions
- the present disclosure is directed to an automated inspection system for detection of coating imperfections. Particularly, the disclosure is directed to an automated inspection system for detection of coating imperfections based on the method of “shape from shadows” (also called computational illumination or multi-flash imaging).
- Gas turbine engine components such as blades, vanes, disks, gears, and the like, may suffer irregularities during manufacture, such as spallation, machining defects, or inadequate coating, or may suffer wear and damage during operation, for example, due to erosion, hot corrosion (sulfidation), cracks, dents, nicks, gouges, and other damage, such as from foreign object damage. Detecting this damage may be achieved by images or videos for aircraft engine blade inspection, power turbine blade inspection, internal inspection of mechanical devices, and the like. A variety of techniques for inspecting by use of images or videos may include capturing and displaying images or videos to human inspectors for manual defect detection and interpretation. Human inspectors may then decide whether any defect exists within those images or videos.
- an inspection system comprising an imaging device mounted so as to image a component surface; at least one controllable light mounted at low oblique angles around the component and configured to illuminate the component surface and cast shadows in a feature on the component surface; and a processor coupled to the imaging device and the at least one controllable light; the processor comprising a tangible, non-transitory memory configured to communicate with the processor, the tangible, non-transitory memory having instructions stored therein that, in response to execution by the processor, cause the processor to perform operations comprising: controlling, by the processor, the at least one controllable light to cast the shadows; receiving, by the processor, image data for the component from the imaging device; determining, by the processor, a feature based on a dissimilarity between the image data and a reference model; classifying, by the processor, the feature dissimilarity; and determining, by the processor, a probability that the feature dissimilarity indicates damage to the component.
- the inspection system further comprises removing specular reflections.
- the processor is further configured to control at least one of a position of said at least one controllable light and an orientation of said at least one controllable light, with respect to the component surface.
- controlling the at least one controllable light to cast the shadows further comprises illuminating the at least one controllable light independently.
- controlling the at least one controllable light to cast the shadows further comprises illuminating the component surface from multiple directions.
- the processor is further configured to compute a surface model from the image data to form a proxy model.
- the processor is further configured to determine a feature based on a dissimilarity between the image data and a proxy model.
- the imaging device is configured as at least one of a high dynamic range camera and a multi-polarization camera.
- the feature comprises a shallow surface defect.
- the feature comprises a coating imperfection.
- a method for inspection of a component comprising imaging a component surface with an imaging device; mounting one or more controllable lights at low oblique angles around the component; illuminating the component surface; casting one or more shadows in a feature on the component surface; and detecting a defect based on the shadows.
- the method for inspection of a component further comprises coupling a processor to the imaging device and the one or more controllable lights; the processor comprising a tangible, non-transitory memory configured to communicate with the processor, the tangible, non-transitory memory having instructions stored therein that, in response to execution by the processor, cause the processor to perform operations comprising: controlling the one or more controllable lights to cast the one or more shadows; receiving image data for the component from the imaging device; determining a feature based on dissimilarity between the image data and a reference model; classifying the feature; and determining a probability that the feature indicates damage to the component.
- the method for inspection of a component further comprises removing specular reflections.
- the method for inspection of a component further comprises archiving the image data and the feature for one or more of future damage progression detection, damage trending and condition-based maintenance.
- the method for inspection of a component further comprises controlling at least one of a position of at least one of the one or more controllable lights and an orientation of at least one of the one or more controllable lights, with respect to the component surface.
- the method for inspection of a component further comprises illuminating each of the at least one light in the array independently.
- the method for inspection of a component further comprises illuminating the component surface from multiple directions.
- the imaging device is configured as at least one of a high dynamic range camera and a multi-polarization camera.
- the method for inspection of a component further comprises computing a surface model from the image data to form a proxy model; and determining a feature based on dissimilarity between the image data and the proxy model.
- An array of controllable lights are arranged around a part at low oblique angles. The position and orientation of the lights with respect to the part are controllable. The lights are triggered independently, in order to capture images, and detect defects from the cast shadows created by the lights.
- a model is registered and used to detect differences between the shadow images and the model. Examples of the model include an as-designed CAD model, an as-built model, a previous condition model and the like.
- a low-order surface model is computed from the data as a proxy to an a priori model. The differences from the proxy model and the captured shadow images can be computed.
- FIG. 1 is a schematic diagram of an exemplary inspection system in accordance with various embodiments.
- FIG. 2 is a process map of an exemplary inspection system in accordance with various embodiments.
- FIG. 3 is a schematic diagram of an exemplary inspection system.
- FIG. 1 a schematic illustration of an automated inspection system 10 for detecting a defect or damage to a component 20 is shown, in accordance with various embodiments.
- the automated inspection system 10 may be configured to effectively perform 3D imaging of a component 20 and particularly for detection of coating imperfections.
- component 20 may be any natural or manufactured object, in particular, it may include a component on an aircraft, such as an engine component, such as a fan or an airfoil (e.g., a fan, blade, or vane), a combustor liner, and the like.
- Component 20 may be scanned or sensed by one or more sensors 12 to obtain data 14 about the component 20 . Data 14 may be obtained, for example, from a 1D or 2D sensor.
- data 14 may be obtained by rotating, panning, or positioning the sensor(s) 12 relative to the component 20 to capture data 14 from multiple viewpoint angles, perspectives, and/or depths.
- the component 20 may be rotated or positioned relative to the sensor(s) 12 to obtain data 14 from multiple viewpoints, perspectives, and/or depths.
- An array of sensors 12 positioned around component 20 may be used to obtain data 14 from multiple viewpoints.
- either of the sensor(s) 12 or component 20 may be moved or positioned relative to the other and relative to various directions or axes of a coordinate system to obtain sensor information from various viewpoints or perspectives, and/or depths.
- sensor 12 may scan, sense, or capture information from a single position relative to component 20 .
- a sensor 12 may include a one-dimensional (1D) or 2D sensor and/or a combination and/or array thereof. Sensor 12 may be operable anywhere in the electromagnetic spectrum compatible with illumination 36 ( FIG. 3 ). Sensor 12 may provide various characteristics of the sensed electromagnetic spectrum including intensity, spectral characteristics, polarization, etc.
- sensor 12 may include an image capture device, such as an optical device having one or more optical lenses, apertures, filters, and the like.
- image capture devices include a DSLR camera, a surveillance camera, a high-dynamic range camera, a mobile video camera, an industrial microscope, or other imaging device or image sensor, capable of capturing 2D still images or video images.
- Sensor 12 may include two or more physically separated cameras that may view a component from different angles, to obtain visual stereo image data.
- sensor 12 may include a line sensor, a linear image sensor, or other 1D sensor. Further, sensor 12 may include a 2D sensor. Automated inspection system 10 may synthesize 2D or 3D information from the 1D sensor data; and inspection system 10 may extract 1D information or synthesize 3D information from the 2D sensor data. The extraction may be achieved by retaining only a subset of the data such as keeping only that data that is in focus. The synthesizing may be achieved by tiling or mosaicking the data. Even further, sensor 12 may include a position and/or orientation sensor such as an inertial measurement unit (IMU) that may provide position and/or orientation information about component 20 with respect to a coordinate system or other sensor 12 . The position and/or orientation information may be beneficially employed in aligning 1D, 2D or 3D information to a reference model as discussed elsewhere herein.
- IMU inertial measurement unit
- Data 14 from sensor(s) 12 may be transmitted to one or more processors 16 (e.g., computer systems having a central processing unit and memory) for recording, processing and storing the data received from sensors 12 .
- Processor 16 may include a general-purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof.
- Processor 16 may be in communication (such as electrical communication) with sensors 12 and may be configured to receive input, such as image information from sensors 12 .
- Processor 16 may receive data 14 about component 20 captured and transmitted by the sensor(s) 12 via a communication channel. Upon receiving the data 14 , the processor 16 may process data 14 from sensors 12 to determine if damage or defects are present on the component 20 .
- processor 16 may receive or construct 3D information or image data 30 corresponding to the component 20 .
- the 3D information may be represented as one or more 2D datasets.
- Processor 16 may further include a reference model 22 stored, for example, in memory of processor 16 .
- Reference model 22 may be generated from a CAD model, a 3D CAD model, and/or 3D information, such as from a 3D scan or 3D information of an original component or an undamaged component.
- Reference model 22 may also be generated from the current data 14 .
- Reference model 22 may be a theoretical model or may be based on historical information about component 20 .
- Reference model 22 may be represented as one or more 2D datasets.
- Reference model 22 may be adjusted and updated as component 20 and/or similar components are scanned and inspected.
- reference model 22 may be a learned model of a component and may include, for example, 3D information including shape and surface features of the component.
- processor 16 of automated inspection system 10 may classify the damage and determine the probability of damage and/or if the damage meets or exceeds a threshold 24 .
- Threshold 24 may be an input parameter based on reference model 22 , based on user input, based on current data 14 , and the like.
- Processor 16 may provide an output 26 to a user interface 28 indicating the status of the component 20 .
- User interface 28 may include a display.
- the automated inspection system 10 may display an indication of the damage to component 20 , which may include an image and/or a report.
- output 26 may also relay information about the type of defect, the location of the defect, size of the defect, etc. If defects are found in the inspected component 20 , an indicator may be displayed on user interface 28 to alert personnel or users of the defect.
- Processor 16 may be capable of carrying out the steps of FIG. 2 .
- One or more sensors 12 may capture data about a component 20 .
- Method 200 performed by processor 16 of automated inspection system 10 , may include receiving data from a sensor/camera (step 202 ).
- Method 200 may include generating current condition information from the sensor data (step 204 ). The current condition information may correspond to the component.
- Method 200 may include aligning the current condition information with a reference model (step 206 ), determining a feature dissimilarity between the current condition information and the reference model (step 208 ), classifying the feature dissimilarity (step 210 ), determining damage (step 212 ), and displaying an output (step 214 ).
- Step 202 may further comprise receiving 1D or 2D data from a sensor 12 .
- the entire forward surface of a gas turbine engine fan blade can be captured.
- the entire pressure or suction surface of a turbine blade can be captured.
- Step 204 may comprise constructing a complete image of component 20 by tiling or mosaicking information from one or more sensors 12 or multiple viewpoints.
- Step 204 may comprise merging data 14 from multiple viewpoints.
- step 204 may comprise merging a first data from a 1D sensor and a second data from a 2D sensor and processing the 1D and 2D data to produce 3D information 30 .
- Step 206 may comprise aligning 2D current condition information with a reference model 22 .
- Step 206 may further comprise aligning the 3D information with a reference model 22 , such as a 3D point cloud, by an iterative closest point (ICP) algorithm modified to suppress misalignment from damage areas of the component 20 .
- the alignment may be performed by an optimization method, i.e., minimizing an objective function over a dataset, which may include mathematical terms in the ICP objective function or constraints to reject features or damage as outliers.
- the alignment may be performed by a 3D modification to a random sample consensus (RANSAC) algorithm, scale-invariant feature transform (SIFT), speeded up robust feature (SURF), other suitable alignment method.
- RANSAC random sample consensus
- SIFT scale-invariant feature transform
- SURF speeded up robust feature
- Step 206 may further include comparing the 3D information 30 to the reference model 22 to align the features from the 3D information 30 with the reference model 22 by identifying affine and/or scale invariant features, diffeomorphic alignment/scale cascaded alignment, and the like. Step 206 may further include registering the features.
- Step 208 may further comprise computing features, such as surface and shape characteristics, of the component 20 by methods to identify and extract features.
- processor 16 may determine differences or dissimilarities between the information 30 and the reference model 22 .
- Step 208 may further comprise identifying features and determining differences or dissimilarities between the identified features in the information 30 and the reference model 22 using a statistical algorithm such as histogram of gradients (HoG), histogram of oriented gradients (HoOG), histogram of gradients in 3D (Ho0G3D), a histogram of oriented gradients in 3D (Ho0G3D), 3D Zernike moments, or other algorithms.
- HoG histogram of gradients
- HoOG histogram of oriented gradients
- Ho0G3D histogram of gradients in 3D
- Ho0G3D histogram of oriented gradients in 3D
- 3D Zernike moments or other algorithms.
- processor 16 may define the orientation of edges and surfaces of information 30 by dividing the information 30 into portions or cells and assigning to each cell a value, where each point or pixel contributes a weighted orientation or gradient to the cell value. By grouping cells and normalizing the cell values, a histogram of the gradients can be produced and used to extract or estimate information about an edge or a surface of the component 20 . Thus, the features of the information 30 , such as surface and edge shapes, may be identified. Other algorithms, such as 3D Zernike moments, may similarly be used to recognize features in 3D information 30 by using orthogonal moments to reconstruct, for example, surface and edge geometry of component 20 .
- Step 208 may further comprise determining differences or dissimilarities between the identified features in the information 30 and the reference model 22 .
- the dissimilarities may be expressed, for example, by the distance between two points or vectors.
- Other approaches to expressing dissimilarities may include computing mathematical models of information 30 and reference model 22 in a common basis (comprising modes) and expressing the dissimilarity as a difference of coefficients of the basis functions (modes). Differences or dissimilarities between the information 30 and the reference model 22 may represent various types of damage to component 20 .
- Step 210 may further comprise classifying the feature dissimilarities identified in step 208 .
- the automated inspection system 10 may include categories of damage or defect types for component 20 .
- damage may be categorized into classes such as warping, stretching, edge defects, erosion, nicks, cracks, and/or cuts.
- Step 210 may further comprise identifying the damage type based on the dissimilarities between the information 30 and the reference model 22 .
- Step 210 may further comprise classifying the feature dissimilarities into categories of, for example, systemic damage or localized damage.
- Systemic damage may include warping or stretching of component 20 .
- Localized damage may include edge defects, erosion, nicks, cracks, or cuts on a surface of component 20 .
- Classifying the feature dissimilarities may be accomplished by, for example, support vector machine (SVM), decision tree, deep neural network, recurrent ensemble learning machine, or other classification method.
- SVM support vector machine
- the detection of damage may include differencing the data and a model to produce an error map.
- the error map may contain small errors due to model-mismatch and sensing errors, and may contain large, spatially correlated errors where damage has occurred.
- Step 212 may further comprise determining whether the feature difference or dissimilarity represents damage to component 20 .
- Step 212 may comprise determining a probability of damage represented by the feature dissimilarity and/or classification.
- Step 212 may comprise determining damage by comparing the probability of damage to a threshold. Damage may be determined if the probability meets or exceeds a threshold.
- the automated inspection system 10 may determine if the damage is acceptable or unacceptable, and may determine if the component 20 should be accepted or rejected, wherein a rejected component would indicate that the component should be repaired or replaced.
- Deep learning is the process of training or adjusting the weights of a deep neural network.
- the deep neural network is a deep convolutional neural network. Deep convolutional neural networks are trained by presenting an error map or partial error map to an input layer and, a damage/no-damage label (optionally, a descriptive label, e.g., missing material, crack, spallation, and the like), to an output layer.
- the training of a deep convolutional network proceeds layer-wise and does not require a label until the output layer is trained.
- the weights of the deep network's layers are adapted, typically by a stochastic gradient descent algorithm, to produce a correct classification.
- the deep learning training may use only partially labeled data, only fully labeled data, or only implicitly labeled data, or may use unlabeled data for initial or partial training with only a final training on labeled data.
- Statistical estimation regression techniques can include principal components analysis (PCA), robust PCA (RPCA), support vector machines (SVM), linear discriminant analysis (LDA), expectation maximization (EM), Boosting, Dictionary Matching, maximum likelihood (ML) estimation, maximum a priori (MAP) estimation, least squares (LS) estimation, non-linear LS (NNLS) estimation, and Bayesian Estimation based on the error map.
- PCA principal components analysis
- RPCA support vector machines
- LDA linear discriminant analysis
- EM expectation maximization
- Boosting Dictionary Matching
- maximum likelihood estimation maximum ML estimation
- MAP maximum a priori
- LS least squares estimation
- NLS non-linear LS estimation
- Bayesian Estimation based on the error map.
- Step 214 may further comprise transmitting, displaying, or storing the 2D or 3D information, feature differences or dissimilarities, classification of the feature differences or dissimilarities, a damage report, and/or a determination or recommendation that the component 20 be accepted or rejected.
- Step 214 may further comprise displaying an image, a model, a combined image and model, a 2D perspective from a model, and the like, of the damaged component for further evaluation by a user or by a subsequent automated system.
- the system 10 can include an automated inspection system for detection of coating imperfections based on the method of “shape from shadows” for applications such as gas turbine engine blade coating inspection.
- the component 20 can be a blade of a fan, or a blade of a compressor, a blade of a turbine, a combustor liner, or other component with a surface coating.
- the exemplary embodiment shown in FIG. 3 includes a component 20 with a surface coating 32 .
- the sensor 12 is shown as a camera 12 configured to capture images of a surface coating 32 .
- the camera 12 can be a high dynamic range camera or a multi-polarization camera to capture the necessary image data 14 of the surface 32 .
- An array of controllable light(s) 36 are mounted at low oblique angles 38 around the component 20 .
- the light(s) 36 may be operable anywhere in the electromagnetic spectrum compatible with sensor(s) 12 .
- light(s) 36 and/or sensor(s) 12 may operate at any one frequency in the electromagnetic spectrum (monochromatic), one band of frequencies (polychromatic), or one or more combinations of the foregoing.
- Light(s) 36 and/or sensor(s) 12 may employ filters (not shown) to achieve operation in the desired frequencies and/or bands.
- light(s) 36 and sensor(s) 12 operate at a frequency or frequencies outside the spectrum of ambient illumination such that ambient illumination does not interfere with light(s) 36 and sensor(s) 12 .
- the array of light(s) 36 are configured to illuminate the component surface 32 and cast at least one shadow 40 to be detected as in a feature dissimilarity or simply feature 42 on the component surface 32 .
- the feature 42 can result from a shallow surface defect, damage, crack, and the like formed on the surface 32 .
- the processor 16 is coupled to the imaging device 12 and the array of light(s) 36 .
- the array of light(s) 36 are arranged to illuminate the component surface 32 from multiple directions.
- the array of light(s) 36 can be controlled independently, such that, the light(s) 36 cast the shadows 40 .
- the cast shadows 40 represent 3D information about the surface 32 .
- the processor 12 can be configured to determine damage to the coating 34 based on image or video analytics.
- the processor 16 can be configured to automatically report damage and archive the damage for trending and condition-based-maintenance.
- the processor 16 can be configured to receive the data for the surface 32 of the component 20 from the imaging device 12 .
- the processor 16 can be configured to perform operations like controlling the lighting array 36 to cast the shadows 40 .
- the processor 16 can receive image data 30 for the component 20 from the imaging device 12 .
- the processor 16 can determine a feature dissimilarity 42 between the image data 30 and a reference model 22 .
- the processor 16 can classify the feature dissimilarity 42 and determine a probability that the feature dissimilarity 42 indicates damage to the component.
- the processor 16 can include operations to remove specular reflections.
- a specular reflection is a type of surface reflectance often described as a mirror-like reflection of light from the surface. In specular reflection, the incident light is reflected into a single outgoing direction.
- the processor 16 can include operations to control at least one of a position of at least one light in the array 36 and an orientation of at least one light in the array 36 , with respect to the component surface 32 .
- the processor 16 can include operations to illuminate each of the light(s) in the array independently.
- the processor 16 can include operations to illuminate the component surface from multiple directions.
- the processor 16 can include operations to compute a surface model from the image data to form a proxy model.
- the processor 16 can include operations to determine a feature dissimilarity between the image data and a proxy model.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Signal Processing (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
- The present disclosure is directed to an automated inspection system for detection of coating imperfections. Particularly, the disclosure is directed to an automated inspection system for detection of coating imperfections based on the method of “shape from shadows” (also called computational illumination or multi-flash imaging).
- Gas turbine engine components, such as blades, vanes, disks, gears, and the like, may suffer irregularities during manufacture, such as spallation, machining defects, or inadequate coating, or may suffer wear and damage during operation, for example, due to erosion, hot corrosion (sulfidation), cracks, dents, nicks, gouges, and other damage, such as from foreign object damage. Detecting this damage may be achieved by images or videos for aircraft engine blade inspection, power turbine blade inspection, internal inspection of mechanical devices, and the like. A variety of techniques for inspecting by use of images or videos may include capturing and displaying images or videos to human inspectors for manual defect detection and interpretation. Human inspectors may then decide whether any defect exists within those images or videos. When human inspectors look at many similar images of very similar blades of an engine stage or like components of a device, they may not detect defects, for example, because of fatigue or distraction experienced by the inspector. Missing a defect may lead to customer dissatisfaction, transportation of an expensive engine back to service centers, lost revenue, or even engine failure. Additionally, manual inspection of components may be time consuming and expensive. Emerging 3D depth sensors might provide an alternative approach; however, it may be particularly difficult, time consuming, or expensive to directly 3D scan a component to an accuracy sufficient to detect shallow spallation or small manufacturing defects.
- In accordance with the present disclosure, there is provided an inspection system comprising an imaging device mounted so as to image a component surface; at least one controllable light mounted at low oblique angles around the component and configured to illuminate the component surface and cast shadows in a feature on the component surface; and a processor coupled to the imaging device and the at least one controllable light; the processor comprising a tangible, non-transitory memory configured to communicate with the processor, the tangible, non-transitory memory having instructions stored therein that, in response to execution by the processor, cause the processor to perform operations comprising: controlling, by the processor, the at least one controllable light to cast the shadows; receiving, by the processor, image data for the component from the imaging device; determining, by the processor, a feature based on a dissimilarity between the image data and a reference model; classifying, by the processor, the feature dissimilarity; and determining, by the processor, a probability that the feature dissimilarity indicates damage to the component.
- In another and alternative embodiment, the inspection system further comprises removing specular reflections.
- In another and alternative embodiment, the processor is further configured to control at least one of a position of said at least one controllable light and an orientation of said at least one controllable light, with respect to the component surface.
- In another and alternative embodiment, controlling the at least one controllable light to cast the shadows further comprises illuminating the at least one controllable light independently.
- In another and alternative embodiment, controlling the at least one controllable light to cast the shadows further comprises illuminating the component surface from multiple directions.
- In another and alternative embodiment, the processor is further configured to compute a surface model from the image data to form a proxy model.
- In another and alternative embodiment, the processor is further configured to determine a feature based on a dissimilarity between the image data and a proxy model.
- In another and alternative embodiment, the imaging device is configured as at least one of a high dynamic range camera and a multi-polarization camera.
- In another and alternative embodiment, the feature comprises a shallow surface defect.
- In another and alternative embodiment, the feature comprises a coating imperfection.
- In another and alternative embodiment, the at least one filter associated with the at least one controllable light and the imaging device wherein the at least one filter provides attenuation to at least one of intensity, frequency, and polarization.
- In accordance with the present disclosure, there is provided a method for inspection of a component, comprising imaging a component surface with an imaging device; mounting one or more controllable lights at low oblique angles around the component; illuminating the component surface; casting one or more shadows in a feature on the component surface; and detecting a defect based on the shadows.
- In another and alternative embodiment, the method for inspection of a component further comprises coupling a processor to the imaging device and the one or more controllable lights; the processor comprising a tangible, non-transitory memory configured to communicate with the processor, the tangible, non-transitory memory having instructions stored therein that, in response to execution by the processor, cause the processor to perform operations comprising: controlling the one or more controllable lights to cast the one or more shadows; receiving image data for the component from the imaging device; determining a feature based on dissimilarity between the image data and a reference model; classifying the feature; and determining a probability that the feature indicates damage to the component.
- In another and alternative embodiment, the method for inspection of a component further comprises removing specular reflections.
- In another and alternative embodiment, the method for inspection of a component further comprises archiving the image data and the feature for one or more of future damage progression detection, damage trending and condition-based maintenance.
- In another and alternative embodiment, the method for inspection of a component further comprises controlling at least one of a position of at least one of the one or more controllable lights and an orientation of at least one of the one or more controllable lights, with respect to the component surface.
- In another and alternative embodiment, the method for inspection of a component further comprises illuminating each of the at least one light in the array independently.
- In another and alternative embodiment, the method for inspection of a component further comprises illuminating the component surface from multiple directions.
- In another and alternative embodiment, the imaging device is configured as at least one of a high dynamic range camera and a multi-polarization camera.
- In another and alternative embodiment, the method for inspection of a component further comprises computing a surface model from the image data to form a proxy model; and determining a feature based on dissimilarity between the image data and the proxy model.
- An array of controllable lights are arranged around a part at low oblique angles. The position and orientation of the lights with respect to the part are controllable. The lights are triggered independently, in order to capture images, and detect defects from the cast shadows created by the lights. A model is registered and used to detect differences between the shadow images and the model. Examples of the model include an as-designed CAD model, an as-built model, a previous condition model and the like. As an alternative, a low-order surface model is computed from the data as a proxy to an a priori model. The differences from the proxy model and the captured shadow images can be computed.
- Other details of the system and method for damage detection by cast shadows are set forth in the following detailed description and the accompanying drawings wherein like reference numerals depict like elements.
-
FIG. 1 is a schematic diagram of an exemplary inspection system in accordance with various embodiments. -
FIG. 2 is a process map of an exemplary inspection system in accordance with various embodiments. -
FIG. 3 is a schematic diagram of an exemplary inspection system. - Referring to
FIG. 1 , a schematic illustration of anautomated inspection system 10 for detecting a defect or damage to acomponent 20 is shown, in accordance with various embodiments. Theautomated inspection system 10 may be configured to effectively perform 3D imaging of acomponent 20 and particularly for detection of coating imperfections. Whilecomponent 20 may be any natural or manufactured object, in particular, it may include a component on an aircraft, such as an engine component, such as a fan or an airfoil (e.g., a fan, blade, or vane), a combustor liner, and the like.Component 20 may be scanned or sensed by one ormore sensors 12 to obtaindata 14 about thecomponent 20.Data 14 may be obtained, for example, from a 1D or 2D sensor. In various embodiments,data 14 may be obtained by rotating, panning, or positioning the sensor(s) 12 relative to thecomponent 20 to capturedata 14 from multiple viewpoint angles, perspectives, and/or depths. Further, thecomponent 20 may be rotated or positioned relative to the sensor(s) 12 to obtaindata 14 from multiple viewpoints, perspectives, and/or depths. An array ofsensors 12 positioned aroundcomponent 20 may be used to obtaindata 14 from multiple viewpoints. Thus, either of the sensor(s) 12 orcomponent 20 may be moved or positioned relative to the other and relative to various directions or axes of a coordinate system to obtain sensor information from various viewpoints or perspectives, and/or depths. Further,sensor 12 may scan, sense, or capture information from a single position relative tocomponent 20. - A
sensor 12 may include a one-dimensional (1D) or 2D sensor and/or a combination and/or array thereof.Sensor 12 may be operable anywhere in the electromagnetic spectrum compatible with illumination 36 (FIG. 3 ).Sensor 12 may provide various characteristics of the sensed electromagnetic spectrum including intensity, spectral characteristics, polarization, etc. - In various embodiments,
sensor 12 may include an image capture device, such as an optical device having one or more optical lenses, apertures, filters, and the like. Exemplary image capture devices include a DSLR camera, a surveillance camera, a high-dynamic range camera, a mobile video camera, an industrial microscope, or other imaging device or image sensor, capable of capturing 2D still images or video images.Sensor 12 may include two or more physically separated cameras that may view a component from different angles, to obtain visual stereo image data. - In various embodiments,
sensor 12 may include a line sensor, a linear image sensor, or other 1D sensor. Further,sensor 12 may include a 2D sensor.Automated inspection system 10 may synthesize 2D or 3D information from the 1D sensor data; andinspection system 10 may extract 1D information or synthesize 3D information from the 2D sensor data. The extraction may be achieved by retaining only a subset of the data such as keeping only that data that is in focus. The synthesizing may be achieved by tiling or mosaicking the data. Even further,sensor 12 may include a position and/or orientation sensor such as an inertial measurement unit (IMU) that may provide position and/or orientation information aboutcomponent 20 with respect to a coordinate system orother sensor 12. The position and/or orientation information may be beneficially employed in aligning 1D, 2D or 3D information to a reference model as discussed elsewhere herein. -
Data 14 from sensor(s) 12 may be transmitted to one or more processors 16 (e.g., computer systems having a central processing unit and memory) for recording, processing and storing the data received fromsensors 12.Processor 16 may include a general-purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof.Processor 16 may be in communication (such as electrical communication) withsensors 12 and may be configured to receive input, such as image information fromsensors 12.Processor 16 may receivedata 14 aboutcomponent 20 captured and transmitted by the sensor(s) 12 via a communication channel. Upon receiving thedata 14, theprocessor 16 may processdata 14 fromsensors 12 to determine if damage or defects are present on thecomponent 20. - In various embodiments,
processor 16 may receive or construct 3D information orimage data 30 corresponding to thecomponent 20. The 3D information may be represented as one or more 2D datasets.Processor 16 may further include areference model 22 stored, for example, in memory ofprocessor 16.Reference model 22 may be generated from a CAD model, a 3D CAD model, and/or 3D information, such as from a 3D scan or 3D information of an original component or an undamaged component.Reference model 22 may also be generated from thecurrent data 14.Reference model 22 may be a theoretical model or may be based on historical information aboutcomponent 20.Reference model 22 may be represented as one or more 2D datasets.Reference model 22 may be adjusted and updated ascomponent 20 and/or similar components are scanned and inspected. Thus,reference model 22 may be a learned model of a component and may include, for example, 3D information including shape and surface features of the component. - In various embodiments,
processor 16 ofautomated inspection system 10 may classify the damage and determine the probability of damage and/or if the damage meets or exceeds athreshold 24.Threshold 24 may be an input parameter based onreference model 22, based on user input, based oncurrent data 14, and the like.Processor 16 may provide anoutput 26 to auser interface 28 indicating the status of thecomponent 20.User interface 28 may include a display. Theautomated inspection system 10 may display an indication of the damage tocomponent 20, which may include an image and/or a report. In addition to reporting any defects in the component,output 26 may also relay information about the type of defect, the location of the defect, size of the defect, etc. If defects are found in the inspectedcomponent 20, an indicator may be displayed onuser interface 28 to alert personnel or users of the defect. - With reference to
FIG. 2 , amethod 200 for detecting defects is provided, in accordance with various embodiments.Processor 16 may be capable of carrying out the steps ofFIG. 2 . One ormore sensors 12 may capture data about acomponent 20.Method 200, performed byprocessor 16 ofautomated inspection system 10, may include receiving data from a sensor/camera (step 202).Method 200 may include generating current condition information from the sensor data (step 204). The current condition information may correspond to the component.Method 200 may include aligning the current condition information with a reference model (step 206), determining a feature dissimilarity between the current condition information and the reference model (step 208), classifying the feature dissimilarity (step 210), determining damage (step 212), and displaying an output (step 214). - Step 202 may further comprise receiving 1D or 2D data from a
sensor 12. In an exemplary embodiment, the entire forward surface of a gas turbine engine fan blade can be captured. In yet another exemplary embodiment, the entire pressure or suction surface of a turbine blade can be captured. - Step 204 may comprise constructing a complete image of
component 20 by tiling or mosaicking information from one ormore sensors 12 or multiple viewpoints. Step 204 may comprise mergingdata 14 from multiple viewpoints. In various embodiments,step 204 may comprise merging a first data from a 1D sensor and a second data from a 2D sensor and processing the 1D and 2D data to produce3D information 30. - Step 206 may comprise aligning 2D current condition information with a
reference model 22. - Step 206 may further comprise aligning the 3D information with a
reference model 22, such as a 3D point cloud, by an iterative closest point (ICP) algorithm modified to suppress misalignment from damage areas of thecomponent 20. The alignment may be performed by an optimization method, i.e., minimizing an objective function over a dataset, which may include mathematical terms in the ICP objective function or constraints to reject features or damage as outliers. The alignment may be performed by a 3D modification to a random sample consensus (RANSAC) algorithm, scale-invariant feature transform (SIFT), speeded up robust feature (SURF), other suitable alignment method. Step 206 may further include comparing the3D information 30 to thereference model 22 to align the features from the3D information 30 with thereference model 22 by identifying affine and/or scale invariant features, diffeomorphic alignment/scale cascaded alignment, and the like. Step 206 may further include registering the features. - Step 208 may further comprise computing features, such as surface and shape characteristics, of the
component 20 by methods to identify and extract features. For example,processor 16 may determine differences or dissimilarities between theinformation 30 and thereference model 22. Step 208 may further comprise identifying features and determining differences or dissimilarities between the identified features in theinformation 30 and thereference model 22 using a statistical algorithm such as histogram of gradients (HoG), histogram of oriented gradients (HoOG), histogram of gradients in 3D (Ho0G3D), a histogram of oriented gradients in 3D (Ho0G3D), 3D Zernike moments, or other algorithms. In a Ho0G3D method,processor 16 may define the orientation of edges and surfaces ofinformation 30 by dividing theinformation 30 into portions or cells and assigning to each cell a value, where each point or pixel contributes a weighted orientation or gradient to the cell value. By grouping cells and normalizing the cell values, a histogram of the gradients can be produced and used to extract or estimate information about an edge or a surface of thecomponent 20. Thus, the features of theinformation 30, such as surface and edge shapes, may be identified. Other algorithms, such as 3D Zernike moments, may similarly be used to recognize features in3D information 30 by using orthogonal moments to reconstruct, for example, surface and edge geometry ofcomponent 20. Step 208 may further comprise determining differences or dissimilarities between the identified features in theinformation 30 and thereference model 22. The dissimilarities may be expressed, for example, by the distance between two points or vectors. Other approaches to expressing dissimilarities may include computing mathematical models ofinformation 30 andreference model 22 in a common basis (comprising modes) and expressing the dissimilarity as a difference of coefficients of the basis functions (modes). Differences or dissimilarities between theinformation 30 and thereference model 22 may represent various types of damage tocomponent 20. - Step 210 may further comprise classifying the feature dissimilarities identified in
step 208. Theautomated inspection system 10 may include categories of damage or defect types forcomponent 20. For example, damage may be categorized into classes such as warping, stretching, edge defects, erosion, nicks, cracks, and/or cuts. Step 210 may further comprise identifying the damage type based on the dissimilarities between theinformation 30 and thereference model 22. Step 210 may further comprise classifying the feature dissimilarities into categories of, for example, systemic damage or localized damage. Systemic damage may include warping or stretching ofcomponent 20. Localized damage may include edge defects, erosion, nicks, cracks, or cuts on a surface ofcomponent 20. Classifying the feature dissimilarities may be accomplished by, for example, support vector machine (SVM), decision tree, deep neural network, recurrent ensemble learning machine, or other classification method. - The detection of damage may include differencing the data and a model to produce an error map. The error map may contain small errors due to model-mismatch and sensing errors, and may contain large, spatially correlated errors where damage has occurred.
- Step 212 may further comprise determining whether the feature difference or dissimilarity represents damage to
component 20. Step 212 may comprise determining a probability of damage represented by the feature dissimilarity and/or classification. Step 212 may comprise determining damage by comparing the probability of damage to a threshold. Damage may be determined if the probability meets or exceeds a threshold. Theautomated inspection system 10 may determine if the damage is acceptable or unacceptable, and may determine if thecomponent 20 should be accepted or rejected, wherein a rejected component would indicate that the component should be repaired or replaced. - Various types of damage such as missing material, cracks, delamination, creep, spallation, and the like can be detected automatically by using a deep learning classifier trained from available data, such as a library of user characterized damage examples, by using statistical estimation algorithms, by image or video classification algorithms, and the like. Deep learning is the process of training or adjusting the weights of a deep neural network. In an embodiment the deep neural network is a deep convolutional neural network. Deep convolutional neural networks are trained by presenting an error map or partial error map to an input layer and, a damage/no-damage label (optionally, a descriptive label, e.g., missing material, crack, spallation, and the like), to an output layer. The training of a deep convolutional network proceeds layer-wise and does not require a label until the output layer is trained. The weights of the deep network's layers are adapted, typically by a stochastic gradient descent algorithm, to produce a correct classification. The deep learning training may use only partially labeled data, only fully labeled data, or only implicitly labeled data, or may use unlabeled data for initial or partial training with only a final training on labeled data.
- In another embodiment statistical estimation or regression techniques to determine if damage is present in the error map. Statistical estimation regression techniques can include principal components analysis (PCA), robust PCA (RPCA), support vector machines (SVM), linear discriminant analysis (LDA), expectation maximization (EM), Boosting, Dictionary Matching, maximum likelihood (ML) estimation, maximum a priori (MAP) estimation, least squares (LS) estimation, non-linear LS (NNLS) estimation, and Bayesian Estimation based on the error map.
- Step 214 may further comprise transmitting, displaying, or storing the 2D or 3D information, feature differences or dissimilarities, classification of the feature differences or dissimilarities, a damage report, and/or a determination or recommendation that the
component 20 be accepted or rejected. Step 214 may further comprise displaying an image, a model, a combined image and model, a 2D perspective from a model, and the like, of the damaged component for further evaluation by a user or by a subsequent automated system. - Referring also to
FIG. 3 an exemplaryautomated inspection system 10 can be seen. In another exemplary embodiment, thesystem 10 can include an automated inspection system for detection of coating imperfections based on the method of “shape from shadows” for applications such as gas turbine engine blade coating inspection. Thecomponent 20 can be a blade of a fan, or a blade of a compressor, a blade of a turbine, a combustor liner, or other component with a surface coating. The exemplary embodiment shown inFIG. 3 includes acomponent 20 with asurface coating 32. Thesensor 12 is shown as acamera 12 configured to capture images of asurface coating 32. Thecamera 12 can be a high dynamic range camera or a multi-polarization camera to capture thenecessary image data 14 of thesurface 32. - An array of controllable light(s) 36 are mounted at low
oblique angles 38 around thecomponent 20. The light(s) 36 may be operable anywhere in the electromagnetic spectrum compatible with sensor(s) 12. In particular, light(s) 36 and/or sensor(s) 12 may operate at any one frequency in the electromagnetic spectrum (monochromatic), one band of frequencies (polychromatic), or one or more combinations of the foregoing. Light(s) 36 and/or sensor(s) 12 may employ filters (not shown) to achieve operation in the desired frequencies and/or bands. In one non-limiting embodiment light(s) 36 and sensor(s) 12 operate at a frequency or frequencies outside the spectrum of ambient illumination such that ambient illumination does not interfere with light(s) 36 and sensor(s) 12. The array of light(s) 36 are configured to illuminate thecomponent surface 32 and cast at least oneshadow 40 to be detected as in a feature dissimilarity or simply feature 42 on thecomponent surface 32. Thefeature 42 can result from a shallow surface defect, damage, crack, and the like formed on thesurface 32. Theprocessor 16 is coupled to theimaging device 12 and the array of light(s) 36. The array of light(s) 36 are arranged to illuminate thecomponent surface 32 from multiple directions. The array of light(s) 36 can be controlled independently, such that, the light(s) 36 cast theshadows 40. The cast shadows 40 represent 3D information about thesurface 32. - The
processor 12 can be configured to determine damage to the coating 34 based on image or video analytics. Theprocessor 16 can be configured to automatically report damage and archive the damage for trending and condition-based-maintenance. - The
processor 16 can be configured to receive the data for thesurface 32 of thecomponent 20 from theimaging device 12. Theprocessor 16 can be configured to perform operations like controlling thelighting array 36 to cast theshadows 40. Theprocessor 16 can receiveimage data 30 for thecomponent 20 from theimaging device 12. Theprocessor 16 can determine afeature dissimilarity 42 between theimage data 30 and areference model 22. Theprocessor 16 can classify thefeature dissimilarity 42 and determine a probability that thefeature dissimilarity 42 indicates damage to the component. Theprocessor 16 can include operations to remove specular reflections. A specular reflection is a type of surface reflectance often described as a mirror-like reflection of light from the surface. In specular reflection, the incident light is reflected into a single outgoing direction. Theprocessor 16 can include operations to control at least one of a position of at least one light in thearray 36 and an orientation of at least one light in thearray 36, with respect to thecomponent surface 32. Theprocessor 16 can include operations to illuminate each of the light(s) in the array independently. Theprocessor 16 can include operations to illuminate the component surface from multiple directions. Theprocessor 16 can include operations to compute a surface model from the image data to form a proxy model. Theprocessor 16 can include operations to determine a feature dissimilarity between the image data and a proxy model. - There has been provided a system and method for damage detection by cast shadows. While the system and method for damage detection by cast shadows has been described in the context of specific embodiments thereof, other unforeseen alternatives, modifications, and variations may become apparent to those skilled in the art having read the foregoing description. Accordingly, it is intended to embrace those alternatives, modifications, and variations which fall within the broad scope of the appended claims.
Claims (18)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/971,227 US10473593B1 (en) | 2018-05-04 | 2018-05-04 | System and method for damage detection by cast shadows |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/971,227 US10473593B1 (en) | 2018-05-04 | 2018-05-04 | System and method for damage detection by cast shadows |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190339206A1 true US20190339206A1 (en) | 2019-11-07 |
US10473593B1 US10473593B1 (en) | 2019-11-12 |
Family
ID=68385026
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/971,227 Active US10473593B1 (en) | 2018-05-04 | 2018-05-04 | System and method for damage detection by cast shadows |
Country Status (1)
Country | Link |
---|---|
US (1) | US10473593B1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111080627A (en) * | 2019-12-20 | 2020-04-28 | 南京航空航天大学 | 2D +3D large airplane appearance defect detection and analysis method based on deep learning |
CN111640112A (en) * | 2020-06-11 | 2020-09-08 | 云从科技集团股份有限公司 | Image detection method, system, platform, device, medium, and image processing apparatus |
CN111951234A (en) * | 2020-07-27 | 2020-11-17 | 上海微亿智造科技有限公司 | Model detection method |
CN111968084A (en) * | 2020-08-08 | 2020-11-20 | 西北工业大学 | Method for quickly and accurately identifying defects of aero-engine blade based on artificial intelligence |
US10902664B2 (en) | 2018-05-04 | 2021-01-26 | Raytheon Technologies Corporation | System and method for detecting damage using two-dimensional imagery and three-dimensional model |
US10914191B2 (en) | 2018-05-04 | 2021-02-09 | Raytheon Technologies Corporation | System and method for in situ airfoil inspection |
US10928362B2 (en) | 2018-05-04 | 2021-02-23 | Raytheon Technologies Corporation | Nondestructive inspection using dual pulse-echo ultrasonics and method therefor |
US10943320B2 (en) | 2018-05-04 | 2021-03-09 | Raytheon Technologies Corporation | System and method for robotic inspection |
CN113125440A (en) * | 2019-12-30 | 2021-07-16 | 纬创资通股份有限公司 | Method and device for judging object defects |
US11079285B2 (en) | 2018-05-04 | 2021-08-03 | Raytheon Technologies Corporation | Automated analysis of thermally-sensitive coating and method therefor |
US11268881B2 (en) | 2018-05-04 | 2022-03-08 | Raytheon Technologies Corporation | System and method for fan blade rotor disk and gear inspection |
DE102021124153A1 (en) | 2021-09-17 | 2023-03-23 | Homag Plattenaufteiltechnik Gmbh | Method and device for checking the quality of an edge of a panel-shaped workpiece |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10670539B1 (en) * | 2018-12-11 | 2020-06-02 | General Electric Company | Coating quality inspection system and method |
Family Cites Families (157)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5283641A (en) * | 1954-12-24 | 1994-02-01 | Lemelson Jerome H | Apparatus and methods for automated analysis |
US3804397A (en) | 1971-08-19 | 1974-04-16 | Gco | Automatic positioning vacuum cup |
JPS5677704A (en) * | 1979-11-30 | 1981-06-26 | Hitachi Ltd | Inspection system for surface defect of substance |
US4402053A (en) | 1980-09-25 | 1983-08-30 | Board Of Regents For Education For The State Of Rhode Island | Estimating workpiece pose using the feature points method |
US4873651A (en) * | 1987-04-21 | 1989-10-10 | Case Western Reserve University | Method and apparatus for reconstructing three-dimensional surfaces from two-dimensional images |
US5119678A (en) | 1989-12-26 | 1992-06-09 | General Electric Company | Pulse echo and through transmission ultra-sound |
US5064291A (en) * | 1990-04-03 | 1991-11-12 | Hughes Aircraft Company | Method and apparatus for inspection of solder joints utilizing shape determination from shading |
JP2583146B2 (en) * | 1990-05-28 | 1997-02-19 | 鐘紡株式会社 | Top cleanliness inspection method |
US5345514A (en) | 1991-09-16 | 1994-09-06 | General Electric Company | Method for inspecting components having complex geometric shapes |
US6462813B1 (en) | 1996-04-12 | 2002-10-08 | Perceptron, Inc. | Surface defect inspection system and method |
DE19710743C2 (en) | 1997-03-14 | 1999-02-25 | Siemens Ag | Process for the non-destructive detection of cracks and for measuring the crack depths in turbine blades |
US5963328A (en) * | 1997-08-28 | 1999-10-05 | Nissan Motor Co., Ltd. | Surface inspecting apparatus |
GB9805861D0 (en) | 1998-03-20 | 1998-05-13 | Rolls Royce Plc | A method and an apparatus for inspecting articles |
GB9806322D0 (en) | 1998-03-26 | 1998-05-20 | Rolls Royce Plc | Interpretation of thermal paint |
US6177682B1 (en) * | 1998-10-21 | 2001-01-23 | Novacam Tyechnologies Inc. | Inspection of ball grid arrays (BGA) by using shadow images of the solder balls |
WO2000037926A1 (en) * | 1998-12-21 | 2000-06-29 | Hottinger Maschinenbau Gmbh | Method and device for object recognition |
US6593574B2 (en) | 1999-09-16 | 2003-07-15 | Wayne State University | Hand-held sound source gun for infrared imaging of sub-surface defects in materials |
CA2382675C (en) | 1999-09-16 | 2009-01-06 | Wayne State University | Miniaturized contactless sonic ir device for remote non-destructive inspection |
US7064332B2 (en) | 1999-09-16 | 2006-06-20 | Wayne State University | Hand-held sound source for sonic infrared imaging of defects in materials |
US7724925B2 (en) | 1999-12-02 | 2010-05-25 | Thermal Wave Imaging, Inc. | System for generating thermographic images using thermographic signal reconstruction |
US7690840B2 (en) | 1999-12-22 | 2010-04-06 | Siemens Energy, Inc. | Method and apparatus for measuring on-line failure of turbine thermal barrier coatings |
US20020167660A1 (en) * | 2001-05-09 | 2002-11-14 | Testship Automatic Test Solutions Ltd. | Illumination for integrated circuit board inspection |
US20030107646A1 (en) | 2001-08-17 | 2003-06-12 | Byoungyi Yoon | Method and system for adjusting display angles of a stereoscopic image based on a camera location |
US6804622B2 (en) | 2001-09-04 | 2004-10-12 | General Electric Company | Method and apparatus for non-destructive thermal inspection |
US7122801B2 (en) | 2002-08-28 | 2006-10-17 | Wayne State University | System and method for generating chaotic sound for sonic infrared imaging of defects in materials |
WO2004020993A2 (en) | 2002-08-28 | 2004-03-11 | Wayne State University | System for infrared imaging by inducing acoustic chaos |
US7010987B2 (en) | 2002-10-31 | 2006-03-14 | Alstom (Switzerland) Ltd | Non-destructive method of detecting defects in braze-repaired cracks |
US6838670B2 (en) | 2002-11-12 | 2005-01-04 | Siemens Westinghouse Power Corporation | Methods and system for ultrasonic thermographic non-destructive examination for enhanced defect determination |
US7075084B2 (en) | 2002-12-20 | 2006-07-11 | The Boeing Company | Ultrasonic thermography inspection method and apparatus |
US6907358B2 (en) | 2003-01-30 | 2005-06-14 | General Electric Company | Eddy current inspection method |
US7738725B2 (en) | 2003-03-19 | 2010-06-15 | Mitsubishi Electric Research Laboratories, Inc. | Stylized rendering using a multi-flash camera |
US7184073B2 (en) | 2003-04-11 | 2007-02-27 | Satyam Computer Services Limited Of Mayfair Centre | System and method for warning drivers based on road curvature |
US7064330B2 (en) | 2003-04-30 | 2006-06-20 | United Technologies Corporation | Infrared defect detection via broad-band acoustics |
US20040240600A1 (en) | 2003-05-30 | 2004-12-02 | Siemens Westinghouse Power Corporation | Positron annihilation for inspection of land based industrial gas turbine components |
US7162070B2 (en) * | 2003-06-06 | 2007-01-09 | Acushnet Company | Use of patterned, structured light to detect and measure surface defects on a golf ball |
DE10346481B4 (en) * | 2003-10-02 | 2013-01-17 | Daimler Ag | Three-dimensional reconstruction of surface profiles |
US7026811B2 (en) | 2004-03-19 | 2006-04-11 | General Electric Company | Methods and apparatus for eddy current inspection of metallic posts |
US7190162B2 (en) | 2004-07-23 | 2007-03-13 | General Electric Company | Methods and apparatus for inspecting a component |
US7489811B2 (en) | 2004-10-08 | 2009-02-10 | Siemens Energy, Inc. | Method of visually inspecting turbine blades and optical inspection system therefor |
US7164146B2 (en) | 2004-10-22 | 2007-01-16 | Northrop Grumman Corporation | System for detecting structural defects and features utilizing blackbody self-illumination |
JP4023494B2 (en) | 2005-01-18 | 2007-12-19 | ソニー株式会社 | IMAGING DEVICE, IMAGING METHOD, AND IMAGING DEVICE DESIGNING METHOD |
US7240556B2 (en) | 2005-03-14 | 2007-07-10 | The Boeing Company | Angle beam shear wave through-transmission ultrasonic testing apparatus and method |
US7233867B2 (en) | 2005-04-06 | 2007-06-19 | General Electric Company | Eddy current inspection method and system |
US7313961B2 (en) | 2005-04-26 | 2008-01-01 | General Electric Company | Method and apparatus for inspecting a component |
US7272529B2 (en) | 2005-07-08 | 2007-09-18 | Honeywell International, Inc. | Dual wall turbine blade ultrasonic wall thickness measurement technique |
US7367236B2 (en) | 2005-07-21 | 2008-05-06 | The Boeing Company | Non-destructive inspection system and associated method |
US8449176B2 (en) | 2005-08-01 | 2013-05-28 | Thermal Wave Imaging, Inc. | Automated binary processing of thermographic sequence data |
US7415882B2 (en) | 2005-12-19 | 2008-08-26 | The Boeing Company | Methods and systems for inspection of composite assemblies |
US7689030B2 (en) | 2005-12-21 | 2010-03-30 | General Electric Company | Methods and apparatus for testing a component |
US7602963B2 (en) | 2006-01-10 | 2009-10-13 | General Electric Company | Method and apparatus for finding anomalies in finished parts and/or assemblies |
US7732768B1 (en) | 2006-03-02 | 2010-06-08 | Thermoteknix Systems Ltd. | Image alignment and trend analysis features for an infrared imaging system |
US8477154B2 (en) | 2006-03-20 | 2013-07-02 | Siemens Energy, Inc. | Method and system for interactive virtual inspection of modeled objects |
US8244025B2 (en) | 2006-03-20 | 2012-08-14 | Siemens Energy, Inc. | Method of coalescing information about inspected objects |
US7716987B2 (en) | 2006-07-31 | 2010-05-18 | University Of Dayton | Non-contact thermo-elastic property measurement and imaging system for quantitative nondestructive evaluation of materials |
US20090000382A1 (en) | 2006-07-31 | 2009-01-01 | University Of Dayton | Non-contact acousto-thermal method and apparatus for detecting incipient damage in materials |
US7549339B2 (en) | 2006-09-05 | 2009-06-23 | United Technologies Corporation | Inverse thermal acoustic imaging part inspection |
US7966883B2 (en) | 2006-12-06 | 2011-06-28 | Lockheed Martin Corporation | Non-destructive inspection using laser-ultrasound and infrared thermography |
US7447598B2 (en) | 2007-01-30 | 2008-11-04 | Theo Boeing Company | Methods and systems for automatically assessing and reporting structural health |
US7901185B2 (en) | 2007-02-21 | 2011-03-08 | United Technologies Corporation | Variable rotor blade for gas turbine engine |
US7757558B2 (en) | 2007-03-19 | 2010-07-20 | The Boeing Company | Method and apparatus for inspecting a workpiece with angularly offset ultrasonic signals |
US20090010507A1 (en) | 2007-07-02 | 2009-01-08 | Zheng Jason Geng | System and method for generating a 3d model of anatomical structure using a plurality of 2d images |
US8208711B2 (en) | 2007-09-07 | 2012-06-26 | General Electric Company | Method for automatic identification of defects in turbine engine blades |
US20090252987A1 (en) | 2008-04-02 | 2009-10-08 | United Technologies Corporation | Inspection and repair process using thermal acoustic imaging |
US7823451B2 (en) | 2008-05-06 | 2010-11-02 | The Boeing Company | Pulse echo/through transmission ultrasonic testing |
US8131107B2 (en) | 2008-05-12 | 2012-03-06 | General Electric Company | Method and system for identifying defects in NDT image data |
JP5253066B2 (en) | 2008-09-24 | 2013-07-31 | キヤノン株式会社 | Position and orientation measurement apparatus and method |
GB0903232D0 (en) | 2009-02-25 | 2009-04-08 | Saipem Spa | A method for testing pipeline welds |
US8108168B2 (en) | 2009-03-12 | 2012-01-31 | Etegent Technologies, Ltd. | Managing non-destructive evaluation data |
US8221825B2 (en) | 2009-03-30 | 2012-07-17 | Alstom Technology Ltd. | Comprehensive method for local application and local repair of thermal barrier coatings |
FR2949220B1 (en) | 2009-08-21 | 2011-09-09 | Snecma | METHOD AND SYSTEM FOR DETECTING THE INGESTION OF AN OBJECT BY AN AIRCRAFT TURBOJUSTER DURING A MISSION |
US8440974B2 (en) | 2009-09-16 | 2013-05-14 | Siemens Energy, Inc. | System and method for analysis of ultrasonic power coupling during acoustic thermography |
US8204294B2 (en) | 2009-11-25 | 2012-06-19 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for detecting defects in coatings utilizing color-based thermal mismatch |
US9080453B2 (en) | 2010-03-17 | 2015-07-14 | Thermal Wave Imaging, Inc. | Thermographic detection of internal passageway blockages |
US9479759B2 (en) | 2010-03-29 | 2016-10-25 | Forstgarten International Holding Gmbh | Optical stereo device and autofocus method therefor |
US9151698B2 (en) | 2010-04-23 | 2015-10-06 | Siemens Aktiengesellschaft | Testing system for examining turbine blades |
US9476842B2 (en) | 2010-05-03 | 2016-10-25 | United Technologies Corporation | On-the-fly dimensional imaging inspection |
CA2795532A1 (en) | 2010-05-04 | 2011-11-10 | Creaform Inc. | Object inspection with referenced volumetric analysis sensor |
FR2960642B1 (en) | 2010-05-28 | 2012-07-13 | Snecma | NON-DESTRUCTIVE CONTROL METHOD AND DEVICE FOR IMPLEMENTING THE METHOD |
US8692887B2 (en) | 2010-08-27 | 2014-04-08 | General Electric Company | Thermal imaging method and apparatus for evaluating coatings |
FR2965353B1 (en) | 2010-09-28 | 2013-08-23 | Astrium Sas | METHOD AND DEVICE FOR NON-DESTRUCTIVE CONTROL OF WINDMILL BLADES |
US8983794B1 (en) | 2010-10-04 | 2015-03-17 | The Boeing Company | Methods and systems for non-destructive composite evaluation and repair verification |
US9497388B2 (en) | 2010-12-17 | 2016-11-15 | Pelco, Inc. | Zooming factor computation |
US8431917B2 (en) | 2010-12-22 | 2013-04-30 | General Electric Company | System and method for rotary machine online monitoring |
WO2012122481A1 (en) | 2011-03-09 | 2012-09-13 | Rolls-Royce Corporation | Intelligent airfoil component grain defect inspection |
US8897502B2 (en) | 2011-04-29 | 2014-11-25 | Aptina Imaging Corporation | Calibration for stereoscopic capture system |
US9146205B2 (en) | 2011-05-10 | 2015-09-29 | Areva Inc. | Vibrothermographic weld inspections |
US9482585B2 (en) | 2011-05-16 | 2016-11-01 | General Electric Company | Method and system for multi-functional embedded sensors |
GB201109533D0 (en) | 2011-06-08 | 2011-07-20 | Rolls Royce Plc | Temperature indicating paint |
US8713998B2 (en) | 2011-06-14 | 2014-05-06 | The Boeing Company | Autonomous non-destructive evaluation system for aircraft structures |
US9488592B1 (en) | 2011-09-28 | 2016-11-08 | Kurion, Inc. | Automatic detection of defects in composite structures using NDT methods |
US8792705B2 (en) | 2011-11-03 | 2014-07-29 | United Technologies Corporation | System and method for automated defect detection utilizing prior data |
US8781209B2 (en) | 2011-11-03 | 2014-07-15 | United Technologies Corporation | System and method for data-driven automated borescope inspection |
US8744166B2 (en) | 2011-11-03 | 2014-06-03 | United Technologies Corporation | System and method for multiple simultaneous automated defect detection |
US8761490B2 (en) | 2011-11-03 | 2014-06-24 | United Technologies Corporation | System and method for automated borescope inspection user interface |
US8781210B2 (en) | 2011-11-09 | 2014-07-15 | United Technologies Corporation | Method and system for automated defect detection |
US9471057B2 (en) | 2011-11-09 | 2016-10-18 | United Technologies Corporation | Method and system for position control based on automated defect detection feedback |
SG191452A1 (en) | 2011-12-30 | 2013-07-31 | Singapore Technologies Dynamics Pte Ltd | Automatic calibration method and apparatus |
US9116071B2 (en) | 2012-01-31 | 2015-08-25 | Siemens Energy, Inc. | System and method for visual inspection and 3D white light scanning of off-line industrial gas turbines and other power generation machinery |
US9154743B2 (en) | 2012-01-31 | 2015-10-06 | Siemens Energy, Inc. | System and method for optical inspection of off-line industrial gas turbines and other power generation machinery while in turning gear mode |
US10697941B2 (en) | 2012-03-20 | 2020-06-30 | Baylor University | Method and system of non-destructive testing for composites |
US8913825B2 (en) | 2012-07-16 | 2014-12-16 | Mitsubishi Electric Research Laboratories, Inc. | Specular edge extraction using multi-flash imaging |
TWI460394B (en) | 2012-07-20 | 2014-11-11 | Test Research Inc | Three-dimensional image measuring apparatus |
US8855404B2 (en) | 2012-08-27 | 2014-10-07 | The Boeing Company | Methods and systems for inspecting a workpiece |
AU2013330961A1 (en) | 2012-10-19 | 2015-04-30 | Resodyn Corporation | Methods and systems for detecting flaws in an object |
US9251582B2 (en) | 2012-12-31 | 2016-02-02 | General Electric Company | Methods and systems for enhanced automated visual inspection of a physical asset |
US9285296B2 (en) | 2013-01-02 | 2016-03-15 | The Boeing Company | Systems and methods for stand-off inspection of aircraft structures |
JP5800434B2 (en) | 2013-01-11 | 2015-10-28 | Ckd株式会社 | Inspection system monitoring system |
US20140198185A1 (en) | 2013-01-17 | 2014-07-17 | Cyberoptics Corporation | Multi-camera sensor for three-dimensional imaging of a circuit board |
GB201302815D0 (en) | 2013-02-19 | 2013-04-03 | Rolls Royce Plc | Determining the deterioration of a gas turbine engine in use |
CN105453243B (en) | 2013-03-15 | 2018-05-22 | 鲁道夫技术公司 | Optoacoustic substrate assessment system and method |
US9453500B2 (en) | 2013-03-15 | 2016-09-27 | Digital Wind Systems, Inc. | Method and apparatus for remote feature measurement in distorted images |
FR3006447B1 (en) | 2013-05-30 | 2015-05-29 | Snecma | ULTRASOUND TRANSMISSION INSPECTION METHOD IMPROVED |
CA2820732C (en) | 2013-06-27 | 2017-11-21 | Systemes Tecscan Inc. | Method and apparatus for scanning an object |
US9244004B2 (en) | 2013-08-08 | 2016-01-26 | Stichting Sron—Netherlands Institute For Space Research | Method and system for inspection of composite assemblies using terahertz radiation |
US10373301B2 (en) | 2013-09-25 | 2019-08-06 | Sikorsky Aircraft Corporation | Structural hot spot and critical location monitoring system and method |
US10048230B2 (en) | 2013-11-14 | 2018-08-14 | The Boeing Company | Structural bond inspection |
US20150138342A1 (en) | 2013-11-19 | 2015-05-21 | United Technologies Corporation | System and method to determine visible damage |
US20150185128A1 (en) | 2013-12-26 | 2015-07-02 | The Boeing Company | Detection and Assessment of Damage to Composite Structure |
US9300865B2 (en) | 2014-01-24 | 2016-03-29 | Goodrich Corporation | Random imaging |
US9476798B2 (en) | 2014-02-21 | 2016-10-25 | General Electric Company | On-line monitoring of hot gas path components of a gas turbine |
US9734568B2 (en) | 2014-02-25 | 2017-08-15 | Kla-Tencor Corporation | Automated inline inspection and metrology using shadow-gram images |
JP6440367B2 (en) | 2014-02-27 | 2018-12-19 | 三菱重工業株式会社 | Wind turbine blade damage detection method and wind turbine |
US9305345B2 (en) | 2014-04-24 | 2016-04-05 | General Electric Company | System and method for image based inspection of an object |
US9483820B2 (en) | 2014-05-20 | 2016-11-01 | General Electric Company | Method and system for detecting a damaged component of a machine |
US11051000B2 (en) | 2014-07-14 | 2021-06-29 | Mitsubishi Electric Research Laboratories, Inc. | Method for calibrating cameras with non-overlapping views |
US9467628B2 (en) | 2014-08-26 | 2016-10-11 | Sensors Unlimited, Inc. | High dynamic range image sensor |
US20170234837A1 (en) | 2014-10-24 | 2017-08-17 | Renishaw Plc | Acoustic apparatus and method |
US11060979B2 (en) | 2014-12-19 | 2021-07-13 | General Electric Company | System and method for engine inspection |
EP3243166B1 (en) | 2015-01-06 | 2020-07-08 | Sikorsky Aircraft Corporation | Structural masking for progressive health monitoring |
JP2016139467A (en) * | 2015-01-26 | 2016-08-04 | 株式会社日立ハイテクノロジーズ | Sample observation method and sample observation device |
WO2016123508A1 (en) | 2015-01-29 | 2016-08-04 | The Regents Of The University Of California | Patterned-illumination systems adopting a computational illumination |
US9800798B2 (en) | 2015-02-13 | 2017-10-24 | Qualcomm Incorporated | Systems and methods for power optimization for imaging devices with dual cameras |
US9808933B2 (en) | 2015-04-03 | 2017-11-07 | GM Global Technology Operations LLC | Robotic system with reconfigurable end-effector assembly |
US10504218B2 (en) | 2015-04-21 | 2019-12-10 | United Technologies Corporation | Method and system for automated inspection utilizing a multi-modal database |
EP3288709B1 (en) | 2015-04-29 | 2023-03-08 | Magna International Inc. | Flexible fixturing |
ITUB20152385A1 (en) | 2015-07-22 | 2017-01-22 | Alenia Aermacchi Spa | METHOD AND NON-DESTRUCTIVE THERMOGRAPHIC INSPECTION SYSTEM FOR DETECTION AND MEASUREMENT OF DEFECTIVE IN COMPOSITE MATERIAL STRUCTURES |
US10502719B2 (en) | 2015-08-21 | 2019-12-10 | The Boeing Company | Analysis of a structure modeled with inconsistencies mapped thereon |
US9838583B2 (en) * | 2015-09-21 | 2017-12-05 | Siemens Energy, Inc. | Method and apparatus for verifying lighting setup used for visual inspection |
US10191479B2 (en) | 2015-09-29 | 2019-01-29 | General Electric Company | Methods and systems for network-based detection of component wear |
GB2544058A (en) | 2015-11-03 | 2017-05-10 | Rolls Royce Plc | Inspection apparatus and methods of inspecting gas turbine engines |
US10197473B2 (en) | 2015-12-09 | 2019-02-05 | General Electric Company | System and method for performing a visual inspection of a gas turbine engine |
US9785919B2 (en) | 2015-12-10 | 2017-10-10 | General Electric Company | Automatic classification of aircraft component distress |
GB2545271A (en) | 2015-12-11 | 2017-06-14 | Airbus Operations Ltd | Determining physical characteristics of a structure |
US10126272B2 (en) | 2015-12-29 | 2018-11-13 | General Electric Company | Systems and methods for ultrasonic inspection of turbine components |
US9594059B1 (en) | 2015-12-31 | 2017-03-14 | The Boeing Company | System and method for automated bond testing |
US9519844B1 (en) | 2016-01-22 | 2016-12-13 | The Boeing Company | Infrared thermographic methods for wrinkle characterization in composite structures |
CN105741348B (en) | 2016-01-28 | 2018-06-12 | 北京航空航天大学 | A kind of threedimensional model edit methods of structure adaptive |
US9928592B2 (en) | 2016-03-14 | 2018-03-27 | Sensors Unlimited, Inc. | Image-based signal detection for object metrology |
US20170262985A1 (en) | 2016-03-14 | 2017-09-14 | Sensors Unlimited, Inc. | Systems and methods for image-based quantification for allergen skin reaction |
US20170262979A1 (en) | 2016-03-14 | 2017-09-14 | Sensors Unlimited, Inc. | Image correction and metrology for object quantification |
US10007971B2 (en) | 2016-03-14 | 2018-06-26 | Sensors Unlimited, Inc. | Systems and methods for user machine interaction for image-based metrology |
US20170262977A1 (en) | 2016-03-14 | 2017-09-14 | Sensors Unlimited, Inc. | Systems and methods for image metrology and user interfaces |
US20170258391A1 (en) | 2016-03-14 | 2017-09-14 | Sensors Unlimited, Inc. | Multimodal fusion for object detection |
US10068326B2 (en) | 2016-03-18 | 2018-09-04 | Siemens Energy, Inc. | System and method for enhancing visual inspection of an object |
US11027332B2 (en) | 2016-04-15 | 2021-06-08 | United States Of America As Represented By The Administrator Of Nasa | System and method for in-situ characterization and inspection of additive manufacturing deposits using transient infrared thermography |
US9950815B2 (en) * | 2016-07-01 | 2018-04-24 | United Technologies Corporation | Systems and methods for detecting damage |
KR102560780B1 (en) | 2016-10-05 | 2023-07-28 | 삼성전자주식회사 | Image processing system including plurality of image sensors and electronic device including thereof |
-
2018
- 2018-05-04 US US15/971,227 patent/US10473593B1/en active Active
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11079285B2 (en) | 2018-05-04 | 2021-08-03 | Raytheon Technologies Corporation | Automated analysis of thermally-sensitive coating and method therefor |
US11880904B2 (en) | 2018-05-04 | 2024-01-23 | Rtx Corporation | System and method for robotic inspection |
US11268881B2 (en) | 2018-05-04 | 2022-03-08 | Raytheon Technologies Corporation | System and method for fan blade rotor disk and gear inspection |
US10902664B2 (en) | 2018-05-04 | 2021-01-26 | Raytheon Technologies Corporation | System and method for detecting damage using two-dimensional imagery and three-dimensional model |
US10914191B2 (en) | 2018-05-04 | 2021-02-09 | Raytheon Technologies Corporation | System and method for in situ airfoil inspection |
US10928362B2 (en) | 2018-05-04 | 2021-02-23 | Raytheon Technologies Corporation | Nondestructive inspection using dual pulse-echo ultrasonics and method therefor |
US10943320B2 (en) | 2018-05-04 | 2021-03-09 | Raytheon Technologies Corporation | System and method for robotic inspection |
CN111080627A (en) * | 2019-12-20 | 2020-04-28 | 南京航空航天大学 | 2D +3D large airplane appearance defect detection and analysis method based on deep learning |
CN113125440A (en) * | 2019-12-30 | 2021-07-16 | 纬创资通股份有限公司 | Method and device for judging object defects |
CN111640112A (en) * | 2020-06-11 | 2020-09-08 | 云从科技集团股份有限公司 | Image detection method, system, platform, device, medium, and image processing apparatus |
CN111951234B (en) * | 2020-07-27 | 2021-07-30 | 上海微亿智造科技有限公司 | Model detection method |
CN111951234A (en) * | 2020-07-27 | 2020-11-17 | 上海微亿智造科技有限公司 | Model detection method |
CN111968084A (en) * | 2020-08-08 | 2020-11-20 | 西北工业大学 | Method for quickly and accurately identifying defects of aero-engine blade based on artificial intelligence |
DE102021124153A1 (en) | 2021-09-17 | 2023-03-23 | Homag Plattenaufteiltechnik Gmbh | Method and device for checking the quality of an edge of a panel-shaped workpiece |
Also Published As
Publication number | Publication date |
---|---|
US10473593B1 (en) | 2019-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10473593B1 (en) | System and method for damage detection by cast shadows | |
US10914191B2 (en) | System and method for in situ airfoil inspection | |
US9950815B2 (en) | Systems and methods for detecting damage | |
US11268881B2 (en) | System and method for fan blade rotor disk and gear inspection | |
US20190339207A1 (en) | System and method for flexibly holding workpiece and reporting workpiece location | |
CN110378900B (en) | Method, device and system for detecting product defects | |
EP3006893B1 (en) | Methods for improving the accuracy of dimensioning-system measurements | |
CN113538432B (en) | Part defect detection method and system based on image processing | |
US8238635B2 (en) | Method and system for identifying defects in radiographic image data corresponding to a scanned object | |
CN106469448B (en) | Automated industrial inspection with 3D vision | |
US10958843B2 (en) | Multi-camera system for simultaneous registration and zoomed imagery | |
WO2013061976A1 (en) | Shape inspection method and device | |
US10488371B1 (en) | Nondestructive inspection using thermoacoustic imagery and method therefor | |
CN112088304A (en) | Inspection apparatus and inspection method | |
KR20200014438A (en) | Apparatus and method for optimizing examination outside of the subject | |
US20170125271A1 (en) | Position detection apparatus, position detection method, information processing program, and storage medium | |
WO2020125528A1 (en) | Anchor object detection method and apparatus, electronic device, and storage medium | |
WO2010059679A2 (en) | Constructing enhanced hybrid classifiers from parametric classifier families using receiver operating characteristics | |
JP7327984B2 (en) | Information processing device, information processing method, and program | |
Ieamsaard et al. | Automatic optical inspection of solder ball burn defects on head gimbal assembly | |
Svalina et al. | Possibilities of evaluating the dimensional acceptability of workpieces using computer vision | |
CN117934453B (en) | Method and system for diagnosing defects of backlight foreign matters of mobile phone screen | |
JP7399632B2 (en) | Photography processing device and photography processing method | |
US20240104715A1 (en) | Production-speed component inspection system and method | |
Khatyreva et al. | Unsupervised anomaly detection for industrial manufacturing using multiple perspectives of free falling parts |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: UNITED TECHNOLOGIES CORPORATION, CONNECTICUT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:XIONG, ZIYOU;FINN, ALAN MATTHEW;REEL/FRAME:045716/0820 Effective date: 20180501 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: RAYTHEON TECHNOLOGIES CORPORATION, MASSACHUSETTS Free format text: CHANGE OF NAME;ASSIGNOR:UNITED TECHNOLOGIES CORPORATION;REEL/FRAME:054062/0001 Effective date: 20200403 |
|
AS | Assignment |
Owner name: RAYTHEON TECHNOLOGIES CORPORATION, CONNECTICUT Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE AND REMOVE PATENT APPLICATION NUMBER 11886281 AND ADD PATENT APPLICATION NUMBER 14846874. TO CORRECT THE RECEIVING PARTY ADDRESS PREVIOUSLY RECORDED AT REEL: 054062 FRAME: 0001. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF ADDRESS;ASSIGNOR:UNITED TECHNOLOGIES CORPORATION;REEL/FRAME:055659/0001 Effective date: 20200403 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: RTX CORPORATION, CONNECTICUT Free format text: CHANGE OF NAME;ASSIGNOR:RAYTHEON TECHNOLOGIES CORPORATION;REEL/FRAME:064714/0001 Effective date: 20230714 |