CN113390605B - Full-field measurement method for wing deformation of wind tunnel test airplane - Google Patents

Full-field measurement method for wing deformation of wind tunnel test airplane Download PDF

Info

Publication number
CN113390605B
CN113390605B CN202110819196.7A CN202110819196A CN113390605B CN 113390605 B CN113390605 B CN 113390605B CN 202110819196 A CN202110819196 A CN 202110819196A CN 113390605 B CN113390605 B CN 113390605B
Authority
CN
China
Prior art keywords
image
camera
wing
pixel
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110819196.7A
Other languages
Chinese (zh)
Other versions
CN113390605A (en
Inventor
王斌
张文清
王众
王强
梁杰
周鑫
顾正华
宋巍巍
王盼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Equipment Design and Testing Technology Research Institute of China Aerodynamics Research and Development Center
Original Assignee
Equipment Design and Testing Technology Research Institute of China Aerodynamics Research and Development Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Equipment Design and Testing Technology Research Institute of China Aerodynamics Research and Development Center filed Critical Equipment Design and Testing Technology Research Institute of China Aerodynamics Research and Development Center
Priority to CN202110819196.7A priority Critical patent/CN113390605B/en
Publication of CN113390605A publication Critical patent/CN113390605A/en
Application granted granted Critical
Publication of CN113390605B publication Critical patent/CN113390605B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M9/00Aerodynamic testing; Arrangements in or on wind tunnels
    • G01M9/06Measuring arrangements specially adapted for aerodynamic testing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M9/00Aerodynamic testing; Arrangements in or on wind tunnels
    • G01M9/08Aerodynamic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Fluid Mechanics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a wind tunnel test airplane wing deformation full-field measurement method, which comprises the following steps: manufacturing a spot pattern on the surface of the wing; calibrating the stereo vision measuring system to obtain the internal parameter P of the camera a a Internal parameter P of camera b b Calibration parameter P of stereoscopic vision measurement system s Measuring a transformation matrix R converted from a coordinate system to an airplane model coordinate system; acquiring a test image sequence according to the shot images of the camera a and the camera b before and during the blowing test; and calculating the wing deformation full-field measurement result based on the test image sequence. The method is used for solving the problems that the wind tunnel test airplane wing deformation measurement technology in the prior art can only carry out discrete point measurement, has sparse measurement data, low spatial resolution and the like, can obtain dense measurement data with higher spatial resolution, and achieves the purposes of full-field measurement of airplane wing deformation and full-field measurement of three-dimensional appearance of the wing surface in a deformation state.

Description

Full-field measurement method for wing deformation of wind tunnel test airplane
Technical Field
The invention relates to the field of wind tunnel tests, in particular to a full-field measurement method for wing deformation of a wind tunnel test airplane.
Background
In the process of aircraft development, a scaled model wind tunnel test needs to be carried out in a wind tunnel. In the wind tunnel test, because the airplane model is not a rigid body, the aerodynamic load can cause the wings to generate large deformation, thereby influencing the test precision of the wind tunnel model. As shown in fig. 1, the broken line is the undeformed profile of the wing before the blow test, the solid line is the deformed profile of the wing during the blow test, and fig. 1 is a front view of the airplane. Therefore, the aeroelastic deformation of the airplane wing must be accurately measured in the wind tunnel model test process, so that the measurement data of the wind tunnel test can be corrected at a later stage. In addition, with the adoption of emerging design methods, for example, CAD (computer-aided design), ANASYS (artificial neural systems) and other tools are used for researching the pneumatic and structural coupling problems, and three-dimensional shape data before and after wing deformation is urgently required to be actually measured. Therefore, in the wind tunnel model test, the three-dimensional shapes of the wings before and after deformation need to be measured.
In the prior art, an Optotrak optical measurement system and a photogrammetry technology are mainly adopted for measuring technologies before and after wing deformation: the principle of the Optotrak optical measurement system is shown in FIG. 2, LED light-emitting mark points are punched and embedded in the surface of a wing to be measured, and then coordinates of the LED mark points are obtained through a linear array CCD by utilizing a triangulation principle, so that a discrete wing deformation result is obtained. However, the wing deformation measuring method for punching and embedding the LED light-emitting mark points on the surface of the model is difficult to apply to the application occasions which have high requirements on the smoothness of the surface of the model and are not allowed to punch on the surface of the model; the schematic diagram of the photogrammetry technology is shown in fig. 3, wherein a plurality of white disc-shaped artificial mark points are pasted on the surface of the wing, a stereo vision measuring system is formed by adopting two photographic mechanisms, images of the artificial mark points on the surface of the wing are synchronously shot, the space three-dimensional coordinates of the artificial mark points are calculated according to the stereo vision measuring principle, and then the discrete wing deformation is calculated.
The two types of methods have the common disadvantages that: the method can only obtain discrete point measurement results, has the problems of sparse measurement data, low spatial resolution and the like, cannot obtain full-field measurement results of wing deformation, and has important application value for improving the accuracy of later data correction.
Disclosure of Invention
The invention aims to provide a full-field measurement method for wing deformation of a wind tunnel test airplane, which aims to solve the problems that in the prior art, the wing deformation measurement technology of the wind tunnel test airplane can only carry out discrete point measurement, has sparse measurement data, low spatial resolution and the like, can obtain dense measurement data with higher spatial resolution, and achieves the purposes of full-field measurement of wing deformation of the airplane and full-field measurement of three-dimensional appearance of the surface of the wing in a deformation state.
The invention is realized by the following technical scheme:
a wind tunnel test airplane wing deformation full-field measurement method comprises the following steps:
s1, manufacturing a spot pattern on the surface of the wing;
s2, calibrating the stereo vision measuring system to obtain the internal parameter P of the camera a a Internal parameters P of camera b b Calibration parameter P of stereoscopic vision measurement system s Measuring a transformation matrix R converted from the coordinate system to the airplane model coordinate system;
s3, acquiring a test image sequence according to the shot images of the camera a and the camera b before and during the blowing test;
and S4, calculating the wing deformation full-field measurement result based on the test image sequence.
The measurement principle of this application is: in stereo vision, the full-field three-dimensional shape of the measured object can be reconstructed by utilizing the middle texture information of the stereo images shot by the camera a and the camera b through image block matching. Based on the technical principle, spot patterns are made on the surface of the wing, image texture information is artificially increased, manual mark points adopted in the photogrammetry technology are replaced, full-field three-dimensional point clouds on the surface of the wing can be calculated through dense pixel matching and three-dimensional stereo vision reconstruction and serve as wing three-dimensional appearance measurement results, and then aerodynamic force before and after a blowing test is appliedAnd performing difference by using the three-dimensional point cloud on the surfaces of the lower undeformed wing and the deformed wing to obtain a whole field measurement result of wing deformation. The method obtains the internal parameter P of the camera a in the process of calibrating the stereo vision measuring system a Internal parameter P of camera b b Calibration parameter P of stereoscopic vision measurement system s And parameters such as a transformation matrix R for converting the measurement coordinate system to the airplane model coordinate system play an important role in finally calculating the wing deformation full-field measurement result, and the finally obtained wing deformation full-field measurement result can be effectively ensured.
Further, the method for manufacturing the speckle pattern in step S1 includes:
s11, manufacturing a plurality of holes with the aperture of 0.0001-1000 mm, randomly distributed positions and different sizes on the cloth/paper as masks;
s12, covering the mask on the surface of the airplane wing, and coating the A color coating to obtain an A color coating; taking down the mask to obtain a A color spot pattern;
s13, after the paint A is solidified, painting the paint B to obtain a paint B, and enabling the paint B to completely cover the A color spot pattern;
and S14, after the B color coating is solidified and dried, grinding and polishing the B color coating to expose the A color spots from the B color coating, thereby obtaining the spot pattern with a smooth surface.
In wind tunnel tests, particularly in transonic wind tunnels, clear requirements are placed on the smoothness of the model surface. The spot patterns made on the surface of the model cannot have large bulges, otherwise, the flow field on the surface of the model can be disturbed, so that the traditional Optotrak optical measurement system and the photogrammetry technology are not suitable for wind tunnel tests. At present, the methods for manufacturing the spot patterns which can be found out mainly include methods such as a paint spraying method, chemical etching, mask printing and the like. The paint spraying method is to spray paint droplets above the measured object, and the scattered droplets form a disordered spot pattern on the surface of the measured object. Because the paint droplets are very small, the method can be used for manufacturing the spot pattern with certain smoothness, but the small paint droplets are difficult to form large spots, and when the shooting field is large and the shooting distance is long, a distinguishable spot pattern image cannot be obtained, so that the method is not suitable for wind tunnel test application. The chemical corrosion method is to make disordered textures on the metal surface through chemical corrosion, and the method can damage the surface structure of the wing and is obviously not suitable for wind tunnel test application. The conventional mask printing is: a plurality of holes are randomly punched on cloth or paper to form a mask, the mask is covered on the surface of a measured object, and a spot pattern is made by painting a coating with a larger color contrast with the measured object. The method can be used for manufacturing spots of any size and can be used in large-view-field and long-distance shooting occasions. In practical application, the spots with small size and dense distribution can be selected as much as possible to form the spot pattern through experiments under the condition of ensuring the resolution of the spots in the shot image. However, because the spots have a certain thickness, large protrusions are easily generated on the surface of the model, the requirement of smoothness of the surface of the model cannot be met, and the flow field on the surface of the model is interfered. Therefore, the conventional mask printing method cannot be directly used in the wind tunnel test.
In order to overcome the problem that the existing spot manufacturing method is not suitable for the wind tunnel test of the airplane, the invention provides a spot pattern manufacturing method capable of meeting the smoothness requirement of a wind tunnel test model, which comprises the following steps: firstly, holes with the aperture of 0.0001-1000 mm, randomly distributed positions and different sizes are manufactured on cloth or paper to be used as masks; then, covering a mask on the surface of the airplane wing, coating the A color coating to prepare an A color coating, taking down the mask to obtain an A color spot pattern, and coating the B color coating after the A color coating is solidified to prepare a B color coating so that the B color coating completely covers the A color spots; and then, after the B color coating is solidified and dried, polishing the B color coating by using a polishing tool to enable the A color spots to be exposed out of the B color coating and meet the requirement on the smoothness of a test model, and finally obtaining the spot pattern with a smooth surface. Wherein, the colors of A and B need to be different, and the color difference is preferably larger. The method fills the technical blank of the prior art for manufacturing the surface spots of the wing in the wind tunnel test process of the airplane.
Further, the specific method of step S2 includes:
s21, calibrating the camera a to obtain the internal parameter P of the camera a a
S22, calibrating the camera b to obtain the internal parameter P of the camera b b
S23, establishing a coordinate system [ i, j, k ] of the measuring system by taking the camera a as a reference]Calibrating the stereo vision measuring system formed by the camera a and the camera b to obtain a calibration parameter P of the stereo vision measuring system s
And S24, calculating a transformation matrix R for converting the coordinate system [ i, j, k ] of the measurement system into the coordinate system [ x, y, z ] of the model according to the manual mark points on the surface of the wing.
The method and the device have the advantages that the calibration process of the stereo vision measurement system is defined in detail, and a sufficient basis is provided for finally calculating the full-field measurement result of the wing deformation.
Further, the method for acquiring the test image sequence in step S3 includes:
s31, before the blowing test, the camera a and the camera b shoot the spot pattern images of the undeformed wing surface in the windless state, and the spot pattern images are respectively recorded as
Figure BDA0003171246710000031
S32, in the process of the blowing test, the camera a and the camera b shoot spot pattern images of the surface of the deformed wing in the windy state at the moment t, and the spot pattern images are recorded as
Figure BDA0003171246710000032
Wherein k represents the number of shots;
s33, storing according to time sequence
Figure BDA0003171246710000033
And the image shot by the camera a at the time t
Figure BDA0003171246710000034
Obtaining a sequence of images captured by a camera a
Figure BDA0003171246710000041
Store in chronological order
Figure BDA0003171246710000042
And the camera b at the t moment
Figure BDA0003171246710000043
Obtaining a sequence of images captured by camera b
Figure BDA0003171246710000044
Further, the method for calculating the full-field measurement result of the wing deformation in step S4 includes:
s41, preprocessing the image;
s42, calculating three-dimensional point cloud on the surface of the wing before the blowing test;
s43, calculating three-dimensional point cloud on the surface of the wing at the t moment in the blowing test;
and S44, calculating the full-field measurement result of the wing deformation at the time t.
The image preprocessing comprises:
s411, respectively passing through internal parameters P a 、P b Distortion correction is carried out on all images in the shot image sequence of the camera a and all images in the shot image sequence of the camera b to obtain an image sequence after distortion correction of the camera a and the camera b;
s412, calibrating the parameter P according to the stereoscopic vision measuring system s Performing stereo correction on the image sequence after the distortion correction to ensure that pixels with the same name are aligned in the image ordinate direction in a stereo image pair shot by a camera a and a camera b;
s413, extracting the spot pattern interested area T, eliminating background image pixels which are not positioned on the surface of the wing, and reserving pixels in the spot pattern area on the surface of the wing.
The method for extracting the spot pattern interesting region T comprises the following steps:
s4131, distinguishing a speckle pattern image and a test section background image by using a pixel neighborhood gradient mean value G;
Figure BDA0003171246710000045
wherein s is the neighborhood size, the value range is 1-1000, and the unit is a pixel;
Figure BDA0003171246710000046
is the 1 st derivative of the (u, v) th pixel in the image u coordinate direction,
Figure BDA0003171246710000047
is the 1 st derivative of the (u, v) th pixel in the image v coordinate direction;
s4132, setting a Flag matrix Flag equal to the size of the image, and setting a threshold value
Figure BDA0003171246710000048
The value range of (A) is 0.0001-10000, and the dimension is not large;
s4133, traversing the whole image, and obtaining the pixel at the position of the image (i, j)
Figure BDA0003171246710000049
Then, for the speckle pattern image, the Flag (i, j) is set to 1; otherwise, setting the Flag (i, j) to be 0 for the background image of the test segment;
s4134, after the image is traversed, firstly carrying out corrosion operation on the Flag matrix Flag to eliminate interference noise, and then carrying out expansion operation to obtain T; in T, 1 indicates that the pixel is a speckle pattern to be processed, and 0 indicates that no speckle pattern is to be processed.
The calculation method of the three-dimensional point cloud on the surface of the wing before the blowing test comprises the following steps:
s421, extracting from the image sequence after the stereo correction of the camera a and the camera b respectively
Figure BDA00031712467100000410
Composing a stereoscopic image pair
Figure BDA0003171246710000051
Using dense matching methods, stereoscopic image pairs
Figure BDA0003171246710000052
Carrying out dense stereo matching on pixels in T;
s422, an ICGN optical flow algorithm is adopted, an affine transformation model is selected for carrying out deformation modeling on the image block, sub-pixel matching is carried out on the image block, and pixel coordinates of a sub-pixel matching result are obtained;
s423, traversing the images in sequence
Figure BDA0003171246710000053
Obtaining a sub-pixel dense matching result by centering the pixels in T;
s424, calculating the sub-pixel image coordinates of the sub-pixel matching result pixel coordinates before the stereo correction in an inverse mapping mode according to the mapping relation in the stereo correction process;
s425, according to the calibration parameter P of the stereo vision measuring system s Calculating the coordinates of the sub-pixel image in the coordinate system [ i, j, k ] of the measuring system]Three-dimensional point cloud; and converting the three-dimensional point cloud [ X ] into a three-dimensional point cloud [ X ] under an airplane model coordinate system according to the conversion matrix R 0 ,Y 0 ,Z 0 ]。
Specifically, the specific process of the full-field measurement result of the wing deformation in the application is as follows:
firstly, according to the internal parameters P of the camera a a Image sequence shot by camera a
Figure BDA0003171246710000054
Distortion correction is carried out on all the images to obtain a distortion-corrected image sequence
Figure BDA0003171246710000055
According to the parameters P in the camera b b Sequence of images taken by camera b
Figure BDA0003171246710000056
All the images are subjected to distortion correction to obtain a distortion-corrected image sequence
Figure BDA0003171246710000057
Secondly, adopting the image block sub-pixel matching facing the stereoscopic visionMatching method, distortion corrected stereo image pair { I' a ,I' b Dense matching is carried out on all pixels in the spot pattern area in the previous step, and a parameter P is calibrated according to the stereoscopic vision s Performing three-dimensional reconstruction to obtain the wing coordinate system [ i, j, k ] in measurement]Three-dimensional point cloud [ I, J, K ] below]Then according to the coordinate system [ i, j, k ] of the measuring system]To the model coordinate system [ x, y, z ]]Transforming the transformation matrix R to obtain wing three-dimensional point cloud [ X, Y, Z ]] T (ii) a Processing the pre-blow test image pair as described above
Figure BDA0003171246710000058
Obtaining three-dimensional point cloud [ X ] of undeformed wing 0 ,Y 0 ,Z 0 ]Processing t-moment image pairs in blowing test
Figure BDA0003171246710000059
Obtaining three-dimensional point cloud [ X ] of deformed wing at t moment t ,Y t ,Z t ]。
The stereoscopic vision-oriented image block sub-pixel matching method comprises the following steps:
1. calibrating parameter P from stereo vision s Pair of distortion corrected stereo images { I' a ,I' b Carrying out stereo correction to obtain a stereo corrected image pair
Figure BDA00031712467100000510
Make the pixels with the same name in { I 'a, I' b } in
Figure BDA00031712467100000512
In (d) is aligned with the image ordinate u, and the superscript r indicates stereo correction.
The principle is as follows: according to the epipolar Geometry constraint in stereoscopic Vision, a point O on the object surface, and the like pixel points O1 and O2 in the stereoscopic image pair are located on the polar line. Stereo correction aligns the epipolar lines of a stereo image pair in the horizontal direction so that the pixels of the same name have the same ordinate u. As shown in particular in fig. 7.
2. And (3) extracting a spot pattern area in the image by adopting a manual or automatic method to be used as an interested area T so as to eliminate background image pixels which are not positioned on the surface of the wing, and reserving pixels in the spot pattern area on the surface of the wing for reducing the pixel matching data volume.
The method for automatically extracting the spot pattern as the region of interest T comprises the following steps: distinguishing a speckle pattern image and a test section background image by adopting a pixel neighborhood gradient mean value; the pixel neighborhood gradient mean is:
Figure BDA0003171246710000061
wherein s is the neighborhood size, the range of values is 1-1000, the unit is pixel,
Figure BDA0003171246710000062
is the 1 st derivative of the (u, v) th pixel in the image u coordinate direction,
Figure BDA0003171246710000063
is the 1 st derivative of the (u, v) th pixel in the image v coordinate direction; given a Flag matrix Flag equal in size to the image, a threshold value is set
Figure BDA0003171246710000064
The value range of (a) is 0.0001-10000; traversing the whole image, when the pixel at the position of image (i, j)
Figure BDA0003171246710000065
When the image is a speckle pattern image, setting the Flag (i, j) to be 1, otherwise, setting the Flag (i, j) to be 0; after traversing the image, firstly carrying out corrosion operation on the Flag matrix to eliminate interference noise, and then carrying out expansion operation to obtain T; in T, 1 indicates that the pixel is a speckle pattern to be processed, and 0 indicates that no speckle pattern is to be processed.
The principle of automatically extracting the spot pattern interesting region T is as follows: the background of the test section is a uniform flat area, the pixel neighborhood gradient mean value G of the image in the area is small, the image in the speckle pattern area is represented as mottled texture, rapid light and shade change exists, and the pixel neighborhood gradient G is large, so that the pixel neighborhood gradient mean value G can be used for effectively distinguishing the speckle pattern image from the background image of the test section and extracting the speckle pattern area T.
3. Given an image Ir a In one pixel a (m), image coordinates are qm ═ (um, vm), m represents a pixel number, u is an image ordinate and ranges from 1 to h, h is an image height, h ranges from 1 to 1000000, a unit is a pixel, v is an image abscissa and ranges from 1 to w, w is an image width, w ranges from 1 to 1000000, a unit is a pixel, an image block a (m +1) with the size of L is defined by taking qm as a center, L ranges from 1 to 100000, and a unit is a pixel, and the image block a (m +1) are calculated by adopting FFT (fast Fourier transform) and the image block v (m, vm) and the image are calculated by adopting FFT
Figure BDA0003171246710000066
The image is processed by boundary removing to obtain a convolution image
Figure BDA0003171246710000067
Convolved images of the same size; on the convolution image, a rectangle with the size τ w is arranged by centering on a straight line u ═ um defined by the ordinate um of the pixel a (m), and the upper boundary of the rectangle and the straight line u ═ w are set 13 - τ/2 is level with the lower boundary u-u 13 The method comprises the steps that + tau/2 is level, the left boundary is level with a straight line v which is 1, the right boundary is level with a straight line v which is w, tau ranges from 0 to 1000, the unit is a pixel, the maximum value pixel is searched in a rectangular range on a convolution image, and the image coordinate q of the pixel is 19 =(u 19 ,v 19 ) Is the integer pixel initial match result for that pixel.
In the prior art, in the process of finding the maximum value in a convolution image to carry out optimal matching search, the calibration error of a camera and stereoscopic vision cannot be avoided, and therefore, a relaxation amount tau is introduced. Namely: the pixels a (m) having the same name are located on a rectangle having a width τ and being axisymmetric to the straight line u ═ um. Finding the maximum value in the rectangular range, the optimal matching with the image block a (m +1) can be found, and the pixel a (m) in the image can be obtained
Figure BDA0003171246710000071
The integer pixel matching result in (2).
4. Taking the image coordinate q of the pixel b as an initial position, adopting an ICGN inverse component Gauss Newton optical flow algorithm, selecting an affine transformation model to carry out deformation modeling on the image block, and carrying out deformation modeling on the image
Figure BDA0003171246710000072
Performing sub-pixel matching on the image block a (m +1) to obtain sub-pixel image coordinates as a sub-pixel matching result of the pixels a (m);
5. according to the stereo correction process, (I) a ,I b ) To
Figure BDA0003171246710000073
Mapping relation, calculating pixel a (m) in image by inverse mapping
Figure BDA0003171246710000074
Sub-pixel coordinates of
Figure BDA0003171246710000075
Computing qm in an image
Figure BDA0003171246710000076
Sub-pixel coordinates of
Figure BDA0003171246710000077
Sub-pixel matching results as stereoscopic image pair
Figure BDA0003171246710000078
Superscript s denotes sub-pixel coordinates;
6. traversing the image according to the 3-5
Figure BDA0003171246710000079
All pixels in the T are located upwards, and a sub-pixel dense matching result is obtained;
and finally, before the blowing test, three-dimensional point cloud [ X ] of the undeformed wing under the coordinate system of the airplane model 0 ,Y 0 ,Z 0 ]For reference, calculating the displacement delta of the morphing wing in the z direction at the time t z =Z' t -Z 0 (ii) a Wherein, Z' t =f([X 0 ,Y 0 ],[X t ,Y t ,Z t ]) Is represented by [ X ] 0 ,Y 0 ]For input, by means of a three-dimensional point cloud interpolation operation f (-), from [ X ] t ,Y t ,Z t ]And interpolating the calculated z-direction coordinates.
Furthermore, the sub-pixel dense matching result is traversed by a gridding discrete point matching and interpolation compensation method by dividing the image into a plurality of grids with the size Su Sv; wherein Su and Sv are both pixels.
The inventor finds that if dense matching of all pixels in the whole image is directly performed in the application, the algorithm complexity is high, and the calculation time is long, so that the invention provides a method for rapidly calculating three-dimensional point cloud of airplane wings, which comprises the following steps: instead of performing dense matching on all pixels in the spot pattern T, a gridding discrete point matching and interpolation compensation method is used to calculate a three-dimensional point cloud on the wing surface. The specific method comprises the following steps: dividing an image into m × n grids, wherein the value range of m is 1-10000, the value range of n is 1-10000, the size of the grids is Su × Sv pixels, Su >1 and Sv > 1; matching only one pixel in the grid, and not matching the other pixels to obtain a sparse three-dimensional point cloud; interpolating the sparse three-dimensional point cloud by adopting an interpolation compensation method to obtain dense three-dimensional point cloud on the surface of the wing; the interpolation compensation method adopts RBF radial basis function interpolation or B spline interpolation.
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. the invention relates to a full-field measurement method for wing deformation of a wind tunnel test airplane, which solves the problems that the prior art only can carry out wing deformation discrete point measurement, has sparse measurement data, low spatial resolution and the like, can obtain dense measurement data with higher spatial resolution, and realizes full-field measurement of wing deformation of the airplane and full-field measurement of three-dimensional appearance of the wing surface in a deformation state;
2. the invention relates to a full-field measurement method for wing deformation of a wind tunnel test airplane, which provides an automatic extraction method for a spot pattern interesting region T, and can effectively reduce the labor amount, improve the spot pattern image preprocessing efficiency, reduce the spot pattern pixel processing data amount and improve the calculation speed of the three-dimensional appearance of the wing surface;
3. the invention relates to a full-field measurement method for wing deformation of a wind tunnel test airplane, which provides a stereoscopic vision-oriented image block sub-pixel matching method, utilizes epipolar geometric constraint in stereoscopic vision, adopts a stereoscopic corrected image to perform image block matching, can reduce a searched space, reduce the calculated amount and reduce image block matching errors, and gives consideration to algorithm speed and matching precision through the provided two-stage matching of integer pixels and sub-pixels;
4. the invention relates to a wind tunnel test airplane wing deformation full-field measurement method, which provides a gridding discrete point matching and interpolation compensation method, further reduces the pixel matching calculation amount in wing full-field deformation measurement, and improves the wing three-dimensional point cloud calculation speed;
5. the invention discloses a full-field measurement method for wing deformation of a wind tunnel test airplane, and provides a brand-new model surface spot pattern manufacturing method which is used for manufacturing a spot pattern with a smooth and clear and identifiable surface, meets the requirement of a wind tunnel test on the smoothness of the model surface, and fills the blank of the prior art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. In the drawings:
FIG. 1 is a schematic view of aeroelastic deformation of an aircraft wing;
FIG. 2 is a schematic diagram of LED light-emitting mark points arranged on the surface of an airfoil by a prior Optotrak measurement system;
FIG. 3 is a schematic view of a wing deformation measurement principle performed by a conventional photogrammetry technique;
FIG. 4 is a schematic view of a wind tunnel test aircraft wing deformation full-field measurement device used in an embodiment of the invention;
FIG. 5 is a schematic diagram of a measurement according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating an initial matching process for integer pixels according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a sub-pixel matching process according to an embodiment of the present invention;
FIG. 8 is a diagram of a Flag tag matrix and a region of interest T according to an embodiment of the present invention;
FIG. 9 is a schematic view of a measuring device setup for a mold half test according to an embodiment of the present invention;
FIG. 10 is a comparison of pixel-by-pixel matching and discrete point matching in accordance with one embodiment of the present invention;
FIG. 11 is a schematic diagram of three-dimensional point interpolation compensation according to an embodiment of the present invention.
Reference numbers and corresponding part names in the figures:
1-an airfoil; 1-1-undeformed airfoil; 1-2-morphing wings; 2-LED luminous mark points; 3, marking points manually; 4-camera a; 5-camera b; 6-speckle pattern; 7-light source a; 8-light source b; 9-an image acquisition controller; 10-a computer; 11-a fuselage; 12-first image in stereo pair; 13-pixel a (13) to be processed in image 12; 14-image block a of size L x L centered on pixel a (13); 15-second image in stereo pair; 16-1-image block a (14) and convolution image a (16-1) of image 15; 16-2-convolved image a (16-1) removing the convolved image b (16-2) after the boundary; 17-a straight line passing through pixel a (13) and having the same ordinate as pixel a (13); 18-a rectangle of width τ defined on the convolved image b (16-2) with a straight line (17) as axis symmetry, 19-the integer pixel matching result of the pixel a (13), 20-the convolution operation procedure of the image block a (14) with the second image, 21-the removal operation procedure of the boundary of the convolved image a (16-1); 22-ICGN-based image block sub-pixel matching procedure, 23-sub-pixel matching result of pixel a (13); 24-Flag matrix; 25-region of interest T; 26-wind tunnel incoming flow direction, 27-wind tunnel test section, 28-left wall facing to wind tunnel incoming flow direction, 29-lower wall facing to wind tunnel incoming flow direction, 30-right wall facing to wind tunnel incoming flow direction, 31-upper wall facing to wind tunnel incoming flow direction, 32-camera a optical observation window, 33-camera b optical observation window and 34-pixel.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to examples and accompanying drawings, and the exemplary embodiments and descriptions thereof are only used for explaining the present invention and are not meant to limit the present invention.
Example 1:
a wind tunnel test airplane wing deformation full-field measurement method comprises the following steps:
s1, manufacturing a spot pattern on the surface of the wing;
s2, calibrating the stereo vision measuring system to obtain the internal parameter P of the camera a a Internal parameter P of camera b b Calibration parameter P of stereoscopic vision measurement system s Measuring a transformation matrix R converted from the coordinate system to the airplane model coordinate system;
s3, acquiring a test image sequence according to the shot images of the camera a and the camera b before and during the blowing test;
and S4, calculating the wing deformation full-field measurement result based on the test image sequence.
The method for manufacturing the speckle pattern in step S1 includes:
s11, manufacturing a plurality of holes with the aperture of 0.0001-1000 mm, randomly distributed positions and different sizes on the cloth/paper as masks;
s12, covering the mask on the surface of the airplane wing, and coating the A color coating to obtain an A color coating; taking down the mask to obtain a A color spot pattern;
s13, after the paint A is solidified, painting the paint B to obtain a paint B, and enabling the paint B to completely cover the A color spot pattern;
and S14, after the B color coating is solidified and dried, grinding and polishing the B color coating to expose the A color spots from the B color coating to obtain a spot pattern with a smooth surface.
The specific method of step S2 includes:
s21, calibrating the camera a to obtain the internal parameter P of the camera a a
S22, pairCalibrating the camera b to obtain the internal parameter P of the camera b b
S23, establishing a coordinate system [ i, j, k ] of the measuring system by taking the camera a as a reference]Calibrating a stereoscopic vision measuring system consisting of a camera a and a camera b to obtain a calibration parameter P of the stereoscopic vision measuring system s
S24, calculating a transformation matrix R for converting the coordinate system [ i, j, k ] of the measuring system into the coordinate system [ x, y, z ] of the model according to the manual marking points on the surface of the wing.
The method for acquiring the test image sequence in the step S3 includes:
s31, before the blowing test, the camera a and the camera b shoot the spot pattern images of the undeformed wing surface in the windless state, and the spot pattern images are respectively recorded as
Figure BDA0003171246710000101
S32, in the process of the blowing test, the camera a and the camera b shoot spot pattern images of the surface of the deformed wing in the windy state at the moment t, and the spot pattern images are recorded as spot pattern images respectively
Figure BDA0003171246710000102
Wherein k represents the number of shots;
s33, storing according to time sequence
Figure BDA0003171246710000103
And the image shot by the camera a at the time t
Figure BDA0003171246710000104
Obtaining a sequence of images captured by a camera a
Figure BDA0003171246710000105
Store in chronological order
Figure BDA0003171246710000106
And the camera b at the t moment
Figure BDA0003171246710000107
Get the camerab sequence of captured images
Figure BDA0003171246710000108
The method for calculating the full-field measurement result of the wing deformation in the step S4 includes:
s41, preprocessing the image;
s42, calculating three-dimensional point cloud on the surface of the wing before the blowing test;
s43, calculating three-dimensional point cloud on the surface of the wing at the t moment in the blowing test;
and S44, calculating the full-field measurement result of the wing deformation at the time t.
The image preprocessing of step S41 includes:
s411, respectively passing through internal parameters P a 、P b Distortion correction is carried out on all images in the shot image sequence of the camera a and all images in the shot image sequence of the camera b to obtain an image sequence after distortion correction of the camera a and the camera b;
s412, calibrating the parameter P according to the stereoscopic vision measuring system s Performing stereo correction on the image sequence after the distortion correction to ensure that pixels with the same name are aligned in the image ordinate direction in a stereo image pair shot by a camera a and a camera b;
s413, extracting the spot pattern interested area T, eliminating background image pixels which are not positioned on the surface of the wing, and reserving pixels in the spot pattern area on the surface of the wing.
The method for extracting the speckle pattern region of interest T of step S413 includes:
s4131, distinguishing a speckle pattern image and a test section background image by using a pixel neighborhood gradient mean value G;
Figure BDA0003171246710000109
wherein s is the neighborhood size, the value range is 1-1000, and the unit is a pixel;
Figure BDA00031712467100001010
is the (u) thV) the 1 st derivative of the pixels in the image u coordinate direction,
Figure BDA0003171246710000111
is the 1 st derivative of the (u, v) th pixel in the image v coordinate direction;
s4132, setting a Flag matrix Flag equal to the size of the image, and setting a threshold
Figure BDA0003171246710000112
The value range of (A) is 0.0001-10000, and the dimension is not large;
s4133, traversing the whole image, and obtaining the pixel at the position of the image (i, j)
Figure BDA0003171246710000113
Then, for the speckle pattern image, the Flag (i, j) is set to 1; otherwise, setting the Flag (i, j) to be 0 for the background image of the test segment;
s4134, after the image is traversed, firstly carrying out corrosion operation on the Flag matrix Flag to eliminate interference noise, and then carrying out expansion operation to obtain T; in T, 1 indicates that the pixel is a speckle pattern to be processed, and 0 indicates that no speckle pattern is to be processed.
The calculation method of the three-dimensional point cloud on the surface of the wing before the blowing test comprises the following steps:
s421, extracting from the image sequence after the stereo correction of the camera a and the camera b respectively
Figure BDA0003171246710000114
Composing a stereoscopic image pair
Figure BDA0003171246710000115
Using dense matching methods, stereoscopic image pairs
Figure BDA0003171246710000116
Carrying out dense stereo matching on pixels in T;
s422, an ICGN optical flow algorithm is adopted, an affine transformation model is selected for carrying out deformation modeling on the image block, sub-pixel matching is carried out on the image block, and pixel coordinates of a sub-pixel matching result are obtained;
s423, traversing the images in sequence
Figure BDA0003171246710000117
Obtaining a sub-pixel dense matching result by centering the pixels in T;
s424, calculating the sub-pixel image coordinates of the sub-pixel matching result pixel coordinates before the stereo correction in an inverse mapping mode according to the mapping relation in the stereo correction process;
s425, according to the calibration parameter P of the stereo vision measuring system s Calculating the coordinates of the sub-pixel image in the coordinate system [ i, j, k ] of the measuring system]Three-dimensional point cloud; and converting the three-dimensional point cloud into a three-dimensional point cloud [ X ] under an airplane model coordinate system according to the conversion matrix R 0 ,Y 0 ,Z 0 ]。
The sub-pixel dense matching result is traversed by dividing the image into a plurality of grids with the size SuxSv and using a gridding discrete point matching and interpolation compensation method; wherein Su and Sv are both pixels.
Wherein, the whole field measurement result delta of the wing deformation at the time t z The calculating method comprises the following steps: before blowing test, three-dimensional point cloud [ X ] under coordinate system of airplane model 0 ,Y 0 ,Z 0 ]For reference, calculating the displacement delta of the morphing wing in the z direction at the time t z =Z' t -Z 0
Wherein, Z' t =f([X 0 ,Y 0 ],[X t ,Y t ,Z t ]) Is represented by [ X ] 0 ,Y 0 ]For input, by interpolation of a three-dimensional point cloud f (-) from [ X t ,Y t ,Z t ]And z-direction coordinates calculated by interpolation.
Example 2:
this example illustrates a half-mold test conducted in a research wind tunnel.
Fig. 9 shows a schematic diagram of the apparatus setup for the half-mold test of this example. In fig. 9, the respective reference numerals have the following meanings: the arrow of 26 is the wind tunnel incoming flow direction, 27 is the wind tunnel test section, the four walls of the wind tunnel test section are named as a left wall 28, a lower wall 29, a right wall 30 and an upper wall 31 facing the incoming flow direction, the aircraft body 11 is fixed in the middle of the upper wall 31 of the test section, the surface of the wing 1 is drawn with the speckle pattern 6, the middle section of the right wall 30 is provided with a camera a optical observation window 32 and a camera b optical observation window 33, the camera a is arranged at the camera a optical observation window 32, the camera b is arranged at the camera b optical observation window 33, the installation angles of the two cameras are adjusted, the visual fields of the two cameras can be ensured to completely cover the wing 1, and the camera a and the camera b form a stereoscopic vision measuring unit. In practical application, 1-2 sets of stereoscopic vision measuring units can be added as appropriate for expanding the measuring field of view, such as full-mode test measurement.
In this embodiment, the dimensions of the machined metal airfoil are: wingspan: 400mm, wing root width: 80mm, wing tip width: 50mm, the thickness is 1mm, and the wing has better elasticity; 2 black and white industrial cameras with the resolution of 400 ten thousand pixels and the frame rate of 10fps are selected, and the lens parameters are respectively as follows: the aperture F is 2.0, and the focal length F is 13 mm; 2 LED light sources with power of 50W are selected to provide illumination for the camera; the Camera is connected with the image acquisition controller through a cable, and the shot image is input into a computer through a Camera-link data acquisition card for data processing to calculate the full-field deformation of the wing.
The specific method of this embodiment is described in detail below:
firstly, making spot pattern on wing surface
Selecting cotton cloth for manufacturing the mask, wherein the specific manufacturing method comprises the following steps: an electric iron is used for burning holes with diameters of 2mm, 5mm and 8mm on the cotton cloth, and the holes distributed at random positions are used as masks. Then, covering a mask on the surface of the wing, selecting black self-spray paint, spraying a black coating on the mask to ensure that all holes are covered by black paint, and then uncovering the mask; adopting infrared heating equipment to accelerate the solidification of the coating; after the black coating is solidified, white self-spray paint is used for spraying and painting the white coating, and the white coating completely covers the whole wing; and (3) accelerating the solidification of the coating by adopting infrared heating equipment, and after the white coating is dried, carefully polishing the surface coating of the wing by using fine abrasive paper to enable black spots to be exposed from the white coating. Finally, 5 artificial mark points are made on the surface of the airplane body and used for correlating the airplane model coordinate system [ x, y, z ].
The practical application shows that: the spot pattern manufactured by the method can meet the requirement of smoothness of the surface of the model, and the clear and visible spot pattern can be shot by a camera.
Calibration of stereo vision measuring system
According to the camera calibration method, calibrating the camera a to obtain an internal parameter P a Calibrating the camera b to obtain an internal parameter P b Establishing a coordinate system [ i, j, k ] of the measuring system by taking the camera a as a reference]Calibrating the stereo vision measuring system formed by the camera a and the camera b to obtain a calibration parameter P of the stereo vision measuring system s . And calculating a transformation matrix R for converting the measurement coordinate system into the airplane model coordinate system according to the calibration parameters of the stereoscopic vision measurement system and the artificial mark points drawn on the surface of the airplane body.
Third, test image sequence acquisition
Before the blowing test, the image acquisition controller 9 synchronously triggers the camera a and the camera b to shoot the images of the spot patterns on the surface of the undeformed wing 1-1 in the windless state, and the images are respectively recorded as
Figure BDA0003171246710000121
In the process of the blowing test, under the condition that the image acquisition controller 9 synchronously triggers the camera a and the camera b to shoot the wind at the moment t, the spot pattern images on the surfaces of the deformed wings 1-2 are respectively recorded as
Figure BDA0003171246710000131
Will be provided with
Figure BDA0003171246710000132
Storing the image sequence in time sequence
Figure BDA0003171246710000133
In the process, the
Figure BDA0003171246710000134
Storing the image sequence in time sequence
Figure BDA0003171246710000135
In (1). In this example, a sequence of wing deformation images in a 10-frame wind tunnel test was taken at 0.1s intervals.
Four, full field measurement and calculation of wing deformation
1. Image pre-processing
a) Distortion correction and stereo correction of a photographed image
Firstly, according to the internal parameters P of the camera a a For image sequence
Figure BDA0003171246710000136
All the images are subjected to distortion correction to obtain a distortion-corrected image sequence
Figure BDA0003171246710000137
According to the intrinsic parameters P of the camera b b For image sequence
Figure BDA0003171246710000138
Distortion correction is carried out on all the images to obtain a distortion-corrected image sequence
Figure BDA0003171246710000139
Then, according to the three-dimensional calibration parameter P s And performing stereo correction on the image after the distortion correction so as to align pixels with the same name in the stereo images shot by the camera a and the camera b in the direction of the image ordinate u. To pair
Figure BDA00031712467100001310
Performing stereo correction to obtain image sequence
Figure BDA00031712467100001311
To pair
Figure BDA00031712467100001312
Performing stereo correction to obtain image sequence
Figure BDA00031712467100001313
The principle is as follows: according to epipolar geometric constraint in stereoscopic vision, a point O on the surface of the object, and the homonymous pixel points O1, O2 in the stereo image pair are located on the polar line. Stereo correction aligns the epipolar lines of a stereo image pair in the horizontal direction so that the pixels of the same name have the same ordinate u. As shown in particular in fig. 7, the pixels 19 of the pixels 13 of the same name are located on a straight line 17.
Practical application shows that after distortion correction and stereo correction, pixels in a stereo image pair at the same position on the surface of the wing have similar vertical coordinates in the corrected image.
b) Extraction of speckle Pattern region of interest T
And the pixels in the spot pattern area in the image are extracted in advance to be used as an interested area T, and invalid pixels in the background of the test section are removed, so that the image data processing amount can be effectively reduced, and the subsequent pixel matching speed is improved. As shown in fig. 8, a Flag24 matrix of the same size as the image is given. Setting the size s of the pixel neighborhood to be 30, and calculating the gradient mean value of the pixel neighborhood
Figure BDA00031712467100001314
Figure BDA00031712467100001315
Is the 1 st derivative of the (u, v) th pixel in the image u coordinate direction,
Figure BDA00031712467100001316
is the 1 st derivative of the (u, v) th pixel in the v coordinate direction of the image, and a threshold value is set
Figure BDA00031712467100001317
Traversing the whole image, when the pixel at the position of image (i, j)
Figure BDA00031712467100001318
Setting the mark Flag (i, j) to be 1 for the speckle pattern image, otherwise setting the mark Flag (i, j) to be 0 for the test segment background image; after traversing the image, firstly carrying out corrosion operation on the Flag matrix FlagThe expansion operation is performed again to obtain T (i.e., indicated by reference numeral 25 in fig. 8) after removing the interference noise. In T, 1 indicates that the pixel is a speckle pattern to be processed, and 0 indicates that no speckle pattern is to be processed. Treated in this way
Figure BDA0003171246710000141
And finding T of the images.
In practical use, because the background of the test section is a relatively uniform flat area, the pixel neighborhood gradient mean G of the image in the area is relatively small, the image in the speckle pattern area is represented as a mottled texture, rapid light and shade change exists, and the pixel neighborhood gradient G is relatively large, the speckle pattern image and the background image of the test section can be effectively distinguished by adopting the method, and the method is used for extracting the speckle pattern area T. Practical experiments show that the threshold value T g 30, the T extracted by the automatic method can completely cover the wing surface speckle pattern area, and the number of pixels in the T does not exceed 5% of the total number of pixels in the speckle pattern area. Therefore, the automatic method can be adopted to replace a manual method to extract T, so that the manual workload is reduced, and the image preprocessing efficiency is improved.
2. Calculating three-dimensional point cloud of wing surface before blowing test
From
Figure BDA0003171246710000142
Respectively extract
Figure BDA0003171246710000143
Composing a stereoscopic image pair
Figure BDA0003171246710000144
First, a dense matching method is used to match stereo image pairs
Figure BDA0003171246710000145
And carrying out dense stereo matching on the pixels positioned in the T. The basic operation is as follows: taking images
Figure BDA0003171246710000146
Is located in TA plurality of pixels a (13), an image block a (14) having a size of 41 × 41 with the pixel a (13) as a center, and image coordinates of the pixel a (13) being q 13 =(u 13 ,v 13 ). Using FFT fast Fourier transform to process image block a (14) and image
Figure BDA0003171246710000147
Performing convolution to obtain a convolution image a (16-1), wherein the image convolution process uses edge extension operation to enable the size of the image a (16-1) to exceed that of the image
Figure BDA0003171246710000148
Therefore, here, the extension boundary of the image a (16-1) is first removed to obtain the convolution image b (16-2). In this example, τ is taken to be 3. Practical application results show that the method can effectively realize fast pixel matching.
After the whole pixel matching of the pixel a (13) is completed, the whole pixel is taken as an initial value, an ICGN optical flow algorithm is adopted, an affine transformation model is selected to carry out deformation modeling on the image block, sub-pixel matching is carried out on the image block, and a sub-pixel matching result pixel coordinate 23 is obtained: q. q.s 23 =(u 23 ,v 23 ). Sequentially traversing the images
Figure BDA0003171246710000149
The pixels located in T are centered, and a dense matching result is obtained. Finally, according to the image in the stereo correction process
Figure BDA00031712467100001410
To
Figure BDA00031712467100001411
Image of a person
Figure BDA00031712467100001412
To
Figure BDA00031712467100001413
By inverse mapping, estimate { q } 13 ,q 23 Sub-pixel image coordinates before stereo correction
Figure BDA00031712467100001414
In the inverse mapping process, sub-pixel coordinates are estimated by B-spline interpolation. Then, the parameter P is calibrated according to the stereo vision s Calculating it in the coordinate system [ i, j, k ] of the measuring system]The three-dimensional point cloud is converted into a three-dimensional point cloud [ X ] under the airplane model coordinate system according to the conversion matrix R 0 ,Y 0 ,Z 0 ]。
Because the sub-pixel image block matching algorithm facing the stereoscopic vision adopts the pixel-by-pixel dense matching method, the matching calculation amount of each pixel in T is very large. In order to reduce the calculated amount and improve the speed, the application also provides a gridding discrete point matching and interpolation compensation method. As shown in fig. 10, the shaded portion is a pixel that needs matching, the blank portion is a pixel that does not need matching, and the image is divided into a plurality of grids with size Su Sv and the unit is a pixel.
In fig. 10, on the grid with Su ═ 1 and Sv ═ 1 on the left side, the result of pixel matching is performed, i.e., the pixel-by-pixel dense matching method; the right side is a grid with Su-4 and Sv-4, and the result of matching is performed with the pixel at the upper left corner of the grid, which is a gridding discrete point matching method. This approach reduces the amount of computation by increasing the matched pixel interval (note that when Su ═ 1 and Sv ═ 1, their matched pixel interval in the horizontal and vertical directions is 1 pixel, and when Su ═ 4 and Sv ═ 4, their matched pixel interval in the horizontal and vertical directions is 4 pixels). In fig. 10, the gridded discrete point matching method on the right side is a pixel-by-pixel dense matching method on the left side, and the number of pixels to be processed is reduced to 1/16. Because the surface of the wing is continuous and smooth, a discrete matching result is obtained by increasing the matching pixel interval and is used for calculating sparse three-dimensional point cloud, and dense full-field three-dimensional point cloud can be obtained by an interpolation compensation method. The method uses an RBF interpolation method to carry out interpolation compensation on the sparse three-dimensional point cloud and calculate the dense three-dimensional point cloud on the surface of the wing. The result surface is used in the experiment, when Su is 4 and Sv is 4, the average difference between the three-dimensional point cloud calculated by the method and the result calculated by the dense matching method is within 0.5%, but the calculation speed is increased by 8.1 times, and compared with the dense matching result, the obtained three-dimensional point cloud data has lower noise and smoother local part and is closer to the result of the real surface of the wing.
3. Three-dimensional point cloud of wing surface at t moment in calculation blowing test
From
Figure BDA0003171246710000151
Extracting the image taken at time t
Figure BDA0003171246710000152
Forming a pair of time t stereo images
Figure BDA0003171246710000153
Calculating the three-dimensional point cloud [ X ] of the wing surface at the time t according to the method in the step 2 (calculating the three-dimensional point cloud of the wing surface before the blowing test) t ,Y t ,Z t ]。
4. Calculating the whole field measurement result of wing deformation at the time t
As shown in FIG. 5, the wing deformation is defined as the displacement Δ of the wing in the z-coordinate direction under the aerodynamic force z . Three-dimensional point cloud [ X ] of undeformed wing 1-1 0 ,Y 0 ,Z 0 ]Three-dimensional point cloud [ X ] of morphing wing 1-2 t ,Y t ,Z t ]After alignment on the x and y coordinates, a difference is made in the z direction as a displacement Δ in the z direction z . However, due to the wing's position change, [ X ] 0 ,Y 0 ,Z 0 ]And [ X ] t ,Y t ,Z t ]The X and y coordinates of the three-dimensional point cloud are not aligned, therefore, the three-dimensional point cloud interpolation compensation method is needed to be used for [ X [ ] 0 ,Y 0 ,Z 0 ]In [ X ] 0 ,Y 0 ]Coordinates as input at [ X ] t ,Y t ,Z t ]Interpolation calculation [ X ] in point cloud data 0 ,Y 0 ,Z 0 ]For calculating Δ z =Z' t -Z 0 . The interpolation operation is defined as Z' t =f([X 0 ,Y 0 ],[X t ,Y t ,Z t ])。
FIG. 11 shows an example, whichIn 38 is [ X ] t ,Y t ,Z t ]Three-dimensional point 38 has (x, y) coordinates of 37; given one input coordinate 39: (X0, y0) at [ X t ,Y t ,Z t ]The three-dimensional points under the point cloud are 40: (x0, y0, zt), three-dimensional points 40 may be interpolated from neighboring three-dimensional points. In the present example, z 'coordinates of three-dimensional points 40 may be calculated using RBF (radial basis function) interpolation' t
Because the actual deformation value of the wing cannot be measured as the precision evaluation reference in the blowing test, for this reason, in the embodiment, under the windless condition, the wing tip of the wing is deformed by artificially applying a load, the deformation of the wing is measured by adopting the laser tracker, the measurement precision is 0.001mm, and the measured value is used as the reference value to be compared with the measurement result of the device. The comparison result shows that the measurement error (standard deviation) of the device is 0.01 mm. The measurement accuracy is limited by the camera resolution, and the measurement accuracy of the present invention can be higher when a higher resolution camera is employed.
The actual measurement result of the embodiment shows that the full-field measurement of the deformation of the airplane wing can be realized. The number of the pixels in the wing area is about 101.4 ten thousand pixels, and when a pixel-by-pixel dense matching method is adopted, 101.4 ten thousand three-dimensional points are calculated, and 101.4 ten thousand wing deformation measurement values are obtained; when a gridding discrete point matching method is adopted, taking Su as 8 and Sv as 8 to obtain about 1.5 ten thousand actual measurement values, and obtaining 3 ten thousand three-dimensional points and 3 ten thousand wing deformation measurement values through interpolation compensation; far exceeding 1000 measurements of the Optotrak system and 100 measurements of a model deformation measurement device based on photogrammetry technology. Therefore, the invention can effectively increase the measurement quantity and improve the spatial resolution of wing deformation measurement.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. In addition, the term "connected" used herein may be directly connected or indirectly connected via other components without being particularly described.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (2)

1. A wind tunnel test airplane wing deformation full-field measurement method is characterized by comprising the following steps:
s1, manufacturing a spot pattern on the surface of the wing;
s2, calibrating the stereo vision measuring system to obtain the internal parameter P of the camera a a Internal parameter P of camera b b Calibration parameter P of stereoscopic vision measurement system s Measuring a transformation matrix R converted from the coordinate system to the airplane model coordinate system;
s3, acquiring a test image sequence according to the shot images of the camera a and the camera b before and during the blowing test;
s4, calculating a wing deformation full-field measurement result based on the test image sequence;
the specific method of step S2 includes:
s21, calibrating the camera a to obtain the internal parameter P of the camera a a
S22, calibrating the camera b to obtain the internal parameter P of the camera b b
S23, establishing a coordinate system [ i, j, k ] of the measuring system by taking the camera a as a reference]Calibrating a stereo vision measuring system consisting of a camera a and a camera bObtaining the calibration parameter P of the stereoscopic vision measurement system s
S24, calculating a transformation matrix R for converting the coordinate system [ i, j, k ] of the measurement system into the coordinate system [ x, y, z ] of the model according to the artificial mark points on the surface of the wing;
the method for acquiring the test image sequence in the step S3 includes:
s31, before the blowing test, the camera a and the camera b shoot the spot pattern images of the undeformed wing surface in the windless state, and the spot pattern images are respectively recorded as
Figure FDA0003756728040000011
S32, in the process of the blowing test, the camera a and the camera b shoot spot pattern images of the surface of the deformed wing in the windy state at the moment t, and the spot pattern images are recorded as
Figure FDA0003756728040000012
Wherein k represents the number of shots;
s33, storing according to time sequence
Figure FDA0003756728040000013
And the image shot by the camera a at the time t
Figure FDA0003756728040000014
Obtaining a sequence of images captured by a camera a
Figure FDA0003756728040000015
Store in chronological order
Figure FDA0003756728040000016
And the camera b at the t moment
Figure FDA0003756728040000017
Obtaining a sequence of images captured by camera b
Figure FDA0003756728040000018
The method for calculating the full-field measurement result of the wing deformation in the step S4 includes:
s41, preprocessing the image;
s42, calculating three-dimensional point cloud on the surface of the wing before the blowing test;
s43, calculating three-dimensional point cloud on the surface of the wing at the t moment in the blowing test;
s44, calculating a full-field wing deformation measurement result at the moment t;
the image preprocessing comprises:
s411, respectively passing through internal parameters P a 、P b Distortion correction is carried out on all images in the shot image sequence of the camera a and all images in the shot image sequence of the camera b to obtain an image sequence after distortion correction of the camera a and the camera b;
s412, calibrating the parameter P according to the stereoscopic vision measuring system s Stereo correction is carried out on the image sequence after distortion correction, so that the homonymous pixels in the stereo image pair shot by the camera a and the camera b are aligned in the image ordinate direction to obtain the image sequence after stereo correction
Figure FDA0003756728040000021
S413, extracting a spot pattern interesting region T, eliminating background image pixels which are not positioned on the surface of the wing, and reserving pixels in a spot pattern region on the surface of the wing;
the method for extracting the spot pattern interesting region T comprises the following steps:
s4131, distinguishing a speckle pattern image and a test section background image by using a pixel neighborhood gradient mean value G;
Figure FDA0003756728040000022
wherein s is the neighborhood size, the value range is 1-1000, and the unit is a pixel;
Figure FDA0003756728040000023
is the 1 st derivative of the (u, v) th pixel in the image u coordinate direction,
Figure FDA0003756728040000024
is the 1 st derivative of the (u, v) th pixel in the image v coordinate direction;
s4132, setting a Flag matrix Flag equal to the size of the image, and setting a threshold value
Figure FDA0003756728040000025
Figure FDA0003756728040000026
The value range of (a) is 0.0001-10000;
s4133, traversing the whole image, and obtaining the pixel at the position of the image (i, j)
Figure FDA0003756728040000027
Then, for the speckle pattern image, the Flag (i, j) is set to 1; otherwise, setting the Flag (i, j) to be 0 for the background image of the test segment;
s4134, after the image is traversed, firstly carrying out corrosion operation on the Flag matrix Flag to eliminate interference noise, and then carrying out expansion operation to obtain T; in T, 1 represents that the pixel is a speckle pattern to be processed, and 0 represents that the pixel is a non-speckle pattern to be processed;
the calculation method of the three-dimensional point cloud on the surface of the wing before the blowing test comprises the following steps:
s421, extracting from the image sequence after the stereo correction of the camera a and the camera b respectively
Figure FDA0003756728040000028
Composing a stereoscopic image pair
Figure FDA0003756728040000029
Using dense matching methods, stereoscopic image pairs
Figure FDA00037567280400000210
Middle in T pixelPerforming dense stereo matching;
s422, an ICGN optical flow algorithm is adopted, an affine transformation model is selected for carrying out deformation modeling on the image block, sub-pixel matching is carried out on the image block, and pixel coordinates of a sub-pixel matching result are obtained;
s423, traversing the images in sequence
Figure FDA00037567280400000211
Obtaining a sub-pixel dense matching result by centering the pixels in T;
s424, calculating the sub-pixel image coordinates of the sub-pixel matching result pixel coordinates before the stereo correction in an inverse mapping mode according to the mapping relation in the stereo correction process;
s425, according to the calibration parameter P of the stereo vision measuring system s Calculating the coordinates of the sub-pixel image in the coordinate system [ i, j, k ] of the measuring system]Three-dimensional point cloud; and converting the three-dimensional point cloud into a three-dimensional point cloud [ X ] under an airplane model coordinate system according to the conversion matrix R 0 ,Y 0 ,Z 0 ];
The sub-pixel dense matching result is traversed by dividing the image into a plurality of grids with the size SuxSv and using a gridding discrete point matching and interpolation compensation method; wherein Su and Sv are pixels;
full field measurement result delta of wing deformation at time t z The calculating method comprises the following steps: before blowing test, three-dimensional point cloud [ X ] under the coordinate system of airplane model 0 ,Y 0 ,Z 0 ]For reference, calculating the displacement delta of the morphing wing in the z direction at the time t z =Z' t -Z 0
Wherein, Z' t =f([X 0 ,Y 0 ],[X t ,Y t ,Z t ]) Is represented by [ X ] 0 ,Y 0 ]For input, by interpolation of a three-dimensional point cloud f (-) from [ X t ,Y t ,Z t ]And interpolating the calculated z-direction coordinates.
2. The wind tunnel test aircraft wing deformation full-field measurement method according to claim 1, wherein the manufacturing method of the spot pattern in the step S1 comprises the following steps:
s11, manufacturing a plurality of holes with the aperture of 0.0001-1000 mm, randomly distributed positions and different sizes on the cloth/paper as masks;
s12, covering the surface of the airplane wing with a mask, and coating an A color coating to obtain an A color coating; taking down the mask to obtain a A color spot pattern;
s13, after the paint A is solidified, painting the paint B to obtain a paint B, and enabling the paint B to completely cover the A color spot pattern;
and S14, after the B color coating is solidified and dried, grinding and polishing the B color coating to expose the A color spots from the B color coating, thereby obtaining the spot pattern with a smooth surface.
CN202110819196.7A 2021-07-20 2021-07-20 Full-field measurement method for wing deformation of wind tunnel test airplane Active CN113390605B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110819196.7A CN113390605B (en) 2021-07-20 2021-07-20 Full-field measurement method for wing deformation of wind tunnel test airplane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110819196.7A CN113390605B (en) 2021-07-20 2021-07-20 Full-field measurement method for wing deformation of wind tunnel test airplane

Publications (2)

Publication Number Publication Date
CN113390605A CN113390605A (en) 2021-09-14
CN113390605B true CN113390605B (en) 2022-09-02

Family

ID=77626640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110819196.7A Active CN113390605B (en) 2021-07-20 2021-07-20 Full-field measurement method for wing deformation of wind tunnel test airplane

Country Status (1)

Country Link
CN (1) CN113390605B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113642681B (en) * 2021-10-13 2022-01-04 中国空气动力研究与发展中心低速空气动力研究所 Matching method of aircraft model surface mark points
CN116540255A (en) * 2022-01-26 2023-08-04 上海飞机制造有限公司 System and method for measuring and obtaining plane shape by using multiple laser radars
CN115183980A (en) * 2022-06-15 2022-10-14 中国航天空气动力技术研究院 Post-processing method for measurement test data of wind tunnel test surface
CN117048848B (en) * 2023-10-12 2024-01-05 中国飞机强度研究所 Space attitude and deformation testing method for full-size airplane test

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103542815A (en) * 2013-09-23 2014-01-29 西安新拓三维光测科技有限公司 Large size speckle full-field strain measurement method
WO2018072143A1 (en) * 2016-10-19 2018-04-26 北京交通大学 Method for full-field measurement using dynamic laser doppler imaging
CN109839072A (en) * 2019-02-27 2019-06-04 东南大学 A kind of method and device in the temperature field based on DIC and deformation field synchro measure
CN111380750A (en) * 2020-04-13 2020-07-07 北京科技大学 Amnion tissue non-contact full-field deformation measurement method using methylene blue to make spots

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7952722B2 (en) * 2007-12-12 2011-05-31 Kabushiki Kaisha Topcon Optical image measurement device
CN101655352B (en) * 2009-09-15 2011-02-09 西安交通大学 Measurement method of three-dimensional speckle strain measurement device
US20150260509A1 (en) * 2014-03-11 2015-09-17 Jonathan Kofman Three dimensional (3d) imaging by a mobile communication device
CN104864819B (en) * 2015-01-19 2017-08-01 华中科技大学 A kind of high speed three-dimensional strain measurement method based on digital speckle
CN104748696B (en) * 2015-04-13 2017-10-20 西安交通大学 A kind of high inclination-angle wing full field deformation measure method
CN105865366A (en) * 2016-06-13 2016-08-17 中国科学院力学研究所 Measuring method for use in high-temperature thermal buckling transient full-field deformation process of porous sandwich panel
CN105973161A (en) * 2016-06-17 2016-09-28 西安交通大学 Three-dimensional full-field deformation measurement method of paddle
CN107576264A (en) * 2017-03-23 2018-01-12 四川精视科技有限公司 Object stereo vision measurement method in one kind vibration and small size space
GB2570742B (en) * 2018-06-01 2020-10-28 Optonor As Optical-interference analysis
CN109916322B (en) * 2019-01-29 2020-02-14 同济大学 Digital speckle full-field deformation measurement method based on adaptive window matching
CN110954008B (en) * 2019-11-29 2021-08-24 上海交通大学 Welding strain measurement system and method based on three-dimensional digital image correlation method
CN112556976B (en) * 2020-12-09 2022-08-12 中国航天空气动力技术研究院 Speckle manufacturing method for deformation measurement of special wind tunnel experiment model
CN112325789B (en) * 2021-01-04 2021-03-23 中南大学 Method for measuring deformation and displacement in model test based on image processing
CN113108709A (en) * 2021-04-08 2021-07-13 中国航发北京航空材料研究院 Blade-oriented laser shot blasting surface quality measuring device and measuring method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103542815A (en) * 2013-09-23 2014-01-29 西安新拓三维光测科技有限公司 Large size speckle full-field strain measurement method
WO2018072143A1 (en) * 2016-10-19 2018-04-26 北京交通大学 Method for full-field measurement using dynamic laser doppler imaging
CN109839072A (en) * 2019-02-27 2019-06-04 东南大学 A kind of method and device in the temperature field based on DIC and deformation field synchro measure
CN111380750A (en) * 2020-04-13 2020-07-07 北京科技大学 Amnion tissue non-contact full-field deformation measurement method using methylene blue to make spots

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
数字图像相关法测量含缺陷试样全场变形量;陈琳等;《激光杂志》;20141125;第35卷(第11期);第16-19页 *

Also Published As

Publication number Publication date
CN113390605A (en) 2021-09-14

Similar Documents

Publication Publication Date Title
CN113390605B (en) Full-field measurement method for wing deformation of wind tunnel test airplane
CN109242828B (en) Three-dimensional defect detection method for 3D printed product based on grating projection multistep phase shift method
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN110118528B (en) Line structure light calibration method based on chessboard target
CN107588721A (en) The measuring method and system of a kind of more sizes of part based on binocular vision
CN107590825B (en) Point cloud hole repairing method based on SFM
CN107767456A (en) A kind of object dimensional method for reconstructing based on RGB D cameras
CN109523595B (en) Visual measurement method for linear angular spacing of building engineering
CN110189339A (en) The active profile of depth map auxiliary scratches drawing method and system
CN108801135B (en) Nuclear fuel rod pose automatic identification equipment
CN108663026B (en) Vibration measuring method
CN104574432A (en) Three-dimensional face reconstruction method and three-dimensional face reconstruction system for automatic multi-view-angle face auto-shooting image
CN110866969A (en) Engine blade reconstruction method based on neural network and point cloud registration
CN111028295A (en) 3D imaging method based on coded structured light and dual purposes
CN101639947A (en) Image-based plant three-dimensional shape measurement and reconstruction method and system
CN111402330B (en) Laser line key point extraction method based on planar target
CN104748683A (en) Device and method for online and automatic measuring numerical control machine tool workpieces
CN106996748A (en) A kind of wheel footpath measuring method based on binocular vision
CN107990846B (en) Active and passive combination depth information acquisition method based on single-frame structured light
CN115816471B (en) Unordered grabbing method, unordered grabbing equipment and unordered grabbing medium for multi-view 3D vision guided robot
CN109087323A (en) A kind of image three-dimensional vehicle Attitude estimation method based on fine CAD model
CN111046843A (en) Monocular distance measurement method under intelligent driving environment
CN114627177A (en) Aircraft skin gap and step difference measuring method based on image segmentation
CN114170284B (en) Multi-view point cloud registration method based on active landmark point projection assistance
CN114463521B (en) Building target point cloud rapid generation method for air-ground image data fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant