CN115900571A - Liquid rocket engine structure deformation measuring method based on optical image observation - Google Patents

Liquid rocket engine structure deformation measuring method based on optical image observation Download PDF

Info

Publication number
CN115900571A
CN115900571A CN202211288266.1A CN202211288266A CN115900571A CN 115900571 A CN115900571 A CN 115900571A CN 202211288266 A CN202211288266 A CN 202211288266A CN 115900571 A CN115900571 A CN 115900571A
Authority
CN
China
Prior art keywords
camera
image
images
coordinates
displacement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211288266.1A
Other languages
Chinese (zh)
Inventor
王珺
闫松
张志伟
袁军社
李学龙
陈穆林
肖勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Xian Aerospace Propulsion Institute
Original Assignee
Northwestern Polytechnical University
Xian Aerospace Propulsion Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University, Xian Aerospace Propulsion Institute filed Critical Northwestern Polytechnical University
Priority to CN202211288266.1A priority Critical patent/CN115900571A/en
Publication of CN115900571A publication Critical patent/CN115900571A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a method for measuring structural deformation of a liquid rocket engine based on optical image observation. The method uses a high-precision industrial camera to observe, utilizes a binocular vision depth estimation principle to measure the displacement of a plurality of key points, and calibrates on a displacement simulation platform, thereby realizing the structural deformation measurement of the liquid rocket engine. The method has the advantages of high detection speed and high detection accuracy, and is a structural deformation measurement method of the liquid rocket engine, which has the advantages of high detection speed and accuracy and high practical application value.

Description

Liquid rocket engine structure deformation measuring method based on optical image observation
Technical Field
The invention belongs to the field of intelligent photoelectric and optical measurement, and particularly relates to a method for measuring structural deformation of a liquid rocket engine in a static test scene.
Background
The structural performance of the liquid rocket engine determines the loading capacity and the working condition of the carrier rocket. Therefore, before the liquid rocket engine is assembled, a static test needs to be carried out on the liquid rocket engine to complete the verification of the mechanical property of the liquid rocket engine. The static test of the liquid engine is used for researching and verifying the strength characteristic of the engine under the stress condition by applying static load to the engine, and the classical structural deformation measuring method mainly comprises a laser measuring method, a strain rod measuring method and the like.
The method achieves the purpose of structural deformation measurement by measuring the displacement and the strain of a key position under the application of static force, is mostly a contact type measurement method, needs to arrange an instrument and a connection system on the surface of a measured object in advance, has the limitations of complex preparation work, limited number of measurement points, interference of environmental factors and the like, and is difficult to realize the all-dimensional measurement of a large-size object.
In recent years, with the continuous development of computing devices and intelligent photoelectric technologies, measurement technologies based on optical image observation are applied to various fields such as aerospace, medical health, intelligent manufacturing and the like. Among them, the stereo optical measurement system composed of a plurality of camera sets has gained wide attention due to its characteristics such as high precision, easy operation, repeatability, etc.
Disclosure of Invention
Technical problem to be solved
In order to avoid the defects of the prior art, the invention provides a liquid rocket engine structure deformation measuring method based on optical image observation.
Technical scheme
A liquid rocket engine structure deformation measuring method based on optical image observation is characterized in that: designing a plurality of groups of binocular vision systems to detect multipoint displacement according to the actual requirements of multipoint displacement measurement of key points of a liquid rocket engine, and imaging an optical image of a measured object on an industrial camera image sensor through a lens when the binocular vision systems are started to work; then the image sensor converts the optical signal into an analog electrical signal, and converts the analog signal into a digital signal through a digital-to-analog converter; then the digital information is processed by the image processor and stored in the memory; finally, inputting through a digital interface or a video interface; the method comprises the following steps:
step 1: calibrating a binocular vision system by adopting a Zhangyingyou calibration method;
step 2: the method comprises the steps of simultaneously acquiring a plurality of view field images in a hard triggering mode, wherein each view field comprises a plurality of target points, and numbering and distinguishing the images acquired by different cameras;
and step 3: performing distortion correction and three-dimensional correction according to the parameters obtained by calibrating the camera group, so that the corresponding polar lines of the images shot by the left camera and the right camera are on the same horizontal line;
and 4, step 4: the image after correction is divided into an interested area, a target point parallax value is calculated to obtain a real coordinate, and a displacement value is calculated according to position changes at different moments;
and 5: by analyzing a static test scene of the rocket engine, a multipoint displacement simulation system is constructed by using a plurality of electric triaxial displacement platforms to simulate the displacement of key points in the static test, and the system is calibrated and adjusted; and testing in a real static test, integrating displacement data obtained by system measurement, constructing a three-dimensional graph through software, and performing visual display by using a platform to realize structural deformation measurement of the liquid rocket engine.
The further technical scheme of the invention is as follows: the step 1 is as follows:
1a: a checkerboard calibration plate made of float glass is used, the size of the checkerboard is 20mm, and the side length manufacturing error is less than 0.01mm;
1b: placing a chessboard calibration plate by using a fixing device, fixing the camera to be unchanged, moving the calibration plate, and shooting multi-angle images by using the camera to obtain corresponding images under left and right visual angles of a binocular camera;
1c: inputting the obtained left and right images into a software system in pair, and performing distortion correction to obtain a primary internal and external parameter matrix;
1d: eliminating the calibration image with the reprojection error larger than 0.1 pixel, ensuring the reprojection error to be within the range of 0.1 pixel, and obtaining an internal and external parameter matrix and a distortion coefficient matrix shown as a formula (1):
Figure BDA0003900736450000031
wherein f is x ,f y Denotes the focal length, c x ,c y The coordinates of the main point are represented by,
Figure BDA0003900736450000032
represents a rotation matrix describing the orientation of the coordinate axis of the world coordinate system relative to the camera coordinate axis, and>
Figure BDA0003900736450000033
representing translation vectors describing the coordinates of the origin of space, k, in the camera coordinate system 1 ,k 2 Is the distortion coefficient.
The further technical scheme of the invention is as follows: the step 3 is as follows:
3a: the distortion correction is respectively carried out on the acquired left camera image and the acquired right camera image, and known image pixels are obtained
Figure BDA00039007364500000310
Substituting the coordinates into formulas (2) and (3), calculating pixel coordinates (u, v) after correction, and obtaining an image after distortion correction, wherein the flow is shown as follows;
Figure BDA0003900736450000034
Figure BDA0003900736450000035
wherein (u, v) represents the pixel coordinates after correcting the distortion,
Figure BDA0003900736450000036
showing the pixel coordinates at actual radial distortion, (x, y) showing the coordinates of the continuous image at ideal no distortion, (u) 0 ,v 0 ) Representing the principal point of the camera, k 1 ,k 2 Represents a distortion coefficient;
3b: the left camera image and the right camera image after the distortion correction are subjected to three-dimensional correction, and the method comprises the following steps:
1): finding optical center positions c for left and right cameras, respectively 1 ,c 2 As shown in equations (4) and (5), the internal and external parameters R, t obtained by equation (1) are substituted into:
Figure BDA0003900736450000037
Figure BDA0003900736450000038
wherein c is 1 ,c 2 Respectively representing the optical center positions, R, of the left and right cameras 1 ,R 2 Respectively representing left and right camera rotation matrices, K 1 ,K 2 Respectively, the camera internal reference matrix is represented,
Figure BDA0003900736450000039
respectively representing left and right camera translation matrices;
2): determining new camera rotation matrix, substituting the results obtained from formulas (4) and (5) into the following formula, and calculating new camera rotation matrix R n
Figure BDA0003900736450000041
/>
Wherein
Figure BDA0003900736450000042
r 2 =R 1 (3,:)×r 1 ,r 3 =r 1 ×r 2
3): calculating new camera internal reference matrix K n
Figure BDA0003900736450000043
4): solving a transformation matrix T 1 ,T 2
Figure BDA0003900736450000044
Figure BDA0003900736450000045
5): by transformation matrix T 1 ,T 2 And respectively changing the two images to obtain the images after the three-dimensional correction.
The further technical scheme of the invention is as follows: the step 4 is as follows:
4a: respectively demarcating interested areas on the corrected left camera image and the corrected right camera image, and determining the position of the center of the angular point target in the images, wherein the steps are as follows:
1): defining an interested area from the collected image;
2): filtering the region of interest to remove noise;
3): detecting the foreground area by using an OTSU algorithm to further reduce the retrieval range, and drawing a maximum external rectangle by using contour detection;
4): defining a retrieval mask and carrying out corrosion treatment;
5): performing sub-pixel corner detection on the image, and selecting the mode of parallax as final parallax;
(4b) The method comprises the following steps Obtaining the pixel position of the target point according to the left and right camera images to obtain the parallax disparity (x) r ,x l ) Calculating the real coordinates (x, y, z) of the target point through the projection matrix:
the operation flow is as follows, and the formulas are shown as (10), (11) and (12):
[X,Y,Z,W] T =Q×[x,y,disparity(x r ,x l ),1] T , (10)
Figure BDA0003900736450000046
wherein:
Figure BDA0003900736450000051
c x ,c y is the coordinate of the main point of the left camera in the image, f is the focal length, T x Is a translation between the projection centers of the two cameras, c' x The coordinates of the main point of the right camera in the image are obtained, so that the coordinates (x, y, z) of the key point can be obtained from the formula;
(4c) The method comprises the following steps By capturing images of the target point at different times, the displacement value is calculated using the real coordinates X = (X, y, z) already determined, assuming the time t 1 >t 2 The displacement X can be obtained, as shown in equation (13):
Figure BDA0003900736450000052
(4d) The method comprises the following steps Steps 1 to 3 are repeated for each target point of the camera group.
Advantageous effects
According to the method for measuring the structural deformation of the liquid rocket engine based on the optical image observation, a computer vision method is introduced into the measurement of the displacement of the key point of the static test according to the requirement of the structural deformation measurement in the static test scene of the liquid rocket engine, and the method has the main advantages of non-contact, repeatability, easiness in operation and the like, so that the manpower and material resources in the test scene are greatly reduced. Firstly, after a camera is calibrated to a camera group by using a Zhang-Zhengyou calibration method, an optical mark point is attached to a key point, and the movement of the optical mark point is detected by using a space coordinate calculation algorithm so as to simulate the displacement of the key point. The algorithm has the advantages of two aspects, in the aspect of measurement efficiency, the calculation time of the algorithm is reduced by designing a sparse parallax calculation algorithm and defining the region of interest of a user, the measurement time of a single group of cameras is optimized to 0.2 s/time, and the real-time measurement requirement can be met. In the aspect of measurement precision, by improving a threshold segmentation algorithm and an optical mark point geometric center sub-pixel positioning algorithm, the measurement resolution of 0.05mm/m can be achieved, and the displacement measurement requirement of a static test can be met. And then, the repeatability of the test is increased by designing an electric simulation displacement platform, the calibration and the tuning of the system are realized, and the obtained displacement data are visually displayed, so that the accuracy of the structural deformation measuring method is easily verified. The invention overcomes the defects of large operation difficulty, difficult repetition, complex layout and the like of the traditional contact type measurement method, and can realize non-contact type measurement with higher detection speed,
Drawings
The drawings, in which like reference numerals refer to like parts throughout, are for the purpose of illustrating particular embodiments only and are not to be considered limiting of the invention.
FIG. 1 is a block diagram of the process flow of the present invention;
fig. 2 is a binocular vision system display diagram;
FIG. 3 is a schematic diagram of a multiphase unit layout;
FIG. 4 is a three-dimensional correction result graph;
FIG. 5 is a drawing of a left camera image demarcating a region of interest;
FIG. 6 is a diagram of corner detection results;
FIG. 7 is a diagram showing an electric simulation displacement platform;
FIG. 8 is an effect diagram of verifying the layout of a multi-phase unit based on an electric displacement platform;
fig. 9 is a structural deformation showing view.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
FIG. 1 is a block diagram of the overall flow of a method for measuring structural deformation of a liquid rocket engine based on optical image observation;
step 1, reducing repeated test errors by designing a standardized calibration process, and acquiring internal and external parameters of the high-resolution industrial camera.
As shown in FIG. 2, the binocular vision system adopted by the present invention uses an industrial camera, model MV-CH120-10UM, frame rate 23 frames/sec, focal length 16mm, resolution 4096 × 3000 pixels. According to the actual requirements of multipoint displacement measurement of key points of a liquid rocket engine, a plurality of groups of binocular vision systems are designed to detect multipoint displacement, and when the binocular vision systems are started to work, an optical image of an object to be measured is imaged on an industrial camera image sensor through a lens; then the image sensor converts the optical signal into an analog electrical signal, and converts the analog signal into a digital signal through a digital-to-analog converter; then the digital information is processed by the image processor and stored in the memory; and finally, inputting through a digital interface or a video interface.
The invention relates to a Zhang Zhengyou calibration method based on binocular vision, which utilizes a checkerboard pattern to calibrate. In the actual calibration process, the calibration error is easily increased due to the shaking of the checkerboard, the unevenness of the surface, the unclear grid and the like, and the result of the whole experimental process is influenced. And if the image part of the shooting calibration plate is missing or the angle is single in the calibration process, the calculated error of the internal and external parameters of the camera is larger, so the calibration process of the Zhang Zhengyou calibration method based on binocular optical image correction is designed, the calibration error is reduced, and the specific steps are as follows:
(1a) The method comprises the following steps A checkerboard calibration plate made of float glass is used, the size of the checkerboard is 20mm, and the manufacturing error of the side length is less than 0.01mm. The calibration plate can reduce the influence caused by light diffuse reflection, and the accuracy of calibration parameters can be ensured by strictly controlling the side length error;
(1b) The method comprises the following steps The chessboard calibration plate is placed by using a fixing device, the position of a camera is fixed and is not changed, the calibration plate is moved, the camera is used for shooting images of multiple angles, the operation aims at correcting lens distortion, so that the calibration plate images with different inclination angles are expected to be shot, 20 images need to be collected in the calibration process, and corresponding images under the left visual angle and the right visual angle of a binocular camera are obtained;
(1c) The method comprises the following steps Inputting the obtained left and right images into a software system (Matlab) in pair, and performing distortion correction to obtain a primary internal and external parameter matrix;
(1d) The method comprises the following steps The reprojection error is an important parameter for judging and measuring the accuracy of the calibration result, the theoretical calculation is to meet the measurement accuracy requirement, the reprojection error of the calibration result is required to be less than 0.1 pixel, so the calibration image with the reprojection error greater than 0.1 pixel needs to be removed, the reprojection error is ensured to be within the range of 0.1 pixel, and the internal and external parameter matrix and the distortion coefficient matrix shown in the formula (1) are obtained, taking a camera as an example.
Figure BDA0003900736450000081
Wherein f is x ,f y Denotes the focal length, c x ,c y Representing the principal point coordinates (relative to the coordinate plane),
Figure BDA0003900736450000082
represents a rotation matrix describing the orientation of the coordinate axis of the world coordinate system relative to the camera coordinate axis, and/or>
Figure BDA0003900736450000083
Represents a translation vector describing the coordinates of the origin in space, k, in the camera coordinate system 1 ,k 2 Is the distortion coefficient.
And 2, simultaneously acquiring a plurality of view field images by using a hard triggering mode, wherein each view field comprises a plurality of target points, and numbering and distinguishing different view field images.
As shown in fig. 3, the method is a schematic diagram for constructing the multi-phase unit according to the actual working conditions, the multi-phase unit is used for constructing the camera combined network, the camera combined network can cover a plurality of target points of different planes, and each group of cameras can shoot all target points in the depth of field, so that the requirements of the rocket engine static test on key points of different planes are met. An FY6900 type signal generator is adopted to generate a 15Hz square wave signal, and a plurality of cameras are controlled to shoot images simultaneously by using hard triggering. And numbering and distinguishing the acquired images according to different view fields, and reading the images into a Python program.
And 3, performing distortion correction and three-dimensional correction according to the parameters obtained by calibrating the camera set, so that the corresponding polar lines of the images shot by the left camera and the right camera are on the same horizontal line.
The invention takes a group of cameras as an example, and each group of cameras uses the same steps to obtain the target point displacement. The steps are as follows:
(3a) The method comprises the following steps Respectively correcting distortion of the acquired left and right camera images, and obtaining known image pixels
Figure BDA0003900736450000087
Substituting the coordinates into formulas (2) and (3), calculating pixel coordinates (u, v) after correction, and obtaining an image after distortion correction, wherein the flow is shown as follows; />
Figure BDA0003900736450000084
Figure BDA0003900736450000085
Wherein (u, v) represents the pixel coordinates after correcting the distortion,
Figure BDA0003900736450000086
representing the pixel coordinates under actual radial distortion: (x, y) represents the coordinates of the continuous image in the ideal case without distortion (u) 0 ,v 0 ) Representing the principal point of the camera, k 1 ,k 2 Representing the distortion coefficient.
(3b) The method comprises the following steps The left and right camera images after the distortion correction are subjected to stereo correction, and the left and right images after the stereo correction are shown in fig. 4, and the method comprises the following steps:
1): finding optical center positions c for left and right cameras, respectively 1 ,c 2 As shown in the formulas (4) and (5), the internal and external parameters R, t obtained by the formula (1) are substituted.
Figure BDA0003900736450000091
Figure BDA0003900736450000092
Wherein c is 1 ,c 2 Respectively representing the optical center positions, R, of the left and right cameras 1 ,R 2 Respectively representing left and right camera rotation matrices, K 1 ,K 2 Respectively, represent a reference matrix within the camera,
Figure BDA0003900736450000093
representing the left and right camera translation matrices, respectively.
2): determining new camera rotation matrix, substituting the results obtained from formulas (4) and (5) into the following formula, and obtaining new camera rotation matrix R n
Figure BDA0003900736450000094
Wherein
Figure BDA0003900736450000095
r 2 =R 1 (3,:)×r 1 ,r 3 =r 1 ×r 2
3): solving new camera internal parameter matrix K n As shown in equation (7).
Figure BDA0003900736450000096
4): solving a transformation matrix T 1 ,T 2 As shown in equations (8) and (9).
Figure BDA0003900736450000097
Figure BDA0003900736450000098
5): by transformation matrix T 1 ,T 2 And respectively changing the two images to obtain the images after the three-dimensional correction.
And 4, defining an interested area of the corrected image, calculating a target point parallax value, calculating to obtain an optical mark point space coordinate, and calculating a displacement value according to the position change at different moments.
(4a) The method comprises the following steps Respectively demarcating interested areas on the corrected left camera image and the corrected right camera image, and determining the position of the center of the angular point target in the images, wherein the steps are as follows:
1): an interested area is defined from the captured image, taking the left camera as an example, as shown in fig. 5, the right camera defines an interested area of the same target point in the same manner.
2): and filtering the region of interest to remove noise.
3): and detecting the foreground area by using an OTSU algorithm to further narrow the retrieval range, and drawing the maximum circumscribed rectangle by using contour detection.
4): and defining a retrieval mask and carrying out corrosion treatment.
5): and (3) performing sub-pixel corner detection on the image, selecting the mode of parallax as the final parallax, and obtaining a result shown in fig. 6, wherein a white dot in the diagram represents a detection center.
(4b) The method comprises the following steps Obtaining the pixel position of the target point according to the left and right camera images to obtain the parallax disparity (x) r ,x l ) The real coordinates (x, y, z) of the target point are calculated from the projection matrix:
the operation flow is as follows, and the formulas (10), (11) and (12) are as follows:
[X,Y,Z,W] T =Q×[x,y,disparity(x r ,x l ),1] T , (10)
Figure BDA0003900736450000101
wherein:
Figure BDA0003900736450000102
c x ,c y is the coordinate of the main point of the left camera in the image, f is the focal length, T x Is the translation (negative value) between the two camera projection centers, c' x Is the coordinates of the right camera principal point in the image, so that the key point coordinates (x, y, z) can be found from the above equation.
(4c) The method comprises the following steps By capturing images of the target point at different times, the displacement value is calculated using the real coordinates X = (X, y, z) already determined, assuming the time t 1 >t 2 The displacement X can be obtained, as shown in equation (13):
Figure BDA0003900736450000103
(4d) The method comprises the following steps Steps 1 to 3 are repeated for each target point of the camera group.
And 5, constructing a multipoint displacement simulation system by analyzing a static test scene of the liquid rocket engine and using a plurality of electric triaxial displacement platforms to simulate the displacement condition of key points in the static test, and calibrating the system. And integrating displacement data obtained by system measurement, constructing a three-dimensional graph through software, and performing visual display by using a platform.
For the complexity and the non-repeatability of the static experiment of the liquid rocket engine, the scalability of a visual displacement monitoring result and the economy of repeated experiments are considered, so that a displacement platform is constructed in a laboratory to simulate the displacement condition of a key point in the static experiment of the liquid rocket engine.
In the invention, aiming at the test requirements of the liquid engine, as shown in fig. 7, a special controller is arranged on the left side, and a three-dimensional displacement platform is arranged on the right side. The proposal of using the electric displacement platform is provided, an XYZ three-axis motion platform is adopted, the electric drive is adopted, a Zhuo-Li-Han-Dynasty SC300 controller or a PLC controller is used for real-time control, and the displacement along each axis is accurately recorded. The XYZ axial measuring range of the displacement table is 20mm, and the repeated positioning precision is less than or equal to 1.5 mu m, so the invention meets the experimental requirement.
The invention verifies the moving precision of the target points in different planes through the electric displacement table, and proves the effectiveness of the method. Due to the test requirements of the liquid engine, multiple target points on different planes need to be detected simultaneously, so as shown in fig. 8, the left camera group shoots two target points on the same plane, the right camera group shoots three target points on the same plane, but the target points shot by the left camera group and the right camera group are not in the same plane. The arrangement of the multiphase unit and the electric displacement table can simulate a test environment and meet actual requirements, so that the measurement precision of the displacement table can be optimized, and then the multiphase unit and the electric displacement table are transferred to an actual application scene.
According to the method, an electric displacement table is adopted to drive target points to generate displacement, images of displacement of key points in a plurality of groups of cameras are synchronously recorded in a hardware pulse current triggering mode, after steps 1 to 4 are carried out on each group of cameras, a visual graph of the final target point is obtained by using visual software, and as shown in fig. 9, the horizontal axis represents time and the unit is second(s), and the vertical axis represents displacement distance and the unit is millimeter (mm). I.e. a display of multiple point displacements. Through the displacement visual analysis of different key points, the structural deformation of the liquid rocket engine is observed according to the displacement of the key position, so that the subsequent stress characteristic test of the liquid rocket engine is facilitated.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims (4)

1. A liquid rocket engine structure deformation measuring method based on optical image observation is characterized in that: designing a plurality of groups of binocular vision systems to detect multipoint displacement according to the actual requirements of multipoint displacement measurement of key points of a liquid rocket engine, and imaging an optical image of an object to be measured on an industrial camera image sensor through a lens when the binocular vision systems are started to work; then the image sensor converts the optical signal into an analog electrical signal, and converts the analog signal into a digital signal through a digital-to-analog converter; then the digital information is processed by the image processor and stored in the memory; finally, inputting through a digital interface or a video interface; the method comprises the following steps:
step 1: calibrating a binocular vision system by adopting a Zhangyingyou calibration method;
and 2, step: simultaneously acquiring a plurality of view field images in a hard triggering mode, wherein each view field comprises a plurality of target points, and numbering and distinguishing the images acquired by different cameras;
and step 3: performing distortion correction and three-dimensional correction according to the parameters obtained by calibrating the camera group, so that the corresponding polar lines of the images shot by the left camera and the right camera are on the same horizontal line;
and 4, step 4: the image after correction is divided into an interested area, a target point parallax value is calculated to obtain a real coordinate, and a displacement value is calculated according to position changes at different moments;
and 5: by analyzing a static test scene of the rocket engine, a multipoint displacement simulation system is constructed by using a plurality of electric triaxial displacement platforms to simulate the displacement of key points in the static test, and the system is calibrated and adjusted; and testing in a real static test, integrating displacement data obtained by system measurement, constructing a three-dimensional graph through software, and performing visual display by using a platform to realize structural deformation measurement of the liquid rocket engine.
2. The method for measuring structural deformation of a liquid rocket engine based on optical image observation according to claim 1, wherein: the step 1 is as follows:
1a: a checkerboard calibration plate made of float glass is used, the size of the checkerboard is 20mm, and the side length manufacturing error is less than 0.01mm;
1b: placing a chessboard calibration plate by using a fixing device, fixing the camera to be unchanged, moving the calibration plate, and shooting multi-angle images by using the camera to obtain corresponding images under left and right visual angles of a binocular camera;
1c: inputting the obtained left and right images into a software system in pair, and performing distortion correction to obtain a primary internal and external parameter matrix;
1d: eliminating the calibration image with the reprojection error larger than 0.1 pixel, ensuring the reprojection error to be within the range of 0.1 pixel, and obtaining an internal and external parameter matrix and a distortion coefficient matrix shown as a formula (1):
Figure FDA0003900736440000021
wherein f is x ,f y Denotes the focal length, c x ,c y The coordinates of the main point are represented by,
Figure FDA0003900736440000022
represents a rotation matrix describing the orientation of the coordinate axis of the world coordinate system relative to the camera coordinate axis, and/or>
Figure FDA0003900736440000023
Representing translation vectors describing the coordinates of the origin of space, k, in the camera coordinate system 1 ,k 2 Is the distortion coefficient.
3. The method for measuring structural deformation of a liquid rocket engine based on optical image observation according to claim 1, wherein: the step 3 is as follows:
3a: for the acquired left and right camera imagesRespectively performing distortion correction to the known image pixels
Figure FDA0003900736440000024
Substituting the coordinates into formulas (2) and (3), calculating pixel coordinates (u, v) after correction, and obtaining an image after distortion correction, wherein the flow is shown as follows;
Figure FDA0003900736440000025
/>
Figure FDA0003900736440000026
wherein (u, v) represents the pixel coordinates after correcting the distortion,
Figure FDA0003900736440000027
showing the pixel coordinates at actual radial distortion, (x, y) showing the coordinates of the continuous image at ideal no distortion, (u) 0 ,v 0 ) Representing the principal point of the camera, k 1 ,k 2 Represents a distortion coefficient;
3b: the left camera image and the right camera image after the distortion correction are subjected to three-dimensional correction, and the method comprises the following steps:
1): finding the optical center position c for the left and right cameras respectively 1 ,c 2 As shown in equations (4) and (5), the internal and external parameters R, t obtained by equation (1) are substituted into:
Figure FDA0003900736440000028
Figure FDA0003900736440000029
wherein c is 1 ,c 2 Respectively representing the optical center positions of the left and right cameras, R 1 ,R 2 Respectively representing left and right camera rotation matrices, K 1 ,K 2 Respectively, the camera internal reference matrix is represented,
Figure FDA0003900736440000031
respectively representing left and right camera translation matrixes;
2): determining new camera rotation matrix, substituting the results obtained from formulas (4) and (5) into the following formula, and obtaining new camera rotation matrix R n
Figure FDA0003900736440000032
Wherein
Figure FDA0003900736440000033
r 2 =R 1 (3,:)×r 1 ,r 3 =r 1 ×r 2
3): calculating new camera internal reference matrix K n
Figure FDA0003900736440000034
4): solving a transformation matrix T 1 ,T 2
Figure FDA0003900736440000035
Figure FDA0003900736440000036
5): by transformation matrix T 1 ,T 2 And respectively changing the two images to obtain the images after the three-dimensional correction.
4. The method for measuring structural deformation of a liquid rocket engine based on optical image observation according to claim 1, wherein: the step 4 is as follows:
4a: respectively demarcating interested areas on the corrected left camera image and the corrected right camera image, and determining the position of the center of the angular point target in the images, wherein the steps are as follows:
1): defining an interested area from the collected image;
2): filtering the region of interest to remove noise;
3): detecting the foreground area by using an OTSU algorithm to further reduce the retrieval range, and drawing a maximum external rectangle by using contour detection;
4): defining a retrieval mask and carrying out corrosion treatment;
5): performing sub-pixel corner detection on the image, and selecting the mode of parallax as final parallax;
(4b) The method comprises the following steps Obtaining the pixel position of the target point according to the left and right camera images to obtain the parallax disparity (x) r ,x l ) The real coordinates (x, y, z) of the target point are calculated from the projection matrix:
the operation flow is as follows, and the formulas are shown as (10), (11) and (12):
[X,Y,Z,W] T =Q×[x,y,disparity(x r ,x l ),1] T , (10)
Figure FDA0003900736440000041
wherein:
Figure FDA0003900736440000042
c x ,c y is the coordinate of the main point of the left camera in the image, f is the focal length, T x Is a translation between the projection centers of the two cameras, c' x The coordinates of the main point of the right camera in the image are obtained, so that the coordinates (x, y, z) of the key point can be obtained from the formula;
(4c) The method comprises the following steps By capturing images of the target point at different times, the real coordinates X = (X, y, z) that have already been determined are used to calculate the actual coordinatesDisplacement value, assumed time t 1 >t 2 The displacement X can be obtained, as shown in equation (13):
Figure FDA0003900736440000043
(4d) The method comprises the following steps Steps 1 to 3 are repeated for each target point of the camera group.
CN202211288266.1A 2022-10-20 2022-10-20 Liquid rocket engine structure deformation measuring method based on optical image observation Pending CN115900571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211288266.1A CN115900571A (en) 2022-10-20 2022-10-20 Liquid rocket engine structure deformation measuring method based on optical image observation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211288266.1A CN115900571A (en) 2022-10-20 2022-10-20 Liquid rocket engine structure deformation measuring method based on optical image observation

Publications (1)

Publication Number Publication Date
CN115900571A true CN115900571A (en) 2023-04-04

Family

ID=86494098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211288266.1A Pending CN115900571A (en) 2022-10-20 2022-10-20 Liquid rocket engine structure deformation measuring method based on optical image observation

Country Status (1)

Country Link
CN (1) CN115900571A (en)

Similar Documents

Publication Publication Date Title
CN110021046B (en) External parameter calibration method and system for camera and laser radar combined sensor
CN106981083B (en) The substep scaling method of Binocular Stereo Vision System camera parameters
Orteu et al. Multiple-camera instrumentation of a single point incremental forming process pilot for shape and 3D displacement measurements: methodology and results
CN104537707B (en) Image space type stereoscopic vision moves real-time measurement system online
CN110728715A (en) Camera angle self-adaptive adjusting method of intelligent inspection robot
CN111667536A (en) Parameter calibration method based on zoom camera depth estimation
CN109099883A (en) The big visual field machine vision metrology of high-precision and caliberating device and method
Li et al. Light plane calibration and accuracy analysis for multi-line structured light vision measurement system
CN109712232B (en) Object surface contour three-dimensional imaging method based on light field
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN114283203B (en) Calibration method and system of multi-camera system
CN112991464B (en) Point cloud error compensation method and system based on three-dimensional reconstruction of stereoscopic vision
CN113870366B (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
CN113554697A (en) Cabin section profile accurate measurement method based on line laser
CN113450416B (en) TCSC method applied to three-dimensional calibration of three-dimensional camera
CN112556594A (en) Strain field and temperature field coupling measurement method and system fusing infrared information
CN112489137A (en) RGBD camera calibration method and system
CN112229323A (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
CN115880369A (en) Device, system and method for jointly calibrating line structured light 3D camera and line array camera
CN114332191A (en) Three-dimensional point cloud error compensation method and device
CN113724337A (en) Camera dynamic external parameter calibration method and device without depending on holder angle
CN115861445A (en) Hand-eye calibration method based on calibration plate three-dimensional point cloud
CN116309879A (en) Robot-assisted multi-view three-dimensional scanning measurement method
CN112164119B (en) Calibration method for multi-camera system placed in surrounding mode and suitable for narrow space
JPH1089960A (en) Three-dimensional image measuring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination