CN115808150A - Curved surface deflection angle detection method and device based on binocular vision angle measurement - Google Patents

Curved surface deflection angle detection method and device based on binocular vision angle measurement Download PDF

Info

Publication number
CN115808150A
CN115808150A CN202211280967.0A CN202211280967A CN115808150A CN 115808150 A CN115808150 A CN 115808150A CN 202211280967 A CN202211280967 A CN 202211280967A CN 115808150 A CN115808150 A CN 115808150A
Authority
CN
China
Prior art keywords
target
deflection
plane
image
spot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211280967.0A
Other languages
Chinese (zh)
Inventor
高红伟
谷雨珊
张晨利
才永峰
何松亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Andawell Aviation Equipment Co Ltd
Original Assignee
Beijing Andawell Aviation Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Andawell Aviation Equipment Co Ltd filed Critical Beijing Andawell Aviation Equipment Co Ltd
Priority to CN202211280967.0A priority Critical patent/CN115808150A/en
Publication of CN115808150A publication Critical patent/CN115808150A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The application provides a curved surface deflection angle detection method based on binocular identification angle measurement, which is applied to a binocular identification angle measurement device, and comprises the following steps: receiving a target plane deflection angle measurement request sent by user equipment, wherein the target plane deflection angle is used for representing a deflection angle between a target plane and a reference plane, and the target plane and the reference plane are both curved surfaces; performing surface fixing marking on the target surface by using a preset number of targets, wherein the preset number is at least 3, the surface fixing marking refers to determining a plane where any point on the target surface is located, and the targets correspond to unique target marks; acquiring a reference binocular image of a target surface, wherein the reference binocular image comprises a first reference image and a second reference image; and carrying out target feature detection on the reference binocular image to obtain a reference surface target list. By the method, the problem that the deflection angle of the target surface is measured under the condition of relating to the curved surface by the binocular recognition and detection technology is solved, and the deflection angle of the target surface is determined to meet the actual requirements of a user.

Description

Curved surface deflection angle detection method and device based on binocular vision angle measurement
Technical Field
The application relates to the technical field of angle measurement, in particular to a curved surface angle detection method based on binocular vision angle measurement.
Background
And (3) calculating parallax information of each pixel point in the image from the images acquired by the left camera and the right camera by binocular recognition and detection, and further acquiring three-dimensional information of an object in the actual space. In the fields of aerospace, ship manufacturing, resource surveying and the like, the binocular identification detection technology is widely applied to angle detection of equipment in the related field.
When the traditional binocular identification and detection technology is applied to deflection angle detection of a detected surface, a mass center angle solution is often used for identifying a target number, and then a plane deflection angle is obtained through a calibrated target space coordinate position. When the measured surface of the measured object is a curved surface, for example, when detecting the deflection angle of the control surface of an airplane, or when performing quality inspection on some mechanical components with curved surfaces of industrial equipment, the deflection angle of the target surface needs to be measured under the curved surface condition, so as to determine that the deflection angle of the target surface meets the actual requirements of users.
Therefore, a method for detecting a curved surface deflection angle based on binocular vision angle measurement is needed.
Disclosure of Invention
The application provides a curved surface deflection angle detection method and device based on binocular identification angle measurement, which are used for measuring the deflection angle of a target surface under the curved surface related condition by adopting a binocular identification detection technology so as to detect whether the deflection angle of the target surface meets the actual requirements of a user.
The first aspect of the application provides a curved surface deflection angle detection method based on binocular identification angle measurement, which is applied to a binocular identification angle measurement device, and the method comprises the following steps: receiving a target plane deflection angle measurement request sent by user equipment, wherein the target plane deflection angle is used for representing a deflection angle between a deflected target plane and a reference plane, and the target plane and the reference plane are both curved surfaces; performing surface fixing marking on the target surface by using a preset number of targets, wherein the preset number is at least 3, the surface fixing marking refers to determining a plane where any point on the target surface is located, and the targets correspond to unique target marks; acquiring a reference binocular image of a target surface, wherein the reference binocular image comprises a first reference image and a second reference image; target feature detection is carried out on the reference binocular image to obtain a reference surface target list; responding to the deflection action of the target surface, and acquiring a deflection binocular image of the target surface, wherein the deflection binocular image comprises a first deflection image and a second deflection image; target feature detection is carried out on the deflection binocular image to obtain a deflection surface target list; acquiring targets with the same target labels in the reference plane target list and the deflection plane target list to obtain left and right reference plane matching targets; acquiring space coordinates of the left and right reference surfaces matching the target, wherein the space coordinates comprise a first reference space coordinate in a first reference image, a second reference space coordinate in a second reference image, a first deflection space coordinate in a first deflection image and a second deflection space coordinate in a second deflection image; according to the space coordinates of the left and right reference surfaces matched with the target, a target surface deflection angle is obtained; and sending the target plane deflection angle to user equipment so that a user can judge whether the included angle between the target plane and the reference plane is within a preset deflection angle range according to the target plane deflection angle.
In a possible implementation manner, the target feature detection is performed on the reference binocular image to obtain a reference plane target list, which specifically includes: performing target feature detection on the first reference image to obtain a first target group, wherein the first target group comprises at least one target; performing target feature detection on the second reference image to obtain a second target group, wherein the second target group comprises at least one target; acquiring a reference surface target list, wherein the reference surface target list is targets with the same target labels in a first target group and a second target group; target feature detection is carried out on the deflection binocular image to obtain a deflection surface target list, and the method specifically comprises the following steps: target feature detection is carried out on the first deflection image and the second deflection image, a third target group and a fourth target group are correspondingly obtained, and the third target group and the fourth target group both comprise at least one target; and acquiring a deflection surface target list, wherein the deflection surface target list is targets with the same target labels in the third target group and the fourth target group. According to the embodiment, targets with the same target numbers which can be identified on the reference plane and the polarization plane can be matched.
In one possible embodiment, the target feature detection comprises: performing speckle detection on the gray level images of the first reference image and the second reference image to obtain speckles which accord with preset speckle characteristics; denoising the spots which accord with the preset spot characteristics to obtain spots which accord with the preset denoising requirement so as to obtain a first spot group; performing ellipse fitting on the outline of the first spot group to obtain a first ellipse outline, wherein the first ellipse outline is used for limiting the outline of each spot in the first spot group to be the same ellipse; correcting the outline of each spot in the first spot group to obtain a first circular outline, wherein the first circular outline is used for limiting the outline of each spot in the first spot group to be the same circle; and numbering any one circular spot in the first spot group to obtain a target group with a number. The present application can obtain a target group corresponding to a target number by the above-described embodiment.
In a possible embodiment, numbering any one circular blob in the first blob group specifically includes: acquiring a plurality of fan-shaped images of the circular spot, wherein the fan-shaped images are obtained by equally dividing the circular spot according to a preset first angle, and the circular spot is equally divided according to the preset first angle by using a preset reference line; only white circular rings or non-white circular rings exist in any one of the fan-shaped images; any one of the circular spots in the first spot group is numbered. According to the embodiment, the fan-shaped images in the circular spots only have two states, so that the number of the subsequent targets is convenient.
In a possible implementation manner, denoising a blob that meets a preset blob feature, and acquiring a blob that meets a preset denoising requirement, so as to obtain a first blob group specifically includes: acquiring the number of contour points of any one of the spots which accord with the preset spot characteristics; and constructing a first spot group, wherein the number of contour points of any spot in the first spot group is greater than or equal to a preset threshold value of the number of contour points, and the number of contour points is used for representing the size of data volume in the contour of the spot.
In a possible implementation manner, the modifying the contour of each spot in the first elliptical spot group to obtain a first circular spot group specifically includes: acquiring circumscribed rectangle parameters of the outline of each spot in the first spot group; and calling a getAffiniTransform function to obtain an affine transformation matrix required by correction, and applying the affine transformation matrix to a warpAffeine function to obtain a first circular spot group.
In a possible implementation manner, obtaining the target plane deflection angle according to the spatial coordinates of the left and right reference planes matching the target specifically includes: acquiring an approximate plane of the reference surface, wherein the approximate plane of the reference surface is obtained according to the first reference space coordinate and the second reference space coordinate; obtaining an approximate plane of the deflection surface, wherein the approximate plane of the deflection surface is obtained according to the first deflection space coordinate and the second deflection space coordinate; and acquiring a deflection angle, wherein the deflection angle is an included angle between an approximate plane of the reference plane and an approximate plane of the deflection plane.
In a possible implementation manner, the speckle detection on the grayscale images of the first reference image and the second reference image specifically includes: and obtaining the spots meeting the area filtering requirement and the roundness filtering requirement.
The second aspect of the application provides a curved surface deflection angle detection device based on binocular vision angle measurement, which comprises a receiving module, a fixed surface marking module, a reference image acquisition module, a reference image detection module, a deflection image acquisition module, a deflection image detection module, a target matching module, a space coordinate acquisition module, a deflection angle calculation module and an output module; the receiving module is used for receiving a target plane deflection angle measurement request sent by user equipment, wherein the target plane deflection angle is used for representing a deflection angle between a target plane and a reference plane, and the target plane and the reference plane are both curved surfaces; the fixed surface marking module is used for carrying out fixed surface marking on the target surface by using a preset number of targets, the preset number is at least 3, the fixed surface marking refers to a plane where any point on the target surface is located, and the targets correspond to unique target marks; the reference image acquisition module is used for acquiring a reference binocular image of the target surface, wherein the reference binocular image comprises a first reference image and a second reference image; the reference image detection module is used for carrying out target feature detection on the reference binocular image to obtain a reference surface target list; the deflection image acquisition module is used for responding to the deflection action of the target surface and acquiring a deflection binocular image of the target surface, wherein the deflection binocular image comprises a first deflection image and a second deflection image; the deflection image detection module is used for carrying out target feature detection on the reference binocular image to obtain a reference surface target list; the target matching module is used for acquiring targets with the same target labels in the reference plane target list and the deflection plane target list to obtain left and right reference plane matching targets; the spatial coordinate acquisition module is used for acquiring spatial coordinates of the left and right reference planes matched with the target, wherein the spatial coordinates comprise a first reference spatial coordinate in a first reference image, a second reference spatial coordinate in a second reference image, a first deflection spatial coordinate in a first deflection image and a second deflection spatial coordinate in a second deflection image; the deflection angle calculation module is used for obtaining a target plane deflection angle according to the space coordinates of the left and right reference planes matched with the target; and the output module is used for sending the target plane deflection angle to the user equipment so that a user can judge whether the included angle between the target plane and the reference plane is within a preset deflection angle range according to the target plane deflection angle.
A third aspect of the application provides an electronic device comprising a processor, a memory for storing instructions, and a transceiver for communicating with other devices, the processor being configured to execute the instructions stored in the memory to cause the electronic device to perform the method of any one of the above.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon instructions that, when executed, perform the method of any one of the above.
Compared with the prior art, the beneficial effects of this application are: the deflection angle of the target surface is measured under the condition of relating to the curved surface by adopting a binocular recognition detection technology, so that the effect of detecting whether the deflection angle of the target surface meets the actual requirement of a user is achieved.
Drawings
Fig. 1 is a schematic flowchart of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present application;
fig. 4 is a schematic view of a fan-shaped cutting principle of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present application;
fig. 8 is a schematic flowchart of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present application;
fig. 9 is a schematic flowchart of a method for detecting a curved surface deflection angle based on binocular identification angle measurement according to an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating a method for detecting a deflection angle of a curved surface based on binocular identification angle measurement according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a curved surface deflection angle detection apparatus based on binocular identification angle measurement according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a curved surface deflection angle detection apparatus based on binocular identification angle measurement according to an embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
Furthermore, the terms "first," "second," and "third," etc. in the description of the present application are used for distinguishing between different objects and not necessarily for describing a particular order, and may explicitly or implicitly include one or more of the features.
The binocular vision angle measuring device in the embodiment of the application is used for acquiring various images in the embodiment, and the circular target is used for constructing a three-dimensional space. The measured surface in this application is the lower curved surface of camber for paste the target.
The execution platform of various functions in the embodiments of the present application may be OpenCV, and may execute image processing and function processing functions. The ellipse fitting in the embodiment of the present application means: for a set of sample points on a given plane, an ellipse is found that is as close as possible to the sample points. That is, a set of data in an image is fitted by using an elliptical equation as a model, so that a certain elliptical equation meets the data as much as possible, and each parameter of the elliptical equation is calculated. The center of the last determined best ellipse is the target to be determined by the search.
The target surface in the embodiment of the present application is a plane with a micro curved surface, and the deflection curved surface in the embodiment of the present application refers to a new curved surface obtained after the target surface with the micro curved surface deflects. The curved surface and the deflection curved surface in the embodiment of the application both refer to the same target surface.
The spatial coordinate system in the embodiment of the present application may be a camera coordinate system, and the camera coordinate system is a three-dimensional rectangular coordinate system established with a focusing center of the camera as an origin and an optical axis as a Z-axis.
The binocular vision angle measuring device in the embodiment of the application calibrates and finishes ten groups of photos at different positions, and needs to acquire the setting parameters of the binocular vision angle measuring device. Optionally, after the binocular vision angle measurement is calibrated, the reprojection error of the image is less than 0.5, which may be referred to as a calibration method of a conventional binocular vision angle measurement apparatus, and is not described in detail herein.
The embodiment of the application provides a method for detecting a curved surface deflection angle based on binocular vision angle measurement, which comprises the steps of S101-S106 as shown in FIG. 1.
Step S101, receiving a target plane deflection angle measurement request sent by user equipment, wherein the target plane deflection angle is used for representing a deflection angle between a deflected target plane and a reference plane, and the target plane and the reference plane are both curved surfaces.
For example, the target surface may be any object to be measured or surface of a device to be measured with a micro-curved surface to which the present application is applied. For example, there are scenarios where the angle of deflection of a curved surface is measured, such as the detection of the control surface of an aircraft, the deflection of an aircraft relative to the ground-facing surface, and the like.
Step S102, a preset number of targets are used for carrying out surface fixing marking on the target surface, the preset number is at least 3, the surface fixing marking refers to a plane where any point on the target surface is located, and the targets correspond to unique target marks.
For example, a number of targets are acquired and the spatial location on the target surface is determined. Since three points define a plane, the minimum target number is 3. The embodiment can properly increase the number of targets, and the increase of the number of targets can improve the angle value and the fault tolerance of the target curved surface to be recognized.
In one possible embodiment, the present method embodiment uses a target number of 12.
Step S103, acquiring a reference binocular image of the target surface, wherein the reference binocular image comprises a first reference image and a second reference image; and carrying out target feature detection on the reference binocular image to obtain a reference surface target list.
As shown in fig. 2, in a possible embodiment, target feature detection is performed on the reference binocular image to obtain a reference plane target list, which specifically includes steps S1031 to S1033.
And step S1031, performing target feature detection on the first reference image to obtain a first target group, wherein the first target group comprises at least one target.
Step S1032, performing target feature detection on the second reference image to obtain a second target group, where the second target group includes at least one target.
Step S1033, a reference plane target list is obtained, where the reference plane target list is a target with the same target label in the first target group and the second target group.
For example, for step S1033, all data of the composition is traversed, taking the lowest value as the target label. If the lowest value is less than 65535 (0 XFFF), the target label and spatial coordinates are stored in the corresponding data structure. After all the spots are identified, all the target marks which can be identified by the binocular vision angle measuring equipment are obtained. And traversing the first target group and the second target group, if the same target number exists, considering that the point is correctly identified, and respectively storing the target number and the space coordinate into a correct target number list of corresponding binocular vision angle measuring equipment, which is referred to as a reference plane target list in the following.
In one possible embodiment, the targets are removed from the list of fiducial targets when the following are present for the targets in the first and second target sets. The following cases include: when the target number has the condition of identification failure, if the standard deviation is too high, returning an error code 0; the target number is greater than 65535 and an error code 65535 is returned. There are instances where the same target number exists when a target is recognized incorrectly and matches a target number in the reference plane target list. The data precision of the targets in the reference plane target list is improved through the embodiment.
Step S104, responding to the deflection action of the target surface, and acquiring a deflection binocular image of the target surface, wherein the deflection binocular image comprises a first deflection image and a second deflection image; and carrying out target feature detection on the deflection binocular image to obtain a deflection surface target list.
As shown in fig. 3, in one possible embodiment, the target feature detection is performed on the deflected binocular images to obtain the deflected plane target list, which specifically includes steps S1041-S1042.
Step S1041, performing target feature detection on the first deflection image and the second deflection image, and obtaining a third target group and a fourth target group correspondingly, where the third target group and the fourth target group both include at least one target.
For example, for step S1041, the reference binocular image is rotated by a certain angle (less than ± 60 °), and the deflected binocular image of the target plane is obtained. For the method steps of performing target feature detection on the first deflection image and the second deflection image, reference may be made to the method for performing target feature detection on the first reference image and the second reference image in the specification, and details are not repeated here.
Step S1042, a list of deflection targets is obtained, where the list of deflection targets is targets of the same target label in the third target group and the fourth target group.
Step S105, obtaining targets with the same target labels in the reference plane target list and the deflection plane target list to obtain left and right reference plane matching targets; and acquiring space coordinates of the left and right reference surfaces matching the target, wherein the space coordinates comprise a first reference space coordinate in the first reference image, a second reference space coordinate in the second reference image, a first deflection space coordinate in the first deflection image and a second deflection space coordinate in the second deflection image.
For example, in step S105, since the measurement environment of the target surface is a curved surface, and the target identification is affected by the factors such as illumination, distance, and the like, there is a certain probability of identification failure. When the target label is a curved surface, the number of targets successfully identified by the reference plane does not necessarily coincide with the number of targets identified by the moving plane. At this time, the plane constructed by all the target points of the reference surface has an error with the plane constructed by all the target points of the moving surface. The error caused by the curved surface is eliminated, so that the datum plane can be consistent with the moving surface target point, namely the target point which is not identified by the moving surface in the datum plane is removed before the three-dimensional plane is constructed, and the datum plane target is matched. Due to the influence of factors such as illumination, a part of targets on the reference surface may be unidentifiable. When the target which cannot be identified before after the datum face moves to form the moving face can be identified successfully, the part of the target also needs to be removed. After the reference plane target matching is completed, traversing the reference plane target list, projecting the reference plane target list into a three-dimensional camera space system one by one, acquiring space coordinates of left and right target points which are input currently, storing the space coordinates into the reference plane space point list, and waiting for subsequent calculation. Traversing the two-dimensional target points of the deflection surface, projecting the two-dimensional target points one by one into a three-dimensional camera space system, acquiring the space coordinates of the currently input left and right target points, storing the space coordinates into a surface space point list, and waiting for subsequent calculation.
Step S106, obtaining a target plane deflection angle according to the space coordinates of the left and right reference planes matched with the target; and sending the target plane deflection angle to user equipment so that a user can judge whether the included angle between the target plane and the reference plane is within a preset deflection angle range according to the target plane deflection angle.
For example, for step S106, obtaining the target plane deflection angle according to the spatial coordinates of the left and right reference planes matching the target specifically includes: and (3) using a surface calculation algorithm, inputting all points in the reference surface space point list, and obtaining an approximate plane of the reference surface in the binocular system space. The same operation is carried out on the moving surface to obtain an approximate plane of the moving surface in the binocular system space. The method for calculating the included angle between two planes is known by using basic three-dimensional space knowledge: the point of the normal vector of the two planes is multiplied by its angle value and the cross is multiplied by the direction. And calculating the included angle between the reference surface and the moving surface by using the method as a principle.
In one possible embodiment, the obtaining the target plane deflection angle according to the spatial coordinates of the left and right reference planes matching the target further comprises: when the measured angle is less than-90 deg. or greater than 90 deg., its complementary angle is calculated as the return value. (because the normal vector of the plane has two directions, the included angles calculated by the normal vectors of the two directions are 180 degrees complementary angles each other in each calculation)
In the embodiment of the application, a method for calculating an included angle between a reference plane and a deflection plane is provided.
The following: pasting 12 coding identification points on a detected control surface, firstly calculating three-dimensional space coordinates of the 12 coding identification points through the previous principle when the control surface is in a zero surface position, and fitting out a minimum approximation plane P1 of the 12 coding points through a least square method principle:
A 1 x+B 1 y+C 1 z+D=0
when the control surface is at the measured position, acquiring three-dimensional space coordinates of 12 coding points at the position, and fitting a minimum approximation plane Pn of the 12 coding points by a least square method:
A n x+B n y+C n z+D=0
as shown in fig. 4, the included angle between P1 and Pn is shown in the figure, and the included angle of deflection of the control surface is shown in the figure, which is easily proved:
the included angle of the two planes is obtained, and the included angle is the deflection angle of the control surface. The angle between the two planes is known from the geometric principle as the normal vector angle of the two planes, because the normal vector of P1 is P1 (A) 1 ,B 1 ,C 1 ) The normal vector of Pn is Pn (A) n ,B n ,C n ) Therefore: the control surface deflection angle is determined.
As shown in the figure, in a possible implementation, the obtaining of the target plane deflection angle according to the spatial coordinates of the left and right reference planes matching the target specifically includes steps S1061-S1063.
Step S1061, obtaining an approximate plane of the reference plane, where the approximate plane of the reference plane is obtained according to the first reference space coordinate and the second reference space coordinate.
Step S1062, obtaining an approximate plane of the deflection surface, wherein the approximate plane of the deflection surface is obtained according to the first deflection space coordinate and the second deflection space coordinate.
Step S1063, obtaining a deflection angle, wherein the deflection angle is an included angle between an approximate plane of the reference plane and an approximate plane of the deflection plane.
As shown in fig. 5, in one possible embodiment, target feature detection includes steps S401-S406.
Step S401, performing blob detection on the grayscale images of the first reference image and the second reference image, and acquiring blobs which conform to preset blob features.
For example, a SimpleBlobDedetector is used to perform blob detection on the gray scale image, finding blobs on the reference binocular image. Since the target is circular, a filter needs to be provided. A structure SimpleBlobDetector is created, params, and area filtering and roundness filtering are adopted.
In a possible implementation manner, the speckle detection on the grayscale images of the first reference image and the second reference image specifically includes: and obtaining the spots meeting the area filtering requirement and the roundness filtering requirement.
For example, the maximum area of the area filter is 20000 pixels, and the minimum area is 1000 pixels; the minimum radius of the roundness filter is 0.3 roundness, and the maximum radius is 3 roundness; the minimum threshold is 30 luminances and the maximum threshold is 150 luminances.
And S402, denoising the spots which accord with the preset spot characteristics, and acquiring the spots which accord with the preset denoising requirement to obtain a first spot group.
As shown in fig. 6, in a possible embodiment, denoising is performed on blobs which meet the preset blob feature, and blobs which meet the preset denoising requirement are obtained, so as to obtain a first blob group, which specifically includes steps S4021 to S4022.
Step S4021, acquiring the number of contour points of any one of the blobs according with the preset blob characteristics.
Step S4022, a first spot group is constructed, the number of contour points of any spot in the first spot group is greater than or equal to a preset threshold of the number of contour points, and the number of contour points is used for representing the size of data volume in the contour of the spot.
For example, a feature detector is used for detecting a first reference image acquired by binocular vision angle measuring equipment, and all spots meeting conditions are identified and stored in a vector < cv:: keyPoint > container. Traversing the vector < cv:: keyPoint > container, the processing operation for each blob is as follows: taking the radius of 10 times of the spot as the side length and the center of the spot as the center, and intercepting a square from an original image as a new image; and (5) calculating an image average value, and binarizing the new image by using the average value as a lowest threshold value to obtain a binarized image. After the above steps are completed, steps S4021-S4022 are performed, and the function Canny and the function findContours are called in sequence, and Canny is filled in the parameters 50,150. filling a parameter RETR _ CCOMP into a findContours function, detecting all contours and establishing only two levels of information; fill in the parameter CHAIN _ APPROX _ SIMPLE, perform contour compression, and only reserve the vertex. Finally, the obtained contour data is filled into a data structure vector < vector < Point >, and contour level information is filled into a data vector < Vec4i >; the profile data is traversed and only the top-most profile is found in order to speed up the computation. Carrying out data volume test on the currently processed contour, and if the number of points of the contour is less than 30, discarding the contour;
step S403, performing ellipse fitting on the outline of the first spot group to obtain a first ellipse outline, where the first ellipse outline is used to define the outline of each spot in the first spot group as the same ellipse.
For example, fitting an ellipse to a contour passing a data size test using a function fitEllipse, recording distances from centers of all contours to a center of an image, and using the contour having the smallest distance as an elliptical contour successfully fitted, it should be noted that since the contour of a target is a circle, a major axis of the contour needs to be determined as a major axis of an accurate value.
Step S404, the contour of each spot in the first spot group is corrected to obtain a first circular contour, and the first circular contour is used to limit the contour of each spot in the first spot group to be the same circle.
As shown in fig. 7, in a possible embodiment, the method corrects the contour of each spot in the first elliptical spot group to obtain a first circular spot group, and specifically includes steps S4041-S4042;
step S4041, the circumscribed rectangle parameter of the outline of each spot in the first spot group is obtained.
Step S4042, a getAffiniTransform function is called to obtain an affine transformation matrix required by correction, and the affine transformation matrix is applied to a warpAffine function to obtain a first circular spot group.
For example, a circumscribed rectangle of each spot in the first elliptical spot group is projected to be a circle, an affine transformation matrix before and after transformation is obtained by using getAffiniTransform, and a picture is transformed by using warpAffine, so that the outline of each spot in the first elliptical spot group becomes a circle, and a first circular spot group is obtained.
Step S405, any one circular spot in the first spot group is numbered, and a target group with the number is obtained.
As shown in fig. 8, in one possible embodiment, the numbering of any one of the circular spots in the first spot group specifically includes steps S4051-S4052.
Step S4051, acquiring a plurality of fan-shaped images of the circular spot, wherein the fan-shaped images are obtained by equally dividing the circular spot according to a preset first angle, the circular spot is equally divided according to the preset first angle by using a preset reference line, and any one fan-shaped image in the fan-shaped images only has a white circular ring or a non-white circular ring;
in step S4052, any circular spot in the first spot group is numbered.
For step S4051 and step S4052, for example, as shown in fig. 9, a plurality of sector images of the circular spot are obtained, and the sector images are obtained by equally dividing the circular spot according to a preset first angle, where the preset first angle is 24 °, and a total of 15 sector images exist. The maximum recognizable target label is 2^15-1=32767. Before the target can be correctly identified, the target needs to be correctly cut. Dividing the circular spots into 360 degrees, analyzing each fan-shaped image, if the number of white pixels between 2 times of radius and 4 times of half radius in the fan-shaped image is more than 0.5 times of the number of white pixels in the radius, considering that a white ring exists in the fan-shaped image, and recording the information into vector < int >. And a copy of the data is copied (i.e. the data sequence numbers 0-359 correspond to and are the same as the data 360-719) to make the subsequent data calculation easier. When a correct cut is made, all rings within each 24 ° should be uniform in color, with only white or non-white rings present, and no intermediate states. And traversing all possible cuts by using the rule as a principle, searching for one cut with the lowest standard deviation, recording the angle of the start of the cut, and taking the cutting line of the circular spot according to the angle of the start of the cut as a preset datum line.
In one possible embodiment, step S4051 precedes the numbering of any one of the circular blobs in the first blob group. The method further comprises merging the data of the cutting composition with the lowest standard deviation, filtering the pattern noise in the plurality of fan-shaped images by using the average value, and regarding the pattern with the average value smaller than 0.5, regarding that no white ring exists, and otherwise, regarding that the white ring exists.
This embodiment provides a schematic diagram of cutting targets into a plurality of sectors, such as circular target 1 and circular target 2 shown in fig. 10. A plurality of sectors ordered clockwise from the initial cutting line for the target 1, when there is no shadow in the sectors, i.e. sectors where there is a white circle. The target identification method can identify 2^15-1 targets at most by traversing all possible sector orderings.
Through the embodiment, the following beneficial effects can be achieved. The deflection angle of the target surface is measured under the condition of relating to the curved surface by adopting a binocular recognition detection technology, so that the purpose of detecting whether the deflection angle of the target surface meets the actual requirement of a user is achieved. The error of the angle measurement result under the curved surface condition is reduced, and the precision of the measurement result is improved. It is possible to have only two states of the fan images in the circular spot for subsequent target numbering. By adopting the method, more usable targets and target labels can be obtained, and more different algorithm schemes can be obtained conveniently to meet the requirements of different measurement scenes. And the spots in the graph are subjected to noise processing and screening, so that the accuracy of the finally obtained target is improved.
As shown in fig. 11, the present embodiment provides a curved surface deflection angle detection apparatus based on binocular vision angle measurement, and the apparatus includes a receiving module 101, a fixed surface marking module 102, a reference image acquisition module 103, a reference image detection module 104, a deflection image acquisition module 105, a deflection image detection module 106, a target matching module 107, a spatial coordinate acquisition module 108, a deflection angle calculation module 109, and an output module 110. The fixed surface marking module 102, the reference image acquisition module 103 and the reference image detection module 104 are all used for reference binocular images; the deflection image acquisition module 105, the deflection image detection module 106 and the target matching module 107 are applied to the deflection binocular images.
The receiving module 101 receives a target plane deflection angle measurement request sent by a user equipment, where the target plane deflection angle is used to indicate a deflection angle between a target plane and a reference plane, and both the target plane and the reference plane are curved surfaces.
The surface fixing marking module 102 is configured to perform surface fixing marking on the target surface by using a preset number of targets, where the preset number is at least 3, the surface fixing marking is a plane where any point on the target surface is determined, and the target corresponds to a unique target label.
The reference image acquiring module 103 acquires a reference binocular image of the target surface, wherein the reference binocular image comprises a first reference image and a second reference image.
The reference image detection module 104 is used for performing target feature detection on the reference binocular image to obtain a reference surface target list; a deflection image acquisition module 105 for acquiring a deflection binocular image of the target surface in response to the deflection action of the target surface, the deflection binocular image including a first deflection image and a second deflection image,
the deflection image detection module 106 is used for carrying out target feature detection on the reference binocular image to obtain a reference surface target list;
the target matching module 107 is used for acquiring targets with the same target labels in the reference plane target list and the deflection plane target list to obtain left and right reference plane matching targets;
a spatial coordinate obtaining module 108, configured to obtain spatial coordinates of the left and right reference planes matching the target, where the spatial coordinates include a first reference spatial coordinate in the first reference image, a second reference spatial coordinate in the second reference image, a first deflection spatial coordinate in the first deflection image, and a second deflection spatial coordinate in the second deflection image;
the deflection angle calculation module 109 is used for obtaining the deflection angle of the target surface according to the space coordinates of the left and right reference surfaces matched with the target;
the output module 110 sends the target plane deflection angle to the user equipment, so that the user can determine whether the included angle between the target plane and the reference plane is within the predetermined deflection angle range according to the target plane deflection angle.
In a possible implementation manner, the reference image detection module 104 performs blob detection on the grayscale images of the first reference image and the second reference image to obtain blobs which accord with preset blob features; denoising the spots which accord with the preset spot characteristics to obtain spots which accord with the preset denoising requirement so as to obtain a first spot group; performing ellipse fitting on the outline of the first spot group to obtain a first ellipse outline, wherein the first ellipse outline is used for limiting the outline of each spot in the first spot group to be the same ellipse; correcting the outline of each spot in the first spot group to obtain a first circular outline, wherein the first circular outline is used for limiting the outline of each spot in the first spot group to be the same circle; and numbering any one circular spot in the first spot group to obtain a target group with a number.
In a possible implementation manner, the reference image detection module 104 acquires a plurality of sector images of a circular spot, the sector images are obtained by equally dividing the circular spot according to a preset first angle, the circular spot is equally divided according to the preset reference line by the preset first angle, and any one sector image in the sector images only has a white circular ring or a non-white circular ring; any one of the circular spots in the first spot group is numbered.
In a possible implementation manner, the reference image detection module 104 obtains the number of contour points of any one of the blobs according to the preset blob features; and constructing a first spot group, wherein the number of contour points of any spot in the first spot group is greater than or equal to a preset threshold value of the number of contour points, and the number of contour points is used for representing the size of data volume in the contour of the spot.
In a possible implementation manner, the reference image detection module 104 acquires circumscribed rectangle parameters of the outline of each spot in the first spot group; and calling a getAffiniTransform function to obtain an affine transformation matrix required by correction, and applying the affine transformation matrix to a warpAffeine function to obtain a first circular spot group.
In a possible implementation manner, the deflection angle calculation module 109 acquires an approximate plane of the reference plane, where the approximate plane of the reference plane is obtained according to the first reference space coordinate and the second reference space coordinate; obtaining an approximate plane of the deflection surface, wherein the approximate plane of the deflection surface is obtained according to the first deflection space coordinate and the second deflection space coordinate; and acquiring a deflection angle, wherein the deflection angle is an included angle between an approximate plane of the reference plane and an approximate plane of the deflection plane.
In one possible implementation, the reference image detection module 104 obtains blobs that meet the area filtering requirement and the roundness filtering requirement.
It should be noted that: in the above embodiment, when the device implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 12, the electronic device 1000 may include: at least one processor 1001, at least one network interface 1004, a user interface 1003, memory 1005, at least one communication bus 1002.
The communication bus 1002 is used to implement connection communication among these components.
The user interface 1003 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others. Processor 1001 may include one or more processing cores, among other things. The processor 1001 connects various parts throughout the server 1000 using various interfaces and lines, and performs various functions of the server 1000 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1005, and calling data stored in the memory 1005. Alternatively, the processor 1001 may be implemented in at least one hardware form of Digital Signal Processing (DSP), field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 1001 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the above modem may not be integrated into the processor 1001, and may be implemented by a single chip.
The Memory 1005 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1005 includes a non-transitory computer-readable medium. The memory 1005 may be used to store an instruction, a program, code, a set of codes, or a set of instructions. The memory 1005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 12, a program of an operating system, a network communication module, a user interface module, and a method and apparatus for detecting a deflection angle of a curved surface based on binocular vision angle measurement may be included in a memory 1005, which is a computer storage medium.
In the electronic device 1000 shown in fig. 12, the user interface 1003 is mainly used as an interface for providing input for a user, and acquiring data input by the user; and the processor 1001 may be configured to invoke an application program stored in the memory 1005 for a binocular vision angle measurement based curved surface deflection angle detection method, which when executed by the one or more processors, causes the electronic device to perform the method as described in one or more of the above embodiments.
A computer-readable storage medium having instructions stored thereon. When executed by one or more processors, cause an electronic device to perform a method as described in one or more of the above embodiments.
It is clear to a person skilled in the art that the solution of the present application can be implemented by means of software and/or hardware. The "unit" and "module" in this specification refer to software and/or hardware that can perform a specific function independently or in cooperation with other components, where the hardware may be, for example, a Field-ProgrammaBLE Gate Array (FPGA), an Integrated Circuit (IC), or the like.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some service interfaces, devices or units, and may be an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by a program which instructs associated hardware to perform the steps, and the program may be stored in a computer readable memory, and the memory may include: flash disks, read-Only memories (ROMs), random Access Memories (RAMs), magnetic or optical disks, and the like.
The above are merely exemplary embodiments of the present disclosure, and the scope of the present disclosure should not be limited thereby. That is, all equivalent changes and modifications made in accordance with the teachings of the present disclosure are intended to be included within the scope of the present disclosure. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains.

Claims (10)

1. A curved surface deflection angle detection method based on binocular identification angle measurement is characterized by being applied to a binocular identification angle measurement device and comprising the following steps:
receiving a target plane deflection angle measurement request sent by user equipment, wherein the target plane deflection angle is used for representing a deflection angle between a deflected target plane and a reference plane, and the target plane and the reference plane are both curved surfaces;
performing surface fixing marking on the target surface by using a preset number of targets, wherein the preset number is at least 3, the surface fixing marking refers to determining a plane where any point on the target surface is located, and the targets correspond to unique target marks;
acquiring a reference binocular image of a target surface, wherein the reference binocular image comprises a first reference image and a second reference image;
target feature detection is carried out on the reference binocular image to obtain a reference surface target list;
acquiring a deflection binocular image of the target surface in response to the deflection action of the target surface, wherein the deflection binocular image comprises a first deflection image and a second deflection image;
target feature detection is carried out on the deflection binocular image to obtain a deflection plane target list;
acquiring targets with the same target labels in the reference plane target list and the deflection plane target list to obtain left and right reference plane matching targets;
acquiring space coordinates of the left and right reference surfaces matching the target, wherein the space coordinates comprise a first reference space coordinate in a first reference image, a second reference space coordinate in a second reference image, a first deflection space coordinate in a first deflection image and a second deflection space coordinate in a second deflection image;
obtaining the deflection angle of the target surface according to the space coordinates of the left and right reference surfaces matched with the target;
and sending the target plane deflection angle to user equipment so that a user can judge whether the included angle between the target plane and the reference plane is within a preset deflection angle range according to the target plane deflection angle.
2. The method according to claim 1, wherein the target feature detection is performed on the reference binocular image to obtain a reference plane target list, and specifically comprises:
performing the target feature detection on the first reference image to obtain a first target group, wherein the first target group comprises at least one target;
performing the target feature detection on the second reference image to obtain a second target group, wherein the second target group comprises at least one target;
acquiring the reference plane target list, wherein the reference plane target list is the targets with the same target labels in the first target group and the second target group;
the target feature detection is carried out on the deflection binocular image to obtain a deflection surface target list, and the method specifically comprises the following steps:
performing the target feature detection on the first deflection image and the second deflection image to obtain a third target group and a fourth target group correspondingly, wherein the third target group and the fourth target group both comprise at least one target;
obtaining the list of the targets with the deflection surfaces, wherein the list of the targets with the deflection surfaces is targets with the same target labels in the third target group and the fourth target group.
3. The method of claim 2, wherein the target feature detection comprises:
performing speckle detection on the gray level images of the first reference image and the second reference image to obtain speckles which accord with preset speckle characteristics;
denoising the spots which accord with the preset spot characteristics to obtain spots which accord with the preset denoising requirement so as to obtain a first spot group;
performing ellipse fitting on the outline of the first spot group to obtain a first ellipse outline, wherein the first ellipse outline is used for limiting the outline of each spot in the first spot group to be the same ellipse;
correcting the outline of each spot in the first spot group to obtain a first circular outline, wherein the first circular outline is used for limiting the outline of each spot in the first spot group to be the same circle;
and numbering any one circular spot in the first spot group to obtain a target group with a number.
4. The method of claim 3, wherein numbering any one of the circular spots in the first spot group comprises:
acquiring a plurality of fan-shaped images of a circular spot, wherein the fan-shaped images are obtained by equally dividing the circular spot according to a preset first angle, the circular spot is equally divided according to the preset first angle by using a preset reference line, and any one fan-shaped image in the fan-shaped images only has a white circular ring or a non-white circular ring;
and numbering any one circular spot in the first spot group.
5. The method according to claim 3, wherein denoising the blobs which meet the preset blob feature to obtain blobs which meet the preset denoising requirement, so as to obtain the first blob group, specifically comprises:
acquiring the number of contour points of any one of the spots which accord with the preset spot characteristics;
and constructing a first spot group, wherein the number of contour points of any spot in the first spot group is greater than or equal to a preset threshold value of the number of contour points, and the number of contour points is used for representing the size of data volume in the contour of the spot.
6. The method according to claim 3, wherein the step of modifying the contour of each spot in the first elliptical spot group to obtain a first circular spot group comprises:
acquiring circumscribed rectangle parameters of the outline of each spot in the first spot group;
and calling a getAffiniTransform function to obtain an affine transformation matrix required by correction, wherein the affine transformation matrix is applied to a warpAffeine function to obtain the first circular spot group.
7. The method of claim 2, wherein obtaining the target plane deflection angle from the spatial coordinates of the left and right reference planes matching the target comprises:
acquiring an approximate plane of a reference plane, wherein the approximate plane of the reference plane is obtained according to a first reference space coordinate and a second reference space coordinate;
obtaining an approximate plane of a deflection surface, wherein the approximate plane of the deflection surface is obtained according to a first deflection space coordinate and a second deflection space coordinate;
and acquiring a deflection angle, wherein the deflection angle is an included angle between an approximate plane of the reference plane and an approximate plane of the deflection plane.
8. The method according to claim 3, wherein the performing speckle detection on the gray scale maps of the first reference image and the second reference image comprises:
and obtaining the spots meeting the area filtering requirement and the roundness filtering requirement.
9. A curved surface deflection angle detection device based on binocular vision angle measurement is characterized by comprising a receiving module, a fixed surface marking module, a reference image acquisition module, a reference image detection module, a deflection image acquisition module, a deflection image detection module, a target matching module, a space coordinate acquisition module, a deflection angle calculation module and an output module;
the receiving module is used for receiving a target plane deflection angle measurement request sent by user equipment, wherein the target plane deflection angle is used for representing a deflection angle between a target plane and a reference plane, and the target plane and the reference plane are both curved surfaces;
the fixed surface marking module is used for carrying out fixed surface marking on the target surface by using a preset number of targets, the preset number is at least 3, the fixed surface marking refers to a plane where any point on the target surface is located, and the targets correspond to unique target marks;
the reference image acquisition module is used for acquiring a reference binocular image of the target surface, wherein the reference binocular image comprises a first reference image and a second reference image;
the reference image detection module is used for carrying out target feature detection on the reference binocular image to obtain a reference surface target list;
a deflection image acquisition module for responding to the deflection action of the target surface and acquiring a deflection binocular image of the target surface, wherein the deflection binocular image comprises a first deflection image and a second deflection image,
the deflection image detection module is used for carrying out target feature detection on the reference binocular image to obtain a reference surface target list;
the target matching module is used for acquiring targets with the same target labels in the reference plane target list and the deflection plane target list to obtain left and right reference plane matching targets;
the spatial coordinate acquisition module is used for acquiring spatial coordinates of the left and right reference surfaces matched with the target, wherein the spatial coordinates comprise a first reference spatial coordinate in a first reference image, a second reference spatial coordinate in a second reference image, a first deflection spatial coordinate in a first deflection image and a second deflection spatial coordinate in a second deflection image;
the deflection angle calculation module is used for obtaining a target plane deflection angle according to the space coordinates of the left and right reference planes matched with the target;
and the output module is used for sending the target plane deflection angle to the user equipment so that a user can judge whether the included angle between the target plane and the reference plane is within a preset deflection angle range according to the target plane deflection angle.
10. An electronic device comprising a processor, a memory for storing instructions, and a transceiver for communicating with other devices, the processor being configured to execute the instructions stored in the memory to cause the electronic device to perform the method of any of claims 1-8.
CN202211280967.0A 2022-10-19 2022-10-19 Curved surface deflection angle detection method and device based on binocular vision angle measurement Pending CN115808150A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211280967.0A CN115808150A (en) 2022-10-19 2022-10-19 Curved surface deflection angle detection method and device based on binocular vision angle measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211280967.0A CN115808150A (en) 2022-10-19 2022-10-19 Curved surface deflection angle detection method and device based on binocular vision angle measurement

Publications (1)

Publication Number Publication Date
CN115808150A true CN115808150A (en) 2023-03-17

Family

ID=85482769

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211280967.0A Pending CN115808150A (en) 2022-10-19 2022-10-19 Curved surface deflection angle detection method and device based on binocular vision angle measurement

Country Status (1)

Country Link
CN (1) CN115808150A (en)

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
CN109801333B (en) Volume measurement method, device and system and computing equipment
JP5618569B2 (en) Position and orientation estimation apparatus and method
CN111179358A (en) Calibration method, device, equipment and storage medium
US8526705B2 (en) Driven scanning alignment for complex shapes
US10024653B2 (en) Information processing apparatus, information processing method, and storage medium
US20180150969A1 (en) Information processing device, measuring apparatus, system, calculating method, storage medium, and article manufacturing method
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
JP2022528301A (en) Calibration method, positioning method, equipment, electronic devices and storage media
WO2011158885A1 (en) Position/orientation measurement apparatus, processing method therefor, and non-transitory computer-readable storage medium
CN104634242A (en) Point adding system and method of probe
CN112184811B (en) Monocular space structured light system structure calibration method and device
CN111507976A (en) Defect detection method and system based on multi-angle imaging
WO2024011764A1 (en) Calibration parameter determination method and apparatus, hybrid calibration board, device, and medium
CN113049184A (en) Method, device and storage medium for measuring mass center
CN111681186A (en) Image processing method and device, electronic equipment and readable storage medium
CN115376000A (en) Underwater measurement method, device and computer readable storage medium
CN115685160A (en) Target-based laser radar and camera calibration method, system and electronic equipment
CN115187612A (en) Plane area measuring method, device and system based on machine vision
JP6486083B2 (en) Information processing apparatus, information processing method, and program
CN115808150A (en) Curved surface deflection angle detection method and device based on binocular vision angle measurement
CN115115619A (en) Feature point extraction method, device and equipment based on circle fitting and storage medium
CN115512343A (en) Method for correcting and recognizing reading of circular pointer instrument
Frangione et al. Multi-step approach for automated scaling of photogrammetric micro-measurements
JP2018041169A (en) Information processing device and control method and program thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination