CN111768441A - Method and system for monitoring traveling process of columnar object and computer equipment - Google Patents

Method and system for monitoring traveling process of columnar object and computer equipment Download PDF

Info

Publication number
CN111768441A
CN111768441A CN202010603182.7A CN202010603182A CN111768441A CN 111768441 A CN111768441 A CN 111768441A CN 202010603182 A CN202010603182 A CN 202010603182A CN 111768441 A CN111768441 A CN 111768441A
Authority
CN
China
Prior art keywords
image data
detected
base point
measured
calibration parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010603182.7A
Other languages
Chinese (zh)
Inventor
李四维
吕键
黄伯源
刘充
陈光磊
张飞豹
李明明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Institute Of Aeronautics And Astronautics Equipment & Technology
Original Assignee
Guangdong Institute Of Aeronautics And Astronautics Equipment & Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Institute Of Aeronautics And Astronautics Equipment & Technology filed Critical Guangdong Institute Of Aeronautics And Astronautics Equipment & Technology
Priority to CN202010603182.7A priority Critical patent/CN111768441A/en
Publication of CN111768441A publication Critical patent/CN111768441A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application relates to a method, a system and computer equipment for monitoring a columnar object advancing process. The method for monitoring the traveling process of the columnar object comprises the step of emitting a group of stripe structure light to the surface of the object to be detected. And acquiring a group of image data according to the signal light reflected by the surface of the object to be detected. And obtaining the center shaft base point coordinates and the direction vector of the object to be detected according to the image data and the calibration parameters of the system, thereby realizing high-speed and high-precision monitoring and feedback in the advancing process of the columnar object. The method is simple to operate, high in measuring speed and precision and simple in data processing.

Description

Method and system for monitoring traveling process of columnar object and computer equipment
Technical Field
The present application relates to the field of three-dimensional measurement, and in particular, to a method, a system, and a computer device for monitoring a traveling process of a cylindrical object.
Background
In the current process of manufacturing, detecting and medical equipment in high precision industry, it is often necessary to align and insert a cylindrical object such as a probe or a catheter with a target position. But is interfered by external factors such as mechanical motion errors, manual operation and the like, and the whole travelling process needs continuous feedback and calibration. Therefore, there is a need for a non-contact high-precision monitoring method and system, which can perform high-precision monitoring on the spatial position and motion trajectory of the cylindrical object and feed back the spatial position and motion trajectory to the control system for calibration, so as to ensure that the cylindrical object is successfully aligned with and inserted into the target position.
Disclosure of Invention
Based on the above, the application provides a method, a system and a computer device for monitoring the advancing process of a columnar object, so as to perform high-precision monitoring on the spatial position and the motion track of the columnar object.
A method of monitoring a cylindrical object travel process, comprising:
emitting a group of stripe structure light to the surface of an object to be detected;
acquiring a group of image data according to the signal light reflected by the surface of the object to be detected;
and obtaining the coordinates and the direction vectors of the central axis base point of the object to be detected according to the image data and the calibration parameters of the system.
In one embodiment, the step of obtaining the coordinates of the central axis base point and the direction vector of the object to be measured according to the image data and the calibration parameters of the system includes:
determining the three-dimensional point cloud information of the object to be detected according to the image data and the calibration parameters of the system;
and fitting the three-dimensional point cloud information to obtain the central axis base point coordinates and the direction vectors of the object to be detected.
In one embodiment, the step of determining the three-dimensional point cloud information of the object to be measured according to the image data and the calibration parameters of the system includes:
determining phase information with the object to be detected according to the gray value of the image data;
and converting the phase information into the three-dimensional point cloud information according to the calibration parameters.
In one embodiment, the relationship between the phase information of the object to be measured and the gray value is as follows:
Figure BDA0002559854090000021
wherein the content of the first and second substances,
Figure BDA0002559854090000022
as phase information of the object to be measured, InThe gray value corresponding to the nth image data.
In one embodiment, the light with a striped structure is a phase-shifted light with a sinusoidal distribution.
In one embodiment, the step of obtaining the central axis base point coordinates and the direction vector of the object to be measured according to the image data and the calibration parameters of the system comprises:
and calculating the relative spatial position of the object to be detected and the target according to the coordinate of the central axis base point of the object to be detected and the direction vector.
A system for monitoring a cylindrical object travel process, comprising:
the projection equipment is used for emitting a group of stripe structure light to the surface of the object to be measured;
the detector is connected with the projection equipment and used for acquiring a group of image data according to the signal light reflected by the surface of the object to be detected; and
and the processor is connected with the detector and used for obtaining the coordinate of the central axis base point and the direction vector of the object to be detected according to the image data and the calibration parameters of the system.
In one embodiment, the processor comprises:
the parameter setting module is connected with the projection equipment, and is used for setting parameters of the stripe structure light and sending the parameters to the projection equipment; and;
and the data processing module is connected with the detector and used for determining the three-dimensional point cloud information of the object to be detected according to the image data and the calibration parameters of the system, and fitting the three-dimensional point cloud information to obtain the center axis base point coordinates and the direction vectors of the object to be detected.
In one embodiment, the processor further comprises:
and the calculation module is connected with the data processing module and used for calculating the relative spatial position of the object to be detected and the target according to the central axis base point coordinates and the direction vector of the object to be detected.
A computer device, comprising a memory, a processor and a computer program stored in the memory and running on the processor, wherein the processor implements the step of obtaining the coordinate of the central axis base point and the direction vector of the object to be measured according to the image data and the calibration parameters of the system when executing the computer program.
The method for monitoring the traveling process of the columnar object comprises the step of emitting a group of stripe structure light to the surface of the object to be detected. And acquiring a group of image data according to the signal light reflected by the surface of the object to be detected. And obtaining the center shaft base point coordinates and the direction vector of the object to be detected according to the image data and the calibration parameters of the system, thereby realizing high-speed and high-precision monitoring and feedback in the advancing process of the columnar object. The method is simple to operate, high in measuring speed and precision and simple in data processing.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the conventional technologies of the present application, the drawings used in the descriptions of the embodiments or the conventional technologies will be briefly introduced below, it is obvious that the drawings in the following descriptions are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of a method for monitoring a process of traveling a cylindrical object according to an embodiment of the present application;
FIG. 2 is a block diagram of a system for monitoring a traveling process of a cylindrical object according to an embodiment of the present application;
FIG. 3 is a block diagram of a system for monitoring a traveling process of a cylindrical object according to another embodiment of the present application;
FIG. 4 is a block diagram of a system for monitoring a traveling process of a cylindrical object applied to a high precision welding system according to a first embodiment of the present disclosure;
FIG. 5 is a block diagram of a system for monitoring the progress of a cylindrical object in a high-precision needle insertion system according to a second embodiment of the present application;
FIG. 6 is a graph of needle and catheter monitoring data acquired in a second embodiment as provided by one embodiment of the present application;
FIG. 7 is a graph of calculated phase data for a needle and catheter in a second embodiment as provided in one embodiment of the present application;
FIG. 8 is a graph of calculated three-dimensional point cloud data for a needle and catheter in a second embodiment as provided in one embodiment of the present application;
FIG. 9 shows a base point and a direction vector of the central axis of the needle and catheter calculated in a second embodiment as provided in one embodiment of the present application.
Description of the main element reference numerals
10. A projection device; 20. a detector; 30. a processor; 31. a parameter setting module; 32. a data processing module; 33. a calculation module; 40. welding equipment; 41. welding pins; 50. welding a target; 60. a mechanical arm; 70. an injection device; 71. an injection needle; 72. an injection target; 73. a conduit; 80. a swing angle device; 90. three-dimensional displacement platform
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanying the present application are described in detail below with reference to the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of embodiments in many different forms than those described herein and those skilled in the art will be able to make similar modifications without departing from the spirit of the application and it is therefore not intended to be limited to the embodiments disclosed below.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first acquisition module may be referred to as a second acquisition module, and similarly, a second acquisition module may be referred to as a first acquisition module, without departing from the scope of the present application. The first acquisition module and the second acquisition module are both acquisition modules, but are not the same acquisition module.
It will be understood that when an element is referred to as being "disposed on" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Referring to fig. 1, the present application provides a method for monitoring a traveling process of a cylindrical object. A method of monitoring a cylindrical object travel process, comprising:
and S10, emitting a group of stripe structured light to the surface of the object to be measured.
In step S10, the object to be measured may be a cylindrical object. The object to be measured can also be a vertical columnar object. In one optional embodiment, the object to be tested may be a welding pin in a high precision welder system. A set of stripe-structured light can be emitted to the surface of the object to be measured through a digital projection system. In one optional embodiment, the striped structured light is sinusoidally distributed phase-shifted striped structured light. And after the light of the stripe structure is reflected by the surface of the object to be detected, different signal lights can be formed.
And S20, acquiring a group of image data according to the signal light reflected by the surface of the object to be measured.
In step S20, a synchronous acquisition may be performed by the probe 20. The detector 20 may be wired to the digital projection system to synchronize the switching and acquisition of the projection pattern. The image data carries gray value information.
And S30, obtaining the coordinate of the central axis base point and the direction vector of the object to be detected according to the image data and the calibration parameters of the system.
In step S30, calibration parameters of the system may be stored in the processor 30. The processor 30 may be a single chip or a microprocessor 30. The functions of the processor 30 may also be implemented by a computer. The processor 30 may store the image data. And calculating the coordinate and the direction vector of the central axis base point of the object to be measured. The processor 30 may also include a parameter setting module 31. The parameter setting module 31 may be connected to the digital projection system. Optionally, the parameter setting module 31 and the digital projection system may be connected by a data line. The parameter setting module 31 is used for importing projection patterns and setting projection parameters.
In one embodiment, step S30 includes determining three-dimensional point cloud information of the object to be measured according to the image data and calibration parameters of the system, and fitting the three-dimensional point cloud information to obtain the middle axis base point coordinates and the direction vector of the object to be measured. The three-dimensional point cloud information comprises a plurality of image points. Each image point corresponds to an image data. And fitting all the three-dimensional point clouds to obtain a three-dimensional image of the object to be detected, and further obtain coordinates of a base point on the central axis of the object to be detected and a direction vector of the central axis. And reflecting the motion direction of the object to be measured according to the coordinates of the base point on the central axis and the direction vector of the central axis. And the relative spatial position of the object to be measured and the target object can be calculated according to the coordinates of the base point on the central axis and the direction vector of the central axis. The relative spatial position can be fed back to an operation system, and then fine adjustment of the advancing direction of the object to be detected is completed, so that smooth alignment and insertion of the object to be detected and a target are ensured.
In one embodiment, the method for determining the three-dimensional point cloud information of the object to be detected according to the image data and the calibration parameters of the system comprises the steps of determining phase information with the object to be detected according to the gray value of the image data, and converting the phase information into the three-dimensional point cloud information according to the calibration parameters.
In one embodiment, the relationship between the phase information of the object to be measured and the gray value is as follows:
Figure BDA0002559854090000071
wherein the content of the first and second substances,
Figure BDA0002559854090000072
as phase information of the object to be measured, InThe gray value corresponding to the nth image data.
In this embodiment, the method for monitoring the traveling process of the columnar object includes emitting a group of stripe-structured light to the surface of the object to be measured. And acquiring a group of image data according to the signal light reflected by the surface of the object to be detected. And obtaining the center shaft base point coordinates and the direction vector of the object to be detected according to the image data and the calibration parameters of the system, thereby realizing high-speed and high-precision monitoring and feedback in the advancing process of the columnar object. The method is simple to operate, high in measuring speed and precision and simple in data processing.
Referring to fig. 2, the present application provides a system for monitoring a traveling process of a cylindrical object. The system for monitoring the traveling process of the columnar object comprises a projection device 10, a detector 20 and a processor 30.
The projection device 10 is used for emitting a group of stripe-structured light to the surface of an object to be measured. The detector 20 is connected to the projection device 10. The detector 20 is configured to obtain a set of image data according to the signal light reflected by the surface of the object to be measured. The processor 30 is connected to the detector 20. The processor 30 is configured to obtain the coordinate of the central axis base point and the direction vector of the object to be measured according to the image data and the calibration parameters of the system.
It is understood that the object to be measured may be a cylindrical object. The object to be measured can also be a vertical columnar object. In one of the alternative embodiments, the object to be tested may be a welding pin 41 in a high precision welder system.
It is to be understood that the structure of the projection apparatus 10 is not particularly limited as long as the stripe-structured light can be transmitted. Optionally, a set of stripe-structured light may be emitted to the surface of the object to be measured by a digital projection system. In one optional embodiment, the striped structured light is sinusoidally distributed phase-shifted striped structured light. And after the light of the stripe structure is reflected by the surface of the object to be detected, different signal lights can be formed.
It is understood that the structure of the detector 20 is not particularly limited, as long as a set of image data can be obtained according to the signal light reflected by the surface of the object to be measured. The detector 20 may be wired to the digital projection system to synchronize the switching and acquisition of the projection pattern. The image data carries gray value information. The detector 20 and the processor 30 may be connected by a data line.
It is understood that the structure of the processor 30 is not particularly limited, as long as the coordinate of the base point of the central axis and the direction vector of the object to be measured can be obtained according to the image data and the calibration parameters of the system. Calibration parameters for the system may be stored in the processor 30. The processor 30 may be a single chip or a microprocessor. The functions of the processor 30 may also be implemented by a computer. The processor 30 may store the image data. And calculating the coordinate and the direction vector of the central axis base point of the object to be measured.
Referring to fig. 3, in one embodiment, the processor 30 includes a parameter setting module 31, a data processing module 32, and a calculating module 33.
The parameter setting module 31 is connected to the projection device 10. The parameter setting module 31 is configured to set parameters of the stripe structure light, and send the parameters to the projection device 10. The data processing module 32 is connected to the detector 20. The data processing module 32 is configured to determine three-dimensional point cloud information of the object to be detected according to the image data and calibration parameters of the system. The data processing module 32 is further configured to fit the three-dimensional point cloud information to obtain a central axis base point coordinate and a direction vector of the object to be measured. The calculation module 33 is connected to the data processing module 32. The calculating module 33 is configured to calculate a relative spatial position between the object to be measured and the target according to the coordinate of the central axis base point of the object to be measured and the direction vector.
It is understood that the parameter setting module 31, the data processing module 32 and the calculation module 33 may be implemented by a computer.
The parameter setting module 31 may be connected to the digital projection system. Optionally, the parameter setting module 31 and the digital projection system may be connected by a data line. The parameter setting module 31 is used for importing projection patterns and setting projection parameters. The data processing module 32 and the detector 20 may be connected by a data line.
In one embodiment, the method for obtaining the coordinate of the central axis base point and the direction vector of the object to be detected according to the image data and the calibration parameters of the system comprises the step of determining the three-dimensional point cloud information of the object to be detected according to the image data and the calibration parameters of the system. And fitting the three-dimensional point cloud information to obtain the central axis base point coordinates and the direction vectors of the object to be detected. The three-dimensional point cloud information comprises a plurality of image points. Each image point corresponds to an image data. And fitting all the three-dimensional point clouds to obtain a three-dimensional image of the object to be detected, and further obtain coordinates of a base point on the central axis of the object to be detected and a direction vector of the central axis. And reflecting the motion direction of the object to be measured according to the coordinates of the base point on the central axis and the direction vector of the central axis. And the relative spatial position of the object to be measured and the target object can be calculated according to the coordinates of the base point on the central axis and the direction vector of the central axis. The relative spatial position can be fed back to an operation system, and then fine adjustment of the advancing direction of the object to be detected is completed, so that smooth alignment and insertion of the object to be detected and a target are ensured.
In one embodiment, the method for determining the three-dimensional point cloud information of the object to be detected according to the image data and the calibration parameters of the system comprises the step of determining the phase information with the object to be detected according to the gray value of the image data. And converting the phase information into the three-dimensional point cloud information according to the calibration parameters.
In one embodiment, the relationship between the phase information of the object to be measured and the gray value is as follows:
Figure BDA0002559854090000101
wherein the content of the first and second substances,
Figure BDA0002559854090000102
as phase information of the object to be measured, InThe gray value corresponding to the nth image data.
In this embodiment, in the system for monitoring the traveling process of the cylindrical object, a group of stripe-structured light is emitted to the surface of the object to be measured through the projection device 10. The detector 20 obtains a set of image data according to the signal light reflected by the surface of the object to be measured. The processor 30 obtains the middle axis base point coordinates and the direction vector of the object to be detected according to the image data and the calibration parameters of the system, so that high-speed and high-precision monitoring and feedback can be realized in the advancing process of the columnar object. The system is simple to operate, high in measuring speed and precision and simple in data processing.
Referring to fig. 4, the present application provides a system for monitoring a traveling process of a cylindrical object applied in a high precision welding machine system. The system for monitoring the traveling process of the columnar object is used for monitoring the welding pin 41 and the welding target 50 on the welding equipment 40. The system for monitoring the traveling process of the columnar object obtains the spatial deviation between the two and feeds the spatial deviation back to the mechanical arm 60 to realize calibration.
Specifically, the system for monitoring the traveling process of the columnar object comprises a projection device 10, a detector 20 and a processor 30. The projection device 10 is used for emitting a group of stripe-structured light to the surface of an object to be measured. The detector 20 is connected to the projection device 10. The detector 20 is configured to obtain a set of image data according to the signal light reflected by the surface of the object to be measured. The processor 30 is connected to the detector 20. The processor 30 is configured to obtain the coordinate of the central axis base point and the direction vector of the object to be measured according to the image data and the calibration parameters of the system.
It is to be understood that the structure of the projection apparatus 10 is not particularly limited as long as the stripe-structured light can be transmitted. Optionally, a set of stripe-structured light may be emitted to the surface of the object to be measured by a digital projection system. In one optional embodiment, the striped structured light is sinusoidally distributed phase-shifted striped structured light. And after the light of the stripe structure is reflected by the surface of the object to be detected, different signal lights can be formed.
It is understood that the structure of the detector 20 is not particularly limited, as long as a set of image data can be obtained according to the signal light reflected by the surface of the object to be measured. The detector 20 may be wired to the digital projection system to synchronize the switching and acquisition of the projection pattern. The image data carries gray value information. The detector 20 and the processor 30 may be connected by a data line.
It is understood that the structure of the processor 30 is not particularly limited, as long as the coordinate of the base point of the central axis and the direction vector of the object to be measured can be obtained according to the image data and the calibration parameters of the system. Calibration parameters for the system may be stored in the processor 30. The processor 30 may be a single chip or a microprocessor. The functions of the processor 30 may also be implemented by a computer. The processor 30 may store the image data. And calculating the coordinate and the direction vector of the central axis base point of the object to be measured.
In order to ensure that the welding tip 41 and the welding target 50 can be detected at all times, the system for monitoring the progress of the columnar object needs to be fixed on the welding equipment 40, so as to ensure that the rotation of the mechanical arm 60 does not cause the welding tip 41 to exceed the monitoring range. In the practical application, when the robot arm 60 moves the welding needle 41 to the position near the upper part of the welding target 50, the system for monitoring the traveling process of the columnar object starts to work, measures the relative spatial position between the two and feeds back the relative spatial position to the robot arm 60 for fine adjustment alignment.
Referring to fig. 5, the present application provides a system for monitoring the progress of a cylindrical object applied to a high-precision needle insertion system. The system for monitoring the progress of the cylindrical object is used for monitoring the docking and insertion process of the injection needle 71 on the injection device 70 and the conduit 73 on the injection target 72. The system for monitoring the advancing process of the columnar object obtains the space deviation between the columnar object and the system and feeds back the space deviation to the electric control high-precision swing angle equipment 80 and the electric control high-precision three-dimensional displacement platform 90 to realize calibration and insertion.
Specifically, the system for monitoring the traveling process of the columnar object comprises a projection device 10, a detector 20 and a processor 30. The projection device 10 is used for emitting a group of stripe-structured light to the surface of an object to be measured. The detector 20 is connected to the projection device 10. The detector 20 is configured to obtain a set of image data according to the signal light reflected by the surface of the object to be measured. The processor 30 is connected to the detector 20. The processor 30 is configured to obtain the coordinate of the central axis base point and the direction vector of the object to be measured according to the image data and the calibration parameters of the system.
It is to be understood that the structure of the projection apparatus 10 is not particularly limited as long as the stripe-structured light can be transmitted. Optionally, a set of stripe-structured light may be emitted to the surface of the object to be measured by a digital projection system. In one optional embodiment, the striped structured light is sinusoidally distributed phase-shifted striped structured light. And after the light of the stripe structure is reflected by the surface of the object to be detected, different signal lights can be formed.
It is understood that the structure of the detector 20 is not particularly limited, as long as a set of image data can be obtained according to the signal light reflected by the surface of the object to be measured. The detector 20 may be wired to the digital projection system to synchronize the switching and acquisition of the projection pattern. The image data carries gray value information. The detector 20 and the processor 30 may be connected by a data line.
It is understood that the structure of the processor 30 is not particularly limited, as long as the coordinate of the base point of the central axis and the direction vector of the object to be measured can be obtained according to the image data and the calibration parameters of the system. Calibration parameters for the system may be stored in the processor 30. The processor 30 may be a single chip or a microprocessor 30. The functions of the processor 30 may also be implemented by a computer. The processor 30 may store the image data. And calculating the coordinate and the direction vector of the central axis base point of the object to be measured.
In the practical application process, when the electric control high-precision three-dimensional displacement platform 90 moves the injection needle 71 to the position near the upper part of the guide pipe 73 on the injection target 72, the system for monitoring the traveling process of the columnar object starts to work, measures and obtains three-dimensional point cloud information of the injection needle 71 and the guide pipe 73, then respectively obtains the central axis and the base point of the injection needle 71 and the guide pipe 73 through cylinder fitting, and feeds back to the system for angle and position adjustment, so that the injection needle 71 is inserted into the guide pipe 73 smoothly.
As shown in fig. 6, for the monitoring image data acquired by the detector 20, a total of N pictures correspond to N different phase-shifted fringe-structured light projections, respectively. As shown in FIG. 7, is based on image data I1,I2…INThe phase value with three-dimensional information of the beam needle and the sleeve needle surface is calculated
Figure BDA0002559854090000131
The results are shown in the figure. As shown in fig. 8, the result diagram is obtained by calibrating parameters through the system and converting phi into three-dimensional point cloud. As shown in fig. 9, a result diagram is obtained by fitting the three-dimensional point cloud data by a cylinder fitting method to obtain coordinates and direction vectors of the central axis base points of the beam needle and the trocar, so as to completely determine the relative spatial positions of the two.
A computer device comprising a memory, a processor 30 and a computer program stored in the memory and running on the processor 30, wherein the processor 30 implements the step of obtaining the coordinate of the central axis base point and the direction vector of the object to be measured according to the calibration parameters of the system and the image data as described in any one of the above embodiments when executing the computer program.
It is understood that the structure of the processor 30 is not particularly limited, as long as the coordinate of the base point of the central axis and the direction vector of the object to be measured can be obtained according to the image data and the calibration parameters of the system. Calibration parameters for the system may be stored in the processor 30. The processor 30 may be a single chip or a microprocessor. The functions of the processor 30 may also be implemented by a computer. The processor 30 may store the image data. And calculating the coordinate and the direction vector of the central axis base point of the object to be measured.
In one embodiment, the processor 30 includes a parameter setting module 31, a data processing module 32, and a calculation module 33.
The parameter setting module 31 is connected to the projection device 10. The parameter setting module 31 is configured to set parameters of the stripe structure light, and send the parameters to the projection device 10. The data processing module 32 is connected to the detector 20. The data processing module 32 is configured to determine three-dimensional point cloud information of the object to be detected according to the image data and calibration parameters of the system. The data processing module 32 is further configured to fit the three-dimensional point cloud information to obtain a central axis base point coordinate and a direction vector of the object to be measured. The calculation module 33 is connected to the data processing module 32. The calculating module 33 is configured to calculate a relative spatial position between the object to be measured and the target according to the coordinate of the central axis base point of the object to be measured and the direction vector.
It is understood that the parameter setting module 31, the data processing module 32 and the calculation module 33 may be implemented by a computer.
The parameter setting module 31 may be connected to the digital projection system. Optionally, the parameter setting module 31 and the digital projection system may be connected by a data line. The parameter setting module 31 is used for importing projection patterns and setting projection parameters. The data processing module 32 and the detector 20 may be connected by a data line.
In this embodiment, the computer device emits a set of stripe-structured light to the surface of the object to be measured through the projection device 10. The detector 20 obtains a set of image data according to the signal light reflected by the surface of the object to be measured. The processor 30 obtains the middle axis base point coordinates and the direction vector of the object to be detected according to the image data and the calibration parameters of the system, so that high-speed and high-precision monitoring and feedback can be realized in the advancing process of the columnar object. The system is simple to operate, high in measuring speed and precision and simple in data processing.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the claims. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of monitoring the progress of a cylindrical object, comprising:
emitting a group of stripe structure light to the surface of an object to be detected;
acquiring a group of image data according to the signal light reflected by the surface of the object to be detected;
and obtaining the coordinates and the direction vectors of the central axis base point of the object to be detected according to the image data and the calibration parameters of the system.
2. The method of claim 1, wherein the step of obtaining the coordinates of the base point of the central axis and the direction vector of the object to be measured according to the image data and the calibration parameters of the system comprises:
determining the three-dimensional point cloud information of the object to be detected according to the image data and the calibration parameters of the system;
and fitting the three-dimensional point cloud information to obtain the central axis base point coordinates and the direction vectors of the object to be detected.
3. The method of claim 2, wherein the step of determining the three-dimensional point cloud information of the object to be measured according to the image data and calibration parameters of the system comprises:
determining phase information with the object to be detected according to the gray value of the image data;
and converting the phase information into the three-dimensional point cloud information according to the calibration parameters.
4. The method according to claim 3, wherein the relationship between the phase information of the object to be measured and the gray value is as follows:
Figure FDA0002559854080000011
wherein the content of the first and second substances,
Figure FDA0002559854080000012
as phase information of the object to be measured, InThe gray value corresponding to the nth image data.
5. The method of claim 1, wherein the light is sinusoidally distributed phase shifted light.
6. The method of claim 1, wherein the step of obtaining the coordinates of the base point of the central axis and the direction vector of the object to be measured according to the image data and the calibration parameters of the system is followed by the step of:
and calculating the relative spatial position of the object to be detected and the target according to the coordinate of the central axis base point of the object to be detected and the direction vector.
7. A system for monitoring the progress of a cylindrical object, comprising:
the projection equipment is used for emitting a group of stripe structure light to the surface of the object to be measured;
the detector is connected with the projection equipment and used for acquiring a group of image data according to the signal light reflected by the surface of the object to be detected; and
and the processor is connected with the detector and used for obtaining the coordinate of the central axis base point and the direction vector of the object to be detected according to the image data and the calibration parameters of the system.
8. The system for monitoring the course of travel of a cylindrical object as recited in claim 7, wherein the processor comprises:
the parameter setting module is connected with the projection equipment, and is used for setting parameters of the stripe structure light and sending the parameters to the projection equipment; and;
and the data processing module is connected with the detector and used for determining the three-dimensional point cloud information of the object to be detected according to the image data and the calibration parameters of the system, and fitting the three-dimensional point cloud information to obtain the center axis base point coordinates and the direction vectors of the object to be detected.
9. The system for monitoring the course of travel of a cylindrical object as recited in claim 7, wherein the processor further comprises:
and the calculation module is connected with the data processing module and used for calculating the relative spatial position of the object to be detected and the target according to the central axis base point coordinates and the direction vector of the object to be detected.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor when executing the computer program implements the steps of obtaining the center axis base point coordinates and the direction vector of the object to be measured according to the image data and the calibration parameters of the system in the method of any one of claims 1 to 6.
CN202010603182.7A 2020-06-29 2020-06-29 Method and system for monitoring traveling process of columnar object and computer equipment Pending CN111768441A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010603182.7A CN111768441A (en) 2020-06-29 2020-06-29 Method and system for monitoring traveling process of columnar object and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010603182.7A CN111768441A (en) 2020-06-29 2020-06-29 Method and system for monitoring traveling process of columnar object and computer equipment

Publications (1)

Publication Number Publication Date
CN111768441A true CN111768441A (en) 2020-10-13

Family

ID=72722873

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010603182.7A Pending CN111768441A (en) 2020-06-29 2020-06-29 Method and system for monitoring traveling process of columnar object and computer equipment

Country Status (1)

Country Link
CN (1) CN111768441A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101196465A (en) * 2007-12-14 2008-06-11 武汉大学 Laser double-mode micro-volume sample analyzing method and its device
CN103727927A (en) * 2013-12-19 2014-04-16 大连理工大学 High-velocity motion object pose vision measurement method based on structured light
CN104014905A (en) * 2014-06-06 2014-09-03 哈尔滨工业大学 Observation device and method of three-dimensional shape of molten pool in GTAW welding process
CN105783726A (en) * 2016-04-29 2016-07-20 无锡科技职业学院 Curve-welding-seam three-dimensional reconstruction method based on line structure light vision detection
CN105953747A (en) * 2016-06-07 2016-09-21 杭州电子科技大学 Structured light projection full view three-dimensional imaging system and method
US20170085860A1 (en) * 2015-09-22 2017-03-23 Purdue Research Foundation Calibration arrangement for structured light system using a tele-centric lens
CN108692661A (en) * 2018-05-08 2018-10-23 深圳大学 Portable three-dimensional measuring system based on Inertial Measurement Unit and its measurement method
CN109341527A (en) * 2018-10-22 2019-02-15 广东工业大学 A kind of the structured light projection three-dimension measuring system and method for auto shadows compensation
JP2019191003A (en) * 2018-04-25 2019-10-31 オムロン株式会社 Control system, control method, and program
CN111046776A (en) * 2019-12-06 2020-04-21 杭州成汤科技有限公司 Mobile robot traveling path obstacle detection method based on depth camera

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101196465A (en) * 2007-12-14 2008-06-11 武汉大学 Laser double-mode micro-volume sample analyzing method and its device
CN103727927A (en) * 2013-12-19 2014-04-16 大连理工大学 High-velocity motion object pose vision measurement method based on structured light
CN104014905A (en) * 2014-06-06 2014-09-03 哈尔滨工业大学 Observation device and method of three-dimensional shape of molten pool in GTAW welding process
US20170085860A1 (en) * 2015-09-22 2017-03-23 Purdue Research Foundation Calibration arrangement for structured light system using a tele-centric lens
CN105783726A (en) * 2016-04-29 2016-07-20 无锡科技职业学院 Curve-welding-seam three-dimensional reconstruction method based on line structure light vision detection
CN105953747A (en) * 2016-06-07 2016-09-21 杭州电子科技大学 Structured light projection full view three-dimensional imaging system and method
JP2019191003A (en) * 2018-04-25 2019-10-31 オムロン株式会社 Control system, control method, and program
CN108692661A (en) * 2018-05-08 2018-10-23 深圳大学 Portable three-dimensional measuring system based on Inertial Measurement Unit and its measurement method
CN109341527A (en) * 2018-10-22 2019-02-15 广东工业大学 A kind of the structured light projection three-dimension measuring system and method for auto shadows compensation
CN111046776A (en) * 2019-12-06 2020-04-21 杭州成汤科技有限公司 Mobile robot traveling path obstacle detection method based on depth camera

Similar Documents

Publication Publication Date Title
Sładek et al. The hybrid contact–optical coordinate measuring system
CN100412505C (en) Width measuring method and surface property measuring equipment
AU664393B2 (en) Method and system for point by point measurement of spatial coordinates
EP1579168B2 (en) Workpiece inspection method and apparatus
EP2840354A1 (en) Form measuring apparatus and method of registering coordinate system for rotary table
CN108827187B (en) A kind of measuring three-dimensional profile system
EP2843357B1 (en) Determining the centre of a V-Groove
JP2015203567A (en) Metrology system
CN107044837B (en) For demarcating the method, apparatus and control equipment of detection instrument coordinate system
CN105473981A (en) Calibration of a contact probe
JP5948729B2 (en) Shape measuring device
Lee et al. A framework for laser scan planning of freeform surfaces
US7096149B2 (en) Method for determining coordinate system for device under measurement, and coordinate measuring apparatus
JP6583730B2 (en) CMM measuring method, measuring control device, and measuring program
JP2019105615A (en) Spatial accuracy correction method, and spatial accuracy correction device
US20050234344A1 (en) Digitization of undercut surfaces using non-contact sensors
JP2021193400A (en) Method for measuring artefact
CN104567773B (en) The sloped correcting method of the base portion of arm type three-dimensional measuring machine and the support measuring machine
US11002529B2 (en) Robot system with supplementary metrology position determination system
JP6747151B2 (en) Inspection method and device for positioning machine using tracking laser interferometer
CN110794766A (en) Quick identification method for measuring perpendicularity error of numerical control machine tool based on ball arm instrument
JP5270138B2 (en) Calibration jig and calibration method
CN111768441A (en) Method and system for monitoring traveling process of columnar object and computer equipment
JP2018084488A (en) Three-dimensional measuring machine measurement method and three-dimensional measuring machine
US11740063B2 (en) Post-processing of measurement data collected with optical probe

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination