WO2024109846A1 - 一种光学靶标三维测量***、方法、电子设备及存储介质 - Google Patents

一种光学靶标三维测量***、方法、电子设备及存储介质 Download PDF

Info

Publication number
WO2024109846A1
WO2024109846A1 PCT/CN2023/133443 CN2023133443W WO2024109846A1 WO 2024109846 A1 WO2024109846 A1 WO 2024109846A1 CN 2023133443 W CN2023133443 W CN 2023133443W WO 2024109846 A1 WO2024109846 A1 WO 2024109846A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
coordinate system
measuring
measurement
optical target
Prior art date
Application number
PCT/CN2023/133443
Other languages
English (en)
French (fr)
Inventor
谷飞飞
段俊凯
宋展
Original Assignee
中国科学院深圳先进技术研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国科学院深圳先进技术研究院 filed Critical 中国科学院深圳先进技术研究院
Publication of WO2024109846A1 publication Critical patent/WO2024109846A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the present invention belongs to the field of visual measurement technology, and in particular relates to an optical target three-dimensional measurement system, method, electronic equipment and storage medium.
  • Portable optical target 3D measurement technology is widely used in the fields of mechanical manufacturing, aerospace, industrial inspection and clinical medicine due to its advantages of being easy to carry, install, operate and meet the requirements of on-site online measurement. For example, it can detect the geometric information of the inner cavity and outer shape of large special-shaped workpieces in industry, and track and locate the position information of joints and spine in medical surgery.
  • the portable optical target measurement system is a contact optical measurement system, which generally includes a visual system, an optical target, a computer and related software.
  • the surface of the optical target contains several marking points, and a contact probe is fixed at the bottom.
  • the optical target is held and its position is adjusted so that the probe contacts a certain point to be measured on the surface of the object to be measured, and the surface marking point can be observed and captured by the visual system; then the three-dimensional coordinate information of the marking point is obtained through image processing methods and visual measurement technology, and then the position of the measuring point is inferred and calculated.
  • the visual system is generally composed of two or more cameras. In these systems, the field of view of the visual system is relatively limited, and in order to achieve measurement, the optical target must be fully imaged in two or more cameras, which greatly limits the application of portable optical target three-dimensional measurement technology and equipment in the measurement of large-scale objects.
  • the purpose of the embodiments of this specification is to provide an optical target three-dimensional measurement system, method, electronic device and storage medium.
  • the center line of the optical axis of the measuring camera intersects the center line of the optical axis of the reference camera;
  • the reference plate is fixed on the side of the measurement camera facing the reference camera. Track and locate the changed posture of the test board to measure the changed posture of the camera;
  • Motion mechanism the motion mechanism drives the measuring camera to realize rotation and pitch motion
  • the optical target is used to contact the object to be measured, and the information of the optical target is captured by the measuring camera.
  • the motion mechanism includes a pitch mechanism and a rotation mechanism.
  • the pitch mechanism is an angular displacement platform, which realizes the pitch motion of the measuring camera through angular displacement; and the rotation mechanism is a horizontal rotation mechanism, which realizes the rotation motion of the measuring camera.
  • the optical target includes a target body and a probe, and a plurality of marking points are arranged on the surface of the target body.
  • the present application provides a method for three-dimensional measurement of an optical target, the method comprising:
  • the optical target contacts the measuring point on the surface of the object to be measured, and the motion mechanism adjusts the position of the measuring camera to put the optical target in the best imaging position;
  • the measuring camera captures the marking point information on the target surface of the optical target
  • the three-dimensional coordinates of each marking point are obtained;
  • the three-dimensional information of the measured object is determined based on the three-dimensional coordinates of multiple measuring points.
  • the method further includes: determining a motion conversion relationship between a reference camera coordinate system and a current measurement camera coordinate system, including:
  • the motion conversion relationship between the reference plate coordinate system and the measuring camera coordinate system is determined
  • the three-dimensional coordinates of each feature point on the reference plate in the reference plate coordinate system and the pixel coordinates of the imaging corner points corresponding to each feature point in the reference coordinate system are obtained;
  • the motion transformation relationship between the reference camera coordinate system and the reference board coordinate system is estimated
  • the position and posture of the measuring camera changes, and the image on the reference plate also changes, so the motion conversion relationship between the current reference plate coordinate system and the reference camera is determined;
  • calculating the three-dimensional coordinates of the measuring point according to the three-dimensional coordinates of each marking point includes:
  • the three-dimensional coordinates of each marking point are determined by the following steps:
  • the relationship between the spatial coordinates of each marking point and the corresponding pixel coordinates in the optical target image is determined
  • the three-dimensional coordinates of each marking point are determined according to the geometric relationship between each marking point and the relationship between the spatial coordinates of each marking point and the corresponding pixel coordinates in the optical target image.
  • the method further includes:
  • the position and posture information of the current measuring camera is read, and according to the motion conversion relationship between the reference camera coordinate system and the current measuring camera coordinate system, the three-dimensional coordinates of the measuring point in the measuring coordinate system are converted into the coordinates in the reference coordinate system.
  • the present application provides an electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein when the processor executes the program, the optical target three-dimensional measurement method according to the second aspect is implemented.
  • the present application provides a readable storage medium having a computer program stored thereon, which, when executed by a processor, implements the optical target three-dimensional measurement method as in the second aspect.
  • the imaging error gradually increases from the center to the edge of the image.
  • the measurement camera posture can be adjusted in real time during the measurement so that the target is imaged at the center of the image as much as possible, thereby improving the accuracy of 3D reconstruction;
  • the optical target in this application only needs to be imaged in one camera, which reduces the difficulty of placing the handheld target and is more flexible in use.
  • FIG1 is a schematic diagram of the structure of an optical target three-dimensional measurement system provided by the present application.
  • FIG2 is a schematic diagram of the structure of the visual module provided by the present application.
  • FIG3 is a schematic diagram of a pattern of a reference plate provided in the present application.
  • FIG4 is a schematic diagram of the distribution of marking points on the target surface provided by the present application.
  • FIG5 is a schematic diagram of the geometric relationship of the reference plate provided by the present application in the image plane of the reference camera;
  • FIG6 is a schematic diagram of a reference plate image captured by a reference camera when the measurement camera provided in the present application is in different postures;
  • FIG7 is a schematic diagram of the process of the optical target three-dimensional measurement method provided by the present application.
  • FIG8 is a schematic diagram of monocular imaging of four-point collinear optical targets provided by the present application.
  • FIG. 9 is a schematic diagram of the structure of an electronic device provided in this application.
  • the visual part of the portable optical target three-dimensional measurement system generally adopts a stereoscopic vision system composed of two or more cameras, such as a handheld optical target and its measurement method for visual coordinate measurement, which realizes a spatial measurement method based on a binocular stereoscopic vision system by designing a plane target with several LED optical marking points installed on the surface.
  • a handheld spherical target and measurement method used in binocular stereoscopic vision measurement which designs a handheld spherical target with regular feature points arranged on the surface, and realizes three-dimensional measurement based on a binocular stereoscopic vision system.
  • the current portable optical target 3D measurement system mostly uses two or more cameras to collect images based on fixed postures. There are the following disadvantages:
  • this application proposes an optical target three-dimensional measurement method and system.
  • the visual system adopts a combination mode of "reference camera + measurement camera”.
  • the measurement camera realizes rotation and pitch movement through a motion mechanism, and a high-precision reference plate is pasted or installed on the surface.
  • the reference camera is fixed and the position information of the measurement camera is determined in real time by capturing the real-time status of the reference plate.
  • FIG. 1 shows a schematic structural diagram of an optical target three-dimensional measurement system applicable to an embodiment of the present application.
  • the optical target three-dimensional measurement system may include
  • the optical axis center line of the measuring camera intersects the optical axis center line of the reference camera;
  • a reference plate 30 is fixed to a side of the measuring camera facing the reference camera; the reference camera tracks and locates the transformed posture of the measuring camera by capturing the changed posture of the reference plate;
  • a motion mechanism 40 which drives the measuring camera 20 to realize rotation and pitch motion
  • the optical target 50 contacts the object to be measured, and the information of the optical target 50 is captured by the measuring camera 20 .
  • the motion mechanism 40 includes a pitch mechanism 410 and a rotation mechanism 420 .
  • the pitch mechanism 410 is an angular displacement platform, which realizes the pitch motion of the measuring camera through angular displacement;
  • the rotation mechanism 420 is a horizontal rotation mechanism, which realizes the rotation motion of the measuring camera.
  • the reference camera 10 , the measuring camera 20 , the reference plate 30 and the motion mechanism (or motion control mechanism) 40 can be collectively referred to as a vision module, as shown in FIG. 2 .
  • the reference camera 10 and the measuring camera 20 are installed vertically.
  • the intersection point is as close as possible to the imaging optical center of the measuring camera, as shown in Figure 2.
  • the motion mechanism 40 is composed of a pitch mechanism 410 and a rotation mechanism 420.
  • the pitch mechanism 410 is an angular displacement platform, which can achieve ⁇ 30° pitch motion of the measuring camera through angular displacement;
  • the rotation mechanism 420 is a horizontal rotation mechanism, which can achieve 360° rotation motion of the measuring camera.
  • the measuring camera 20 is fixedly mounted on the horizontal mounting surface above the motion mechanism 40, and a wide range of observation angles can be achieved through the designed motion mechanism.
  • the angular displacement centerline of the pitch mechanism 410 coincides with the rotation centerline of the rotation mechanism 420 and intersects with the optical axis centerline of the measuring camera 20 , and the intersection point is as close as possible to the imaging optical center position of the measuring camera 20 ; as shown in FIG. 2 .
  • the mode in which the rotating mechanism is on top and the pitching mechanism is on the bottom in the motion mechanism in Figures 1 and 2 can also be a mode in which the rotating mechanism is on the bottom and the pitching mechanism is on top, and this is not limited here.
  • the reference plate 30 can be installed or pasted on the upper surface of the measuring camera 20 (i.e., the side facing the reference camera 10).
  • the reference plate 30 uses a high-precision reference plate and is used to track the position of the measuring camera 20.
  • the patterns that can be used as the reference plate 30 include a checkerboard pattern (i.e., black and white squares in both horizontal and vertical directions, as shown in FIG3 (a)), a dot pattern (i.e., dots arranged at equal intervals in both horizontal and vertical directions, as shown in FIG3 (b)), etc.
  • the checkerboard pattern shown in FIG3 (a) is used as an example for explanation, and the checkerboard intersections are used as feature points.
  • the optical target (or simply referred to as target) includes a target body and a probe, and a plurality of marking points are arranged on the surface of the target body.
  • the distribution of the marking points on the target surface is shown in Figure 1. It can be four points in a collinear manner, or it can be any other geometric distribution relationship, including multiple points in a collinear manner, multiple points in a coplanar manner, multiple points in multiple planes, etc. As long as the constraint relationship between the points is known, the three-dimensional coordinates of the marking points can be calculated, as shown in Figure 4 (including Figure 4 (a), Figure 4 (b), and Figure 4 (c)).
  • the marking points on the target surface can be passive physical marking points such as corner points, intersection points, reflective balls, etc., or optical marking points that actively emit light such as infrared LED lights, and no limitation is made here.
  • This embodiment greatly increases the measurement accuracy through the new visual system structure of "reference camera + measurement camera".
  • the measurement field can realize the measurement of large-sized targets.
  • the optical target in the present application only needs to be imaged in one camera (ie, the measurement camera), which reduces the difficulty of placing the handheld target and is more flexible in use.
  • the motion transformation relationship between the reference camera coordinate system and the current measurement camera coordinate system is first determined, specifically including:
  • the motion conversion relationship between the reference plate coordinate system and the measuring camera coordinate system is determined
  • the three-dimensional coordinates of each feature point on the reference plate in the reference plate coordinate system and the pixel coordinates of the imaging corner points corresponding to each feature point in the reference coordinate system are obtained;
  • the motion transformation relationship between the reference camera coordinate system and the reference board coordinate system is estimated
  • the position and posture of the measuring camera changes, and the image on the reference plate also changes, so the motion conversion relationship between the current reference plate coordinate system and the reference camera is determined;
  • the conversion relationship between the reference plate coordinate system and the measuring camera coordinate system and the motion conversion relationship between the current reference plate coordinate system and the reference camera is determined.
  • the checkerboard pattern in Figure 3(a) is used as an example.
  • the reference camera and the measurement camera have been calibrated using the Zhang method in advance.
  • the calibrated internal parameter matrices of the two are K ref and K mea , both of which are 3 ⁇ 3 matrices;
  • the calibrated lens distortion coefficients of the reference camera and the measurement camera are k ref and k cam , respectively.
  • the reference plate and the measurement camera are fixedly installed.
  • the reference plate coordinate system is O plane -X plane Y plane Z plane (abbreviated as )
  • the measurement camera coordinate system is O meacam -X meacam Y meacam Z meacam (abbreviated as )
  • the motion transformation relationship between the two coordinate systems can be known, which is recorded as Then we have:
  • the reference camera is calculated The pose between the reference camera coordinate system and the reference board coordinate system (i.e., the conversion relationship between the reference camera coordinate system and the reference board coordinate system), and save the result in the memory for later use.
  • the calculation method is as follows: Assume that the characteristic corner point on the chessboard is in the chessboard coordinate system
  • (u 0 ,v 0 ) is the pixel coordinate of the center point of the reference camera image
  • (f 1 ,f 2 ) is the focal length of the lens, both of which are pre-calibrated internal parameters of the reference camera. Represents image pixels The pixel coordinates along the horizontal and vertical direction of the image.
  • Corner point coordinates to remove lens distortion the normalized coordinates obtained Contains lens distortion, so distortion correction is required.
  • the distortion correction method is relatively mature and will not be repeated here.
  • the ⁇ function represents the correction process function. Suppose the coordinates of the corner point after correction are
  • kd represents the distortion coefficient of the reference camera lens calibrated in advance.
  • u 1 , u 2 , u 3 represent vectors along the x-axis, y-axis, and z-axis of the space respectively. They represent the rotation matrix and translation matrix from the reference plate coordinate system to the measurement camera coordinate system.
  • dot() represents the dot product operation.
  • FIG. 6 shows examples of reference board images captured by the reference camera when the measurement camera is in different postures.
  • the reference board image captured by the reference camera is shown in Figure 6(a). If the measurement camera only has horizontal rotation movement, the reference board image captured by the reference camera is shown in Figure 6(b), where the chessboard squares only rotate without deformation; if the measurement camera has both horizontal rotation and pitch movement, the reference board image captured by the reference camera is shown in Figure 6(c), where the chessboard squares rotate and deform.
  • the reference camera captures the checkerboard image on the reference board as The result after extracting the changed chessboard corner point information and sorting is recorded as Assume the current reference plate coordinate system is According to the above steps (1)-(3), the reference camera coordinate system can be obtained and reference plate coordinate system The motion conversion relationship between satisfy,
  • the motion transformation relationship between the measurement camera and the reference camera coordinate system can be obtained when the measurement camera is in any position, thereby realizing the unification of the three-dimensional reconstruction data collected at different positions of the measurement camera.
  • the reference camera captures the checkerboard image on the reference board and records the position of the feature points as the reference corner point information; when the measurement camera posture changes, the checkerboard image on the reference board changes accordingly.
  • the reference camera captures the checkerboard image on the reference board, extracts the changed checkerboard corner point information, and calculates the posture change of the current measurement camera in combination with the reference corner point information.
  • FIG. 7 which shows a schematic flow chart of the optical target three-dimensional measurement method applicable to the embodiment of the present application.
  • a method for three-dimensional measurement of an optical target may include:
  • the optical target contacts the measuring point on the surface of the measured object, and the motion mechanism adjusts the position of the measuring camera to place the optical target in an optimal imaging position.
  • the measuring camera captures the marking point information on the target surface of the optical target
  • S740 calculating the three-dimensional coordinates of the measuring point according to the three-dimensional coordinates of each marking point, may include:
  • the three-dimensional coordinates of each marking point are determined by the following steps:
  • the relationship between the spatial coordinates of each marking point and the corresponding pixel coordinates in the optical target image is determined
  • each marking point According to the geometric relationship between each marking point and the spatial coordinates of each marking point and the optical target image The relationship between the corresponding pixel coordinates in is used to determine the three-dimensional coordinates of each marking point.
  • a three-coordinate measuring machine or other methods are used to calibrate the positional relationship between each marking point on the optical target and the probe probe, that is, the distance between points A, B, C, and D, and their distances to the probe are calibrated in advance.
  • the target is held and fine-tuned so that the probe Q of the target contacts the position of the measured point, and the measuring camera captures the target image.
  • the four marking points are imaged on the image. Based on the image, the imaging marking points a, b, c, and d corresponding to the marking points A, B, C, and D are reconstructed.
  • the corner point coordinates are dedistorted by lens: the normalized coordinates ⁇ d obtained contain lens distortion, so distortion correction is required.
  • the distortion correction method is relatively mature and will not be described here.
  • the ⁇ ′ function represents the correction process function.
  • k′d represents the pre-calibrated measurement camera lens distortion coefficient
  • the three-dimensional coordinate PA of point A can be obtained by formula (12), and the three-dimensional coordinates PB and PC of points B and C can be obtained by formulas (16) and (13). Other points on the optical target can also be calculated in the same way.
  • the three-dimensional coordinates of the measuring point can be calculated.
  • the three-dimensional coordinates of the current measuring point Q are P Q (X Q , Y Q , Z Q ), and the distances from the four points A, B, C, and D on the target to the measuring point Q are calibrated in advance.
  • the distances are L AQ , L BQ , L CQ , and L DQ , respectively, then:
  • the method further includes:
  • the position and posture information of the current measuring camera is read, and according to the motion conversion relationship between the reference camera coordinate system and the current measuring camera coordinate system, the three-dimensional coordinates of the measuring point in the measuring coordinate system are converted into the coordinates in the reference coordinate system.
  • S750 Determine three-dimensional information of the measured object according to the three-dimensional coordinates of the multiple measuring points.
  • information such as the inner cavity shape, shape and position tolerance, etc. of the object to be measured can be calculated according to requirements.
  • the optical target three-dimensional measurement method provided in the embodiment of the present application can adjust the measurement camera posture in real time during measurement so that the target is imaged at the center of the image as much as possible, thereby improving the three-dimensional reconstruction accuracy.
  • the reference camera is fixed, and the position and posture information of the measuring camera can be determined in real time by capturing the real-time status of the reference plate, which is simple and flexible to use.
  • Fig. 9 is a schematic diagram of the structure of an electronic device provided by an embodiment of the present invention. As shown in Fig. 9, a schematic diagram of the structure of an electronic device 900 suitable for implementing the embodiment of the present application is shown.
  • the electronic device 900 includes a central processing unit (CPU) 901, which can perform various appropriate actions and processes according to a program stored in a read-only memory (ROM) 902 or a program loaded from a storage part 908 into a random access memory (RAM) 903.
  • ROM read-only memory
  • RAM random access memory
  • various programs and data required for the operation of the device 900 are also stored.
  • the CPU 901, the ROM 902, and the RAM 903 are connected to each other via a bus 904.
  • An input/output (I/O) interface 905 is also connected to the bus 904.
  • the following components are connected to the I/O interface 905: an input section 906 including a keyboard, a mouse, etc.; An output section 907 such as a cathode ray tube (CRT), a liquid crystal display (LCD), and a speaker, etc.; a storage section 908 including a hard disk, etc.; and a communication section 909 including a network interface card such as a LAN card, a modem, etc.
  • the communication section 909 performs communication processing via a network such as the Internet.
  • a drive 910 is also connected to the I/O interface 906 as needed.
  • a removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, etc., is installed on the drive 910 as needed, so that a computer program read therefrom is installed into the storage section 908 as needed.
  • an embodiment of the present disclosure includes a computer program product, which includes a computer program tangibly contained on a machine-readable medium, and the computer program includes a program code for executing the above-mentioned optical target three-dimensional measurement method.
  • the computer program can be downloaded and installed from a network through the communication part 909, and/or installed from the removable medium 911.
  • each box in the flow chart or block diagram can represent a module, a program segment, or a part of the code, and the aforementioned module, program segment, or a part of the code contains one or more executable instructions for realizing the specified logical function.
  • the functions marked in the box can also occur in a different order from the order marked in the accompanying drawings. For example, two boxes represented in succession can actually be executed substantially in parallel, and they can sometimes be executed in the opposite order, depending on the functions involved.
  • each box in the block diagram and/or flow chart, and the combination of the boxes in the block diagram and/or flow chart can be implemented with a dedicated hardware-based system that performs the specified function or operation, or can be implemented with a combination of dedicated hardware and computer instructions.
  • the units or modules involved in the embodiments described in the present application may be implemented by software or hardware.
  • the units or modules described may also be arranged in a processor.
  • the names of these units or modules do not constitute limitations on the units or modules themselves in certain circumstances.
  • a typical implementation device is a computer.
  • the computer may be, for example, a personal computer, a laptop, a mobile phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
  • the present application further provides a storage medium, which may be a storage medium included in the aforementioned system in the above embodiment; or a storage medium that exists independently and is not assembled into a device.
  • the storage medium stores one or more programs, and the aforementioned programs are used by one or more processors to execute the optical target three-dimensional measurement method described in the present application.
  • Storage media includes permanent and non-permanent, removable and non-removable media that can be implemented by any method or technology to store information.
  • Information can be computer-readable instructions, data structures, program modules or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, compact disk read-only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media that can be used to store information that can be accessed by a computing device.
  • computer-readable media does not include temporary computer-readable media (transitory media), such as modulated data signals and carrier waves.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请提供一种光学靶标三维测量***、方法、电子设备及存储介质,该***包括:参考相机;测量相机;测量相机的光轴中心线与参考相机的光轴中心线相交;参考板,参考板固定于测量相机朝向参考相机的一面;参考相机通过捕捉参考板的变化姿态跟踪定位测量相机的变换位姿;运动机构,运动机构带动测量相机实现旋转、俯仰运动;光学靶标,光学靶标用于接触被测物体,通过测量相机捕捉光学靶标的信息。该方案通过"参考相机+测量相机"的视觉***新结构,极大地增加了测量视野,可实现大尺寸目标的测量。

Description

一种光学靶标三维测量***、方法、电子设备及存储介质 技术领域
本发明属于视觉测量技术领域,特别涉及一种光学靶标三维测量***、方法、电子设备及存储介质。
背景技术
便携式光学靶标三维测量技术由于具有便于携带、安装、操作、可以满足现场在线在位测量等优点,在机械制造、航空航天、工业检测和临床医学等领域均有广泛应用。如工业上大型异形工件内腔、外形等形位几何信息的检测,医学手术中关节脊柱等位置信息的跟踪定位等。
便携式光学标靶测量***为一种接触式光学测量***,一般包括视觉***、光学靶标、计算机及相关软件。其中光学靶标表面含有若干标记点,底部固结接触式探针。在使用时,手持光学靶标并调整其位置,使得探针接触被测物体表面某待测测点的同时,表面标记点可被视觉***观测捕捉;然后通过图像处理方法和视觉测量技术获取标记点三维坐标信息,进而推理计算出测点位置。其中,为了实现三维定位,视觉***一般采用两个、或者多个摄像机组成。在这些***中,视觉***视野较为有限,且为了实现测量,光学靶标必须在2个或2个以上摄像机中完整成像,极大地限制了便携式光学靶标三维测量技术及设备在大型尺寸物体测量中的应用。
发明内容
本说明书实施例的目的是提供一种光学靶标三维测量***、方法、电子设备及存储介质。
为解决上述技术问题,本申请实施例通过以下方式实现的:
第一方面,本申请提供一种光学靶标三维测量***,该***包括:
参考相机;
测量相机;测量相机的光轴中心线与参考相机的光轴中心线相交;
参考板,参考板固定于测量相机朝向参考相机的一面;参考相机通过捕捉参 考板的变化姿态跟踪定位测量相机的变换位姿;
运动机构,运动机构带动测量相机实现旋转、俯仰运动;
光学靶标,光学靶标用于接触被测物体,通过测量相机捕捉光学靶标的信息。
在其中一个实施例中,运动机构包括俯仰机构和旋转机构,俯仰机构为角位移平台,通过角位移实现测量相机的俯仰运动;旋转机构为水平旋转机构,实现测量相机的旋转运动。
在其中一个实施例中,光学靶标包括靶体和测头,靶体表面设置有若干标记点。
第二方面,本申请提供一种光学靶标三维测量方法,该方法包括:
光学靶标接触被测物体表面的测点,运动机构调整测量相机的位姿以使光学靶标处于最佳成像位置;
测量相机捕捉光学靶标的靶体表面的标记点信息;
根据标记点信息与已标定的标记点信息,求取各个标记点的三维坐标;
根据各个标记点的三维坐标,计算测点的三维坐标;
根据多个测点的三维坐标,确定被测物体的三维信息。
在其中一个实施例中,该方法还包括:确定参考相机坐标系与当前测量相机坐标系之间的运动转换关系,包括:
根据参考板和测量相机的安装关系,确定参考板坐标系与测量相机坐标系之间的运动转换关系;
光学靶标三维测量***工作之前,获取参考板上各个特征点在参考板坐标系下的三维坐标及在参考坐标系下各个特征点对应的成像角点像素坐标;
根据各个特征点的三维坐标与成像角点像素坐标的单应对应性,估计参考相机坐标系与参考板坐标系之间的运动转换关系;
根据参考板坐标系与测量相机坐标系之间的运动转换关系及参考相机坐标系与参考板坐标系之间的运动转换关系,确定参考相机坐标系和测量相机坐标系之间的运动转换关系;
光学靶标三维测量***工作之后,测量相机位姿变动,参考板上图像也变动,确定当前参考板坐标系与参考相机之间的运动转换关系;
根据参考板坐标系与测量相机坐标系之间的运动转换关系及当前参考板坐 标系与参考相机之间的运动转换关系,确定参考相机坐标系与当前测量相机坐标系之间的转换关系。
在其中一个实施例中,根据各个标记点的三维坐标,计算测点的三维坐标,包括:
获取各个标记点到测点的距离;
根据各个标记点的三维坐标及各对应的距离,建立方程组;
求解方程组,得到测点的三维坐标。
在其中一个实施例中,各个标记点的三维坐标通过下述步骤确定:
获取测量时测量相机拍摄的光学靶标图像,其中,光学靶标图像中包括各个标记点;
根据针孔成像模型,确定各个标记点的空间坐标与光学靶标图像中对应的像素坐标之间的关系;
根据各个标记点之间的几何关系及各个标记点的空间坐标与光学靶标图像中对应的像素坐标之间的关系,确定各个标记点的三维坐标。
在其中一个实施例中,得到测点的三维坐标后,方法还包括:
读取当前测量相机的位姿信息,根据参考相机坐标系与当前测量相机坐标系之间的运动转换关系,将测量坐标系下的测点的三维坐标转换为参考坐标系下的坐标。
第三方面,本申请提供一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,处理器执行程序时实现如第二方面的光学靶标三维测量方法。
第四方面,本申请提供一种可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如第二方面的光学靶标三维测量方法。
由以上本说明书实施例提供的技术方案可见,该方案:
1)通过“参考相机+测量相机”的视觉***新结构,极大地增加了测量视野,可实现大尺寸目标的测量;
2)受镜头畸变等因素影响,图像由中心向边缘,成像误差逐渐增大,通过本申请新的测量方式,测量中可实时调整测量相机姿态,使靶标尽量成像于图像中心位置,从而提高三维重建精度;
3)与双摄像机或多摄像机测量***相比,本申请中光学靶标只需在一个相机中成像,降低了手持式靶标的摆放难度,使用中更加灵活。
附图说明
为了更清楚地说明本说明书实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本说明书中记载的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本申请提供的光学靶标三维测量***的结构示意图;
图2为本申请提供的视觉模块的结构示意图;
图3为本申请提供的参考板的图案示意图;
图4为本申请提供的靶体表面的标记点分布示意图;
图5为本申请提供的参考板在参考相机图像平面成像几何关系示意图;
图6为本申请提供的测量相机不同位姿时,参考相机捕捉到的参考板图像示意图;
图7为本申请提供的光学靶标三维测量方法的流程示意图;
图8为本申请提供的四点共线光学靶标单目成像示意图;
图9为本申请提供的电子设备的结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本说明书中的技术方案,下面将结合本说明书实施例中的附图,对本说明书实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本说明书一部分实施例,而不是全部的实施例。基于本说明书中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都应当属于本说明书保护的范围。
以下描述中,为了说明而不是为了限定,提出了诸如特定***结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的***、***、电路以及方法的详细说明,以免不必要的细节妨碍本申 请的描述。
在不背离本申请的范围或精神的情况下,可对本申请说明书的具体实施方式做多种改进和变化,这对本领域技术人员而言是显而易见的。由本申请的说明书得到的其他实施方式对技术人员而言是显而易见得的。本申请说明书和实施例仅是示例性的。
关于本文中所使用的“包含”、“包括”、“具有”、“含有”等等,均为开放性的用语,即意指包含但不限于。
本申请中的“份”如无特别说明,均按质量份计。
相关技术中,为了实现三维定位,便携式光学靶标三维测量***的视觉部分一般采用两个、或者多个摄像机组成的立体视觉***,如一种用于视觉坐标测量的手持式光学靶标及其测量方法,通过设计一种表面安装有若干LED光学标记点的平面靶标,基于双目立体视觉***实现了空间测量方法。又如一种用于双目立体视觉测量中的手持球型靶标及测量方法,设计了一种表面布置有规则特征点的手持球型靶标,基于双目立体视觉***实现了三维测量。而由于单个摄像机进行测量时无法恢复标记点的深度信息,因此基于单目光学靶标定位***的产品和研究较少。也有一些采用单摄像机方案,如一种用于单目视觉***快速标定和测量的便携式球靶标,利用球靶标成像模型实现了球心的单目空间定位,然而,该方法虽然可以使用单摄像机实现球心定位,但定位精度较低,性能不稳定,因此无法在实际中应用。此外,以上***均存在测量视野受限的问题,一旦视觉***固定,测量视野也随之固定,尤其是双摄像机或者多摄像机***,要求手持式靶标在每个摄像机中成像,更是严重限制了靶标测量***的测量范围,无法应用于大尺寸工件的测量。
基于上述,目前的便携式光学靶标三维测量***多使用两个或以上摄像头,基于固定的位姿采集图像。存在以下缺点:
1)视场范围小。在现有的便携式光学靶标三维测量***中,由于两个或多个摄像头采用固定视角的位姿,测量范围有限;另外要求光学靶标上全部标记点均在所有的摄像头中成像,进一步降低了有效测量范围。因此无法适用于大尺寸物体的空间测量。
2)测量精度低。受视角范围有限的影响,光学靶标很难成像在图像中心区 域,而众所周知图像中心区域畸变较小,成像精度最高。成像越靠近图像边缘,畸变越大,会影响靶标标记点的成像和提取精度,进而影响被测点的三维测量精度。
3)使用不灵活。在现有的便携式光学靶标三维测量***中,操作者手持光学靶标,需要仔细调整其位姿以使其处于视觉***的有效成像区域,若位姿不合适则无法实现测量,对于某些不可见表面,甚至需要多次调整、甚至移动视觉***的位置才可以观测到,但若移动视觉***,则需要额外的标定物标定其与世界坐标系的位置关系。因此使用起来灵活性较差。
为了实现大尺寸目标的测量,本申请提出一种光学靶标三维测量方法及***。本申请中视觉***采用“参考相机+测量相机”的组合模式,测量相机通过运动机构实现旋转、俯仰运动,同时表面粘贴或安装有高精度参考板。参考相机固定不动,通过捕捉参考板的实时状态实时确定测量相机的位姿信息。
下面结合附图和实施例对本发明进一步详细说明。
参照图1,其示出了适用于本申请实施例提供的光学靶标三维测量***的结构示意图。
如图1所示,光学靶标三维测量***,可以包括
参考相机10;
测量相机20;测量相机的光轴中心线与参考相机的光轴中心线相交;
参考板30,参考板固定于测量相机朝向参考相机的一面;参考相机通过捕捉参考板的变化姿态跟踪定位测量相机的变换位姿;
运动机构40,运动机构带动测量相机20实现旋转、俯仰运动;
光学靶标50,光学靶标接触被测物体,通过测量相机20捕捉光学靶标50的信息。
其中,运动机构40包括俯仰机构410和旋转机构420,俯仰机构410为角位移平台,通过角位移实现测量相机的俯仰运动;旋转机构420为水平旋转机构,实现测量相机的旋转运动。
可以理解的,上述参考相机10、测量相机20、参考板30以及运动机构(或称为运动控制机构)40可以总称为视觉模块,如图2所示。
具体的,参考相机10和测量相机20垂直安装。参考相机的光轴中心线与测 量相机的光轴中心线相交,交点位置尽可能靠近测量相机的成像光心位置;如图2所示。
运动机构40由俯仰机构410和旋转机构420两部分组成,俯仰机构410部分为一个角位移平台,可以通过角位移实现测量相机±30°的俯仰运动;旋转机构420为一个水平旋转机构,可以实现测量相机360°的旋转运动。测量相机20固定安装在运动机构40上面的水平安装面上,通过所设计的运动机构可实现大范围的观测视角。
其中,俯仰机构410的角位移中心线与旋转机构420的旋转中心线重合,并与测量相机20的光轴中心线相交,交点位置尽可能靠近测量相机20的成像光心位置;如图2所示。
可以理解的,图1和图2中参考相机在上、测量相机在下的位置关系,也可以颠倒放置,即测量相机在上、参考相机在下,在此不做限制。
还可以理解的,图1和图2中运动机构中旋转机构在上、俯仰机构在下的模式,也可以是旋转机构在下、俯仰机构在上的模式,在此不做限制。
参考板30可以安装或粘贴于测量相机20的上表面(即朝向参考相机10的一面),参考板30采用高精度参考板,用于测量相机20的位姿追踪。如图3所示,可以作为参考板30的图案包括棋盘格图案(即横纵均黑白相间的方格,如图3(a)所示)、圆点图案(即横纵向均等间隔排列的圆点,如图3(b)所示)等。本申请中以图3(a)所示的棋盘格图案为例进行说明,以棋盘格交叉点作为特征点。
其中,光学靶标(或简称为靶标)包括靶体和测头,靶体表面设置有若干标记点。
可以理解的,靶体表面的标记点的分布如图1所示,可以为四点共线,还可以为其他任意几何分布关系,包括多点共线、多点共面、多点多面等,只要各点之间的约束关系已知,可以推算出标记点空间三维坐标即可,如图4(包括图4(a)、图4(b)、图4(c))所示。
还可以理解的,靶体表面的标记点既可以为角点、交叉点、反光球等被动物理标记点,也可以为红外LED灯等主动发光的光学标记点,在此不做限定。
本实施例通过“参考相机+测量相机”的视觉***新结构,极大地增加了测 量视野,可实现大尺寸目标的测量。
另外,与双摄像机或多摄像机测量***相比,本申请中光学靶标只需在一个摄像机(即测量相机)中成像,降低了手持式靶标的摆放难度,使用中更加灵活。
可以理解的,先确定参考相机坐标系与当前测量相机坐标系之间的运动转换关系,具体包括:
根据参考板和测量相机的安装关系,确定参考板坐标系与测量相机坐标系之间的运动转换关系;
光学靶标三维测量***工作之前,获取参考板上各个特征点在参考板坐标系下的三维坐标及在参考坐标系下各个特征点对应的成像角点像素坐标;
根据各个特征点的三维坐标与成像角点像素坐标的单应对应性,估计参考相机坐标系与参考板坐标系之间的运动转换关系;
根据参考板坐标系与测量相机坐标系之间的运动转换关系及参考相机坐标系与参考板坐标系之间的运动转换关系,确定参考相机坐标系和测量相机坐标系之间的运动转换关系;
光学靶标三维测量***工作之后,测量相机位姿变动,参考板上图像也变动,确定当前参考板坐标系与参考相机之间的运动转换关系;
根据参考板坐标系与测量相机坐标系之间的运动转换关系及当前参考板坐标系与参考相机之间的运动转换关系,确定参考相机坐标系与当前测量相机坐标系之间的转换关系。
具体的,在此以图3(a)中的棋盘格图案为例。参考相机和测量相机预先使用Zhang方法实现了内部参数的标定,假设已标定好的两者的内部参数矩阵分别为Kref、Kmea,两者均为3×3的矩阵;已标定的参考相机和测量相机镜头畸变系数分别为kref,kcam。参考板与测量相机之间为固定安装关系,设参考板坐标系为Oplane-XplaneYplaneZplane(简记为),测量相机坐标系为Omeacam-XmeacamYmeacamZmeacam(简记为),根据安装关系可知两个坐标系之间的运动转换关系,记为则有:
光学靶标三维测量***(也可以简称为***)开始工作之前,计算参考相机 与参考板之间的位姿(即参考相机坐标系与参考板坐标系之间的转换关系),并将结果保存在内存中备用。计算方式如下:设棋盘格上的特征角点在棋盘格坐标系下的三维坐标为Pi plane(i=1,2,....,m),其中m为棋盘格上角点总数目。因为棋盘格每个方格大小和排列已知,因此Pi plane(i=1,2,....,m)为Zplane=0的已知三维坐标数据。参考相机捕捉参考板上的棋盘格图像并提取棋盘格上的全部角点作为参考特征点,因此得到在参考相机下Pi plane(i=1,2,....,m)对应的全部成像角点像素坐标参见图5,示出了参考板在参考相机图像平面成像几何关系。
设参考相机坐标系Orefcam-XrefcamYrefcamZrefcam(简记为)与参考板坐标系之间的运动转换关系为根据的单应对应性,估计出具体步骤如下:
1)棋盘格像素角点的归一化:将转化到参考相机坐标系
式中(u0,v0)为参考相机图像中心点的像素坐标,(f1,f2)为镜头焦距,均为提前标定好的参考相机内部参数,分别表示图像像素点沿图像水平和竖直方向的像素坐标。
2)角点坐标去镜头畸变:所得到的归一化坐标含有镜头畸变,因此需进行畸变校正,畸变校正方法比较成熟,在此不再赘述,用Γ函数代表校正过程函数。设校正后的角点坐标为
式中kd代表提前标定好的参考相机镜头畸变系数。
3)计算首先,计算和Pi plane(i=1,2,....,m)之间的单应 性矩阵H,H为3×3的矩阵,然后可以由H计算得到和与参考板坐标系之间的运动转换关系计算公式如下:
u1=H(:,1)/H(:,1)
u2=H(:,2)-dot(u1,H(:,2))·u1
u3=u1×u2                            (4)

式中u1,u2,u3分别表示沿空间x轴、y轴、z轴的向量。分别表示参考板坐标系到测量相机坐标系的旋转矩阵和平移矩阵。dot()表示点乘操作。
由此可以得到:
4)计算参考相机坐标系和测量相机坐标系之间的运动转换关系由公式(1)和(5)即可推到出

***工作之后,测量相机产生位姿变化。测量相机位姿变动时,参考板上的棋盘格图像随之变动,图6示出了测量相机不同位姿时,参考相机捕捉到的参考板图像示例。测量相机位于原始位置时,参考相机捕捉到的参考板图像如图6(a)所示。若测量相机只有水平旋转运动,则参考相机捕捉到的参考板图像示例如图6(b)所示,棋盘方格只有旋转,没有变形;若测量相机既有水平旋转运动,又有俯仰运动时,参考相机捕捉到的参考板图像示例如图6(c)所示,棋盘格发生了旋转,也出现了变形。
设测量相机位姿变动后,参考相机捕捉参考板上的棋盘格图像为提取变动后的棋盘格角点信息并排序后的结果记为设当前的参考板坐标系为则根据前述步骤(1)—(3)可以得到参考相机坐标系和参考板坐标系之间的运动转换关系满足,
结合公式(1)和(7)即可计算得到参考相机坐标系和当前新的测量相机坐标系之间的运动转换关系
通过这个方法,可以得到测量相机在任意位姿时,其与参考相机坐标系之间的运动转换关系,从而实现测量相机不同位姿下所采集的三维重建数据的统一。
本实施例中,在***开始工作之前,参考相机捕捉参考板上的棋盘格图像,记录特征点位置作为参考角点信息;测量相机位姿变动时,参考板上的棋盘格图像随之变动。参考相机捕捉参考板上的棋盘格图像,提取变动后的棋盘格角点信息,结合参考角点信息计算当前测量相机的位姿变化。
参照图7,其示出了适用于本申请实施例提供的光学靶标三维测量方法的流程示意图。
如图7所示,一种光学靶标三维测量方法,可以包括:
S710、光学靶标接触被测物体表面的测点,运动机构调整测量相机的位姿以使光学靶标处于最佳成像位置。
S720、测量相机捕捉光学靶标的靶体表面的标记点信息;
S730、根据标记点信息与已标定的标记点信息,求取各个标记点的三维坐标;
S740、根据各个标记点的三维坐标,计算测点的三维坐标,可以包括:
获取各个标记点到测点的距离;
根据各个标记点的三维坐标及各对应的距离,建立方程组;
求解方程组,得到测点的三维坐标。
一个实施例中,各个标记点的三维坐标通过下述步骤确定:
获取测量时测量相机拍摄的光学靶标图像,其中,光学靶标图像中包括各个标记点;
根据针孔成像模型,确定各个标记点的空间坐标与光学靶标图像中对应的像素坐标之间的关系;
根据各个标记点之间的几何关系及各个标记点的空间坐标与光学靶标图像 中对应的像素坐标之间的关系,确定各个标记点的三维坐标。
具体的,以图8所示的四点共线靶标为例,在测量前,使用三坐标测量机或者其他方法先标定出光学靶标上各个标记点以及探针测头之间的位置关系,即点A、B、C、D之间的距离,及它们到测头的距离提前标定已知。
测量时,手持靶标进行微调使得靶标的测头Q接触被测测点位置,测量相机拍摄靶标图像。四个标记点均在图像上成像。根据该图像重建标记点A、B、C、D对应的成像标记点a、b、c、d。
首先对标记点的归一化:
式中为测量相机图像中心点的像素坐标,(f1 mea,f2 mea)为镜头焦距,均为提前标定好的测量相机内部参数。
然后角点坐标去镜头畸变:所得到的归一化坐标χd含有镜头畸变,因此需进行畸变校正,畸变校正方法比较成熟,在此不再赘述,用Γ′函数代表校正过程函数。设校正后的角点坐标为χn
χn=Γ′(χn,kd′),(χ=a,b,c,d)          (10)
式中k′d代表提前标定好的测量相机镜头畸变系数。
针孔成像模型,空间A、B、C、D四点的空间坐标Pδ(δ=A,B,C,D)与其在测量相机图像平面上成像的四点a、b、c、d的像素坐标χn(χ=a,b,c,d)之间存在以下关系:
式中分别为χn和Pδ的增广矩阵形式。sδ为点δ的未知深度因子,一般为该点的Z轴坐标:sδ=Zδ。设世界坐标系位于测量相机坐标系,则有公式(11)可变形为:
另外,以图8中靶标标记点A、B、C为例,由于三者共线,存在以下关系:
式中λA、λC为已知长度系数,如若B为AC的中点,则λA=λC=0.5。结合公式(12)-(13),得:
公式(14)两边同时叉乘得到:
推导可以得到:
另外由于光学靶标在加工时,已经提前标定了各标记点的位置分布,如已知AC之间的长度为LAC,则:
结合公式(16)、(17),可得:
得到深度因子ZA之后,由公式(12)可得到A点的三维坐标PA,再由公式(16)、(13)可以得到B、C点的三维坐标PB、PC,光学靶标上的其他点也可同理计算得出。
通过上述得到光学靶标上全部标记点的三维坐标之后,可计算测点的三维坐标。设当前测点Q的三维坐标为PQ(XQ,YQ,ZQ),靶标上四点A、B、C、D到测点Q的距离提前标定已知,设距离分别为LAQ,LBQ,LCQ,LDQ,则有:
式中只有(XQ,YQ,ZQ)三个未知数,可由最小二乘法解方程得出。
一个实施例中,得到测点的三维坐标后,该方法还包括:
读取当前测量相机的位姿信息,根据参考相机坐标系与当前测量相机坐标系之间的运动转换关系,将测量坐标系下的测点的三维坐标转换为参考坐标系下的坐标。
具体的,得到当前测点位置之后,读取当前测量相机的位姿信息,根据公式(8)中参考相机坐标系和当前新的测量相机坐标系之间的运动转换关系可以将测量坐标系下测点的三维坐标转换到参考坐标系下,实现数据的统一。
S750、根据多个测点的三维坐标,确定被测物体的三维信息。
具体的,根据所测得的多个测点位置,可根据需求计算待测对象的内腔外形、形位公差等信息,即被测物体的三维信息。
本申请实施例提供的光学靶标三维测量方法,测量中可实时调整测量相机姿态,使靶标尽量成像于图像中心位置,从而提高三维重建精度。
本申请实施例中参考相机固定不动,可以通过捕捉参考板的实时状态实时确定测量相机的位姿信息,使用简单灵活。
图9为本发明实施例提供的一种电子设备的结构示意图。如图9所示,示出了适于用来实现本申请实施例的电子设备900的结构示意图。
如图9所示,电子设备900包括中央处理单元(CPU)901,其可以根据存储在只读存储器(ROM)902中的程序或者从存储部分908加载到随机访问存储器(RAM)903中的程序而执行各种适当的动作和处理。在RAM 903中,还存储有设备900操作所需的各种程序和数据。CPU 901、ROM 902以及RAM 903通过总线904彼此相连。输入/输出(I/O)接口905也连接至总线904。
以下部件连接至I/O接口905:包括键盘、鼠标等的输入部分906;包括诸 如阴极射线管(CRT)、液晶显示器(LCD)等以及扬声器等的输出部分907;包括硬盘等的存储部分908;以及包括诸如LAN卡、调制解调器等的网络接口卡的通信部分909。通信部分909经由诸如因特网的网络执行通信处理。驱动器910也根据需要连接至I/O接口906。可拆卸介质911,诸如磁盘、光盘、磁光盘、半导体存储器等等,根据需要安装在驱动器910上,以便于从其上读出的计算机程序根据需要被安装入存储部分908。
特别地,根据本公开的实施例,上文参考图1描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括有形地包含在机器可读介质上的计算机程序,计算机程序包含用于执行上述光学靶标三维测量方法的程序代码。在这样的实施例中,该计算机程序可以通过通信部分909从网络上被下载和安装,和/或从可拆卸介质911被安装。
附图中的流程图和框图,图示了按照本发明各种实施例的***、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,前述模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的***来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本申请实施例中所涉及到的单元或模块可以通过软件的方式实现,也可以通过硬件的方式来实现。所描述的单元或模块也可以设置在处理器中。这些单元或模块的名称在某种情况下并不构成对该单元或模块本身的限定。
上述实施例阐明的***、***、模块或单元,具体可以由计算机芯片或实体实现,或者由具有某种功能的产品来实现。一种典型的实现设备为计算机。具体的,计算机例如可以为个人计算机、笔记本电脑、行动电话、智能电话、个人数字助理、媒体播放器、导航设备、电子邮件设备、游戏控制台、平板计算机、可穿戴设备或者这些设备中的任何设备的组合。
作为另一方面,本申请还提供了一种存储介质,该存储介质可以是上述实施例中前述***中所包含的存储介质;也可以是单独存在,未装配入设备中的存储介质。存储介质存储有一个或者一个以上程序,前述程序被一个或者一个以上的处理器用来执行描述于本申请的光学靶标三维测量方法。
存储介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括要素的过程、方法、商品或者设备中还存在另外的相同要素。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于***实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。

Claims (10)

  1. 一种光学靶标三维测量***,其特征在于,所述***包括:
    参考相机;
    测量相机;所述测量相机的光轴中心线与所述参考相机的光轴中心线相交;
    参考板,所述参考板固定于所述测量相机朝向所述参考相机的一面;所述参考相机通过捕捉所述参考板的变化姿态跟踪定位所述测量相机的变换位姿;
    运动机构,所述运动机构带动所述测量相机实现旋转、俯仰运动;
    光学靶标,所述光学靶标用于接触被测物体,通过所述测量相机捕捉所述光学靶标的信息。
  2. 根据权利要求1所述的***,其特征在于,所述运动机构包括俯仰机构和旋转机构,所述俯仰机构为角位移平台,通过角位移实现所述测量相机的俯仰运动;所述旋转机构为水平旋转机构,实现所述测量相机的旋转运动。
  3. 根据权利要求1所述的***,其特征在于,所述光学靶标包括靶体和测头,所述靶体表面设置有若干标记点。
  4. 一种光学靶标三维测量方法,其特征在于,所述方法包括:
    光学靶标接触被测物体表面的测点,运动机构调整测量相机的位姿以使所述光学靶标处于最佳成像位置;
    所述测量相机捕捉所述光学靶标的靶体表面的标记点信息;
    根据所述标记点信息与已标定的标记点信息,求取各个标记点的三维坐标;
    根据所述各个标记点的三维坐标,计算所述测点的三维坐标;
    根据多个测点的三维坐标,确定所述被测物体的三维信息。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:确定参考相机坐标系与当前测量相机坐标系之间的运动转换关系,包括:
    根据所述参考板和所述测量相机的安装关系,确定所述参考板坐标系与测量相机坐标系之间的运动转换关系;
    光学靶标三维测量***工作之前,获取参考板上各个特征点在参考板坐标系 下的三维坐标及在参考坐标系下各个特征点对应的成像角点像素坐标;
    根据所述各个特征点的三维坐标与所述成像角点像素坐标的单应对应性,估计参考相机坐标系与参考板坐标系之间的运动转换关系;
    根据所述参考板坐标系与测量相机坐标系之间的运动转换关系及所述参考相机坐标系与参考板坐标系之间的运动转换关系,确定参考相机坐标系和测量相机坐标系之间的运动转换关系;
    所述光学靶标三维测量***工作之后,测量相机位姿变动,参考板上图像也变动,确定当前参考板坐标系与参考相机之间的运动转换关系;
    根据所述参考板坐标系与测量相机坐标系之间的运动转换关系及所述当前参考板坐标系与参考相机之间的运动转换关系,确定参考相机坐标系与当前测量相机坐标系之间的转换关系。
  6. 根据权利要求4所述的方法,其特征在于,所述根据所述各个标记点的三维坐标,计算所述测点的三维坐标,包括:
    获取所述各个标记点到所述测点的距离;
    根据所述各个标记点的三维坐标及各对应的距离,建立方程组;
    求解方程组,得到所述测点的三维坐标。
  7. 根据权利要求6所述的方法,其特征在于,所述各个标记点的三维坐标通过下述步骤确定:
    获取测量时测量相机拍摄的光学靶标图像,其中,所述光学靶标图像中包括各个标记点;
    根据针孔成像模型,确定所述各个标记点的空间坐标与光学靶标图像中对应的像素坐标之间的关系;
    根据所述各个标记点之间的几何关系及所述各个标记点的空间坐标与光学靶标图像中对应的像素坐标之间的关系,确定各个标记点的三维坐标。
  8. 根据权利要求4所述的方法,其特征在于,得到所述测点的三维坐标后,所述方法还包括:
    读取当前测量相机的位姿信息,根据参考相机坐标系与当前测量相机坐标系之间的运动转换关系,将测量坐标系下的测点的三维坐标转换为参考坐标系下的坐标。
  9. 一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述程序时实现如权利要求4-8中任一所述的光学靶标三维测量方法。
  10. 一种可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求4-8中任一所述的光学靶标三维测量方法。
PCT/CN2023/133443 2022-11-25 2023-11-22 一种光学靶标三维测量***、方法、电子设备及存储介质 WO2024109846A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211488593.1A CN115854866A (zh) 2022-11-25 2022-11-25 一种光学靶标三维测量***、方法、电子设备及存储介质
CN202211488593.1 2022-11-25

Publications (1)

Publication Number Publication Date
WO2024109846A1 true WO2024109846A1 (zh) 2024-05-30

Family

ID=85666318

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/133443 WO2024109846A1 (zh) 2022-11-25 2023-11-22 一种光学靶标三维测量***、方法、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN115854866A (zh)
WO (1) WO2024109846A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115854866A (zh) * 2022-11-25 2023-03-28 中国科学院深圳先进技术研究院 一种光学靶标三维测量***、方法、电子设备及存储介质
CN117704967B (zh) * 2024-02-05 2024-05-07 中铁西南科学研究院有限公司 基于机器视觉的炮孔位置动态测量方法、标靶及测量***

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005322128A (ja) * 2004-05-11 2005-11-17 Rikogaku Shinkokai ステレオ3次元計測用キャリブレーション方法及び3次元位置算出方法
CN102062578A (zh) * 2010-12-13 2011-05-18 西安交通大学 一种用于视觉坐标测量的手持式光学靶标及其测量方法
CN102589437A (zh) * 2012-03-09 2012-07-18 天津大学 光笔式便携三坐标测量***中测头中心位置的标定方法
EP3584533A1 (en) * 2018-06-19 2019-12-25 Apodius GmbH Coordinate measurement system
CN112212788A (zh) * 2020-11-17 2021-01-12 华南农业大学 基于多台手机的视觉空间点三维坐标测量方法
CN115854866A (zh) * 2022-11-25 2023-03-28 中国科学院深圳先进技术研究院 一种光学靶标三维测量***、方法、电子设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005322128A (ja) * 2004-05-11 2005-11-17 Rikogaku Shinkokai ステレオ3次元計測用キャリブレーション方法及び3次元位置算出方法
CN102062578A (zh) * 2010-12-13 2011-05-18 西安交通大学 一种用于视觉坐标测量的手持式光学靶标及其测量方法
CN102589437A (zh) * 2012-03-09 2012-07-18 天津大学 光笔式便携三坐标测量***中测头中心位置的标定方法
EP3584533A1 (en) * 2018-06-19 2019-12-25 Apodius GmbH Coordinate measurement system
CN112212788A (zh) * 2020-11-17 2021-01-12 华南农业大学 基于多台手机的视觉空间点三维坐标测量方法
CN115854866A (zh) * 2022-11-25 2023-03-28 中国科学院深圳先进技术研究院 一种光学靶标三维测量***、方法、电子设备及存储介质

Also Published As

Publication number Publication date
CN115854866A (zh) 2023-03-28

Similar Documents

Publication Publication Date Title
WO2024109846A1 (zh) 一种光学靶标三维测量***、方法、电子设备及存储介质
CN109035320B (zh) 基于单目视觉的深度提取方法
CN109146980B (zh) 基于单目视觉的优化的深度提取和被动测距方法
CN104616292B (zh) 基于全局单应矩阵的单目视觉测量方法
CN106408556B (zh) 一种基于一般成像模型的微小物体测量***标定方法
US9928595B2 (en) Devices, systems, and methods for high-resolution multi-view camera calibration
US9124873B2 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
Zhang et al. A robust and rapid camera calibration method by one captured image
CN111862238B (zh) 一种全空间单目光笔式视觉测量方法
CN112132908B (zh) 一种基于智能检测技术的相机外参数标定方法及设备
Wu et al. A novel high precise laser 3D profile scanning method with flexible calibration
WO2018201677A1 (zh) 基于光束平差的远心镜头三维成像***的标定方法及装置
CN106500625B (zh) 一种远心立体视觉测量方法
WO2020208686A1 (ja) カメラ校正装置、カメラ校正方法、及びプログラムが格納された非一時的なコンピュータ可読媒体
JP2004163271A (ja) 非接触画像計測装置
Liu et al. Epipolar rectification method for a stereovision system with telecentric cameras
JP2023511735A (ja) 物体の姿勢の検出および測定システムを特徴付けるためのシステムおよび方法
Xu et al. A calibration method for non-overlapping cameras based on mirrored absolute phase target
Wang et al. An improved measurement model of binocular vision using geometrical approximation
Romero et al. A validation strategy for a target-based vision tracking system with an industrial robot
CN111915666A (zh) 基于移动终端的体积测量方法及装置
CN114359365B (zh) 一种具有高分辨率的汇聚式双目视觉测量方法
CN113592934B (zh) 一种基于单目相机的目标深度与高度测量方法及装置
CN112734838B (zh) 一种空间目标定位方法、设备及存储介质
CN115307865A (zh) 一种面向高温高超声速流场的模型变形测量方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23893934

Country of ref document: EP

Kind code of ref document: A1