CN102521822B - Active light-emitting type target for automatic calibration based on machine vision and calibrating method thereof - Google Patents

Active light-emitting type target for automatic calibration based on machine vision and calibrating method thereof Download PDF

Info

Publication number
CN102521822B
CN102521822B CN201110328922.1A CN201110328922A CN102521822B CN 102521822 B CN102521822 B CN 102521822B CN 201110328922 A CN201110328922 A CN 201110328922A CN 102521822 B CN102521822 B CN 102521822B
Authority
CN
China
Prior art keywords
target
coordinate system
point
lampet
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110328922.1A
Other languages
Chinese (zh)
Other versions
CN102521822A (en
Inventor
郭亚敏
肖舰
戚力
俞乾
张益昕
王顺
张旭苹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Original Assignee
Nanjing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University filed Critical Nanjing University
Priority to CN201110328922.1A priority Critical patent/CN102521822B/en
Publication of CN102521822A publication Critical patent/CN102521822A/en
Application granted granted Critical
Publication of CN102521822B publication Critical patent/CN102521822B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to an active light-emitting type target calibrating method for automatic calibration based on machine vision. A plurality of small light emitting lamps arranged into a rectangular dot matrix are embedded on a target of a two-dimensional flat plate type, and all the small lamps are arranged equidistantly in a horizontal direction and a vertical direction, and colors of the four small light-emitting lamps on the four vertexes of the rectangular dot matrix are different; each small light-emitting lamp is controlled by an independent switch; and the operating modes of the target include a remote operating mode and a close operating mode: when the target is arranged at a close place, all the small light-emitting lamps on the target are opened; when the target is arranged at a remote place, the small light-emitting lamps with different colors at the four corners of the target are opened, and the residual small light-emitting lamps emit light every two columns. The calibrating method comprises the following steps of: shooting multiple groups of target images from any selected different positions by a camera, and respectively shooting one opening image and one closing image of the small light-emitting lamps at the same position; removing the background of each group of images by using a grey difference subtraction method; extracting a light emitting central point (calibrating point) of each small light-emitting lamp by using a grey gravity method; and directly substituting the obtained data into the camera calibration algorithm to calculate inner and outer parameters of the camera.

Description

A kind of active light-emitting type target and scaling method thereof for automatic calibration based on machine vision
Technical field:
The present invention relates to a kind of active light-emitting type target and scaling method thereof that can be used for automatic calibration based on machine vision, belong to computer vision field.
Background technology:
In machine vision, camera marking method mainly is divided into traditional scaling method and certainly demarcates two kinds at present.The former need to utilize the demarcation thing, and wherein the two-step approach development is comparatively ripe; Certainly demarcating does not need only to find the solution camera parameters by the relation between sequence of pictures with reference to demarcating thing, but this method is still having some deficits aspect stability and precision.
Yet use traditional scaling method to realize the automatic Calibration of camera parameters, the core key is automatically to identify calibration point.What use than multi-scheme at present is one the target of geometry in particular and color to be arranged.Utilize the design of geometric relationship in special shape and color relations to write the self-identifying algorithm, lines, circle and the array point etc. at the multiplex black and white of geometric configuration interval.But the common disadvantage of these schemes is to eliminate the interference of background environment, if occur the characteristic pattern similar to target or other in background with interfering factor, easily judges calibration point or the calibration point of failing to judge by accident.And it is complicated that this programmed algorithm is tending towards usually, and it is very high that some scheme is made requirement to target.Therefore a kind of cost of necessary design is lower, algorithm is simple and easy, be not subject to the scheme of external environmental interference.
Summary of the invention
The objective of the invention is to propose a kind of scaling method that can pass through the target of simple process implementation video camera automatic Calibration.And target is with low cost, and timing signal can filter the target background to be made when extracting calibration point without interruption, can automatically identify and extract calibration point.
Technical scheme of the present invention is: the active light-emitting type target and the scaling method thereof that are used for automatic calibration based on machine vision, adopt the target of two-dimensional flat plate formula, it is characterized in that being embedded with on target the luminous lampet that several are arranged in rectangular lattice, all lampets are equidistantly arranged with vertical direction in the horizontal direction, lampet on four summits of rectangle is with the property distinguished, its color is different, remaining lampet solid colour but not identical with lampet color on four jiaos; Every lampet is by independently switch control; The mode of operation of target is placed at a distance, two kinds nearby: when being placed on nearby, the lampet on target is all opened; When being placed at a distance, open with the lampet of the property distinguished on four jiaos of targets, all the other lampet interlacing are luminous every row.
Demarcating steps is: take many group target images at arbitrarily selected diverse location, same position is taken respectively each, the picture of lampet opening and closing; Use gray scale to do poor method processing of subtracting each other and respectively organize image removal background; The method of use grey scale centre of gravity extracts the luminescent center point of lampet in image; Set up the world coordinate system of unified direction, calculate the world coordinates of whole lampets; Directly bring result of calculation into Camera Calibration Algorithm at last, solve inner parameter and the external parameter of video camera, wherein inner parameter is focal length, the pixel ratio factor, level, vertical pixel unit length and principal point coordinate; External parameter is rotation and the displacement relation between world coordinate system and camera coordinate system.
The method of using target of the present invention to carry out the video camera automatic Calibration comprises,
Step 1: place the diverse location of target in camera coverage, take many group coloured images.Each position take respectively the lampet opening and closing picture each one.
Step 2: process every picture group sheet, obtain not having the only target picture of surplus lampet luminous point of background.On this figure, the gray-scale value of every is that the bright light picture deducts at the gray-scale value of this some the gray-scale value of picture at this point of turning off the light.
Step 3: the gained picture is carried out threshold segmentation, remove near the scattering aperture of lampet.
Step 4: for each luminous point, use the method for grey scale centre of gravity to calculate its luminous focus point.Suppose that the luminous light intensity of lampet is symmetrical, so the focus point of this moment is exactly the geometric center point of lampet.Use grey scale centre of gravity method formula:
Figure BDA0000102032160000021
X wherein, y is the point that image coordinate is fastened, and f (x, y) is the gray-scale value of this point, and x1, y1 are the focus point coordinates, calculate the central point of whole luminous points, obtain the whole calibration point coordinates under image coordinate system.
Step 5: utilize on target four jiaos of lampets to set up the unified world coordinate system of direction for the characteristics of different colours, the world coordinates of lampet can obtain by the size of measuring the actual range initial point.Calculate the world coordinates of whole lampets, obtain getting under world coordinate system total data.
The imaging model of video camera is usually to be approximately pin-hole model.Take a summit of target as initial point, target plane is O wX wY wWorld coordinate system O is set up on the plane wX wY wZ wWith the video camera center O cBe initial point, set up camera coordinate system O cX cY cZ cTake the left upper apex O of image as initial point, set up image coordinate system O 0U 0V 0If P is any point on target, its coordinate under world coordinate system is P w(x w, y w, z w), getting coordinate under camera coordinate system is P c(x c, y c, z c), coordinate is P under image coordinate system 0(x 0, y 0).The corresponding relation of world coordinate system and image coordinate system is:
S x 0 y 0 1 = A x c y c z c = A R T x c y c z c = α c u 0 0 β v 0 0 0 1 R T x w y w z w 1 - - - ( 1 )
Wherein s is scale-up factor.Matrix [R T] is called external parameter, and wherein R and T represent respectively rotation matrix and the translation matrix of world coordinate system and camera coordinate system.Matrix A is called inner parameter, wherein (u 0, v 0) for the intersection point of optical axis and image coordinate system is the principal point coordinate, α and β are respectively the normalization focal length of video camera on x, y direction, c is the out of plumb factor between video camera x axle and y axle.
Step 6: the world coordinate system that step 5 is obtained, the calibration point coordinate data under image coordinate system are brought in the calibration algorithm of Zhang Zhengyou two-step approach, calculate the inside and outside parameter of video camera.
The present invention has designed a kind of target of active light-emitting type.Use this target timing signal, can make the poor background of removing by image, adopt the method that judges picture gray-scale value size to determine the calibration point of target, complete the automatic Calibration of video camera in the substitution calibration algorithm.
The invention has the beneficial effects as follows: the target of realizing the video camera automatic Calibration by simple method.Obtain the accurate inside and outside parameter of video camera, and the target that arranges is with low cost, timing signal can filter the target background to be made when extracting calibration point without interruption, can automatically identify and extract calibration point.
Description of drawings
Fig. 1 is video camera aperture perspective transform schematic diagram.
Fig. 2 is the schematic diagram of target.
Fig. 3 is the schematic diagram that target is worked a long way off.
Specific embodiments
Be embedded with the luminous lampet that several are arranged in rectangular lattice on the target of the present invention's design, for example ranks are distributed as 9x11.All lampets are equidistantly arranged with vertical direction in the horizontal direction, and the lampet on four summits of rectangle is with the property distinguished, and its color is different, all the other lampet solid colours but not identical with lampet color on four jiaos.Every lampet is by independently switch control, and the later stage can be by the switch of lampet in the programmed control array.
Concrete calibration process is as follows:
Step 1: target is placed on diverse location in camera coverage, takes many group coloured images.Each position take respectively the lampet opening and closing picture each one.During shooting, control lampet with master switch, in every picture group sheet, the position of target remains unchanged.
Step 2:, first colour picture is converted to the gray scale picture, then make in every group two pictures gray scales subtract each other.Program is: travel through whole pictures, in new picture, the gray-scale value of every is this two pictures absolute value that gray-scale value subtracts each other on respective point.This step has been removed identical background in two target pictures, the lampet of surplus array type only on new picture.
Step 3: new picture is carried out threshold segmentation: namely travel through picture, if the gray-scale value of coordinate points is less than given threshold values, with regard to zero setting; If more than or equal to, remain unchanged.This step has been removed near the scattering aperture lampet.
Step 4: for each luminous point, use the method for grey scale centre of gravity to calculate its luminous focus point.Suppose that the luminous light intensity of lampet is symmetrical, so the focus point of this moment is exactly the geometric center point of lampet.Use grey scale centre of gravity method formula:
Figure BDA0000102032160000031
Figure BDA0000102032160000032
X wherein, y are the points on the coordinate system of image, and f (x, y) is the gray-scale value of this point, and x1, y1 are the focus point coordinates.Calculate the central point of whole luminous points, obtain the coordinate of the whole calibration points (luminous point) under image coordinate system.
Step 5: calculate calibration point coordinate under world coordinate system, be divided into for two steps: 1. set up unified world coordinate system.The color of four jiaos of LED lamps in the judgement original image, if the color of four jiaos of lampets is red, yellow, and green, white, every pictures unification is take red LED lamp as initial point, red, green LED lamp place straight line is the x axle, red, yellow LED lamp place straight line is the y axle, with the orientation determination z axle of the right-hand rule, set up world coordinate system.2. calculate each LED lamp and get coordinate in world coordinate system.The LED lamp coordinate is all 0 on the z direction, is physical size size apart from initial point in the value of x, y direction.This step obtains whole calibration points (luminous point) coordinate under world coordinate system.
Step 6: bring in the calibration algorithm of Zhang Zhengyou two-step approach getting data under world coordinate system, image coordinate system, calculate the parameter of video camera.
The imaging model of video camera is usually to be approximately pin-hole model.Take a summit of target as initial point, target plane is O wX wY wWorld coordinate system O is set up on the plane wX wY wZ wWith the video camera center O cBe initial point, set up camera coordinate system O cX cY cZ cTake the left upper apex O of image as initial point, set up image coordinate system O 0U 0V 0If P is any point on target, its coordinate under world coordinate system is P w(x w, y w, z w), getting coordinate under camera coordinate system is P c(x c, y c, z c), coordinate is P under image coordinate system 0(x 0, y 0).The corresponding relation of world coordinate system and image coordinate system is:
S x 0 y 0 1 = A x c y c z c = A R T x c y c z c = α c u 0 0 β v 0 0 0 1 R T x w y w z w 1 - - - ( 1 )
Wherein s is scale-up factor.Matrix [R T] is called external parameter, and wherein R and T represent respectively rotation matrix and the translation matrix of world coordinate system and camera coordinate system.Matrix A is called inner parameter, wherein (u 0, v 0) for the intersection point of optical axis and image coordinate system is the principal point coordinate, α and β are respectively the normalization focal length of video camera on x, y direction, c is the out of plumb factor between video camera x axle and y axle.
Concrete calibration algorithm is as follows:
Homography matrix H is:
H = A R T = α c u 0 0 β v 0 0 0 1 R T = h 1 T h 2 T h 3 T T - - - ( 2 )
If one is demarcated under luminous point alive boundary coordinate system to such an extent that coordinate be M=[x w, y w, 1] I, under the image coordinate system of actual measurement, coordinate is m=[u, v, 1] T, the image coordinate that re-projection calculates is m '=[u ', v ', 1] T, have:
m ′ = 1 h 3 T M h 1 T M h 2 T M - - - ( 3 )
Suppose the error Normal Distribution between m and m ', the estimation of homography matrix H should make the likelihood function as (4) formula reach maximum.
1 Π 2 π σ e - 1 2 Σ | | m - m ′ | | 2 σ 2 - - - ( 4 )
If think that the σ value all equates for whole reference point, the estimation of H changed into non-linear minimization problem:
min∑||m-m′|| 2 (5)
Perpendicular quadrature by between (1) formula and r1 and r3 can get:
h 1 TA -TA -1h 2=0 (6)
h 1 TA -TA -1h 1=h 2 TA -TA -1h 2 (7)
Order
B = A - T A - 1 = B 11 B 12 B 13 B 21 B 22 B 23 B 31 B 32 B 33 =
1 α 2 - c α 2 β cv α - u αβ α 2 β - c α 2 β c 2 α 2 β 2 + 1 β 2 - c ( c v α - u αβ ) α 2 β 2 - v α β 2 cv α - u αβ α 2 β - c ( cv α - u αβ ) α 2 β 2 - v α β 2 ( cv α - u αβ ) 2 α 2 β 2 + v α 2 β 2 + 1 - - - ( 8 )
If vectorial b is:
B=[B 11 B 12 B 22 B 13 B 23 B 33] (9)
Have:
h i TB h i=v ij Tb (10)
Wherein
vij=[h i1h j1h i1h j2+h i2h j1h i2h j2h i3h j1+h i1h j3h i1h j3+h i3h j2h i3h j3] T (11)
h i=[h i1h i2h i3] T (12)
Simultaneous (8) and (12) can get:
v 12 T ( v 11 - v 12 ) T * b = 0 - - - ( 13 )
Take m width picture, obtain m as the equation of (13) formula, simultaneous gets:
Vb=0 (14)
After solving b, the inner parameter matrix A is:
v 0 = ( B 12 B 13 - B 11 B 23 ) / ( B 11 B 12 - B 12 2 ) λ = B 33 - [ B 13 2 + v 0 ( B 12 B 13 - B 11 B 23 ) ] / B 11 α = λ / B 11 β = λB 11 B 11 B 12 - B 12 2 u 0 = - B 13 α 2 / λ - - - ( 15 )
After finding the solution A, external parameter is:
r 1 = λ A - 1 h 1 r 2 = λ A - 1 h 2 r 3 = r 1 × r 2 T = λ A - 1 h 2 λ = 1 | | A - 1 h 1 | | = 1 | | A - 1 h 2 | | - - - ( 16 ) .

Claims (1)

1. active light-emitting type target scaling method that is used for automatic calibration based on machine vision, adopt the target of two-dimensional flat plate formula, it is characterized in that being embedded with on target the luminous lampet that several are arranged in rectangular lattice, all lampets are equidistantly arranged with vertical direction in the horizontal direction, four lampet colors on four summits of rectangle are different, remaining lampet solid colour but not identical with lampet color on four jiaos; Every lampet is by independently switch control; The mode of operation of target is placed at a distance, two kinds nearby: when being placed on nearby, the lampet on target is all opened; When being placed at a distance, open with the lampet of the property distinguished on four jiaos of targets, all the other lampet interlacing are luminous every row;
Demarcating steps is: video camera is taken many group target images at arbitrarily selected diverse location, and same position is taken respectively each, the picture of lampet opening and closing; Use gray scale to do poor method processing of subtracting each other and respectively organize image removal background; The method of use grey scale centre of gravity extracts the luminescent center point of lampet in image; Set up the world coordinate system of unified direction, calculate the world coordinates of whole lampets; With the direct substitution Camera Calibration Algorithm of result of calculation, solve inner parameter and the external parameter of video camera at last, wherein inner parameter is focal length, the pixel ratio factor, level, vertical pixel unit length and principal point coordinate; External parameter is rotation and the displacement relation between world coordinate system and camera coordinate system;
Step 1: place the diverse location of target in camera coverage, take many group coloured images; Every picture group sheet be take respectively in each position the lampet opening and closing picture each one;
Step 2: process every picture group sheet, obtain not having the only target picture of surplus lampet luminous point of background; On this figure, the gray-scale value of every is that the bright light picture deducts at the gray-scale value of this some the gray-scale value of picture at this point of turning off the light;
Step 3: the gained picture is carried out Threshold segmentation, remove near the scattering aperture of lampet;
Step 4: for each luminous point, use the method for grey scale centre of gravity to calculate its luminous focus point; Suppose that the luminous light intensity of lampet is symmetrical, so the focus point of this moment is exactly the geometric center point of lampet; Use grey scale centre of gravity method formula:
Figure FDA00003600677000011
Figure FDA00003600677000012
X wherein, y is the point that image coordinate is fastened, and f (x, y) is the gray-scale value of this point, and x1, y1 are the focus point coordinates; Calculate the central point of whole luminous points, obtain the whole calibration point coordinates under image coordinate system;
Step 5: utilize on target four jiaos of lampets to set up the unified world coordinate system of direction for the characteristics of different colours, the world coordinates of lampet obtains by the size of measuring the actual range initial point; Calculate the world coordinates of whole lampets, obtain the total data under world coordinate system;
The imaging of video camera is approximately pin-hole model: take a summit of target as initial point, target plane is O wX wY wWorld coordinate system O is set up on the plane wX wY wZ wWith the video camera center O cBe initial point, set up camera coordinate system O cX cY cZ cTake the left upper apex O of image as initial point, set up image coordinate system OU 0V 0If P is any point on target, its coordinate under world coordinate system is P w(x w, y w, z w), the coordinate under camera coordinate system is P c(x c, y c, z c), coordinate is P under image coordinate system 0(x 0, y 0); The corresponding relation of world coordinate system and image coordinate system is:
S x 0 y 0 1 = A x c y c z c = A [ RT ] x w y w z w 1 = α c u 0 0 β v 0 0 0 1 [ RT ] x w y w z w 1 (1)
Wherein S is scale-up factor; Matrix [R T] is called external parameter, and wherein R and T represent respectively rotation matrix and the translation matrix of world coordinate system and camera coordinate system; Matrix A is called inner parameter, wherein (u 0, v 0) for the intersection point of optical axis and image coordinate system is the principal point coordinate, α and β are respectively the normalization focal length of video camera on x, y direction, c is the out of plumb factor between video camera x axle and y axle;
Step 6: the world coordinate system that the employing step 5 obtains, the inside and outside parameter that the calibration point coordinate data under image coordinate system calculates video camera.
CN201110328922.1A 2011-10-25 2011-10-25 Active light-emitting type target for automatic calibration based on machine vision and calibrating method thereof Expired - Fee Related CN102521822B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110328922.1A CN102521822B (en) 2011-10-25 2011-10-25 Active light-emitting type target for automatic calibration based on machine vision and calibrating method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110328922.1A CN102521822B (en) 2011-10-25 2011-10-25 Active light-emitting type target for automatic calibration based on machine vision and calibrating method thereof

Publications (2)

Publication Number Publication Date
CN102521822A CN102521822A (en) 2012-06-27
CN102521822B true CN102521822B (en) 2013-11-06

Family

ID=46292726

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110328922.1A Expired - Fee Related CN102521822B (en) 2011-10-25 2011-10-25 Active light-emitting type target for automatic calibration based on machine vision and calibrating method thereof

Country Status (1)

Country Link
CN (1) CN102521822B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855728A (en) * 2012-09-21 2013-01-02 北京智安邦科技有限公司 Automatic calibration method of image-based fire detector
CN103413318B (en) * 2013-08-27 2016-03-02 北京航空航天大学 Target ball mirror surface center positioning method
CN106441234B (en) * 2016-09-22 2018-12-28 上海极清慧视科技有限公司 Detect scaling method in a kind of 3D machine vision space
CN106570904B (en) * 2016-10-25 2019-04-09 大连理工大学 A kind of multiple target relative pose recognition methods based on Xtion camera
CN107421473A (en) * 2017-05-26 2017-12-01 南京理工大学 The two beam laser coaxial degree detection methods based on image procossing
CN107507247B (en) * 2017-08-28 2018-09-11 哈尔滨拓博科技有限公司 A kind of real-time dynamic autoization scaling method of projected keyboard
CN109443206B (en) * 2018-11-09 2020-03-10 山东大学 System and method for measuring tail end pose of mechanical arm based on color spherical light source target
CN110111394A (en) * 2019-05-16 2019-08-09 湖南三一快而居住宅工业有限公司 Based on manipulator feature to the method and device of video camera automatic Calibration
CN111572377A (en) * 2020-05-13 2020-08-25 广州华立科技职业学院 Visual guidance method for automatic alignment of mobile robot charging station
CN115239795B (en) * 2022-09-23 2022-12-30 山东工程职业技术大学 Archery target ring hit position ring recording detection method, detection device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101566465A (en) * 2009-05-18 2009-10-28 西安交通大学 Method for measuring object deformation in real time
CN101576379A (en) * 2009-05-12 2009-11-11 四川大学 Fast calibration method of active projection three dimensional measuring system based on two-dimension multi-color target
CN101655352A (en) * 2009-09-15 2010-02-24 西安交通大学 Three-dimensional speckle strain measurement device and measurement method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3687034B2 (en) * 2000-12-29 2005-08-24 東京特殊電線株式会社 Display device color calibration device and display device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576379A (en) * 2009-05-12 2009-11-11 四川大学 Fast calibration method of active projection three dimensional measuring system based on two-dimension multi-color target
CN101566465A (en) * 2009-05-18 2009-10-28 西安交通大学 Method for measuring object deformation in real time
CN101655352A (en) * 2009-09-15 2010-02-24 西安交通大学 Three-dimensional speckle strain measurement device and measurement method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特开2002-209230A 2002.07.26

Also Published As

Publication number Publication date
CN102521822A (en) 2012-06-27

Similar Documents

Publication Publication Date Title
CN102521822B (en) Active light-emitting type target for automatic calibration based on machine vision and calibrating method thereof
CN101576379B (en) Fast calibration method of active projection three dimensional measuring system based on two-dimension multi-color target
CN106340044B (en) Join automatic calibration method and caliberating device outside video camera
JP5207719B2 (en) Label with color code, color code extraction means, and three-dimensional measurement system
Salvi et al. A robust-coded pattern projection for dynamic 3D scene measurement
CN100573040C (en) The scaling method of object surface three-dimensional contour structure light measurement system
CN101770646B (en) Edge detection method based on Bayer RGB images
CN106091983B (en) The complete scaling method of Vision Measuring System With Structured Light Stripe comprising scanning direction information
CN106651752A (en) Three-dimensional point cloud data registration method and stitching method
CN103049731B (en) Decoding method for point-distributed color coding marks
CN101667303A (en) Three-dimensional reconstruction method based on coding structured light
RU2013101791A (en) OBTAINING SPATIAL TOPOGRAPHIC IMAGES OF TRACES FROM THE TOOL USING A NONLINEAR PHOTOMETRIC STEREO METHOD
CN103033171B (en) Encoding mark based on colors and structural features
CN106353317B (en) Detect the detection device and method of target to be measured
JP2022059013A (en) Information processing apparatus, recognition support method, and computer program
CN113012096B (en) Display screen sub-pixel positioning and brightness extraction method, device and storage medium
CN105865329A (en) Vision-based acquisition system for end surface center coordinates of bundles of round steel and acquisition method thereof
CN101001520B (en) Measuring method and apparatus using color images
CN109194954A (en) Fish-eye camera performance parameter test method, apparatus, equipment and can storage medium
JP5771423B2 (en) Image color correction apparatus and image color correction method
CN115760704A (en) Paint surface defect detection method and device based on color active reference object and medium
CN101799924A (en) Method for calibrating projector by CCD (Charge Couple Device) camera
CN113706607A (en) Sub-pixel positioning method based on circular array diagram, computer equipment and device
CN110738170B (en) Image identification method for ammeter terminal fault identification
CN111784768A (en) Unmanned aerial vehicle attitude estimation method and system based on three-color four-lamp mark recognition

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20131106

Termination date: 20141025

EXPY Termination of patent right or utility model