CN109360245B - External parameter calibration method for multi-camera system of unmanned vehicle - Google Patents

External parameter calibration method for multi-camera system of unmanned vehicle Download PDF

Info

Publication number
CN109360245B
CN109360245B CN201811256308.7A CN201811256308A CN109360245B CN 109360245 B CN109360245 B CN 109360245B CN 201811256308 A CN201811256308 A CN 201811256308A CN 109360245 B CN109360245 B CN 109360245B
Authority
CN
China
Prior art keywords
camera
relay
calibrated
cameras
common
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811256308.7A
Other languages
Chinese (zh)
Other versions
CN109360245A (en
Inventor
周易
李发成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motovis Technology Shanghai Co ltd
Original Assignee
Motovis Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motovis Technology Shanghai Co ltd filed Critical Motovis Technology Shanghai Co ltd
Priority to CN201811256308.7A priority Critical patent/CN109360245B/en
Publication of CN109360245A publication Critical patent/CN109360245A/en
Application granted granted Critical
Publication of CN109360245B publication Critical patent/CN109360245B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method for calibrating external parameters of a multi-camera system of an unmanned vehicle comprises the following steps: arranging at least two relay camera sets between adjacent cameras to be calibrated of the vehicle; synchronizing the camera to be calibrated and the relay camera; moving the calibration plate around the vehicle so that the calibration plate passes through all the cameras in sequence; starting the camera to be calibrated and the relay camera, and shooting a moving calibration plate; detecting 2D pixel coordinates of each characteristic pattern in each camera; and (4) estimating external parameters. The invention introduces the relay camera and establishes the redundant attitude map optimization strategy based on the relay camera, overcomes the problem of external parameter error accumulation caused by distance between cameras in the traditional method, obtains accurate external parameter estimation meeting the requirements of the SLAM system, does not need to establish a calibration structure in proportion to the vehicle platform, saves the field and saves the design and manufacturing cost of the marker.

Description

External parameter calibration method for multi-camera system of unmanned vehicle
Technical Field
The invention belongs to the technical field of multi-camera systems, and particularly relates to an external parameter calibration method of a multi-camera system of an unmanned vehicle.
Background
As one of the most potential technologies in the world today, unmanned driving means that an automobile senses the surrounding environment and completes navigation tasks through a sensor equipped in the automobile without human operation. The popularization of the prediction unmanned technology of the Puhua Yongdao reduces the whole traffic accidents by ninety percent; the bimaway research center predicts that unmanned technology will drive productivity and energy efficiency improvements and new business models will emerge.
Unmanned vehicles are typically equipped with sensors such as cameras, Inertial Measurement Units (IMUs), lidar and Global Positioning Systems (GPS). The external information which can be sensed by the camera is most abundant, and comprises the color, the structure and the texture of a scene and some semantic information (such as roads, pedestrians, traffic signs and the like). Compared with the situation that a human driver can only observe the traffic condition in a certain direction at the same time, the unmanned technology aims at realizing 360-degree all-dimensional sensing of the environment around the vehicle body without dead angles. Due to the limited field angle of a single camera, a panoramic imaging system is typically composed using multiple cameras. The navigation task usually requires the information of multiple cameras to be converted into the same coordinate system for description, so that the external parameters between the multiple cameras need to be calibrated. For small vehicles, manufacturers or developers can obtain extrinsic parameters between multi-view cameras through global positioning by building static markers (calibration plates). However, for large vehicles (e.g., heavy trucks with trailers), a limited number of cameras are often mounted in a surrounding manner on the vehicle body for both blind around view and cost considerations. Two problems arise at this time: (1) there will be a large separation between cameras; (2) there is no (or only a small) overlap of fields of view between some cameras. These practical situations make external parameter calibration for multi-view cameras very difficult. Simply taking the above calibration strategy for a small vehicle all-round multi-view camera will have great requirements on the field. In addition, most of the existing external parameter calibration technologies for the panoramic system are mainly developed for the task of generating a high-quality 360-degree aerial view, and the quality of the final image stitching result is usually determined by visual sense. In fact, the requirements of the SLAM system for the external parameter calibration accuracy of the multi-view camera system are much higher than those of the system aiming at image stitching.
The existing calibration scheme and the advantages and disadvantages thereof are as follows:
1. the invention has the patent name: a calibration method of a panoramic vision auxiliary parking system is disclosed as follows: CN 101425181B:
the invention discloses a calibration method of a panoramic vision auxiliary parking system, which is characterized in that a virtual aerial view at a certain height at the top of an automobile is generated by images generated by four wide-angle fisheye cameras arranged on the periphery of the automobile. Wherein the positional relationship of each camera with respect to the virtual bird's eye view camera is determined by calculation based on a homography transformation matrix of the ground plane. Since the position of the virtual bird's-eye view camera is estimated by low-precision measurement, the spatial position relationship between the multi-view cameras based on the virtual bird's-eye view camera meets the requirement of completing seamless image stitching, but cannot meet the design requirement of the SLAM system based on the multi-view cameras.
2. The invention has the patent name: dynamic calibration system, joint optimization method and device in dynamic calibration system, the publication number: CN 105844624A:
the invention provides a dynamic calibration system, and a joint optimization method and device in the dynamic calibration system. In the external parameter calibration step between cameras, a plurality of groups of static calibration objects with known relative spatial positions need to be artificially constructed. The calibration task requires that a vehicle carrying a camera passes through the static markers along a designed track, and calibration data is acquired in a dynamic process. Compared with the method in the patent CN101425181B, the method obtains the camera external parameters more accurately, but the requirements on the calibration scene are too strict, and the method is not easy to be popularized to the calibration task of the all-round multi-view camera system under a large-scale vehicle platform.
3. The invention has the patent name: a method for detecting and tracking a running obstacle of a heavy-duty truck based on binocular fisheye cameras is disclosed as follows: CN 105678787A:
the invention discloses a method for detecting and tracking a driving obstacle of a heavy-duty truck based on binocular fisheye cameras, and belongs to the technical field of active safety of traffic vehicles. In order to detect the rear obstacle of the truck when the truck backs up, a group of binocular fisheye cameras are arranged at the tail part of the truck. The depth of field needs to be measured, and the visual fields of the two fisheye cameras have enough overlapped visual fields. The configuration simplifies the external parameter calibration of the binocular camera, and can be completed by using the existing open-source camera calibration tool box. However, for a large truck look around multi-view camera system, the above approach cannot be used because the distance between the cameras (baseline) is far and there is no (or only a small) overlapping area of view.
Disclosure of Invention
Based on the above technical problem, an external parameter calibration method for a multi-camera system of an unmanned vehicle is provided.
In order to solve the technical problems, the invention adopts the following technical scheme:
a method for calibrating external parameters of a multi-camera system of an unmanned vehicle comprises the following steps:
110. arranging at least two relay camera groups between adjacent cameras to be calibrated of a vehicle, wherein the at least two relay camera groups are arranged left and right, each relay camera group comprises at least one relay camera in the vertical direction, a common-view area is arranged between the adjacent cameras, and the angle of the common-view area is 50-150 degrees;
120. synchronizing the camera to be calibrated and the relay camera;
130. moving the calibration board around the vehicle to enable the calibration board to pass through all the cameras in sequence, wherein the front surface of the calibration board faces the camera to be calibrated and the relay camera, and the front surface is provided with a plurality of characteristic patterns in matrix arrangement;
140. starting the camera to be calibrated and the relay camera, and shooting a moving calibration plate;
150. detecting 2D pixel coordinates of each characteristic pattern in each camera;
160. external parameter estimation:
161. carrying out common-view association on the 2D pixel coordinates and the 3D world coordinates of the characteristic patterns in the common-view area to generate a posture graph with the absolute posture of the camera as a node and the common-view relation as an edge, wherein the absolute posture is represented by { R, t }, R is a rotation matrix, and t is a translation vector;
162. and generating a minimum spanning tree by a breadth-first search algorithm according to the number of the common-view characteristic patterns and the maximum reprojection error: selecting the side with the large number of the common-view characteristic patterns as the side for connecting the two nodes, and selecting the side with the small maximum reprojection error as the side for connecting the two nodes if the number of the common-view characteristic patterns is the same;
163. the absolute pose of any one of the camera nodes to be calibrated in the minimum spanning tree and the relative poses between all nodes on the path of the minimum spanning tree are taken as the external parameter set to be calibrated, and the relative poses pass through a formula Tf Wn=Tf WmTf mnTo obtain wherein Tf WnIs the absolute pose, T, of camera n at time ff WmFor the absolute position of camera number m at time fPosture Tf mnThe relative pose of the n camera relative to the m camera is obtained;
164. and obtaining an optimized estimation value of the external parameter to be calibrated by a nonlinear least square method taking the reprojection error as energy.
The relay camera is fixed on the vehicle or fixed on a tripod in a manner of being capable of being adjusted in a left-right rotating mode.
And synchronizing the camera to be calibrated and the relay camera by sending a synchronous clock signal.
A manually held calibration plate is moved around the vehicle.
The invention introduces the relay camera and establishes the redundant attitude map optimization strategy based on the relay camera, overcomes the problem of external parameter error accumulation caused by distance between cameras in the traditional method, obtains accurate external parameter estimation meeting the requirements of the SLAM system, does not need to establish a calibration structure in proportion to the vehicle platform, saves the field and saves the design and manufacturing cost of the marker.
Drawings
The invention is described in detail below with reference to the following figures and detailed description:
FIG. 1 is a schematic diagram of the principles of the present invention;
fig. 2 is a schematic structural diagram of a relay camera group according to the present invention;
fig. 3 is a schematic top view of a relay camera group according to the present invention;
FIG. 4 is a schematic diagram of the pose graph and minimum spanning tree of the present invention;
FIG. 5 is a schematic diagram of a calibration plate feature pattern employed in the present invention.
Detailed Description
A method for calibrating external parameters of a multi-camera system of an unmanned vehicle comprises the following steps:
110. as shown in fig. 1, at least two relay camera sets 30 are provided between adjacent cameras 21 to be calibrated of the vehicle 20, the relay camera sets 30 are arranged in the left and right, each relay camera set 30 includes at least one relay camera 31 in the up-down direction, a common viewing area is provided between the adjacent cameras, the common viewing area is in the shape of a sector, and the angle of the common viewing area is 50 ° to 150 °.
In each relay camera group 30, when the number of the relay cameras 31 is 1, the relay cameras may be fixed to the vehicle 20 by mechanical fixing or magnetic force adsorption, and as shown in fig. 2 and 3, when the number of the relay cameras 31 is 2 or more, the relay cameras may be vertically arranged on a tripod 33 by a camera fixing frame 32, and the relay cameras may be adjusted by left and right rotation by the camera fixing frame.
120. By transmitting the synchronization clock signal to each camera, the camera to be calibrated 21 and the relay camera 31 are synchronized, so that photographing can be performed at the same frame rate.
130. The calibration board 40 is moved around the vehicle 20 so that the calibration board 40 passes through all the cameras in sequence, the front side of the calibration board 40 faces the camera to be calibrated 21 and the relay camera 31, and the front side thereof has a plurality of feature patterns arranged in a matrix, in this embodiment, aprilatas feature patterns are adopted, see fig. 5.
In the present embodiment, the calibration plate 40 is moved around the vehicle 20 by a manual hand, and the moving path L is shown in fig. 1.
140. The camera to be calibrated 21 and the relay camera 31 are started to shoot the moving calibration board 40.
150. Detecting the 2D pixel coordinate x of each characteristic pattern in each camerak=(uk,vk)T
160. External parameter estimation:
161. and carrying out common-view association on the 2D pixel coordinates and the 3D world coordinates of the characteristic patterns in the common-view area to generate a posture graph with the absolute position and posture of the camera as nodes and the common-view relation as edges.
Wherein, a certain characteristic pattern in the common view area of two cameras has different 2D pixel coordinates in the two adjacent cameras, and the common view association is to associate the 3D world coordinates of the characteristic pattern with the corresponding two 2D pixel coordinates.
The common-view relationship means that two cameras have a common-view area, and then the corresponding two nodes are communicated through edges.
162. And generating a minimum spanning tree by a breadth-first search algorithm according to the number of the common-view characteristic patterns and the maximum reprojection error: and selecting the side with the large number of the common-view characteristic patterns as the side for connecting the two nodes, and selecting the side with the small maximum reprojection error as the side for connecting the two nodes if the number of the common-view characteristic patterns is the same.
In computer vision, Reprojection error (Reprojection error) is often used, which is the error between the pixel coordinates of a 3D pattern (the projected position observed by a camera) and the position of the 3D pattern projected according to the current estimated pose of the camera, e.g., the 2D pixel coordinates into which the 3D coordinates of a certain feature pattern are transformed by the absolute pose of the a camera and the 2D pixel coordinates observed by the a camera.
Wherein, the absolute pose is represented by { R, t }, R is a rotation matrix, and t is a translation vector.
The initial value of the absolute pose of each camera is obtained through a PnP (passive-n-Point) algorithm, the associated 3D world coordinate and 2D pixel coordinate are input parameters of the PnP algorithm, a 3D world coordinate system is defined on one corner of a calibration board 40, the movement of the calibration board 40 relative to the camera is equivalent to the movement of the camera relative to the calibration board 40, and the 3D world coordinate of a characteristic pattern is known because the size of the characteristic pattern sprayed on the calibration board 40 is known.
163. Selecting the absolute pose of any one camera node to be calibrated in the minimum spanning tree and the relative poses between all nodes on the path of the minimum spanning tree as an extrinsic parameter set to be calibrated, wherein the relative poses pass through a formula Tf Wn=Tf WmTf mnTo obtain wherein Tf WnIs the absolute pose, T, of camera n at time ff WmIs the absolute pose, T, of the m-number camera at the moment ff mnThe relative pose of the camera number n relative to the camera number m.
164. And obtaining an optimized estimation value of the external parameter to be calibrated by a nonlinear least square method taking the reprojection error as energy.
When the relay camera group 30 includes a plurality of relay cameras 31 arranged up and down, a dense pose graph (i.e., an overconstrained least square problem) can be formed with the camera 21 to be calibrated, which is beneficial to obtaining a more accurate estimation value.
As shown in fig. 4, taking two cameras to be calibrated as an example, two relay cameras are arranged between the two cameras to be calibrated, in the figure, 1 and 2 represent the two cameras to be calibrated, 3 and 4 represent the two relay cameras, the light color edge is the path of the minimum spanning tree, and the external parameters to be calibrated are optimized and estimated through an energy function:
Figure BDA0001842808640000061
wherein the content of the first and second substances,
Figure BDA0001842808640000062
representing reprojection error, xkFor the camera to observe the 2D pixel coordinates of the feature pattern,
Figure BDA0001842808640000063
is the 2d pixel coordinate converted by the absolute pose of the n-number camera for the 3d world coordinate of the characteristic pattern,
Figure BDA0001842808640000064
is the absolute pose, X, of camera n at time fkIs the 3d world coordinate of the feature pattern, n is the camera number, and k is the number of the feature pattern.
Figure BDA0001842808640000065
Representing the external parameters to be calibrated, namely the absolute pose of the camera No. 1 at each moment, the relative pose of the camera No. 3 relative to the camera No. 1, the relative pose of the camera No. 4 relative to the camera No. 3, the relative pose of the camera No. 4 relative to the camera No. 2, and the projection position coordinates of the feature points on the image are expressed by xkAnd (4) showing.
The invention introduces the relay camera and establishes a redundant attitude map optimization strategy based on the relay camera, overcomes the problem of external parameter error accumulation caused by distance between cameras in the traditional method, obtains accurate external parameter estimation meeting the requirements of the SLAM system, does not need to establish a calibration structure in proportion to a vehicle platform, saves the field and the design and manufacturing cost of markers, and is suitable for external parameter calibration of a panoramic multi-view camera system on a large vehicle platform.
However, those skilled in the art should realize that the above embodiments are illustrative only and not limiting to the present invention, and that changes and modifications to the above described embodiments are intended to fall within the scope of the appended claims, provided they fall within the true spirit of the present invention.

Claims (4)

1. A method for calibrating external parameters of a multi-camera system of an unmanned vehicle is characterized by comprising the following steps:
110. arranging at least two relay camera groups between adjacent cameras to be calibrated of a vehicle, wherein the at least two relay camera groups are arranged in the left and right direction, each relay camera group comprises at least one relay camera in the up-down direction, a common-view area is arranged between the adjacent relay cameras, and the angle of the common-view area is 50-150 degrees;
120. synchronizing the camera to be calibrated and the relay camera;
130. moving the calibration board around the vehicle to enable the calibration board to pass through all the cameras in sequence, wherein the front surface of the calibration board faces the camera to be calibrated and the relay camera, and the front surface is provided with a plurality of characteristic patterns in matrix arrangement;
140. starting the camera to be calibrated and the relay camera, and shooting a moving calibration plate;
150. detecting 2D pixel coordinates of each characteristic pattern in each camera;
160. external parameter estimation:
161. carrying out common-view association on the 2D pixel coordinates and the 3D world coordinates of the characteristic patterns in the common-view area to generate a posture graph with the absolute posture of the camera as a node and the common-view relation as an edge, wherein the absolute posture is represented by { R, t }, R is a rotation matrix, and t is a translation vector;
162. and generating a minimum spanning tree by a breadth-first search algorithm according to the number of the common-view characteristic patterns and the maximum reprojection error: selecting the side with the large number of the common-view characteristic patterns as the side for connecting the two nodes, and selecting the side with the small maximum reprojection error as the side for connecting the two nodes if the number of the common-view characteristic patterns is the same;
163. the absolute pose of any one of the camera nodes to be calibrated in the minimum spanning tree and the relative poses between all nodes on the path of the minimum spanning tree are taken as the external parameter set to be calibrated, and the relative poses pass through a formula Tf Wn=Tf WmTf mnTo obtain wherein Tf WnIs the absolute pose, T, of camera n at time ff WmIs the absolute pose, T, of the m-number camera at the moment ff mnThe relative pose of the n camera relative to the m camera is obtained;
164. and obtaining an optimized estimation value of the external parameter to be calibrated by a nonlinear least square method taking the reprojection error as energy.
2. The method for calibrating the extrinsic parameters of a multi-camera system of an unmanned vehicle as claimed in claim 1, wherein said relay camera is fixed to said vehicle or a tripod capable of being adjusted to rotate left and right.
3. The extrinsic parameter calibration method of a multi-camera system in an unmanned vehicle according to claim 1 or 2, characterized in that the camera to be calibrated and the relay camera are synchronized by sending a synchronous clock signal.
4. The method of claim 3, wherein a manually held calibration plate is moved around the vehicle.
CN201811256308.7A 2018-10-26 2018-10-26 External parameter calibration method for multi-camera system of unmanned vehicle Active CN109360245B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811256308.7A CN109360245B (en) 2018-10-26 2018-10-26 External parameter calibration method for multi-camera system of unmanned vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811256308.7A CN109360245B (en) 2018-10-26 2018-10-26 External parameter calibration method for multi-camera system of unmanned vehicle

Publications (2)

Publication Number Publication Date
CN109360245A CN109360245A (en) 2019-02-19
CN109360245B true CN109360245B (en) 2021-07-06

Family

ID=65346751

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811256308.7A Active CN109360245B (en) 2018-10-26 2018-10-26 External parameter calibration method for multi-camera system of unmanned vehicle

Country Status (1)

Country Link
CN (1) CN109360245B (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163915B (en) * 2019-04-09 2021-07-13 深圳大学 Spatial three-dimensional scanning method and device for multiple RGB-D sensors
CN110244282B (en) * 2019-06-10 2021-06-15 宁波智能装备研究院有限公司 Multi-camera system and laser radar combined system and combined calibration method thereof
US10925687B2 (en) * 2019-07-12 2021-02-23 Synaptive Medical Inc. System and method for optical axis calibration
CN110910453B (en) * 2019-11-28 2023-03-24 魔视智能科技(上海)有限公司 Vehicle pose estimation method and system based on non-overlapping view field multi-camera system
DE102019132996A1 (en) * 2019-12-04 2021-06-10 Valeo Schalter Und Sensoren Gmbh Estimating a three-dimensional position of an object
CN111210478B (en) * 2019-12-31 2023-07-21 重庆邮电大学 Common-view-free multi-camera system external parameter calibration method, medium and system
CN111260733B (en) * 2020-01-13 2023-03-24 魔视智能科技(上海)有限公司 External parameter estimation method and system of vehicle-mounted all-around multi-camera system
CN111256689B (en) * 2020-01-15 2022-01-21 北京智华机器人科技有限公司 Robot positioning method, robot and storage medium
CN111768364B (en) * 2020-05-15 2022-09-20 成都飞机工业(集团)有限责任公司 Aircraft surface quality detection system calibration method
CN111815716A (en) * 2020-07-13 2020-10-23 北京爱笔科技有限公司 Parameter calibration method and related device
CN112233188B (en) * 2020-10-26 2024-03-12 南昌智能新能源汽车研究院 Calibration method of data fusion system of laser radar and panoramic camera
CN112489141B (en) * 2020-12-21 2024-01-30 像工场(深圳)科技有限公司 Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera
CN112598749B (en) * 2020-12-21 2024-02-27 西北工业大学 Calibration method for large-scene non-common-view multi-camera
CN113112551B (en) * 2021-04-21 2023-12-19 阿波罗智联(北京)科技有限公司 Camera parameter determining method and device, road side equipment and cloud control platform
CN115482287A (en) * 2021-05-31 2022-12-16 联发科技(新加坡)私人有限公司 Calibration sample plate, calibration system and calibration method thereof
CN113345031A (en) * 2021-06-23 2021-09-03 地平线征程(杭州)人工智能科技有限公司 Multi-camera external parameter calibration device and method, storage medium and electronic device
CN113256742B (en) * 2021-07-15 2021-10-15 禾多科技(北京)有限公司 Interface display method and device, electronic equipment and computer readable medium
CN114092564B (en) * 2021-10-29 2024-04-09 上海科技大学 External parameter calibration method, system, terminal and medium for non-overlapping vision multi-camera system
CN114299120B (en) * 2021-12-31 2023-08-04 北京银河方圆科技有限公司 Compensation method, registration method, and readable storage medium
CN117128985B (en) * 2023-04-27 2024-05-31 荣耀终端有限公司 Point cloud map updating method and equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073119A (en) * 2004-12-11 2007-11-14 三星电子株式会社 Information storage medium including meta data for multi-angle title, and apparatus and method for reproducing the same
CN101226638A (en) * 2007-01-18 2008-07-23 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN101419055A (en) * 2008-10-30 2009-04-29 北京航空航天大学 Space target position and pose measuring device and method based on vision
CN201373736Y (en) * 2008-11-28 2009-12-30 北京航空航天大学 Initiative vision non-contact servo mechanism parameter measuring device
CN201881988U (en) * 2010-09-17 2011-06-29 长安大学 Vehicle lane changing auxiliary device
CN102478759A (en) * 2010-11-29 2012-05-30 中国空间技术研究院 Integration measuring method of wavefront distortion and optical axis vibration of space camera
WO2015170361A1 (en) * 2014-05-07 2015-11-12 野村ユニソン株式会社 Cable robot calibration method
CN106408650A (en) * 2016-08-26 2017-02-15 中国人民解放军国防科学技术大学 3D reconstruction and measurement method for spatial object via in-orbit hedgehopping imaging
CN206563649U (en) * 2017-03-24 2017-10-17 中国工程物理研究院应用电子学研究所 A kind of pupil on-line measurement device based on imaging conjugate
CN107346425A (en) * 2017-07-04 2017-11-14 四川大学 A kind of three-D grain photographic system, scaling method and imaging method
CN107401976A (en) * 2017-06-14 2017-11-28 昆明理工大学 A kind of large scale vision measurement system and its scaling method based on monocular camera

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073119A (en) * 2004-12-11 2007-11-14 三星电子株式会社 Information storage medium including meta data for multi-angle title, and apparatus and method for reproducing the same
CN101226638A (en) * 2007-01-18 2008-07-23 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN101226638B (en) * 2007-01-18 2010-05-19 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN101419055A (en) * 2008-10-30 2009-04-29 北京航空航天大学 Space target position and pose measuring device and method based on vision
CN201373736Y (en) * 2008-11-28 2009-12-30 北京航空航天大学 Initiative vision non-contact servo mechanism parameter measuring device
CN201881988U (en) * 2010-09-17 2011-06-29 长安大学 Vehicle lane changing auxiliary device
CN102478759A (en) * 2010-11-29 2012-05-30 中国空间技术研究院 Integration measuring method of wavefront distortion and optical axis vibration of space camera
WO2015170361A1 (en) * 2014-05-07 2015-11-12 野村ユニソン株式会社 Cable robot calibration method
CN106408650A (en) * 2016-08-26 2017-02-15 中国人民解放军国防科学技术大学 3D reconstruction and measurement method for spatial object via in-orbit hedgehopping imaging
CN206563649U (en) * 2017-03-24 2017-10-17 中国工程物理研究院应用电子学研究所 A kind of pupil on-line measurement device based on imaging conjugate
CN107401976A (en) * 2017-06-14 2017-11-28 昆明理工大学 A kind of large scale vision measurement system and its scaling method based on monocular camera
CN107346425A (en) * 2017-07-04 2017-11-14 四川大学 A kind of three-D grain photographic system, scaling method and imaging method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Pose-relay videometric method and ship deformation measurement system with camera-series;Yu Qi-feng;《2010 International Symposium on Optomechatronic Technologies》;20110113;全文 *
基于单目RGB摄像机的三维重建技术的算法研究与实现;封倩倩;《中国优秀硕士学位论文全文数据库 信息科技辑》;20180415;全文 *

Also Published As

Publication number Publication date
CN109360245A (en) 2019-02-19

Similar Documents

Publication Publication Date Title
CN109360245B (en) External parameter calibration method for multi-camera system of unmanned vehicle
US10659677B2 (en) Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
JP7073315B2 (en) Vehicles, vehicle positioning systems, and vehicle positioning methods
CN111986506B (en) Mechanical parking space parking method based on multi-vision system
CN111046743B (en) Barrier information labeling method and device, electronic equipment and storage medium
KR102550678B1 (en) Non-Rigid Stereo Vision Camera System
WO2017159382A1 (en) Signal processing device and signal processing method
JP5588812B2 (en) Image processing apparatus and imaging apparatus using the same
JP5455124B2 (en) Camera posture parameter estimation device
EP1462762A1 (en) Circumstance monitoring device of a vehicle
KR102295809B1 (en) Apparatus for acquisition distance for all directions of vehicle
CN104859538A (en) Vision-based object sensing and highlighting in vehicle image display systems
WO2005088971A1 (en) Image generation device, image generation method, and image generation program
CN101487895B (en) Reverse radar system capable of displaying aerial vehicle image
CN102163331A (en) Image-assisting system using calibration method
CA2526105A1 (en) Image display method and image display apparatus
CN103802725A (en) New method for generating vehicle-mounted driving assisting image
CN109883433B (en) Vehicle positioning method in structured environment based on 360-degree panoramic view
JP6910454B2 (en) Methods and systems for generating composite top-view images of roads
Rangesh et al. A multimodal, full-surround vehicular testbed for naturalistic studies and benchmarking: Design, calibration and deployment
JP2018139084A (en) Device, moving object device and method
CN112233188A (en) Laser radar-based roof panoramic camera and calibration method thereof
TW202020734A (en) Vehicle, vehicle positioning system, and vehicle positioning method
Kinzig et al. Real-time seamless image stitching in autonomous driving
CN111862210B (en) Object detection and positioning method and device based on looking-around camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant