WO2018235256A1 - Stereo measurement device and system - Google Patents

Stereo measurement device and system Download PDF

Info

Publication number
WO2018235256A1
WO2018235256A1 PCT/JP2017/023159 JP2017023159W WO2018235256A1 WO 2018235256 A1 WO2018235256 A1 WO 2018235256A1 JP 2017023159 W JP2017023159 W JP 2017023159W WO 2018235256 A1 WO2018235256 A1 WO 2018235256A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
stereo
unit
control
distance
Prior art date
Application number
PCT/JP2017/023159
Other languages
French (fr)
Japanese (ja)
Inventor
媛 李
三好 雅則
秀行 粂
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to JP2019524821A priority Critical patent/JP6734994B2/en
Priority to PCT/JP2017/023159 priority patent/WO2018235256A1/en
Publication of WO2018235256A1 publication Critical patent/WO2018235256A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • the present invention relates to a stereo measurement apparatus and system including a single-eye camera and a plurality of cameras.
  • stereo measurement devices and systems that measure an object or an environment using a plurality of cameras have been put to practical use. For example, using an in-vehicle stereo camera of a car, various measurements such as detection of an intruder or detection of a dangerous act are performed, and control of voice notification, danger avoidance action, etc. is performed based on the measurement result.
  • automatic car driving there is an example of controlling a car by measuring a longitudinal direction with a stereo camera.
  • stereo cameras there are adaptation cases of stereo cameras in the surveillance field and marketing.
  • the measurement areas of the two cameras need to be approximately the same.
  • the parameters, angle of view, parallax depth, etc. of the cameras handled in the series of processes depend on the specifications of the two cameras, and the measurement range is fixed.
  • the measurement range for example, in the case of an on-vehicle camera for automatic vehicle operation, a range of 2 m to 40 m ahead is said to be a measurement range. For this reason, there is a problem that it becomes impossible to measure within 2 m and 40 m or more.
  • Non-Patent Document 1 proposes a stereo camera with variable parameters. By controlling the camera zoom, it is possible to measure the proximity and the distance with a single unit, and it is possible to expand the measurement range.
  • Patent Document 1 proposes to widen a measurement width by combining a plurality of sets of stereo cameras.
  • Patent Document 2 proposes that the measurement width can be widened by combining a plurality of sets of stereo cameras.
  • the measurement target The position is estimated, and calibration data prepared in advance is selected for the estimated position, and three-dimensional measurement is performed.
  • image information such as human face.
  • Patent Document 1 Japanese Patent Application Publication No. 2006-033282 JP 2012-058188 A
  • Non-Patent Document 1 it is difficult to appropriately control and measure the camera according to the measurement target in a wide range from a near area to an ultra-far-distance area.
  • Patent Document 1 has a problem that the cost of the measurement system is high. Also, since the combined camera's Sbeck is fixed in advance, it can not be changed later. There is also the problem of low versatility.
  • Patent Document 2 it is difficult to dynamically control the camera and perform appropriate measurement on an actual moving measurement object. Further, as in Patent Document 1, calibration data is prepared in advance, and there is a problem that versatility is low. In addition, in the measurement range, there are cases where focusing can not be performed due to light, disturbance, etc., and in this case, the estimated position is unclear and there is a problem that the measurement is affected.
  • a stereo measuring device that can control and measure the camera appropriately according to the measurement target for a wide region from the near side with a single stereo camera. And provide a system.
  • the stereo measurement device for processing left and right image information from the stereo camera unit including the camera control unit of the left and right cameras while providing left and right image information with the left and right cameras
  • the stereo measuring device performs calibration using a feature point selection unit that selects feature points from image scenes of left and right image information, and a calibration calculation unit that calculates calibration data by performing calibration using the selected feature points.
  • the camera control unit of the left and right cameras is controlled from the distance calculation unit that calculates the distance to the target object to be sensed using the calibration data and the left and right image information, and the calculated distance and the image information Control amount calculating unit for calculating a control amount for controlling the camera control unit of the left and right cameras of the stereo camera unit. It is obtained by the stereoscopic measurement system "according to claim Rukoto.
  • the present invention is a "stereo measurement system configured by a stereo measurement device and a stereo camera unit which gives left and right image information by the left and right cameras and includes camera control units of the left and right cameras.”
  • a single stereo camera can be used to measure a wide area at a very far distance from near.
  • the system can zoom in on the camera and make measurements.
  • the zoom factor can be controlled based on the currently measured position.
  • a person gets close it is possible to zoom out and measure.
  • FIG. 2 is a view showing an example of the arrangement of a camera control unit 12;
  • FIG. 8 is a view showing an example of the flow of processing of the feature point extraction unit 13;
  • FIG. 2 is a view showing an example of the arrangement of a distance calculation unit 16;
  • FIG. 6 is a diagram showing a flow of processing in a zoom control unit 21 and a baseline control unit 22.
  • FIG. 7 is a diagram showing a flow of processing for performing rotation control using the pan control value and the tilt control value in the pan control unit 24 and the tilt control unit 25.
  • FIG. 5 is a diagram showing an example of the configuration of a stereo camera device 100 according to a second embodiment of the present invention.
  • FIG. 1 is a view showing a configuration example of a stereo camera device according to Embodiment 1 of the present invention.
  • the stereo camera device 100 is roughly divided into a stereo camera unit 101 and a processing unit 102 configured by a computer.
  • the computer may be a device having a processing function such as a personal computer, a small built-in computer, or a computer device connectable to a network.
  • the stereo camera unit 101 includes an actual image input unit 10 configured by the left and right actual image input units 10R and 10L including a lens and an imaging unit, and an image acquisition unit configured by the left and right image acquisition units 11R and 11L.
  • a camera control unit 12 is composed of the camera control unit 11 and the left and right camera control units 12R and 12L.
  • the stereo camera unit 101 can be said to be a mechanical mechanism unit mainly composed of a camera.
  • the real video input unit 10 in the stereo camera unit 101 is configured to include two sets of imaging units and lenses, and inputs video from the left and right cameras of the stereo camera.
  • the imaging unit in the real video input unit 10 is a mechanism including an imaging element such as a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD), which is an image sensor.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the lens has a zoomable lens. By zooming, it is possible to image a distant region, and it is also possible to image a near region.
  • the image acquisition unit 11 is a function of converting an electronic signal input from the real video input unit 10 into an image.
  • the image format is various. BMP, JPEG, PNG etc.
  • the system determines the image format.
  • the camera control unit 12 can adjust camera parameters such as the angle of view and orientation of the camera in the real video input unit 10 and the distance between the cameras.
  • the scene to be photographed is not specified. If it is a stereo camera for surveillance function, let a moving object, surveillance area, etc. be a photography scene. In the case of an in-vehicle stereo camera, a road or a car ahead is taken as a shooting scene. As a shooting scene, even complex scenes and simple scenes can be shot, and the shot image is input.
  • the camera of the first embodiment may be configured as a camera that can change the focal length, the rotation angle, the distance between the cameras, and the like.
  • the stereo camera device 100 can be operated with various devices and various environments.
  • FIG. 2 is a view showing a configuration example of the camera control unit 12.
  • the camera control unit 12 includes a zoom control unit 21, a baseline control unit 22, a rotation control unit 23, a tilt control unit 24, and a pan control unit 25.
  • the zoom control unit 21 controls the lens mechanism using the zoom control value 26.
  • the zoom control unit 21 changes the focal length by moving the lens to change the angle of view which is the imaging range of the camera.
  • the baseline control unit 23 adjusts the distance between the cameras using the baseline control value 27.
  • the tilt control unit 24 controls the angle of the camera using the tilt control value 28.
  • the pan control unit uses the pan control value 29 to control the pan angle of the camera.
  • the tilt control unit 23 and the pan control unit 24 respectively control the camera angle in the horizontal direction and in the vertical direction.
  • the rotation control unit 23 can change the posture after rotating the camera.
  • the real image input unit 10 the image acquisition unit 11, and the camera control unit 12 respectively have cases where left and right camera functions are integrally formed. Although shown, they may be separated. When separated, the right-view input unit 10R and the left-view input unit 10L, the right-image acquisition unit 11R and the left-image acquisition unit 11L, and the right-camera control unit 12R and the left-camera control unit 12L are individually provided. .
  • the processing unit 102 configured by a computer in the stereo camera device 100 includes a feature point selection unit 13, a calibration calculation unit 14, a calibration data database DB 1, a distance calculation unit 16, and a control amount calculation unit 17. While the stereo camera unit 101 is a mechanical mechanism unit mainly composed of a camera, the processing unit 102 configured by a computer can be said to be an electronic processing function.
  • the feature point selection unit 13 executes a process of selecting feature points for calibration using the image information input from the left and right image acquisition units 11R and 11L. .
  • Feature point selection is a process of extracting feature points from a shooting scene and selecting effective points from the feature points, which can be realized by an existing method.
  • FIG. 1 An example of the process flow of feature point extraction is shown in FIG.
  • feature points are often extracted from one scene captured at one position, but in the first embodiment, an example is shown in which feature point groups are extracted from a plurality of scenes.
  • processing 33 selects feature points for the images I1, I2, and I3 in a plurality of scenes, and processing 34 selects valid feature points from the plurality of extracted feature points.
  • the process 34 a process of selecting valid feature points Pr is required except for the wrong feature points.
  • RANSAC Random sample condition
  • the point group Px is an incorrect feature point, and Pr indicates a correct feature point showing a predetermined regularity.
  • the wrong feature point Px can be removed by RANSAC processing.
  • the calibration calculation unit 14 performs camera calibration using the feature points output from the feature point selection unit 13.
  • Camera calibration is a process of calculating internal camera parameters and external camera parameters.
  • f indicates the focal length
  • indicates the aspect ratio
  • s indicates the skew
  • (u 0 , v 0 ) indicate the center coordinates of the coordinates.
  • R is a rotation matrix indicating the orientation of the camera
  • T is a translation matrix indicating the positional relationship between the cameras.
  • R indicating the orientation of the camera is defined by the Euler angles, and is represented by three parameters of pan ⁇ , tilt ⁇ , and roll ⁇ ⁇ ⁇ which are camera installation angles. Be done. Calibration techniques are required to calculate internal and external parameters, respectively. Conventionally, the method described in “Z.
  • the calculated camera parameters are recorded as calibration data D1 in the calibration data database DB1.
  • the feature point selecting unit 13 selects feature points from the re-input scene.
  • the calibration calculation unit 14 also calculates the calibration data D1 again, and registers the latest calibration data D1 in the calibration database DB1. Then, calibration data is used in accordance with camera parameters controlled by the camera.
  • FIG. 4 is a view showing a configuration example of the distance calculation unit 16.
  • the distance calculation unit 16 calculates distance data P1 using the left and right images from the left and right image acquisition units 11R and 11L and the calibration data D1.
  • the calculation of the distance data P1 is obtained by sequentially executing the processing of the distortion correction unit 44, the parallel processing unit 45, and the distance calculation unit 46.
  • K is an internal parameter matrix
  • f is a focal length
  • a is an aspect ratio
  • s is a skew
  • (vc, uc) is a center coordinate of image coordinates.
  • D is an external parameter matrix
  • (r11, r12, r13, r21, r22, r23, r31, r32, r33) indicate the orientation of the camera
  • (tx, ty, tz) is the camera Indicates the position coordinates between
  • the distortion correction unit 44 performing the first process in the distance calculation unit 46 of FIG. 4 corrects distortion of the camera image using the camera parameters. Then, the parallel processing unit 45 processes parallelization of the left and right cameras.
  • the three-dimensional measurement value of the measurement object can be calculated by (4), including the distance.
  • (xl, yl) and (xr, yr) are pixel values on the left and right camera images.
  • f is the focal length
  • B is the baseline
  • d is the difference between the images projected on the left and right cameras, respectively, in the same three dimensions. The difference is the three-dimensional parallax.
  • the relationship between world coordinates (X, Y, Z) and image coordinates (u, v) is as shown in equation (5).
  • the determination of the world coordinates (X, Y, Z) means that the distance data P1 has been acquired.
  • control amount calculation unit 17 calculates a control value using the calculated distance data P1 and the image data. Then, the camera control unit 12 performs control using each control value.
  • FIG. 5 shows a flowchart regarding control amount calculation and control processing in the zoom control unit 21 and the baseline control unit 22.
  • processing step S51 which is the first processing in the zoom control unit 21 and the baseline control unit 22 shown in FIG. 5, images are input from the left and right cameras.
  • processing step S52 the distance data P1 measured by the distance calculation unit 16 is stored in the distance database DB2.
  • processing step S54 the stability of the distance data P1 is evaluated. For example, if the distance data P1 is stored several times (here five times), it is determined to be stable.
  • processing step S55 the average distance is calculated using the distance data determined to be stable, and in processing step S56, the zoom control unit 21 and the baseline control unit 22 calculate zoom and baseline control amounts.
  • the reference s can be set in accordance with the zoom of the camera. If the distance is close, the zoom factor a is decreased, and if the distance is long, the zoom factor a is increased.
  • control processing is repeatedly executed until the end of image acquisition. If there is no image, the image acquisition ends in processing step S58, and the control processing also ends.
  • FIG. 6 is a diagram showing a flow of processing for performing rotation control using the pan control value and the tilt control value in the pan control unit 24 and the tilt control unit 25.
  • processing step S61 in the rotation control shown in FIG. 6, the left and right images from the left and right image acquiring units 11R and 11L are acquired, and in processing step S62, the inter-frame difference is calculated. If there is a difference, it indicates that there is movement such as human movement in the image.
  • Processing step S63 extracts the motion area. For example, the image area is divided into a plurality of areas, and movement of a person or the like is detected between the divided small areas.
  • processing step S64 if there is area movement, the amount of movement is calculated in processing step S65, and rotation control of pan and tilt is performed in processing step S66.
  • processing step S67 control processing is repeated until the end of image acquisition. If there is no image, the process proceeds to processing step S68, the image acquisition ends, and the control processing also ends.
  • the pan control value p and the tilt control value t use the movement distance in a triangular relationship, according to equation (6) It is possible to calculate the angle.
  • the ideal stereo camera should be synchronized, and the calculated distance should be stable.
  • the left and right cameras have the problem of being asynchronous. In that case, the distance during movement is unstable, and control on a distance basis is difficult.
  • FIG. 7 is a diagram showing the flow of control processing in an asynchronous stereo camera.
  • processing step S71 which is the first processing step in FIG. 7, left and right camera images are acquired.
  • processing step S72 distortion correction of the camera is performed.
  • processing step S73 disparity calculation processing is performed.
  • processing step S74 inter-frame difference processing of the left and right cameras is performed.
  • Y side if there is no difference in the processing step S75 (Y side), it is judged as stationary and moves to the processing of the processing step S710, and if there is a difference (N side), it is judged as movement and the processing in the processing step S76 Move.
  • processing step S75 When it is determined that there is a difference (N side) in the determination processing of the processing step S75, it is determined that the movement of a person or the like. In the case of movement, in the processing step S76, a movement area on the image is calculated. Then, in processing step S77, the current values of pan and tilt of the camera are calculated. In processing step S78, the calculated motion area is compared with the current value to calculate a difference, and the position of a person or the like is determined. At processing step S79, the pan control value and the tilt control value are calculated based on the difference, and the rotation control is performed.
  • step S75 If it is stationary (Y side) in the determination process of processing step S75, the distance abnormal value is determined in processing step S710, and the distance data of 5 consecutive frames at the time of stationary is stored in processing step S711 only for normal data.
  • step S712 the stability is evaluated. If it is stable (Y side), an average distance is calculated in processing step S713, and a zoom control value and a baseline control value are calculated. Zooming and baseline control are performed in processing step S714 using the control value.
  • control processing is repeatedly performed until the end of image acquisition. If there is no image, in the processing step S716, the image acquisition ends, and the control processing also ends.
  • FIG. 8 is a view showing a configuration example of a stereo camera device 100 according to a second embodiment of the present invention.
  • the functions from the real video input unit 10 to the control amount calculation unit 17 in the configuration shown in FIG. 1 are the same as in the first embodiment, and thus the description thereof is omitted.
  • the stereo camera unit 101 which is a mechanical mechanism unit
  • the processing unit 102 which is an electronic processing function
  • a network 81 is communicably connected to each other via a network 81.
  • calibration is performed on a camera in the network 81.
  • the distance of the object measured by the camera is calculated.
  • it is a camera system configuration that can be remotely processed to control the camera via the network 81 using the calculated distance. Thereby, it becomes possible to measure remotely without depending on the installation place or the measurement place.
  • FIG. 9 is a view showing a stereo camera device 100 according to a third embodiment of the present invention.
  • the accuracy of the scanned distance when controlling a camera, when scanning a wide range, the accuracy of the scanned distance can be improved.
  • the accuracy of the near distance is high, and the accuracy decreases as the distance increases.
  • one camera first scans the near area 94 with 91 cameras. Then, the camera sbeck is changed to 92 and the far area 95 is scanned. Similarly, the camera Sbeck is changed to 93, and the far area 96 is scanned. Then, by integrating the scanned results, the results for the entire area 97 can be obtained. In the processing result of the image obtained for the entire area 97, the accuracy of the distance can be high for all the areas. In this way, various applications such as map construction and environment scanning can be performed for automatic driving in the future.
  • FIG. 10 is a view showing a stereo camera device 100 according to a fourth embodiment of the present invention.
  • the fourth embodiment is an embodiment in which a sensing object is tracked and appropriate measurement is performed while controlling a camera.
  • a moving object M such as a person or a car is used as a target object to be measured.
  • the movement path of the object M to be sensed is like g, and is assumed to be gradually moved away from the near distance position.
  • the sensing system tracks the target object M while controlling the pan, tilt, zoom, and baseline of the camera. Thereby, it is possible to measure a wide area at a very far distance from a close position using one stereo camera. Moreover, it becomes possible to sense appropriately according to the movement of an object.
  • the zoom is 1 ⁇
  • the baseline is 15 cm
  • the zoom is 5 ⁇ and the baseline is 20 cm and the angle 20 ° at the middle distance position.
  • the zoom is 10 ⁇
  • the baseline is 25 cm
  • the angle is 10 °.
  • FIG. 11 is a view showing a stereo camera device 100 according to a fifth embodiment of the present invention.
  • Example 5 is an example of a sensing system.
  • a stereo camera 101 and a computer 111 are connected by a network 81.
  • the image is sent to the computer 111 via the network 81.
  • the computer 111 performs the process shown in FIG. 1 using the received image.
  • the zoom control unit 115 controls the focal length of the camera.
  • the baseline control unit 116 controls the baseline stage 113.
  • the rotation control unit 117 controls the rotation stage 114. Those units are connected to the computer 111 and perform control according to the processing result.
  • the image acquisition unit 11 of the camera acquires an image at, for example, 30 frames per second, and at this point, it is assumed that the image processing of the second frame is to be performed from now.
  • the image of the second frame is simultaneously given from the image acquisition unit 11 to the feature point selection unit 13.
  • the distance calculation unit 16 detects the distance from the left and right images to the target object and the movement of the target object in the image, and when there is movement, a new camera via the control amount calculation unit 17 and the camera control unit 12 Act to position.
  • the distance calculation unit 16 refers to the calibration data database DB1 to obtain the calibration data D2 used for correction when calculating the distance, but the latest calibration data D2 is at least the first frame.
  • the image processing of the eyes is obtained by the processing of the feature point selection unit 13 and the carry calculation unit 14. That is, the distance calculation unit 16 calculates the distance from the image of the current time and the latest calibration data D2 obtained by the process of the past time.
  • calibration data D2 obtained from the image obtained in the second frame is stored in the calibration data database DB1, and the process of the distance calculation unit 16 in the third frame is performed.
  • the calculation timing of the calibration data is performed at the timing when the camera is controlled. This makes it possible to calculate distance data with high accuracy. And according to the present distance of measurement object, it becomes possible to control a camera appropriately and to measure further.
  • the stereo camera device 100 can calibrate the latest situation and reflect it on the distance control.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Focusing (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Accessories Of Cameras (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)

Abstract

Provided is a stereo measurement system that makes it possible to perform measurement over a wide range from a short distance to a long distance by use of a single stereo camera. A stereo measurement device for processing right and left image information from a stereo camera unit that provides the right and left image information by use of right and left cameras and that includes a camera control unit for the right and left cameras, the stereo measurement device being characterized in that the stereo measurement device is provided with: a feature point selection unit that selects feature points from the image scenes of the right and left image information; a calibration calculation unit that uses the selected feature points to perform a calibration, thereby calculating calculation data; a distance calculation unit that uses the calculated calibration data and the right and left image information to calculate the distance to a target object that is to be sensed; and a control amount calculation unit that calculates, from the calculated distance and the image information, control amounts by which to control the camera control unit for the right and left cameras, wherein the control amounts are provided to the camera control unit of the stereo camera unit for the right and left cameras.

Description

ステレオ計測装置及びシステムStereo measuring device and system
 本発明は、単眼カメラや複数のカメラにより構成されるステレオ計測装置及びシステムに関するものである。 The present invention relates to a stereo measurement apparatus and system including a single-eye camera and a plurality of cameras.
 従来から、複数のカメラを用いて、対象物や環境を計測するステレオ計測装置及びシステムが実用化されている。例えば、自動車の車載ステレオカメラを用いて、侵入者検知や、危険行為の検知などの様々な計測を行い、この計測結果から音声報知や危険回避行動などの制御を行っている。また自動車自動運転では、ステレオカメラにより前後方位を計測し、車を制御する実例がある。また、近年では、監視分野やマーケティングでもステレオカメラの適応事例がある。 BACKGROUND Conventionally, stereo measurement devices and systems that measure an object or an environment using a plurality of cameras have been put to practical use. For example, using an in-vehicle stereo camera of a car, various measurements such as detection of an intruder or detection of a dangerous act are performed, and control of voice notification, danger avoidance action, etc. is performed based on the measurement result. In automatic car driving, there is an example of controlling a car by measuring a longitudinal direction with a stereo camera. Also, in recent years, there are adaptation cases of stereo cameras in the surveillance field and marketing.
 ステレオカメラを用いた三次元計測の手法としては、まず2台のカメラにより同時に同じ対象物を撮影し、得られた2つの画像に対して、歪み補正、平行化を行い、同じ三次元の、撮影した2つの画像上の差分、視差を算出する。次に、算出した視差を用いて、三次元対象物の形状などを計測する。 As a method of three-dimensional measurement using a stereo camera, first, the same object is photographed simultaneously by two cameras, distortion correction and parallelization are performed on the two obtained images, and the same three-dimensional The difference between the two captured images and the parallax are calculated. Next, the shape etc. of the three-dimensional object are measured using the calculated parallax.
 係る三次元計測の場合には、2台のカメラの計測領域はほぼ同じである必要がある。また上記一連の処理の中で取り扱うカメラのパラメータ、画角、視差深度などは、2台のカメラのスペックに依存し、計測範囲は固定されている。 In the case of such three-dimensional measurement, the measurement areas of the two cameras need to be approximately the same. The parameters, angle of view, parallax depth, etc. of the cameras handled in the series of processes depend on the specifications of the two cameras, and the measurement range is fixed.
 計測範囲に関して、例えば、自動車自動運転の車載カメラの場合、前方2mから40mの範囲が計測範囲と言われている。このため、2m以内、40m以上は計測できなくなってしまうという問題がある。 With regard to the measurement range, for example, in the case of an on-vehicle camera for automatic vehicle operation, a range of 2 m to 40 m ahead is said to be a measurement range. For this reason, there is a problem that it becomes impossible to measure within 2 m and 40 m or more.
 計測範囲についての上記点の簡単な解決方法としては、2台のカメラを複数組備えて切り替え使用することが考えられる。2m以内の距離では近傍カメラを用い、40m以上の距離では遠望カメラに切り替えて計測するというやり方である。しかし、2台のカメラを複数組備える、この方法では、コストが高くなってしまうという問題がある。また、自動車自動運転の車載カメラの場合は、進行中切り替えることは不可能である。さらに、超遠望を計測する際には、例えば、200m場合以上を計測する場合は、さらに複数台のカメラの組み合わせとなり、コストが高くなるという問題がある。 As a simple solution method of the above-mentioned point about a measurement range, it is possible to equip two or more sets of two cameras and to switch and use it. In a distance of 2 m or less, a nearby camera is used, and in a distance of 40 m or more, it is a method of switching to a far-vision camera for measurement. However, this method having a plurality of two cameras has a problem that the cost is increased. In addition, in the case of an on-vehicle camera for automatic vehicle driving, it is impossible to switch during the process. Furthermore, in the case of measuring, for example, a case of 200 m or more when measuring the super distant view, there is a problem that the combination of a plurality of cameras is further generated and the cost becomes high.
 計測範囲についての上記問題を解決する手法として、さらに以下のものが知られている。 Further, the following is known as a method for solving the above-mentioned problem with the measurement range.
 非特許文献1は、パラメータ可変なステレオカメラを提案したものであり、カメラズームを制御することで、近傍や遠望を一台で計測することが可能となり、計測範囲を広げることができるとしている。 Non-Patent Document 1 proposes a stereo camera with variable parameters. By controlling the camera zoom, it is possible to measure the proximity and the distance with a single unit, and it is possible to expand the measurement range.
 特許文献1は、複数セットのステレオカメラを組み合わせることで、計測幅を広げることを提案している。 Patent Document 1 proposes to widen a measurement width by combining a plurality of sets of stereo cameras.
 特許文献2もまた特許文献1と同様に、複数セットのステレオカメラを組み合わせることで、計測幅を広けることを提案したものであるが、特許文献2では、カメラのフォーカス値より、計測対象の位置を推定し、推定した位置に対して、予め用意したキャリブレーションデータを選別し、三次元計測を行うものである。なお位置を推定する際には、人の顔など画像情報を利用するのが良いとしている。 Similarly to Patent Document 1, Patent Document 2 proposes that the measurement width can be widened by combining a plurality of sets of stereo cameras. However, in Patent Document 2, according to the focus value of the camera, the measurement target The position is estimated, and calibration data prepared in advance is selected for the estimated position, and three-dimensional measurement is performed. When estimating the position, it is better to use image information such as human face.
特開2006-033282号公報Patent Document 1: Japanese Patent Application Publication No. 2006-033282 特開2012-058188号公報JP 2012-058188 A
 しかしながら上記の公知手法によればさらに以下の解決すべき課題を有している。 However, according to the above-described known method, the following problems to be solved are further present.
 まず非特許文献1では、近い領域から、超遠望領域までの広い範囲に対して、計測対象に応じて、適切にカメラを制御し、計測することが困難である。 First, in Non-Patent Document 1, it is difficult to appropriately control and measure the camera according to the measurement target in a wide range from a near area to an ultra-far-distance area.
 特許文献1では、計測システムのコストが高いという問題がある。また、組み合わせたカメラのスベックは予めフィックスされているため、その後の変更ができなくなる。汎用性が低いという問題もある。 Patent Document 1 has a problem that the cost of the measurement system is high. Also, since the combined camera's Sbeck is fixed in advance, it can not be changed later. There is also the problem of low versatility.
 特許文献2では、実際の動く計測対象物に対して、動的にカメラを制御し、適切な計測をすることは困難である。また、特許文献1と同じく、キャリブレーションデータは予め用意されており、汎用性は低いという問題がある。また、計測範囲は、光や外乱などによって、フォーカスできないケースも発生し、そのときは推定された位置が不明確であり、計測に影響を受けるという問題がある。 In Patent Document 2, it is difficult to dynamically control the camera and perform appropriate measurement on an actual moving measurement object. Further, as in Patent Document 1, calibration data is prepared in advance, and there is a problem that versatility is low. In addition, in the measurement range, there are cases where focusing can not be performed due to light, disturbance, etc., and in this case, the estimated position is unclear and there is a problem that the measurement is affected.
 以上のことから本発明においては、1台のステレオカメラにより、近傍から遠望までの広い領域に対して、計測対象に応じて、適切にカメラを制御し、計測することを可能とするステレオ計測装置及びシステムを提供することを目的とする。 From the above, according to the present invention, a stereo measuring device that can control and measure the camera appropriately according to the measurement target for a wide region from the near side with a single stereo camera. And provide a system.
 以上のことから本発明においては、「左右のカメラにより左右の画像情報を与えるとともに、左右のカメラのカメラ制御部を含むステレオカメラ部からの左右の画像情報を処理するステレオ計測装置であって、ステレオ計測装置は、左右の画像情報の画像シーンから特徴点を選別する特徴点選別部と、選別された特徴点を用いて、キャリブレーションを行い、キャリブレーションデータを算出するキャリブ算出部と、算出されたキャリブレーションデータと左右の画像情報を用いて、センシング対象の対象物体までの距離を算出する距離算出部と、算出された距離と画像情報とから、左右のカメラのカメラ制御部を制御するための制御量を算出する制御量算出部とを備え、制御量を前記ステレオカメラ部の左右のカメラのカメラ制御部に与えることを特徴とするステレオ計測装置」としたものである。 From the above, according to the present invention, “the stereo measurement device for processing left and right image information from the stereo camera unit including the camera control unit of the left and right cameras while providing left and right image information with the left and right cameras, The stereo measuring device performs calibration using a feature point selection unit that selects feature points from image scenes of left and right image information, and a calibration calculation unit that calculates calibration data by performing calibration using the selected feature points. The camera control unit of the left and right cameras is controlled from the distance calculation unit that calculates the distance to the target object to be sensed using the calibration data and the left and right image information, and the calculated distance and the image information Control amount calculating unit for calculating a control amount for controlling the camera control unit of the left and right cameras of the stereo camera unit. It is obtained by the stereoscopic measurement system "according to claim Rukoto.
 また本発明は、「ステレオ計測装置と、左右のカメラにより左右の画像情報を与えるとともに、左右のカメラのカメラ制御部を含むステレオカメラ部とにより構成されたステレオ計測システム」としたものである。 Further, the present invention is a "stereo measurement system configured by a stereo measurement device and a stereo camera unit which gives left and right image information by the left and right cameras and includes camera control units of the left and right cameras."
 本発明によれば、一台のステレオカメラを用いて、近傍から超遠望までの広い領域を計測することが可能になる。 According to the present invention, a single stereo camera can be used to measure a wide area at a very far distance from near.
 また本発明の実施例によれば、計測システムにおける、カメラのコストを削減することが期待できる。さらに、計測対象物を追いかけてセンシングする際には、カメラのスベックを制御しなから、適切なセンシングを行うことができる。例えば、人が遠くにいる場合は、システムはカメラをズームインして、計測することが可能である。ズーム率は、現状で計測された位置に基づいて制御することが可能である。また、人が近くなると、ズームアウトして、計測することが可能である。それによって、現場では、コスト面を考慮しなくでも、このようなステレオカメラを知能的に制御することで、幅広く、更に動的に計測することができるため、安全、安心な制御システムや計測システムに貢献することができる。 Further, according to the embodiment of the present invention, it can be expected to reduce the cost of the camera in the measurement system. Furthermore, when chasing after the measurement object to perform sensing, appropriate sensing can be performed since the camera's Sbeck is controlled. For example, if a person is far, the system can zoom in on the camera and make measurements. The zoom factor can be controlled based on the currently measured position. Also, when a person gets close, it is possible to zoom out and measure. As a result, in the field, it is possible to measure widely and dynamically by intelligently controlling such stereo cameras without considering the cost aspect, so that a safe and reliable control system and measurement system Can contribute to
本発明の実施例1に係るステレオカメラ装置100の構成例を示す図。BRIEF DESCRIPTION OF THE DRAWINGS The figure which shows the structural example of the stereo camera apparatus 100 concerning Example 1 of this invention. カメラ制御部12の構成例を示す図。FIG. 2 is a view showing an example of the arrangement of a camera control unit 12; 特徴点抽出部13の処理の流れの一例について示す図。FIG. 8 is a view showing an example of the flow of processing of the feature point extraction unit 13; 距離算出部16の構成例を示す図。FIG. 2 is a view showing an example of the arrangement of a distance calculation unit 16; ズーム制御部21、ベースライン制御部22での処理の流れを示す図。FIG. 6 is a diagram showing a flow of processing in a zoom control unit 21 and a baseline control unit 22. パン制御部24と、チルト制御部25におけるパン制御値とチルト制御値を用いて回転制御する処理の流れを示す図。FIG. 7 is a diagram showing a flow of processing for performing rotation control using the pan control value and the tilt control value in the pan control unit 24 and the tilt control unit 25. 非同期ステレオカメラにおける制御の処理流れを示す図。The figure which shows the processing flow of control in an asynchronous stereo camera. 本発明の実施例2に係るステレオカメラ装置100の構成例を示す図。FIG. 5 is a diagram showing an example of the configuration of a stereo camera device 100 according to a second embodiment of the present invention. 本発明の実施例3に係るステレオカメラ装置100を示す図。The figure which shows the stereo camera apparatus 100 which concerns on Example 3 of this invention. 本発明の実施例4に係るステレオカメラ装置100を示す図。The figure which shows the stereo camera apparatus 100 concerning Example 4 of this invention. 本発明の実施例5に係るステレオカメラ装置100を示す図。The figure which shows the stereo camera apparatus 100 concerning Example 5 of this invention.
 以下、本発明の実施例について、図面を参照して詳細に説明する。以下に示す本発明のステレオ計測装置及びシステムの実施例の説明においては、2つのカメラで構成されたステレオカメラ装置を例として説明する。なお、ステレオ計測装置とは、以下に示す処理部のみの構成を意図しており、ステレオ計測システムとは、以下に示す処理部とステレオカメラ部による構成を意図している。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the following description of the embodiments of the stereo measurement device and system of the present invention, a stereo camera device configured by two cameras will be described as an example. In addition, with a stereo measurement apparatus, the structure of only the process part shown below is intended, and with a stereo measurement system, the structure by the process part and stereo camera part shown below is intended.
 図1は、本発明の実施例1に係るステレオカメラ装置の構成例を示す図である。ステレオカメラ装置100は、これを大別するとステレオカメラ部101と計算機で構成された処理部102により構成されている。計算機は、パソコン、組み込み用の小型ものやネットワークに接続可能な計算機器など処理機能を有する機器が考えられる。 FIG. 1 is a view showing a configuration example of a stereo camera device according to Embodiment 1 of the present invention. The stereo camera device 100 is roughly divided into a stereo camera unit 101 and a processing unit 102 configured by a computer. The computer may be a device having a processing function such as a personal computer, a small built-in computer, or a computer device connectable to a network.
 このうちステレオカメラ部101は、レンズと撮像部を備えた左右の実映像入力部10R、10Lで構成された実映像入力部10と、左右の画像取得部11R、11Lで構成された画像取得部11と、左右のカメラ制御部12R、12Lで構成されたカメラ制御部12とで構成されている。ステレオカメラ部101は、カメラを主体とする機械機構部ということができる。 Among them, the stereo camera unit 101 includes an actual image input unit 10 configured by the left and right actual image input units 10R and 10L including a lens and an imaging unit, and an image acquisition unit configured by the left and right image acquisition units 11R and 11L. A camera control unit 12 is composed of the camera control unit 11 and the left and right camera control units 12R and 12L. The stereo camera unit 101 can be said to be a mechanical mechanism unit mainly composed of a camera.
 ステレオカメラ部101内の実映像入力部10は、それぞれ2組の撮像部とレンズを備えて構成されており、ステレオカメラの左右カメラからそれぞれ映像を入力する。 The real video input unit 10 in the stereo camera unit 101 is configured to include two sets of imaging units and lenses, and inputs video from the left and right cameras of the stereo camera.
 実映像入力部10内の撮像部は、画像センサであるCMOS(Complementary Metal Oxide Semiconductor)やCCD(Charge Coupled Device)などの撮像素子を含む機構である。レンズは、ズーム可能なレンズを有している。ズームすることによって、遠望領域を撮像することを可能とし、近傍領域を撮像することも可能である。 The imaging unit in the real video input unit 10 is a mechanism including an imaging element such as a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD), which is an image sensor. The lens has a zoomable lens. By zooming, it is possible to image a distant region, and it is also possible to image a near region.
 画像取得部11は、実映像入力部10から入力した電子信号を画像へ変換する機能である。ここでは、画像フォマットは様々である。BMP、 JPEG、 PNGなど。システムによって、画像フォマットを決められる。 The image acquisition unit 11 is a function of converting an electronic signal input from the real video input unit 10 into an image. Here, the image format is various. BMP, JPEG, PNG etc. The system determines the image format.
 カメラ制御部12は、実映像入力部10におけるカメラの画角、向き、カメラ間の距離など、カメラのパラメータを調整することができる。 The camera control unit 12 can adjust camera parameters such as the angle of view and orientation of the camera in the real video input unit 10 and the distance between the cameras.
 実施例1では、撮影するシーンは特定しない。監視機能向けのステレオカメラであれば、移動物体や監視エリアなどを撮影シーンとする。車載ステレオカメラであれば、道路や前方車などを撮影シーンとする。撮影シーンは、複雑シーンや単純なシーンでも撮影可能とし、撮影された映像を入力する。また、実施例1のカメラは、焦点距離、回転角度、カメラ間の距離など変更することが可能なカメラとした構成でもよい。このようにステレオカメラ装置100は、様々な機器や様々な環境で操作できる。 In the first embodiment, the scene to be photographed is not specified. If it is a stereo camera for surveillance function, let a moving object, surveillance area, etc. be a photography scene. In the case of an in-vehicle stereo camera, a road or a car ahead is taken as a shooting scene. As a shooting scene, even complex scenes and simple scenes can be shot, and the shot image is input. In addition, the camera of the first embodiment may be configured as a camera that can change the focal length, the rotation angle, the distance between the cameras, and the like. Thus, the stereo camera device 100 can be operated with various devices and various environments.
 図2はカメラ制御部12の構成例を示す図である。カメラ制御部12は、ズーム制御部21、ベースライン制御部22、回転制御部23、チルト制御部24、パン制御部25で構成されている。ズーム制御部21はズーム制御値26を用いて、レンズ機構を制御するもので、レンズの移動によって焦点距離を変更しカメラの撮像範囲である画角を変更する。ベースライン制御部23は、ベースライン制御値27を用いて、カメラ間の距離を調整する。チルト制御部24は、チルト制御値28を用いて、カメラの角度を制御する。パン制御部は、パン制御値29を用いて、カメラのパン角度を制御する。チルト制御部23、パン制御部24は、それぞれ画角と水平方向、画角と垂直方向のカメラ向きを制御する。回転制御部23は、カメラを回転しならから姿勢を変えることができる。 FIG. 2 is a view showing a configuration example of the camera control unit 12. The camera control unit 12 includes a zoom control unit 21, a baseline control unit 22, a rotation control unit 23, a tilt control unit 24, and a pan control unit 25. The zoom control unit 21 controls the lens mechanism using the zoom control value 26. The zoom control unit 21 changes the focal length by moving the lens to change the angle of view which is the imaging range of the camera. The baseline control unit 23 adjusts the distance between the cameras using the baseline control value 27. The tilt control unit 24 controls the angle of the camera using the tilt control value 28. The pan control unit uses the pan control value 29 to control the pan angle of the camera. The tilt control unit 23 and the pan control unit 24 respectively control the camera angle in the horizontal direction and in the vertical direction. The rotation control unit 23 can change the posture after rotating the camera.
 なお、ズーム制御部21とベースライン制御部22の具体的な処理内容について図5を参照して後述し、回転制御部23の具体的な処理内容について図6を参照して後述する。 The specific processing contents of the zoom control unit 21 and the baseline control unit 22 will be described later with reference to FIG. 5, and the specific processing contents of the rotation control unit 23 will be described later with reference to FIG.
 図5、図6の構成における以上の調整は、すべて独立でも、同時でも可能である。ここでは、システム仕様、撮影するシーンのニーズに合わせて調整することが可能である。 The above adjustments in the configurations of FIGS. 5 and 6 can be all independent or simultaneous. Here, it is possible to adjust according to the system specification and the needs of the scene to be photographed.
 図1に示した実施例1のステレオカメラ装置100内のステレオカメラ部101について、実映像入力部10、画像取得部11、カメラ制御部12は夫々左右のカメラ機能が一体に形成された事例を示しているが、これらは別体とされていてもよい。別体とする時には、右実映入力部10Rと左実映入力部10L、右画像取得部11Rと左画像取得部11L、右カメラ制御部12Rと左カメラ制御部12Lを個別に有することになる。 In the stereo camera unit 101 in the stereo camera device 100 according to the first embodiment shown in FIG. 1, the real image input unit 10, the image acquisition unit 11, and the camera control unit 12 respectively have cases where left and right camera functions are integrally formed. Although shown, they may be separated. When separated, the right-view input unit 10R and the left-view input unit 10L, the right-image acquisition unit 11R and the left-image acquisition unit 11L, and the right-camera control unit 12R and the left-camera control unit 12L are individually provided. .
 ステレオカメラ装置100内の、計算機で構成された処理部102は、特徴点選別部13、キャリブ算出部14、キャリブデータデータベースDB1、距離算出部16、制御量算出部17から構成されている。ステレオカメラ部101がカメラを主体とする機械機構部であるに対し、計算機で構成される処理部102は電子的処理機能ということができる。 The processing unit 102 configured by a computer in the stereo camera device 100 includes a feature point selection unit 13, a calibration calculation unit 14, a calibration data database DB 1, a distance calculation unit 16, and a control amount calculation unit 17. While the stereo camera unit 101 is a mechanical mechanism unit mainly composed of a camera, the processing unit 102 configured by a computer can be said to be an electronic processing function.
 図1の処理部102において、特徴点選別部13は、左右の画像取得部11R、11Lから入力された画像情報を用いて、キャリブレーションをするための特徴点を選別する処理を実行している。特徴点選別は、撮影シーンから特徴点を抽出し、その中から有効点を選別するプロセスであり、既存の手法での実現が可能である。 In the processing unit 102 of FIG. 1, the feature point selection unit 13 executes a process of selecting feature points for calibration using the image information input from the left and right image acquisition units 11R and 11L. . Feature point selection is a process of extracting feature points from a shooting scene and selecting effective points from the feature points, which can be realized by an existing method.
 特徴点抽出の処理の流れの一例について図3に示している。特徴点抽出の従来例においては、一つの位置で撮影した一つのシーンから特徴点を抽出することが多いが、実施例1では、複数のシーンから特徴点群を抽出する例を示している。 An example of the process flow of feature point extraction is shown in FIG. In the conventional example of feature point extraction, feature points are often extracted from one scene captured at one position, but in the first embodiment, an example is shown in which feature point groups are extracted from a plurality of scenes.
 具体的な図3の例では、処理31に示すように例えばカメラの回転制御を行いながら撮影を行い、処理32に示すように時系列的に複数シーンでの画像I1、I2、I3を入手する。次に処理33では複数シーンでの画像I1、I2、I3についてそれぞれ特徴点を選別し、処理34では抽出された複数の特徴点の中から有効な特徴点の選別を行う。 In the specific example of FIG. 3, as shown in process 31, for example, shooting is performed while controlling the rotation of the camera, and as shown in process 32, images I1, I2, I3 in a plurality of scenes are obtained in time series. . Next, processing 33 selects feature points for the images I1, I2, and I3 in a plurality of scenes, and processing 34 selects valid feature points from the plurality of extracted feature points.
 特徴点抽出手法としては様々のものが知られているが、汎用性の高い手法としてはSIFT(Scale-Invariant Feature Transform)、HOG(Histograms of Oriented Gradients)などが考えられる。 Although various methods are known as feature point extraction methods, SIFT (Scale-Invariant Feature Transform), HOG (Histograms of Oriented Gradients), etc. can be considered as highly versatile methods.
 処理33において抽出された特徴点群Pxでは、間違ってマッチングした特徴点も存在する可能性がある。間違った特徴点を利用する場合は、キャリブレーションの算出に悪影響を与え、キャリブレーションデータが大きな誤差が含むことがある。そのため、処理34のように、間違った特徴点を除き、有効な特徴点Prを選別するプロセスが必要である。有効な特徴点の選別手法としては、RANSAC(Random sample consensus)がよく利用されている。処理34の中で、点群Pxは間違った特徴点であり、Prは所定の規則性を示す正しい特徴点を示している。RANSAC処理によって、間違った特徴点Pxを除去することができる。 In the feature point group Px extracted in the process 33, there may also be incorrectly matched feature points. When using the wrong feature point, the calibration calculation is adversely affected, and the calibration data may include a large error. Therefore, as in the process 34, a process of selecting valid feature points Pr is required except for the wrong feature points. As an effective feature point selection method, RANSAC (Random sample condition) is often used. In the process 34, the point group Px is an incorrect feature point, and Pr indicates a correct feature point showing a predetermined regularity. The wrong feature point Px can be removed by RANSAC processing.
 キャリブ算出部14は、特徴点選別部13から出力された特徴点を用いて、カメラキャリブレーションを行う。カメラキャリブレーションは、カメラの内部パラメータと、カメラの外部パラメータを算出するプロセスである。 The calibration calculation unit 14 performs camera calibration using the feature points output from the feature point selection unit 13. Camera calibration is a process of calculating internal camera parameters and external camera parameters.
 カメラの内部パラメータとして、fは焦点距離、αはアスペクト比、sはスキュー、(u、v)は座標の中心座標を示す。また外部パラメータとして、Rはカメラの向きを示す回転行列、Tはカメラ間の位置関係を示す並進行列である。カメラの向きを示すRは、3×3(r11、r12、…r33)マトリックスにおいては、オイラー角により定義すると、カメラの設置角度であるパンθ、チルトφ、ロールψの3個のパラメータによって表される。それぞれ内部パラメータと外部パラメータを算出するために、キャリブレーション技術が求められている。従来は、「Z.Zhang、“A flexible new technique for camera calibration”、IEEE Transactions on Pattern Analysis and Machine Intelligence、22(11):1330-1334、2000」に記載された手法を利用することが多い。また画像を用いて、自動的にカメラパラメータを推定する手法が多く提案されている。また非特許文献1に記載した自動キャリブレーション手法が提案されている。 As internal parameters of the camera, f indicates the focal length, α indicates the aspect ratio, s indicates the skew, and (u 0 , v 0 ) indicate the center coordinates of the coordinates. As external parameters, R is a rotation matrix indicating the orientation of the camera, and T is a translation matrix indicating the positional relationship between the cameras. In the 3 × 3 (r11, r12,... R33) matrix, R indicating the orientation of the camera is defined by the Euler angles, and is represented by three parameters of pan θ, tilt φ, and roll で あ る which are camera installation angles. Be done. Calibration techniques are required to calculate internal and external parameters, respectively. Conventionally, the method described in “Z. Zhang,“ A flexible new technique for camera calibration ”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 22 (11): 1330-1334, 2000” is often used. Also, many methods for automatically estimating camera parameters using images have been proposed. Further, an automatic calibration method described in Non-Patent Document 1 has been proposed.
 算出されたカメラパラメータは、キャリブレーションデータD1として、キャリブデータデータベースDB1に記録される。 The calculated camera parameters are recorded as calibration data D1 in the calibration data database DB1.
 本発明の実施例1では、カメラの構成を変える度に、再度キャリブレーションすることが必要である。たとえば、実施例1では、カメラのズーム、ベースライン、回転角度などいずれが変更された際には、特徴点選別部13では再度入力したシーンから特徴点を選別することになる。その後は、キャリブ算出部14も再度キャリブレーションデータD1を算出することになり、最新のキャリブレーションデータD1をキャリブレーションデータベースDB1に登録することになる。そして、カメラ制御したカメラのパラメータに合わせて、キャリブレーションデータを利用する。 In the first embodiment of the present invention, it is necessary to recalibrate each time the camera configuration is changed. For example, in the first embodiment, when any one of the zoom, the baseline, the rotation angle, and the like of the camera is changed, the feature point selecting unit 13 selects feature points from the re-input scene. After that, the calibration calculation unit 14 also calculates the calibration data D1 again, and registers the latest calibration data D1 in the calibration database DB1. Then, calibration data is used in accordance with camera parameters controlled by the camera.
 図4は距離算出部16の構成例を示す図である。距離算出部16では、左右画像取得部11R、11Lからの左右画像と、キャリブレーションデータD1を用いて距離データP1を算出する。距離データP1の算出は、歪み補正部44、並行処理部45、距離算出部46の処理を逐次実行することで得られる。 FIG. 4 is a view showing a configuration example of the distance calculation unit 16. The distance calculation unit 16 calculates distance data P1 using the left and right images from the left and right image acquisition units 11R and 11L and the calibration data D1. The calculation of the distance data P1 is obtained by sequentially executing the processing of the distortion correction unit 44, the parallel processing unit 45, and the distance calculation unit 46.
 以下に距離算出の具体的な処理手順について数式を用いて説明するが、以下における数式での説明の前提として、カメラの内部パラメータ行列Kおよび外部パラメータ行列Dは、それぞれ(1)(2)式で表すことができる。なお(1)式において、Kは内部パラメータ行列であり、fは焦点距離、aはアスペクト比、sはスキュー、(vc、uc)は画像座標の中心座標を示す。また(2)式において、Dは外部パラメータ行列であり、(r11、r12、r13、r21、r22、r23、r31、r32、r33)はカメラの向きを示し、(tx、ty、tz)はカメラ間の位置座標を示す。 The specific processing procedure of the distance calculation will be described below using formulas, but the internal parameter matrix K and the external parameter matrix D of the camera are each represented by the formulas (1) and (2) as a premise of the description in the formulas below. Can be represented by In equation (1), K is an internal parameter matrix, f is a focal length, a is an aspect ratio, s is a skew, and (vc, uc) is a center coordinate of image coordinates. In the equation (2), D is an external parameter matrix, (r11, r12, r13, r21, r22, r23, r31, r32, r33) indicate the orientation of the camera, and (tx, ty, tz) is the camera Indicates the position coordinates between
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 これら2つのパラメータ行列K、D及び定数λを用いると、画像座標(u、v)と世界座標(X、Y、Z)は(3)式により対応付けられる。 Using these two parameter matrices K and D and a constant λ, the image coordinates (u, v) and the world coordinates (X, Y, Z) are associated by equation (3).
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 なお、外部パラメータDのカメラの向きを示す(r11、r12、…r33)においては、オイラー角により定義すると、カメラの設置角度であるパンθ、チルトφ、ロールψの3個のパラメータによって表される。そのため、画像座標と世界座標の対応付けのために必要なカメラパラメータ数は、5個の内部パラメータと6個の外部パラメータを合計した11個となる。 In addition, in (r11, r12, ... r33) indicating the orientation of the camera of the external parameter D, it is represented by three parameters of pan θ, tilt φ, and roll で あ る, which are installation angles of the camera, when defined by Euler angles. Ru. Therefore, the number of camera parameters required for the correspondence between the image coordinates and the world coordinates is 11 which is the sum of 5 internal parameters and 6 external parameters.
 図4の距離算出部46における最初の処理を行う歪み補正部44では、カメラパラメータを用いて、カメラ画像の歪みを補正する。そして、平行処理部45は、左右カメラの平行化を処理する。距離を含み、計測物の三次元計測値は(4)により算出することができる。 The distortion correction unit 44 performing the first process in the distance calculation unit 46 of FIG. 4 corrects distortion of the camera image using the camera parameters. Then, the parallel processing unit 45 processes parallelization of the left and right cameras. The three-dimensional measurement value of the measurement object can be calculated by (4), including the distance.
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 (4)式において、(xl、yl)、(xr、yr)は左右カメラ画像上の画素値である。平行化処理後、yl=yr=yとなる。fは焦点距離、Bはベースライン、カメラ間の距離である。dは同じ三次元上のものはそれぞれ左右カメラに投影した画像間の差分である。その差分はその三次元上の視差である。世界座標(X、Y、Z)と画像座標(u、v)の関係は、(5)式のようになる。世界座標(X、Y、Z)を求めたということは、距離データP1を取得したことを意味している。 In equation (4), (xl, yl) and (xr, yr) are pixel values on the left and right camera images. After the parallelization, yl = yr = y. f is the focal length, B is the baseline, and the distance between the cameras. d is the difference between the images projected on the left and right cameras, respectively, in the same three dimensions. The difference is the three-dimensional parallax. The relationship between world coordinates (X, Y, Z) and image coordinates (u, v) is as shown in equation (5). The determination of the world coordinates (X, Y, Z) means that the distance data P1 has been acquired.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 図1において、制御量算出部17は、計算された距離データP1と画像データを用いて制御値を算出する。そして、カメラ制御部12は、それぞれ制御値を用いて制御する。 In FIG. 1, the control amount calculation unit 17 calculates a control value using the calculated distance data P1 and the image data. Then, the camera control unit 12 performs control using each control value.
 図5は、ズーム制御部21、ベースライン制御部22での制御量算出および、制御処理に関する流れ図を示している。 FIG. 5 shows a flowchart regarding control amount calculation and control processing in the zoom control unit 21 and the baseline control unit 22.
 図5に示すズーム制御部21、ベースライン制御部22での最初の処理である処理ステップS51では、左右カメラから画像を入力する。処理ステップS52では、距離算出部16で計測した距離データP1を距離データベースDB2に格納する。処理ステップS54では、距離データP1の安定性を評価する。たとえは、距離データP1は数回(ここでは5回)を格納した場合は、安定と判断する。処理ステップS55では、安定と判断された距離データを用いて、平均距離を算出し、処理ステップS56において、ズーム制御部21、ベースライン制御部22でのズーム、ベースラインの制御量を算出する。 In processing step S51 which is the first processing in the zoom control unit 21 and the baseline control unit 22 shown in FIG. 5, images are input from the left and right cameras. In processing step S52, the distance data P1 measured by the distance calculation unit 16 is stored in the distance database DB2. At processing step S54, the stability of the distance data P1 is evaluated. For example, if the distance data P1 is stored several times (here five times), it is determined to be stable. In processing step S55, the average distance is calculated using the distance data determined to be stable, and in processing step S56, the zoom control unit 21 and the baseline control unit 22 calculate zoom and baseline control amounts.
 ズーム、ベースラインの制御量の簡単な計算方法は、ズーム率a=距離d/基準sとして求めることができる。基準sはカメラのズームに合わせて設定することが可能である。距離が近いのであれば、ズーム率aを減少し、距離が遠いのであれは、ズーム率aを増加する。また、ベースライン制御値b=(ベースライン全体幅f/基準s)×n(n=0、1、2….N)として求めることができ、ここでNは制御粒度である。Nは何段回を制御する指標である。処理ステップS57では、画像取得終了まで制御処理を繰り返し実行する。画像がない場合は、処理ステップS58において画像取得を終了し、制御処理も終了する。 A simple calculation method of control amounts of zoom and baseline can be obtained as zoom ratio a = distance d / reference s. The reference s can be set in accordance with the zoom of the camera. If the distance is close, the zoom factor a is decreased, and if the distance is long, the zoom factor a is increased. Also, baseline control value b = (baseline overall width f / reference s) × n (n = 0, 1, 2... N) can be obtained, where N is the control granularity. N is an index for controlling how many rounds. At processing step S57, control processing is repeatedly executed until the end of image acquisition. If there is no image, the image acquisition ends in processing step S58, and the control processing also ends.
 図6は、パン制御部24と、チルト制御部25におけるパン制御値とチルト制御値を用いて回転制御する処理の流れを示す図である。 FIG. 6 is a diagram showing a flow of processing for performing rotation control using the pan control value and the tilt control value in the pan control unit 24 and the tilt control unit 25.
 図6に示す回転制御における最初の処理ステップS61では、左右画像取得部11R、11Lからの左右画像を取得し、処理ステップS62において、フレーム間差分を算出する。差分がある場合は、画像内に人物移動などの動きがあることを示す。処理ステップS63は、その動き領域を抽出する。例えば、画像領域を複数に区分しておき、区分された小領域間で、人物などの移動があることを検知する。処理ステップS64では、領域移動がある場合は、処理ステップS65において、その移動量を算出し、そして処理ステップS66でパン、チルトの回転制御を行う。処理ステップS67では、画像取得終了まで制御処理を繰り返し行う。画像がない場合は、処理ステップS68に移り、画像取得を終了し、制御処理も終了する。 In the first processing step S61 in the rotation control shown in FIG. 6, the left and right images from the left and right image acquiring units 11R and 11L are acquired, and in processing step S62, the inter-frame difference is calculated. If there is a difference, it indicates that there is movement such as human movement in the image. Processing step S63 extracts the motion area. For example, the image area is divided into a plurality of areas, and movement of a person or the like is detected between the divided small areas. In processing step S64, if there is area movement, the amount of movement is calculated in processing step S65, and rotation control of pan and tilt is performed in processing step S66. At processing step S67, control processing is repeated until the end of image acquisition. If there is no image, the process proceeds to processing step S68, the image acquisition ends, and the control processing also ends.
 たとえは、現在位置(x0、y0)であり、動き位置(x1、y1)であれば、パン制御値p、チルト制御値tは三角関係での移動距離を用いて、(6)式により、角度を算出することが可能である。 For example, if it is the current position (x0, y0) and the movement position (x1, y1), the pan control value p and the tilt control value t use the movement distance in a triangular relationship, according to equation (6) It is possible to calculate the angle.
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 上記一連の処理により、理想的なステレオカメラは同期化されて、算出された距離は安定するはずである。しかし実際には、左右カメラには非同期という問題がある。その場合は、動き際の距離は不安定であり、距離ベースでの制御は困難である。 By the above series of processing, the ideal stereo camera should be synchronized, and the calculated distance should be stable. However, in practice, the left and right cameras have the problem of being asynchronous. In that case, the distance during movement is unstable, and control on a distance basis is difficult.
 このことから、本発明においては、さらに図7のように実施ことを提案する。図7は、非同期ステレオカメラにおける制御の処理の流れを示す図である。 From this, in the present invention, it is further proposed to implement as shown in FIG. FIG. 7 is a diagram showing the flow of control processing in an asynchronous stereo camera.
 図7における最初の処理ステップであるS71では、左右カメラ画像を取得する。処理ステップS72では、カメラの歪み補正を行う。処理ステップS73では、視差算出処理を行う。そして、処理ステップS74で、左右カメラのフレーム間差分処理を行う。そのうえで、処理ステップS75において差分がない場合(Y側)は、静止と判断して処理ステップS710の処理に移り、差分がある場合(N側)は、動きと判断して処理ステップS76の処理に移る。 In S71, which is the first processing step in FIG. 7, left and right camera images are acquired. In processing step S72, distortion correction of the camera is performed. In processing step S73, disparity calculation processing is performed. Then, in processing step S74, inter-frame difference processing of the left and right cameras is performed. In addition, if there is no difference in the processing step S75 (Y side), it is judged as stationary and moves to the processing of the processing step S710, and if there is a difference (N side), it is judged as movement and the processing in the processing step S76 Move.
 処理ステップS75の判断処理で、差分があるとされた場合(N側)は、人物などの動きと判断する。動きの場合は、処理ステップS76において、画像上の動き領域を算出する。そして、処理ステップS77において、カメラのパン、チルトの現状値を算出する。処理ステップS78では、算出された動き領域と現状値を比較して、差分を算出し、人物などの位置を判定する。処理ステップS79では、その差分に基づいて、パン制御値、チルト制御値を算出して、回転制御を行う。 When it is determined that there is a difference (N side) in the determination processing of the processing step S75, it is determined that the movement of a person or the like. In the case of movement, in the processing step S76, a movement area on the image is calculated. Then, in processing step S77, the current values of pan and tilt of the camera are calculated. In processing step S78, the calculated motion area is compared with the current value to calculate a difference, and the position of a person or the like is determined. At processing step S79, the pan control value and the tilt control value are calculated based on the difference, and the rotation control is performed.
 処理ステップS75の判断処理で、静止(Y側)の場合は、処理ステップS710において距離異常値の判定を行い、正常データについてのみ処理ステップS711において静止時の連続する5フレームの距離データを格納し、処理ステップS712においてその安定性を評価する。安定(Y側)であれば、処理ステップS713で平均距離を算出し、ズーム制御値、ベースライン制御値を算出する。その制御値を用いて、処理ステップS714においてズーム、ベースライン制御を行う。処理ステップS715では、画像取得終了まで制御処理を繰り返し行う。画像がない場合は、処理ステップS716において、画像取得を終了し、制御処理も終了する。 If it is stationary (Y side) in the determination process of processing step S75, the distance abnormal value is determined in processing step S710, and the distance data of 5 consecutive frames at the time of stationary is stored in processing step S711 only for normal data. In step S712, the stability is evaluated. If it is stable (Y side), an average distance is calculated in processing step S713, and a zoom control value and a baseline control value are calculated. Zooming and baseline control are performed in processing step S714 using the control value. At processing step S715, control processing is repeatedly performed until the end of image acquisition. If there is no image, in the processing step S716, the image acquisition ends, and the control processing also ends.
 図8を用いて、実施例2について説明する。図8は、本発明の実施例2に係るステレオカメラ装置100の構成例を示す図である。 The second embodiment will be described with reference to FIG. FIG. 8 is a view showing a configuration example of a stereo camera device 100 according to a second embodiment of the present invention.
 実施例2では、図1に示す構成において実映像入力部10から制御量算出部17までの機能は実施例1と同じであるのでその説明を省略する。 In the second embodiment, the functions from the real video input unit 10 to the control amount calculation unit 17 in the configuration shown in FIG. 1 are the same as in the first embodiment, and thus the description thereof is omitted.
 実施例2では、機械機構部であるステレオカメラ部101と、電子的処理機能である処理部102とは、ネットワーク81を介して相互に通信可能に接続されている。ここでは、ネットワーク81にあるカメラに対して、キャリブレーションを行う。そして、カメラで計測した対象の距離を算出する。その後、計算された距離を用いて、ネットワーク81を経由してカメラを制御するという遠隔処理できるカメラシステム構成である。それによって、設置場所や計測場所に依存しなく、遠隔で計測することが可能となる。 In the second embodiment, the stereo camera unit 101, which is a mechanical mechanism unit, and the processing unit 102, which is an electronic processing function, are communicably connected to each other via a network 81. Here, calibration is performed on a camera in the network 81. Then, the distance of the object measured by the camera is calculated. After that, it is a camera system configuration that can be remotely processed to control the camera via the network 81 using the calculated distance. Thereby, it becomes possible to measure remotely without depending on the installation place or the measurement place.
 図9を用いて、実施例3について説明する。図9は、本発明の実施例3に係るステレオカメラ装置100を示す図である。 The third embodiment will be described with reference to FIG. FIG. 9 is a view showing a stereo camera device 100 according to a third embodiment of the present invention.
 実施例3によれば、カメラを制御するときに、広い範囲をスキャンする際には、スキャンした距離の精度を高めることができる。 According to the third embodiment, when controlling a camera, when scanning a wide range, the accuracy of the scanned distance can be improved.
 従来における1台のカメラの場合は、例えば近くの距離の精度が高く、遠くなると精度が減少するという関係にある。実施例3では、1台のカメラはまず、91のカメラスベックで近い領域94をスキャンする。そして、カメラスベックを92に変更し、遠い領域95をスキャンする。また同じように、カメラスベックを93に変更し、さらに遠い領域96をスキャンする。そして、スキャンした結果を統合することで、全領域97についての結果を得ることができる。全領域97について得られた画像の処理結果では、全ての領域に対して、距離の精度が高いものとすることができる。それによって、今後自動運転ための、地図構築や環境スキャンなど様々な適用ができる。 In the case of one conventional camera, for example, the accuracy of the near distance is high, and the accuracy decreases as the distance increases. In the third embodiment, one camera first scans the near area 94 with 91 cameras. Then, the camera sbeck is changed to 92 and the far area 95 is scanned. Similarly, the camera Sbeck is changed to 93, and the far area 96 is scanned. Then, by integrating the scanned results, the results for the entire area 97 can be obtained. In the processing result of the image obtained for the entire area 97, the accuracy of the distance can be high for all the areas. In this way, various applications such as map construction and environment scanning can be performed for automatic driving in the future.
 図10を用いて、実施例4について説明する。図10は、本発明の実施例4に係るステレオカメラ装置100を示す図である。実施例4は、カメラを制御しなから、センシング物体を追跡し、適切な計測を行う実施例である。ここでは、人あるいは、車などの移動する物体Mを、計測したい対象物体とする。センシングする物体Mの移動経路はgのようであり、近距離位置から次第に遠ざかっていくものとする。センシングシステムでは、カメラのパン、チルト、ズーム、ベースラインを制御しなから、対象物体Mを追跡していく。それによって、1台のステレオカメラを用いて、近い位置から超遠望までの広い領域を計測することが可能になる。また、対象の動きに応じて、適切にセンシングすることが可能になる。 The fourth embodiment will be described with reference to FIG. FIG. 10 is a view showing a stereo camera device 100 according to a fourth embodiment of the present invention. The fourth embodiment is an embodiment in which a sensing object is tracked and appropriate measurement is performed while controlling a camera. Here, a moving object M such as a person or a car is used as a target object to be measured. The movement path of the object M to be sensed is like g, and is assumed to be gradually moved away from the near distance position. The sensing system tracks the target object M while controlling the pan, tilt, zoom, and baseline of the camera. Thereby, it is possible to measure a wide area at a very far distance from a close position using one stereo camera. Moreover, it becomes possible to sense appropriately according to the movement of an object.
 なお、図10の事例では、近い位置ではズームは1倍、ベースラインは15cm、角度10°であり、中距離位置ではズームは5倍、ベースラインは20cm、角度20°であり、遠距離位置ではズームは10倍、ベースラインは25cm、角度10°とした例を示している。 In the example of FIG. 10, the zoom is 1 ×, the baseline is 15 cm, and the angle 10 ° at the close position, and the zoom is 5 × and the baseline is 20 cm and the angle 20 ° at the middle distance position. In the example, the zoom is 10 ×, the baseline is 25 cm, and the angle is 10 °.
 図11を用いて、実施例5について説明する。図11は、本発明の実施例5に係るステレオカメラ装置100を示す図である。実施例5は、センシングシステムの一例である。 The fifth embodiment will be described with reference to FIG. FIG. 11 is a view showing a stereo camera device 100 according to a fifth embodiment of the present invention. Example 5 is an example of a sensing system.
 このシステムでは、ステレオカメラ101と、計算機111はネットワーク81で接続されている。画像はネットワーク81を経由して計算機111に送付される。計算機111は、受け取った画像を用いて、図1示した処理を行う。ズーム制御ユニット115はカメラの焦点距離を制御する。ベースライン制御ユニット116はベースラインステージ113を制御する。回転制御ユニット117は回転ステージ114を制御する。それらのユニットは計算機111と接続し、処理結果に応じて、それぞれ制御を行う。 In this system, a stereo camera 101 and a computer 111 are connected by a network 81. The image is sent to the computer 111 via the network 81. The computer 111 performs the process shown in FIG. 1 using the received image. The zoom control unit 115 controls the focal length of the camera. The baseline control unit 116 controls the baseline stage 113. The rotation control unit 117 controls the rotation stage 114. Those units are connected to the computer 111 and perform control according to the processing result.
 次に図1の構成における各部処理の時間的関係について説明する。ここでは、カメラの画像取得部11は例えば30フレーム/秒で画像を入手しており、現時点では第2フレーム目の画像処理をこれから実施するタイミングを想定している。 Next, the temporal relationship of each processing in the configuration of FIG. 1 will be described. Here, the image acquisition unit 11 of the camera acquires an image at, for example, 30 frames per second, and at this point, it is assumed that the image processing of the second frame is to be performed from now.
 第2フレーム目の画像は、画像取得部11から特徴点選別部13に同時に与えられる。この時、距離算出部16は、左右画像から対象物までの距離と画像内の対象物体の動きを検知し、動きがある場合に制御量算出部17、カメラ制御部12を介して新たなカメラ位置とすべく作用する。またこのとき、距離算出部16はキャリブレーションデータデータベースDB1を参照して、距離を算出する際の補正に使用するキャリブレーションデータD2を得るが、この最新のキャリブレーションデータD2は、少なくとも第1フレーム目の画像処理において、特徴点選別部13、キャリ部レーション算出部14の処理により求めたものである。つまり、距離算出部16は、現在時刻の画像と過去時刻の処理で求めた最新のキャリブレーションデータD2とから、距離を算出する。なお第2フレーム目の処理では、キャリブレーションデータデータベースDB1には第2フレーム目で入手した画像から得たキャリブレーションデータD2が記憶されることになり、第3フレーム目における距離算出部16の処理で使用されることになる。 The image of the second frame is simultaneously given from the image acquisition unit 11 to the feature point selection unit 13. At this time, the distance calculation unit 16 detects the distance from the left and right images to the target object and the movement of the target object in the image, and when there is movement, a new camera via the control amount calculation unit 17 and the camera control unit 12 Act to position. At this time, the distance calculation unit 16 refers to the calibration data database DB1 to obtain the calibration data D2 used for correction when calculating the distance, but the latest calibration data D2 is at least the first frame. The image processing of the eyes is obtained by the processing of the feature point selection unit 13 and the carry calculation unit 14. That is, the distance calculation unit 16 calculates the distance from the image of the current time and the latest calibration data D2 obtained by the process of the past time. In the process of the second frame, calibration data D2 obtained from the image obtained in the second frame is stored in the calibration data database DB1, and the process of the distance calculation unit 16 in the third frame is performed. Will be used in
 キャリブレーションデータの算出タイミングは、カメラが制御されたタイミングで行う。それによって、高精度な距離データを算出することが可能になる。そして、計測対象の現在距離に応じて、適切にカメラを制御し、さらに計測することが可能になる。 The calculation timing of the calibration data is performed at the timing when the camera is controlled. This makes it possible to calculate distance data with high accuracy. And according to the present distance of measurement object, it becomes possible to control a camera appropriately and to measure further.
 上記を通じて、ステレオカメラ装置100は最新の状況をキャリブレーションし、距離制御に反映させることが可能である。 Through the above, the stereo camera device 100 can calibrate the latest situation and reflect it on the distance control.
10:実映像入力部、11:画像取得部、12:カメラ制御部、13:特徴点選別部、14:キャリブ算出部、DB1:キャリブデータデータベース、16:距離算出部、17:制御量算出部、21:ズーム制御部、22:ベースライン制御部23:回転制御部、24:チルト制御部、25:パン制御部、26:ズーム制御値、27:ベースライン制御値、28:チルト制御値、29:パン制御値、44:歪み補正部、45:平行処理部、46:距離算出部、P1:距離データ、81:ネットワーク、91、92、93:カメラスペック、94、95、96:スキャン領域、97:統合スキャン結果、100:ステレオカメラ装置、M:計測対象、g:移動経路、111:計算機、101:ステレオカメラ、113:ベースラインステージ、114:回転ステージ、115:ズーム制御ユニット、116:ベースライン制御ユニット、117:回転制御ユニット 10: real image input unit, 11: image acquisition unit, 12: camera control unit, 13: feature point selection unit, 14: calibration calculation unit, DB1: calibration data database, 16: distance calculation unit, 17: control amount calculation unit 21: zoom control unit 22: baseline control unit 23: rotation control unit 24: tilt control unit 25: pan control unit 26: zoom control value 27: baseline control value 28: tilt control value 29: Pan control value 44: distortion correction unit 45: parallel processing unit 46: distance calculation unit P1: distance data 81: network 91, 92, 93: camera specification 94, 95, 96: scan area , 97: integrated scan result, 100: stereo camera device, M: measurement target, g: movement path, 111: calculator, 101: stereo camera, 113: baseline stage, 14: rotating stage, 115: zoom control unit 116: baseline control unit, 117: rotation control unit

Claims (14)

  1.  左右のカメラにより左右の画像情報を与えるとともに、前記左右のカメラのカメラ制御部を含むステレオカメラ部からの前記左右の画像情報を処理するステレオ計測装置であって、
     ステレオ計測装置は、左右の画像情報の画像シーンから特徴点を選別する特徴点選別部と、選別された特徴点を用いて、キャリブレーションを行い、キャリブレーションデータを算出するキャリブ算出部と、算出されたキャリブレーションデータと前記左右の画像情報を用いて、センシング対象の対象物体までの距離を算出する距離算出部と、算出された距離と画像情報とから、前記左右のカメラのカメラ制御部を制御するための制御量を算出する制御量算出部とを備え、
     前記制御量を前記ステレオカメラ部の前記左右のカメラのカメラ制御部に与えることを特徴とするステレオ計測装置。
    A stereo measurement device that provides left and right image information by left and right cameras, and processes the left and right image information from a stereo camera unit including a camera control unit of the left and right cameras.
    The stereo measuring device performs calibration using a feature point selection unit that selects feature points from image scenes of left and right image information, and a calibration calculation unit that calculates calibration data by performing calibration using the selected feature points. The camera control unit of the left and right cameras is calculated from the distance calculation unit that calculates the distance to the target object of the sensing target using the calibration data and the left and right image information, and the calculated distance and the image information. And a control amount calculation unit that calculates a control amount for control.
    A stereo measurement device characterized in that the control amount is given to camera control units of the left and right cameras of the stereo camera unit.
  2.  請求項1に記載のステレオ計測装置であって、
     前記ステレオカメラ部の前記左右のカメラのカメラ制御部は、少なくともズーム、ベースライン、回転についての制御部を備えており、前記制御量算出部は、少なくともズーム、ベースライン、回転についての制御量を算出して、前記カメラ制御部に与えることを特徴とするステレオ計測装置。
    The stereo measuring device according to claim 1, wherein
    The camera control units of the left and right cameras of the stereo camera unit include at least controls for zoom, baseline, and rotation, and the control amount calculation unit includes at least control amounts for zoom, baseline, and rotation. A stereo measurement device that is calculated and given to the camera control unit.
  3.  請求項2に記載のステレオ計測装置であって、
     前記回転についての制御部は、チルト制御部とパン制御部を含むことを特徴とするステレオ計測装置。
    The stereo measuring device according to claim 2, wherein
    The control unit for the rotation includes a tilt control unit and a pan control unit.
  4.  請求項1から請求項3のいずれか1項に記載のステレオ計測装置であって、
     左右の画像情報の画像シーンから特徴点を選別する特徴点選別部は、複数の画像シーンについて特徴点を抽出し、抽出した特徴点について有効な特徴点を選別することを特徴とするステレオ計測装置。
    The stereo measurement device according to any one of claims 1 to 3, wherein
    A stereo measurement apparatus characterized in that a feature point selection unit for selecting feature points from image scenes of left and right image information extracts feature points for a plurality of image scenes, and selects effective feature points for the extracted feature points. .
  5.  請求項1から請求項4のいずれか1項に記載のステレオ計測装置であって、
     センシング対象の対象物体までの距離を算出する前記距離算出部は、カメラ画像の歪みを補正する歪み補正部と、左右カメラの平行化を処理する平行処理部とを含み、左右カメラの平行化処理後の画像情報から、センシング対象の対象物体までの距離を算出することを特徴とするステレオ計測装置。
    The stereo measurement device according to any one of claims 1 to 4, wherein
    The distance calculation unit that calculates the distance to the target object to be sensed includes a distortion correction unit that corrects distortion of the camera image, and a parallel processing unit that processes parallelization of the left and right cameras. A stereo measuring device characterized by calculating a distance to a target object to be sensed from image information to be obtained later.
  6.  請求項2または請求項3に記載のステレオ計測装置であって、
     前記ズーム、ベースラインについての制御量を算出する前記制御量算出部は、前記距離算出部で計測した距離データを用いて、センシング対象の対象物体までの平均距離を算出し、ズーム、ベースラインの制御量を算出することを特徴とするステレオ計測装置。
    A stereo measurement device according to claim 2 or claim 3, wherein
    The control amount calculation unit for calculating the control amount for the zoom and baseline calculates the average distance to the target object to be sensed using the distance data measured by the distance calculation unit, and the zoom and baseline are calculated. A stereo measurement device characterized by calculating a control amount.
  7.  請求項2または請求項3に記載のステレオ計測装置であって、
    前記ズーム、ベースラインについての制御量を算出する前記制御量算出部は、算出する制御量の精度を高める場合は、前記距離算出部で計測した距離データの安定性を評価し、安定と判断された距離データを用いて、センシング対象の対象物体までの平均距離を算出し、ズーム、ベースラインの制御量を算出することを特徴とするステレオ計測装置。
    A stereo measurement device according to claim 2 or claim 3, wherein
    The control amount calculation unit that calculates the control amount for the zoom and the baseline evaluates the stability of the distance data measured by the distance calculation unit when the accuracy of the control amount to be calculated is increased, and is determined to be stable. What is claimed is: 1. A stereo measurement apparatus comprising: calculating an average distance to a target object to be sensed using the distance data, and calculating a zoom and a baseline control amount.
  8.  請求項2または請求項3に記載のステレオ計測装置であって、
     前記回転についての制御量を算出する前記制御量算出部は、画像領域を予め複数に区分した小領域間で、センシング対象の対象物体の領域移動がある場合にその移動量を算出し、回転制御を行うことを特徴とするステレオ計測装置。
    A stereo measurement device according to claim 2 or claim 3, wherein
    The control amount calculation unit that calculates the control amount for the rotation calculates the movement amount of the target object of the sensing target between small regions in which the image region is divided into plural in advance, and performs rotation control Stereo measuring device characterized by performing.
  9.  請求項2または請求項3に記載のステレオ計測装置であって、
     前記ズーム、ベースラインについての制御量を算出する前記制御量算出部は、計測した距離データの安定性を評価し、安定と判断された距離データを用いて、センシング対象の対象物体までの平均距離を算出し、ズーム、ベースラインの制御量を算出し、
     前記回転についての制御量を算出する前記制御量算出部は、画像領域を予め複数に区分した小領域間で、センシング対象の対象物体の領域移動がある場合にその移動量を算出して回転制御の制御量を算出するとともに、
     左右カメラのフレーム間差分により、ズーム、ベースラインの制御量算出処理と回転制御の制御量算出処理を選択することを特徴とするステレオ計測装置。
    A stereo measurement device according to claim 2 or claim 3, wherein
    The control amount calculating unit that calculates the control amounts for the zoom and the baseline evaluates the stability of the measured distance data, and uses the distance data determined to be stable to calculate the average distance to the target object to be sensed. Calculate the zoom, baseline control amount,
    The control amount calculation unit that calculates the control amount for the rotation calculates rotation amount when there is movement of the target object of the sensing target among small regions in which the image region is divided into plural in advance, and performs rotation control Calculate the control amount of
    What is claimed is: 1. A stereo measurement apparatus comprising: selecting a control amount calculation processing of zoom and baseline and a control amount calculation processing of rotation control based on an inter-frame difference between left and right cameras.
  10.  請求項1から請求項9のいずれか1項に記載のステレオ計測装置であって、
     前記ステレオカメラ部と前記処理部は、インターネットを介して接続されていることを特徴とするステレオ計測装置。
    The stereo measurement device according to any one of claims 1 to 9, wherein
    A stereo measurement device characterized in that the stereo camera unit and the processing unit are connected via the Internet.
  11.  請求項1から請求項10のいずれか1項に記載のステレオ計測装置であって、
     前記ステレオカメラ部は、複数の異なる距離での複数のカメラスペックのデータを保持し、距離に応じてカメラスペックを変更して左右の画像情報を得ることを特徴とするステレオ計測装置。
    The stereo measurement device according to any one of claims 1 to 10, wherein
    A stereo measurement apparatus, wherein the stereo camera unit holds data of a plurality of camera specifications at a plurality of different distances, and changes the camera specifications according to the distance to obtain left and right image information.
  12.  前記左右のカメラが非同期カメラであり、かつ前記ステレオカメラ部の前記左右のカメラのカメラ制御部は、少なくともズーム、ベースライン、回転についての制御部を備えており、前記制御量算出部は、少なくともズーム、ベースライン、回転についての制御量を算出して、前記カメラ制御部に与えるようにされた請求項1から請求項11のいずれか1項に記載のステレオ計測装置であって、
     前記左右のカメラからの画像について、左右のカメラのフレーム間差分がない場合は、静止と判断してズーム、ベースラインについての制御量を算出し、左右のカメラのフレーム間差分がある場合は、動きと判断して回転についての制御量を算出することを特徴とするステレオ計測装置。
    The left and right cameras are asynchronous cameras, and the camera control unit of the left and right cameras of the stereo camera unit includes at least a control unit for zoom, baseline, and rotation, and the control amount calculation unit is at least The stereo measurement device according to any one of claims 1 to 11, wherein control amounts for zoom, baseline, and rotation are calculated and provided to the camera control unit.
    For the images from the left and right cameras, if there is no inter-frame difference between the left and right cameras, it is determined that the camera is stationary and the control amount for zoom and baseline is calculated. If there is an inter-frame difference between the left and right cameras A stereo measuring device which is determined to be a motion and calculates a control amount for rotation.
  13.  前記ステレオカメラ部の前記左右のカメラのカメラ制御部は、少なくともズーム、ベースライン、回転についての制御部を備えており、前記制御量算出部は、少なくともズーム、ベースライン、回転についての制御量を算出して、前記カメラ制御部に与えるようにされた請求項1から請求項12のいずれか1項に記載のステレオ計測装置であって、
     前記左右のカメラの視野内にある移動物体について、前記ズーム、ベースライン、回転についての制御量を算出して、前記カメラ制御部に与えることにより、前記移動物体を追跡することを特徴とするステレオ計測装置。
    The camera control units of the left and right cameras of the stereo camera unit include at least controls for zoom, baseline, and rotation, and the control amount calculation unit includes at least control amounts for zoom, baseline, and rotation. The stereo measurement device according to any one of claims 1 to 12, which is calculated and provided to the camera control unit.
    A stereo system characterized by tracking the moving object by calculating control amounts for the zoom, baseline, and rotation for the moving object within the field of view of the left and right cameras and giving the control amount to the camera control unit. Measuring device.
  14.  請求項1から請求項13のいずれか1項に記載のステレオ計測装置と、左右のカメラにより左右の画像情報を与えるとともに、前記左右のカメラのカメラ制御部を含むステレオカメラ部とにより構成されたステレオ計測システム。 A stereo measurement device according to any one of claims 1 to 13, and a stereo camera unit which gives left and right image information by the left and right cameras and includes camera control units of the left and right cameras. Stereo measurement system.
PCT/JP2017/023159 2017-06-23 2017-06-23 Stereo measurement device and system WO2018235256A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019524821A JP6734994B2 (en) 2017-06-23 2017-06-23 Stereo measuring device and system
PCT/JP2017/023159 WO2018235256A1 (en) 2017-06-23 2017-06-23 Stereo measurement device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/023159 WO2018235256A1 (en) 2017-06-23 2017-06-23 Stereo measurement device and system

Publications (1)

Publication Number Publication Date
WO2018235256A1 true WO2018235256A1 (en) 2018-12-27

Family

ID=64737655

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/023159 WO2018235256A1 (en) 2017-06-23 2017-06-23 Stereo measurement device and system

Country Status (2)

Country Link
JP (1) JP6734994B2 (en)
WO (1) WO2018235256A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915638A (en) * 2019-05-09 2020-11-10 东芝泰格有限公司 Tracking device, information processing method, readable storage medium, and electronic apparatus
CN111915656A (en) * 2019-05-09 2020-11-10 东芝泰格有限公司 Tracking device, information processing method, readable storage medium, and electronic apparatus

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004096488A (en) * 2002-08-30 2004-03-25 Fujitsu Ltd Object detection apparatus, object detection method and object detection program
JP2004310595A (en) * 2003-04-09 2004-11-04 Ntt Data Corp Animal body detecting device, and animal body detecting method
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
JP2010050582A (en) * 2008-08-20 2010-03-04 Tokyo Institute Of Technology Camera system for probing long distance target
JP2012103741A (en) * 2010-11-05 2012-05-31 Sony Corp Imaging apparatus, image processing apparatus, image processing method and program
JP2013105002A (en) * 2011-11-14 2013-05-30 Bi2−Vision株式会社 3d video photography control system, 3d video photography control method, and program
JP2016099941A (en) * 2014-11-26 2016-05-30 日本放送協会 System and program for estimating position of object
JP2017040549A (en) * 2015-08-19 2017-02-23 シャープ株式会社 Image processing device and error determination method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004096488A (en) * 2002-08-30 2004-03-25 Fujitsu Ltd Object detection apparatus, object detection method and object detection program
JP2004310595A (en) * 2003-04-09 2004-11-04 Ntt Data Corp Animal body detecting device, and animal body detecting method
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
JP2010050582A (en) * 2008-08-20 2010-03-04 Tokyo Institute Of Technology Camera system for probing long distance target
JP2012103741A (en) * 2010-11-05 2012-05-31 Sony Corp Imaging apparatus, image processing apparatus, image processing method and program
JP2013105002A (en) * 2011-11-14 2013-05-30 Bi2−Vision株式会社 3d video photography control system, 3d video photography control method, and program
JP2016099941A (en) * 2014-11-26 2016-05-30 日本放送協会 System and program for estimating position of object
JP2017040549A (en) * 2015-08-19 2017-02-23 シャープ株式会社 Image processing device and error determination method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915638A (en) * 2019-05-09 2020-11-10 东芝泰格有限公司 Tracking device, information processing method, readable storage medium, and electronic apparatus
CN111915656A (en) * 2019-05-09 2020-11-10 东芝泰格有限公司 Tracking device, information processing method, readable storage medium, and electronic apparatus

Also Published As

Publication number Publication date
JPWO2018235256A1 (en) 2020-05-21
JP6734994B2 (en) 2020-08-05

Similar Documents

Publication Publication Date Title
US10455141B2 (en) Auto-focus method and apparatus and electronic device
CN109922251B (en) Method, device and system for quick snapshot
KR102143456B1 (en) Depth information acquisition method and apparatus, and image collection device
KR102149276B1 (en) Method of image registration
JP4852591B2 (en) Stereoscopic image processing apparatus, method, recording medium, and stereoscopic imaging apparatus
CN106960454B (en) Depth of field obstacle avoidance method and equipment and unmanned aerial vehicle
US20070189750A1 (en) Method of and apparatus for simultaneously capturing and generating multiple blurred images
US8648961B2 (en) Image capturing apparatus and image capturing method
KR101095361B1 (en) Imaging apparatus, imaging control method, and recording medium
KR20150050172A (en) Apparatus and Method for Selecting Multi-Camera Dynamically to Track Interested Object
US9619886B2 (en) Image processing apparatus, imaging apparatus, image processing method and program
US20130162786A1 (en) Image processing apparatus, imaging apparatus, image processing method, and program
JP2002298142A (en) Person image detecting method, storage medium recording program for executing the method, person image detecting device, and image pick-up device having this device
JP6694281B2 (en) Stereo camera and imaging system
JP7378219B2 (en) Imaging device, image processing device, control method, and program
JP6694234B2 (en) Distance measuring device
JP2011185720A (en) Distance obtaining device
JP6734994B2 (en) Stereo measuring device and system
US20130093856A1 (en) Stereoscopic imaging digital camera and method of controlling operation of same
US20130076868A1 (en) Stereoscopic imaging apparatus, face detection apparatus and methods of controlling operation of same
JP2019062340A (en) Image shake correction apparatus and control method
WO2016047220A1 (en) Imaging device and imaging method
JP2018205008A (en) Camera calibration device and camera calibration method
CN111080689B (en) Method and device for determining face depth map
WO2019130409A1 (en) Stereo measurement device and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17915178

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019524821

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17915178

Country of ref document: EP

Kind code of ref document: A1