CN104656683B - Binocular vision area target depth information extraction and cross section analysis system and method - Google Patents

Binocular vision area target depth information extraction and cross section analysis system and method Download PDF

Info

Publication number
CN104656683B
CN104656683B CN201510021300.2A CN201510021300A CN104656683B CN 104656683 B CN104656683 B CN 104656683B CN 201510021300 A CN201510021300 A CN 201510021300A CN 104656683 B CN104656683 B CN 104656683B
Authority
CN
China
Prior art keywords
camera
motor
target
information
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510021300.2A
Other languages
Chinese (zh)
Other versions
CN104656683A (en
Inventor
王孙安
陈先益
邸宏宇
程元皓
王冰心
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201510021300.2A priority Critical patent/CN104656683B/en
Publication of CN104656683A publication Critical patent/CN104656683A/en
Application granted granted Critical
Publication of CN104656683B publication Critical patent/CN104656683B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a binocular vision area target depth information extraction and cross section analysis system and method. The system comprises a left bracket, a right bracket, a first driver, a second driver, a third driver, a fourth driver, a fifth driver, a control panel, a first step motor, a second step motor, a third step motor, a fourth step motor, a fifth step motor, a rotating plate, a first transmission gear set, a second transmission gear set, a third transmission gear set, a left rotating shaft, a right rotating shaft, a main shaft electronic compass, a left electronic compass, a right electronic compass, a left camera bracket, a right camera bracket, a left camera, a right camera, a left electronic gyroscope, a right electronic gyroscope, a bracket plate, a bracket plate, a main shaft and a bottom plate, wherein the third transmission gear set consists of a first gear and a second gear; the first transmission gear set consists of a third gear and a fourth gear; the second transmission gear set consists of a fifth gear and a sixth gear. The binocular vision area target depth information extraction and cross section analysis system and method can be used for obtaining the cross section information on preset tracks in areas to be detected.

Description

The extraction of depth information and profile analysis system and method for binocular vision regional aim
Technical field
The invention belongs to motion control field, is related to the extraction of depth information and section point of a kind of binocular vision regional aim Analysis system and method.
Background technology
Now with science and technology development, artificial intelligence constantly improve it is various can autonomous intelligent vehicle should With in every field, such as Agriculture pick robot, outer space exploring robot, explosive-removal robot etc., their walking Road environment it is often more complicated for unstructured road, in order to be able to enable intelligent mobile robot in complicated road environment Middle walking must obtain the environmental information on road surface in advance:Surface roughness(The projection on road surface, depression), grade information etc..Only The walking information on fully acquisition road surface could carry out optimal planning walking path, the pose to walking and enter to intelligent mobile robot Row pre-estimation, so as to can more safety and steady walking.
The method that at present complicated unstructured road road surface is analyzed, the particularly roughness and section of road pavement The technology for being predicted analysis simultaneously is less.It is broadly divided into contact and contactless measurement.The mode of contact type measurement Mainly have many wheels survey flatcars, trailer-type jolt accumulating instrument, stepping type road surface meter etc. by instrument portion mechanism directly contact Detecting road surface, this is used primarily on structured road in face;Contactless having mainly have vehicular bump-integrator, based on used The spindle nose acceleration measurement method of property benchmark, the laser displacement sensor based on inertial reference and acceleration transducer are used cooperatively Laser cross section instrument, inertial measurement method of the multiple acceleration transducers of a column distribution etc. longitudinally along the vehicle, wherein promoting the use of More is laser cross section instrument measuring method, and same these modes are primarily adapted for use in the measurement of structured road.In addition laser is swept Retouch radar and also can obtain information of road surface well, which can be advantageously used in the non-structural walking that structured road also can be suitably used for complexity Pavement detection.Acquisition of the also useful machine vision relatively common at present to surrounding terrain walking information, which is mainly used in obstacle Analyte detection, or three dimensional terrain reconstruction is rarely used in the profile analysis of ground surface or terrain.
But in prior art contact and it is contactless be mainly used on structured road, and for more complicated non-structural Change road or the measurement of field road surface is then incompatible, which cannot make intelligent mobile robot obtain the road conditions in walking front in real time Information, can only carry out statistical analysiss according to data are surveyed, be difficult with existing as destructuring road surface walking environment is complicated and changeable The region road surface that Data Base Analysis will pass through;Although and laser scanning surface radar can be used for different road environment detections then its master The detection of obstacles is used for by road surface is then weak for the roughness on ground and the Slope Analysis on road surface, and equipment is costly; Existing machine vision is obtained to road environment of around walking, and can be adapted to structured road, can be applied well again In non-structured complex road surface environment, but current research primarily focuses on detection of obstacles, the Three-dimensional Gravity of surrounding Build, three-dimensional reconstruction needs to process substantial amounts of data, the wheel efficiency for having had a strong impact on intelligent mobile robot is difficult to meet its row The requirement of real-time walked, if it is possible to effectively obtain the profile information of intelligent mobile robot travel region, you can effectively The wheel efficiency of intelligent mobile robot is improved, but is not involved with how obtaining zone profile information in existing technology System and method.
The content of the invention
It is an object of the invention to overcome the shortcoming of above-mentioned prior art, there is provided a kind of depth of binocular vision regional aim Information retrieval and profile analysis system and method are spent, the system and method can be detected to the roughness in region to be measured, and Obtain the profile information in region to be measured.
To reach above-mentioned purpose, the extraction of depth information of binocular vision regional aim of the present invention and profile analysis system System includes left socle, right support, the first driver, the second driver, the 3rd driver, fourth drive, the 5th driver, control Making sheet, the first motor, the second motor, the 3rd motor, the 4th motor, the 5th motor, swivel plate, First driving gear set, the second driving gear set, the 3rd driving gear set, left-handed rotating shaft, dextral shaft, main shaft electronic guide Pin, left digital compass, right digital compass, left camera support, right camera support, left camera, right camera, left electronic gyroscope Instrument, right electronic gyroscope, supporting plate, main shaft and base plate, the 3rd driving gear set are made up of first gear and second gear, and first Driving gear set is made up of the 3rd gear and the 4th gear, and the second driving gear set is made up of the 5th gear and the 6th gear;
The lower end of the left socle and the lower end of right support are fixedly connected with the two ends of base plate respectively, the upper end of left socle and The upper end of right support is individually fixed in the bottom of supporting plate, and the 3rd motor is fixed on the bottom of supporting plate, the 3rd stepping electricity The output shaft of machine passes through supporting plate, first gear to be socketed on the output shaft of the 3rd motor, and the lower end of main shaft passes through support Plate, the upper end of main shaft pass through swivel plate, and second gear is socketed on main shaft, and second gear is meshed with first gear, main shaft Digital compass is fixed on main shaft top, and the second motor and the 4th motor are fixed on the bottom of swivel plate, second step The output shaft of the output shaft of stepper motor and the 3rd motor both passes through the swivel plate, and the 3rd gear is socketed on second step and enters electricity On the output shaft of machine, the 5th gear is socketed on the output shaft of the 4th motor, the lower end of left-handed rotating shaft and dextral shaft Lower end both passes through swivel plate, and the upper end of left-handed rotating shaft is connected with the lower end of left camera support, the upper end of dextral shaft and right phase The lower end of machine support is connected, and the 4th gear is socketed on anticlockwise axle, and the 4th gear is meshed with the 3rd gear, the 6th tooth Wheel is socketed on dextral shaft, and the 6th gear is meshed with the 5th gear, and left camera support and right camera support are L-type knot Structure, left digital compass are fixed on the upper surface of left camera support bottom, and right digital compass is fixed on right camera support bottom The upper surface in portion;First motor is fixed on the inner side on left camera support top, and the output shaft of the first motor is through a left side The bottom of the first U-shaped support is fixed on behind the side on camera support top, left camera is fixed in the described first U-shaped support, it is left Electronic gyroscope is fixed on the top of the first U-shaped support, and the 5th motor is fixed on the inner side on right camera support top, and the 5th The output shaft of motor is fixed on the bottom of the second U-shaped support through the side on right camera support top, and right camera is fixed on Second U-shaped support it is interior, right electronic gyroscope is fixed on the top of the second U-shaped support;
The outfan and the outfan of right camera of the left camera is connected with the input of message handler, image information The outfan of processor is connected with the input of panel, the input of the outfan of panel and the first driver, second The input of driver, the input of the input, the input of fourth drive and the 5th driver of the 3rd driver are connected Connect, the outfan of the first driver is connected with the control end of the first motor, the outfan and second step of the second driver The control end of stepper motor is connected, and the outfan of the 3rd driver is connected with the control end of the 3rd motor, and the 4th drives The outfan of device is connected with the control end of the 4th motor, the control of the outfan and the 5th motor of the 5th driver End is connected, the outfan of left digital compass, the outfan of right digital compass, the outfan of left electronic gyroscope, right electricity The outfan of the outfan and main shaft digital compass of sub- gyroscope is connected with the input of message handler.
The left electronic gyroscope center, left image center, left electronic guide center of pin and left-handed rotating shaft core are located at same On one straight line.
The right electronic gyroscope center, right image center, right electronic guide center of pin and dextral shaft axle center are located at same On one straight line.
The panel is the panel based on STM32 chips.
The panel is fixed on left socle;
First driver, the second driver, the 3rd driver, fourth drive and the 5th driver are fixed on base plate On.
The profile analysis method of the multi-targets recognition based on binocular vision of the present invention is comprised the following steps:
1)First preset run trace line on region to be analyzed, then on default run trace line to choose several to be measured Point, and first tested point is denoted as into target to be measured;
2)Then described image information be forwarded to letter by left camera and the image information in right camera difference acquisition testing region In breath processor, positional information of the message handler according to described image acquisition of information target to be measured, then according to described to be measured The positional information of target obtain target to be measured relative to left camera azimuth and the angle of pitch, target to be measured relative to right camera Azimuth and the angle of pitch and target to be measured relative to left camera and right camera it is common towards angle, while produce first driving Dynamic signal, the second drive signal, the 3rd drive signal, fourth drive signal and the 5th drive signal, and described first is driven Signal, the second drive signal, the 3rd drive signal, fourth drive signal and the 5th drive signal are forwarded in panel, control Plate drives letter according to first drive signal, the second drive signal, the 3rd drive signal, fourth drive signal and the 5th respectively Number the first stepping is driven respectively by the first driver, the second driver, the 3rd driver, fourth drive and the 5th driver The work of motor, the second motor, the 3rd motor, the 4th motor and the 5th motor, the first motor band Dynamic left camera in the vertical direction is rotated, and the second motor passes through the first driving gear set and left-handed rotating shaft drives left camera to exist Rotate in horizontal direction, the 3rd motor passes through the 3rd driving gear set and main shaft drives left camera and right camera around master Axle is rotated in the horizontal direction, and the 4th motor passes through the second driving gear set and dextral shaft drives right camera in level side It is rotated up, the 5th motor drives right camera in the vertical direction to rotate;
Meanwhile, left digital compass obtains the azimuth information of left camera in real time, and the azimuth of the left camera is believed Breath is forwarded in message handler, according to the azimuth information of left camera, message handler judges that the current azimuth of left camera is It is no for target to be measured relative to left camera azimuth, when the current azimuth of left camera is target to be measured relative to left camera During azimuth, then the second drive signal is no longer produced, make the second motor quit work;
Left electronic gyroscope obtains the pitching angle information of left camera in real time, and the pitching angle information of the left camera is forwarded Into message handler, whether message handler judges the current angle of pitch of left camera according to the pitching angle information of the left camera For target to be measured relative to left camera the angle of pitch, when the current angle of pitch of left camera is target to be measured bowing relative to left camera During the elevation angle, then the first drive signal is no longer produced, make the first motor quit work;
Main shaft digital compass obtain in real time left camera and right camera it is common towards angle information, and by the left camera And right camera is common is forwarded in message handler towards angle information, message handler is common according to left camera and right camera Towards angle information judge left camera and right camera it is current it is common towards angle whether be target to be measured relative to left camera And right camera it is common towards angle, when front left camera and right camera it is current it is common towards angle be target to be measured relative to a left side Camera and right camera it is common towards angle when, then no longer produce the 3rd drive signal, make the 3rd motor quit work;
The azimuth information of the right camera of right digital compass Real-time Collection, and the azimuth information of the right camera is forwarded Into message handler, according to the azimuth information of right camera, message handler judges whether the current azimuth of right camera is to treat Azimuth of the target relative to right camera is surveyed, when the current azimuth of right camera is azimuth of the target to be measured relative to right camera When, then fourth drive signal is no longer produced, makes the 4th motor quit work;
Right electronic gyroscope obtains the pitching angle information of right camera in real time, and the pitching angle information of the right camera is forwarded Into message handler, whether message handler judges the current angle of pitch of right camera according to the pitching angle information of the right camera For target to be measured relative to right camera the angle of pitch, when the current angle of pitch of right camera is target to be measured bowing relative to right camera During the elevation angle, then the 5th drive signal is no longer produced, makes the 5th motor quit work;
When the first motor, the second motor, the 3rd motor, the 4th motor and the 5th motor it is equal After quitting work, then left camera and right camera are directed at target to be measured;
3)After left camera and right camera are directed at target to be measured, left camera and right collected by camera target region to be measured Image information, and the image information of target region to be measured is forwarded in message handler, message handler is according to be measured The image information of target region obtains the roughness of target region to be measured and to be measured by technique of binocular stereoscopic vision The spatial positional information of target, then stores the spatial positional information of target to be measured again;
4)Next tested point will be chosen on default run trace line as new target to be measured, repeat step 3), obtain newly Target to be measured spatial positional information, then store the spatial positional information of new target to be measured;
5)Repeat step 4), the spatial positional information of each tested point on default run trace line is obtained, then to default walking In trajectory, the spatial positional information of each tested point is associated rule analysis, obtains the profile information in region to be measured.
The invention has the advantages that:
The extraction of depth information and profile analysis system and method for binocular vision regional aim of the present invention is being obtained During the profile information in region to be measured, default run trace line first will be drawn on region to be analyzed, in default run trace line It is upper to choose some tested points, then following operation is carried out using each tested point as target to be measured:First pass through the first stepping electricity Machine and the second motor adjust the angle of pitch and the azimuth of left camera respectively, and by left electronic gyroscope and left electronic guide Pin obtains the current angle of pitch of left camera and azimuth in real time, when the current angle of pitch of left camera and azimuth are target phase to be measured For left camera the angle of pitch and azimuth when, the first motor and the second motor quit work;By the 4th stepping Motor and the 5th motor obtain azimuth and the angle of pitch of right camera, and by right digital compass and right electronic gyroscope The current azimuth of right camera and the angle of pitch are obtained in real time, when the current azimuth of right camera and the angle of pitch are that target to be measured is relative When the azimuth of right camera and the angle of pitch, the 4th motor and the 5th motor quit work;Pass through the 3rd step simultaneously Stepper motor adjust left camera and right camera it is common towards angle, and left camera and the right side are obtained in real time by main shaft digital compass Camera it is common towards angle, when front left camera and right camera it is common towards angle be target to be measured relative to left camera and the right side Camera it is common towards angle when, the 3rd motor quits work so that left camera and right camera are directed at target to be measured, this When obtained according to the image information by left camera and the image information of right collected by camera target region to be measured, message handler To the spatial positional information of the target to be measured, the last spatial positional information only needed according to each tested point on default run trace line The profile information in region to be analyzed is can be obtained by, so as to meet the pre- measurement of power of pose of the real-time walking of intelligent mobile robot Can, simple to operate, practicality is extremely strong.
Description of the drawings
Fig. 1 is the structural representation of the present invention;
Fig. 2 is the profile in the region to be measured of present invention analysis.
Wherein, 1 be left socle, 2 be the first driver, 3 be the second driver, 4 be the 3rd driver, 5 be panel, 6 It is the second motor, 8 is swivel plate, 9 is the first driving gear set, 10 is left-handed rotating shaft, 11 is for the 3rd motor, 7 Main shaft digital compass, 12 be left digital compass, 13 be left camera support, 14 be left camera, 15 be the first motor, 16 For left electronic gyroscope, 17 be right electronic gyroscope, 18 be the 5th motor, 19 be right camera, 20 be right camera support, 21 For right digital compass, 22 be dextral shaft, 23 be the second driving gear set, 24 be the 3rd driving gear set, 25 be the 4th step Stepper motor, 26 be supporting plate, 27 be main shaft, 28 be fourth drive, 29 be the 5th driver, 30 be base plate, 31 be right support.
Specific embodiment
Below in conjunction with the accompanying drawings the present invention is described in further detail:
With reference to Fig. 1, the extraction of depth information and profile analysis system of binocular vision regional aim of the present invention include Left socle 1, right support 31, the first driver 2, the second driver 3, the 3rd driver 4, fourth drive 28, the 5th driver 29th, panel 5, the first motor 15, the second motor 7, the 3rd motor 6, the 4th motor 25, the 5th stepping Motor 18, swivel plate 8, the first driving gear set 9, the second driving gear set 23, the 3rd driving gear set 24, left-handed rotating shaft 10, Dextral shaft 22, main shaft digital compass 11, left digital compass 12, right digital compass 21, left camera support 13, right camera Support 20, left camera 14, right camera 19, left electronic gyroscope 16, right electronic gyroscope 17, supporting plate 26, main shaft 27 and base plate 30, the 3rd driving gear set 24 is made up of first gear and second gear, and the first driving gear set 9 is by the 3rd gear and the 4th tooth Wheel composition, the second driving gear set 23 are made up of the 5th gear and the 6th gear;The lower end of left socle 1 and the lower end of right support 31 It is fixedly connected with the two ends of base plate 30 respectively, the upper end of left socle 1 and the upper end of right support 31 are individually fixed in supporting plate 26 The bottom of supporting plate 26 is fixed in bottom, the 3rd motor 6, and the output shaft of the 3rd motor 6 passes through supporting plate 26, and first Gear is socketed on the output shaft of the 3rd motor 6, and the lower end of main shaft 27 passes through supporting plate 26, and the upper end of main shaft 27 is through rotation Flap, second gear are socketed on main shaft 27, and second gear is meshed with first gear, and main shaft digital compass 11 is fixed on 27 top of main shaft, the second motor 7 and the 4th motor 25 are fixed on the bottom of swivel plate 8, the second motor 7 it is defeated The output shaft of shaft and the 3rd motor 6 both passes through the swivel plate, and the 3rd gear is socketed on the output of second step stepper motor 7 On axle, the 5th gear is socketed on the output shaft of the 4th motor 25, under the lower end of left-handed rotating shaft 10 and dextral shaft 22 End both pass through swivel plate 8, the upper end of left-handed rotating shaft 10 is connected with the lower end of left camera support 13, the upper end of dextral shaft 22 with The lower end of right camera support 20 is connected, and the 4th gear is socketed on anticlockwise axle 10, and the 4th gear is mutually nibbled with the 3rd gear Close, the 6th gear is socketed on dextral shaft 22, and the 6th gear is meshed with the 5th gear, left camera support 13 and right camera Support 20 is L-type structure, and left digital compass 12 is fixed on the upper surface of 13 bottom of left camera support, right digital compass 21 upper surfaces for being fixed on 20 bottom of right camera support;First motor 15 is fixed on the inner side on 13 top of left camera support, The output shaft of the first motor 15 is fixed on the bottom of the first U-shaped support through after the side on 13 top of left camera support, left Camera 14 is fixed in the described first U-shaped support, and left electronic gyroscope 16 is fixed on the top of the first U-shaped support, the 5th stepping Motor 18 is fixed on the inner side on 20 top of right camera support, and the output shaft of the 5th motor 18 passes through 20 top of right camera support Side be fixed on the bottom of the second U-shaped support, right camera 19 is fixed on the interior of the second U-shaped support, and right electronic gyroscope 17 is solid Due to the top of the second U-shaped support;The outfan of the outfan of left camera 14 and right camera 19 and the input of message handler It is connected, the outfan of visual information processor is connected with the input of panel 5, the outfan of panel 5 and first drives The input of dynamic device 2, the input of the second driver 3, the input of the 3rd driver 4, the input of fourth drive 28 and The input of the 5th driver 29 is connected, and the outfan of the first driver 2 is connected with the control end of the first motor 15, The outfan of the second driver 3 is connected with the control end of the second motor 7, the outfan and the 3rd step of the 3rd driver 4 The control end of stepper motor 6 is connected, and the outfan of fourth drive 28 is connected with the control end of the 4th motor 25, and the 5th The outfan of driver 29 is connected with the control end of the 5th motor 18, the outfan of left digital compass 12, right electronics The outfan of compass 21, the outfan of left electronic gyroscope 16, the outfan of right electronic gyroscope 17 and main shaft electronic guide The outfan of pin 11 is connected with the input of message handler;Left electronic gyroscope 16, left camera 14 and left electronic guide Pin 12 is located along the same line;Right electronic gyroscope 17, right camera 19 and right digital compass 21 are located along the same line;Control Plate 5 is the panel 5 based on STM32 chips;Panel 5 is fixed on left socle 1;First driver 2, the second driver 3, Three drivers 4, fourth drive 28 and the 5th driver 29 are fixed on base plate 30, it should be noted that left-handed rotating shaft 10, the right side Rotary shaft 22 is by two nuts being socketed on it and swivel plate 8, and main shaft 27 is socketed on supporting plate by two nuts On 26.
The profile analysis method of the multi-targets recognition based on binocular vision of the present invention is comprised the following steps:
1)First preset run trace line on region to be analyzed, then on default run trace line to choose several to be measured Point, and first tested point is denoted as into target to be measured;
2)Then described image information forwarded by left camera 14 and the image information in the difference acquisition testing of right camera 19 region Into message handler, positional information of the message handler according to described image acquisition of information target to be measured, then according to described The positional information of target to be measured obtain target to be measured relative to left camera 14 azimuth and the angle of pitch, target to be measured relative to the right side The azimuth of camera 19 and the angle of pitch and target to be measured relative to left camera 14 and right camera 19 it is common towards angle, together When produce the first drive signal, the second drive signal, the 3rd drive signal, fourth drive signal and the 5th drive signal, and will First drive signal, the second drive signal, the 3rd drive signal, fourth drive signal and the 5th drive signal are forwarded to control In making sheet 5, panel 5 drives letter according to first drive signal, the second drive signal, the 3rd drive signal, the 4th respectively Number and the 5th drive signal driven by the first driver 2, the second driver 3, the 3rd driver 4, fourth drive 28 and the 5th Dynamic device 29 drives the first motor 15, the second motor 7, the 3rd motor 6, the 4th motor 25 and the 5th respectively Motor 18 works, and the first motor 15 drives 14 in the vertical direction of left camera to rotate, and the second motor 7 is by the One driving gear set and left-handed rotating shaft 10 drive left camera 14 to rotate in the horizontal direction, and the 3rd motor 6 is passed by the 3rd Moving gear group and main shaft 27 drive left camera 14 and right camera 19 being rotated around main shaft 27 in the horizontal direction, the 4th stepping electricity Machine 25 passes through the second driving gear set and dextral shaft 22 drives right camera 19 to rotate in the horizontal direction, the 5th motor 18 19 in the vertical direction of right camera is driven to rotate;
Meanwhile, left digital compass 12 obtains the azimuth information of left camera 14 in real time, and by the side of the left camera 14 Azimuth angle information is forwarded in message handler, according to the azimuth information of left camera 14, message handler judges that left camera 14 is current Azimuth whether be azimuth of the target to be measured relative to left camera 14, when the current azimuth of left camera 14 be target to be measured Relative to left camera 14 azimuth when, then no longer produce the second drive signal, make the second motor 7 quit work;
Left electronic gyroscope 16 obtains the pitching angle information of left camera 14 in real time, and the angle of pitch of the left camera 14 is believed Breath is forwarded in message handler, according to the pitching angle information of the left camera 14, message handler judges that left camera 14 is current Whether the angle of pitch is the angle of pitch of the target to be measured relative to left camera 14, when the current angle of pitch of left camera 14 is target phase to be measured For left camera 14 the angle of pitch when, then no longer produce the first drive signal, make the first motor 15 quit work;
Main shaft digital compass 11 obtain in real time left camera 14 and right camera 19 it is common towards angle information, and will be described Left camera 14 and right camera 19 is common is forwarded in message handler towards angle information, message handler is according to left camera 14 And right camera 19 it is common towards angle information judge left camera 14 and right camera 19 it is current it is common towards angle whether be to treat Survey target relative to left camera 14 and right camera 19 it is common towards angle, when front left camera 14 and right camera 19 it is current common Towards angle be target to be measured relative to left camera 14 and right camera 19 it is common towards angle when, then no longer produce the 3rd and drive Signal, makes the 3rd motor 6 quit work;
The azimuth information of the right camera of 21 Real-time Collection of right digital compass 19, and the azimuth of the right camera 19 is believed Breath is forwarded in message handler, and message handler judges the current orientation of right camera 19 according to the azimuth information of right camera 19 Whether angle is azimuth of the target to be measured relative to right camera 19, when the current azimuth of right camera 19 be target to be measured relative to During the azimuth of right camera 19, then fourth drive signal is no longer produced, make the 4th motor 25 quit work;
Right electronic gyroscope 17 obtains the pitching angle information of right camera 19 in real time, and the angle of pitch of the right camera 19 is believed Breath is forwarded in message handler, according to the pitching angle information of the right camera 19, message handler judges that right camera 19 is current Whether the angle of pitch is the angle of pitch of the target to be measured relative to right camera 19, when the current angle of pitch of right camera 19 is target phase to be measured For right camera 19 the angle of pitch when, then no longer produce the 5th drive signal, make the 5th motor 18 quit work;
When the first motor 15, the second motor 7, the 3rd motor 6, the 4th motor 25 and the 5th stepping After motor 18 quits work, then left camera 14 and right camera 19 are directed at target to be measured;
3)After left camera 14 and right camera 19 are directed at target to be measured, left camera 14 and right camera 19 gather target institute to be measured In the image information in region, and the image information of target region to be measured is forwarded in message handler, message handler The coarse of target region to be measured is obtained by technique of binocular stereoscopic vision according to the image information of target region to be measured The spatial positional information of degree and target to be measured, then stores the spatial positional information of target to be measured again;
4)Next tested point will be chosen on default run trace line as new target to be measured, repeat step 3), obtain newly Target to be measured spatial positional information, then store the spatial positional information of new target to be measured;
5)Repeat step 4), the spatial positional information of each tested point on default run trace line is obtained, then to default walking In trajectory, the spatial positional information of each tested point is associated rule analysis, obtains the profile information in region to be measured.
It should be noted that obtaining region to be analyzed according to the spatial positional information of each tested point on default run trace line Profile information when, by the association rule algorithm of impact point so that obtain in region to be analyzed on desired guiding trajectory section letter Breath.The present invention is analyzed detection to the roughness in region using machine vision target detection recognizer, to desired guiding trajectory Point on line is tracked in positioning region, carries out locus meter to the impact point for obtaining using binocular distance measurement principle Calculate, the spatial positional information of target to be measured in the roughness target of acquisition target region to be measured and track.

Claims (6)

1. the extraction of depth information and profile analysis system of a kind of binocular vision regional aim, it is characterised in that including left socle (1), right support (31), the first driver (2), the second driver (3), the 3rd driver (4), fourth drive (28), the 5th Driver (29), panel (5), the first motor (15), the second motor (7), the 3rd motor (6), the 4th step Stepper motor (25), the 5th motor (18), swivel plate (8), the first driving gear set (9), the second driving gear set (23), Three driving gear set (24), left-handed rotating shaft (10), dextral shaft (22), main shaft digital compass (11), left digital compass (12), right digital compass (21), left camera support (13), right camera support (20), left camera (14), right camera (19), a left side Electronic gyroscope (16), right electronic gyroscope (17), supporting plate (26), main shaft (27) and base plate (30), the 3rd driving gear set (24) it is made up of first gear and second gear, the first driving gear set (9) is made up of the 3rd gear and the 4th gear, second passes Moving gear group (23) is made up of the 5th gear and the 6th gear;
The lower end of the left socle (1) and the lower end of right support (31) are fixedly connected with the two ends of base plate (30) respectively, left socle (1) upper end and the upper end of right support (31) is individually fixed in the bottom of supporting plate (26), and the 3rd motor (6) is fixed on and props up The bottom of frame plate (26), the output shaft of the 3rd motor (6) pass through supporting plate (26), first gear to be socketed on the 3rd stepping electricity On the output shaft of machine (6), the lower end of main shaft (27) passes through supporting plate (26), and the upper end of main shaft (27) passes through swivel plate (8), and second Gear is socketed on main shaft (27), and second gear is meshed with first gear, and main shaft digital compass (11) is fixed on main shaft (27) top, the second motor (7) and the 4th motor (25) are fixed on the bottom of swivel plate (8), the second motor (7) output shaft and the output shaft of the 3rd motor (6) both passes through the swivel plate (8), and the 3rd gear is socketed on second step On the output shaft of stepper motor (7), the 5th gear is socketed on the output shaft of the 4th motor (25), under left-handed rotating shaft (10) The lower end of end and dextral shaft (22) both passes through swivel plate (8), under the upper end of left-handed rotating shaft (10) and left camera support (13) End is connected, and the upper end of dextral shaft (22) is connected with the lower end of right camera support (20), and the 4th gear is socketed on anticlockwise On axle (10), and the 4th gear is meshed with the 3rd gear, and the 6th gear is socketed on dextral shaft (22), and the 6th gear with 5th gear is meshed, and left camera support (13) and right camera support (20) are L-type structure, and left digital compass (12) is fixed On the upper surface of left camera support (13) bottom, right digital compass (21) is fixed on the upper table of right camera support (20) bottom Face;First motor (15) is fixed on the inner side on left camera support (13) top, and the output shaft of the first motor (15) is worn The bottom of the first U-shaped support is fixed on behind the side for crossing left camera support (13) top, left camera (14) is fixed on a U In type support, left electronic gyroscope (16) is fixed on the top of the first U-shaped support, and the 5th motor (18) is fixed on right camera The inner side on support (20) top, the output shaft of the 5th motor (18) are fixed on through the side on right camera support (20) top The interior of the second U-shaped support is fixed in the bottom of the second U-shaped support, right camera (19), and right electronic gyroscope (17) is fixed on the 2nd U The top of type support;
The outfan of the outfan and right camera (19) of the left camera (14) is connected with the input of message handler, information The outfan of processor is connected with the input of panel (5), and the outfan of panel (5) is defeated with the first driver (2) Enter end, the input of the second driver (3), the input of the 3rd driver (4), the input and the 5th of fourth drive (28) The input of driver (29) is connected, and the outfan of the first driver (2) is connected with the control end of the first motor (15) Connect, the outfan of the second driver (3) is connected with the control end of the second motor (7), the outfan of the 3rd driver (4) It is connected with the control end of the 3rd motor (6), the control of the outfan of fourth drive (28) and the 4th motor (25) End processed is connected, and the outfan of the 5th driver (29) is connected with the control end of the 5th motor (18), left electronic guide The outfan of pin (12), the outfan of right digital compass (21), the outfan of left electronic gyroscope (16), right electronic gyroscope (17) outfan and the outfan of main shaft digital compass (11) is connected with the input of message handler.
2. the extraction of depth information and profile analysis system of binocular vision regional aim according to claim 1, its feature It is, left electronic gyroscope (16) center, left camera (14) center, left digital compass (12) center and left-handed rotating shaft (10) axle The heart is located along the same line.
3. the extraction of depth information and profile analysis system of binocular vision regional aim according to claim 2, its feature It is, right electronic gyroscope (17) center, right camera (19) center, right digital compass (21) center and dextral shaft (22) axle The heart is located along the same line.
4. the extraction of depth information and profile analysis system of binocular vision regional aim according to claim 1, its feature It is that the panel (5) is the panel based on STM32 chips.
5. the extraction of depth information and profile analysis system of binocular vision regional aim according to claim 1, its feature It is,
The panel (5) is fixed on left socle (1);
First driver (2), the second driver (3), the 3rd driver (4), fourth drive (28) and the 5th driver (29) it is fixed on base plate (30).
6. a kind of profile analysis method of the multi-targets recognition based on binocular vision, it is characterised in that based on described in claim 3 Binocular vision regional aim extraction of depth information and profile analysis system, comprise the following steps:
1) it is first that run trace line is preset on region to be analyzed, then several tested points are chosen on default run trace line, And first tested point is denoted as into target to be measured;
2) then described image information forwarded by the image information in left camera (14) and right camera (19) difference acquisition testing region Into message handler, positional information of the message handler according to described image acquisition of information target to be measured, then according to described The positional information of target to be measured obtain target to be measured relative to left camera (14) azimuth and the angle of pitch, target to be measured relative to The azimuth of right camera (19) and the angle of pitch and target to be measured are relative to the common direction of left camera (14) and right camera (19) Angle, while producing the first drive signal, the second drive signal, the 3rd drive signal, fourth drive signal and the 5th driving letter Number, and by first drive signal, the second drive signal, the 3rd drive signal, fourth drive signal and the 5th drive signal It is forwarded in panel (5), panel (5) drives letter according to first drive signal, the second drive signal, the 3rd respectively Number, fourth drive signal and the 5th drive signal by the first driver (2), the second driver (3), the 3rd driver (4), the Four drivers (28) and the 5th driver (29) drive the first motor (15), the second motor (7), the 3rd stepping respectively The work of motor (6), the 4th motor (25) and the 5th motor (18), the first motor (15) drive left camera (14) In the vertical direction is rotated, and the second motor (7) drives left camera by the first driving gear set and left-handed rotating shaft (10) (14) rotate in the horizontal direction, the 3rd motor (6) drives left camera by the 3rd driving gear set and main shaft (27) (14) and right camera (19) is being rotated in the horizontal direction around main shaft (27), the 4th motor (25) is by the second driving cog Wheel group and dextral shaft (22) drive right camera (19) to rotate in the horizontal direction, and the 5th motor (18) drives right camera (19) in the vertical direction is rotated;
Meanwhile, left digital compass (12) obtains the azimuth information of left camera (14) in real time, and by the left camera (14) Azimuth information is forwarded in message handler, and message handler judges left camera according to the azimuth information of left camera (14) (14) whether current azimuth is azimuth of the target to be measured relative to left camera (14), when the current orientation of left camera (14) Angle be target to be measured relative to left camera (14) azimuth when, then no longer produce the second drive signal, make the second motor (7) quit work;
Left electronic gyroscope (16) obtains the pitching angle information of left camera (14) in real time, and by the angle of pitch of the left camera (14) Information is forwarded in message handler, and message handler judges left camera (14) according to the pitching angle information of the left camera (14) Whether the current angle of pitch is the angle of pitch of the target to be measured relative to left camera (14), when the current angle of pitch of left camera (14) is Target to be measured relative to left camera (14) the angle of pitch when, then no longer produce the first drive signal, make the first motor (15) Quit work;
Main shaft digital compass (11) obtain in real time left camera (14) and right camera (19) it is common towards angle information, and by institute State left camera (14) and right camera (19) is common is forwarded in message handler towards angle information, message handler is according to a left side Camera (14) and right camera (19) it is common judge the current common court of left camera (14) and right camera (19) towards angle information To angle be whether target to be measured relative to left camera (14) and right camera (19) it is common towards angle, when front left camera (14) And right camera (19) it is current it is common towards angle be target to be measured relative to the common court of left camera (14) and right camera (19) During to angle, then the 3rd drive signal is no longer produced, makes the 3rd motor (6) quit work;
The azimuth information of the right camera (19) of right digital compass (21) Real-time Collection, and by the azimuth of the right camera (19) Information is forwarded in message handler, according to the azimuth information of right camera (19), message handler judges that right camera (19) is current Azimuth whether be azimuth of the target to be measured relative to right camera (19), when the current azimuth of right camera (19) is for be measured Target relative to right camera (19) azimuth when, then no longer produce fourth drive signal, make the 4th motor (25) stop Work;
Right electronic gyroscope (17) obtains the pitching angle information of right camera (19) in real time, and by the angle of pitch of the right camera (19) Information is forwarded in message handler, and message handler judges right camera (19) according to the pitching angle information of the right camera (19) Whether the current angle of pitch is the angle of pitch of the target to be measured relative to right camera (19), when the current angle of pitch of right camera (19) is Target to be measured relative to right camera (19) the angle of pitch when, then no longer produce the 5th drive signal, make the 5th motor (18) Quit work;
When the first motor (15), the second motor (7), the 3rd motor (6), the 4th motor (25) and the 5th After motor (18) quits work, then left camera (14) and right camera (19) are directed at target to be measured;
3) after left camera (14) and right camera (19) are directed at target to be measured, left camera (14) and right camera (19) gather mesh to be measured The image information of mark region, and the image information of target region to be measured is forwarded in message handler, at information Reason device obtains to be measured target region by technique of binocular stereoscopic vision according to the image information of target region to be measured The spatial positional information of roughness and target to be measured, then stores the spatial positional information of target to be measured again;
4) next tested point will be chosen on default run trace line as new target to be measured, 3) repeat step, obtains treating newly The spatial positional information of target is surveyed, the spatial positional information of new target to be measured is then stored;
5) 4) repeat step, obtains the spatial positional information of each tested point on default run trace line, then to default walking rail On trace, the spatial positional information of each tested point is associated rule analysis, obtains the profile information in region to be measured.
CN201510021300.2A 2015-01-15 2015-01-15 Binocular vision area target depth information extraction and cross section analysis system and method Expired - Fee Related CN104656683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510021300.2A CN104656683B (en) 2015-01-15 2015-01-15 Binocular vision area target depth information extraction and cross section analysis system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510021300.2A CN104656683B (en) 2015-01-15 2015-01-15 Binocular vision area target depth information extraction and cross section analysis system and method

Publications (2)

Publication Number Publication Date
CN104656683A CN104656683A (en) 2015-05-27
CN104656683B true CN104656683B (en) 2017-04-26

Family

ID=53247941

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510021300.2A Expired - Fee Related CN104656683B (en) 2015-01-15 2015-01-15 Binocular vision area target depth information extraction and cross section analysis system and method

Country Status (1)

Country Link
CN (1) CN104656683B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI672675B (en) * 2017-04-10 2019-09-21 鈺立微電子股份有限公司 Depth information processing device
CN113340241B (en) * 2021-06-09 2022-12-02 河南德朗智能科技有限公司 Binocular vision concrete joint surface roughness measurement method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09135464A (en) * 1995-11-08 1997-05-20 Nippon Steel Corp Stereoscopic video image display device
CN1490594A (en) * 2003-08-22 2004-04-21 湖南大学 Multiple free degree artificial threedimensional binocular vision apparatus
CN2644114Y (en) * 2003-08-22 2004-09-29 湖南大学 Imitated multidirectional stereoscopic vision device
CN1600505A (en) * 2004-10-21 2005-03-30 上海交通大学 Servo binocular vision sensors on welding robot
CN201012496Y (en) * 2006-08-22 2008-01-30 燕山大学 Parallel-connection robot binocular active vision monitoring mechanism
CN203661165U (en) * 2013-12-10 2014-06-18 吉林大学 Multi freedom degree binocular stereo vision device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09135464A (en) * 1995-11-08 1997-05-20 Nippon Steel Corp Stereoscopic video image display device
CN1490594A (en) * 2003-08-22 2004-04-21 湖南大学 Multiple free degree artificial threedimensional binocular vision apparatus
CN2644114Y (en) * 2003-08-22 2004-09-29 湖南大学 Imitated multidirectional stereoscopic vision device
CN1600505A (en) * 2004-10-21 2005-03-30 上海交通大学 Servo binocular vision sensors on welding robot
CN201012496Y (en) * 2006-08-22 2008-01-30 燕山大学 Parallel-connection robot binocular active vision monitoring mechanism
CN203661165U (en) * 2013-12-10 2014-06-18 吉林大学 Multi freedom degree binocular stereo vision device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
主动立体双目视觉平台的设计与实现;余洪山 等;《工业仪表与自动化装置》;20040205(第1期);第61-63、38页 *
双目视觉***设计及控制方法研究;王飞;《中国优秀硕士学位论文全文数据库(电子期刊) 信息科技辑》;20140415(第4期);I138-904 *

Also Published As

Publication number Publication date
CN104656683A (en) 2015-05-27

Similar Documents

Publication Publication Date Title
CN108731736B (en) Wall radar photoelectricity robot system is climbed automatically for bridge tunnel Structural defect non-destructive testing diagnosis
CN101685005B (en) Machine for measuring outline dimension of vehicle
CN201297927Y (en) Automobile outline size measuring machine
CN104881027B (en) Wheel-track combined Intelligent Mobile Robot active obstacle system and control method
CN109281711B (en) A kind of subterranean tunnel safety patrol inspection robot
CN102322857B (en) Position and posture measuring system and method for mechanical equipment
CN109144057A (en) A kind of guide vehicle based on real time environment modeling and autonomous path planning
CN103207403B (en) Satellite navigation and inertial measurement combined orbit measuring system and method
CN107003675A (en) The method that processing surface is surveyed and drawn for autonomous robot vehicle
CN102288121A (en) Method for measuring and pre-warning lane departure distance based on monocular vision
CN105955279B (en) A kind of method for planning path for mobile robot and device based on image vision
CN106325287A (en) Intelligent mower straight line walking control system based on inertial/magnetic sensor MARG attitude detection
ES2953679T3 (en) Method of identifying points or lines of interest on a railway track
CN108645373A (en) A kind of dynamic 3 D tunnel cross-section shape changing detection and analysis system, method and device
CN104390644A (en) Method for detecting field obstacle based on field navigation image collection equipment
CN108153306A (en) A kind of autonomous road lossless detection method of robot system
CN207216418U (en) Agricultural robot automated driving system
CN204557216U (en) Wheel-track combined Intelligent Mobile Robot active obstacle system
CN104656683B (en) Binocular vision area target depth information extraction and cross section analysis system and method
JP2000338865A (en) Data gathering device for digital road map
CN104777500B (en) Roll car travel direction and Seeding location high-precision measuring system and method
CN204228168U (en) A kind of field navigation picture collecting device
CN106168470A (en) A kind of station platform clearance survey device and method
US11047368B2 (en) Systems and methods for maintaining wind turbine blades
DE102012015188A1 (en) Actively or passively driven wheel for e.g. muscle power or motor operated locomotion unit in automobile field, has power generation device connected with sensor and interface for supplying sensor and interface with electrical power

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170426

Termination date: 20200115

CF01 Termination of patent right due to non-payment of annual fee