CN104656683A - Binocular vision area target depth information extraction and cross section analysis system and method - Google Patents

Binocular vision area target depth information extraction and cross section analysis system and method Download PDF

Info

Publication number
CN104656683A
CN104656683A CN201510021300.2A CN201510021300A CN104656683A CN 104656683 A CN104656683 A CN 104656683A CN 201510021300 A CN201510021300 A CN 201510021300A CN 104656683 A CN104656683 A CN 104656683A
Authority
CN
China
Prior art keywords
stepper motor
camera
target
angle
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510021300.2A
Other languages
Chinese (zh)
Other versions
CN104656683B (en
Inventor
王孙安
陈先益
邸宏宇
程元皓
王冰心
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201510021300.2A priority Critical patent/CN104656683B/en
Publication of CN104656683A publication Critical patent/CN104656683A/en
Application granted granted Critical
Publication of CN104656683B publication Critical patent/CN104656683B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a binocular vision area target depth information extraction and cross section analysis system and method. The system comprises a left bracket, a right bracket, a first driver, a second driver, a third driver, a fourth driver, a fifth driver, a control panel, a first step motor, a second step motor, a third step motor, a fourth step motor, a fifth step motor, a rotating plate, a first transmission gear set, a second transmission gear set, a third transmission gear set, a left rotating shaft, a right rotating shaft, a main shaft electronic compass, a left electronic compass, a right electronic compass, a left camera bracket, a right camera bracket, a left camera, a right camera, a left electronic gyroscope, a right electronic gyroscope, a bracket plate, a bracket plate, a main shaft and a bottom plate, wherein the third transmission gear set consists of a first gear and a second gear; the first transmission gear set consists of a third gear and a fourth gear; the second transmission gear set consists of a fifth gear and a sixth gear. The binocular vision area target depth information extraction and cross section analysis system and method can be used for obtaining the cross section information on preset tracks in areas to be detected.

Description

The extraction of depth information of binocular vision regional aim and profile analysis system and method
Technical field
The invention belongs to motion control field, relate to a kind of extraction of depth information and profile analysis system and method for binocular vision regional aim.
Background technology
At present along with the development of science and technology, the application of the intelligent vehicle of the continuous progress various energy autonomous of artificial intelligence is in every field, as Agriculture pick robot, outer space exploring robot, explosive-removal robot etc., the road environment of their walking often more complicated is unstructured road, and can walk in the road environment of complexity to make intelligent mobile robot to obtain the environmental information on road surface in advance: surface roughness (projection on road surface, depression), grade information etc.Only have the walking information fully obtaining road surface could carry out best planning walking path, carry out pre-estimation to the pose of walking intelligent mobile robot, thus can the walking of safety and steady more.
The technology that the current roughness to method, particularly road pavement that the unstructured road road surface of complexity is analyzed and section carry out forecast analysis is simultaneously less.Mainly be divided into contact and contactless measurement.The mode of contact type measurement mainly contains many wheel lining cars, trailer-type jolts accumulating instrument, the road surface meter etc. of stepping type detects road surface by the direct kiss the earth of instrument portion mechanism, and this is mainly used on structured road; Contactless mainly contains vehicular bump-integrator, spindle nose acceleration measurement method based on inertial reference, the laser displacement sensor based on inertial reference and acceleration transducer with the use of laser cross section instrument, inertial measurement method etc. along the multiple acceleration transducer of longitudinal direction of car one column distribution, wherein promoting the use of more is laser cross section instrument measuring method, and these modes are mainly applicable to the measurement of structured road equally.In addition scanning laser radar also well can obtain information of road surface, and it can be advantageously used in structured road also can be applicable to complicated non-structural walking pavement detection.Also useful machine vision more common is at present to the acquisition of surrounding terrain walking information, and it is mainly used in detection of obstacles, or three dimensional terrain reconstruction is rarely used in the profile analysis of ground surface or terrain.
But contact and be contactlessly mainly used on structured road in prior art, measure then incompatible for more complicated unstructured road or road surface, field, it cannot make the traffic information in intelligent mobile robot Real-time Obtaining walking front, statistical study can only be carried out according to surveying data, being difficult to because destructuring road surface walking circumstance complication is changeable distinguish the road surface, region of analysing and being about to pass through by available data, although laser scanning surface radar can be used for different road environment detect then its detection of obstacles being mainly used in road surface for the roughness on ground and the Slope Analysis on road surface then weak, and equipment is costly, existing machine vision obtains road environment of around walking, structured road can be adapted to, well can be applied to non-structured complex road surface environment again, but current research mainly lays particular emphasis on detection of obstacles, the three-dimensional reconstruction of surrounding environment, three-dimensional reconstruction needs to process a large amount of data, the wheel efficiency having had a strong impact on intelligent mobile robot is difficult to the requirement of real-time meeting its walking, if the profile information of intelligent mobile robot travel region effectively can be obtained, effectively can improve the wheel efficiency of intelligent mobile robot, but in existing technology, do not relate to the system and method how obtaining zone profile information.
Summary of the invention
The object of the invention is to the shortcoming overcoming above-mentioned prior art, provide a kind of extraction of depth information and profile analysis system and method for binocular vision regional aim, this system and method can detect the roughness in region to be measured, and obtains the profile information in region to be measured.
For achieving the above object, extraction of depth information and the profile analysis system of binocular vision regional aim of the present invention comprise left socle, right support, first driver, second driver, 3rd driver, four-wheel drive device, 5th driver, control panel, first stepper motor, second stepper motor, 3rd stepper motor, 4th stepper motor, 5th stepper motor, swivel plate, first driving gear set, second driving gear set, 3rd driving gear set, left-handed rotating shaft, dextral shaft, main shaft digital compass, left digital compass, right digital compass, left camera support, right camera support, left camera, right camera, left electronic gyroscope, right electronic gyroscope, supporting plate, main shaft and base plate, 3rd driving gear set is made up of the first gear and the second gear, first driving gear set is made up of the 3rd gear and the 4th gear, second driving gear set is made up of the 5th gear and the 6th gear,
The lower end of described left socle and the lower end of right support are fixedly connected with the two ends of base plate respectively, the upper end of left socle and the upper end of right support are individually fixed in the bottom of supporting plate, 3rd stepper motor is fixed on the bottom of supporting plate, the output shaft of the 3rd stepper motor is through supporting plate, first geared sleeve is connected on the output shaft of the 3rd stepper motor, the lower end of main shaft is through supporting plate, the upper end of main shaft is through swivel plate, second geared sleeve is connected on main shaft, and the second gear is meshed with the first gear, main shaft digital compass is fixed on main shaft top, second stepper motor and the 4th stepper motor are fixed on the bottom of swivel plate, the output shaft of the second stepper motor and the output shaft of the 3rd stepper motor are all through described swivel plate, 3rd geared sleeve is connected on the output shaft of the second stepper motor, 5th geared sleeve is connected on the output shaft of the 4th stepper motor, the lower end of left-handed rotating shaft and the lower end of dextral shaft are all through swivel plate, the upper end of left-handed rotating shaft is connected with the lower end of left camera support, the upper end of dextral shaft is connected with the lower end of right camera support, 4th geared sleeve is connected in left-handed rotating shaft, and the 4th gear is meshed with the 3rd gear, 6th geared sleeve is connected on dextral shaft, and the 6th gear is meshed with the 5th gear, left camera support and right camera support are L-type structure, left digital compass is fixed on the upper surface bottom left camera support, right digital compass is fixed on the upper surface bottom right camera support, first stepper motor is fixed on the inner side on left camera support top, the output shaft of the first stepper motor is through the bottom being fixed on the first U-shaped support behind the side on left camera support top, left camera is fixed in described first U-shaped support, left electronic gyroscope is fixed on the top of the first U-shaped support, 5th stepper motor is fixed on the inner side on right camera support top, the output shaft of the 5th stepper motor is fixed on the bottom of the second U-shaped support through the side on right camera support top, right camera is fixed in the second U-shaped support, right electronic gyroscope is fixed on the top of the second U-shaped support,
Output terminal and the output terminal of right camera of described left camera are connected with the input end of message handler, the output terminal of visual information processor is connected with the input end of control panel, the output terminal of control panel and the input end of the first driver, the input end of the second driver, the input end of the 3rd driver, the input end of four-wheel drive device and the input end of the 5th driver are connected, the output terminal of the first driver is connected with the control end of the first stepper motor, the output terminal of the second driver is connected with the control end of the second stepper motor, the output terminal of the 3rd driver is connected with the control end of the 3rd stepper motor, the output terminal of four-wheel drive device is connected with the control end of the 4th stepper motor, the output terminal of the 5th driver is connected with the control end of the 5th stepper motor, the output terminal of left digital compass, the output terminal of right digital compass, the output terminal of left electronic gyroscope, the output terminal of right electronic gyroscope and the output terminal of main shaft digital compass are all connected with the input end of message handler.
Described left electronic gyroscope center, left image center, left digital compass center and left-handed rotating shaft core are located along the same line.
Described right electronic gyroscope center, right image center, right digital compass center and dextral shaft axle center are located along the same line.
Described control panel is the control panel based on STM32 chip.
Described control panel is fixed on left socle;
Described first driver, the second driver, the 3rd driver, four-wheel drive device and the 5th driver are fixed on base plate.
The profile analysis method of the multi-targets recognition based on binocular vision of the present invention comprises the following steps:
1) first on region to be analyzed, preset run trace line, then choose several tested points on the run trace line preset, and first tested point is denoted as target to be measured;
2) image information in left camera and right camera difference acquisition testing region, then described image information is forwarded in message handler, message handler, according to the positional information of described image information acquisition target to be measured, then obtains target to be measured relative to the position angle of left camera and the angle of pitch according to the positional information of described target to be measured, target to be measured is relative to the position angle of right camera and the angle of pitch, and target to be measured relative to target to be measured relative to left camera and right camera common towards angle, produce the first drive singal simultaneously, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal, and by described first drive singal, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal are forwarded in control panel, and control panel is respectively according to described first drive singal, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal are by the first driver, second driver, 3rd driver, four-wheel drive device and the 5th driver drive the first stepper motor respectively, second stepper motor, 3rd stepper motor, 4th stepper motor and the 5th stepper motor work, the left camera in the vertical direction of first stepping driven by motor rotates, second stepper motor drives left camera to rotate in the horizontal direction by the first driving gear set and left-handed rotating shaft, 3rd stepper motor drives left camera and right camera rotating in the horizontal direction around main shaft by the 3rd driving gear set and main shaft, 4th stepper motor drives right camera to rotate in the horizontal direction by the second driving gear set and dextral shaft, and the 5th stepper motor drives right camera in the vertical direction to rotate,
Simultaneously, the azimuth information of the left camera of left digital compass Real-time Obtaining, and the azimuth information of described left camera is forwarded in message handler, according to the azimuth information of left camera, message handler judges whether the current position angle of left camera is the position angle of target to be measured relative to left camera, when the position angle that left camera is current is the position angle of target to be measured relative to left camera, then no longer produce the second drive singal, the second stepper motor is quit work;
The angle of pitch information of the left camera of left electronic gyroscope Real-time Obtaining, and the angle of pitch information of described left camera is forwarded in message handler, according to the angle of pitch information of described left camera, message handler judges whether the current angle of pitch of left camera is the angle of pitch of target to be measured relative to left camera, when the angle of pitch that left camera is current is the angle of pitch of target to be measured relative to left camera, then no longer produce the first drive singal, the first stepper motor is quit work;
The left camera of main shaft digital compass Real-time Obtaining and right camera common towards angle information, and by described left camera and right camera common be forwarded in message handler towards angle information, message handler according to left camera and right camera common towards angle information judge left camera and right camera current jointly towards angle be whether target to be measured relative to left camera and right camera common towards angle, when front left camera and right camera current jointly towards angle be target to be measured relative to left camera and right camera common towards angle time, then no longer produce the 3rd drive singal, 3rd stepper motor is quit work,
The azimuth information of the right camera of right digital compass Real-time Collection, and the azimuth information of described right camera is forwarded in message handler, according to the azimuth information of right camera, message handler judges whether the current position angle of right camera is the position angle of target to be measured relative to right camera, when the position angle that right camera is current is the position angle of target to be measured relative to right camera, then no longer produce four-wheel drive signal, the 4th stepper motor is quit work;
The angle of pitch information of the right camera of right electronic gyroscope Real-time Obtaining, and the angle of pitch information of described right camera is forwarded in message handler, according to the angle of pitch information of described right camera, message handler judges whether the current angle of pitch of right camera is the angle of pitch of target to be measured relative to right camera, when the angle of pitch that right camera is current is the angle of pitch of target to be measured relative to right camera, then no longer produce the 5th drive singal, the 5th stepper motor is quit work;
After the first stepper motor, the second stepper motor, the 3rd stepper motor, the 4th stepper motor and the 5th stepper motor all quit work, then target to be measured aimed at by left camera and right camera;
3) after target to be measured aimed at by left camera and right camera, the image information of left camera and right collected by camera target region to be measured, and the image information of target region to be measured is forwarded in message handler, message handler obtains the roughness of target region to be measured and the spatial positional information of target to be measured according to the image information of target region to be measured by technique of binocular stereoscopic vision, and then stores the spatial positional information of target to be measured;
4) default run trace line will be chosen next tested point as new target to be measured, repeats step 3), obtain the spatial positional information of target to be measured newly, then store the spatial positional information of new target to be measured;
5) step 4 is repeated), obtain the spatial positional information of each tested point on default run trace line, then Association Rule Analysis is carried out to the spatial positional information of each tested point on default run trace line, obtain the profile information in region to be measured.
The present invention has following beneficial effect:
The extraction of depth information of binocular vision regional aim of the present invention and profile analysis system and method are when obtaining the profile information in region to be measured, first will draw on region to be analyzed and preset run trace line, some tested points chosen by the run trace line preset, then each tested point is carried out following operation as target to be measured: the angle of pitch and the position angle that are first adjusted left camera by the first stepper motor and the second stepper motor respectively, and by left electronic gyroscope and the current angle of pitch of the left camera of left digital compass Real-time Obtaining and position angle, when the current angle of pitch of left camera and position angle are target to be measured relative to the angle of pitch of left camera and position angle, first stepper motor and the second stepper motor quit work, position angle and the angle of pitch of right camera is obtained by the 4th stepper motor and the 5th stepper motor, and by right digital compass and the current position angle of the right camera of right electronic gyroscope Real-time Obtaining and the angle of pitch, when the current position angle of right camera and the angle of pitch are target to be measured relative to the position angle of right camera and the angle of pitch, the 4th stepper motor and the 5th stepper motor quit work, simultaneously by the 3rd stepper motor regulate left camera and right camera common towards angle, and by the left camera of main shaft digital compass Real-time Obtaining and right camera common towards angle, when front left camera and right camera common towards angle be target to be measured relative to left camera and right camera common towards angle time, 3rd stepper motor quits work, thus make left camera and right camera aim at target to be measured, now by the image information of left camera and right collected by camera target region to be measured, message handler obtains the spatial positional information of this target to be measured according to this image information, last only need just can obtain the profile information in region to be analyzed according to the spatial positional information of each tested point on default run trace line, thus meet the pose forecast function of the real-time walking of intelligent mobile robot, simple to operate, practicality is extremely strong.
Accompanying drawing explanation
Fig. 1 is structural representation of the present invention;
Fig. 2 is the sectional view in the region to be measured that the present invention analyzes.
Wherein, 1 is left socle, 2 is the first driver, 3 is the second driver, 4 is the 3rd driver, 5 is control panel, 6 is the 3rd stepper motor, 7 is the second stepper motor, 8 is swivel plate, 9 is the first driving gear set, 10 is left-handed rotating shaft, 11 is main shaft digital compass, 12 is left digital compass, 13 is left camera support, 14 is left camera, 15 is the first stepper motor, 16 is left electronic gyroscope, 17 is right electronic gyroscope, 18 is the 5th stepper motor, 19 is right camera, 20 is right camera support, 21 is right digital compass, 22 is dextral shaft, 23 is the second driving gear set, 24 is the 3rd driving gear set, 25 is the 4th stepper motor, 26 is supporting plate, 27 is main shaft, 28 is four-wheel drive device, 29 is the 5th driver, 30 is base plate, 31 is right support.
Embodiment
Below in conjunction with accompanying drawing, the present invention is described in further detail:
With reference to figure 1, extraction of depth information and the profile analysis system of binocular vision regional aim of the present invention comprise left socle 1, right support 31, first driver 2, second driver 3, 3rd driver 4, four-wheel drive device 28, 5th driver 29, control panel 5, first stepper motor 15, second stepper motor 7, 3rd stepper motor 6, 4th stepper motor 25, 5th stepper motor 18, swivel plate 8, first driving gear set 9, second driving gear set 23, 3rd driving gear set 24, left-handed rotating shaft 10, dextral shaft 22, main shaft digital compass 11, left digital compass 12, right digital compass 21, left camera support 13, right camera support 20, left camera 14, right camera 19, left electronic gyroscope 16, right electronic gyroscope 17, supporting plate 26, main shaft 27 and base plate the 30, three driving gear set 24 are made up of the first gear and the second gear, and the first driving gear set 9 is made up of the 3rd gear and the 4th gear, and the second driving gear set 23 is made up of the 5th gear and the 6th gear, the lower end of left socle 1 and the lower end of right support 31 are fixedly connected with the two ends of base plate 30 respectively, the upper end of left socle 1 and the upper end of right support 31 are individually fixed in the bottom of supporting plate 26, 3rd stepper motor 6 is fixed on the bottom of supporting plate 26, the output shaft of the 3rd stepper motor 6 is through supporting plate 26, first geared sleeve is connected on the output shaft of the 3rd stepper motor 6, the lower end of main shaft 27 is through supporting plate 26, the upper end of main shaft 27 is through swivel plate, second geared sleeve is connected on main shaft 27, and the second gear is meshed with the first gear, main shaft digital compass 11 is fixed on main shaft 27 top, second stepper motor 7 and the 4th stepper motor 25 are fixed on the bottom of swivel plate 8, the output shaft of the second stepper motor 7 and the output shaft of the 3rd stepper motor 6 are all through described swivel plate, 3rd geared sleeve is connected on the output shaft of the second stepper motor 7, 5th geared sleeve is connected on the output shaft of the 4th stepper motor 25, the lower end of left-handed rotating shaft 10 and the lower end of dextral shaft 22 are all through swivel plate 8, the upper end of left-handed rotating shaft 10 is connected with the lower end of left camera support 13, the upper end of dextral shaft 22 is connected with the lower end of right camera support 20, 4th geared sleeve is connected in left-handed rotating shaft 10, and the 4th gear is meshed with the 3rd gear, 6th geared sleeve is connected on dextral shaft 22, and the 6th gear is meshed with the 5th gear, left camera support 13 and right camera support 20 are L-type structure, left digital compass 12 is fixed on the upper surface bottom left camera support 13, right digital compass 21 is fixed on the upper surface bottom right camera support 20, first stepper motor 15 is fixed on the inner side on left camera support 13 top, the output shaft of the first stepper motor 15 is through the bottom being fixed on the first U-shaped support behind the side on left camera support 13 top, left camera 14 is fixed in described first U-shaped support, left electronic gyroscope 16 is fixed on the top of the first U-shaped support, 5th stepper motor 18 is fixed on the inner side on right camera support 20 top, the output shaft of the 5th stepper motor 18 is fixed on the bottom of the second U-shaped support through the side on right camera support 20 top, right camera 19 is fixed in the second U-shaped support, right electronic gyroscope 17 is fixed on the top of the second U-shaped support, output terminal and the output terminal of right camera 19 of left camera 14 are connected with the input end of message handler, the output terminal of visual information processor is connected with the input end of control panel 5, the output terminal of control panel 5 and the input end of the first driver 2, the input end of the second driver 3, the input end of the 3rd driver 4, the input end of four-wheel drive device 28 and the input end of the 5th driver 29 are connected, the output terminal of the first driver 2 is connected with the control end of the first stepper motor 15, the output terminal of the second driver 3 is connected with the control end of the second stepper motor 7, the output terminal of the 3rd driver 4 is connected with the control end of the 3rd stepper motor 6, the output terminal of four-wheel drive device 28 is connected with the control end of the 4th stepper motor 25, the output terminal of the 5th driver 29 is connected with the control end of the 5th stepper motor 18, the output terminal of left digital compass 12, the output terminal of right digital compass 21, the output terminal of left electronic gyroscope 16, the output terminal of right electronic gyroscope 17 and the output terminal of main shaft digital compass 11 are all connected with the input end of message handler, left electronic gyroscope 16, left camera 14 and left digital compass 12 are located along the same line, right electronic gyroscope 17, right camera 19 and right digital compass 21 are located along the same line, control panel 5 is the control panel 5 based on STM32 chip, control panel 5 is fixed on left socle 1, first driver 2, second driver 3, the 3rd driver 4, four-wheel drive device 28 and the 5th driver 29 are fixed on base plate 30, it should be noted that, left-handed rotating shaft 10, dextral shaft 22 are by socket two nuts thereon and swivel plate 8, and main shaft 27 is connected on supporting plate 26 by two nut sleeve.
The profile analysis method of the multi-targets recognition based on binocular vision of the present invention comprises the following steps:
1) first on region to be analyzed, preset run trace line, then choose several tested points on the run trace line preset, and first tested point is denoted as target to be measured;
2) image information in left camera 14 and right camera 19 difference acquisition testing region, then described image information is forwarded in message handler, message handler is according to the positional information of described image information acquisition target to be measured, then target to be measured is obtained relative to the position angle of left camera 14 and the angle of pitch according to the positional information of described target to be measured, target to be measured is relative to the position angle of right camera 19 and the angle of pitch, and target to be measured relative to target to be measured relative to left camera 14 and right camera 19 common towards angle, produce the first drive singal simultaneously, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal, and by described first drive singal, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal are forwarded in control panel 5, control panel 5 is respectively according to described first drive singal, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal are by the first driver 2, second driver 3, 3rd driver 4, four-wheel drive device 28 and the 5th driver 29 drive the first stepper motor 15 respectively, second stepper motor 7, 3rd stepper motor 6, 4th stepper motor 25 and the 5th stepper motor 18 work, first stepper motor 15 drives left camera 14 in the vertical direction to rotate, second stepper motor 7 drives left camera 14 to rotate in the horizontal direction by the first driving gear set and left-handed rotating shaft 10, 3rd stepper motor 6 drives left camera 14 and right camera 19 rotating in the horizontal direction around main shaft 27 by the 3rd driving gear set and main shaft 27, 4th stepper motor 25 drives right camera 19 to rotate in the horizontal direction by the second driving gear set and dextral shaft 22, 5th stepper motor 18 drives right camera 19 in the vertical direction to rotate,
Simultaneously, the azimuth information of the left camera 14 of left digital compass 12 Real-time Obtaining, and the azimuth information of described left camera 14 is forwarded in message handler, according to the azimuth information of left camera 14, message handler judges whether the current position angle of left camera 14 is the position angle of target to be measured relative to left camera 14, when the position angle that left camera 14 is current is the position angle of target to be measured relative to left camera 14, then no longer produce the second drive singal, the second stepper motor 7 is quit work;
The angle of pitch information of the left camera 14 of left electronic gyroscope 16 Real-time Obtaining, and the angle of pitch information of described left camera 14 is forwarded in message handler, according to the angle of pitch information of described left camera 14, message handler judges whether the current angle of pitch of left camera 14 is the angle of pitch of target to be measured relative to left camera 14, when the angle of pitch that left camera 14 is current is the angle of pitch of target to be measured relative to left camera 14, then no longer produce the first drive singal, the first stepper motor 15 is quit work;
The left camera of main shaft digital compass 11 Real-time Obtaining 14 and right camera 19 common towards angle information, and by described left camera 14 and right camera 19 common be forwarded in message handler towards angle information, message handler according to left camera 14 and right camera 19 common towards angle information judge left camera 14 and right camera 19 current jointly towards angle be whether target to be measured relative to left camera 14 and right camera 19 common towards angle, when front left camera 14 and right camera 19 current jointly towards angle be target to be measured relative to left camera 14 and right camera 19 common towards angle time, then no longer produce the 3rd drive singal, 3rd stepper motor 6 is quit work,
The azimuth information of the right camera 19 of right digital compass 21 Real-time Collection, and the azimuth information of described right camera 19 is forwarded in message handler, according to the azimuth information of right camera 19, message handler judges whether the current position angle of right camera 19 is the position angle of target to be measured relative to right camera 19, when the position angle that right camera 19 is current is the position angle of target to be measured relative to right camera 19, then no longer produce four-wheel drive signal, the 4th stepper motor 25 is quit work;
The angle of pitch information of the right camera 19 of right electronic gyroscope 17 Real-time Obtaining, and the angle of pitch information of described right camera 19 is forwarded in message handler, according to the angle of pitch information of described right camera 19, message handler judges whether the current angle of pitch of right camera 19 is the angle of pitch of target to be measured relative to right camera 19, when the angle of pitch that right camera 19 is current is the angle of pitch of target to be measured relative to right camera 19, then no longer produce the 5th drive singal, the 5th stepper motor 18 is quit work;
After the first stepper motor 15, second stepper motor 7, the 3rd stepper motor 6, the 4th stepper motor 25 and the 5th stepper motor 18 all quit work, then target to be measured aimed at by left camera 14 and right camera 19;
3) after target to be measured aimed at by left camera 14 and right camera 19, left camera 14 and right camera 19 gather the image information of target region to be measured, and the image information of target region to be measured is forwarded in message handler, message handler obtains the roughness of target region to be measured and the spatial positional information of target to be measured according to the image information of target region to be measured by technique of binocular stereoscopic vision, and then stores the spatial positional information of target to be measured;
4) default run trace line will be chosen next tested point as new target to be measured, repeats step 3), obtain the spatial positional information of target to be measured newly, then store the spatial positional information of new target to be measured;
5) step 4 is repeated), obtain the spatial positional information of each tested point on default run trace line, then Association Rule Analysis is carried out to the spatial positional information of each tested point on default run trace line, obtain the profile information in region to be measured.
It should be noted that, according to when presetting that the spatial positional information of each tested point obtains the profile information in region to be analyzed on run trace line, by the association rule algorithm of impact point and then the profile information that obtains in region to be analyzed on desired guiding trajectory.The present invention adopts machine vision target detection recognizer to carry out analysis to the roughness in region and detects, carry out in track and localization region to the point on desired guiding trajectory line, adopt binocular distance measurement principle to carry out locus calculating to the impact point obtained, obtain the spatial positional information of target to be measured on the roughness target of target region to be measured and track.

Claims (6)

1. the extraction of depth information of binocular vision regional aim and a profile analysis system, is characterized in that, comprise left socle (1), right support (31), first driver (2), second driver (3), 3rd driver (4), four-wheel drive device (28), 5th driver (29), control panel (5), first stepper motor (15), second stepper motor (7), 3rd stepper motor (6), 4th stepper motor (25), 5th stepper motor (18), swivel plate (8), first driving gear set (9), second driving gear set (23), 3rd driving gear set (24), left-handed rotating shaft (10), dextral shaft (22), main shaft digital compass (11), left digital compass (12), right digital compass (21), left camera support (13), right camera support (20), left camera (14), right camera (19), left electronic gyroscope (16), right electronic gyroscope (17), supporting plate (26), main shaft (27) and base plate (30), 3rd driving gear set (24) is made up of the first gear and the second gear, first driving gear set (9) is made up of the 3rd gear and the 4th gear, and the second driving gear set (23) is made up of the 5th gear and the 6th gear,
The lower end of described left socle (1) and the lower end of right support (31) are fixedly connected with the two ends of base plate (30) respectively, the upper end of left socle (1) and the upper end of right support (31) are individually fixed in the bottom of supporting plate (26), 3rd stepper motor (6) is fixed on the bottom of supporting plate (26), the output shaft of the 3rd stepper motor (6) is through supporting plate (26), first geared sleeve is connected on the output shaft of the 3rd stepper motor (6), the lower end of main shaft (27) is through supporting plate (26), the upper end of main shaft (27) is through swivel plate (8), second geared sleeve is connected on main shaft (27), and the second gear is meshed with the first gear, main shaft digital compass (11) is fixed on main shaft (27) top, second stepper motor (7) and the 4th stepper motor (25) are fixed on the bottom of swivel plate (8), the output shaft of the second stepper motor (7) and the output shaft of the 3rd stepper motor (6) are all through described swivel plate (8), 3rd geared sleeve is connected on the output shaft of the second stepper motor (7), 5th geared sleeve is connected on the output shaft of the 4th stepper motor (25), the lower end of left-handed rotating shaft (10) and the lower end of dextral shaft (22) are all through swivel plate (8), the upper end of left-handed rotating shaft (10) is connected with the lower end of left camera support (13), the upper end of dextral shaft (22) is connected with the lower end of right camera support (20), 4th geared sleeve is connected in left-handed rotating shaft (10), and the 4th gear is meshed with the 3rd gear, 6th geared sleeve is connected on dextral shaft (22), and the 6th gear is meshed with the 5th gear, left camera support (13) and right camera support (20) are L-type structure, left digital compass (12) is fixed on the upper surface of left camera support (13) bottom, right digital compass (21) is fixed on the upper surface of right camera support (20) bottom, first stepper motor (15) is fixed on the inner side on left camera support (13) top, the output shaft of the first stepper motor (15) is through the bottom being fixed on the first U-shaped support behind the side on left camera support (13) top, left camera (14) is fixed in described first U-shaped support, left electronic gyroscope (16) is fixed on the top of the first U-shaped support, 5th stepper motor (18) is fixed on the inner side on right camera support (20) top, the output shaft of the 5th stepper motor (18) is fixed on the bottom of the second U-shaped support through the side on right camera support (20) top, right camera (19) is fixed in the second U-shaped support, right electronic gyroscope (17) is fixed on the top of the second U-shaped support,
The output terminal of described left camera (14) and the output terminal of right camera (19) are connected with the input end of message handler, the output terminal of visual information processor is connected with the input end of control panel (5), the output terminal of control panel (5) and the input end of the first driver (2), the input end of the second driver (3), the input end of the 3rd driver (4), the input end of four-wheel drive device (28) and the input end of the 5th driver (29) are connected, the output terminal of the first driver (2) is connected with the control end of the first stepper motor (15), the output terminal of the second driver (3) is connected with the control end of the second stepper motor (7), the output terminal of the 3rd driver (4) is connected with the control end of the 3rd stepper motor (6), the output terminal of four-wheel drive device (28) is connected with the control end of the 4th stepper motor (25), the output terminal of the 5th driver (29) is connected with the control end of the 5th stepper motor (18), the output terminal of left digital compass (12), the output terminal of right digital compass (21), the output terminal of left electronic gyroscope (16), the output terminal of right electronic gyroscope (17) and the output terminal of main shaft digital compass (11) are all connected with the input end of message handler.
2. the extraction of depth information of binocular vision regional aim according to claim 1 and profile analysis system, it is characterized in that, described left electronic gyroscope (16) center, left camera (14) center, left digital compass (12) center and left-handed rotating shaft (10) axle center are located along the same line.
3. the extraction of depth information of binocular vision regional aim according to claim 2 and profile analysis system, it is characterized in that, described right electronic gyroscope (17) center, right camera (19) center, right digital compass (21) center and dextral shaft (22) axle center are located along the same line.
4. the extraction of depth information of binocular vision regional aim according to claim 1 and profile analysis system, it is characterized in that, described control panel (5) is the control panel based on STM32 chip.
5. the extraction of depth information of binocular vision regional aim according to claim 1 and profile analysis system, is characterized in that,
Described control panel (5) is fixed on left socle (1);
Described first driver (2), the second driver (3), the 3rd driver (4), four-wheel drive device (28) and the 5th driver (29) are fixed on base plate (30).
6. based on a profile analysis method for the multi-targets recognition of binocular vision, it is characterized in that, based on extraction of depth information and the profile analysis system of binocular vision regional aim according to claim 3, comprise the following steps:
1) first on region to be analyzed, preset run trace line, then choose several tested points on the run trace line preset, and first tested point is denoted as target to be measured;
2) image information in left camera (14) and right camera (19) difference acquisition testing region, then described image information is forwarded in message handler, message handler is according to the positional information of described image information acquisition target to be measured, then target to be measured is obtained relative to the position angle of left camera (14) and the angle of pitch according to the positional information of described target to be measured, target to be measured is relative to the position angle of right camera (19) and the angle of pitch, and target to be measured relative to target to be measured relative to left camera (14) and right camera (19) common towards angle, produce the first drive singal simultaneously, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal, and by described first drive singal, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal are forwarded in control panel (5), control panel (5) is respectively according to described first drive singal, second drive singal, 3rd drive singal, four-wheel drive signal and the 5th drive singal are by the first driver (2), second driver (3), 3rd driver (4), four-wheel drive device (28) and the 5th driver (29) drive the first stepper motor (15) respectively, second stepper motor (7), 3rd stepper motor (6), 4th stepper motor (25) and the work of the 5th stepper motor (18), first stepper motor (15) drives left camera (14) in the vertical direction to rotate, second stepper motor (7) drives left camera (14) to rotate in the horizontal direction by the first driving gear set and left-handed rotating shaft (10), 3rd stepper motor (6) drives left camera (14) and right camera (19) rotating in the horizontal direction around main shaft (27) by the 3rd driving gear set and main shaft (27), 4th stepper motor (25) drives right camera (19) to rotate in the horizontal direction by the second driving gear set and dextral shaft (22), 5th stepper motor (18) drives right camera (19) in the vertical direction to rotate,
Simultaneously, the azimuth information of the left camera of left digital compass (12) Real-time Obtaining (14), and the azimuth information of described left camera (14) is forwarded in message handler, according to the azimuth information of left camera (14), message handler judges whether the current position angle of left camera (14) is the position angle of target to be measured relative to left camera (14), when the position angle that left camera (14) is current is the position angle of target to be measured relative to left camera (14), then no longer produce the second drive singal, second stepper motor (7) is quit work,
The angle of pitch information of the left camera of left electronic gyroscope (16) Real-time Obtaining (14), and the angle of pitch information of described left camera (14) is forwarded in message handler, according to the angle of pitch information of described left camera (14), message handler judges whether the current angle of pitch of left camera (14) is the angle of pitch of target to be measured relative to left camera (14), when the angle of pitch that left camera (14) is current is the angle of pitch of target to be measured relative to left camera (14), then no longer produce the first drive singal, first stepper motor (15) is quit work,
The left camera of main shaft digital compass (11) Real-time Obtaining (14) and right camera (19) common towards angle information, and by described left camera (14) and right camera (19) common be forwarded in message handler towards angle information, message handler according to left camera (14) and right camera (19) common towards angle information judge left camera (14) and right camera (19) current jointly towards angle be whether target to be measured relative to left camera (14) and right camera (19) common towards angle, when front left camera (14) and right camera (19) current jointly towards angle be target to be measured relative to left camera (14) and right camera (19) common towards angle time, then no longer produce the 3rd drive singal, 3rd stepper motor (6) is quit work,
The azimuth information of the right camera of right digital compass (21) Real-time Collection (19), and the azimuth information of described right camera (19) is forwarded in message handler, according to the azimuth information of right camera (19), message handler judges whether the current position angle of right camera (19) is the position angle of target to be measured relative to right camera (19), when the position angle that right camera (19) is current is the position angle of target to be measured relative to right camera (19), then no longer produce four-wheel drive signal, 4th stepper motor (25) is quit work,
The angle of pitch information of the right camera of right electronic gyroscope (17) Real-time Obtaining (19), and the angle of pitch information of described right camera (19) is forwarded in message handler, according to the angle of pitch information of described right camera (19), message handler judges whether the current angle of pitch of right camera (19) is the angle of pitch of target to be measured relative to right camera (19), when the angle of pitch that right camera (19) is current is the angle of pitch of target to be measured relative to right camera (19), then no longer produce the 5th drive singal, 5th stepper motor (18) is quit work,
After the first stepper motor (15), the second stepper motor (7), the 3rd stepper motor (6), the 4th stepper motor (25) and the 5th stepper motor (18) all quit work, then target to be measured aimed at by left camera (14) and right camera (19);
3) after target to be measured aimed at by left camera (14) and right camera (19), left camera (14) and right camera (19) gather the image information of target region to be measured, and the image information of target region to be measured is forwarded in message handler, message handler obtains the roughness of target region to be measured and the spatial positional information of target to be measured according to the image information of target region to be measured by technique of binocular stereoscopic vision, and then stores the spatial positional information of target to be measured;
4) default run trace line will be chosen next tested point as new target to be measured, repeats step 3), obtain the spatial positional information of target to be measured newly, then store the spatial positional information of new target to be measured;
5) step 4 is repeated), obtain the spatial positional information of each tested point on default run trace line, then Association Rule Analysis is carried out to the spatial positional information of each tested point on default run trace line, obtain the profile information in region to be measured.
CN201510021300.2A 2015-01-15 2015-01-15 Binocular vision area target depth information extraction and cross section analysis system and method Expired - Fee Related CN104656683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510021300.2A CN104656683B (en) 2015-01-15 2015-01-15 Binocular vision area target depth information extraction and cross section analysis system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510021300.2A CN104656683B (en) 2015-01-15 2015-01-15 Binocular vision area target depth information extraction and cross section analysis system and method

Publications (2)

Publication Number Publication Date
CN104656683A true CN104656683A (en) 2015-05-27
CN104656683B CN104656683B (en) 2017-04-26

Family

ID=53247941

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510021300.2A Expired - Fee Related CN104656683B (en) 2015-01-15 2015-01-15 Binocular vision area target depth information extraction and cross section analysis system and method

Country Status (1)

Country Link
CN (1) CN104656683B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108765480A (en) * 2017-04-10 2018-11-06 钰立微电子股份有限公司 Advanced treatment device
CN113340241A (en) * 2021-06-09 2021-09-03 河南德朗智能科技有限公司 Binocular vision concrete joint surface roughness measurement method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09135464A (en) * 1995-11-08 1997-05-20 Nippon Steel Corp Stereoscopic video image display device
CN1490594A (en) * 2003-08-22 2004-04-21 湖南大学 Multiple free degree artificial threedimensional binocular vision apparatus
CN2644114Y (en) * 2003-08-22 2004-09-29 湖南大学 Imitated multidirectional stereoscopic vision device
CN1600505A (en) * 2004-10-21 2005-03-30 上海交通大学 Servo binocular vision sensors on welding robot
CN201012496Y (en) * 2006-08-22 2008-01-30 燕山大学 Parallel-connection robot binocular active vision monitoring mechanism
CN203661165U (en) * 2013-12-10 2014-06-18 吉林大学 Multi freedom degree binocular stereo vision device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09135464A (en) * 1995-11-08 1997-05-20 Nippon Steel Corp Stereoscopic video image display device
CN1490594A (en) * 2003-08-22 2004-04-21 湖南大学 Multiple free degree artificial threedimensional binocular vision apparatus
CN2644114Y (en) * 2003-08-22 2004-09-29 湖南大学 Imitated multidirectional stereoscopic vision device
CN1600505A (en) * 2004-10-21 2005-03-30 上海交通大学 Servo binocular vision sensors on welding robot
CN201012496Y (en) * 2006-08-22 2008-01-30 燕山大学 Parallel-connection robot binocular active vision monitoring mechanism
CN203661165U (en) * 2013-12-10 2014-06-18 吉林大学 Multi freedom degree binocular stereo vision device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
余洪山 等: "主动立体双目视觉平台的设计与实现", 《工业仪表与自动化装置》 *
王飞: "双目视觉***设计及控制方法研究", 《中国优秀硕士学位论文全文数据库(电子期刊) 信息科技辑》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108765480A (en) * 2017-04-10 2018-11-06 钰立微电子股份有限公司 Advanced treatment device
CN108765480B (en) * 2017-04-10 2022-03-15 钰立微电子股份有限公司 Advanced treatment equipment
CN113340241A (en) * 2021-06-09 2021-09-03 河南德朗智能科技有限公司 Binocular vision concrete joint surface roughness measurement method and system
CN113340241B (en) * 2021-06-09 2022-12-02 河南德朗智能科技有限公司 Binocular vision concrete joint surface roughness measurement method and system

Also Published As

Publication number Publication date
CN104656683B (en) 2017-04-26

Similar Documents

Publication Publication Date Title
Vivacqua et al. Self-localization based on visual lane marking maps: An accurate low-cost approach for autonomous driving
US11288521B2 (en) Automated road edge boundary detection
CN102288121B (en) Method for measuring and pre-warning lane departure distance based on monocular vision
CN104266823A (en) Daytime tunnel portal section lighting standard calculating method based on safety visual cognition and system thereof
CN110780305A (en) Track cone bucket detection and target point tracking method based on multi-line laser radar
US9310804B1 (en) Use of prior maps for estimation of lane boundaries
CN107422730A (en) The AGV transportation systems of view-based access control model guiding and its driving control method
JP7059888B2 (en) Assistance control system
CN107817319A (en) It is a kind of to be used for urban road and the Non-Destructive Testing robot system of pipe engineering underground defect
US20220363263A1 (en) Automated bump and/or depression detection in a roadway
CN107589744B (en) Omnidirectional mobile unmanned platform method based on highway tunnel crack detection
CN103413313A (en) Binocular vision navigation system and method based on power robot
CN108627864A (en) Localization method and system, pilotless automobile system based on automobile key
US8885151B1 (en) Condensing sensor data for transmission and processing
CN205950750U (en) Transformer station inspection robot control system that navigates based on inertial navigation
CN108153306A (en) A kind of autonomous road lossless detection method of robot system
JP6298221B2 (en) Path detection system based on solar blind ultraviolet light signal
CN108995743A (en) Navigation vehicle and air navigation aid
CN111506069B (en) All-weather all-ground crane obstacle identification system and method
CN104656683A (en) Binocular vision area target depth information extraction and cross section analysis system and method
CN104777500B (en) Roll car travel direction and Seeding location high-precision measuring system and method
DE102012015188A1 (en) Actively or passively driven wheel for e.g. muscle power or motor operated locomotion unit in automobile field, has power generation device connected with sensor and interface for supplying sensor and interface with electrical power
CN207798777U (en) A kind of non-destructive testing robot system for urban road and pipe engineering underground defect
CN208842514U (en) Navigation vehicle
CN111044040A (en) All-terrain multi-sensor data acquisition platform for unmanned equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170426

Termination date: 20200115