CN111968170A - Online binocular vision distance measurement method based on cross-correlation time delay estimation - Google Patents

Online binocular vision distance measurement method based on cross-correlation time delay estimation Download PDF

Info

Publication number
CN111968170A
CN111968170A CN202010870861.0A CN202010870861A CN111968170A CN 111968170 A CN111968170 A CN 111968170A CN 202010870861 A CN202010870861 A CN 202010870861A CN 111968170 A CN111968170 A CN 111968170A
Authority
CN
China
Prior art keywords
image
cross
time delay
delay estimation
correlation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010870861.0A
Other languages
Chinese (zh)
Inventor
徐胜
苏成悦
陈元电
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202010870861.0A priority Critical patent/CN111968170A/en
Publication of CN111968170A publication Critical patent/CN111968170A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20228Disparity calculation for image-based rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention discloses an online binocular vision distance measurement method based on cross-correlation time delay estimation, which converts displacement estimation of left and right images into time delay estimation through a power spectrum cross-correlation estimation technology, can improve robustness and real-time of image matching, and improves response speed by utilizing an online iteration technology. The binocular ranging method comprises the following steps: firstly, performing fast Fourier transform on a left image and a right image to obtain a frequency domain function; then, filtering the image frequency domain function, and extracting high texture information; then, a cross-power spectrum of the left image and the right image is obtained by utilizing an online recursion technology; and finally, obtaining time delay estimation through fast Fourier inverse transformation, and obtaining a distance measurement result by using a formula after converting the time delay estimation into displacement estimation. The method improves the robustness and the speed of image matching through the online recursive power spectrum cross-correlation time delay estimation technology, has simple algorithm and small calculated amount, and is suitable for the application of an embedded system.

Description

Online binocular vision distance measurement method based on cross-correlation time delay estimation
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an online binocular vision distance measuring method based on cross-correlation time delay estimation.
Background
Along with the rapid development of the unmanned aerial vehicle technology, the unmanned aerial vehicle is more and more widely applied in the civil field. During drone use, drone error events occur frequently due to the complexity and uncertainty of the surrounding environment. Therefore, the research and the use of the unmanned aerial vehicle distance measurement obstacle avoidance technology are urgent. At present, the common obstacle avoidance technology of the unmanned aerial vehicle comprises ultrasonic ranging, infrared or laser ranging and visual ranging. However, the ultrasonic wave can effectively measure a short distance, has certain requirements on the reflecting surface of the obstacle, and is not suitable for measuring and avoiding the obstacle. Infrared or laser ranging is also called TOF, signals of the infrared or laser ranging are easily interfered by the outside, the range of avoiding the obstacle has certain limitation, and the infrared or laser ranging is not suitable for the unmanned aerial vehicle to avoid the obstacle. The navigation mode of the computer vision technology is not influenced by the appearance outline of the obstacle and the interference of external signals, the measurement range is wide, and the response speed of obstacle avoidance by utilizing the computer vision is faster and faster along with the rapid development of the computer hardware technology and the intelligent algorithm. Meanwhile, the visual equipment has the characteristics of light weight and small size and is suitable for the unmanned aerial vehicle with light load.
Among the visual obstacle avoidance technologies, the binocular vision technology has the widest application prospect. The binocular vision technology utilizes the principle that human eyes calculate the distance, can enough obtain the accurate position of barrier and can measure its distance apart from unmanned aerial vehicle for unmanned aerial vehicle keeps away the barrier and has higher accuracy. Image matching is the most critical link in binocular ranging, accuracy and real-time performance of ranging are greatly affected by accuracy and speed of image matching, accuracy and real-time performance in stereo matching algorithms cannot be considered at the same time, most of accurate stereo matching algorithms adopt global algorithms such as graph cut algorithms, confidence coefficient propagation, neural networks, wavelet algorithms and the like, but even if the algorithms are optimized, efficiency cannot be improved qualitatively, hardware is difficult to optimize, and matching speed is slow. In the application of the actual embedded platform, the existing image matching algorithms have the following problems: the matching efficiency needs to be improved to meet the real-time performance of the system, the matching result is greatly influenced by changes of environment, illumination and the like, the matching of irregular-shaped objects is not suitable, and the matching is interfered by a complex background and the like.
The power spectrum cross-correlation time delay estimation algorithm is a classic algorithm in the field of time delay estimation in signal processing, converts a space domain problem into a frequency domain for solving, and has high time delay estimation precision. The power spectrum cross-correlation time delay estimation algorithm is mostly used for analysis of radio frequency signals and acoustic signals, but is rarely used in an image matching link, because the Fourier transform operation consumes large computing resources if hardware acceleration is not adopted. The DSP and the FPGA on the embedded platform are very suitable for Fourier transform operation, hardware acceleration operation is facilitated, power spectrum cross-correlation is favorably applied to an image matching link, a reference image and a matching image shot by a camera are used for carrying out correlation operation, the position of a real-time image in the reference image is deduced, and the algorithm has the advantages of high robustness in noise resistance and distortion resistance, high matching precision and the like. In addition, the image power spectrum belongs to one of frequency spectrum analysis methods, comprehensively reflects the texture characteristics of an image, is a function for describing the change of signal power along with frequency, and reflects the strength of each spatial frequency component of the image, so that the filtering operation is performed on the image power spectrum, the texture information can be extracted, the interference of noise is inhibited, the filtering operation is also suitable for the acceleration of DSP and FPGA hardware, and the real-time performance of the operation can be improved. In order to further adapt to the characteristics of limited computing resources and high real-time requirement of an embedded platform, the matching algorithm needs to be transformed by online iterative recursion, and continuous recursion is carried out by using the energy of adjacent pixel values, so that the computing speed is ensured, and the memory occupation is greatly saved.
Through search, Chinese patent No. CN110211169A, publication date: year 2019, month 9, day 6, patent name: the reconstruction method based on the narrow baseline parallax related to the multi-scale superpixel and the phase utilizes the power spectrum cross-correlation calculation for image matching, but the extraction of the superpixel is required in advance, and the algorithm is not changed into an online recursion mode, so that the real-time property cannot be ensured. Chinese patent No. CN105812769B, publication date: publication date 2016, 9, 7, patent name: based on a phase correlation high-precision parallax tracker, power spectrum cross-correlation calculation is used for image matching, but features need to be extracted in advance by an SURF algorithm, the algorithm is not changed into an online recursion mode, and instantaneity is difficult to guarantee.
Disclosure of Invention
The invention aims to overcome the defects that a binocular ranging result on an unmanned aerial vehicle platform is not robust, is easy to interfere, has a complex algorithm, has high calculation resource requirement and is insufficient in real-time performance on an embedded platform, and by applying a power spectrum cross-correlation time delay estimation technology to displacement estimation of images shot by a binocular camera, the robustness and the anti-interference capability are improved, and by using an online recursion technology, the calculation resource requirement is reduced, and the real-time performance requirement is improved.
In order to achieve the purpose, the technical scheme provided by the invention is as follows: the online binocular vision distance measurement method based on cross-correlation time delay estimation is characterized by comprising the following steps of:
s1, building an unmanned aerial vehicle binocular ranging platform, acquiring left and right views by the left and right cameras, and performing fast Fourier transform to obtain an image frequency spectrum;
s2, filtering the image frequency spectrum obtained in the step S1, extracting high texture features and removing noise;
s3, performing cross-correlation power spectrum calculation on the left and right image frequency spectrums obtained in the step S2 by using an online recursion technology;
s4, carrying out Fourier inverse transformation on the cross-power spectrum obtained in the step S3 to obtain time delay estimation and convert the time delay estimation into displacement estimation;
and S5, deducing a binocular ranging result according to the displacement difference of the left image and the right image obtained in the step S4.
Further, the specific process of step S1 is as follows:
utilize binocular camera to build unmanned aerial vehicle range finding platform, the optical axis of binocular camera is preceding parallel, deviates a certain distance, and two images are acquireed respectively to two cameras of left and right sides, and the image size is MXN pixel, and left image function is IL(x, y) right image function is IR(x, y) where x, y are pixel coordinates and there is an amount of displacement x between the left and right images0I.e. IR(x,y)=IL(x-x0Y), performing fourier transform on the left and right images to obtain an image spectrum:
Figure BDA0002651031810000031
wherein FL(u, v) is the frequency spectrum of the left image, FR(u, v) is the frequency spectrum of the right image, let Au=[1,e-j2πu......e-j2πNu]∈CNAnd Av=[1,e-j2πv......e-j2πMv]T∈CMThen the Fourier transform expression can be written as FL(u,v)=AuILAvAnd FR(u,v)=AuIRAv,ILAnd IRRespectively, left and right image functions.
Further, the specific process of step S2 is as follows:
filtering the image frequency spectrum obtained by Fourier transform, suppressing the interference of noise, and extracting high texture structure information, wherein the used filter is
Figure BDA0002651031810000041
The filter parameters may be passed
Figure BDA0002651031810000042
To obtain D (e)ju,ejv) Is the set ideal amplitude-frequency characteristic. The result of the filtering of the image spectrum can then be expressed as:
Figure BDA0002651031810000043
introducing matrix
Figure BDA0002651031810000044
Then the formula (2) can be represented as
Figure BDA0002651031810000045
Further, the specific process of step S3 is as follows:
performing cross-correlation calculation on the filtered image frequency spectrums obtained by the formula (3) to obtain cross-power spectrums of the left and right images,
Figure BDA0002651031810000046
wherein R (u, v) is a cross-power spectrum,
Figure BDA0002651031810000047
is composed of
Figure BDA0002651031810000048
In a conjugated form. And transforming the cross-correlation calculation by using an online recursion technology to obtain:
Figure BDA0002651031810000049
wherein K is AuP, R (t) is the cross-correlation calculation result at the time t, and R (t +1) is the cross-correlation calculation result at the time t + 1.
Further, the specific process of step S4 is as follows:
performing inverse Fourier transform on the cross-correlation calculation result obtained by the formula (4), wherein the peak value of the obtained inverse Fourier transform corresponds to the time delay difference tau of the left image and the right image:
Figure BDA00026510318100000410
in the formula phiu=[1,ej2πu......ej2πnu],Φv=[1,ej2πv......ej2πmv]TConverting the time delay estimation into displacement estimation:
Figure BDA00026510318100000411
where k is the coefficient obtained by regression of the experimental data, x0The displacement estimation results of the left image and the right image are obtained.
Further, the specific process of step S5 is as follows:
the displacement estimation x of the left and right images obtained by the formula (6)0Obtaining binocular ranging distance information by the following formula;
Figure BDA0002651031810000051
wherein T is the horizontal distance between the left camera and the right camera, f is the focal length of the cameras, and Z is the finally obtained binocular ranging distance.
Compared with the prior art, the principle and the advantages of the scheme are as follows:
the invention has the advantages that
1. The power spectrum cross-correlation time delay estimation technology is applied to displacement estimation of images shot by a binocular camera, robustness and anti-interference capability are improved, the defects that a binocular ranging result on an unmanned aerial vehicle platform is not robust, is easy to interfere, is complex in algorithm and high in calculation resource demand are overcome, meanwhile, frequency domain calculation is easy to use hardware to accelerate, and response speed can be improved.
2. Aiming at the defects of a cross-correlation algorithm, such as large data storage capacity and large computing resource consumption, an improved recursion algorithm is provided, so that the computation amount and the resource consumption are reduced, the computation speed is increased, and the real-time requirement of an embedded platform is met.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the services required for the embodiments or the technical solutions in the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flow chart of a binocular ranging method of the present invention;
FIG. 2 is a schematic diagram of an image spectrum in an embodiment of the invention: wherein (a) is an original image, and (b) is a frequency spectrum schematic diagram of the image;
FIG. 3 is a table comparing the test results of the confidence propagation image matching algorithm of the present invention;
fig. 4 is a schematic view of the triangulation principle in binocular ranging of the present invention.
Detailed Description
The invention will be further illustrated with reference to specific examples:
in this embodiment, a binocular vision system based on an unmanned aerial vehicle platform is adopted to perform an online binocular vision ranging method based on cross-correlation time delay estimation as shown in fig. 1, and the specific process is as follows:
s1, the binocular camera acquires left and right views, and fast Fourier transform is carried out to obtain an image frequency spectrum:
utilize binocular camera to build unmanned aerial vehicle range finding platform, the optical axis of binocular camera is preceding parallel, deviates a certain distance, and two images are acquireed respectively to two cameras of left and right sides, and the image size is MXN pixel, and left image function is IL(x, y) right image function is IR(x, y) where x, y are pixel coordinates and there is an amount of displacement x between the left and right images0I.e. IR(x,y)=IL(x-x0Y), performing fourier transform on the left and right images to obtain an image spectrum:
Figure BDA0002651031810000061
wherein FL(u, v) is the frequency spectrum of the left image, FR(u, v) is the frequency spectrum of the right image, let Au=[1,e-j2πu......e-j2πNu]∈CNAnd Av=[1,e-j2πv......e-j2πMv]T∈CMThen the Fourier transform expression can be written as FL(u,v)=AuILAvAnd FR(u,v)=AuIRAv,ILAnd IRRespectively, left and right image functions.
S2, filtering the image frequency spectrum obtained in the step S1, extracting high texture features, and removing noise:
filtering the image frequency spectrum obtained in the step S1 to inhibit the interference of noise and extract high texture structure information, wherein the used filter is
Figure BDA0002651031810000062
The filter parameters may be passed
Figure BDA0002651031810000063
To obtain D (E)ju,ejv) Is the set ideal amplitude-frequency characteristic. The result of the filtering of the image spectrum can then be expressed as:
Figure BDA0002651031810000064
introducing matrix
Figure BDA0002651031810000071
Then the formula (2) can be represented as
Figure BDA0002651031810000072
S3, performing cross-correlation power spectrum calculation on the left and right image spectrums obtained in the step S2 by using an online recursion technology:
performing cross-correlation calculation on the filtered image frequency spectrums obtained by the formula (3) to obtain cross-power spectrums of the left and right images,
Figure BDA0002651031810000073
wherein R (u, v) is a cross-power spectrum,
Figure BDA0002651031810000074
is composed of
Figure BDA0002651031810000075
In a conjugated form. Transforming cross-correlation calculation by utilizing an online recursion technology, taking R (t) as a cross-correlation calculation result at the time t, and taking R (t +1) as a cross-correlation calculation result at the time t +1 to obtain:
Figure BDA0002651031810000076
let K be AuP, obtaining:
Figure BDA0002651031810000077
s4, carrying out Fourier inverse transformation on the cross-power spectrum obtained in the step S3 to obtain time delay estimation and convert the time delay estimation into displacement estimation:
performing inverse Fourier transform on the cross-correlation calculation result obtained by the formula (4), wherein the peak value of the obtained inverse Fourier transform corresponds to the time delay difference tau of the left image and the right image:
Figure BDA0002651031810000078
in the formula phiu=[1,ej2πu......ej2πnu],Φv=[1,ej2πv......ej2πmv]TConverting the time delay estimation into displacement estimation:
Figure BDA0002651031810000081
where k is the coefficient obtained by regression of the experimental data, x0The displacement estimation results of the left image and the right image are obtained.
S5, deducing a binocular ranging result according to the displacement difference of the left image and the right image obtained in the step S4:
the displacement estimation x of the left and right images obtained by the formula (6)0Obtaining binocular ranging distance information by the following formula;
Figure BDA0002651031810000082
wherein T is the horizontal distance between the left camera and the right camera, f is the focal length of the cameras, and Z is the finally obtained binocular ranging distance.
To prove the effectiveness of this embodiment, the following simulation verification is performed on binocular vision ranging under an embedded platform:
the embedded operation platform selects a Samsung cotex-a 8-structured S5PV210 processor, a high-performance PowerVR SGX 5403D graphic engine and a 2D graphic engine are built in, and the method can perform relatively fast frequency domain operation, wherein the time consumed by band-pass filtering is 55ms, and the time consumed by power spectrum calculation is 45 ms. 3m, 5m and 7m are selected as reference distances of binocular ranging, under a simple scene similar to the aerial flight of an unmanned aerial vehicle, the average value of 20 measurement results is obtained, a classical confidence coefficient propagation algorithm is compared with the algorithm, and the comparison result is shown in fig. 3.
According to experimental results, the accuracy and the speed of the binocular system ranging are higher than those of a confidence coefficient propagation algorithm, and the confidence coefficient propagation algorithm has a frame rate of 4.5 frames under 752 × 480 resolution and is not enough to meet the ranging requirement of high-speed flight of the unmanned aerial vehicle on the real-time requirement, and has a frame rate of 9.2 frames under 752 × 480 resolution and meets the real-time requirement of the unmanned aerial vehicle during flight; the frame rate of the confidence coefficient propagation algorithm reaches 8.9 frames under the resolution of 376 × 240, and cannot meet the real-time requirement of the unmanned aerial vehicle during high-speed flight. The invention solves the problem of image matching by a cross-correlation time delay estimation technology, improves the robustness of image matching and accelerates the speed of binocular distance measurement.
The above-mentioned embodiments are merely preferred embodiments of the present invention, and the scope of the present invention is not limited thereto, so that variations based on the shape and principle of the present invention should be covered within the scope of the present invention.

Claims (6)

1. An online binocular vision distance measurement method based on cross-correlation time delay estimation is characterized by comprising the following steps:
s1, building an unmanned aerial vehicle binocular ranging platform, acquiring left and right views by the left and right cameras, and performing fast Fourier transform to obtain an image frequency spectrum;
s2, filtering the image frequency spectrum obtained in the step S1, extracting high texture features and removing noise;
s3, performing cross-correlation power spectrum calculation on the left and right image frequency spectrums obtained in the step S2 by using an online recursion technology;
s4, carrying out Fourier inverse transformation on the cross-power spectrum obtained in the step S3 to obtain time delay estimation and convert the time delay estimation into displacement estimation;
and S5, deducing a binocular ranging result according to the displacement difference of the left image and the right image obtained in the step S4.
2. The on-line binocular vision ranging method based on cross-correlation time delay estimation according to claim 1, wherein the step S1 is specifically performed as follows:
utilize binocular camera to build unmanned aerial vehicle range finding platform, the optical axis of binocular camera is preceding parallel, deviates a certain distance, and two images are acquireed respectively to two cameras of left and right sides, and the image size is MXN pixel, and left image function is IL(x, y) right image function is IR(x, y) where x, y are pixel coordinates and there is an amount of displacement x between the left and right images0I.e. IR(x,y)=IL(x-x0Y), performing fourier transform on the left and right images to obtain an image spectrum:
Figure FDA0002651031800000011
Figure FDA0002651031800000012
wherein FL(u, v) is the frequency spectrum of the left image, FR(u, v) is the frequency spectrum of the right image, let Au=[1,e-j2πu......e-j2πNu]∈CNAnd Av=[1,e-j2πv......e-j2πMv]T∈CMThen the Fourier transform expression can be written as FL(u,v)=AuILAvAnd FR(u,v)=AuIRAv,ILAnd IRRespectively, left and right image functions.
3. The on-line binocular vision ranging method based on cross-correlation delay estimation according to claim 2, wherein the specific process of the step S2 is as follows:
filtering the image frequency spectrum obtained in the step S1 to inhibit the interference of noise and extract high texture structure information, wherein the used filter is
Figure FDA0002651031800000021
The filter parameters may be passed
Figure FDA0002651031800000022
To obtain D (e)ju,ejv) Is the set ideal amplitude-frequency characteristic. The result of the filtering of the image spectrum can then be expressed as:
Figure FDA0002651031800000023
Figure FDA0002651031800000024
introducing matrix
Figure FDA0002651031800000025
Then the formula (2) can be represented as
Figure FDA0002651031800000026
Figure FDA0002651031800000027
4. The on-line binocular vision ranging method based on cross-correlation time delay estimation according to claim 3, wherein the specific process of the step S3 is as follows:
performing cross-correlation calculation on the filtered image frequency spectrums obtained by the formula (3) to obtain cross-power spectrums of the left and right images,
Figure FDA0002651031800000029
wherein R (u, v) is a cross-power spectrum,
Figure FDA00026510318000000210
is composed of
Figure FDA00026510318000000211
In a conjugated form. And transforming the cross-correlation calculation by using an online recursion technology to obtain:
Figure FDA0002651031800000028
wherein K is AuP, R (t) is the cross-correlation calculation result at the time t, and R (t +1) is the cross-correlation calculation result at the time t + 1.
5. The on-line binocular vision ranging method based on cross-correlation time delay estimation according to claim 4, wherein the specific process of the step S4 is as follows:
performing inverse Fourier transform on the cross-correlation calculation result obtained by the formula (4), wherein the peak value of the obtained inverse Fourier transform corresponds to the time delay difference tau of the left image and the right image
τ=min{ΦuR(u,v)Φv} (5)
In the formula phiu=[1,ej2πu......ej2πnu],Φv=[1,ej2πv......ej2πnv]TConverting the time delay estimation into displacement estimation:
x0=kτ (6)
where k is the coefficient obtained by regression of the experimental data, x0The displacement estimation results of the left image and the right image are obtained.
6. The on-line binocular vision ranging method based on cross-correlation time delay estimation according to claim 5, wherein the specific process of the step S5 is as follows:
the displacement estimation x of the left and right images obtained by the formula (6)0Obtaining binocular ranging distance information by the following formula;
Figure FDA0002651031800000031
wherein T is the horizontal distance between the left camera and the right camera, f is the focal length of the cameras, and Z is the finally obtained binocular ranging distance.
CN202010870861.0A 2020-08-26 2020-08-26 Online binocular vision distance measurement method based on cross-correlation time delay estimation Pending CN111968170A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010870861.0A CN111968170A (en) 2020-08-26 2020-08-26 Online binocular vision distance measurement method based on cross-correlation time delay estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010870861.0A CN111968170A (en) 2020-08-26 2020-08-26 Online binocular vision distance measurement method based on cross-correlation time delay estimation

Publications (1)

Publication Number Publication Date
CN111968170A true CN111968170A (en) 2020-11-20

Family

ID=73390449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010870861.0A Pending CN111968170A (en) 2020-08-26 2020-08-26 Online binocular vision distance measurement method based on cross-correlation time delay estimation

Country Status (1)

Country Link
CN (1) CN111968170A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113218560A (en) * 2021-04-19 2021-08-06 中国长江电力股份有限公司 Ultrasonic real-time estimation method for bolt pretightening force

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107564061A (en) * 2017-08-11 2018-01-09 浙江大学 A kind of binocular vision speedometer based on image gradient combined optimization calculates method
WO2018086348A1 (en) * 2016-11-09 2018-05-17 人加智能机器人技术(北京)有限公司 Binocular stereo vision system and depth measurement method
CN110211169A (en) * 2019-06-06 2019-09-06 上海黑塞智能科技有限公司 Reconstructing method based on the relevant narrow baseline parallax of multiple dimensioned super-pixel and phase
CN110895792A (en) * 2019-10-12 2020-03-20 南方科技大学 Image splicing method and device
CN111159888A (en) * 2019-12-28 2020-05-15 上海师范大学 Covariance matrix sparse iteration time delay estimation method based on cross-correlation function
US20200193560A1 (en) * 2018-12-16 2020-06-18 Sadiki Pili Fleming-Mwanyoha System and methods for attaining optimal precision stereoscopic direction and ranging through air and across refractive boundaries using minimum variance sub-pixel registration

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018086348A1 (en) * 2016-11-09 2018-05-17 人加智能机器人技术(北京)有限公司 Binocular stereo vision system and depth measurement method
CN107564061A (en) * 2017-08-11 2018-01-09 浙江大学 A kind of binocular vision speedometer based on image gradient combined optimization calculates method
US20200193560A1 (en) * 2018-12-16 2020-06-18 Sadiki Pili Fleming-Mwanyoha System and methods for attaining optimal precision stereoscopic direction and ranging through air and across refractive boundaries using minimum variance sub-pixel registration
CN110211169A (en) * 2019-06-06 2019-09-06 上海黑塞智能科技有限公司 Reconstructing method based on the relevant narrow baseline parallax of multiple dimensioned super-pixel and phase
CN110895792A (en) * 2019-10-12 2020-03-20 南方科技大学 Image splicing method and device
CN111159888A (en) * 2019-12-28 2020-05-15 上海师范大学 Covariance matrix sparse iteration time delay estimation method based on cross-correlation function

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
马秀博;孙熊伟;张德青;王良燕;: "基于机器视觉的对靶喷雾***时延估计方法研究", 农机化研究, no. 06, pages 56 - 60 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113218560A (en) * 2021-04-19 2021-08-06 中国长江电力股份有限公司 Ultrasonic real-time estimation method for bolt pretightening force

Similar Documents

Publication Publication Date Title
WO2018127007A1 (en) Depth image acquisition method and system
Hambarde et al. Depth estimation from single image and semantic prior
WO2009023044A2 (en) Method and system for fast dense stereoscopic ranging
CN112785636B (en) Multi-scale enhanced monocular depth estimation method
Lin et al. Optimizing ZNCC calculation in binocular stereo matching
WO2023155387A1 (en) Multi-sensor target detection method and apparatus, electronic device and storage medium
CN116449384A (en) Radar inertial tight coupling positioning mapping method based on solid-state laser radar
Peng et al. Infrared small-target detection based on multi-directional multi-scale high-boost response
CN115421158A (en) Self-supervision learning solid-state laser radar three-dimensional semantic mapping method and device
CN113963117A (en) Multi-view three-dimensional reconstruction method and device based on variable convolution depth network
CN116310673A (en) Three-dimensional target detection method based on fusion of point cloud and image features
CN111968170A (en) Online binocular vision distance measurement method based on cross-correlation time delay estimation
CN109188436A (en) Efficient Bistatic SAR echo generation method suitable for any platform track
CN117496312A (en) Three-dimensional multi-target detection method based on multi-mode fusion algorithm
CN112489097A (en) Stereo matching method based on mixed 2D convolution and pseudo 3D convolution
CN117132737A (en) Three-dimensional building model construction method, system and equipment
CN104616304A (en) Self-adapting support weight stereo matching method based on field programmable gate array (FPGA)
CN116704200A (en) Image feature extraction and image noise reduction method and related device
CN116703996A (en) Monocular three-dimensional target detection algorithm based on instance-level self-adaptive depth estimation
CN116167947A (en) Image noise reduction method based on noise level estimation
Zeng et al. Tsfe-net: Two-stream feature extraction networks for active stereo matching
CN116466320A (en) Target detection method and device
CN111815670A (en) Multi-view target tracking method, device and system, electronic terminal and storage medium
Zhou et al. Automatic reconstruction of 3-D building structures for tomoSAR using neural networks
Bae et al. An accurate and cost-effective stereo matching algorithm and processor for real-time embedded multimedia systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201120