CN111798485B - Event camera optical flow estimation method and system enhanced by IMU - Google Patents

Event camera optical flow estimation method and system enhanced by IMU Download PDF

Info

Publication number
CN111798485B
CN111798485B CN202010620421.XA CN202010620421A CN111798485B CN 111798485 B CN111798485 B CN 111798485B CN 202010620421 A CN202010620421 A CN 202010620421A CN 111798485 B CN111798485 B CN 111798485B
Authority
CN
China
Prior art keywords
optical flow
imu
event
foreground
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010620421.XA
Other languages
Chinese (zh)
Other versions
CN111798485A (en
Inventor
余磊
付婧祎
杨文�
叶琪霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202010620421.XA priority Critical patent/CN111798485B/en
Publication of CN111798485A publication Critical patent/CN111798485A/en
Application granted granted Critical
Publication of CN111798485B publication Critical patent/CN111798485B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • Signal Processing (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an event camera optical flow estimation method and system enhanced by IMU (inertial measurement Unit). firstly, an EDI (extended display identification) model is utilized to reconstruct a fuzzy brightness image at any moment into a clear brightness image, and then a basic optical flow model is established by combining a constant brightness hypothesis, wherein the EDI represents double integral based on an event; then, adding the IMU as a constraint into a basic optical flow model to realize continuous optical flow estimation under any motion consistent scene; when a mixed motion scene occurs, the background and the foreground are divided and processed, IMU constraint is introduced when the background light stream is estimated, sparse constraint is introduced when the foreground light stream is estimated, the background light stream and the foreground light stream of the mixed motion scene are jointly estimated in an alternating iteration updating mode, and finally the whole continuous light stream of the scene is obtained through combination. The scenes suitable for application of the invention include motion-consistent scenes with single background motion or consistent motion of foreground objects, and mixed motion scenes with different motion directions and sizes of the background and the foreground.

Description

Event camera optical flow estimation method and system enhanced by IMU
Technical Field
The invention belongs to the field of image processing, and particularly relates to a technical scheme for continuous optical flow estimation in a high-dynamic-range and high-speed motion scene.
Background
In the field of computer vision, Optical Flow (Optical Flow) estimation has been a core technical problem, and plays an important role in applications such as navigation, motion segmentation, tracking, and image registration. The event camera is a novel bionic sensor, as shown in fig. 1, the camera has independent pixels, the pixels asynchronously generate a series of pulses only when the light intensity changes, and the pulses are called as 'events', and each event consists of the space-time coordinates and positive and negative polarities of a brightness change pixel point. Since the event camera samples events at a rate at which the scene changes dynamically, there are several advantages over conventional optical cameras: high temporal resolution, low latency, High Dynamic Range (HDR), and low power consumption and bandwidth. Based on the above advantages, the event camera can capture motion well, and thus can solve the problems in optical flow estimation due to the conventional camera limitations.
At present, a newer event camera such as a DAVIS (Dynamic and Active-pixel Vision Sensor) is provided with an Inertial Measurement Unit (IMU) module, the IMU module can measure the linear acceleration and the angular velocity of three axes, is often used for acquiring three-dimensional motion information of the camera, is used for self-positioning in applications such as SLAM (Simultaneous Localization and Mapping) and navigation, and can achieve time synchronization with an event point and a brightness image. Because the event stream output by the event camera is easily influenced by factors such as noise and the like, the trace of the event point is deviated, camera motion information contained in the IMU data has close relation with the trace of the event point, and the motion trace of the event point can be reversely deduced after the camera motion is obtained by the IMU. Therefore, the IMU can be used to perform motion compensation on the event points, and the event point velocity information, i.e. the optical flow, can be obtained therefrom.
However, in general, a small amount of IMU carries too little information, and no suitable technical solution is available at present.
Disclosure of Invention
In order to fully exert the advantages of an event camera in the field of optical flow estimation and design a more efficient and applicable optical flow estimation method by utilizing the characteristics of the event camera, the invention provides a continuous optical flow estimation scheme introducing IMU constraint and sparse constraint, and an application scene is a motion consistent scene with independent background motion or consistent motion of a foreground object and a mixed motion scene with different motion directions and sizes of a background and a foreground respectively.
The technical scheme adopted by the invention provides an event camera optical flow estimation method enhanced by IMU (inertial measurement Unit). firstly, an EDI (extended display identification) model is utilized to reconstruct a fuzzy brightness image at any moment into a clear brightness image, and then a basic optical flow model is established by combining with a constant brightness hypothesis, wherein the EDI represents double integral based on an event; then, adding the IMU as a constraint into a basic optical flow model to realize continuous optical flow estimation in a scene with any consistent motion; when a mixed motion scene occurs, the background and the foreground are divided and processed, IMU constraint is introduced when the background light stream is estimated, sparse constraint is introduced when the foreground light stream is estimated, the background light stream and the foreground light stream of the mixed motion scene are jointly estimated in an alternating iteration updating mode, and finally the whole continuous light stream of the scene is obtained through combination.
Moreover, an implementation of establishing a base optical flow model includes the steps of,
step 1.1, representing the i-th frame blurred brightness image generated in the exposure time T as y [ i ], using an EDI model to compensate the blurred brightness image by using event points in the brightness image generation time period, and calculating a clear brightness image I (f) at any time f:
Figure BDA0002562857850000021
wherein, Ei(f) A double integral of event points generated in the generation time T of the ith frame of blurred brightness image is represented;
step 1.2, based on the optical flow formula with the assumption of constant brightness, the expression of optical flow calculation is obtained as follows,
Figure BDA0002562857850000022
where δ is the Dirac function, v is the optical flow to be solved,
Figure BDA0002562857850000023
representing the determination of the spatial gradient, c is the threshold for the camera excitation event point, p is the polarity of the camera excitation event point, teThe time at which the camera fired the event point;
the simplified representation is that, for the sake of simplicity,
A·v=b+ε1
wherein, variable
Figure BDA0002562857850000024
ε1An error term representing the EDI model estimated optical flow.
Moreover, IMU is used as a constraint and added into a basic optical flow model to realize continuous optical flow estimation under any scene with consistent motion, and the realization mode comprises the following steps,
step 2.1, in the motion process of the event camera, a series of event points and IMU data are output simultaneously, the IMU data comprise linear acceleration and angular velocity, and the event points and the IMU data are aligned in time by utilizing a timestamp of the output data; obtaining an arbitrary event point e by linear interpolationjIs transformed by
Figure BDA0002562857850000025
tjA timestamp representing the point of the event;
step 2.2, each event point is provided with coordinate information, and the event point e is transmittedjIs expressed as xjThe motion compensated pixel coordinate is denoted xj', according to the camera projection model pi (. -) and event point ejDepth information Z (x) ofj) And obtaining the pixel coordinates of the event points after motion compensation:
Figure BDA0002562857850000026
step 2.3, knowing the original pixel coordinate xjAnd time interval at, the optical flow is estimated using the IMU and the event points,
v=m+ε2 (9)
wherein, variable
Figure BDA0002562857850000027
ε2An error term representing the estimated optical flow of the IMU motion compensation model;
step 2.4, the IMU is taken as a constraint and introduced into the EDI model to estimate the optical flow, a least square method is utilized to write a cost function,
Figure BDA0002562857850000031
wherein | |2L representing a matrix2Norm, λ1A weight coefficient representing an IMU constraint;
step 2.5, setting a light flow value v*The estimation result is obtained by two multiplications:
v*=(ATA+λ1E)-1(ATb+λ1m)
wherein E represents an identity matrix.
When a mixed motion scene occurs, moreover, the processing mode includes the following steps,
step 3.1, the image matrix can be represented as a background matrix L and a foreground sparse matrix S by a robust principal component analysis (RobustPCA) method, so as to obtain a convex optimization problem:
Figure BDA0002562857850000032
wherein | |*Representing the kernel norm of the matrix, | | · | luminance1L representing a matrix1Norm, λ2The regularization parameters are used for adjusting the weight of the sparse matrix;
step 3.2, the foreground light stream vfAdding sparse term constraints and background light flow vbAdding IMU constraints, we get:
v=vb+vf
obtaining a cost function added with sparse term constraint of foreground optical flow by referring to a scene with consistent motion,
Figure BDA0002562857850000033
step 3.3, obtaining an optimization problem minvf(vf,vb) Adopting an alternative iteration updating method to jointly estimate the optical flows of the background and the foreground;
and 3.4, alternately iterating and updating until iteration convergence, and adding the foreground optical flow and the background optical flow to obtain a continuous optical flow in the mixed motion scene.
Furthermore, the optical flow for jointly estimating the background and foreground in step 3.3 is implemented in such a way that,
if the foreground luminous flux v is fixedfSolving for the background light flow v byb
vb=(ATA+λ1E)-1[AT(b-A·vf)+λ1m]
Obtaining a background light stream vbAfter that, v is fixedbUpdating the foreground light flow vfThe following optimization problems are obtained:
Figure BDA0002562857850000034
wherein the variable
Figure BDA0002562857850000035
The variable b ═ b-a · vb
Iterative solution of the optimization problem is carried out by applying an ISTA method, and the foreground light stream v is finally obtained by continuous iterative update until convergencef
The invention also provides an event camera optical flow estimation method using IMU enhancement, which is used for executing the event camera optical flow estimation system using IMU enhancement.
The method has the advantages that the data output by the IMU of the camera is used for compensating the background light stream, so that the compensation speed is high and the compensation effect is good; by adding sparse constraint to the foreground light stream, the influence of background compensation on the accuracy of the foreground light stream is avoided, and thus a more accurate continuous light stream estimation result is obtained. The scenes suitable for application of the invention comprise a motion consistent scene with single background motion or consistent motion of foreground objects and a mixed motion scene with different motion directions and sizes of the background and the foreground.
Drawings
Fig. 1 is a comparison graph of conventional camera and event camera data.
FIG. 2 is a schematic diagram of background optical flow compensation according to an embodiment of the present invention.
Fig. 3 is a flow chart of an embodiment of the present invention.
Detailed Description
For a clearer understanding of the present invention, reference will now be made in detail to the present disclosure, taken in conjunction with the accompanying drawings and examples.
In the invention, under the normal condition, a small amount of IMU carries too little information, the IMU in a period of time is integrated to obtain the track of an event point, and the continuous optical flow in the period of time can be estimated through an IMU motion compensation model after the event point is transformed. However, the IMU optical flow model is limited to a scene in which event points are generated only by camera motion, and therefore, it is necessary to introduce an IMU as a constraint into a specific optical flow model and set a certain weight, which can increase the accuracy of the optical flow estimation method through the IMU constraint, and can also ensure the optical flow estimation when the camera is still, thereby increasing the robustness of the method. Whereas in mixed motion scenes, the foreground and background motion are different, the compensation of the IMU can instead result in a false foreground optical flow. Therefore, the foreground and the background need to be separated, so as to highlight the advantages of the IMU on the background optical flow estimation, and make the optical flow estimation more accurate.
The invention proposes that firstly, an EDI (Event-based Double Integral) optical flow model is established by using DVS (Dynamic Vision Sensor) and APS (Active Pixel Sensor) data of an Event camera, and continuous optical flow is estimated while a clear brightness image is reconstructed. And then introducing an inertia measurement unit, and adding the IMU as a constraint into the EDI optical flow model to realize continuous optical flow estimation of any scene with consistent motion. Secondly, in order to solve the problem of limited scene after IMU constraint is added, the foreground and the background are divided and processed, IMU constraint is introduced when the background light stream is estimated, sparse constraint is introduced when the foreground light stream is estimated, the background light stream and the foreground light stream of any moving scene can be estimated through an alternative iteration updating method, and finally the background light stream and the foreground light stream are combined into the integral continuous light stream of the scene.
Referring to fig. 3, an embodiment of the present invention provides an event camera optical flow estimation method enhanced by an IMU, including the following steps:
step 1, firstly, reconstructing a fuzzy brightness image at any moment into a clear brightness image by using an EDI (extended display identification) model, and then deducing an optical flow calculation method based on the EDI model by combining an optical flow formula of 'brightness constancy hypothesis', wherein the optical flow calculation method is used as a basic optical flow model used by the invention.
Step 1.1, an ith frame of blurred brightness image generated in the exposure time T is represented as y [ i ], the blurred brightness image is compensated by using an EDI model and event points in a brightness image generation time period, and a clear brightness image I (f) at any time f is calculated:
Figure BDA0002562857850000051
wherein E isi(f) And (3) representing double integration of event points generated in the generation time T of the ith frame of blurred brightness image:
Figure BDA0002562857850000052
wherein, tiFor the exposure start time, f, t is any time within the exposure time, c is the threshold of the camera excitation event point, τ is the integral sign, e (t) is a function of the continuous time t, which is defined by the invention as:
e(t)=pδ(t-te)
where p is the polarity of the camera firing event point, teFor the moment when the camera fires the event point, δ is the dirac function.
Step 1.2, the optical flow formula based on the "constant brightness" assumption can be expressed as:
Figure BDA0002562857850000053
wherein the content of the first and second substances,
Figure BDA0002562857850000054
representing the spatial gradient of an arbitrary luminance image B,
Figure BDA0002562857850000055
the time derivative of the brightness image B is represented, v is the optical flow to be solved, and the light flow is obtained by combining an EDI model:
Figure BDA0002562857850000056
wherein the content of the first and second substances,
Figure BDA0002562857850000057
is Ei(f) The derivative of time f can be expressed as:
Figure BDA0002562857850000058
the expression for the final found optical flow calculation is as follows:
Figure BDA0002562857850000059
it is simplified as:
A·v=b+ε1 (6)
wherein, variable
Figure BDA00025628578500000510
ε1An error term representing the EDI model estimated optical flow.
And 2, adding the IMU motion compensation model serving as a constraint into the basic optical flow model to realize continuous optical flow estimation in any scene with consistent motion. And 2.1, simultaneously outputting a series of event points and IMU data in the event camera motion process, wherein the IMU data comprises linear acceleration and angular velocity. Using the time stamp of the output data, the event point is summedThe IMU data is aligned in time. As shown in FIG. 2, on the t-axis of the time axis, the dots represent event points, the squares represent IMU data, and the upper I1,I2,I3,I4Representing a sequence of image frames corresponding thereto, separated by time intervals
Figure BDA0002562857850000061
Inner IMU data integration to obtain I2,I3Transformation matrix between two frames
Figure BDA0002562857850000062
Namely:
Figure BDA0002562857850000063
wherein, linear acceleration quadratic integral obtains translation variable quantity
Figure BDA0002562857850000064
Integration of angular velocity to obtain rotation variation
Figure BDA0002562857850000065
Figure BDA0002562857850000066
A timestamp representing the kth event point at any time f,
Figure BDA0002562857850000067
and
Figure BDA0002562857850000068
respectively represent arbitrary event points ejRelative to a reference timestamp
Figure BDA00025628578500000618
And
Figure BDA00025628578500000610
the transformation matrix of (2). Then by passing
Figure BDA00025628578500000611
Linear interpolation is carried out, and any event point e can be obtainedjIs transformed by
Figure BDA00025628578500000612
tjA timestamp representing the point of the event.
Step 2.2, each event point is provided with coordinate information, and the event point e is transmittedjIs expressed as xjThe motion compensated pixel coordinate is denoted xj', coordinate x of pixeljProjecting model pi (.) and event point e through camerajDepth information Z (x) ofj) Back projecting to world coordinate system to obtain back projected coordinate xj1Comprises the following steps:
xj1=Z(xj-1(xj)
then according to the obtained transformation matrix of the event point
Figure BDA00025628578500000613
Coordinate x under three-dimensional coordinatesj1Transforming, and recording the transformed coordinates as xj2
Figure BDA00025628578500000614
Finally, projecting the coordinate x by a camera to a model pi (.)j2Projecting the event point pixel coordinates onto pixel coordinates to obtain event point pixel coordinates after motion compensation:
Figure BDA00025628578500000615
step 2.3, knowing the original pixel coordinate xjAnd time interval Δ t, the optical flow can be estimated using the IMU and the event points, written as
v=m+ε2 (9)
Wherein, variable
Figure BDA00025628578500000616
ε2An error term representing the estimated optical flow of the IMU motion compensation model;
step 2.4, the IMU is used as a constraint and introduced into the EDI model to estimate the optical flow, and a cost function is written by using a least square method:
Figure BDA00025628578500000617
wherein | |2L representing a matrix2Norm, λ1A weight coefficient representing IMU constraint, wherein the larger the IMU influence is, the lambda is obtained through experiments1Is preferably in the range of [0, 1 ]]Zero value can be set when the camera is at rest and is not outputting IMU data;
step 2.5, setting a light flow value v*Solutions are obtained by two multiplications:
v*=(ATA+λ1E)-1(ATb+λ1m) (11)
wherein E represents an identity matrix.
The EDI model optical flow estimation method and the EDI optical flow model method introducing IMU constraint are named EDIF and EDIF _ IMU respectively. These two methods are then compared to the current common event camera-based optical flow estimation methods, named DVS-CM and DVS-LP, respectively, for image contrast maximization estimation optical flow and SAE local plane fitting estimation optical flow methods.
Calculating the error between the optical flow estimation result and the true value by using the Average Endpoint Error (AEE) and the Average Angle Error (AAE) with standard deviation, wherein the error is specifically defined as follows:
Figure BDA0002562857850000071
Figure BDA0002562857850000072
wherein v isi=(vx,i,vy,i) Representing the ith optical flow measurement, ui=(vx,i,vy,i) Representing the true value of the corresponding optical flow, vx,i,vy,iRepresenting the components of the optical flow in the x and y directions, and N represents the total number of optical flow vectors. The error ratios of the four methods are shown in table 1, generally speaking, the EDIF _ IMU results are the best, the endpoint error AEE and the angle error AAE are both smaller, and the EDIF results are also better than the other two existing methods, which illustrates the effectiveness of the method of the present invention.
TABLE 1
Figure BDA0002562857850000073
And 3, after the IMU constraint is added, the optical flow estimation method provided by the invention is limited to a scene with consistent motion, so that the background and the foreground are further divided and processed, the IMU constraint is introduced when the background optical flow is estimated, the sparse constraint is introduced when the foreground optical flow is estimated, the background optical flow and the foreground optical flow of any motion scene can be jointly estimated by an alternative iteration updating method, and finally the background optical flow and the foreground optical flow are combined into the integral continuous optical flow of the scene. Step 3.1, the image matrix can be represented as a background matrix L and a foreground sparse matrix S by a Robust principal component analysis (Robust PCA), and a convex optimization problem can be written:
minS,L||L||*+λ2||S||1 (12)
wherein | |*The kernel norm, i.e. the sum of the matrix singular values, represents the matrix. I. | charging1L representing a matrix1The norm, i.e. the maximum of the sum of the absolute values of the matrix column vectors. Lambda [ alpha ]2To regularize the parameters, used to adjust the weights of the sparse matrix, embodiments obtain λ experimentally2Preferably suggested value of (a) is 0.3.
Step 3.2, the foreground light stream vfAdding sparse term constraints and background light flow vbAdd IMU constraints. Thus, the optical flow can be written as:
v=vb+vf (13)
with reference to a scene with consistent motion, a cost function with sparse term constraint of foreground optical flow can be obtained:
Figure BDA0002562857850000081
step 3.3, obtaining an optimization problem minvf(vf,vb) Jointly estimating the optical flows of the background and the foreground by adopting an alternative iterative updating method, and if the optical flows of the foreground are fixed, determining the optical flows v of the foregroundfThe background light flow v can be solved by the following formulab
vb=(ATA+λ1E)-1[AT(b-A·vf)+λ1m] (15)
Obtaining a background light stream vbAfter that, v is fixedbThen updating the foreground optical flow vfI.e. the following optimization problem:
Figure BDA0002562857850000082
wherein the variable
Figure BDA0002562857850000083
The variable b ═ b-a · vb
The optimization problem is iteratively solved using ISTA (iterative shrinkage thresholding method). The iteration step can be expressed as:
Figure BDA0002562857850000084
where k is used to identify the number of iterations, tkWith > 0 is meant the step size of the iteration,
Figure BDA0002562857850000085
represents the soft threshold operating function:
Figure BDA0002562857850000086
wherein, x represents any variable, sign (.) represents a symbolic function, and finally, the iterative updating is continuously carried out until convergence is reached, and finally, the foreground optical flow v is obtained through solvingf
Step 3.4, the foreground light stream v is fixed by means of alternate iterative updatingfSolving for background light flow vbFixed background light flow vbSolving the foreground light flow vfAnd then until the iteration converges. And finally, adding the foreground optical flow and the background optical flow to obtain a continuous optical flow v in the mixed motion scene.
The continuous optical flow estimation method introducing Sparse constraint is named EDIMU _ Sparse, and the method provided by the invention is tested by taking DVS-CM as a comparison method. Compared with the application of an image contrast maximization method to a mixed motion scene, the method EDIMU _ Sparse provided by the invention has the advantages that the continuous optical flow estimated by the method EDIMU _ Sparse is smoother, and the background optical flow is more consistent, so that the effectiveness of the method is illustrated.
In specific implementation, the method can adopt a computer software technology to realize an automatic operation process, and a corresponding system device for implementing the method process is also in the protection scope of the invention.
It should be understood that the above description is for illustrative purposes only and should not be taken as limiting the scope of the present invention, which is defined by the appended claims.

Claims (5)

1. An event camera optical flow estimation method using IMU enhancement, characterized by: firstly, reconstructing a blurred brightness image at any moment into a clear brightness image by using an EDI (extended display identification) model, and establishing a basic optical flow model by combining a brightness constant hypothesis, wherein the EDI represents event-based double integral; then, adding the IMU as a constraint into a basic optical flow model to realize continuous optical flow estimation in a scene with any consistent motion; when a mixed motion scene occurs, segmenting a background from a foreground, introducing IMU (inertial measurement unit) constraint when estimating a background light stream, introducing sparse constraint when estimating a foreground light stream, jointly estimating the background light stream and the foreground light stream of the mixed motion scene in an alternating iterative updating mode, and finally combining to obtain an integral continuous light stream of the scene;
when a mixed motion scene occurs, the processing mode comprises the following steps,
step 3.1, representing the image matrix as a background matrix L and a foreground sparse matrix S by Robust principal component analysis (Robust PCA), and obtaining a convex optimization problem:
Figure FDA0003653193880000011
wherein, I*Representing the kernel norm of the matrix, | | · | luminance1L representing a matrix1Norm, λ2The regularization parameters are used for adjusting the weight of the sparse matrix;
step 3.2, the foreground light stream vfAdding sparse term constraints and background light flow vbAdding IMU constraint to obtain optical flow:
v=vb+vf
obtaining a cost function added with sparse term constraint of foreground optical flow by referring to a scene with consistent motion,
Figure FDA0003653193880000012
wherein λ is1A weight coefficient representing an IMU constraint; let event point ejIs expressed as xjThe motion compensated pixel coordinate is denoted xj', knowing the original pixel coordinates xjAnd time interval Δ t, variable
Figure FDA0003653193880000013
Step 3.3, obtaining an optimization problem minvf(vf,vb) Adopting an alternative iteration updating method to jointly estimate the optical flows of the background and the foreground;
and 3.4, alternately iterating and updating until iteration convergence, and adding the foreground optical flow and the background optical flow to obtain a continuous optical flow in the mixed motion scene.
2. The method of estimating optical flow of an event camera using IMU enhancement as claimed in claim 1, wherein: an implementation of building an underlying optical flow model includes the following steps,
step 1.1, representing the i-th frame blurred brightness image generated in the exposure time T as y [ i ], using an EDI model to compensate the blurred brightness image by using event points in the brightness image generation time period, and calculating a clear brightness image I (f) at any time f:
Figure FDA0003653193880000021
wherein E isi(f) Indicating that double integration of event points is generated in the generation time T of the ith frame of blurred brightness image;
step 1.2, based on the optical flow formula with the assumption of constant brightness, the expression of optical flow calculation is obtained as follows,
Figure FDA0003653193880000022
where δ is the Dirac function, v is the optical flow to be solved,
Figure FDA0003653193880000023
representing the determination of the spatial gradient, c is the threshold for the camera excitation event point, p is the polarity of the camera excitation event point, teThe time at which the camera fired the event point;
the simplified representation is that, for the sake of simplicity,
A·v=b+ε1
whereinOf variable quantity
Figure FDA0003653193880000024
ε1An error term representing the EDI model estimated optical flow.
3. The method of estimating optical flow of an event camera using IMU enhancement as claimed in claim 2, wherein: the IMU is used as a constraint and added into a basic optical flow model to realize continuous optical flow estimation under any scene with consistent motion, and the realization mode comprises the following steps,
step 2.1, in the motion process of the event camera, simultaneously outputting a series of event points and IMU data, wherein the IMU data comprises linear acceleration and angular velocity, and aligning the event points and the IMU data in time by utilizing a timestamp of the output data; obtaining an arbitrary event point e by linear interpolationjIs transformed by
Figure FDA0003653193880000025
tjA timestamp representing the point of the event;
step 2.2, each event point is provided with coordinate information, and the event point e is divided intojIs expressed as xjThe motion compensated pixel coordinate is denoted xj', from the camera projection model π (. eta.) and event point ejDepth information Z (x) ofj) And obtaining the pixel coordinates of the event points after motion compensation:
Figure FDA0003653193880000026
step 2.3, knowing the original pixel coordinate xjAnd time interval at, the optical flow is estimated using the IMU and the event points,
v=m+ε2 (9)
wherein, variable
Figure FDA0003653193880000027
ε2An error term representing the estimated optical flow of the IMU motion compensation model;
step 2.4, the IMU is used as a constraint and introduced into an EDI model to estimate the optical flow, a least square method is used for writing a cost function,
Figure FDA0003653193880000028
wherein | |2L representing a matrix2Norm, λ1A weight coefficient representing an IMU constraint;
step 2.5, setting the light flow value v*The estimation result is obtained by two multiplications:
v*=(ATA+λ1E)-1(ATb+λ1m)
wherein E represents an identity matrix.
4. The method of estimating optical flow of an event camera using IMU enhancement as claimed in claim 1, 2 or 3, wherein: the optical flow for jointly estimating the background and foreground in step 3.3 is implemented in such a way that,
if the foreground luminous flux v is fixedfSolving for the background light flow v byb
vb=(ATA+λ1E)-1[AT(b-A·vf)+λ1m]
Obtaining a background light flow vbAfter that, v is fixedbUpdating the foreground light flow vfThe following optimization problem is obtained:
Figure FDA0003653193880000031
wherein E represents an identity matrix, variable
Figure FDA0003653193880000032
The variable b ═ b-a · vb
Iteratively solving the optimization problem by applying the ISTA method, and continuously updating by iterationUntil convergence, finally solving to obtain foreground light stream vf
5. An event camera optical flow estimation system enhanced with IMU, characterized by: for performing the method of event camera optical flow estimation with IMU enhancement as claimed in any one of claims 1 to 4.
CN202010620421.XA 2020-06-30 2020-06-30 Event camera optical flow estimation method and system enhanced by IMU Active CN111798485B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010620421.XA CN111798485B (en) 2020-06-30 2020-06-30 Event camera optical flow estimation method and system enhanced by IMU

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010620421.XA CN111798485B (en) 2020-06-30 2020-06-30 Event camera optical flow estimation method and system enhanced by IMU

Publications (2)

Publication Number Publication Date
CN111798485A CN111798485A (en) 2020-10-20
CN111798485B true CN111798485B (en) 2022-07-19

Family

ID=72810841

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010620421.XA Active CN111798485B (en) 2020-06-30 2020-06-30 Event camera optical flow estimation method and system enhanced by IMU

Country Status (1)

Country Link
CN (1) CN111798485B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022141418A1 (en) * 2020-12-31 2022-07-07 华为技术有限公司 Image processing method and device
CN114137247B (en) * 2021-11-30 2024-01-19 上海科技大学 Speed sensing method, device, equipment and medium based on event camera
CN114842386B (en) * 2022-05-06 2024-05-17 中国科学技术大学 Event motion segmentation method for progressive iterative optimization of event camera
CN117739996B (en) * 2024-02-21 2024-04-30 西北工业大学 Autonomous positioning method based on event camera inertial tight coupling

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204477A (en) * 2016-07-06 2016-12-07 天津大学 Video frequency sequence background restoration methods based on online low-rank background modeling
CN107687850A (en) * 2017-07-26 2018-02-13 哈尔滨工业大学深圳研究生院 A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit
US10600189B1 (en) * 2016-12-06 2020-03-24 Apple Inc. Optical flow techniques for event cameras

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7253258B2 (en) * 2017-05-29 2023-04-06 ユニベアズィテート チューリッヒ Block-matching optical flow and stereo vision for dynamic vision sensors
CN110120098B (en) * 2018-02-05 2023-10-13 浙江商汤科技开发有限公司 Scene scale estimation and augmented reality control method and device and electronic equipment
US11244464B2 (en) * 2018-03-09 2022-02-08 Samsung Electronics Co., Ltd Method and apparatus for performing depth estimation of object

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204477A (en) * 2016-07-06 2016-12-07 天津大学 Video frequency sequence background restoration methods based on online low-rank background modeling
US10600189B1 (en) * 2016-12-06 2020-03-24 Apple Inc. Optical flow techniques for event cameras
CN107687850A (en) * 2017-07-26 2018-02-13 哈尔滨工业大学深圳研究生院 A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Asynchronous frameless event-based optical flow;Ryad Benosman etal.;《Neural Networks》;20121231;全文 *
一种局部和全局相结合的光流计算方法;张建明等;《计算机工程与科学》;20051231;第27卷(第5期);全文 *

Also Published As

Publication number Publication date
CN111798485A (en) 2020-10-20

Similar Documents

Publication Publication Date Title
CN111798485B (en) Event camera optical flow estimation method and system enhanced by IMU
Casser et al. Depth prediction without the sensors: Leveraging structure for unsupervised learning from monocular videos
Kim et al. Simultaneous mosaicing and tracking with an event camera
US11238606B2 (en) Method and system for performing simultaneous localization and mapping using convolutional image transformation
WO2020253618A1 (en) Video jitter detection method and device
CN113286194A (en) Video processing method and device, electronic equipment and readable storage medium
CN111899276A (en) SLAM method and system based on binocular event camera
CN112233179B (en) Visual odometer measuring method
CN113269682B (en) Non-uniform motion blur video restoration method combined with interframe information
WO2023071790A1 (en) Pose detection method and apparatus for target object, device, and storage medium
CN113744337A (en) Synchronous positioning and mapping method integrating vision, IMU and sonar
CN115375581A (en) Dynamic visual event stream noise reduction effect evaluation method based on event time-space synchronization
CN111899345B (en) Three-dimensional reconstruction method based on 2D visual image
Li et al. Gyroflow: Gyroscope-guided unsupervised optical flow learning
CN111798484B (en) Continuous dense optical flow estimation method and system based on event camera
CN112131991B (en) Event camera-based data association method
CN110874569B (en) Unmanned aerial vehicle state parameter initialization method based on visual inertia fusion
Xue et al. Event-based non-rigid reconstruction from contours
CN116151320A (en) Visual odometer method and device for resisting dynamic target interference
Tistarelli Computation of coherent optical flow by using multiple constraints
Ren et al. Self-calibration method of gyroscope and camera in video stabilization
CN113256711A (en) Pose estimation method and system of monocular camera
US20240029283A1 (en) Image depth prediction method, electronic device, and non-transitory storage medium
Lobo et al. Bioinspired visuo-vestibular artificial perception system for independent motion segmentation
CN112529936B (en) Monocular sparse optical flow algorithm for outdoor unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant