CN114913239B - Combined calibration method and device for event camera sensor and RGB camera - Google Patents

Combined calibration method and device for event camera sensor and RGB camera Download PDF

Info

Publication number
CN114913239B
CN114913239B CN202210357968.4A CN202210357968A CN114913239B CN 114913239 B CN114913239 B CN 114913239B CN 202210357968 A CN202210357968 A CN 202210357968A CN 114913239 B CN114913239 B CN 114913239B
Authority
CN
China
Prior art keywords
picture
event
camera
edge
transformation matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210357968.4A
Other languages
Chinese (zh)
Other versions
CN114913239A (en
Inventor
赵开春
尤政
朱佳佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202210357968.4A priority Critical patent/CN114913239B/en
Publication of CN114913239A publication Critical patent/CN114913239A/en
Application granted granted Critical
Publication of CN114913239B publication Critical patent/CN114913239B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention discloses a method and a device for jointly calibrating an event camera sensor and an RGB camera, wherein the method comprises the following steps: the method comprises the steps that events output by an event camera are aggregated to obtain a first picture according to a timestamp of an RGB camera; respectively extracting edges of the first picture and a second picture obtained by the RGB camera to obtain a first edge image and a second edge image; extracting straight lines from the first edge graph and the second edge graph respectively, and calculating intersection points of the straight lines as virtual corner points; solving a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture for the virtual corner; based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture, transformation matrices of the event camera and the RGB camera are calculated. The invention can calibrate the event camera and the traditional RGB camera, and has the characteristics of high calibration precision and the like.

Description

Combined calibration method and device for event camera sensor and RGB camera
Technical Field
The invention relates to the technical field of joint calibration of sensors, in particular to a joint calibration method and device for an event camera sensor and an RGB camera.
Background
In recent years, some scientists notice that biological vision systems are superior to traditional digital vision systems in quality, power consumption, response speed and the like. The event camera (EVENT CAMERA), also known as dynamic vision sensor (Dynamic Vision Sensor), is a silicon retinal vision sensor that is biologically inspired. It presents a significant transition in the way visual information is acquired, because the event camera samples pixel information according to scene dynamics, rather than based on a clock independent of the scene being viewed. Their advantages are: high time resolution and low delay (on the order of microseconds), high dynamic range (140 dB), low power consumption. The asynchronous nature of the event camera is suitable for target tracking and motion estimation, motion that would otherwise require a high speed camera to capture in conventional methods can be monitored, and the output data is greatly reduced, which greatly reduces the computational requirements for subsequent filtering. The largest difference between DVS and conventional camera hardware is that DVS has its own photosensitive processing unit and independent signal processing circuitry for each pixel, and each pixel can send a request to transmit pixel location information and activation time stamp to external circuitry alone if the luminosity variation exceeds a threshold. This approach is no longer limited by the "frame" and frees the response rate limitations from the root.
Computer vision methods based on event cameras have been rapidly developed and have been widely focused because of the principle limitations of event cameras, which can only record a disordered asynchronous spatio-temporal visual event sequence, rather than the traditional synchronous video image frames. This makes the event camera independent of the traditional camera, and joint calibration is the first problem to be solved in order to enable fusion, synchronization of sensor information.
Disclosure of Invention
The present invention aims to solve at least one of the technical problems in the related art to some extent.
Therefore, the invention aims to provide the combined calibration method for the event camera sensor and the RGB camera, which can calibrate the event camera and the traditional RGB camera and has the characteristics of high calibration precision and the like.
Another object of the present invention is to provide an integrated calibration device for an event camera sensor and an RGB camera.
In order to achieve the above objective, in one aspect, the present invention provides a method for jointly calibrating an event camera sensor and an RGB camera, including:
The method comprises the steps that events output by an event camera are aggregated to obtain a first picture according to a timestamp of an RGB camera; respectively extracting edges of the first picture and a second picture obtained by the RGB camera to obtain a first edge image and a second edge image; extracting straight lines from the first edge graph and the second edge graph respectively, and calculating intersection points of the straight lines as virtual corner points; solving a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture for the virtual corner; based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture, a transformation matrix of the event camera and the RGB camera is calculated.
The method for calibrating the event camera sensor and the RGB camera in a combined way can calibrate the event camera and the traditional RGB camera and has the characteristics of high calibration precision and the like.
In addition, the method for calibrating the event camera sensor and the RGB camera in a combined manner according to the above embodiment of the present invention may further have the following additional technical features:
Further, in an embodiment of the present invention, the aggregating the events output by the event camera according to the time stamp of the RGB camera to obtain the first picture includes: performing distortion correction on an asynchronous space-time event sequence generated by sensing object motion in an environment by an event camera, and aggregating the asynchronous space-time event sequence into a set of visual events to obtain the first picture; wherein each visual event in the set of visual events is a three-dimensional point in the spatio-temporal three-dimensional space in which it is located.
Further, in one embodiment of the present invention, an event window is selected for the set of visual events, and the events in the event window are projected to the middle time of the time window according to the preset motion state; and accumulating the projected events, and presetting a first contrast to obtain a corresponding real motion condition and a corresponding event frame.
Further, in an embodiment of the present invention, the event frame is filtered to obtain a second contrast to obtain the first edge map.
Further, in one embodiment of the present invention, for the second edge map and the first edge map, the preset edge pixels are clustered by using an edge extraction algorithm by respectively combining conversion formulas of a plurality of coordinates and using features of the edge pixels, so as to identify a linear target and store the linear target in a first linear set and a second linear set; wherein the plurality of coordinates includes a plurality of two-dimensional coordinates, spherical coordinates, and cartesian coordinates.
Further, in an embodiment of the present invention, the first line set and the second line set are combined to obtain a third line set and a fourth line set by using a distance between lines and descriptor similarity information of the lines, respectively.
Further, in an embodiment of the present invention, an intersection point between straight lines is calculated in the set for the third straight line set and the fourth straight line set to obtain a first virtual corner set and a second virtual corner set.
Further, in an embodiment of the present invention, euclidean distances between points in the first virtual corner set and the second virtual corner set are calculated, the euclidean distances are used as distance matrices, and a matching algorithm is used to obtain a pairing relationship between the first virtual corner set and the second virtual corner set.
Further, in one embodiment of the invention, camera references are calibrated by hand eye calibration for successful registration matches to calculate the external references of the event camera and RGB camera.
In order to achieve the above object, another aspect of the present invention provides an integrated calibration device for an event camera sensor and an RGB camera, comprising:
The event aggregation module is used for aggregating the events output by the event camera according to the time stamp of the RGB camera to obtain a first picture; the edge extraction module is used for respectively extracting edges of the first picture and the second picture acquired by the RGB camera to obtain a first edge picture and a second edge picture; the straight line extraction module is used for extracting straight lines from the first edge graph and the second edge graph respectively and calculating intersection points of the straight lines as virtual corner points; the matrix transformation module is used for solving a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture for the virtual corner; and the matrix calculation module is used for calculating the transformation matrix of the event camera and the RGB camera based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture.
The device for calibrating the event camera sensor and the RGB camera in a combined way can calibrate the event camera and the traditional RGB camera, and has the characteristics of high calibration precision and the like.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a flow chart of a method for joint calibration of an event camera sensor and an RGB camera according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a hand-eye calibration method employed in accordance with an embodiment of the present invention;
FIG. 3 is an overall schematic diagram of an event camera sensor and RGB camera joint calibration according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an integrated calibration device for an event camera sensor and an RGB camera according to an embodiment of the invention.
Detailed Description
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
The following describes an event camera sensor and RGB camera joint calibration method and device according to an embodiment of the present invention with reference to the accompanying drawings.
FIG. 1 is a flow chart of a method for joint calibration of an event camera sensor and an RGB camera according to one embodiment of the invention.
As shown in fig. 1, the method for calibrating the event camera sensor and the RGB camera in a combined way comprises the following steps:
S1, gathering an event output by an event camera according to a timestamp of an RGB camera to obtain a first picture;
S2, respectively extracting edges of the first picture and a second picture obtained by the RGB camera to obtain a first edge picture and a second edge picture;
s3, extracting straight lines from the first edge graph and the second edge graph respectively, and calculating intersection points of the straight lines as virtual corner points;
s4, solving a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture for the virtual corner;
S5, calculating the transformation matrix of the event camera and the RGB camera based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture.
Specifically, the present invention prepares a standard checkerboard and photographing apparatus 1, a photographing apparatus (event camera) 2 having an asynchronous event acquisition function, a photographing apparatus (conventional RGB camera) having a synchronous image acquisition function; the mounting position relation of the devices is as follows: the traditional RGB camera and the event camera are fixedly connected through hardware, and shoot a checkerboard together.
As an example, the present invention is shown in fig. 3.
A. The method comprises the steps of performing distortion correction on an asynchronous space-time event sequence generated by object motion in an event camera sensing environment, and then aggregating the asynchronous space-time event sequence into a set of visual events, wherein each visual event in the set is a three-dimensional point in a space-time three-dimensional space where the visual event is located;
B. Obtaining edges of an image obtained after distortion correction of an original input image of an RGB camera to obtain an edge image;
C. Selecting an event window from the set obtained in the step A, and projecting the events in the window to the middle moment of the time window according to the assumed motion state;
D. Accumulating the projected events, and maximizing the contrast ratio to obtain corresponding real motion conditions and corresponding event frames;
E. c, filtering the event frame obtained in the step C to obtain a picture with enhanced contrast, enhanced edge information and smooth face point noise;
F. And B, combining a conversion formula of two-dimensional coordinates, spherical coordinates and Cartesian coordinates for the picture generated by the B, clustering all edge pixels capable of forming a straight line by adopting an effective edge extraction algorithm by utilizing the characteristics of the edge pixels, thereby identifying a straight line target and storing the straight line target in the straight line set 1. Combining two-dimensional coordinates, spherical coordinates and Cartesian coordinates of the image generated by the E, clustering all edge pixels capable of forming a straight line by using the characteristics of the edge pixels and adopting an effective edge extraction algorithm, thereby identifying a straight line target and storing the straight line target in the straight line set 2;
G. Combining the straight lines 1 and the straight lines 2 in a set by utilizing the distance between the straight lines, the descriptor similarity of the straight lines and other information to obtain an effective straight line set 1 and an effective straight line set 2;
H. Calculating intersection points among straight lines in the effective straight line sets 1 and 2 obtained in the step G to obtain a virtual corner point set 1 and a virtual corner point set 2;
I. Calculating Euclidean distance between points of the virtual corner set 1 and the virtual corner 2, taking the Euclidean distance as a distance matrix, and acquiring a pairing relation between the virtual corner set 1 and the corner set 2 by using a matching algorithm;
J. And for successful registration matching, calibrating camera internal parameters by adopting a hand-eye calibration method, and calculating external parameters of the event camera sensor and the RGB camera.
Embodiments of the present invention are further described below with reference to the accompanying drawings.
In this embodiment, a coordinate drawing is selected as the calibration plate. As shown in fig. 2, a conventional camera and an RGB camera are fixedly connected to a base, the base is moved, and the camera is used to take pictures of different angles, so as to obtain a group of images and events with time stamps.
The calibration step of jointly calibrating the event camera sensor and the traditional RGB camera sensor is adopted:
step 1: and the coordinate drawing is placed on the calibration plate, and the camera is smoothly moved to obtain a series of pictures and events.
Step 2: reading an event with the length of 0.8s taking the picture timestamp as the center according to the picture timestamp, and assuming that
x(t)=x(0)+vt
Where x= (x, y) represents the position of the point and v is the speed of the camera movement speed.
An event within a shorter period of time can be inferred to be at a uniform time t ref, as shown by the following equation:
x′k=W(xk,tk;v)=xk-(tk-tref)v
And then accumulate it:
where each pixel x is summed by the events it is warped to this position. Calculation of
Where N p represents the number of event points, h= (H ij),μH=1/Np∑hi,j.
And f (v) is optimized to take the maximum value, and the obtained H (x, v) is correspondingly used as a recovered event frame.
Step 3: and (3) respectively carrying out histogram filtering to enhance contrast ratio on the pictures obtained in the step (2), carrying out Gaussian filtering to reduce plane noise points, and guiding filtering to enhance edge information.
Step 4: and (3) carrying out Hough transformation after carrying out edge extraction on the image obtained in the step (3), extracting all the linear information, and calculating the intersection point of the linear information as a virtual corner point after de-duplication of the linear information.
Step 5: performing hough transformation after edge extraction on the RGB image in the step (2), and calculating corner points as virtual corner points after de-weighting the extracted straight lines.
Step 6: repeating the steps (1) - (5) to obtain a virtual corner set of the RGB image and the event frame at the next moment.
Step 7: adopting nearest neighbor matching to the virtual corner points obtained by the RGB image in the step (5) and the virtual corner points obtained by the RGB image in the step (6) to calculate a transformation matrix A; the virtual corner points obtained by the event frames in the step (5) and the virtual corner points obtained by the event frames in the step (6) are subjected to nearest neighbor matching, and a transformation matrix B is calculated;
Step 8: solving an equation AX=XB, wherein X is an external parameter matrix calibrated by the camera and the event camera.
According to the method for calibrating the event camera sensor and the RGB camera in a combined mode, events output by the event camera are gathered to obtain a first picture according to the time stamp of the RGB camera; respectively extracting edges of the first picture and a second picture obtained by the RGB camera to obtain a first edge image and a second edge image; extracting straight lines from the first edge graph and the second edge graph respectively, and calculating intersection points of the straight lines as virtual corner points; solving a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture for the virtual corner; based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture, transformation matrices of the event camera and the RGB camera are calculated. The invention can calibrate the event camera and the traditional RGB camera, and has the characteristics of high calibration precision and the like.
In order to implement the above embodiment, as shown in fig. 4, there is further provided a combined calibration device 10 of an event camera sensor and an RGB camera, where the device 10 includes: an event aggregation module 100, an edge extraction module 200, a straight line extraction module 300, a matrix transformation module, and a matrix calculation module 500.
The event aggregation module 100 is configured to aggregate an event output by the event camera according to a timestamp of the RGB camera to obtain a first picture;
The edge extraction module 200 is configured to perform edge extraction on the first image and the second image acquired by the RGB camera, respectively, to obtain a first edge image and a second edge image;
The straight line extracting module 300 is configured to extract straight lines from the first edge map and the second edge map, and calculate intersection points of the straight lines as virtual corner points;
the matrix transformation module 400 is configured to solve, for the virtual corner, a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture;
the matrix calculation module 500 is configured to calculate a transformation matrix of the event camera and the RGB camera based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture.
According to the event camera sensor and RGB camera combined calibration device provided by the embodiment of the invention, events output by an event camera are gathered to obtain a first picture according to the timestamp of the RGB camera; respectively extracting edges of the first picture and a second picture obtained by the RGB camera to obtain a first edge image and a second edge image; extracting straight lines from the first edge graph and the second edge graph respectively, and calculating intersection points of the straight lines as virtual corner points; solving a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture for the virtual corner; based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture, transformation matrices of the event camera and the RGB camera are calculated. The invention can calibrate the event camera and the traditional RGB camera, and has the characteristics of high calibration precision and the like.
It should be noted that the foregoing explanation of the embodiment of the method for calibrating the combination of the event camera sensor and the RGB camera is also applicable to the device for calibrating the combination of the event camera sensor and the RGB camera in this embodiment, and will not be repeated herein.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (10)

1. The combined calibration method of the event camera sensor and the RGB camera is characterized by comprising the following steps of:
the method comprises the steps that events output by an event camera are aggregated to obtain a first picture according to a timestamp of an RGB camera;
Respectively extracting edges of the first picture and a second picture obtained by the RGB camera to obtain a first edge image and a second edge image;
extracting straight lines from the first edge map and the second edge map respectively, and calculating intersection points of the straight lines as virtual corner points;
solving a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture for the virtual corner points;
and calculating the transformation matrix of the event camera and the RGB camera based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture.
2. The method of claim 1, wherein aggregating the events output by the event camera into a first picture based on the RGB camera time stamps, comprises:
Performing distortion correction on an asynchronous space-time event sequence generated by sensing object motion in an environment by an event camera, and aggregating the asynchronous space-time event sequence into a set of visual events to obtain the first picture; wherein each visual event in the set of visual events is a three-dimensional point in the spatio-temporal three-dimensional space in which it is located.
3. The method according to claim 2, characterized in that an event window is selected for the set of visual events, the events within the event window being projected to the middle instant of the time window according to a preset motion state;
And accumulating the projected events, and presetting a first contrast to obtain a corresponding real motion condition and a corresponding event frame.
4. A method according to claim 3, wherein the event frames are filtered to obtain a second contrast to obtain the first edge map.
5. The method of claim 4, wherein for the second edge map and the first edge map, respectively, combining conversion formulas of a plurality of coordinates and utilizing characteristics of edge pixels, clustering preset edge pixels by using an edge extraction algorithm to identify a straight line object and respectively storing the straight line object in a first straight line set and a second straight line set; wherein the plurality of coordinates includes a plurality of two-dimensional coordinates, spherical coordinates, and cartesian coordinates.
6. The method of claim 5, wherein the first and second sets of lines are combined to obtain a third and fourth set of lines using a distance between lines and descriptor similarity information of lines, respectively.
7. The method of claim 6, wherein the intersection between lines is calculated for the third set of lines and the fourth set of lines within a set to obtain a first set of virtual corner points and a second set of virtual corner points.
8. The method of claim 7, wherein the euclidean distance between points in the first set of virtual corner points and the second set of virtual corner points is calculated, the euclidean distance is used as a distance matrix, and a matching algorithm is used to obtain a pairing relationship between the first set of virtual corner points and the second set of virtual corner points.
9. The method of claim 8, wherein camera references are calibrated using a hand eye calibration method for successful registration matches to calculate the event camera and RGB camera references.
10. An event camera sensor and RGB camera combined calibration device, comprising:
The event aggregation module is used for aggregating the events output by the event camera according to the time stamp of the RGB camera to obtain a first picture;
The edge extraction module is used for respectively extracting edges of the first picture and the second picture acquired by the RGB camera to obtain a first edge picture and a second edge picture;
the straight line extraction module is used for extracting straight lines from the first edge image and the second edge image respectively, and calculating intersection points of the straight lines as virtual corner points;
The matrix transformation module is used for solving a transformation matrix of the adjacent second picture and a transformation matrix of the adjacent first picture for the virtual corner;
And the matrix calculation module is used for calculating the transformation matrix of the event camera and the RGB camera based on the transformation matrix of the adjacent second picture and the transformation matrix of the adjacent first picture.
CN202210357968.4A 2022-04-06 2022-04-06 Combined calibration method and device for event camera sensor and RGB camera Active CN114913239B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210357968.4A CN114913239B (en) 2022-04-06 2022-04-06 Combined calibration method and device for event camera sensor and RGB camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210357968.4A CN114913239B (en) 2022-04-06 2022-04-06 Combined calibration method and device for event camera sensor and RGB camera

Publications (2)

Publication Number Publication Date
CN114913239A CN114913239A (en) 2022-08-16
CN114913239B true CN114913239B (en) 2024-06-18

Family

ID=82762944

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210357968.4A Active CN114913239B (en) 2022-04-06 2022-04-06 Combined calibration method and device for event camera sensor and RGB camera

Country Status (1)

Country Link
CN (1) CN114913239B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117911540A (en) * 2024-03-18 2024-04-19 安徽大学 Event camera calibration device and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111696143A (en) * 2020-06-16 2020-09-22 清华大学 Event data registration method and system
CN112037269A (en) * 2020-08-24 2020-12-04 大连理工大学 Visual moving target tracking method based on multi-domain collaborative feature expression

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10110913B2 (en) * 2016-09-30 2018-10-23 Intel Corporation Motion estimation using hybrid video imaging system
US11303793B2 (en) * 2020-04-13 2022-04-12 Northwestern University System and method for high-resolution, high-speed, and noise-robust imaging
CN113888639B (en) * 2021-10-22 2024-03-26 上海科技大学 Visual odometer positioning method and system based on event camera and depth camera

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111696143A (en) * 2020-06-16 2020-09-22 清华大学 Event data registration method and system
CN112037269A (en) * 2020-08-24 2020-12-04 大连理工大学 Visual moving target tracking method based on multi-domain collaborative feature expression

Also Published As

Publication number Publication date
CN114913239A (en) 2022-08-16

Similar Documents

Publication Publication Date Title
US8139896B1 (en) Tracking moving objects accurately on a wide-angle video
CN107948517B (en) Preview picture blurring processing method, device and equipment
TW202201944A (en) Maintaining fixed sizes for target objects in frames
WO2017045326A1 (en) Photographing processing method for unmanned aerial vehicle
US11272096B2 (en) Photographing method and photographing apparatus for adjusting a field of view for a terminal
US20100149369A1 (en) Main face choosing device, method for controlling same, and image capturing apparatus
KR101524548B1 (en) Apparatus and method for alignment of images
CN104980651A (en) Image processing apparatus and control method
WO2010061956A1 (en) Stereoscopic image processing device, method, recording medium and stereoscopic imaging apparatus
JP2013533672A (en) 3D image processing
CN104584531A (en) Image processing apparatus and image display apparatus
CN112207821B (en) Target searching method of visual robot and robot
WO2022057800A1 (en) Gimbal camera, gimbal camera tracking control method and apparatus, and device
CN110290287A (en) Multi-cam frame synchornization method
US20220342365A1 (en) System and method for holographic communication
CN114913239B (en) Combined calibration method and device for event camera sensor and RGB camera
JP2015508584A (en) Method for 3D reconstruction of scenes that rely on asynchronous sensors
WO2023236508A1 (en) Image stitching method and system based on billion-pixel array camera
WO2021139764A1 (en) Method and device for image processing, electronic device, and storage medium
CN110414558A (en) Characteristic point matching method based on event camera
TW202240462A (en) Methods, apparatuses, electronic devices and computer storage media for image synchronization
CN111385481A (en) Image processing method and device, electronic device and storage medium
EP4009622B1 (en) Method for capturing and processing a digital panoramic image
JP6063680B2 (en) Image generation apparatus, image generation method, imaging apparatus, and imaging method
CN114037757B (en) Binocular camera gesture sensing system based on synchronous images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant