WO2021027323A1 - 基于仿生眼平台的混合稳像方法和装置 - Google Patents
基于仿生眼平台的混合稳像方法和装置 Download PDFInfo
- Publication number
- WO2021027323A1 WO2021027323A1 PCT/CN2020/086889 CN2020086889W WO2021027323A1 WO 2021027323 A1 WO2021027323 A1 WO 2021027323A1 CN 2020086889 W CN2020086889 W CN 2020086889W WO 2021027323 A1 WO2021027323 A1 WO 2021027323A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image stabilization
- bionic eye
- platform
- transformation
- Prior art date
Links
- 230000006641 stabilisation Effects 0.000 title claims abstract description 70
- 238000011105 stabilization Methods 0.000 title claims abstract description 70
- 238000000034 method Methods 0.000 title claims abstract description 46
- 239000011664 nicotinic acid Substances 0.000 title claims abstract description 39
- 210000001508 eye Anatomy 0.000 claims abstract description 41
- 238000001914 filtration Methods 0.000 claims abstract description 31
- 239000011159 matrix material Substances 0.000 claims abstract description 30
- 238000012545 processing Methods 0.000 claims abstract description 24
- 238000013178 mathematical model Methods 0.000 claims abstract description 8
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 5
- 230000009466 transformation Effects 0.000 claims description 45
- 230000033001 locomotion Effects 0.000 claims description 18
- 238000004422 calculation algorithm Methods 0.000 claims description 14
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 11
- 238000013519 translation Methods 0.000 claims description 10
- 230000002159 abnormal effect Effects 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 6
- 238000001514 detection method Methods 0.000 claims description 6
- 230000000694 effects Effects 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000013441 quality evaluation Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 238000011426 transformation method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
Definitions
- the invention relates to the technical field of image processing, in particular to a hybrid image stabilization method and device based on a bionic eye platform.
- the robot vision system provides environment perception functions for the semi-autonomous or even autonomous operation of complex mobile robots.
- Image stabilization is one of the most important qualitative features of the mobile robot vision system.
- the robot system poses in the X, Y, and Z directions during movement.
- the image may be blurred due to displacement or shaking.
- the existing image stabilization technologies can generally be divided into three categories: optical image stabilization, mechanical image stabilization and digital image stabilization.
- the purpose of the present invention is to solve the shortcomings in the prior art, and propose a hybrid image stabilization method based on a bionic eye platform.
- a hybrid image stabilization method based on a bionic eye platform including the following steps:
- the motor of the mechanical image stabilization is controlled at the same time;
- the parameters are low-pass filtered through the Kalman filtering method, and abnormal values are eliminated from the parameter setting threshold;
- the key points are found through Shi-Tomasi corner detection, and then the movement between adjacent frames is tracked through the pyramid Lucas-Kanande algorithm, and the key points without matching are eliminated by the RANSAC algorithm;
- the final affine transformation model parameter Hfinal is obtained by using the feature points between the two consecutive frames of images Iestab11 and Iestab12 after matching;
- the quaternion interpolation geometric equation is expressed as:.
- the transformation parameters of the perspective model are first obtained by ignoring the influence of the camera translation movement.
- the stabilized image Iestab1 is an image obtained after ignoring the influence of translation.
- this application also provides a hybrid image stabilization device based on a bionic eye platform, including:
- the mechanical image stabilization module is used to control the motor of the mechanical image stabilization by measuring the observable disturbance speed and compensating the motor control amount;
- the first acquisition module is used to acquire the rotation information in the three-dimensional space of the platform by using the gyroscope, and perform interpolation synchronization between the acquired IMU information and the video information through quaternion interpolation, and obtain the image and gyroscope data with the same time stamp;
- the second acquisition module is used to use the gyroscope data and the mathematical model of the bionic eye platform to obtain the rotation matrix of the eye camera relative to the world coordinate system, that is, the rotation external parameters of the camera model;
- the first processing module is used to perform low-pass filtering on parameters through the Kalman filtering method, and remove abnormal values from parameter setting thresholds;
- the second processing module is configured to use the perspective transformation matrix before and after filtering to perform image compensation on the original dithered image to obtain the stabilized image Iestab1;
- the third processing module is used to find key points through Shi-Tomasi corner detection, and then use pyramid Lucas-Kanande algorithm to track the motion between adjacent frames, and use RANSAC algorithm to eliminate key points without matching;
- the fourth processing module is used to use the affine transformation model to obtain the final affine transformation model parameter Hfinal by using the feature points between the two consecutive frames of images Iestab11 and Iestab12 after matching;
- the image transformation module is used to perform image transformation on Iestab1 using the obtained matrix parameter Hfinal to obtain the final stabilized image Istab.
- the quaternion interpolation geometric equation is expressed as:.
- the first processing module is configured to:
- the transformation parameters of the perspective model are obtained first by ignoring the influence of camera translational motion.
- the stabilized image Iestab1 obtained by the second processing module is an image obtained after ignoring the influence of translation.
- this application also provides a computer device, which includes:
- One or more processors are One or more processors;
- Memory used to store one or more computer programs
- the one or more processors realize the hybrid image stabilization method based on the bionic eye platform as described above.
- this application also provides a computer-readable storage medium that stores computer code, and when the computer code is executed, the above-mentioned hybrid image stabilization method based on the bionic eye platform is executed.
- the present invention proposes a hybrid image stabilization method based on a bionic eye platform. Based on a nine-degree-of-freedom bionic eye vision platform, a real-time image stabilization technology of mechanical and electronic hybrid fast motion compensation is proposed.
- the mechanical image stabilization control process can be controlled by Observe the disturbance to measure and compensate, effectively suppress the influence of disturbance speed, and improve the system's position servo's ability to suppress disturbances.
- the second-level electronic image stabilization uses a gyroscope to obtain the rotation information in the three-dimensional space of the platform, and performs multiple filtering with the two-dimensional electronic image stabilization method to obtain the final image stabilization effect of the platform.
- the present invention not only does not need to add additional image stabilization auxiliary equipment on the basis of the original platform, but also combines the advantages of mechanical image stabilization and electronic image stabilization, can realize real-time processing of video information, and is suitable for promotion.
- FIG. 1 is a schematic flowchart of a hybrid image stabilization method based on a bionic eye platform provided by an embodiment of the application, where the dashed frame is a schematic diagram of the electronic image stabilization process;
- FIG. 2 is a schematic diagram of a DH mathematical model of a bionic eye platform provided by an embodiment of the application;
- FIG. 3 is a schematic diagram of matrix transformation before and after filtering according to an embodiment of the application.
- a hybrid image stabilization method based on a bionic eye platform proposed by the present invention includes the following steps:
- the image information and gyroscope data in order to ensure the accuracy of the model transformation parameters, it is necessary to obtain the image information and gyroscope data at the same time.
- the visual information and gyroscope information of the bionic eye platform are obtained by FPGA and MEMS gyroscope respectively, and the two parties are independently controlled Do not interfere with each other. Therefore, the angle information obtained by the gyroscope is converted into a quaternion, and finally the quaternion spherical linear interpolation is used to synchronize the image information and the gyroscope data.
- the quaternion interpolation geometric equation can be written as:
- the azimuth angle collected by the gyroscope is interpolated by slerp to obtain the same time as the video information.
- ⁇ and ⁇ respectively represent the rotation angles around the axes Z, Y, and X, from which the current rotation matrix can be obtained:
- the transformation matrix between the gyroscope data and the camera is calculated, and the rotation matrix of the gyroscope coordinate system relative to the camera coordinate system:
- x is the image coordinate
- K and [R t] are the camera's internal and external parameters
- X is the world coordinate
- the perspective transformation model can be used to realize the coordinate transformation between the original frame and the stable frame to eliminate image jitter caused by platform rotation. .
- the original video sequence is I
- the transformation matrix between adjacent image frames is H
- the video sequence obtained after filtering is The transformation matrix between adjacent image frames.
- the transformation matrix between the original frame and the stable frame at the same time is
- an image stabilization experiment is performed on a nine-degree-of-freedom bionic eye vision platform.
- the motion control part includes nine-axis motion control.
- CANOpen communication is used to build a ROS environment to facilitate the control of the bionic eye platform.
- images are collected through FPGA, and NVIDIA TX2 is used as the control machine.
- the mechanical image stabilization part only performs disturbance suppression processing on the three joints of the neck, so the other 6 degrees of freedom of the eyeball are in a locked state.
- ITF inter-frame conversion fidelity
- PSNR is the peak signal-to-noise ratio
- MSE is the mean square error
- the algorithm has a good performance on the bionic eye vision platform.
- the video ITF is increased by 5% and 20% respectively, and the frame rate after image stabilization can reach 30fps.
- the average value of pixel movement in the X and Y directions is within one pixel.
- the present invention is based on a nine-degree-of-freedom bionic eye vision platform, and proposes a real-time image stabilization technology for rapid motion compensation of mechanical and electronic hybrids.
- the observable disturbance is measured and compensated to effectively suppress The influence of the disturbance speed improves the system's position servo's ability to suppress disturbances.
- the second-level electronic image stabilization uses a gyroscope to obtain the rotation information in the three-dimensional space of the platform, and performs multiple filtering with the two-dimensional electronic image stabilization method to obtain the final image stabilization effect of the platform.
- it not only does not need to add additional image stabilization auxiliary equipment on the basis of the original platform, but also combines the advantages of mechanical image stabilization and electronic image stabilization to realize real-time processing of video information.
- this application also provides a hybrid image stabilization device based on a bionic eye platform, including:
- the mechanical image stabilization module is used to control the motor of the mechanical image stabilization by measuring the observable disturbance speed and compensating the motor control amount;
- the first acquisition module is used to acquire the rotation information in the three-dimensional space of the platform by using the gyroscope, and perform interpolation synchronization between the acquired IMU information and the video information through quaternion interpolation, and obtain the image and gyroscope data with the same time stamp;
- the second acquisition module is used to use the gyroscope data and the mathematical model of the bionic eye platform to obtain the rotation matrix of the eye camera relative to the world coordinate system, that is, the rotation external parameters of the camera model;
- the first processing module is used to perform low-pass filtering on parameters through the Kalman filtering method, and remove abnormal values from parameter setting thresholds;
- the second processing module is configured to use the perspective transformation matrix before and after filtering to perform image compensation on the original dithered image to obtain the stabilized image Iestab1;
- the third processing module is used to find key points through Shi-Tomasi corner detection, and then use pyramid Lucas-Kanande algorithm to track the motion between adjacent frames, and use RANSAC algorithm to eliminate key points without matching;
- the fourth processing module is used to use the affine transformation model to obtain the final affine transformation model parameter Hfinal by using the feature points between the two consecutive frames of images Iestab11 and Iestab12 after matching;
- the image transformation module is used to perform image transformation on Iestab1 using the obtained matrix parameter Hfinal to obtain the final stabilized image Istab.
- the quaternion interpolation geometric equation is expressed as:.
- the first processing module is configured to:
- the transformation parameters of the perspective model are obtained first by ignoring the influence of camera translational motion.
- the stabilized image Iestab1 obtained by the second processing module is an image obtained after ignoring the influence of translation.
- this application also provides a computer device, which includes:
- One or more processors are One or more processors;
- Memory used to store one or more computer programs
- the one or more processors realize the hybrid image stabilization method based on the bionic eye platform as described above.
- the computer device includes one or more processors (CPU), input/output interfaces, network interfaces, and memory.
- processors CPU
- input/output interfaces network interfaces
- memory volatile and non-volatile memory
- the memory may include non-permanent memory in computer readable media, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of computer readable media.
- RAM random access memory
- ROM read-only memory
- flash RAM flash memory
- this application also provides a computer-readable storage medium that stores computer code.
- the computer code is executed, the aforementioned hybrid image stabilization method based on the bionic eye platform is executed. .
- Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
- the information can be computer-readable instructions, data structures, program modules, or other data.
- Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transitory media, such as modulated data signals and carrier waves.
- PRAM phase change memory
- SRAM static random access memory
- DRAM dynamic random access memory
- RAM random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read-only memory
- flash memory or other memory technology
- CD-ROM compact disc
- DVD digital versatile disc
- Magnetic cassettes magnetic tape magnetic disk storage or other magnetic storage devices or any other non
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (10)
- 一种基于仿生眼平台的混合稳像方法,其特征在于,包括以下步骤:通过把可观测扰动速度进行测量以及电机控制量补偿,同时对机械稳像的电机进行控制;利用陀螺仪获取平台三维空间中的旋转信息,将获得的IMU信息与视频信息通过四元数插值的方式进行插值同步,获得时间戳相同下的图像与陀螺仪数据;利用陀螺仪数据和仿生眼平台的数学模型,获得眼球相机相对于世界坐标系的旋转矩阵,即相机模型的旋转外参;通过kalman滤波方法对参数进行低通滤波,并对参数设置阈值进行异常值剔除;利用滤波前后的透视变换矩阵,对原有的抖动图像进行图像补偿获得稳像后图像Iestab1;通过Shi-Tomasi角点检测查找到关键点,之后通过金字塔Lucas-Kanande算法跟踪相邻帧之间的运动,对于无匹配的关键点采用RANSAC算法进行剔除;采用仿射变换模型,利用匹配后的连续两帧图像Iestab11和Iestab12之间的特征点,获得最终的仿射变换模型参数Hfinal;利用获得的矩阵参数Hfinal对Iestab1进行图像变换获得最终稳像后图像Istab。
- 根据权利要求1所述的一种基于仿生眼平台的混合稳像方法,其特征在于,四元数插值几何方程表示为:。
- 根据权利要求1所述的一种基于仿生眼平台的混合稳像方法,其特征在于,在所述通过kalman滤波方法对参数进行低通滤波中,在低通滤波过程中,忽略相机平移运动的影响先获得透视模型的变换参数。
- 根据权利要求1所述的一种基于仿生眼平台的混合稳像方法,其特征 在于,在所述对原有的抖动图像进行图像补偿获得稳像后图像Iestab1中,稳像后图像Iestab1为忽略平移影响后获得的图像。
- 一种基于仿生眼平台的混合稳像装置,其特征在于,包括:机械稳像模块,用于通过把可观测扰动速度进行测量以及电机控制量补偿,同时对机械稳像的电机进行控制;第一获取模块,用于利用陀螺仪获取平台三维空间中的旋转信息,将获得的IMU信息与视频信息通过四元数插值的方式进行插值同步,获得时间戳相同下的图像与陀螺仪数据;第二获取模块,用于利用陀螺仪数据和仿生眼平台的数学模型,获得眼球相机相对于世界坐标系的旋转矩阵,即相机模型的旋转外参;第一处理模块,用于通过kalman滤波方法对参数进行低通滤波,并对参数设置阈值进行异常值剔除;第二处理模块,用于利用滤波前后的透视变换矩阵,对原有的抖动图像进行图像补偿获得稳像后图像Iestab1;第三处理模块,用于通过Shi-Tomasi角点检测查找到关键点,之后通过金字塔Lucas-Kanande算法跟踪相邻帧之间的运动,对于无匹配的关键点采用RANSAC算法进行剔除;第四处理模块,用于采用仿射变换模型,利用匹配后的连续两帧图像Iestab11和Iestab12之间的特征点,获得最终的仿射变换模型参数Hfinal;图像变换模块,用于利用获得的矩阵参数Hfinal对Iestab1进行图像变换获得最终稳像后图像Istab。
- 根据权利要求5所述的一种基于仿生眼平台的混合稳像装置,其特征在于,四元数插值几何方程表示为:。
- 根据权利要求5所述的一种基于仿生眼平台的混合稳像装置,其特征在于,所述第一处理模块,用于:在低通滤波过程中,忽略相机平移运动的影响先获得透视模型的变换参数。
- 根据权利要求5所述的一种基于仿生眼平台的混合稳像装置,其特征在于,所述第二处理模块获得的稳像后图像Iestab1为忽略平移影响后获得的图像。
- 一种计算机设备,所述计算机设备包括:一个或多个处理器;存储器,用于存储一个或多个计算机程序;当一个或多个计算机程序被一个或多个处理器执行时,使得一个或多个处理器实现如权利要求1-4任一项所述的基于仿生眼平台的混合稳像方法。
- 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机代码,当所述计算机代码被执行时,如权利要求1-4任一项所述的基于仿生眼平台的混合稳像方法被执行。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910753067.5 | 2019-08-14 | ||
CN201910753067.5A CN110677578A (zh) | 2019-08-14 | 2019-08-14 | 基于仿生眼平台的混合稳像方法和装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021027323A1 true WO2021027323A1 (zh) | 2021-02-18 |
Family
ID=69075351
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/086889 WO2021027323A1 (zh) | 2019-08-14 | 2020-04-25 | 基于仿生眼平台的混合稳像方法和装置 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110677578A (zh) |
WO (1) | WO2021027323A1 (zh) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113240597A (zh) * | 2021-05-08 | 2021-08-10 | 西北工业大学 | 基于视觉惯性信息融合的三维软件稳像方法 |
CN114089624A (zh) * | 2021-07-20 | 2022-02-25 | 武汉高德红外股份有限公司 | 一种基于实时fft变换的峰值器扰动抑制***和方法 |
CN114567726A (zh) * | 2022-02-25 | 2022-05-31 | 苏州安智汽车零部件有限公司 | 一种类人眼自适应消抖前视摄像头 |
CN114979489A (zh) * | 2022-05-30 | 2022-08-30 | 西安理工大学 | 基于陀螺仪的重型装备生产场景视频监控稳像方法及*** |
CN116208855A (zh) * | 2023-04-28 | 2023-06-02 | 杭州未名信科科技有限公司 | 一种多塔机云台全景图像抖动协调抑制方法和*** |
CN117687346A (zh) * | 2024-02-01 | 2024-03-12 | 中国科学院长春光学精密机械与物理研究所 | 舰载光电经纬仪的空间稳像控制***及控制方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110677578A (zh) * | 2019-08-14 | 2020-01-10 | 北京理工大学 | 基于仿生眼平台的混合稳像方法和装置 |
CN113406646B (zh) * | 2021-06-18 | 2023-03-31 | 北京师范大学 | 基于多方向超声测距及imu进行三维定位的方法和设备 |
CN113359462B (zh) * | 2021-06-25 | 2022-12-20 | 北京理工大学 | 一种基于扰动解耦与补偿的仿生眼稳像***及方法 |
CN113949812A (zh) * | 2021-10-21 | 2022-01-18 | 浙江大立科技股份有限公司 | 一种基于分块kalman运动预测的电子稳像方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101729783A (zh) * | 2009-12-22 | 2010-06-09 | 上海大学 | 基于类人眼球前庭动眼反射的双目视觉***在颠簸环境中的图像稳定方法 |
CN102572220A (zh) * | 2012-02-28 | 2012-07-11 | 北京大学 | 3-3-2空间信息转换新模式的仿生复眼运动目标检测 |
CN104135598A (zh) * | 2014-07-09 | 2014-11-05 | 清华大学深圳研究生院 | 一种视频图像稳定方法及装置 |
CN105306785A (zh) * | 2015-10-27 | 2016-02-03 | 武汉工程大学 | 一种基于sift特征匹配和vfc算法的电子稳像方法及*** |
CN109951631A (zh) * | 2017-12-11 | 2019-06-28 | 高途乐公司 | 联合的机械和电子图像稳定 |
CN110677578A (zh) * | 2019-08-14 | 2020-01-10 | 北京理工大学 | 基于仿生眼平台的混合稳像方法和装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101316368B (zh) * | 2008-07-18 | 2010-04-07 | 西安电子科技大学 | 基于全局特征点迭代的全景稳像方法 |
CN102148934B (zh) * | 2011-04-02 | 2013-02-06 | 北京理工大学 | 一种多模式实时电子稳像*** |
CN105976330B (zh) * | 2016-04-27 | 2019-04-09 | 大连理工大学 | 一种嵌入式雾天实时视频稳像方法 |
CN108307118B (zh) * | 2018-02-10 | 2020-07-07 | 北京理工大学 | 一种基于惯导参数流形优化的低延时视频稳像方法 |
-
2019
- 2019-08-14 CN CN201910753067.5A patent/CN110677578A/zh active Pending
-
2020
- 2020-04-25 WO PCT/CN2020/086889 patent/WO2021027323A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101729783A (zh) * | 2009-12-22 | 2010-06-09 | 上海大学 | 基于类人眼球前庭动眼反射的双目视觉***在颠簸环境中的图像稳定方法 |
CN102572220A (zh) * | 2012-02-28 | 2012-07-11 | 北京大学 | 3-3-2空间信息转换新模式的仿生复眼运动目标检测 |
CN104135598A (zh) * | 2014-07-09 | 2014-11-05 | 清华大学深圳研究生院 | 一种视频图像稳定方法及装置 |
CN105306785A (zh) * | 2015-10-27 | 2016-02-03 | 武汉工程大学 | 一种基于sift特征匹配和vfc算法的电子稳像方法及*** |
CN109951631A (zh) * | 2017-12-11 | 2019-06-28 | 高途乐公司 | 联合的机械和电子图像稳定 |
CN110677578A (zh) * | 2019-08-14 | 2020-01-10 | 北京理工大学 | 基于仿生眼平台的混合稳像方法和装置 |
Non-Patent Citations (1)
Title |
---|
CHEN, XIAOPENG ET AL.: "Hybrid Image Stabilization of Robotic Bionic Eyes", 2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 15 December 2018 (2018-12-15), pages 808 - 813, XP033529560 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113240597A (zh) * | 2021-05-08 | 2021-08-10 | 西北工业大学 | 基于视觉惯性信息融合的三维软件稳像方法 |
CN113240597B (zh) * | 2021-05-08 | 2024-04-26 | 西北工业大学 | 基于视觉惯性信息融合的三维软件稳像方法 |
CN114089624A (zh) * | 2021-07-20 | 2022-02-25 | 武汉高德红外股份有限公司 | 一种基于实时fft变换的峰值器扰动抑制***和方法 |
CN114089624B (zh) * | 2021-07-20 | 2024-05-17 | 武汉高德红外股份有限公司 | 一种基于实时fft变换的峰值器扰动抑制***和方法 |
CN114567726A (zh) * | 2022-02-25 | 2022-05-31 | 苏州安智汽车零部件有限公司 | 一种类人眼自适应消抖前视摄像头 |
CN114979489A (zh) * | 2022-05-30 | 2022-08-30 | 西安理工大学 | 基于陀螺仪的重型装备生产场景视频监控稳像方法及*** |
CN116208855A (zh) * | 2023-04-28 | 2023-06-02 | 杭州未名信科科技有限公司 | 一种多塔机云台全景图像抖动协调抑制方法和*** |
CN116208855B (zh) * | 2023-04-28 | 2023-09-01 | 杭州未名信科科技有限公司 | 一种多塔机云台全景图像抖动协调抑制方法和*** |
CN117687346A (zh) * | 2024-02-01 | 2024-03-12 | 中国科学院长春光学精密机械与物理研究所 | 舰载光电经纬仪的空间稳像控制***及控制方法 |
Also Published As
Publication number | Publication date |
---|---|
CN110677578A (zh) | 2020-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021027323A1 (zh) | 基于仿生眼平台的混合稳像方法和装置 | |
CN107255476B (zh) | 一种基于惯性数据和视觉特征的室内定位方法和装置 | |
WO2019084804A1 (zh) | 一种视觉里程计及其实现方法 | |
US11057567B2 (en) | Anti-shake method and apparatus for panoramic video, and portable terminal | |
CN106550174B (zh) | 一种基于单应性矩阵的实时视频稳像方法 | |
TWI479881B (zh) | 藉由結合方位感測器讀數及影像校準估計的3d視訊穩定之系統、方法及電腦程式產品 | |
US10609287B2 (en) | Stabilizing video | |
JP6170395B2 (ja) | 撮像装置およびその制御方法 | |
US20160165140A1 (en) | Method for camera motion estimation and correction | |
CN103841297B (zh) | 一种适用于合成运动摄像载体的电子稳像方法 | |
EP3296952A1 (en) | Method and device for blurring a virtual object in a video | |
CN113721260B (zh) | 一种激光雷达、双目相机和惯导的在线联合标定方法 | |
CN108900775B (zh) | 一种水下机器人实时电子稳像方法 | |
CN111405187A (zh) | 用于监控器材的图像防抖方法、***、设备和存储介质 | |
JP7253621B2 (ja) | パノラマ映像の手ぶれ補正方法及び携帯端末 | |
JP4661514B2 (ja) | 画像処理装置、および、画像処理方法、プログラム、並びに、記録媒体 | |
CN116052046A (zh) | 一种基于目标跟踪的动态场景双目视觉slam方法 | |
CN113240597B (zh) | 基于视觉惯性信息融合的三维软件稳像方法 | |
CN114429191A (zh) | 基于深度学习的电子防抖方法、***及存储介质 | |
CN111712857A (zh) | 图像处理方法、装置、云台和存储介质 | |
JP2016220083A (ja) | ローリングシャッタ回転歪み補正と映像安定化処理方法 | |
CN109462717A (zh) | 电子稳像方法及终端 | |
CN111696158B (zh) | 基于仿射匹配点对的多像机***相对位姿估计方法及装置 | |
Ryu et al. | Video stabilization for robot eye using IMU-aided feature tracker | |
Dai et al. | A tightly-coupled event-inertial odometry using exponential decay and linear preintegrated measurements |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20851460 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20851460 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31/03/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20851460 Country of ref document: EP Kind code of ref document: A1 |