CN113949812A - Electronic image stabilization method based on partitioned Kalman motion prediction - Google Patents

Electronic image stabilization method based on partitioned Kalman motion prediction Download PDF

Info

Publication number
CN113949812A
CN113949812A CN202111229346.5A CN202111229346A CN113949812A CN 113949812 A CN113949812 A CN 113949812A CN 202111229346 A CN202111229346 A CN 202111229346A CN 113949812 A CN113949812 A CN 113949812A
Authority
CN
China
Prior art keywords
state
matrix
motion
image
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111229346.5A
Other languages
Chinese (zh)
Inventor
王国秀
庞惠民
车宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dali Technology Co ltd
Original Assignee
Zhejiang Dali Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dali Technology Co ltd filed Critical Zhejiang Dali Technology Co ltd
Priority to CN202111229346.5A priority Critical patent/CN113949812A/en
Publication of CN113949812A publication Critical patent/CN113949812A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

An electronic image stabilization method based on block Kalman motion prediction comprises the following steps: each image of the video stream is divided into blocks, an affine transformation matrix of each block of the adjacent inter-frame images is calculated, local motion estimation is carried out through Kalman filtering, irregular jitter in the video stream is removed, scanning motion of a camera is reserved, local motion compensation is carried out, and finally global optimization is carried out on the compensated images to complete electronic image stabilization. The invention can effectively reduce the image distortion caused by motion compensation by carrying out block description on the image, can well predict subjective motion and remove jitter by the Kalman filtering technology, and reduces the image blank area generated after motion compensation by global optimization and image fusion.

Description

Electronic image stabilization method based on partitioned Kalman motion prediction
Technical Field
The invention relates to an electronic image stabilizing method based on partitioned kalman motion prediction, and belongs to the technical field of image processing.
Background
Electronic Image Stabilizing Technology (EIST) is a Technology in which an observed Image sequence is unstable due to irregular motions such as vibration and shake which inevitably occur when a camera carrier shoots, and a video which is finally output to a display is a shake-blurred picture. For example, in the civil field, a navigation system, a surveillance system, a camera system, and in military applications, a surveillance and reconnaissance system, a navigation system, a guidance system, etc., video instability due to uncertain jitter of a camera carrier exists. In addition, in some use scenes, due to the fact that the working environment of the system is not ideal, the camera carrier shakes greatly, the problems that the acquired video shakes irregularly and the imaging quality is reduced are caused, the adverse effect is caused on visual application needing to acquire information or judge such as subjective observation and computer processing, the difficulty of image processing is increased, and great difficulty is caused on subsequent processing of application such as military reconnaissance, identification and monitoring.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the method is characterized by overcoming the defects of the prior art, providing an electronic image stabilization method based on blocking Kalman motion prediction, particularly carrying out blocking on each image of a video stream, calculating an affine transformation matrix of each block of adjacent inter-frame images, carrying out local motion estimation through Kalman filtering, removing irregular jitter in the video stream, reserving the scanning motion of a camera, carrying out local motion compensation, and finally carrying out global optimization on the compensated image to finish electronic image stabilization.
The purpose of the invention is realized by the following technical scheme:
an electronic image stabilization method based on block Kalman motion prediction comprises the following steps:
dividing each frame of input image into a plurality of sub-blocks;
extracting and matching inter-block feature points of the reference frame and the target frame;
calculating an inter-block affine matrix;
performing motion estimation by using Kalman filtering, and then performing motion compensation;
and carrying out global optimization and image fusion so as to reduce image blank areas generated by motion compensation.
Preferably, each frame image is divided into a plurality of sub-blocks of size M x N in proportion.
Preferably, Harris angular points of each sub-block are respectively detected for a reference frame and a target frame, then two angular point sets are matched to obtain coordinate correspondence, mismatching points are removed by using a Randac algorithm to improve matching precision, coordinates are substituted into an affine motion transformation model, and an affine matrix is solved to obtain a local motion vector.
Preferably, a kalman state equation and an observation equation are set, the motion state to be predicted is introduced, a motion estimation value is obtained after prediction and update, and the motion estimation value is compared with the initial motion state to obtain a motion compensation quantity to compensate the sub-block.
Preferably, the Harris corner detection algorithm uses a differential operator to move, and introduces a Gaussian smoothing factor through Taylor series expansion, so that the noise resistance is enhanced; matching angular points in the inter-frame images by adopting similarity measurement; and removing the mismatching point pairs by adopting a random sampling consistency algorithm.
Preferably, the similarity measure is used to match the corners in the inter-frame image, and the matching criterion is:
Figure BDA0003315434220000021
wherein,
Figure BDA0003315434220000022
for the matching of the values of the corners p, q, I in the two images1,I2For the two images to be matched, (p, q) are the corner point coordinates in the two images, image I1Has a corner coordinate of (p)x,py) Image I2Has a corner coordinate of (q)x,qy),μ1、σ1And mu2、σ2The mean and variance of pixels in a square area with the radius of D around the corner points of the two images are corresponded, and (x, y) is the coordinate point of the image pixel.
Preferably, the set kalman motion state equation and the measurement equation are as follows:
xk=Axk-1+Buk+fk
Zk=Hxk+vk
wherein x iskIs the system state matrix at time k, ZkIs the observed quantity of the state matrix at the time k, A is the state transition matrix, B is the control input matrix, xk-1Is k-1 orScaled system state matrix, ukTo control the input matrix at time k, fkProcess noise at time k, which is coincident with a mean of zero, and a covariance matrix of QkH is a state observation matrix, vkThe measured noise at time k is zero mean and the covariance matrix is Rk
The state prediction equation is:
Figure BDA0003315434220000031
Figure BDA0003315434220000032
wherein,
Figure BDA0003315434220000033
is a state prediction value at the moment k, also called a priori state estimation value,
Figure BDA0003315434220000034
the covariance is estimated for the state at time k,
Figure BDA0003315434220000035
is a predicted value of the state at time k-1, Pk-1The covariance is estimated for the state at time k-1.
The state update equation is:
Figure BDA0003315434220000036
Figure BDA0003315434220000037
Figure BDA0003315434220000038
wherein, KKAt time kOptimal kalman gain, HTIn order to be a transpose of the state observation matrix H,
Figure BDA0003315434220000039
for state estimation at time k, PkAnd I is an identity matrix with the same dimension of the state matrix, and is an error covariance estimated value updated at the moment k.
Compared with the prior art, the invention has the following beneficial effects:
the method has the advantages that the image distortion caused by motion compensation can be effectively reduced by carrying out block description on the image, subjective motion can be well predicted and jitter can be removed through the Kalman filtering technology, and image blank areas generated after motion compensation are reduced through global optimization and image fusion.
Drawings
FIG. 1 is a flow chart of the method steps of the present invention.
Fig. 2 is a schematic diagram of image blocks of each frame.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
An electronic image stabilization method based on block Kalman motion prediction comprises the following technical solutions: although the camera carrier has irregular motion, so that irregular jitter appears in a video sequence, the camera carrier has subjective motion such as directional scanning motion, static observation motion and the like, intentional motion of a camera is estimated by calculating motion vectors between frames, unintentional motion, namely the irregular jitter, is removed, the subjective motion is kept, and therefore the video is stabilized. The main process is as follows: 1) image blocking; 2) extracting inter-block feature points and matching; 3) calculating an inter-block affine matrix; 4) local motion estimation is carried out by Kalman filtering; 5) performing motion compensation; 6) the electronic image stabilization is finally completed by global optimization and image fusion, as shown in fig. 1.
The method comprises the following steps:
1. each frame of the input image is divided into several sub-blocks of size M x N in proportion, as shown in fig. 2.
2. Detecting Harris angular points of each sub-block in a reference frame and a target frame respectively, matching the two angular point sets to obtain coordinate correspondence, removing mismatching points by using a Randac algorithm to improve matching precision, substituting the coordinates into an affine motion transformation model, and solving an affine matrix to obtain a local motion vector;
3. designing a kalman state equation and an observation equation, introducing a motion state to be predicted, predicting and updating to obtain a motion estimation value, and comparing the motion estimation value with an initial motion state to obtain a motion compensation quantity to compensate the subblock.
4. And carrying out global optimization and image fusion so as to reduce image blank areas generated by motion compensation.
Wherein, the specific process of the step two is as follows:
the Harris corner detection algorithm uses a differential operator to move, the condition of pixel gray level change in a window after the window moves any distance along any direction can be calculated through Taylor series expansion, and moreover, a Gaussian smoothing factor is introduced, so that the noise resistance is enhanced. Its autocorrelation function is:
Figure BDA0003315434220000041
where c (u, v) is the gray scale difference of the moving window, (u, v) is the window coordinate, (x, y) is the image pixel coordinate point, w (x, y) is the gaussian weighting function, and I (x, y) is the gray scale value at the image (x, y).
After Taylor series expansion simplification:
Figure BDA0003315434220000042
where M is the Harris autocorrelation matrix, IxIs a gradient value in the horizontal direction, IyIs a gradient value in the vertical direction;
Figure BDA0003315434220000043
according to a calculation formula of a quadratic term function characteristic value, we can solve the characteristic value of an M matrix, calculate a corner response value R to judge a corner:
R=detM-α(traceM)2
wherein detM ═ λ1λ2To prove the determinant value of the matrix M, traceM ═ λ12Is the trace value of the matrix M, λ1And λ2The matrix M is characterized by a constant value of α, which is usually 0.04-0.06.
Then matching the corner points in the inter-image by using similarity measurement, wherein the matching criterion is as follows:
Figure BDA0003315434220000051
wherein,
Figure BDA0003315434220000057
for the corner points p, q in the two images, the value, I is matched1,I2For the two images to be matched, (p, q) are the corner point coordinates in the two images, image I1Has a corner coordinate of (p)x,py) Image I2Has a corner coordinate of (q)x,qy),μ1、σ1And mu2、σ2The mean and variance of pixels in a square area with the radius of D around the corner points corresponding to the two images are shown.
The pairs of mismatching points are then removed using a Random Sample Consensus (RANSAC) algorithm. Classical RANSAC is a process of continuously trying different target space parameters to maximize the objective function. In the process, a subspace of the data set is randomly selected, then a model estimation can be generated, the model estimated is tested by using the rest points of the data set, a score is obtained, and finally the model estimation with the highest score is the model of the whole data set. And a threshold value is needed to be set when judging whether a certain current point is in the class.
And then, calculating an affine transformation model through the matching point pairs to obtain a local motion vector, wherein the affine transformation of the image is as follows:
Figure BDA0003315434220000052
wherein,
Figure BDA0003315434220000053
is the coordinates of pixel points before affine transformation, (x, y) is the coordinates of pixel points of images after affine transformation,
Figure BDA0003315434220000054
is an affine transformation matrix.
The motion compensated affine transformation model is as follows:
Figure BDA0003315434220000055
wherein (x)t+1,yt+1) Is the characteristic pixel point coordinate of the current frame, (x)t,yt) Is the characteristic pixel point coordinates of the reference frame,
Figure BDA0003315434220000056
an affine transformation matrix between the current frame and the reference frame feature points.
N affine transformation models can be obtained through n feature points obtained through registration, 6 parameters are solved by using an energy minimization method, and the following steps are carried out:
Figure BDA0003315434220000061
in the formula, x1,x2,……,xnAnd y1,y2,……,ynRespectively the pixel coordinates, X, of N feature points of the current frame1,X2,……,XnAnd Y1,Y2,……,YnPixel coordinates, T, of N feature points of the reference framex、TyIs the sum of the variances in the horizontal and vertical directions.
To make Tx、TyMinimum, firstly, the partial derivative is calculated, then the derivative is made to be zero, and finally, 6 parameters a to f are calculated as follows:
Figure BDA0003315434220000062
Figure BDA0003315434220000063
the specific process of the third step is as follows:
designing a kalman state equation and an observation equation, introducing a motion state to be predicted, predicting and updating to obtain a motion estimation value, and comparing the motion estimation value with an initial motion state to obtain a motion compensation quantity to compensate the subblock.
Subjective motion is extracted from the motion vector by utilizing Kalman filtering, and the motion vector of the current moment is predicted according to the motion estimation value and the covariance estimation value of the previous moment.
Establishing a kalman motion state equation and a measurement equation as follows:
xk=Axk-1+Buk+fk
Zk=Hxk+vk
wherein x iskIs the system state matrix at time k, ZkIs observed quantity (actual measurement) of state matrix at k time, A is state transition matrix, B is control input matrix, xk-1Is the system state matrix at time k-1, ukTo control the input matrix at time k, fkProcess noise at time k, which is coincident with a mean of zero, and a covariance matrix of QkH is a state observation matrix, vkThe measured noise at time k is zero mean and the covariance matrix is Rk
The state prediction equation is as follows:
Figure BDA0003315434220000071
Figure BDA0003315434220000072
wherein,
Figure BDA0003315434220000073
is a state prediction value at the moment k, also called a priori state estimation value,
Figure BDA0003315434220000074
the covariance is estimated for the state at time k,
Figure BDA0003315434220000075
is a predicted value of the state at time k-1, Pk-1The covariance is estimated for the state at time k-1.
The state update equation is as follows:
Figure BDA0003315434220000076
Figure BDA0003315434220000077
Figure BDA0003315434220000078
wherein, KKFor optimal kalman gain at time k, HTIn order to be a transpose of the state observation matrix H,
Figure BDA0003315434220000079
for state estimation at time k, PkAnd I is an identity matrix with the same dimension of the state matrix, and is an error covariance estimated value updated at the moment k.
Obtaining motion estimation values
Figure BDA00033154342200000710
Then, the jitter motion vector can be obtained by comparing the jitter motion vector with the actual motion state value, and local motion compensation can be completed by compensating the jitter motion vector.
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make variations and modifications of the present invention without departing from the spirit and scope of the present invention by using the methods and technical contents disclosed above.

Claims (7)

1. An electronic image stabilization method based on block Kalman motion prediction is characterized by comprising the following steps:
dividing each frame of input image into a plurality of sub-blocks;
extracting and matching inter-block feature points of the reference frame and the target frame;
calculating an inter-block affine matrix;
performing motion estimation by using Kalman filtering, and then performing motion compensation;
and carrying out global optimization and image fusion so as to reduce image blank areas generated by motion compensation.
2. The method of claim 1, wherein each frame of image is divided into a plurality of sub-blocks of size M x N in proportion.
3. The electronic image stabilization method according to claim 1, wherein Harris angular points of each sub-block are respectively detected for a reference frame and a target frame, then two angular point sets are matched to obtain coordinate correspondences, a Randac algorithm is used for removing mismatching points to improve matching accuracy, the coordinates are substituted into an affine motion transformation model, and an affine matrix is solved to obtain local motion vectors.
4. The electronic image stabilization method according to claim 1, wherein a kalman state equation and an observation equation are set, a motion state to be predicted is introduced, a motion estimation value is obtained after prediction and update, and the motion estimation value is compared with an initial motion state to obtain a motion compensation amount to compensate the sub-block.
5. The electronic image stabilization method according to claim 3, wherein the Harris corner detection algorithm uses differential operators for movement, and Gaussian smoothing factors are introduced through Taylor series expansion, so that the noise resistance is enhanced; matching angular points in the inter-frame images by adopting similarity measurement; and removing the mismatching point pairs by adopting a random sampling consistency algorithm.
6. The electronic image stabilization method according to claim 5, characterized in that similarity measures are used to match corner points in the inter-image, the matching criterion being:
Figure FDA0003315434210000011
wherein,
Figure FDA0003315434210000012
for the matching of the values of the corners p, q, I in the two images1,I2For the two images to be matched, (p, q) are the corner point coordinates in the two images, image I1Has a corner coordinate of (p)x,py) Image I2Has a corner coordinate of (q)x,qy),μ1、σ1And mu2、σ2The mean and variance of pixels in a square area with the radius of D around the corner points of the two images are corresponded, and (x, y) is the coordinate point of the image pixel.
7. The electronic image stabilization method of claim 1, wherein:
the set kalman motion state equation and the measurement equation are as follows:
xk=Axk-1+Buk+fk
Zk=Hxk+vk
wherein x iskIs the system state matrix at time k, ZkIs the observed quantity of the state matrix at the time k, A is the state transition matrix, B is the control input matrix, xk-1Is the system state matrix at time k-1, ukTo control the input matrix at time k, fkProcess noise at time k, which is coincident with a mean of zero, and a covariance matrix of QkH is a state observation matrix, vkThe measured noise at time k is zero mean and the covariance matrix is Rk
The state prediction equation is:
Figure FDA0003315434210000021
Figure FDA0003315434210000022
wherein,
Figure FDA0003315434210000023
is a state prediction value at the moment k, also called a priori state estimation value,
Figure FDA0003315434210000024
the covariance is estimated for the state at time k,
Figure FDA0003315434210000025
is a predicted value of the state at time k-1, Pk-1The covariance is estimated for the state at time k-1.
The state update equation is:
Figure FDA0003315434210000026
Figure FDA0003315434210000027
Figure FDA0003315434210000028
wherein, KKFor optimal kalman gain at time k, HTIn order to be a transpose of the state observation matrix H,
Figure FDA0003315434210000029
for state estimation at time k, PkAnd I is an identity matrix with the same dimension of the state matrix, and is an error covariance estimated value updated at the moment k.
CN202111229346.5A 2021-10-21 2021-10-21 Electronic image stabilization method based on partitioned Kalman motion prediction Pending CN113949812A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111229346.5A CN113949812A (en) 2021-10-21 2021-10-21 Electronic image stabilization method based on partitioned Kalman motion prediction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111229346.5A CN113949812A (en) 2021-10-21 2021-10-21 Electronic image stabilization method based on partitioned Kalman motion prediction

Publications (1)

Publication Number Publication Date
CN113949812A true CN113949812A (en) 2022-01-18

Family

ID=79331950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111229346.5A Pending CN113949812A (en) 2021-10-21 2021-10-21 Electronic image stabilization method based on partitioned Kalman motion prediction

Country Status (1)

Country Link
CN (1) CN113949812A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103402045A (en) * 2013-08-20 2013-11-20 长沙超创电子科技有限公司 Image de-spin and stabilization method based on subarea matching and affine model
CN104853064A (en) * 2015-04-10 2015-08-19 海视英科光电(苏州)有限公司 Electronic image-stabilizing method based on infrared thermal imager
CN107222662A (en) * 2017-07-12 2017-09-29 中国科学院上海技术物理研究所 A kind of electronic image stabilization method based on improved KLT and Kalman filtering
CN110677578A (en) * 2019-08-14 2020-01-10 北京理工大学 Mixed image stabilization method and device based on bionic eye platform
CN110796010A (en) * 2019-09-29 2020-02-14 湖北工业大学 Video image stabilization method combining optical flow method and Kalman filtering
CN113256679A (en) * 2021-05-13 2021-08-13 湖北工业大学 Electronic image stabilization algorithm based on vehicle-mounted rearview mirror system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103402045A (en) * 2013-08-20 2013-11-20 长沙超创电子科技有限公司 Image de-spin and stabilization method based on subarea matching and affine model
CN104853064A (en) * 2015-04-10 2015-08-19 海视英科光电(苏州)有限公司 Electronic image-stabilizing method based on infrared thermal imager
CN107222662A (en) * 2017-07-12 2017-09-29 中国科学院上海技术物理研究所 A kind of electronic image stabilization method based on improved KLT and Kalman filtering
CN110677578A (en) * 2019-08-14 2020-01-10 北京理工大学 Mixed image stabilization method and device based on bionic eye platform
CN110796010A (en) * 2019-09-29 2020-02-14 湖北工业大学 Video image stabilization method combining optical flow method and Kalman filtering
CN113256679A (en) * 2021-05-13 2021-08-13 湖北工业大学 Electronic image stabilization algorithm based on vehicle-mounted rearview mirror system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHUNG-MING KUO ET AL.: "An Efficient Motion Estimation Algorithm for Video Coding Using Kalman Filter" *
HAO LI ET AL.: "An Efficient Image Matching Algorithm Based on Adaptive Threshold and RANSAC" *

Similar Documents

Publication Publication Date Title
Gunjal et al. Moving object tracking using kalman filter
CN110796010B (en) Video image stabilizing method combining optical flow method and Kalman filtering
KR100985805B1 (en) Apparatus and method for image stabilization using adaptive Kalman filter
US8290212B2 (en) Super-resolving moving vehicles in an unregistered set of video frames
CN107452015B (en) Target tracking system with re-detection mechanism
CN108198201A (en) A kind of multi-object tracking method, terminal device and storage medium
JP2016508652A (en) Determining object occlusion in image sequences
US20110074927A1 (en) Method for determining ego-motion of moving platform and detection system
CN111383252B (en) Multi-camera target tracking method, system, device and storage medium
CN105374049B (en) Multi-corner point tracking method and device based on sparse optical flow method
CN108629792A (en) Laser eyepiece detection method and device based on background modeling Yu background difference
CN112580683B (en) Multi-sensor data time alignment system and method based on cross correlation
Kim et al. Spatio-temporal weighting in local patches for direct estimation of camera motion in video stabilization
CN110738667A (en) RGB-D SLAM method and system based on dynamic scene
CN116433728A (en) DeepSORT target tracking method for shake blur scene
CN117011381A (en) Real-time surgical instrument pose estimation method and system based on deep learning and stereoscopic vision
CN111160362B (en) FAST feature homogenizing extraction and interframe feature mismatching removal method
Heimbach et al. Improving object tracking accuracy in video sequences subject to noise and occlusion impediments by combining feature tracking with Kalman filtering
CN116883897A (en) Low-resolution target identification method
CN113949812A (en) Electronic image stabilization method based on partitioned Kalman motion prediction
CN115170621A (en) Target tracking method and system under dynamic background based on relevant filtering framework
Gokul et al. Lucas Kanade based Optical Flow for Vehicle Motion Tracking and Velocity Estimation
CN106934818B (en) Hand motion tracking method and system
CN114882070A (en) Binocular vision-based three-dimensional target motion tracking method
KR102173244B1 (en) Video stabilization system based on SURF

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220118