CN116797497B - Image processing method, device, equipment and storage medium - Google Patents

Image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN116797497B
CN116797497B CN202311072199.4A CN202311072199A CN116797497B CN 116797497 B CN116797497 B CN 116797497B CN 202311072199 A CN202311072199 A CN 202311072199A CN 116797497 B CN116797497 B CN 116797497B
Authority
CN
China
Prior art keywords
image
processed
transformation matrix
determining
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311072199.4A
Other languages
Chinese (zh)
Other versions
CN116797497A (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Moore Threads Technology Co Ltd
Original Assignee
Moore Threads Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Moore Threads Technology Co Ltd filed Critical Moore Threads Technology Co Ltd
Priority to CN202311072199.4A priority Critical patent/CN116797497B/en
Publication of CN116797497A publication Critical patent/CN116797497A/en
Application granted granted Critical
Publication of CN116797497B publication Critical patent/CN116797497B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image processing method, an image processing device, image processing equipment and a storage medium, wherein the method comprises the following steps: acquiring an image to be processed, and executing image stabilizing processing on the image to be processed until an image stabilizing image corresponding to the image to be processed is obtained; the image stabilization process includes: performing de-dithering treatment on an image to be treated based on an image transformation matrix to obtain a treated image corresponding to the image to be treated; updating the image transformation matrix to obtain a new image transformation matrix in response to the processed image not meeting the image output condition; and determining the processed image as a stable image in response to the processed image meeting an image output condition.

Description

Image processing method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to an image processing method, an image processing device and a storage medium.
Background
Currently, in video image stabilization processing, a display device (for example, a video camera, a smart phone, etc.) generally uses digital image stabilization and image stabilization combined with a sensor to realize image stabilization, and after obtaining a stabilized image, the stabilized image needs to be cut to eliminate an invalid region in the stabilized image.
However, in the case of a display device with large shake, the related art always has an invalid region after clipping, and even if the invalid region is pixel-filled, the output image has a significant image frame.
Disclosure of Invention
Based on the problems existing in the related art, the embodiment of the application provides an image processing method, an image processing device and a storage medium.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides an image processing method, which comprises the following steps:
acquiring an image to be processed, and executing image stabilizing processing on the image to be processed until an image stabilizing image corresponding to the image to be processed is obtained;
the image stabilization process includes:
performing de-dithering treatment on an image to be treated based on an image transformation matrix to obtain a treated image corresponding to the image to be treated;
updating the image transformation matrix to obtain a new image transformation matrix in response to the processed image not meeting the image output condition;
and determining the processed image as a stable image in response to the processed image meeting an image output condition.
In some embodiments, the updating the image transformation matrix to obtain a new image transformation matrix in response to the processed image not meeting an image output condition includes: comparing the positions of the image clipping frame and the processed image; and adjusting preset smoothing parameters in the image transformation matrix to obtain the new image transformation matrix in response to at least one region of the image clipping frame being located outside the boundary of the processed image.
In some embodiments, the method further comprises: determining, in the processed image, a reference point for characterizing a position of the image cropping frame; and determining the image clipping frame according to the reference point and the image size of the stable image.
In some embodiments, the performing, based on the image transformation matrix, a de-dithering process on the image to be processed to obtain a processed image corresponding to the image to be processed, including: based on the image transformation matrix, carrying out data transformation on vertex coordinates of a plurality of vertexes of the image to be processed to obtain transformation coordinates corresponding to each vertex; and determining the processed image by taking the point corresponding to each transformation coordinate as a fixed point.
In some embodiments, the determining the processed image as a stabilized image in response to the processed image satisfying an image output condition includes: and determining an image area corresponding to the image clipping frame in the processed image as the stable image in response to the image clipping frame being positioned in the boundary of the processed image.
In some embodiments, the method further comprises: acquiring acceleration data and rotation data of image acquisition equipment when acquiring the image to be processed; determining attitude angle data when acquiring a previous frame of image of the image to be processed according to the acceleration data; determining attitude angle increment data when the image to be processed is acquired according to the rotation data, the attitude angle data and preset smoothing parameters; and determining the image transformation matrix according to the attitude angle data and the attitude angle increment data.
In some embodiments, the updating the image transformation matrix to obtain a new image transformation matrix includes: increasing preset smoothing parameters in the image transformation matrix to obtain first adjustment smoothing parameters; and determining an image transformation matrix corresponding to the first adjustment smoothing parameter as the new image transformation matrix.
In some embodiments, the method further comprises: acquiring a multi-frame image and vertex coordinates of each frame of image before the image to be processed at the acquisition moment; carrying out smoothing treatment on vertex coordinates of an image of a previous frame of the image to be treated according to preset smoothing parameters to obtain vertex smoothing coordinates of the image of the previous frame of the image to be treated; and determining the image transformation matrix according to the vertex coordinates of the image of the previous frame of the image to be processed and the vertex smooth coordinates.
In some embodiments, the updating the image transformation matrix to obtain a new image transformation matrix includes: reducing preset smoothing parameters in the image transformation matrix to obtain second adjusted preset smoothing parameters; and determining an image transformation matrix corresponding to the second adjustment smoothing parameter as the new image transformation matrix.
An embodiment of the present application provides an image processing apparatus including:
the image stabilizing processing module is used for acquiring an image to be processed and executing image stabilizing processing on the image to be processed until an image stabilizing image corresponding to the image to be processed is obtained;
the image stabilizing processing module is further used for performing debouncing processing on the image to be processed based on the image transformation matrix to obtain a processed image corresponding to the image to be processed; updating the image transformation matrix to obtain a new image transformation matrix in response to the processed image not meeting the image output condition; and determining the processed image as a stable image in response to the processed image meeting an image output condition.
The image processing device provided by the embodiment of the application comprises a memory and a processor, wherein the memory stores a computer program capable of running on the processor, and the processor realizes the image processing method provided by the embodiment of the application when executing the program.
The computer readable storage medium provided by the embodiment of the application stores executable instructions thereon, and the executable instructions are used for causing a processor to execute the executable instructions to implement the image processing method provided by the embodiment of the application.
Embodiments of the present application provide a computer program product comprising executable instructions stored in a computer-readable storage medium; the image processing method provided by the embodiment of the application is realized when the processor of the image processing device reads the executable instructions from the computer readable storage medium and executes the executable instructions.
The image processing method, the device, the equipment and the computer readable storage medium provided by the embodiment of the application execute image stabilizing processing on the image to be processed, if the image stabilizing image after the image stabilizing processing meets the image output condition, the image stabilizing image is output, if the image stabilizing image after the image stabilizing processing does not meet the image output condition, the image transformation matrix is updated, and the image stabilizing processing is executed again through the updated image transformation matrix, so that the image stabilizing image meeting the output condition is obtained by adjusting the image transformation matrix, no invalid pixels exist in the output image stabilizing image, the edge of the image stabilizing image is not required to be filled, and the image effect of the image stabilizing image is improved.
Drawings
FIG. 1 is a schematic view of an image cropping frame according to an embodiment of the present application;
Fig. 2 is a schematic view of an application scenario of an image processing method according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of an alternative image processing method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of image data transformation provided by an embodiment of the present application;
FIG. 5 is a second flowchart of an alternative image processing method according to an embodiment of the present application;
FIG. 6 is a schematic diagram I of a multi-frame image according to an embodiment of the present application;
FIG. 7 is a third flowchart of an alternative image processing method according to an embodiment of the present application;
FIG. 8 is a second schematic diagram of a multi-frame image according to an embodiment of the present application;
FIG. 9 is a schematic view of a clipping window provided by an embodiment of the present application;
fig. 10 is a schematic diagram of a composition structure of an image processing apparatus according to an embodiment of the present application;
fig. 11 is a schematic diagram of a composition structure of an image processing apparatus provided in an embodiment of the present application.
Detailed Description
For more clearly illustrating the objects, technical solutions and advantages of the embodiments of the present application, the embodiments of the present application will be described in detail below with reference to the accompanying drawings. It is to be understood that the following description of the embodiments is intended to illustrate and explain the general principles of the embodiments of the application, and should not be taken as limiting the embodiments of the application. In the description and drawings, the same or similar reference numerals refer to the same or similar parts or components. For purposes of clarity, the drawings are not necessarily drawn to scale and some well-known components and structures may be omitted from the drawings.
In some embodiments, unless otherwise defined, technical or scientific terms used in the embodiments of the application should be given the ordinary meanings as understood by those of ordinary skill in the art to which the embodiments of the application belong. The terms "first," "second," and the like, as used in embodiments of the present application, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The terms "a" or "an" do not exclude a plurality. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", "top" or "bottom" and the like are used only to indicate a relative positional relationship, which may be changed accordingly when the absolute position of the object to be described is changed. When an element such as a layer, film, region or substrate is referred to as being "on" or "under" another element, it can be "directly on" or "under" the other element or intervening elements may be present.
In the related art, the digital image stabilization is performed by smoothing the spatial motion trajectories of four corner points of images of multiple groups of adjacent frames to obtain smoothed four corner point coordinates, forming transformation pairs according to the original four corner points and transformed four corner points, calculating to obtain an affine transformation or perspective transformation matrix of the images, and finally transforming the images to obtain the image stabilization images. The image stabilization of the combination sensor is that the sensor gathers the spatial six-axis attitude angles of the current image frame, the attitude angles of the images of the current frame after the smoothing are obtained by smoothing the spatial six-axis attitude angles of the adjacent frames, then affine transformation or perspective transformation matrix of the images is obtained, and finally the images are transformed to obtain the image stabilization images. However, both digital image stabilization and image stabilization combined with the sensor require cropping of the transformed image to eliminate the non-valid areas outside the image caused by the image transformation.
In the related art, an image is cut through a fixed cutting window, and the method is to cut off redundant parts by taking the center point of an original image as the center point of a new cutting window and selecting a part area of the original image as a final output image window. The method can eliminate the invalid area of partial transformation, but when the motion of the image acquisition equipment is larger and stronger anti-shake is needed, the transformation matrix is more intense, the distortion degree of the transformed image is larger, and the invalid area is still introduced at the moment, as shown in fig. 1, fig. 1 is a schematic diagram of an image clipping frame provided by the embodiment of the application, fig. 1 a is an original image which is not transformed, and the clipping frame is positioned in the boundary of the image; and b, the image is a stabilized image after debouncing, when debouncing transformation is severe, the image is severely distorted, and invalid areas still exist in the clipping frame.
The related art uses a pixel edge filling method to prevent invalid unknown pixels or black pixels from entering a final output image area, but causes a significant frame phenomenon in the final output image.
Based on the problems existing in the related art, the embodiment of the application provides an image processing method, which obtains coordinates of four corner points of a transformed image based on an image transformation matrix, determines whether the coordinates of the four corner points of the transformed image are outside the rectangular corner points of the image cutting frame or not according to the spatial position relationship between the four corner points of the transformed image and the rectangular corner points of the image cutting frame, or adjusts the image transformation matrix again. In this way, it can be ensured that no invalid pixels exist in the image output by the embodiment of the application.
The image processing method provided by the embodiment of the application can be executed by electronic equipment such as image processing equipment, wherein the electronic equipment can be a notebook computer, a tablet computer, a desktop computer, a set-top box, various types of terminals such as mobile equipment (for example, a mobile phone, a portable music player, a personal digital assistant, a special message equipment and a portable game equipment), and the like, and can also be implemented as a server. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDN), basic cloud computing services such as big data and artificial intelligent platforms, and the like.
In the following, an exemplary application when the image processing apparatus is implemented as a server will be described, and a technical method in an embodiment of the present application will be clearly and completely described with reference to the drawings in the embodiment of the present application.
Fig. 2 is an application scenario schematic diagram of an image processing method according to an embodiment of the present application. The image processing system 20 provided in the embodiment of the present application includes an image capturing device 100, a network 200, and a server 300, where the image capturing device 100 may be an image capturing device such as a mobile phone or a camera, the image capturing device 100 may have a display interface 100-1 thereon, and the network 200 may be a wide area network or a local area network, or a combination of both. The server 300 and the image processing apparatus may be physically separate or integrated. When performing image processing, the server 300 may acquire the image to be processed acquired by the image acquisition device 100 through the network 200, acquire the image to be processed, perform image stabilization processing on the image to be processed until an image stabilization image corresponding to the image to be processed is obtained, send the image stabilization image to the image acquisition device 100, and display the image stabilization image on the display interface 100-1 of the image acquisition device 100.
Referring to fig. 3, fig. 3 is an optional flowchart of an image processing method according to an embodiment of the present application, where the image processing method according to the embodiment of the present application may be implemented by step S301:
step 301, an image to be processed is obtained, and image stabilizing processing is performed on the image to be processed until an image stabilizing image corresponding to the image to be processed is obtained.
Here, the image to be processed may be an image in which a certain frame in the video captured by the image capturing apparatus has distortion compared to the original image. When the image acquisition device acquires video or images, the image acquisition device generates jitter in moving, so that part of image frames in acquired video data have distortion compared with original images, and therefore, the image stabilization processing, namely the debouncing processing, is required to be carried out on the images with distortion.
In some embodiments, the image stabilization process in step S301 may be implemented by steps S3011 to S3013:
step S3011, performing de-jittering processing on an image to be processed based on an image transformation matrix to obtain a processed image corresponding to the image to be processed.
In the embodiment of the application, the image transformation matrix can be used for representing the position transformation relation between the angular point coordinates (namely, the vertex coordinates) in the video image of the last frame of the image to be processed in the video and the corresponding angular point coordinates in the image to be processed, and the image to be processed can be subjected to debouncing processing or correcting processing according to the image transformation matrix so as to correct the image. The embodiment of the application can obtain the image transformation matrix corresponding to the image to be processed through digital image stabilization and image stabilization combined with the sensor.
In the embodiment of the application, after the image transformation matrix is determined, the image to be processed can be subjected to debouncing processing through the image transformation matrix, so that the debounced processed image is obtained, and compared with the image to be processed, the processed image corrects image distortion.
In some embodiments, the de-dithering of the image to be processed may be implemented by step S1 and step S2:
and step S1, carrying out data transformation on vertex coordinates of a plurality of vertexes of the image to be processed based on the image transformation matrix to obtain transformation coordinates corresponding to each vertex.
In some embodiments, the coordinates may be determined based on a coordinate system in the current image acquisition scene, for example, a vertex of the image to be processed may be taken as an origin of coordinates, and vertex coordinates of multiple vertices of the image to be processed may be obtained according to the size of the image to be processed. Here, the image to be processed may be a rectangular image, or may be other shapes.
In the embodiment of the application, based on an image transformation matrix, vertex coordinates of a plurality of vertexes of an image to be processed are subjected to data transformation to obtain coordinates after transformation of each vertex, namely coordinates after debouncing. Here, the data transformation may refer to multiplying the coordinates of each vertex by an image transformation matrix to obtain transformed coordinates corresponding to each vertex, so as to implement image transformation.
And S2, determining the processed image by taking the point corresponding to each transformation coordinate as a fixed point.
In the embodiment of the application, the point corresponding to the coordinate after the debouncing of each vertex is taken as a fixed point, each fixed point is the vertex of the image after the debouncing of the image to be processed, and the processed image can be obtained by connecting the fixed points.
Fig. 4 is a schematic diagram of image data transformation provided in an embodiment of the present application, as shown in fig. 4, in a current image acquisition scene, four vertices of an image 401 to be processed are A, B, C and D, coordinates are a (x 0, y 0), B (x 1, y 1), C (x 2, y 2) and D (x 3, y 3), respectively, after the vertex coordinates are subjected to data transformation, a processed image 402 is obtained, the vertices of the processed image 402 are a, B, C and D, and the vertex coordinates of the processed image 402 are a (x 0 ', y0 '), B (x 1 ', y1 '), C (x 2 ', y2 '), D (x 3, y3 ').
And step S3012, updating the image transformation matrix to obtain a new image transformation matrix in response to the processed image not meeting the image output condition.
In some embodiments, after the image to be processed is subjected to the de-dithering process, the resulting size of the processed image may not meet the size requirement of the output image, e.g., 1080P, while the output image is of a size . Therefore, it is necessary to clip the image to be processed according to the parameters of the image output apparatus (which may be the size of the output image) so that the image can be normally displayed on the image output apparatus. Here, the image output apparatus and the image capturing apparatus may be the same or different, for example, the image output apparatus and the image capturing apparatus are both mobile phones, orThe image acquisition equipment is a camera, and the image output equipment is a notebook computer connected with the camera.
Here, the image output condition may be that there are no invalid pixels in the image after clipping the processed image, that is, that the processed image does not satisfy the image output condition means that there are invalid pixels in the clipped image after clipping the processed image. When the processed image does not meet the image output condition, the image transformation matrix can be updated to obtain a new image transformation matrix, the debouncing processing is carried out on the image to be processed based on the new image transformation matrix, and the processed image corresponding to the image to be processed is obtained again. If the processed image still does not meet the image output condition, updating the image transformation matrix again to obtain a new image transformation matrix, and performing debouncing processing on the image to be processed based on the new image transformation matrix until the processed image meets the image output condition until the image stabilizing image corresponding to the image to be processed is obtained.
According to the embodiment of the application, the image transformation matrix is iteratively adjusted according to whether the image subjected to the debouncing processing meets the image output condition or not until the image subjected to the debouncing processing meets the image output condition. Therefore, the embodiment of the application can ensure that the pixels of the stable image in the image clipping window are all effective image pixels, unnecessary edge pixel filling is not needed, the occurrence of the conditions of false frames and the like is eliminated, and the image effect of stable image output is improved.
In some embodiments, updating the image transformation matrix may be an adjustment to preset smoothing parameters in the image transformation matrix in order to reduce or increase the intensity of the debounce of the image to be processed. When the image transformation matrix is determined based on the sensor, the smaller the preset smoothing parameter in the image transformation matrix is, the better the debouncing effect is; therefore, the adjusting the preset smoothing parameter in the image transformation matrix may be to increase the preset smoothing parameter in the image transformation matrix to obtain an adjusted preset smoothing parameter, and obtain a new image transformation matrix based on the adjusted preset smoothing parameter.
In some embodiments, when determining the image transformation matrix based on digital anti-shake, the larger the preset smoothing parameter in the image transformation matrix, the better the de-shake effect; therefore, the adjusting the preset smoothing parameter in the image transformation matrix may be reducing the preset smoothing parameter in the image transformation matrix, to obtain the adjusted preset smoothing parameter, and obtaining a new image transformation matrix based on the preset smoothing parameter.
Step S3013, determining the processed image as a stable image in response to the processed image satisfying an image output condition.
In the embodiment of the application, the step S3013 and the step S3012 are parallel schemes, and if the image after the de-jittering processing meets the image output condition, that is, no invalid pixels exist in the image after the clipping, the image after the clipping is taken as a stable image; if the image after the debouncing processing does not meet the image output condition, namely invalid pixels exist in the cut image, preset smoothing parameters in the new image transformation matrix can be adjusted again until the image after the debouncing processing meets the image output condition, and the image stabilizing image is obtained.
After the image to be processed is debounced through the image transformation matrix, the image processing method provided by the embodiment of the application determines whether the debounced image meets the image output condition, if not, the image transformation matrix is updated to change the image debouncing intensity, and the image stabilizing image meeting the output condition is obtained through the new image transformation matrix. Therefore, the image stabilizing image output by the embodiment of the application has no invalid pixels, and the edges of the image stabilizing image are not required to be filled, so that the image effect of image stabilizing output is improved, and the efficiency of image stabilizing image output is also improved.
In some embodiments, the preset smoothing parameter characterizes a smoothing intensity when the image to be processed is subjected to the debouncing process, and when the processed image after the debouncing process does not meet the image output condition, that is, when invalid pixels exist in the clipping frame, the smoothing intensity during the debouncing process can be adjusted, that is, the preset smoothing parameter is adjusted.
Fig. 5 is a schematic flow chart of an alternative image processing method according to an embodiment of the present application, and step S3012, namely updating the image transformation matrix to obtain a new image transformation matrix, may be implemented by steps S501 to S502:
and step S501, comparing the position of the image clipping frame with the position of the processed image.
In some embodiments, the image cropping frame is configured to crop the post-processed image after the de-dithering process to obtain a stable image that conforms to the size of the image output device. In some embodiments, determining the image cropping frame may be achieved by step S5011 and step S5012:
step S5011, in the processed image, determining a reference point for characterizing the position of the image cropping frame.
In some embodiments, the reference point is used to characterize the position of the image cropping frame on the processed image, and based on the position of the image cropping frame, the position of the output stabilized image on the processed image may be determined. Here, in order to minimize the occurrence of invalid pixels, the reference point may be the center point of the processed image, the image to be processed, and the image cutout frame, based on which the position of the image cutout frame on the processed image may be determined.
And S5012, determining the image clipping frame according to the reference point and the image size of the stable image.
In some embodiments, the image size of the stabilized image, i.e., the image size of the image output device, may be the same as the size of the image cropping frame, which may be determined based on the reference point and the image size of the stabilized image.
After the position of the image clipping frame is determined on the processed image, the image clipping frame is compared with the processed image in position, and whether invalid pixels exist in the image clipping frame or not is determined.
Step S502, adjusting a preset smoothing parameter in the image transformation matrix in response to at least one region of the image clipping frame being located outside the boundary of the processed image, so as to obtain the new image transformation matrix.
In some embodiments, after determining the position of the image cropping frame having the image size on the processed image, if at least one region in the image cropping frame is located outside the boundary of the processed image, it is indicated that there are invalid pixels in the image after cropping the processed image, at this time, the preset smoothing parameters in the image transformation matrix need to be adjusted to weaken the debounce intensity, a new image transformation matrix is obtained, and the image with reduced debounce intensity is obtained through the new image transformation matrix.
In some embodiments, if at least two vertices of the processed image are located in the image clipping frame, it may also be stated that the debounce intensity is too high, and the clipping may result in a plurality of pixels in the image, so that the preset smoothing parameters in the image transformation matrix may also be adjusted to weaken the debounce intensity, so as to obtain a new image transformation matrix, and an image with reduced debounce intensity is obtained through the new image transformation matrix.
The embodiment of the application can determine whether the processed image after the current debouncing meets the image output condition or not by comparing the image cutting frame with the processed image with the image size, and adjusts the image to be processed again under the condition that the image output condition is not met, thereby avoiding outputting the stable image with invalid pixels and leading to that part of the area of the image in a certain frame of video has no picture.
In some embodiments, when the processed image does not meet the image output condition, that is, at least one region in the image clipping frame is located outside the boundary of the processed image, after updating the image transformation matrix to obtain a new image transformation matrix, performing transformation processing on the coordinates of multiple vertices of the image to be processed according to the new image transformation matrix, that is, multiplying the coordinates of each vertex with the new image transformation matrix to obtain coordinates after transformation of each vertex, that is, coordinates after de-dithering, and obtaining the image after re-processing according to the coordinates after de-dithering. If the image cropping frame is located within the boundary of the reprocessed image, it is indicated that there are no invalid pixels after the reprocessed image is cropped by the image cropping frame. At this time, the image area corresponding to the image clipping frame in the image after the re-processing can be determined as the output image stabilization image.
In the embodiment of the application, if the image clipping frame is positioned in the boundary of the processed image, the processed image is described as meeting the image output condition, and at the moment, the image area corresponding to the image clipping frame in the processed image is determined as the stable image. When the image transformation matrix corresponding to the image to be processed is calculated, the image transformation matrix can be obtained based on the anti-shake method of the sensor, and the image transformation matrix can be obtained through the digital anti-shake method.
In some embodiments, obtaining the image transformation matrix based on the sensor may be achieved through steps S01 to S04:
and step S01, acquiring acceleration data and rotation data of the image acquisition equipment when the image to be processed is acquired.
Here, the acceleration data may be measured by an accelerometer of the image acquisition device for characterizing the acceleration of the image acquisition device when acquiring the image to be processed. The rotation data are collected through a gyroscope of the image collection device and are used for representing the rotation angle of the image collection device when the image to be processed is collected.
In some embodiments, the gyroscope may obtain three attitude angles of an inertial measurement unit coordinate system (inertial measurement unit, IMU), where the origin of coordinates of the IMU coordinate system is the origin of coordinates of the gyroscope and the accelerometer, and the xyz three axis directions are parallel to the respective axes of the gyroscope and the accelerometer, respectively. The course angle yaw rotates around the Z axis of the IMU and rotates by a y angle; pitch rotates around the Y axis of the IMU by an angle p; the roll angle row rotates about the IMU's X axis by an angle r.
In some embodiments, according to three attitude angles, a rotation matrix of three axes of the IMU coordinate system may be obtained, where the rotation matrix of three axes is shown in formulas (1) to (3):
(1);
(2);
(3);
wherein Mx, my and Mz are rotation matrixes of three axes of an IMU coordinate system respectively; r is the angle of rotation of roll angle row about the X axis of the IMU; the p pitch angle pitch rotates around the Y axis of the IMU; y is the angle of rotation of the heading angle yaw about the Z-axis of the IMU.
And step S02, determining attitude angle data when acquiring the previous frame of image of the image to be processed according to the acceleration data.
In the embodiment of the application, the acceleration data are 3 values read by the accelerometer when the image to be processed is acquired, and the attitude angle data, namely the rotation angles of roll, pitch and yaw, when the image of the previous frame of the image to be processed is acquired are determined based on the acceleration data, the rotation matrix of three axes and the 3 values read by the accelerometer when the accelerometer is horizontally placed, wherein the value of the yaw angle can be obtained through the parameters of the gyroscope.
And S03, determining attitude angle increment data when the image to be processed is acquired according to the rotation data, the attitude angle data and the preset smoothing parameters.
In the embodiment of the application, the attitude angle increment data when the image to be processed is acquired can be obtained according to the attitude angle data when the image of the previous frame of the image to be processed is acquired, the rotation data measured by the gyroscope and the preset smoothing parameters, and the attitude angle increment data r ' and p ' and y ' are obtained by carrying out time integration on the formula (4).
(4);
Wherein K is a preset smoothing parameter. The smoothing method employed here is a kalman filter, i.e. the angular increment of the current frame is equal to the weighted calculation of the previous frame and the currently calculated angular increment. The smaller the K value, the better the angle increment smoothing effect.
And step S04, determining the image transformation matrix according to the attitude angle data and the attitude angle increment data.
In some embodiments, the image transformation matrix is as shown in equation (5):
(5);
when the image to be processed is subjected to debouncing based on an image transformation matrix obtained by a sensor, when the debounced transformed image does not meet the image output condition, the preset smoothing parameters in the image transformation matrix are increased to weaken the debouncing (i.e. smoothing) strength of the image, a first adjustment smoothing parameter is obtained, the image transformation matrix corresponding to the first adjustment smoothing parameter is determined to be a new image transformation matrix, debouncing processing is performed on the image to be processed again through the new image transformation matrix, and if the debounced image meets the image output condition, the image is cut and output. Here, determining the image transformation matrix corresponding to the first adjustment smoothing parameter as the new image transformation matrix may mean that the attitude angle increment data when the image to be processed is acquired is recalculated according to the rotation data, the attitude angle data and the first adjustment smoothing parameter, and the new image transformation matrix is calculated according to the recalculated attitude angle increment data and the recalculated attitude angle data.
In some embodiments, obtaining the image transformation matrix based on the digital anti-shake method may be achieved through steps S05 to S07:
and S05, acquiring multi-frame images and vertex coordinates of each frame of image before the image to be processed at the acquisition moment.
Fig. 6 is a schematic diagram of a multi-frame image provided by the embodiment of the present application, as shown in fig. 6, four corner coordinates of four frames of images 1, 2, 3, and 4 in the same scene are (A0, B0, C0, D0), (A1, B1, C1, D1), (A2, B2, C2, D2), and (A3, B3, C3, D3), respectively.
And step S06, carrying out smoothing treatment on the vertex coordinates of the image of the previous frame of the image to be treated according to the preset smoothing parameters to obtain the vertex smoothing coordinates of the image of the previous frame of the image to be treated.
In the embodiment of the application, image debouncing processing is performed on a multi-frame image through preset smoothing parameters to obtain vertex smoothing coordinates of an image of a previous frame of the image to be processed, for example, four frames of images before the image to be processed at the acquisition time and vertex coordinates of each frame of images are obtained, wherein the vertex smoothing coordinates (A3 ', B3', C3 ', D3') of the image of the previous frame of the image to be processed after smoothing are calculated through a formula (6).
(6);
Wherein, K is a preset smoothing parameter, and the larger the K value is, the better the smoothing effect (i.e. debounce effect) is.
And S07, determining the image transformation matrix according to the vertex coordinates of the image of the previous frame of the image to be processed and the vertex smooth coordinates.
In the embodiment of the present application, based on the formula (7), the image transformation matrix may be obtained from the vertex coordinates (A3, B3, C3, D3) and the smoothed vertex smoothing coordinates (A3 ', B3', C3 ', D3').
(7);
The formula (8) is an image transformation matrix, and the value of the image transformation matrix in the formula (8) can be calculated according to the formula (7).
(8);
When the image to be processed is subjected to debouncing based on a digital anti-shake method, when the debounced transformed image does not meet the image output condition, the preset smoothing parameters in the image transformation matrix are reduced to weaken the debouncing (i.e. smoothing) strength of the image, a second adjustment smoothing parameter is obtained, the image transformation matrix corresponding to the second adjustment smoothing parameter is determined to be a new image transformation matrix, debouncing processing is performed on the image to be processed again through the new image transformation matrix, and if the debounced image meets the image output condition, the image is cut and output. Here, determining the image transformation matrix corresponding to the second adjustment smoothing parameter as the new image transformation matrix may mean that, according to the second adjustment smoothing parameter, the vertex coordinates of the image of the previous frame of the image to be processed are smoothed, so as to obtain vertex smoothing coordinates of the image of the previous frame of the image to be processed, and according to the vertex coordinates and the vertex smoothing coordinates of the image of the previous frame of the image to be processed, the new image transformation matrix is calculated.
In the following, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
Based on the problems of the related art. Fig. 7 is a schematic flow chart of an alternative image processing method according to an embodiment of the present application, and as shown in fig. 7, the image processing method according to the embodiment of the present application may be implemented by steps S701 to S707:
and step 701, obtaining a six-axis attitude angle through a smooth gyroscope.
In some embodiments, the gyroscope may obtain three attitude angles of an inertial measurement unit coordinate system (inertial measurement unit, IMU), where the origin of coordinates of the IMU coordinate system is the origin of coordinates of the gyroscope and the accelerometer, and the xyz three axis directions are parallel to the respective axes of the gyroscope and the accelerometer, respectively. The course angle yaw rotates around the Z axis of the IMU and rotates by a y angle; pitch rotates around the Y axis of the IMU by an angle p; the roll angle row rotates about the IMU's X axis by an angle r.
From the three attitude angles, a rotation matrix of three axes of the IMU coordinate system can be obtained, and the rotation matrix of the three axes is shown in formulas (9) to (11):
(9);
(10);
(11);
in an embodiment of the application, the accelerometer measures the acceleration perceived by the device (i.e., the image acquisition device). Acceleration with the gravity acceleration alone at rest, the values of the device attitude angles roll and pitch are calculated. When the accelerometer is placed horizontally, i.e. the Z axis is vertically upwards, the Z axis can read a value of 1g (g is gravitational acceleration), and the X axis and Y axis directions read 0, which can be noted as (0, g). When the accelerometer rotates for a certain attitude, the gravity acceleration will generate corresponding components on 3 axes of the acceleration, which are essentially coordinates of (0, g) in the ground coordinate system under the own coordinates of the new accelerometer, and the 3 values read by the accelerometer are new coordinates of the (0, g) vector. Can be expressed by the formula (12):
(12);
In some embodiments, solving equation (12) may result in angular increment roll and pitch angle values, and the yaw angle may be obtained by a gyroscope parameter.
In some embodiments, the angular velocity measured by the gyroscope is rotated about 3 axes of the IMU coordinate system, and thus the angular velocity is integrated to obtain the angle. Here, the attitude angles of the IMU at the nth time are r, p and Y, which means that the IMU coordinates are rotated by an angle Y around the Z axis, by an angle p around the Y axis, and by an angle r around the X axis from the initial position, to obtain the final attitude, and at this time, the attitude of the next time (n+1) needs to be calculated. Let the attitude angle at time n+1 be、/>And->The pose underwent 3 rotations. In the embodiment of the application, the posture at the time n+1 is obtained by adding the corresponding posture angle variation on the basis of the time n, wherein the variation of the posture angle is obtained through the integration of the angular speed and the time period, as shown in a formula (13): />
(13);
In some embodiments, using the roll and pitch angle values obtained from the accelerometer, the angle delta required for the n+1 state can be solved, as shown in equation (14):
(14);
the method for smoothing the image according to the embodiment of the application is realized by the following steps of、/>And->The accumulated quantity of the previous and subsequent frames is filtered to achieve +. >、/>And->Amplitude of variationSmaller, as shown in equation (15):
(15);
wherein K is a smoothing parameter (i.e., a preset smoothing parameter). The method adopted by the embodiment of the application is Kalman filtering, namely the angle increment of the current frame is equal to the weighted calculation of the angle increment of the previous frame and the current calculation. The smaller the K value, the better the angle increment smoothing effect. The smooth angular increments thus obtained are roll ', pitch ' and yaw '.
Step S702, calculating the coordinates of the four corners of the current image by smoothing the coordinates of the four corners of the image of the plurality of frames.
Fig. 8 is a schematic diagram of a multi-frame image provided by an embodiment of the present application, as shown in fig. 8, the multi-frame image may be four-frame images, and four corner coordinates of the four-frame images 5, 6, 7, 8 in the same scene are (E0, F0, G0, H0), (E1, F1, G1, H1), (E2, F2, G2, H2) and (E3, F3, G3, H3), respectively.
The four-corner coordinates (A3 ', B3', C3 ', D3') of the image 4 can be calculated by the formula (16) after the four-corner coordinates (A3, B3, C3, D3) are smoothed.
(16);
Wherein, K is smoothing parameter, and the larger the K value is, the better the smoothing effect is.
And step 703, obtaining a transformation matrix according to the six-axis attitude angle or the angular point coordinate.
In the embodiment of the present application, obtaining the transformation matrix M (i.e., the image transformation matrix) based on the six-axis attitude angle can be achieved by the formula (17):
(17);
in the embodiment of the present application, the obtaining of the transformation matrix M based on the four corner coordinates (A3 ', B3', C3 ', D3') can be achieved by the formula (18):
(18);
wherein the transformation matrix M is as shown in the matrix (19):
(19);
the transformation matrix M can be calculated based on the formula (18).
And step S704, performing matrix transformation on four corner points of the image to be processed according to the transformation matrix to obtain transformation coordinates.
In the embodiment of the present application, four corner points a (x 0, y 0), B (x 1, y 1), C (x 2, y 2) and D (x 3, y 3) of the current frame image (i.e., the image to be processed) may be transformed by multiplying the coordinates of each corner point by the transformation matrix to obtain new coordinates a (x 0 ', y 0'), B (x 1 ', y 1'), C (x 2 ', y 2') and D (x 3 ', y 3'), as shown in fig. 9.
Step S705, judging whether the transformation coordinates are outside the four corner points of the clipping window.
The center point of the current frame image is taken as the midpoint of the cropping window (i.e., the image cropping frame), and a fixed cropping window size is set according to the output image size of the device (i.e., the image output device). FIG. 9 is a schematic view of a clipping window according to an embodiment of the present application, as shown in FIG. 9, four corner points of the clipping window have coordinates of a (x 0 ', y 0'), B (x 1 ', y 1'), C (x 2 ', y 2'), D (x 3 ', y 3'), where A (x 0 ', y 0') and B (x 1 ', y 1'), A (x 0 ', y 0') and C (x 2 ', y 2'), D (x 3 ', y 3') and B (x 1 ', y 1'), D (x 3 ', y 3') and C (x 2 ', y 2') are connected in a straight line. It is determined whether four corner points a, b, c, d of the clipping window are all on one side of the 4 connecting lines. If the condition is satisfied, the a, b, c, d points are considered to be in the quadrangle of the image frame formed by A, B, C, D, and the transformation matrix at the moment can satisfy the condition that no invalid point exists in the image corresponding to the clipping window after clipping (namely, satisfy the image output condition), as shown in a graph in fig. 9 a; if this condition is not satisfied, it is considered that the clipping window has points outside the quadrangle of the image frame composed of A, B, C, D, and the transformation matrix at this time cannot satisfy the case where no invalid point exists in the clipping window after clipping, as shown in fig. 9 b.
In the embodiment of the present application, if the transformation coordinates are not outside the clipping window four-corner points, step S706 is performed, and if the transformation coordinates are outside the clipping window four-corner points, step S707 is performed.
And step S706, clipping the image according to the clipping window to obtain an output image.
In some embodiments, in the case where no invalid point exists in the image corresponding to the cropping window, the image corresponding to the cropping window is taken as the output image.
Step S707, adjusting the smoothing parameter.
In the embodiment of the present application, when an invalid point exists in an image corresponding to a cropping window, the smoothing intensity is weakened, that is, the smoothing parameter is adjusted, the transformation matrix is recalculated, and the steps S701 to S705 are repeated until no invalid point exists in the image corresponding to the cropping window, so as to obtain an output image.
Based on the above image processing method, fig. 10 is a schematic diagram of a composition structure of an image processing apparatus according to an embodiment of the present application, as shown in fig. 10, the image processing apparatus 10 includes an image stabilizing processing module 101, where the image stabilizing processing module 101 is configured to obtain an image to be processed, and perform image stabilizing processing on the image to be processed until an image stabilizing image corresponding to the image to be processed is obtained; the image stabilizing processing module 101 is further configured to perform de-dithering processing on an image to be processed based on an image transformation matrix, so as to obtain a processed image corresponding to the image to be processed; updating the image transformation matrix to obtain a new image transformation matrix in response to the processed image not meeting the image output condition; and determining the processed image as a stable image in response to the processed image meeting an image output condition.
In some embodiments, the image stabilization processing module 101 is further configured to perform a position comparison on the image cropping frame and the processed image; and adjusting preset smoothing parameters in the image transformation matrix to obtain the new image transformation matrix in response to at least one region of the image clipping frame being located outside the boundary of the processed image.
In some embodiments, the image stabilization processing module 101 is further configured to determine, in the processed image, a reference point for characterizing a position of the image cropping frame; and determining the image clipping frame according to the reference point and the image size of the stable image.
In some embodiments, the image stabilizing processing module 101 is further configured to perform data transformation on vertex coordinates of a plurality of vertices of the image to be processed based on the image transformation matrix, to obtain transformation coordinates corresponding to each vertex; and determining the processed image by taking the point corresponding to each transformation coordinate as a fixed point.
In some embodiments, the image stabilization processing module 101 is further configured to determine, as the image stabilization image, an image area corresponding to the image cropping frame in the processed image in response to the image cropping frame being located within a boundary of the processed image.
In some embodiments, the image processing apparatus further includes a first acquisition module for acquiring acceleration data and rotation data of the image acquisition device when acquiring the image to be processed; the first determining module is used for determining attitude angle data when acquiring the image of the previous frame of the image to be processed according to the acceleration data; the second determining module is used for determining attitude angle increment data when the image to be processed is acquired according to the rotation data, the attitude angle data and the preset smoothing parameters; and the third determining module is used for determining the image transformation matrix according to the attitude angle data and the attitude angle increment data.
In some embodiments, the image stabilizing processing module 101 is further configured to increase a preset smoothing parameter in the image transformation matrix to obtain a first adjustment smoothing parameter; and determining an image transformation matrix corresponding to the first adjustment smoothing parameter as the new image transformation matrix.
In some embodiments, the image processing apparatus further includes a second acquisition module configured to acquire a multi-frame image and vertex coordinates of each frame of image before the image to be processed at the time of acquisition; the smoothing processing module is used for carrying out smoothing processing on the vertex coordinates of the image of the previous frame of the image to be processed according to the preset smoothing parameters to obtain vertex smoothing coordinates of the image of the previous frame of the image to be processed; and a fourth determining module, configured to determine the image transformation matrix according to the vertex coordinates of the image of the previous frame of the image to be processed and the vertex smoothing coordinates.
In some embodiments, the image stabilizing processing module 101 is further configured to reduce a preset smoothing parameter in the image transformation matrix to obtain a second adjusted preset smoothing parameter; and determining an image transformation matrix corresponding to the second adjustment smoothing parameter as the new image transformation matrix.
It should be noted that, the description of the apparatus according to the embodiment of the present application is similar to the description of the embodiment of the method described above, and has similar beneficial effects as the embodiment of the method, so that a detailed description is omitted. For technical details not disclosed in the present apparatus embodiment, please refer to the description of the method embodiment of the present application for understanding.
It should be noted that, in the embodiment of the present application, if the above-mentioned image processing method is implemented in the form of a software functional module, and sold or used as a separate product, it may also be stored in a computer readable storage medium. Based on such understanding, the technical methods of the embodiments of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a terminal to perform all or part of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, an optical disk, or other various media capable of storing program codes. Thus, embodiments of the application are not limited to any specific combination of hardware and software.
An embodiment of the present application provides an image processing apparatus, fig. 11 is a schematic diagram of a composition structure of the image processing apparatus provided in the embodiment of the present application, and as shown in fig. 11, the image processing apparatus 110 at least includes: a processor 111 and a computer readable storage medium 112 configured to store executable instructions, wherein the processor 111 generally controls the overall operation of the image processing device. The computer-readable storage medium 112 is configured to store instructions and applications executable by the processor 111, and may also cache data to be processed or processed by each module in the processor 111 and the image processing apparatus 110, and may be implemented by a flash memory or a random access memory (RAM, random Access Memory).
An embodiment of the present application provides a storage medium storing executable instructions, in which the executable instructions are stored, which when executed by a processor, cause the processor to perform an image processing method provided by an embodiment of the present application, for example, a method as shown in fig. 3.
In some embodiments, the storage medium may be a computer readable storage medium, such as a ferroelectric Memory (FRAM, ferromagnetic Random Access Memory), read Only Memory (ROM), programmable Read Only Memory (PROM, programmable Read Only Memory), erasable programmable Read Only Memory (EPROM, erasable Programmable Read Only Memory), electrically erasable programmable Read Only Memory (EEPROM, electrically Erasable Programmable Read Only Memory), flash Memory, magnetic surface Memory, optical Disk, or Compact Disk-Read Only Memory (CD-ROM), or the like; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, the executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or, alternatively, distributed across multiple sites and interconnected by a communication network.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and scope of the present application are included in the protection scope of the present application. It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application. The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed.
The foregoing is merely an embodiment of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application.

Claims (12)

1. An image processing method, the method comprising:
acquiring an image to be processed, and executing image stabilizing processing on the image to be processed until an image stabilizing image corresponding to the image to be processed is obtained;
the image stabilization process includes:
acquiring an image transformation matrix;
performing de-dithering treatment on an image to be treated based on the image transformation matrix to obtain a treated image corresponding to the image to be treated;
updating the image transformation matrix to obtain a new image transformation matrix in response to the processed image not meeting the image output condition; the image output condition representation image clipping frame is positioned in the processed image;
and determining the processed image as a stable image in response to the processed image meeting an image output condition.
2. The method of claim 1, wherein updating the image transformation matrix in response to the processed image not satisfying an image output condition results in a new image transformation matrix, comprising:
Comparing the positions of the image clipping frame and the processed image;
and adjusting preset smoothing parameters in the image transformation matrix to obtain the new image transformation matrix in response to at least one region of the image clipping frame being located outside the boundary of the processed image.
3. The method according to claim 2, wherein the method further comprises:
determining, in the processed image, a reference point for characterizing a position of the image cropping frame;
and determining the image clipping frame according to the reference point and the image size of the stable image.
4. The method according to claim 1, wherein the performing, based on the image transformation matrix, the de-dithering process on the image to be processed to obtain a processed image corresponding to the image to be processed includes:
based on the image transformation matrix, carrying out data transformation on vertex coordinates of a plurality of vertexes of the image to be processed to obtain transformation coordinates corresponding to each vertex;
and determining the processed image by taking the point corresponding to each transformation coordinate as a fixed point.
5. The method of claim 1, wherein the determining the processed image as a stabilized image in response to the processed image satisfying an image output condition comprises:
And determining an image area corresponding to the image clipping frame in the processed image as the stable image in response to the image clipping frame being positioned in the boundary of the processed image.
6. The method according to any one of claims 1 to 5, further comprising:
acquiring acceleration data and rotation data of image acquisition equipment when acquiring the image to be processed;
determining attitude angle data when acquiring a previous frame of image of the image to be processed according to the acceleration data;
determining attitude angle increment data when the image to be processed is acquired according to the rotation data, the attitude angle data and preset smoothing parameters;
and determining the image transformation matrix according to the attitude angle data and the attitude angle increment data.
7. The method of claim 6, wherein updating the image transformation matrix to obtain a new image transformation matrix comprises:
increasing preset smoothing parameters in the image transformation matrix to obtain first adjustment smoothing parameters;
and determining an image transformation matrix corresponding to the first adjustment smoothing parameter as the new image transformation matrix.
8. The method according to any one of claims 1 to 5, further comprising:
acquiring a multi-frame image and vertex coordinates of each frame of image before the image to be processed at the acquisition moment;
carrying out smoothing treatment on vertex coordinates of an image of a previous frame of the image to be treated according to preset smoothing parameters to obtain vertex smoothing coordinates of the image of the previous frame of the image to be treated;
and determining the image transformation matrix according to the vertex coordinates of the image of the previous frame of the image to be processed and the vertex smooth coordinates.
9. The method of claim 8, wherein updating the image transformation matrix to obtain a new image transformation matrix comprises:
reducing preset smoothing parameters in the image transformation matrix to obtain second adjusted preset smoothing parameters;
and determining an image transformation matrix corresponding to the second adjustment preset smoothing parameter as the new image transformation matrix.
10. An image processing apparatus, characterized in that the apparatus comprises:
the image stabilizing processing module is used for acquiring an image to be processed and executing image stabilizing processing on the image to be processed until an image stabilizing image corresponding to the image to be processed is obtained;
The image stabilizing processing module is also used for acquiring an image transformation matrix; performing de-dithering treatment on an image to be treated based on the image transformation matrix to obtain a treated image corresponding to the image to be treated; updating the image transformation matrix to obtain a new image transformation matrix in response to the processed image not meeting the image output condition; the image output condition representation image clipping frame is positioned in the processed image; and determining the processed image as a stable image in response to the processed image meeting an image output condition.
11. An image processing apparatus comprising a memory and a processor, the memory storing a computer program executable on the processor, the processor implementing the image processing method of any one of claims 1 to 9 when executing the program.
12. A computer-readable storage medium, having stored thereon executable instructions for causing a processor to execute the executable instructions, implementing the image processing method of any one of claims 1 to 9.
CN202311072199.4A 2023-08-24 2023-08-24 Image processing method, device, equipment and storage medium Active CN116797497B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311072199.4A CN116797497B (en) 2023-08-24 2023-08-24 Image processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311072199.4A CN116797497B (en) 2023-08-24 2023-08-24 Image processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116797497A CN116797497A (en) 2023-09-22
CN116797497B true CN116797497B (en) 2023-11-14

Family

ID=88048330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311072199.4A Active CN116797497B (en) 2023-08-24 2023-08-24 Image processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116797497B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009033479A (en) * 2007-07-27 2009-02-12 Sharp Corp Camera shake correction device
CN107241544A (en) * 2016-03-28 2017-10-10 展讯通信(天津)有限公司 Video image stabilization method, device and camera shooting terminal
CN111462166A (en) * 2020-03-31 2020-07-28 武汉卓目科技有限公司 Video image stabilization method and system based on histogram equalization optical flow method
CN113132612A (en) * 2019-12-31 2021-07-16 华为技术有限公司 Image stabilization processing method, terminal shooting method, medium and system
CN113436113A (en) * 2021-07-22 2021-09-24 黑芝麻智能科技有限公司 Anti-shake image processing method, device, electronic equipment and storage medium
CN114390188A (en) * 2020-10-22 2022-04-22 华为技术有限公司 Image processing method and electronic equipment
CN115209030A (en) * 2021-04-08 2022-10-18 北京字跳网络技术有限公司 Video anti-shake processing method and device, electronic equipment and storage medium
CN115209031A (en) * 2021-04-08 2022-10-18 北京字跳网络技术有限公司 Video anti-shake processing method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9967461B2 (en) * 2015-10-14 2018-05-08 Google Inc. Stabilizing video using transformation matrices

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009033479A (en) * 2007-07-27 2009-02-12 Sharp Corp Camera shake correction device
CN107241544A (en) * 2016-03-28 2017-10-10 展讯通信(天津)有限公司 Video image stabilization method, device and camera shooting terminal
CN113132612A (en) * 2019-12-31 2021-07-16 华为技术有限公司 Image stabilization processing method, terminal shooting method, medium and system
CN111462166A (en) * 2020-03-31 2020-07-28 武汉卓目科技有限公司 Video image stabilization method and system based on histogram equalization optical flow method
CN114390188A (en) * 2020-10-22 2022-04-22 华为技术有限公司 Image processing method and electronic equipment
CN115209030A (en) * 2021-04-08 2022-10-18 北京字跳网络技术有限公司 Video anti-shake processing method and device, electronic equipment and storage medium
CN115209031A (en) * 2021-04-08 2022-10-18 北京字跳网络技术有限公司 Video anti-shake processing method and device, electronic equipment and storage medium
CN113436113A (en) * 2021-07-22 2021-09-24 黑芝麻智能科技有限公司 Anti-shake image processing method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN116797497A (en) 2023-09-22

Similar Documents

Publication Publication Date Title
WO2019153671A1 (en) Image super-resolution method and apparatus, and computer readable storage medium
CN107689035B (en) Homography matrix determination method and device based on convolutional neural network
CN107564063B (en) Virtual object display method and device based on convolutional neural network
CN113132612B (en) Image stabilization processing method, terminal shooting method, medium and system
CN113556464B (en) Shooting method and device and electronic equipment
CN103985103A (en) Method and device for generating panoramic picture
CN112686824A (en) Image correction method, image correction device, electronic equipment and computer readable medium
US20230025058A1 (en) Image rectification method and device, and electronic system
CN115701125B (en) Image anti-shake method and electronic equipment
CN113436113A (en) Anti-shake image processing method, device, electronic equipment and storage medium
CN111669499B (en) Video anti-shake method and device and video acquisition equipment
CN111372000B (en) Video anti-shake method and apparatus, electronic device, and computer-readable storage medium
CN113947768A (en) Monocular 3D target detection-based data enhancement method and device
CN106909223B (en) Camera orientation correction method and device based on 3D scene
EP4221181A1 (en) Method for generating rotation direction of gyroscope and computer device
CN111212222A (en) Image processing method, image processing apparatus, electronic apparatus, and storage medium
CN116797497B (en) Image processing method, device, equipment and storage medium
CN114742722A (en) Document correction method, device, electronic equipment and storage medium
CN114401362A (en) Image display method and device and electronic equipment
CN115705651A (en) Video motion estimation method, device, equipment and computer readable storage medium
US11847750B2 (en) Smooth object correction for augmented reality devices
WO2023023960A1 (en) Methods and apparatus for image processing and neural network training
CN113223007A (en) Visual odometer implementation method and device and electronic equipment
CN111656763B (en) Image acquisition control method, image acquisition control device and movable platform
CN111353929A (en) Image processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant