CN112689853A - Image processing method, image processing apparatus, photographing device, movable platform and storage medium - Google Patents

Image processing method, image processing apparatus, photographing device, movable platform and storage medium Download PDF

Info

Publication number
CN112689853A
CN112689853A CN202080004975.5A CN202080004975A CN112689853A CN 112689853 A CN112689853 A CN 112689853A CN 202080004975 A CN202080004975 A CN 202080004975A CN 112689853 A CN112689853 A CN 112689853A
Authority
CN
China
Prior art keywords
image
filter
value
determining
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080004975.5A
Other languages
Chinese (zh)
Inventor
郑子翔
韩守谦
梁大奖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112689853A publication Critical patent/CN112689853A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention provides an image processing method, an image processing device, shooting equipment, a movable platform and a storage medium, wherein the method comprises the following steps: acquiring multiple frames of images to be processed, wherein the multiple frames of images to be processed comprise multiple frames of images shot in the lens moving process, the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene; filtering a plurality of frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image; evaluating the first filter according to the filtering result corresponding to each frame of image; and determining whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter. The image processing method, the image processing device, the shooting equipment, the movable platform and the storage medium provided by the embodiment of the invention can improve the efficiency of selecting the filter for the scene and improve the shooting effect.

Description

Image processing method, image processing apparatus, photographing device, movable platform and storage medium
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to an image processing method, an image processing device, shooting equipment, a movable platform and a storage medium.
Background
With the continuous development of imaging technology, cameras are also more widely applied. At present, most cameras can provide an automatic focusing function, so that a manual focusing process of a user is eliminated, and convenience is provided for the user.
In the actual shooting process, the filter can be used for carrying out filtering processing on the shot image, so that the automatic focusing function is realized in an auxiliary mode. The types and parameters of the filters are various, in the prior art, before the camera leaves a factory, the filter is often selected for the camera by means of manual experience, more time needs to be consumed, and the selected filter may have a poor filtering effect, so that the shooting effect is poor.
Disclosure of Invention
The embodiment of the invention provides an image processing method and device, shooting equipment, a movable platform and a storage medium, which are used for solving the technical problem that the efficiency of selecting a filter for a camera is low in the prior art.
A first aspect of an embodiment of the present invention provides an image processing method, including:
acquiring multiple frames of images to be processed, wherein the multiple frames of images to be processed comprise multiple frames of images shot in the lens moving process, the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene;
filtering the multiple frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image;
evaluating the first filter according to a filtering result corresponding to each frame of image;
and determining whether the first filter is a filter suitable for the scene or not according to the evaluation result corresponding to the first filter.
A second aspect of an embodiment of the present invention provides an image processing apparatus, including:
a memory for storing a computer program;
a processor for executing the computer program stored in the memory to implement:
acquiring multiple frames of images to be processed, wherein the multiple frames of images to be processed comprise multiple frames of images shot in the lens moving process, the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene;
filtering the multiple frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image;
evaluating the first filter according to a filtering result corresponding to each frame of image;
and determining whether the first filter is a filter suitable for the scene or not according to the evaluation result corresponding to the first filter.
A third aspect of embodiments of the present invention provides an image processing apparatus, including:
the device comprises an acquisition circuit, a processing circuit and a processing circuit, wherein the acquisition circuit is used for acquiring multiple frames of images to be processed, the multiple frames of images to be processed comprise multiple frames of images shot in the moving process of a lens, the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene;
the filter circuit is used for filtering the plurality of frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image;
the evaluation circuit is used for evaluating the first filter according to the filtering result corresponding to each frame of image;
and the determining circuit is used for determining whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter.
A fourth aspect of the embodiments of the present invention provides a photographing apparatus including the image processing device according to the second aspect.
A fifth aspect of the embodiments of the present invention provides a photographing apparatus including the image processing device according to the third aspect.
A sixth aspect of the embodiments of the present invention provides a movable platform including the shooting apparatus of the fourth aspect.
A seventh aspect of embodiments of the present invention provides a movable platform including the shooting apparatus of the fifth aspect.
An eighth aspect of the present invention provides a computer-readable storage medium, in which program instructions are stored, the program instructions being configured to implement the image processing method according to the first aspect.
The image processing method and device, the shooting equipment, the movable platform and the storage medium provided by the embodiment of the invention can quickly judge whether the first filter is suitable for the current scene, so that the efficiency of selecting the filter for the scene is improved, and the shooting effect is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating another image processing method according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of evaluating a first filter according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a focal curve formed by numerical points according to an embodiment of the present invention;
fig. 5 is a schematic flowchart of a process of determining a score corresponding to a first filter according to a focal value curve according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a focal length curve according to an embodiment of the present invention;
FIG. 7 is a schematic view of another focal value curve provided by an embodiment of the present invention;
FIG. 8 is a flowchart illustrating another image processing method according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of another image processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
The embodiment of the present invention provides an image processing method, an execution subject of the method may be an image processing apparatus in a shooting device, it is understood that the shooting device may be any device with a shooting function, such as a camera, and the image processing apparatus may be implemented as software, or a combination of software and hardware.
The image processing method provided by the embodiment of the invention can acquire a plurality of frames of images to be processed corresponding to a scene, filter the plurality of frames of images to be processed through the first filter to be analyzed to obtain the filtering result corresponding to each frame of image, and evaluate the first filter according to the obtained filtering result to determine whether the first filter is a filter suitable for the scene.
The image processing method as described above can determine a corresponding filter for a scene by processing a captured image. The filter has various models and parameters, and different filters can cause different effects of images finally obtained by shooting, so that different filters can be set for different shooting scenes, shot scenes can be better displayed for a user, and the shooting requirements of all scenes are met.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The features of the embodiments and examples described below may be combined with each other without conflict between the embodiments.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present invention. As shown in fig. 1, the image processing method in the present embodiment may include:
step 101, obtaining multiple frames of images to be processed, where the multiple frames of images to be processed include multiple frames of images shot in a lens moving process, where the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene.
The method in the embodiment of the invention can be used for determining the corresponding filter for a scene, wherein the scene can be any scene such as an indoor scene, an outdoor scene and the like and is marked as a scene to be analyzed, and the scene can be any scene conforming to the scene. Assuming that the scene to be analyzed is an indoor scene, that is, a filter corresponding to the indoor scene needs to be determined, the scene may be any scene conforming to the indoor scene, such as a living room, a bedroom, a kitchen, and the like.
After the scene to be analyzed is determined, a scene corresponding to the scene may be selected and the selected scene may be photographed by a photographing apparatus. The photographing apparatus may include a lens, a focusing motor, an image sensor, etc., light reflected from a scene is converged on the image sensor after passing through the lens, and the image sensor converts an optical signal into an electrical signal, thereby forming an image.
During the process of shooting a scene, the focusing motor can drive the lens to move, so that the object distance is changed, and the definition of the shot scene in the image is changed continuously. The movable range of the lens can be determined by the stroke of the focusing motor, and the movement of the lens can be performed manually or automatically.
In this embodiment, a multi-frame image corresponding to the scene shot in the lens moving process may be acquired as an image to be processed, where the multi-frame image may be all images shot in the movable range of the lens or a partial image thereof. The image may be a PNG image or an image in other format.
And 102, filtering the multiple frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image.
The first filter may be an Infinite Impulse Response (IIR) filter or a Finite Impulse Response (FIR) filter. In one embodiment, a filter is selected from a set of filters comprising a plurality of filters as a first filter to be analyzed. The plurality of filters in the set of filters may be of different types or different parameters of the filters.
The first filter is used for capturing effective signals in the image, and the subsequent focusing processing can be completed according to the filtered result. If the first filter is designed unreasonably, the focusing result may be inaccurate, which easily causes focusing failure, and finally, the captured image is blurred, thereby affecting the user experience.
And 103, evaluating the first filter according to the filtering result corresponding to each frame of image.
After a multi-frame image to be processed is acquired, the multi-frame image can be respectively filtered through a first filter, and then the first filter is evaluated according to a result obtained after filtering.
There are several ways to evaluate the first filter based on the filtering result. Optionally, the first filter may be scored according to a filtering result, and the more the filtering result meets the expectation, the higher the score is, otherwise, the lower the score is; or, whether the first filter is qualified or not can be judged according to the filtering result, if the preset condition is met, the first filter is considered to be qualified, otherwise, the first filter is considered to be unqualified.
The specific evaluation strategy of the first filter, for example, how to score or judge whether to be qualified, may be set according to actual needs. It will be appreciated that different cameras may have different filming indices or filming requirements that may be referenced when determining the particular evaluation strategy for the first filter. In addition, the specific evaluation strategies corresponding to different scenarios may be the same or different, and this embodiment is not limited to this.
In an alternative embodiment, the sharpness of the image may be determined according to the filtering result of each frame of image, and the first filter may be evaluated according to the sharpness.
For example, if the final presentation effect of the image is concerned, the definition degree of each frame of image can be determined according to the filtering result, the image with the highest definition degree is found out in each frame of image, whether the corresponding definition degree meets the requirement is judged, if the image with the highest definition degree does not meet the requirement, the first filter is considered to be unqualified, otherwise, the first filter is considered to be qualified.
In an embodiment, if the first filter to be analyzed does not meet the requirement, the filter parameter of the first filter to be analyzed is adjusted, the first filter after the filter parameter is adjusted filters the multiple frames of images to be processed to obtain a new filtering result corresponding to each image, and the new filtering result is evaluated to determine whether the first filter after the filter parameter is adjusted is a filter suitable for the scene. In another embodiment, if the first filter to be analyzed does not meet the requirement, another filter is selected from a filter set including a plurality of filters, the plurality of frames of images to be processed are filtered by the another filter to obtain a new filtering result corresponding to each image, and the another filter result is evaluated to determine whether the another filter is a filter applicable to the scene.
If the focusing speed is concerned, the definition degree of each frame of image can be determined according to the filtering result, whether the image with the highest definition degree can be selected quickly or not is judged according to the definition degree change among the frames of images, if not, the first filter is considered unqualified, otherwise, the first filter is considered qualified.
Wherein, whether the image with the highest definition can be selected quickly can be judged according to the selected method in focusing. For example, if a hill climbing method is selected for focusing in actual shooting, the desired sharpness degree change tendency should be monotonically increased to a peak and then monotonically decreased. If this condition is satisfied, it is considered that the image with the highest degree of sharpness can be selected quickly therefrom. If focusing is performed by adopting other methods during actual shooting, a corresponding method can be selected to judge whether the image with the highest definition degree can be selected quickly or not so as to complete evaluation on the first filter.
The definition obtained by filtering the image through the first filter can realize the evaluation of the first filter. The definition of the image determined according to the filtering result may be determined in any manner, and the embodiment is not limited thereto.
In another alternative embodiment, the smoothing degree of each frame of image after being filtered may be determined according to the filtering result obtained by the first filter, and whether the requirement is met may be determined according to the smoothing degree.
In some scenes, it is desirable that the smoother the filtered image is, the better the filtered image is, and the less sharp the edges of objects in the image are; in other scenarios, it is desirable that the sharper the image after filtering, the better, and the less blurred the edges of the object. The first filter may be evaluated according to the smoothing requirements corresponding to the scene to be analyzed.
Specifically, after the multi-frame image to be processed is filtered by the first filter, the smoothing degree corresponding to the filtered image may be determined according to the filtering result, for example, the smoothing degree of the image may be determined by comparing adjacent pixel points. If the smoothness degree of each frame image meets the preset smoothness requirement, the first filter is considered to be qualified, otherwise, the first filter is considered to be unqualified.
And step 104, determining whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter.
As can be seen from the above analysis, the evaluation result can be expressed in various ways, such as specific scores, or pass and fail. The corresponding relation between the evaluation result and whether the evaluation result is applicable can be set according to actual needs.
In an optional embodiment, the evaluation result may include pass or fail, and if the evaluation result of the filter is pass, the first filter is determined to be a filter applicable to the scene, and if the evaluation result of the filter is fail, the first filter is determined to be a filter not applicable to the scene.
In another alternative embodiment, the evaluation result may include a score for the first filter. If the score exceeds a certain score threshold (e.g., score a), then the first filter is determined to be a filter applicable to the scene, otherwise the first filter is determined to be a filter not applicable to the scene.
In practical application, when shooting is carried out in a certain scene, shot images can be continuously acquired in the moving process of a lens, filtering processing can be carried out on the images through a filter, and whether the in-focus position is reached or not can be judged according to filtering information of the images on frequency. However, in different scenes, the corresponding focused sensitive frequency domain parts are different, and the difference has a great influence on whether the focused position can be found, so that the selection of the filter is very important for the actual shooting effect.
In this embodiment, the method described above may be used to evaluate the first filter and determine whether the first filter is applicable to the scene to be analyzed. If the first filter is suitable for the scene to be analyzed, it is indicated that the first filter meets the shooting requirement under the scene, and the corresponding relationship between the first filter and the scene may be stored in a shooting device. When a user uses the shooting equipment to shoot in the scene, the first filter corresponding to the scene is used for filtering the shot image, so that the filtering effect meets the shooting requirement in the scene, and the shooting effect is improved.
Alternatively, the above-described method may be applied to an actual photographing process. Specifically, the evaluation of the filter may be implemented by using the methods described in the above steps 101 to 104 during the actual shooting process of the user. Further, after step 104, if the first filter is a filter applicable to the scene, focusing may be performed according to a filtering result corresponding to each frame of image, so as to obtain a clear image.
The focusing process may specifically refer to determining a focusing position according to a filtering result. During the lens movement, an object in the captured image usually undergoes a process from blur to sharpness to blur, wherein the position of the lens at which the object is sharpest can be regarded as the in-focus position. There are many specific implementation methods for determining the focus position according to the filtering result, and this embodiment does not limit this. The above method may be applied in contrast focus (CDAF) schemes, but also in other focus schemes.
The image processing method provided in this embodiment may obtain multiple frames of images to be processed, where the multiple frames of images to be processed include multiple frames of images captured during a moving process of a lens, where the multiple frames of images to be processed are multiple frames of images corresponding to a scene, the multiple frames of images corresponding to the scene are multiple frames of images captured of a scene conforming to the scene, the multiple frames of images to be processed are filtered through a first filter to be analyzed, a filtering result corresponding to each frame of image is obtained, the first filter is evaluated according to the filtering result corresponding to each frame of image, and whether the first filter is a filter suitable for the scene is determined according to the evaluation result corresponding to the first filter, so that whether the first filter is suitable for a current scene can be quickly determined, and the efficiency of selecting a filter for the scene is improved, the shooting effect is improved.
The scheme provided by the embodiment of the invention can judge whether a certain filter is a filter suitable for the current scene, and on the basis, the filter suitable for the current scene can be selected from the plurality of filters.
Specifically, when a filter is selected for a scene, a plurality of filters may be designed in advance, and a filter suitable for the scene may be selected therefrom. It will be appreciated that there are many types of filters, and the specific parameters of the filters can be adjusted, and different filters can be designed by changing the type and/or parameters of the filters, and then the filter suitable for the current scene can be selected from the designed filters. Therefore, the process of selecting a filter for a scene can be understood as a process of debugging parameters of the filter.
In some schemes, the selection of the filter for the scene depends on empirical debugging of equipment manufacturers, and no complete evaluation method is available for debugging the filter for different scenes to optimize the parameters of the filter. This method of selecting filters based on experience can take a lot of time and the final selected filter will not work well.
In the embodiment of the invention, the performance of different filters can be evaluated according to the image corresponding to the current scene, the time spent on debugging the filters is effectively reduced, the finally selected filters can effectively improve the focusing accuracy, the robustness of a focusing algorithm is greatly improved, and the period of debugging the filters is shortened.
Fig. 2 is a schematic flowchart of another image processing method according to an embodiment of the present invention. Fig. 2 illustrates an implementation method for selecting a filter suitable for a current scene from a plurality of filters by taking two filters as an example. As shown in fig. 2, the image processing method may include:
step 201, obtaining multiple frames of images to be processed, where the multiple frames of images to be processed include multiple frames of images shot in a lens moving process, where the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene.
Step 202, filtering the multiple frames of images to be processed through a first filter to be analyzed, and evaluating the first filter according to a filtering result corresponding to the first filter.
Specifically, the multiple frames of images to be processed may be filtered through the first filter to obtain a filtering result corresponding to each frame of image, and the first filter is evaluated according to the filtering result corresponding to each frame of image.
For a specific implementation method for evaluating the first filter, reference may be made to the foregoing embodiments, and details are not described here.
And 203, filtering the multiple frames of images to be processed through a second filter to be analyzed, and evaluating the second filter according to a filtering result corresponding to the second filter.
Wherein the second filter may be any filter different from the first filter. It will be appreciated that different filters may be obtained by varying the type and/or parameters of the filter.
Optionally, the types of filters may include, but are not limited to: butterworth filters, Chebyshev filters, Bessel filters, Elliptic (eliptical) filters, etc.
The parameter of the filter may be a frequency response coefficient of the filter, and the parameter may specifically include, but is not limited to: passband cutoff frequency, stopband cutoff frequency, passband attenuation, stopband attenuation, low cutoff frequency, high cutoff frequency, passband width, stopband width, and the like.
Different types and/or parameters of filters may be considered different filters. In step 203, a second filter, different from the first filter, may be evaluated. Specifically, the multiple frames of images to be processed may be filtered by the second filter to obtain a filtering result corresponding to each frame of image, and the second filter is evaluated according to the filtering result of the second filter on each frame of image.
The specific implementation method for evaluating the second filter is similar to the method for evaluating the first filter, and is not described herein again.
It is understood that the order of the steps given in the embodiment of the present invention is only an example, and the order of the steps 202 and 203 may be adjusted according to actual needs, for example, the second filter may be evaluated first and then the first filter may be evaluated, or the first filter and the second filter may be evaluated at the same time.
And 204, selecting a filter suitable for the scene from the first filter and the second filter according to the evaluation result corresponding to the first filter and the evaluation result corresponding to the second filter.
Determining whether the first filter is a filter applicable to the scene according to the evaluation result corresponding to the first filter may be implemented by step 204. It is to be understood that if a first filter is selected from the first filter and the second filter as a filter applicable to the scene, it indicates that the first filter is a filter applicable to the scene; if the second filter is selected as the filter applicable to the scene, then the first filter may be deemed not to be the filter applicable to the scene.
According to the evaluation result corresponding to the first filter and the evaluation result corresponding to the second filter, one having a more excellent evaluation result can be selected from the first filter and the second filter as a filter suitable for the scene.
For example, if the evaluation result is a score, a filter with a higher score may be selected from the first filter and the second filter as a filter suitable for the scene.
And if the specific expression form of the evaluation result is qualified or unqualified, judging whether one of the first filter and the second filter is qualified or not, and if so, selecting the qualified filter as the filter suitable for the scene.
Optionally, if the scores of the first filter and the second filter are the same, or both the first filter and the second filter are qualified, another strategy, such as selecting a filter with lower implementation cost as the filter suitable for the scene, may also be combined.
Fig. 2 illustrates an implementation of selecting a filter from a plurality of filters suitable for a scene to be analyzed, taking two filters as an example. In practical applications, the filter to be analyzed may not be limited to two. When a filter suitable for the scene needs to be selected from more than two filters, for each filter, the image to be processed may be filtered through the filter to obtain a filtering result corresponding to each frame of image, the filter may be evaluated according to the filtering result corresponding to each frame of image obtained by the filter, and finally, a filter suitable for the scene may be selected from the plurality of filters according to the evaluation results corresponding to the plurality of filters.
Alternatively, a butterworth filter may be selected as the filter to be analyzed. The frequency response curve of the Butterworth filter in the pass band is flat, simple and easy to realize. Parameters of the butterworth filter may include pass band gain, pass band width, low cut-off band, and the like.
When selecting a filter for a scene, multiple butterworth filters may be designed in advance, and specific parameters of different filters are different, such as the cutoff frequency of the first filter being f1 and the cutoff frequency of the second filter being f 2. Then, shooting the scenery according with the scene to obtain an image to be processed, respectively filtering the image to be processed by utilizing a plurality of designed Butterworth filters, evaluating the filters according to the filtering results corresponding to the filters, and selecting the filter suitable for the scene from the filters according to the evaluation results.
In the image processing method provided by this embodiment, after an image to be processed corresponding to a scene to be analyzed is acquired, the image to be processed may be filtered by using the first filter and the second filter, and the first filter and the second filter may be evaluated according to a filtering result corresponding to the first filter and a filtering result corresponding to the second filter, so that a filter suitable for the scene is selected from the first filter and the second filter, whether a plurality of filters are suitable for the scene may be analyzed, and a corresponding filter is selected for the scene from the plurality of filters, thereby effectively improving efficiency and accuracy of selecting a filter for the scene.
Fig. 3 is a schematic flowchart of evaluating a first filter according to an embodiment of the present invention. On the basis of the technical solution provided by the above embodiment, optionally, the score corresponding to the first filter may be determined according to the filtering result corresponding to each frame of image. As shown in fig. 3, determining the score corresponding to the first filter according to the filtering result corresponding to each frame of image may include:
step 301, determining a focus value (focus value) corresponding to each frame of image according to the filtering result corresponding to each frame of image.
In one embodiment, Fourier transforming the gray scale values of pixels in the image results in frequency components (e.g., amplitudes that include frequency) about the gray scale values of the image. Then, the frequency component (for example, amplitude of the frequency) is normalized to obtain a normalized frequency component. A frequency range (or frequency) corresponding to the region of interest (i.e., the in-focus region) is determined in the normalized frequency components. Next, the frequency range (or frequency) is digitally filtered accordingly. And after filtering, accumulating the frequencies in the frequency range to obtain an accumulated value which is the focal value. Note that the frequency component indicates the energy of the image.
The focus value may be used to represent a degree of sharpness of the image, where the degree of sharpness may refer to an overall degree of sharpness of the image or a degree of sharpness of a partial region in the image. When a scene is shot, a plurality of frames of images shot in the moving process of a lens can be acquired, and for a single frame of image, the larger the focal value of the image is, the clearer the image is.
Optionally, the filtering result corresponding to each frame of image may include a signal value corresponding to each pixel point in at least part of pixel points of the image after filtering processing. Correspondingly, determining the focal value corresponding to each frame of image according to the filtering result corresponding to each frame of image may include: and accumulating the signal values corresponding to at least part of pixel points in the image aiming at each frame of image to obtain a focal value corresponding to the image. And at least part of the pixel points are pixel points of the interested region. In one embodiment, the corresponding signal values are frequency values corresponding to the at least some pixel points.
In an alternative embodiment, the filtering result may include a signal value corresponding to each pixel point in all pixel points of the image. The signal value corresponding to the pixel point is a value corresponding to the pixel point after filtering, and the value can be used for representing gradient information corresponding to the pixel point.
And accumulating the signal values corresponding to all the pixel points of the image after filtering to obtain the focal value corresponding to the image, thereby reflecting the integral definition of the image.
In another alternative embodiment, the filtering result may include a signal value corresponding to each pixel point in the partial pixel points of the image. And accumulating the signal values corresponding to part of the pixel points of the image to obtain the focal value corresponding to the image.
The partial pixel points can be a plurality of pixel points which are uniformly distributed in the image, the integral definition degree of the image can be reflected through the partial pixel points, the calculated amount can be reduced, the image processing speed is increased, and therefore the focusing efficiency is improved.
Alternatively, the partial pixel points may include pixel points in a region of interest of a user in the image. The region of interest may be determined by a variety of methods. For example, when an image is taken, a user may click on an area in the picture to be focused, and the area of interest may be determined by identifying the position clicked by the user, or semantic analysis may be performed on the image, and the area of interest may be determined by the semantic analysis, for example, the area where a human face is located in a self-portrait mode may be used as the area of interest.
And the pixel points in the region of interest can be used as pixel points for calculating a focal value, and signal values corresponding to the pixel points in the region of interest are accumulated to obtain the focal value corresponding to the image. The focus value is calculated through the signal value corresponding to the pixel point in the region of interest, so that the focus value can reflect the definition of the region of interest in the image, and the focusing accuracy is improved.
The above method of calculating the focus value of the image by signal value superposition may be used to calculate the focus value by other methods, for example, the degree of sharpness of the image may be evaluated by an evaluation function, and the output value of the evaluation function may be used as the focus value of the image. The specific expression form of the evaluation function may be set according to actual needs, and this embodiment does not limit this.
Step 302, determining a score corresponding to the first filter according to the focal value corresponding to each frame of image.
Optionally, determining a score corresponding to the first filter according to the focus value corresponding to each frame of image may include: determining a focal value curve according to the focal value corresponding to each frame of image; and determining the score corresponding to the first filter according to the focal value curve.
The focus value curve may be a curve formed by connecting focus values corresponding to respective frames of images. Specifically, determining the focal value curve according to the focal value corresponding to each frame image may include: for each frame of image, determining a numerical point corresponding to the image in a focal value curve, wherein the abscissa of the numerical point is the sequence number of the image, and the ordinate of the numerical point is the focal value corresponding to the image; and generating the focal value curve according to the determined plurality of numerical value points. Wherein the sequence number of the images can be determined according to the shooting sequence.
Fig. 4 is a schematic diagram of a focal value curve formed by numerical points according to an embodiment of the present invention. As shown in fig. 4, the abscissa, from 1 to 41, illustrates that 41 images are taken for a scene, and the corresponding focal value can be calculated for each image frame, thereby forming a focal value curve, and the ordinate is the focal value corresponding to the image. Since the lens positions corresponding to the respective frame images are different, the lens positions are represented by serial No. 1 to serial No. 41 in fig. 4. That is, the abscissa in fig. 4 corresponds to the lens position 1 to the lens position 41, which correspond to the frame 1 to the frame 41, respectively.
The method of determining the score of the first filter from the focus value curve may be many. Alternatively, the strategy of scoring by the focus value curve may be determined according to the actual focusing strategy. For example, if it is necessary to satisfy that the slopes on both sides of the position with the largest focal value are trembled during actual focusing to achieve accurate focusing, then when evaluating the first filter, the slopes on both sides of the highest value in the focal value curve may be used as reference bases for scoring, so that the finally selected filter can satisfy the focusing requirement.
According to the image processing method provided by the embodiment, the focus value corresponding to each frame of image can be determined according to the filtering result corresponding to each frame of image, the focus value curve is determined according to the focus value corresponding to each frame of image, the score corresponding to the first filter is determined according to the focus value curve, and the focus value curve can visually reflect the change trend of the image definition degree, so that the evaluation of the first filter can be quickly and accurately realized, and the efficiency and the accuracy of selecting the filter for the scene are further improved.
On the basis of the technical solution provided in the foregoing embodiment, optionally, determining the score corresponding to the first filter according to the focal value curve may include: determining a score corresponding to the first filter according to the focal value curve by at least one of: curve contrast, first ratio, second ratio, maximum opening size, curve monotonicity. Wherein, the first ratio is the ratio of numerical value points with focal values higher than a first threshold value; the second ratio is the ratio of numerical points with focal values lower than a second threshold value. The following description will be made by taking fig. 5 as an example.
Fig. 5 is a schematic flowchart of a process of determining a score corresponding to a first filter according to a focus value curve according to an embodiment of the present invention. In the scheme shown in fig. 5, the score corresponding to the first filter is determined by five dimensions, namely curve contrast, first ratio, second ratio, maximum opening size and curve monotonicity.
Specifically, scores corresponding to the curve contrast, the first ratio, the second ratio, the maximum value opening size, and the curve monotonicity can be determined through the focal value curve, and the final score of the first filter is further determined according to the obtained five scores.
As shown in fig. 5, determining the score corresponding to the first filter according to the focus value curve may include:
and step 501, determining N numerical points with the largest focal value and M numerical points with the smallest focal value in the focal value curve.
Wherein M and N are both positive integers.
Step 502, determining a score of the curve contrast corresponding to the first filter according to a ratio of the mean of the focal values corresponding to the N number of value points to the mean of the focal values corresponding to the M number of value points.
In this embodiment, scoring the curve contrast of the focus value curve corresponding to the first filter may be implemented through steps 501 to 502.
The curve contrast may refer to a ratio of a mean value of the maximum N values to a mean value of the minimum M values in the curve, and the ratio may reflect the quality of the focal value curve to a certain extent.
Optionally, a ratio between a maximum average of N focal values and a minimum average of M focal values in the focal value curve and a score of a curve contrast corresponding to the first filter may be in a positive correlation.
The positive correlation relationship means that when the variable x increases, the variable y also increases, that is, the variation directions of the two variables are the same, and when one variable x changes from large to small/from small to large, the other variable y also changes from large to small/from small to large, then the variable x and the variable y can be regarded as a positive correlation relationship.
In this embodiment, the ratio between the maximum average of N focal values and the minimum average of M focal values and the corresponding score of the curve contrast may be in a positive correlation relationship, that is, the larger the ratio is, the higher the score of the curve contrast may be, and the smaller the ratio is, the lower the score of the curve contrast may be, for example, the relationship between the ratio and the score may be a positive proportional function.
In the actual shooting process, the focal value of the image near the in-focus position is larger, the focal value of the image at the out-of-focus position is smaller, the larger the focal value difference between in-focus and out-of-focus is, the better the corresponding shooting effect is, and the score of the curve contrast corresponding to the first filter is higher under the condition; if the difference in focus values between in-focus and out-of-focus is small, indicating that the first filter is filtering poorly for the current scene, then the corresponding score may be low.
In this embodiment, the values of M and N may be set according to actual needs. Optionally, M may be equal to a value obtained by rounding a frame number of the image to be processed multiplied by a first coefficient, and N may be equal to a value obtained by rounding a frame number of the image to be processed multiplied by a second coefficient, where both the first coefficient and the second coefficient are greater than 0, and rounding may be any rounding method such as rounding up, rounding down, and the like.
In some embodiments, the second coefficient may be greater than the first coefficient, because in a plurality of frames of images captured during the lens moving process, there are more out-of-focus images and fewer images close to the in-focus state, and setting N to be greater than M can effectively reflect the contrast in the in-focus state and the out-of-focus state.
For example, if the image has 41 frames, N may be 4, and M may be 8, and the maximum focal value 4 and the minimum focal value 8 are found from the focal value curve, and the curve contrast is scored according to the value of N and M.
Step 503, determining the highest value of the focal value curve, and counting the number of numerical value points of which the focal value is greater than a first threshold value, where the first threshold value is a product of the highest value and a first proportional coefficient.
Wherein the maximum value may be the maximum focal value in the focal value curve, the focal value larger than the first threshold value may be a focal value close to the maximum value in value, and the first scaling factor may be set according to actual needs, for example, may be 92%, and then, the number of focal values between 92% and 100% of the maximum value may be counted.
Step 504, determining a score of the first ratio corresponding to the first filter according to a ratio of the number of the numerical points of which the focal value is greater than the first threshold to the number of all the numerical points in the focal value curve.
In this embodiment, the scoring of the first ratio of the focus value curve corresponding to the first filter may be implemented through steps 503 to 504. And the first ratio is the ratio of numerical points with focal values higher than a first threshold value.
Optionally, a score of a ratio between the number of the numerical points whose focal value is greater than the first threshold and the number of all the numerical points in the focal value curve and a first ratio corresponding to the first filter may be a negative correlation.
The negative correlation relationship means that when the variable x increases, the variable y decreases, that is, the changing directions of the two variables are opposite, when one variable x changes from large to small/from small to large, the other variable y changes from small to large/from large to small, and then the variable x and the variable y can be regarded as the negative correlation relationship.
In this embodiment, a score of the ratio of the number of the numerical points with the focal value larger than the first threshold to the number of all the numerical points in the focal value curve is in a negative correlation with the first ratio, that is, the larger the ratio of the number of the numerical points with the focal value larger than the first threshold to the number of all the numerical points in the focal value curve is, the lower the corresponding score may be, the smaller the ratio is, the higher the corresponding score may be, for example, the relationship between the ratio and the corresponding score may be an inverse proportional function.
In the actual shooting process, the position corresponding to the highest value can be regarded as the focusing position, and the more the focus value close to the highest value is, the more difficult it is to quickly and accurately find the focusing position in the focusing process, so the score of the ratio of the numerical value points of which the focus values are greater than the first threshold value and the first ratio can be in a negative correlation relationship, a filter selected for the current scene can better realize the function of auxiliary focusing, and the shooting effect of the image is improved.
And 505, determining the lowest value of the focal value curve, and counting the number of numerical value points of which the focal value is smaller than a second threshold value, wherein the second threshold value is the product of the lowest value and a second proportional coefficient.
Wherein the lowest value may be a minimum focal value in the focal value curve, the focal value smaller than the second threshold may be a focal value numerically close to the lowest value, and the second scaling factor may be set according to actual needs, for example, may be 120%, and then the number of focal values between 100% and 120% of the lowest value may be counted.
Step 506, determining a score of a second ratio corresponding to the first filter according to a ratio of the number of the numerical points of which the focal value is smaller than the second threshold to the number of all the numerical points in the focal value curve.
In this embodiment, the scoring of the second ratio of the first filter to the focus value curve may be implemented through steps 505 to 506. And the second ratio is the ratio of numerical points with focal values lower than a second threshold value.
Optionally, a ratio of the number of the numerical points with the focal value smaller than the second threshold to the number of all the numerical points in the focal value curve may be in a positive correlation with the score of the second ratio.
In the actual shooting process, the more the images with the focal values close to the lowest values, the smaller the focal value corresponding to the out-of-focus image is, the better the filtering effect of the first filter on the images is, therefore, the score of the ratio of the numerical value points with the focal values smaller than the second threshold value and the second ratio can be in a positive correlation relationship, and the shooting effect of the images can be further improved.
And step 507, determining the curvature corresponding to the focal value curve at the highest value.
And step 508, determining the score of the maximum value opening size corresponding to the first filter according to the curvature.
In this embodiment, the scoring of the size of the highest value opening of the focus value curve corresponding to the first filter may be implemented through steps 507 to 508.
Specifically, the opening size of the focal value curve at the highest value can be represented by the curvature, optionally, a curve formed by connecting a numerical point where the highest value is located and a plurality of numerical points nearby can be fitted by a quadratic equation, and the curvature of the curve at the highest value can be determined according to the characteristics of the quadratic equation.
It can be understood that the larger the curvature of the focal value curve at the highest value is, the smaller the opening of the focal value curve at the highest value is, and the smaller the opening is, the more the curve is shaken, which is beneficial to quickly and accurately realizing focusing. Thus, the curvature may be positively correlated with the score of the highest value opening size.
And 509, counting the number of monotone increasing intervals and/or the number of monotone decreasing intervals in the focal value curve.
Wherein, if two adjacent intervals are both monotone increasing intervals or both monotone decreasing intervals, the two intervals are combined into one interval.
And step 510, determining the monotonicity score of the curve corresponding to the first filter according to the number of the monotonous increasing intervals and/or the number of the monotonous decreasing intervals.
In this embodiment, the scoring of the curve monotonicity of the focus value curve corresponding to the first filter may be implemented through steps 509 to 510.
The ideal curve of the focal value should be monotonically increasing to the left of the maximum and monotonically decreasing to the right of the maximum. The more the fluctuation of the focus value curve is, the more difficult the accurate focusing is realized, therefore, the number of the monotone increasing intervals and/or the monotone decreasing intervals and the score of the monotonicity of the curve can be in a negative correlation relationship, thereby realizing the focusing quickly and accurately.
It should be understood that the execution sequence of the above steps in this embodiment is not limited to the sequence defined by the above sequence numbers, for example, the monotonicity of the curve may be evaluated first and then the size of the maximum value opening is evaluated, or the monotonicity of the curve and the size of the maximum value opening may be evaluated simultaneously, and those skilled in the art may perform any configuration according to specific application requirements and design requirements, and details are not described herein again.
And 511, determining the score of the first filter according to the curve contrast, the first ratio, the second ratio, the maximum value opening size and the score corresponding to the monotonicity of the curve.
Optionally, in five dimensions of the curve contrast, the first ratio, the second ratio, the maximum opening size, and the curve monotonicity, a weight value may be assigned to each dimension, and scores corresponding to the five dimensions are weighted and summed to obtain a score of the first filter.
The specific implementation principle of evaluating the first filter through five dimensions of curve contrast, the first ratio, the second ratio, the maximum value opening size and the curve monotonicity is given above, and in other alternative embodiments, only a part of the dimensions may be selected to evaluate the first filter.
In practical applications, when the first filter is evaluated, the first filter may be scored by using a focus value curve and using the above method, so as to determine whether the first filter is a filter suitable for a current scene. Similarly, the method may also be used to evaluate other filters, for example, a plurality of filters may be preset, each filter is scored by the above method, and the filter with the highest score is selected as the filter suitable for the current scene.
Fig. 6 is a schematic diagram of a focal value curve according to an embodiment of the present invention. The definitions of the abscissa and ordinate in fig. 6 and 4 are similar. For brevity, no further description is provided. As shown in fig. 6, the focus value curve has a large fluctuation, and when the focus value curve is applied to an actual focusing scene, focusing is likely to stay at a fluctuated part, i.e., a peak part between numbers 10 to 20 in the figure. In the focal value curve shown in fig. 6, the focal values of two points at the highest point of the curve are relatively close to each other, and the determination of the in-focus position is greatly affected. Therefore, the score of the focus curve shown in fig. 6 may be lower.
FIG. 7 is a diagram of another focal value curve according to an embodiment of the present invention. The definitions of the abscissa and ordinate in fig. 7 and 4 are similar. For brevity, no further description is provided. As shown in fig. 7, the fluctuation of the focal value curve is less, the highest value is more obvious, and is better than the focal value curve shown in fig. 6, and the filter corresponding to fig. 7 can be selected as the filter suitable for the current scene from the filters corresponding to fig. 6 and 7.
The method for determining the score corresponding to the first filter according to the focal value curve provided by this embodiment can determine the score corresponding to the first filter based on five dimensions, namely, the curve contrast, the first ratio, the second ratio, the maximum value opening size and the curve monotonicity of the focal value curve, so as to better realize the evaluation of the first filter, so that the filter screened for the current scene better meets the shooting requirement of the scene, the filtered image is subjected to filtering processing through the screened filter, the change trend of the definition of the filtered image is better, and the focal value curve formed by the filtering result better meets the requirement of focusing processing, so that focusing is quickly and accurately realized, and the shooting effect of the image is improved.
On the basis of the technical solution provided by the above embodiment, optionally, for a scene, multiple scenes conforming to the scene may be shot, so as to improve the adaptability of the filter to the scene.
Fig. 8 is a flowchart illustrating another image processing method according to an embodiment of the present invention. As shown in fig. 8, the image processing method may include:
step 801, acquiring multiple frames of images to be processed, where the multiple frames of images to be processed include multiple frames of images shot in a lens moving process, where the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a first scene conforming to the scene.
And step 802, filtering the multiple frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image.
In this embodiment, the specific implementation principle and process of step 801 to step 802 may refer to the foregoing embodiments, and are not described herein again.
And 803, acquiring a plurality of frames of second images, wherein the plurality of frames of second images are obtained by shooting a second scene conforming to the scene.
And 804, filtering the plurality of frames of second images through the first filter to obtain a filtering result corresponding to each frame of second image.
Alternatively, there may be a plurality of scenes that fit the scene, and assuming that the scene is a human face scene, different faces may be different scenes that are considered to fit the scene. For example, the first scene may be the face of the user a, and the second scene may be the face of the user B.
In the embodiment, the first scenery is shot, and a plurality of frames of images shot in the lens moving process can be obtained and recorded as images to be processed; and shooting the second scene to obtain a plurality of frames of images shot in the lens moving process, and recording the images as second images. The image to be processed and the second image may be respectively subjected to filtering processing by the first filter.
Step 805, evaluating the first filter according to the filtering result corresponding to the image to be processed and the filtering result corresponding to the second image.
Step 806, determining whether the first filter is a filter applicable to the scene according to the evaluation result corresponding to the first filter.
Specifically, the first filter may be evaluated according to a filtering result corresponding to the image to be processed to obtain a first score, the first filter may be evaluated according to a filtering result corresponding to the second image to obtain a second score, and the first score and the second score are added to obtain a total score corresponding to the first filter. And determining whether the first filter is a filter suitable for the scene according to the total score corresponding to the first filter.
The implementation of determining whether the first filter is applicable to the scene from the images of multiple scenes has been described above using two scenes as an example. In practical applications, the scenes used for evaluating the filters may not be limited to two. Optionally, more than two scenes corresponding to the scene may be selected, each scene is photographed, the first filter is scored according to the photographed image, the total score of the first filter is determined according to the score corresponding to each scene, and whether the first filter is a filter suitable for the current scene is determined according to the total score of the first filter.
Combining the solutions provided by the above embodiments can result in an alternative solution for configuring the filter for the photographing device, which may involve multiple scenes, and multiple filters. Specifically, a plurality of shot scenes, such as an indoor scene, an outdoor scene, and the like, may be set first, and for each scene, a plurality of scenes corresponding to the scene are shot, for example, for an indoor scene, indoor scenes of a plurality of houses are shot respectively, and images corresponding to the plurality of scenes are obtained, where the image corresponding to each scene may include a plurality of frames of images shot during a lens shift process when the scene is shot.
In addition, a plurality of filters may be designed, and for each filter, the filter filters images corresponding to a plurality of scenes of a scene, and scores the filter according to a filtering result, so as to obtain a total score of the filter for the plurality of scenes in the scene. After all filters are evaluated, the total score corresponding to each filter can be obtained, and one filter with the highest total score is selected from all filters to be used as the filter applicable to the scene. By implementing the above steps for each scene, a filter corresponding to each scene can be obtained.
During actual shooting, a corresponding filter can be called to filter the shot image according to the current shooting scene, so that the filtering result meets the requirements of the current scene.
In another optional scheme of configuring a filter for the shooting device, in an actual shooting process, a plurality of pre-designed filters are used to respectively perform filtering processing on a currently shot image, and the filters are evaluated according to filtering results, so that a filter suitable for a current scene is determined among the filters, and subsequent focusing processing is performed according to the filtering results corresponding to the filters.
The image processing method provided by this embodiment may be implemented by shooting a plurality of scenes corresponding to a scene to be analyzed, thereby performing comprehensive evaluation on the first filter according to the images corresponding to the plurality of scenes, and determining whether the first filter is a filter applicable to the scene, so that the filtered filter has better adaptability to the current scene.
On the basis of the technical solutions provided by the above embodiments, the scenes corresponding to the respective images may be manually selected, or the scenes corresponding to the images may be determined according to the images to be processed.
Optionally, the scene may be any one of the following: normal bright scenes, bright light scenes, low light scenes, point light source scenes, character scenes, and the like.
In an optional implementation manner, determining a scene corresponding to the image according to the image to be processed may include: determining environment brightness information according to the brightness information corresponding to the image to be processed and exposure parameters when the image to be processed is shot; and determining a scene corresponding to the image to be processed according to the environment brightness information.
The exposure parameters may include a corresponding light-passing time when the image is captured, and the like. During image capture, light impinges on the image sensor and a corresponding image may be formed. In the case of other exposure parameters being unchanged, the response per unit time on the image sensor can be used to characterize ambient brightness information, with different responses representing different ambient brightness information. Therefore, the ambient brightness information can be determined by the brightness information of the photographed image and by the time.
For example, the brightness information of the image may be an average value of brightness values corresponding to each pixel point in the image, and the ambient brightness information may be equal to the brightness information of the image divided by the light-on time, where in a case that the brightness information of the image is constant, the longer the corresponding light-on time is, the smaller the ambient brightness information is, and the shorter the corresponding light-on time is, the larger the ambient brightness information is. The detection of the ambient brightness information can be rapidly and accurately realized through the brightness information and the exposure parameters of the image, and a normal bright scene, a strong light scene and a weak light scene can be effectively detected.
If the ambient brightness information is greater than a first ambient brightness threshold, the current scene may be considered as a highlight scene, if the ambient brightness information is less than a second ambient brightness threshold, the current scene may be considered as a low-light scene, and if the ambient brightness information is less than the first ambient brightness information and greater than the second ambient brightness information, the current scene may be considered as a normal-light scene.
In another optional implementation, determining a scene corresponding to the image according to the image to be processed may include: detecting brightness information corresponding to a foreground object and brightness information corresponding to a rear scene object in the image to be processed; and determining the scene corresponding to the image to be processed according to the difference value or the ratio of the brightness information corresponding to the foreground object and the brightness information corresponding to the rear scene object.
The foreground object in the image may be an object closest to the shooting device in the image, and the rear scene object in the image may be an object other than the foreground object. The distance between the object and the shooting device can be determined according to the change trend of the definition of the image area where the object is located.
If the difference or ratio of the brightness information corresponding to the foreground object and the brightness information corresponding to the rear scene body is larger, it is indicated that the brightness of the foreground object in the image is far larger than that of the rear scene body, and at this time, it can be determined that the current scene is a point light source scene.
In yet another optional implementation, determining a scene corresponding to the image according to the image to be processed may include: and determining a scene corresponding to the image to be processed by detecting whether a preset object exists in the image to be processed. For example, if a human face exists in the image, the current scene may be considered as a human face scene.
By the scheme, the scene detection can be rapidly and accurately realized. In practical application, the above scene detection schemes may be used in combination, for example, whether a preset object exists in the image may be detected first, if so, the current scene is determined to be a scene corresponding to the preset object, and if not, the brightness information of the foreground object and the brightness information of the rear scene body in the image may be detected, and whether the current scene is a point light source scene or not is determined, and if not, whether the current scene is a non-point light source scene or not is determined by the environment brightness information, and whether the current scene is a strong light scene, a weak light scene or a normal light scene is determined by the environment brightness information.
For a plurality of scenes, the method provided by the foregoing embodiments is used to strictly and scientifically evaluate the filter to be analyzed, and determine the filter suitable for each different scene. In the actual shooting process, the current scene can be detected through the shot image, and the filter corresponding to the scene is selected for shooting, so that the focusing efficiency and accuracy are effectively improved, and the shooting effect is improved.
Fig. 9 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention. The image processing apparatus may execute the image processing method corresponding to fig. 1 to 8, and as shown in fig. 9, the image processing apparatus may include:
a memory 11 for storing a computer program;
a processor 12 for executing the computer program stored in the memory to implement:
acquiring multiple frames of images to be processed, wherein the multiple frames of images to be processed comprise multiple frames of images shot in the lens moving process, the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene;
filtering the multiple frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image;
evaluating the first filter according to a filtering result corresponding to each frame of image;
and determining whether the first filter is a filter suitable for the scene or not according to the evaluation result corresponding to the first filter.
Optionally, the image processing apparatus may further include a communication interface 13 for communicating with other devices or a communication network.
In one implementation, the processor 12 is further configured to:
and if the first filter is a filter suitable for the scene, focusing the filtering result corresponding to each frame of image.
In one implementation, the processor 12 is further configured to:
filtering the plurality of frames of images to be processed through a second filter to be analyzed;
and evaluating the second filter according to the filtering result corresponding to the second filter.
In an implementation manner, when determining whether the first filter is a filter applicable to the scene according to the evaluation result corresponding to the first filter, the processor 12 is specifically configured to:
and selecting a filter suitable for the scene from the first filter and the second filter according to the evaluation result corresponding to the first filter and the evaluation result corresponding to the second filter.
In an implementation manner, when evaluating the first filter according to the filtering result corresponding to each frame of image, the processor 12 is specifically configured to:
and determining the corresponding score of the first filter according to the filtering result corresponding to each frame of image.
In an implementation manner, when determining the score corresponding to the first filter according to the filtering result corresponding to each frame of image, the processor 12 is specifically configured to:
determining a focal value corresponding to each frame of image according to a filtering result corresponding to each frame of image;
and determining the corresponding score of the first filter according to the focal value corresponding to each frame of image.
In an implementable manner, the filtering result corresponding to each frame of image comprises a signal value corresponding to each pixel point in at least part of pixel points of the image after filtering processing;
when determining the focal value corresponding to each frame of image according to the filtering result corresponding to each frame of image, the processor 12 is specifically configured to:
and accumulating the signal values corresponding to at least part of pixel points in the image aiming at each frame of image to obtain a focal value corresponding to the image.
In an implementation manner, when determining the score corresponding to the first filter according to the focal value corresponding to each frame of image, the processor 12 is specifically configured to:
determining a focal value curve according to the focal value corresponding to each frame of image;
and determining the score corresponding to the first filter according to the focal value curve. In an implementation manner, when determining the focus value curve according to the focus value corresponding to each frame image, the processor 12 is specifically configured to:
for each frame of image, determining a numerical point corresponding to the image in a focal value curve, wherein the abscissa of the numerical point is the sequence number of the image, and the ordinate of the numerical point is the focal value corresponding to the image;
and generating the focal value curve according to the determined plurality of numerical value points.
In an implementation manner, when determining the score corresponding to the first filter according to the focal value curve, the processor 12 is specifically configured to:
determining a score corresponding to the first filter according to the focal value curve by at least one of:
the contrast ratio of the curve, the first ratio, the second ratio, the size of the maximum opening and the monotonicity of the curve are obtained;
wherein, the first ratio is the ratio of numerical value points with focal values higher than a first threshold value; the second ratio is the ratio of numerical points with focal values lower than a second threshold value.
In an implementable manner, when determining the score corresponding to the first filter by curve contrast, the processor 12 is specifically configured to:
determining N numerical value points with the largest focal value and M numerical value points with the smallest focal value in the focal value curve;
determining the score of the curve contrast corresponding to the first filter according to the ratio of the mean value of the focal values corresponding to the N numerical points to the mean value of the focal values corresponding to the M numerical points;
wherein M and N are positive integers.
In an implementation manner, the M is equal to a value obtained by multiplying the frame number of the image to be processed by a first coefficient and then rounding, and the N is equal to a value obtained by multiplying the frame number of the image to be processed by a second coefficient and then rounding, where both the first coefficient and the second coefficient are greater than 0.
In an implementation manner, a ratio between a mean value of the focal values corresponding to the N number of numerical points and a mean value of the focal values corresponding to the M number of numerical points and the score of the curve contrast is in a positive correlation.
In an implementation manner, when determining the score corresponding to the first filter by using the first ratio, the processor 12 is specifically configured to:
determining the highest value of the focal value curve, and counting the number of numerical value points with focal values larger than a first threshold value, wherein the first threshold value is the product of the highest value and a first proportional coefficient;
and determining the score of the first ratio corresponding to the first filter according to the ratio of the number of the numerical points of which the focal value is greater than the first threshold value to the number of all the numerical points in the focal value curve.
In one practical implementation manner, the ratio of the number of numerical points with the focal value greater than the first threshold to the number of all numerical points in the focal value curve and the score of the first ratio are in a negative correlation relationship.
In an implementation manner, when determining the score corresponding to the first filter through the second ratio, the processor 12 is specifically configured to:
determining the lowest value of the focal value curve, and counting the number of numerical value points of which the focal value is smaller than a second threshold value, wherein the second threshold value is the product of the lowest value and a second proportional coefficient;
and determining the score of the second ratio corresponding to the first filter according to the ratio of the number of the numerical points of which the focal value is smaller than the second threshold value to the number of all the numerical points in the focal value curve.
In an implementation manner, a ratio of the number of the numerical points with the focal value smaller than the second threshold to the number of all the numerical points in the focal value curve and a score of the second ratio are in a positive correlation.
In an implementable manner, when determining the score corresponding to the first filter by the highest value opening size, the processor 12 is specifically configured to:
determining the curvature corresponding to the focal value curve at the highest value;
and determining the score of the maximum value opening size corresponding to the first filter according to the curvature.
In one practical manner, the curvature is positively correlated with the fraction of the highest value opening size.
In an implementable manner, when determining the score corresponding to the first filter through curve monotonicity, the processor 12 is specifically configured to:
counting the number of monotone increasing intervals and/or the number of monotone decreasing intervals in the focal value curve;
and determining the monotonicity score of the curve corresponding to the first filter according to the number of the monotonous increasing intervals and/or the number of the monotonous decreasing intervals.
In one practical manner, the number of monotonically increasing intervals and/or monotonically decreasing intervals is inversely related to the curve monotonicity score.
In one implementation, the scene is any one of the following: normal bright scene, strong light scene, weak light scene, point light source scene, character scene.
In one implementation, the processor 12 is further configured to:
acquiring a plurality of frames of second images, wherein the plurality of frames of second images are obtained by shooting a second scene conforming to the scene;
and filtering the plurality of frames of second images through the first filter to obtain a filtering result corresponding to each frame of second image.
In an implementation manner, when the first filter is evaluated according to the filtering result corresponding to each frame of image, the processor 12 is specifically configured to:
and evaluating the first filter according to the filtering result corresponding to the image to be processed and the filtering result corresponding to the second image.
In one implementation, the processor 12 is further configured to:
and determining a scene corresponding to the image according to the image to be processed.
In an implementation manner, when determining a scene corresponding to the image according to the image to be processed, the processor 12 is specifically configured to:
determining environment brightness information according to the brightness information corresponding to the image to be processed and exposure parameters when the image to be processed is shot;
and determining a scene corresponding to the image to be processed according to the environment brightness information.
In an implementation manner, when determining a scene corresponding to the image according to the image to be processed, the processor 12 is specifically configured to:
detecting brightness information corresponding to a foreground object and brightness information corresponding to a rear scene object in the image to be processed;
and determining the scene corresponding to the image to be processed according to the difference value or the ratio of the brightness information corresponding to the foreground object and the brightness information corresponding to the rear scene object.
In an implementation manner, when determining a scene corresponding to the image according to the image to be processed, the processor 12 is specifically configured to:
and determining a scene corresponding to the image to be processed by detecting whether a preset object exists in the image to be processed.
The image processing apparatus shown in fig. 9 can execute the method of the embodiment shown in fig. 1-8, and the related description of the embodiment shown in fig. 1-8 can be referred to for the part not described in detail in this embodiment. The implementation process and technical effect of the technical solution refer to the descriptions in the embodiments shown in fig. 1 to 8, and are not described herein again.
Fig. 10 is a schematic structural diagram of another image processing apparatus according to an embodiment of the present invention. The image processing apparatus may execute the image processing method corresponding to fig. 1 to 8, and as shown in fig. 10, the image processing apparatus may include:
the acquiring circuit 21 is configured to acquire multiple frames of to-be-processed images, where the multiple frames of to-be-processed images include multiple frames of images shot in a lens moving process, where the multiple frames of to-be-processed images are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene;
the filter circuit 22 is configured to filter the multiple frames of images to be processed through a first filter to be analyzed, so as to obtain a filtering result corresponding to each frame of image;
the evaluation circuit 23 is configured to evaluate the first filter according to a filtering result corresponding to each frame of image;
and the determining circuit 24 is configured to determine whether the first filter is a filter applicable to the scene according to the evaluation result corresponding to the first filter.
In one implementation, the determination circuit 24 is further configured to:
and if the first filter is a filter suitable for the scene, focusing the filtering result corresponding to each frame of image.
In one implementation, the filter circuit 22 is further configured to:
filtering the plurality of frames of images to be processed through a second filter to be analyzed;
and evaluating the second filter according to the filtering result corresponding to the second filter.
In an implementation manner, when determining whether the first filter is a filter applicable to the scene according to the evaluation result corresponding to the first filter, the determining circuit 24 is specifically configured to:
and selecting a filter suitable for the scene from the first filter and the second filter according to the evaluation result corresponding to the first filter and the evaluation result corresponding to the second filter.
In an implementation manner, when evaluating the first filter according to the filtering result corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
and determining the corresponding score of the first filter according to the filtering result corresponding to each frame of image.
In an implementable manner, when determining the score corresponding to the first filter according to the filtering result corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
determining a focal value corresponding to each frame of image according to a filtering result corresponding to each frame of image;
and determining the corresponding score of the first filter according to the focal value corresponding to each frame of image.
In an implementable manner, the filtering result corresponding to each frame of image comprises a signal value corresponding to each pixel point in at least part of pixel points of the image after filtering processing;
when the focal value corresponding to each frame of image is determined according to the filtering result corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
and accumulating the signal values corresponding to at least part of pixel points in the image aiming at each frame of image to obtain a focal value corresponding to the image.
In an implementable manner, when determining the score corresponding to the first filter according to the focal value corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
determining a focal value curve according to the focal value corresponding to each frame of image;
and determining the score corresponding to the first filter according to the focal value curve. In an implementation manner, when determining the focus value curve according to the focus value corresponding to each frame image, the evaluation circuit 23 is specifically configured to:
for each frame of image, determining a numerical point corresponding to the image in a focal value curve, wherein the abscissa of the numerical point is the sequence number of the image, and the ordinate of the numerical point is the focal value corresponding to the image;
and generating the focal value curve according to the determined plurality of numerical value points.
In an implementable manner, when determining the score corresponding to the first filter according to the focal value curve, the evaluation circuit 23 is specifically configured to:
determining a score corresponding to the first filter according to the focal value curve by at least one of:
the contrast ratio of the curve, the first ratio, the second ratio, the size of the maximum opening and the monotonicity of the curve are obtained;
wherein, the first ratio is the ratio of numerical value points with focal values higher than a first threshold value; the second ratio is the ratio of numerical points with focal values lower than a second threshold value.
In an implementable manner, when determining the corresponding score of the first filter by curve contrast, the evaluation circuit 23 is specifically configured to:
determining N numerical value points with the largest focal value and M numerical value points with the smallest focal value in the focal value curve;
determining the score of the curve contrast corresponding to the first filter according to the ratio of the mean value of the focal values corresponding to the N numerical points to the mean value of the focal values corresponding to the M numerical points;
wherein M and N are positive integers.
In an implementation manner, the M is equal to a value obtained by multiplying the frame number of the image to be processed by a first coefficient and then rounding, and the N is equal to a value obtained by multiplying the frame number of the image to be processed by a second coefficient and then rounding, where both the first coefficient and the second coefficient are greater than 0.
In an implementation manner, a ratio between a mean value of the focal values corresponding to the N number of numerical points and a mean value of the focal values corresponding to the M number of numerical points and the score of the curve contrast is in a positive correlation.
In an implementable manner, when determining the corresponding score of the first filter by the first ratio, the evaluation circuit 23 is specifically configured to:
determining the highest value of the focal value curve, and counting the number of numerical value points with focal values larger than a first threshold value, wherein the first threshold value is the product of the highest value and a first proportional coefficient;
and determining the score of the first ratio corresponding to the first filter according to the ratio of the number of the numerical points of which the focal value is greater than the first threshold value to the number of all the numerical points in the focal value curve.
In one practical implementation manner, the ratio of the number of numerical points with the focal value greater than the first threshold to the number of all numerical points in the focal value curve and the score of the first ratio are in a negative correlation relationship.
In an implementable manner, when determining the corresponding score of the first filter by the second ratio, the evaluation circuit 23 is specifically configured to:
determining the lowest value of the focal value curve, and counting the number of numerical value points of which the focal value is smaller than a second threshold value, wherein the second threshold value is the product of the lowest value and a second proportional coefficient;
and determining the score of the second ratio corresponding to the first filter according to the ratio of the number of the numerical points of which the focal value is smaller than the second threshold value to the number of all the numerical points in the focal value curve.
In an implementation manner, a ratio of the number of the numerical points with the focal value smaller than the second threshold to the number of all the numerical points in the focal value curve and a score of the second ratio are in a positive correlation.
In an implementable manner, when determining the corresponding score of the first filter by the maximum aperture size, the evaluation circuit 23 is specifically configured to:
determining the curvature corresponding to the focal value curve at the highest value;
and determining the score of the maximum value opening size corresponding to the first filter according to the curvature.
In one practical manner, the curvature is positively correlated with the fraction of the highest value opening size.
In an implementable manner, when determining the score corresponding to the first filter by curve monotonicity, the evaluation circuit 23 is specifically configured to:
counting the number of monotone increasing intervals and/or the number of monotone decreasing intervals in the focal value curve;
and determining the monotonicity score of the curve corresponding to the first filter according to the number of the monotonous increasing intervals and/or the number of the monotonous decreasing intervals.
In one practical manner, the number of monotonically increasing intervals and/or monotonically decreasing intervals is inversely related to the curve monotonicity score.
In one implementation, the scene is any one of the following: normal bright scene, strong light scene, weak light scene, point light source scene, character scene.
In one implementation, the filter circuit 22 is further configured to:
acquiring a plurality of frames of second images, wherein the plurality of frames of second images are obtained by shooting a second scene conforming to the scene;
and filtering the plurality of frames of second images through the first filter to obtain a filtering result corresponding to each frame of second image.
In an implementation manner, when the first filter is evaluated according to the filtering result corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
and evaluating the first filter according to the filtering result corresponding to the image to be processed and the filtering result corresponding to the second image.
In one practical manner, the evaluation circuit 23 is further configured to:
and determining a scene corresponding to the image according to the image to be processed.
In an implementable manner, when determining a scene corresponding to the image according to the image to be processed, the evaluation circuit 23 is specifically configured to:
determining environment brightness information according to the brightness information corresponding to the image to be processed and exposure parameters when the image to be processed is shot;
and determining a scene corresponding to the image to be processed according to the environment brightness information.
In an implementable manner, when determining a scene corresponding to the image according to the image to be processed, the evaluation circuit 23 is specifically configured to:
detecting brightness information corresponding to a foreground object and brightness information corresponding to a rear scene object in the image to be processed;
and determining the scene corresponding to the image to be processed according to the difference value or the ratio of the brightness information corresponding to the foreground object and the brightness information corresponding to the rear scene object.
In an implementable manner, when determining a scene corresponding to the image according to the image to be processed, the evaluation circuit 23 is specifically configured to:
and determining a scene corresponding to the image to be processed by detecting whether a preset object exists in the image to be processed.
The image processing apparatus shown in fig. 10 may perform the method of the embodiment shown in fig. 1-8. It will be appreciated that the method of the embodiments shown in fig. 1-8 may be implemented by hardware circuitry. For example, the calculation of the focal value from the filtering result may be implemented by an accumulator; calculating the grade of the filter, which can be realized by an arithmetic unit corresponding to the grade method; whether the grading meets the requirement or not is judged, and the grading can be realized through a comparator; the focusing process can be performed by outputting a step signal to the focusing motor.
For parts of this embodiment not described in detail, reference may be made to the description of the embodiment shown in fig. 1 to 8. The implementation process and technical effect of the technical solution refer to the descriptions in the embodiments shown in fig. 1 to 8, and are not described herein again.
The embodiment of the invention also provides shooting equipment comprising the image processing device in any one of the embodiments.
Optionally, the photographing apparatus may further include a lens, a focus motor, and an image sensor. The focusing motor is used for driving the lens to move so as to change the object distance or the image distance. The image sensor is used for converting the optical signal passing through the lens into an electric signal to form an image.
The shooting equipment can be a mobile phone, a motion camera, a professional camera, an infrared camera and the like. The structure, function, execution process and technical effect of each component in the shooting device may refer to the description in the foregoing embodiments, and are not described herein again.
The embodiment of the invention also provides a movable platform which comprises the shooting equipment. The movable platform can be an unmanned aerial vehicle, an unmanned vehicle or a holder and the like. The movable platform may also include a body and a power system. The shooting equipment and the power system are arranged on the machine body, and the power system is used for providing power for the movable platform.
The structure, function, execution process and technical effect of each component in the movable platform can be referred to the description in the foregoing embodiments, and are not described herein again.
In addition, an embodiment of the present invention further provides a computer-readable storage medium, where program instructions are stored in the computer-readable storage medium, where the program instructions are used to implement the image processing method according to any one of the above embodiments.
The technical solutions and the technical features in the above embodiments may be used alone or in combination when conflicting with the present invention, and all embodiments are equivalent embodiments within the scope of the present invention as long as they do not exceed the scope recognized by those skilled in the art.
In the embodiments provided in the present invention, it should be understood that the disclosed related devices and methods can be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer Processor (Processor) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (66)

1. An image processing method, comprising:
acquiring multiple frames of images to be processed, wherein the multiple frames of images to be processed comprise multiple frames of images shot in the lens moving process, the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene;
filtering the multiple frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image;
evaluating the first filter according to a filtering result corresponding to each frame of image;
and determining whether the first filter is a filter suitable for the scene or not according to the evaluation result corresponding to the first filter.
2. The method of claim 1, further comprising:
and if the first filter is a filter suitable for the scene, focusing the filtering result corresponding to each frame of image.
3. The method of claim 1, further comprising:
filtering the plurality of frames of images to be processed through a second filter to be analyzed;
and evaluating the second filter according to the filtering result corresponding to the second filter.
4. The method according to claim 3, wherein determining whether the first filter is a filter applicable to the scene according to the evaluation result corresponding to the first filter comprises:
and selecting a filter suitable for the scene from the first filter and the second filter according to the evaluation result corresponding to the first filter and the evaluation result corresponding to the second filter.
5. The method of claim 1, wherein evaluating the first filter according to the filtering result corresponding to each frame of image comprises:
and determining the corresponding score of the first filter according to the filtering result corresponding to each frame of image.
6. The method of claim 5, wherein determining the score corresponding to the first filter according to the filtering result corresponding to each frame of image comprises:
determining a focal value corresponding to each frame of image according to a filtering result corresponding to each frame of image;
and determining the corresponding score of the first filter according to the focal value corresponding to each frame of image.
7. The method of claim 6, wherein the filtering result corresponding to each frame of image comprises a signal value corresponding to each pixel in at least a part of pixels of the image after filtering;
determining a focal value corresponding to each frame of image according to a filtering result corresponding to each frame of image, including:
and accumulating the signal values corresponding to at least part of pixel points in the image aiming at each frame of image to obtain a focal value corresponding to the image.
8. The method of claim 6, wherein determining the score corresponding to the first filter according to the focus value corresponding to each frame of the image comprises:
determining a focal value curve according to the focal value corresponding to each frame of image;
and determining the score corresponding to the first filter according to the focal value curve.
9. The method of claim 8, wherein determining a focus value curve based on the focus value corresponding to each frame of image comprises:
for each frame of image, determining a numerical point corresponding to the image in a focal value curve, wherein the abscissa of the numerical point is the sequence number of the image, and the ordinate of the numerical point is the focal value corresponding to the image;
and generating the focal value curve according to the determined plurality of numerical value points.
10. The method of claim 9, wherein determining a score corresponding to the first filter from the focal value profile comprises:
determining a score corresponding to the first filter according to the focal value curve by at least one of:
the contrast ratio of the curve, the first ratio, the second ratio, the size of the maximum opening and the monotonicity of the curve are obtained;
wherein, the first ratio is the ratio of numerical value points with focal values higher than a first threshold value; the second ratio is the ratio of numerical points with focal values lower than a second threshold value.
11. The method of claim 10, wherein determining the score corresponding to the first filter by curve contrast comprises:
determining N numerical value points with the largest focal value and M numerical value points with the smallest focal value in the focal value curve;
determining the score of the curve contrast corresponding to the first filter according to the ratio of the mean value of the focal values corresponding to the N numerical points to the mean value of the focal values corresponding to the M numerical points;
wherein M and N are positive integers.
12. The method of claim 11, wherein M is equal to a value rounded by multiplying a number of frames of the image to be processed by a first coefficient, and wherein N is equal to a value rounded by multiplying a number of frames of the image to be processed by a second coefficient, wherein the first coefficient and the second coefficient are both greater than 0.
13. The method of claim 11, wherein a ratio of a mean of the focal values corresponding to the N number of points to a mean of the focal values corresponding to the M number of points is positively correlated with the score of the curve contrast.
14. The method of claim 10, wherein determining the score corresponding to the first filter by a first ratio comprises:
determining the highest value of the focal value curve, and counting the number of numerical value points with focal values larger than a first threshold value, wherein the first threshold value is the product of the highest value and a first proportional coefficient;
and determining the score of the first ratio corresponding to the first filter according to the ratio of the number of the numerical points of which the focal value is greater than the first threshold value to the number of all the numerical points in the focal value curve.
15. The method of claim 14, wherein a ratio of the number of numerical points having a focus value greater than a first threshold value to the number of all numerical points in the focus curve is inversely related to the first ratio score.
16. The method of claim 10, wherein determining the score corresponding to the first filter by a second ratio comprises:
determining the lowest value of the focal value curve, and counting the number of numerical value points of which the focal value is smaller than a second threshold value, wherein the second threshold value is the product of the lowest value and a second proportional coefficient;
and determining the score of the second ratio corresponding to the first filter according to the ratio of the number of the numerical points of which the focal value is smaller than the second threshold value to the number of all the numerical points in the focal value curve.
17. The method of claim 16, wherein a ratio of the number of numerical points having a focus value less than a second threshold to the number of all numerical points in the focus curve is positively correlated with a score of the second ratio.
18. The method of claim 10, wherein determining the score corresponding to the first filter by a highest value opening size comprises:
determining the curvature corresponding to the focal value curve at the highest value;
and determining the score of the maximum value opening size corresponding to the first filter according to the curvature.
19. The method of claim 18, wherein the curvature is positively correlated with the fraction of the highest value opening size.
20. The method of claim 10, wherein determining the score corresponding to the first filter by curve monotonicity comprises:
counting the number of monotone increasing intervals and/or the number of monotone decreasing intervals in the focal value curve;
and determining the monotonicity score of the curve corresponding to the first filter according to the number of the monotonous increasing intervals and/or the number of the monotonous decreasing intervals.
21. The method according to claim 20, wherein the number of monotonically increasing intervals and/or monotonically decreasing intervals is inversely related to the score of the curve monotonicity.
22. The method of claim 1, wherein the scene is any one of: normal bright scene, strong light scene, weak light scene, point light source scene, character scene.
23. The method of claim 22, further comprising:
acquiring a plurality of frames of second images, wherein the plurality of frames of second images are obtained by shooting a second scene conforming to the scene;
and filtering the plurality of frames of second images through the first filter to obtain a filtering result corresponding to each frame of second image.
24. The method of claim 23, wherein evaluating the first filter according to the filtering result corresponding to each frame of image comprises:
and evaluating the first filter according to the filtering result corresponding to the image to be processed and the filtering result corresponding to the second image.
25. The method of claim 1, further comprising:
and determining a scene corresponding to the image according to the image to be processed.
26. The method of claim 25, wherein determining the scene corresponding to the image according to the image to be processed comprises:
determining environment brightness information according to the brightness information corresponding to the image to be processed and exposure parameters when the image to be processed is shot;
and determining a scene corresponding to the image to be processed according to the environment brightness information.
27. The method of claim 25, wherein determining the scene corresponding to the image according to the image to be processed comprises:
detecting brightness information corresponding to a foreground object and brightness information corresponding to a rear scene object in the image to be processed;
and determining the scene corresponding to the image to be processed according to the difference value or the ratio of the brightness information corresponding to the foreground object and the brightness information corresponding to the rear scene object.
28. The method of claim 25, wherein determining the scene corresponding to the image according to the image to be processed comprises:
and determining a scene corresponding to the image to be processed by detecting whether a preset object exists in the image to be processed.
29. An image processing apparatus characterized by comprising:
a memory for storing a computer program;
a processor for executing the computer program stored in the memory to implement:
acquiring multiple frames of images to be processed, wherein the multiple frames of images to be processed comprise multiple frames of images shot in the lens moving process, the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene;
filtering the multiple frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image;
evaluating the first filter according to a filtering result corresponding to each frame of image;
and determining whether the first filter is a filter suitable for the scene or not according to the evaluation result corresponding to the first filter.
30. The apparatus of claim 29, wherein the processor is further configured to:
and if the first filter is a filter suitable for the scene, focusing the filtering result corresponding to each frame of image.
31. The apparatus of claim 29, wherein the processor is further configured to:
filtering the plurality of frames of images to be processed through a second filter to be analyzed;
and evaluating the second filter according to the filtering result corresponding to the second filter.
32. The apparatus of claim 31, wherein when determining whether the first filter is a filter applicable to the scene according to the evaluation result corresponding to the first filter, the processor is specifically configured to:
and selecting a filter suitable for the scene from the first filter and the second filter according to the evaluation result corresponding to the first filter and the evaluation result corresponding to the second filter.
33. The apparatus according to claim 29, wherein the processor is specifically configured to, when evaluating the first filter according to the filtering result corresponding to each frame image:
and determining the corresponding score of the first filter according to the filtering result corresponding to each frame of image.
34. The apparatus according to claim 33, wherein when determining the score corresponding to the first filter according to the filtering result corresponding to each frame image, the processor is specifically configured to:
determining a focal value corresponding to each frame of image according to a filtering result corresponding to each frame of image;
and determining the corresponding score of the first filter according to the focal value corresponding to each frame of image.
35. The apparatus according to claim 34, wherein the filtering result corresponding to each frame of image comprises a signal value corresponding to each pixel in at least a part of pixels of the image after filtering;
when the focal value corresponding to each frame image is determined according to the filtering result corresponding to each frame image, the processor is specifically configured to:
and accumulating the signal values corresponding to at least part of pixel points in the image aiming at each frame of image to obtain a focal value corresponding to the image.
36. The apparatus of claim 34, wherein when determining the score corresponding to the first filter according to the focal value corresponding to each frame image, the processor is specifically configured to:
determining a focal value curve according to the focal value corresponding to each frame of image;
and determining the score corresponding to the first filter according to the focal value curve.
37. The apparatus of claim 36, wherein when determining the focus value curve based on the focus values corresponding to the respective frame images, the processor is specifically configured to:
for each frame of image, determining a numerical point corresponding to the image in a focal value curve, wherein the abscissa of the numerical point is the sequence number of the image, and the ordinate of the numerical point is the focal value corresponding to the image;
and generating the focal value curve according to the determined plurality of numerical value points.
38. The apparatus of claim 37, wherein in determining the score corresponding to the first filter from the focal value profile, the processor is specifically configured to:
determining a score corresponding to the first filter according to the focal value curve by at least one of:
the contrast ratio of the curve, the first ratio, the second ratio, the size of the maximum opening and the monotonicity of the curve are obtained;
wherein, the first ratio is the ratio of numerical value points with focal values higher than a first threshold value; the second ratio is the ratio of numerical points with focal values lower than a second threshold value.
39. The apparatus of claim 38, wherein in determining the score corresponding to the first filter by curve contrast, the processor is specifically configured to:
determining N numerical value points with the largest focal value and M numerical value points with the smallest focal value in the focal value curve;
determining the score of the curve contrast corresponding to the first filter according to the ratio of the mean value of the focal values corresponding to the N numerical points to the mean value of the focal values corresponding to the M numerical points;
wherein M and N are positive integers.
40. The apparatus of claim 39, wherein M is equal to a number of frames of the image to be processed multiplied by a first coefficient and N is equal to a number of frames of the image to be processed multiplied by a second coefficient and wherein the first coefficient and the second coefficient are both greater than 0.
41. The apparatus of claim 39, wherein a ratio of a mean of the focal values corresponding to the N number of points to a mean of the focal values corresponding to the M number of points is positively correlated with the score of the curve contrast.
42. The apparatus of claim 38, wherein in determining the score corresponding to the first filter by the first ratio, the processor is specifically configured to:
determining the highest value of the focal value curve, and counting the number of numerical value points with focal values larger than a first threshold value, wherein the first threshold value is the product of the highest value and a first proportional coefficient;
and determining the score of the first ratio corresponding to the first filter according to the ratio of the number of the numerical points of which the focal value is greater than the first threshold value to the number of all the numerical points in the focal value curve.
43. The apparatus of claim 42, wherein a ratio of the number of numerical points having a focus value greater than a first threshold to the number of all numerical points in the focus curve is inversely related to the first ratio score.
44. The apparatus of claim 38, wherein in determining the score corresponding to the first filter by the second ratio, the processor is specifically configured to:
determining the lowest value of the focal value curve, and counting the number of numerical value points of which the focal value is smaller than a second threshold value, wherein the second threshold value is the product of the lowest value and a second proportional coefficient;
and determining the score of the second ratio corresponding to the first filter according to the ratio of the number of the numerical points of which the focal value is smaller than the second threshold value to the number of all the numerical points in the focal value curve.
45. The apparatus of claim 44, wherein the ratio of the number of numerical points having a focus value less than the second threshold to the number of all numerical points in the focus curve is positively correlated with the score of the second ratio.
46. The apparatus of claim 38, wherein in determining the score corresponding to the first filter by a highest value opening size, the processor is specifically configured to:
determining the curvature corresponding to the focal value curve at the highest value;
and determining the score of the maximum value opening size corresponding to the first filter according to the curvature.
47. The device of claim 46, wherein said curvature is positively correlated with a score of said maximum opening size.
48. The apparatus of claim 38, wherein in determining the score corresponding to the first filter by curve monotonicity, the processor is specifically configured to:
counting the number of monotone increasing intervals and/or the number of monotone decreasing intervals in the focal value curve;
and determining the monotonicity score of the curve corresponding to the first filter according to the number of the monotonous increasing intervals and/or the number of the monotonous decreasing intervals.
49. The apparatus according to claim 48, wherein the number of monotonically increasing intervals and/or monotonically decreasing intervals is inversely related to the score of the curve monotonicity.
50. The apparatus of claim 29, wherein the scene is any one of: normal bright scene, strong light scene, weak light scene, point light source scene, character scene.
51. The apparatus of claim 50, wherein the processor is further configured to:
acquiring a plurality of frames of second images, wherein the plurality of frames of second images are obtained by shooting a second scene conforming to the scene;
and filtering the plurality of frames of second images through the first filter to obtain a filtering result corresponding to each frame of second image.
52. The apparatus according to claim 51, wherein when evaluating the first filter according to the filtering result corresponding to each frame image, the processor is specifically configured to:
and evaluating the first filter according to the filtering result corresponding to the image to be processed and the filtering result corresponding to the second image.
53. The apparatus of claim 29, wherein the processor is further configured to:
and determining a scene corresponding to the image according to the image to be processed.
54. The apparatus according to claim 53, wherein when determining a scene corresponding to the image according to the image to be processed, the processor is specifically configured to:
determining environment brightness information according to the brightness information corresponding to the image to be processed and exposure parameters when the image to be processed is shot;
and determining a scene corresponding to the image to be processed according to the environment brightness information.
55. The apparatus according to claim 53, wherein when determining a scene corresponding to the image according to the image to be processed, the processor is specifically configured to:
detecting brightness information corresponding to a foreground object and brightness information corresponding to a rear scene object in the image to be processed;
and determining the scene corresponding to the image to be processed according to the difference value or the ratio of the brightness information corresponding to the foreground object and the brightness information corresponding to the rear scene object.
56. The apparatus according to claim 53, wherein when determining a scene corresponding to the image according to the image to be processed, the processor is specifically configured to:
and determining a scene corresponding to the image to be processed by detecting whether a preset object exists in the image to be processed.
57. An image processing apparatus characterized by comprising:
the device comprises an acquisition circuit, a processing circuit and a processing circuit, wherein the acquisition circuit is used for acquiring multiple frames of images to be processed, the multiple frames of images to be processed comprise multiple frames of images shot in the moving process of a lens, the multiple frames of images to be processed are multiple frames of images corresponding to a scene, and the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene conforming to the scene;
the filter circuit is used for filtering the plurality of frames of images to be processed through a first filter to be analyzed to obtain a filtering result corresponding to each frame of image;
the evaluation circuit is used for evaluating the first filter according to the filtering result corresponding to each frame of image;
and the determining circuit is used for determining whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter.
58. The apparatus according to claim 57, wherein when evaluating the first filter according to the filtering result corresponding to each frame image, the evaluation circuit is specifically configured to:
and determining the corresponding score of the first filter according to the filtering result corresponding to each frame of image.
59. The apparatus according to claim 58, wherein when determining the score corresponding to the first filter according to the filtering result corresponding to each frame of image, the evaluation circuit is specifically configured to:
determining a focal value corresponding to each frame of image according to a filtering result corresponding to each frame of image;
and determining the corresponding score of the first filter according to the focal value corresponding to each frame of image.
60. The apparatus according to claim 59, wherein when determining the score corresponding to the first filter according to the focal value corresponding to each frame of the image, the evaluation circuit is specifically configured to:
determining a focal value curve according to the focal value corresponding to each frame of image;
and determining the score corresponding to the first filter according to the focal value curve.
61. The apparatus according to claim 60, wherein when determining the score corresponding to the first filter from the focal value profile, the evaluation circuit is specifically configured to:
determining a score corresponding to the first filter according to the focal value curve by at least one of:
the contrast ratio of the curve, the first ratio, the second ratio, the size of the maximum opening and the monotonicity of the curve are obtained;
wherein, the first ratio is the ratio of numerical value points with focal values higher than a first threshold value; the second ratio is the ratio of numerical points with focal values lower than a second threshold value.
62. A photographing apparatus, characterized by comprising: the image processing device of any one of claims 29-56.
63. A photographing apparatus, characterized by comprising: the image processing device of any one of claims 57-61.
64. A movable platform, comprising: a camera device as claimed in claim 62.
65. A movable platform, comprising: a camera device as claimed in claim 63.
66. A computer-readable storage medium, characterized in that program instructions for implementing the image processing method according to any one of claims 1 to 28 are stored in the computer-readable storage medium.
CN202080004975.5A 2020-04-28 2020-04-28 Image processing method, image processing apparatus, photographing device, movable platform and storage medium Pending CN112689853A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/087530 WO2021217427A1 (en) 2020-04-28 2020-04-28 Image processing method and apparatus, photographing device, movable platform, and storage medium

Publications (1)

Publication Number Publication Date
CN112689853A true CN112689853A (en) 2021-04-20

Family

ID=75457690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080004975.5A Pending CN112689853A (en) 2020-04-28 2020-04-28 Image processing method, image processing apparatus, photographing device, movable platform and storage medium

Country Status (2)

Country Link
CN (1) CN112689853A (en)
WO (1) WO2021217427A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113395454A (en) * 2021-07-06 2021-09-14 Oppo广东移动通信有限公司 Anti-shake method and device for image shooting, terminal and readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114827461A (en) * 2022-04-15 2022-07-29 维沃移动通信(杭州)有限公司 Shooting focusing method and device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1829288A (en) * 2005-03-01 2006-09-06 株式会社理光 Image pick-up device and image pick-up method
JP2008109230A (en) * 2006-10-23 2008-05-08 Kyocera Corp Imaging apparatus and image processing method
CN101310205A (en) * 2006-01-17 2008-11-19 索尼株式会社 Focus control device and imaging device
JP2010020081A (en) * 2008-07-10 2010-01-28 Ricoh Co Ltd Image capturing apparatus and method, and control program and storage medium
CN105593737A (en) * 2013-10-02 2016-05-18 奥林巴斯株式会社 Focal point detection device and focal point detection method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007102061A (en) * 2005-10-07 2007-04-19 Olympus Corp Imaging apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1829288A (en) * 2005-03-01 2006-09-06 株式会社理光 Image pick-up device and image pick-up method
CN101310205A (en) * 2006-01-17 2008-11-19 索尼株式会社 Focus control device and imaging device
JP2008109230A (en) * 2006-10-23 2008-05-08 Kyocera Corp Imaging apparatus and image processing method
JP2010020081A (en) * 2008-07-10 2010-01-28 Ricoh Co Ltd Image capturing apparatus and method, and control program and storage medium
CN105593737A (en) * 2013-10-02 2016-05-18 奥林巴斯株式会社 Focal point detection device and focal point detection method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113395454A (en) * 2021-07-06 2021-09-14 Oppo广东移动通信有限公司 Anti-shake method and device for image shooting, terminal and readable storage medium

Also Published As

Publication number Publication date
WO2021217427A1 (en) 2021-11-04

Similar Documents

Publication Publication Date Title
US10326927B2 (en) Distance information producing apparatus, image capturing apparatus, distance information producing method and storage medium storing distance information producing program
US8233078B2 (en) Auto focus speed enhancement using object recognition and resolution
US8023000B2 (en) Image pickup apparatus, image processing apparatus, image pickup method, and image processing method
KR101412752B1 (en) Apparatus and method for digital auto-focus
KR101399012B1 (en) apparatus and method for restoring image
CN105578048B (en) A kind of quick focusing method and device, mobile terminal
CN103685861A (en) System and method for utilizing enhanced scene detection in a depth estimation procedure
JP4957134B2 (en) Distance measuring device
WO2012132486A1 (en) Imaging device, imaging method, program, and program storage medium
JP5635844B2 (en) Focus adjustment apparatus and imaging apparatus
WO2010088079A2 (en) Automatic focusing apparatus and method for digital images using automatic filter switching
CN105635565A (en) Shooting method and equipment
US20120307009A1 (en) Method and apparatus for generating image with shallow depth of field
TW201103316A (en) Two-dimensional polynomial model for depth estimation based on two-picture matching
CN110324536B (en) Image change automatic sensing focusing method for microscope camera
JP2008271241A (en) Imaging apparatus, image processing apparatus, imaging method, and image processing method
KR100897768B1 (en) An automatic focusing method and apparatus for using the same
CN108156369B (en) Image processing method and device
WO2009012364A1 (en) Device and method for estimating if an image is blurred
WO2007086378A1 (en) Best-focus detector
CN112689853A (en) Image processing method, image processing apparatus, photographing device, movable platform and storage medium
JP2013042371A (en) Image pickup device and distance information acquisition method
JP5144724B2 (en) Imaging apparatus, image processing apparatus, imaging method, and image processing method
KR20170101532A (en) Method for image fusion, Computer program for the same, and Recording medium storing computer program for the same
JP4255186B2 (en) Focusing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination