CN113129400A - Image processing method, image processing device, electronic equipment and readable storage medium - Google Patents

Image processing method, image processing device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN113129400A
CN113129400A CN202110286693.5A CN202110286693A CN113129400A CN 113129400 A CN113129400 A CN 113129400A CN 202110286693 A CN202110286693 A CN 202110286693A CN 113129400 A CN113129400 A CN 113129400A
Authority
CN
China
Prior art keywords
pixel point
black
color
view
white
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110286693.5A
Other languages
Chinese (zh)
Other versions
CN113129400B (en
Inventor
倪攀
张华琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110286693.5A priority Critical patent/CN113129400B/en
Publication of CN113129400A publication Critical patent/CN113129400A/en
Application granted granted Critical
Publication of CN113129400B publication Critical patent/CN113129400B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The application discloses an image processing method, an image processing device, electronic equipment and a readable storage medium, belongs to the technical field of communication, and can solve the problem that the coloring result of a black-and-white image has color difference with a color image. The method comprises the following steps: acquiring a binocular image pair aiming at a target shooting object; calculating a first candidate color value of each pixel point in a second black-and-white view based on the color values of the pixel points in the first color view; calculating a second candidate color value of a shielding pixel point in the second black-and-white view based on the first candidate color value of the target pixel point in the second black-and-white view; and coloring the second black-and-white view according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black-and-white view to obtain the target image. The method and the device are suitable for scenes in which the electronic equipment needs to perform image optimization after shooting the images.

Description

Image processing method, image processing device, electronic equipment and readable storage medium
Technical Field
The present application belongs to the field of communication technologies, and in particular, to an image processing method, an image processing apparatus, an electronic device, and a readable storage medium.
Background
Generally, a color camera of an electronic device first acquires a bayer array image of a first image through a color filter array, then performs demosaicing on the bayer array image, and finally obtains a color image. The color filter array refers to a photoelectric sensor array carrying filters with three colors arranged alternately in a color camera. Due to the filtering structure of the color filtering array, the photoelectric sensor cannot image by using all incident light, and the image brightness of the color image acquired by the electronic equipment is low. In contrast to color cameras, black and white cameras do not require the use of filters in acquiring black and white images, and therefore, black and white cameras use almost all incident light for imaging. That is, the brightness of black and white images is richer than the brightness gradation of color images, and the details of the texture are rich.
In the related technology, for a group of related black-white-color image pairs acquired by a black-white-color binocular camera system, feature matching can be performed on the group of black-white-color image pairs based on a black-white image coloring algorithm, then, migration of colors from a color image to the black-white image is completed according to a matching result, so that the black-white image and the color image are fused, and the color image with lower signal-to-noise ratio and richer details is obtained through image fusion.
However, if the black-and-white image is obtained and colored in the above manner, the texture of the black-and-white image is not completely the same as that of the color image and the texture is visually poor, so that the matching result between the black-and-white image and the color image is not accurate when feature matching is performed, and further, the obtained coloring result of the black-and-white image may have a color difference from the color image.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image processing method, an image processing apparatus, an electronic device, and a readable storage medium, which can solve the problem that a coloring result of a black-and-white image has a color difference from a color image.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an image processing method. The method comprises the following steps:
acquiring a binocular image pair aiming at a target shooting object; the above binocular image pair includes: a first color view and a second black and white view;
calculating a first candidate color value of each pixel point in a second black-and-white view based on the color values of the pixel points in the first color view;
calculating a second candidate color value of a shielding pixel point in the second black-and-white view based on the first candidate color value of the target pixel point in the second black-and-white view; the shielding pixel points are pixel points in the second black-and-white view, which have parallax loss with the first color view; the target pixel point is a pixel point matched with the brightness value of the shielding pixel point;
and coloring the second black-and-white view according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black-and-white view to obtain the target image.
In a second aspect, an embodiment of the present application provides an image processing apparatus. The device includes: the device comprises an acquisition unit, a first calculation unit, a second calculation unit and a coloring unit;
the acquisition unit is used for acquiring a binocular image pair aiming at a target shooting object; the binocular image pair includes: a first color view and a second black and white view;
the first calculating unit is configured to calculate a first candidate color value of each pixel point in the second monochrome view based on a color value of the pixel point in the first color view;
the second calculating unit is configured to calculate a second candidate color value of a blocking pixel in the second black-and-white view based on the first candidate color value of the target pixel in the second black-and-white view; the shielding pixel points are pixel points in the second black-and-white view, which have parallax loss with the first color view; the target pixel point is a pixel point matched with the brightness value of the shielding pixel point;
the coloring unit is used for coloring the second black-and-white view according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black-and-white view to obtain the target image.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method as provided in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor implement the steps of the method as provided in the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method as provided in the first aspect.
In a sixth aspect, the present application provides a computer program product stored in a non-volatile storage medium, the program product being executed by at least one processor to implement the method as provided in the first aspect.
In the embodiment of the application, a binocular image pair (including a first color view and a second black and white view) for a target shooting object is obtained, then a first candidate color value of each pixel point in the second black and white view is calculated based on a color value of a pixel point in the first color view, a second candidate color value of a shielding pixel point in the second black and white view is calculated based on a first candidate color value of a target pixel point in the second black and white view, and finally the second black and white view is colored according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black and white view to obtain the target image. Therefore, the target image can not only keep the texture details in the second black-and-white view, but also obtain the color impression in the first color view by taking the second black-and-white view as a basic image and the first color view as a guide image. Meanwhile, before the second black-and-white image is colored, parallax optimization is not needed to be carried out on the first color view and the second black-and-white view, and loss of texture details caused by parallax optimization can be avoided, so that a target image with higher definition and better appearance is obtained. Furthermore, because the first candidate color value corresponding to the shielding pixel point may have a larger deviation compared with the actual color value, the second candidate color value of the shielding pixel point is recalculated according to the target pixel point corresponding to the shielding pixel point in the second black-and-white view, so that the second candidate color value corresponding to the shielding pixel point is closer to the color corresponding to the target shooting object, and further, the target image colored according to the first candidate color and the second candidate color is more consistent with the color corresponding to the actual target shooting object.
Drawings
Fig. 1 is a schematic diagram of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a method for calculating a first candidate color value according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of another method for calculating a first candidate color value according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a method for calculating a second candidate color value according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a method for determining a target pixel point according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 7 is a hardware schematic diagram of an electronic device according to an embodiment of the present disclosure;
fig. 8 is a second hardware schematic diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The black-and-white image has richer brightness details and texture details and richer color image, and the black-and-white image and the color image are fused in a characteristic matching mode through a group of related black-and-white-and-color image pairs of a shooting object simultaneously acquired by a black-and-white-color binocular camera system to obtain a fused image with richer colors and details.
To solve the above problem, an embodiment of the present application provides an image processing method as shown in fig. 1. The method may include steps 101 to 104 described below. The method is exemplified below by taking the execution subject as an image processing apparatus.
Step 101: the image processing apparatus acquires a binocular image pair for a target photographic subject.
In an embodiment of the present application, the binocular image pair includes: a first color view and a second black and white view. In one acquisition process aiming at a target shooting object, the black-and-white camera and the color camera shoot the target shooting object captured by the image acquisition lens at the same time, and a first color view and a second black-and-white view about the target shooting object are obtained respectively.
It can be understood that, in one acquisition process, the time when the first color view is shot and the time when the second black-and-white view is shot may be the same time, or there may be a slight time difference, and the time difference may be in the order of nanoseconds, microseconds, or milliseconds. The first color view and the second black and white view are compared, the captured image of the target photographic object is basically the same, so the first color view can be used as a guide view of the second black and white view to provide a coloring basis.
It should be noted that, because the black-and-white camera and the color camera are fixedly installed at preset positions in the various image capturing devices, a specific angular relationship exists between the black-and-white camera and the color camera with respect to a shooting object, and thus, in the shooting process, the obtained first color view and the second black-and-white view are shot by the color camera and the black-and-white camera according to the parallax angle corresponding to the specific angular relationship. That is, a pair of images including a first color view and a second black-and-white view can be obtained in one photographing process, and the first color view and the second black-and-white view are obtained by photographing the same photographic subject at a parallax angle.
In an embodiment of the application, the first color view may comprise at least one of: YUV images, color infrared images.
Step 102: the image processing device calculates a first candidate color value of each pixel point in the second black-and-white view based on the color values of the pixel points in the first color view.
In the embodiment of the application, the first color view is used as a guide view to guide and calculate the first candidate color value used for coloring of each pixel point in the second black-and-white view, wherein the first candidate color value is used for representing the color value of each pixel point in the colored second black-and-white view.
Illustratively, matching corresponding pixel points in the first color view for each pixel point in the second black-and-white view in a block matching mode, acquiring color values corresponding to each pixel point in the second black-and-white view according to the corresponding pixel points in the first color view, and then calculating a first candidate color value according to a preset algorithm according to the acquired color values. The preset algorithm includes at least one of: a weighted average algorithm, a mean algorithm, and a random selection algorithm.
Optionally, in this embodiment of the present application, before step 102, the image processing method provided in this embodiment of the present application may further include: and denoising the color channels of the first color view respectively according to a DCT frequency domain denoising algorithm, and updating the first color view. The color details of the first color view are provided by the above-mentioned denoising.
Illustratively, the first color view is denoted as C ∈ RH×W×3Wherein, R is a real number, H is the height of the pixel point, W is the width of the pixel point, and '3' is three color channels representing colors. Extracting the color value in each color channel, respectively performing noise reduction processing, combining the noise-reduced color values according to the color channels to which the noise-reduced color values belong to generate a new first color view, and updating the first color view which is not subjected to noise reduction into the new first color view.
Step 103: and the image processing device calculates a second candidate color value of the shielding pixel point in the second black-and-white view based on the first candidate color value of the target pixel point in the second black-and-white view.
In the embodiment of the present application, the shielding pixel is a pixel in the second black-and-white view that has a parallax loss with the first color view; the target pixel point is a pixel point matched with the brightness value of the shielding pixel point.
Illustratively, in the process of matching each pixel point in the second black-and-white view with the corresponding pixel point position in the first color view, if the confidence of the matching result is low, it is determined that the pixel point in the second black-and-white view is a shielding pixel point in the matching corresponding relationship with low confidence.
Illustratively, on the assumption that colors of adjacent pixels with similar brightness are similar, the shielding pixel is used as a seed point, and among pixels whose pixel distance from the seed point belongs to a preset distance range, a target pixel with brightness similar to that of the shielding pixel is used.
In the embodiment of the present application, similar to the first candidate color value, the second candidate color value is used to characterize a color value of a blocking pixel point in the colored second black-and-white view. For example, the image processing apparatus may calculate the second candidate color value of the target pixel point according to a preset algorithm according to the color value corresponding to the target pixel point. The preset algorithm includes at least one of: a weighted average algorithm, a mean algorithm, and a random selection algorithm.
In an example, the image processing apparatus may obtain a color value corresponding to the target pixel point according to that each pixel point in the second black-and-white view matches a corresponding pixel point position in the first color view.
In another example, the image processing apparatus may determine the first candidate color value of the target pixel point in the second black-and-white view as the color value corresponding to the target pixel point.
Step 104: and the image processing device colors the second black-and-white view according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black-and-white view to obtain the target image.
In the embodiment of the application, if the pixel point in the second black-and-white view has the corresponding second candidate color value, the second candidate color value is adopted, and if the pixel point in the second black-and-white view does not have the corresponding second candidate color value, the first candidate color value is adopted, and the second black-and-white view is colored on the basis, so that the target image is obtained.
Exemplarily, the above process of coloring the second black-and-white view: assigning values to each pixel point in the second black-and-white view according to a preset color mode by using the first candidate color value or the second candidate color value.
In the image processing method provided by the embodiment of the application, a binocular image pair (including a first color view and a second black and white view) for a target shooting object is obtained, then a first candidate color value of each pixel point in the second black and white view is calculated based on a color value of a pixel point in the first color view, a second candidate color value of a shielding pixel point in the second black and white view is calculated based on a first candidate color value of a target pixel point in the second black and white view, and finally the second black and white view is colored according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black and white view, so that the target image is obtained. Therefore, the target image can not only keep the texture details in the second black-and-white view, but also obtain the color impression in the first color view by taking the second black-and-white view as a basic image and the first color view as a guide image. Meanwhile, before the second black-and-white image is colored, parallax optimization is not needed to be carried out on the first color view and the second black-and-white view, and loss of texture details caused by parallax optimization can be avoided, so that a target image with higher definition and better appearance is obtained. Furthermore, because the first candidate color value corresponding to the shielding pixel point may have a larger deviation compared with the actual color value, the second candidate color value of the shielding pixel point is recalculated according to the target pixel point corresponding to the shielding pixel point in the second black-and-white view, so that the second candidate color value corresponding to the shielding pixel point is closer to the color corresponding to the target shooting object, and further, the target image colored according to the first candidate color and the second candidate color is more consistent with the color corresponding to the actual target shooting object.
Optionally, in this embodiment of the application, the image processing apparatus may obtain, in a block matching manner, a color value corresponding to each pixel point in the second black-and-white view, and then calculate the first candidate color value according to the obtained color value and a preset algorithm.
Illustratively, as shown in fig. 2, the step 102 may include the following steps 201 to 204:
step 201: and the image processing device performs block processing on the second black-and-white view to obtain N overlapped first image blocks.
It should be noted that the image processing apparatus performs the overlapping block processing on the second black-and-white view according to the resolution, the image block size and the overlapping ratio of the second black-and-white view, so as to obtain N first image blocks. The first image blocks adjacent to each other in the second black-and-white view include partially identical pixel points, that is, the different first image blocks are overlapped with each other. It can be understood that overlapping means that different first image blocks may include the same pixel points.
Exemplarily, an arbitrary position is used as a first center point in the second black-and-white view, a first image block corresponding to the first center point is obtained according to a preset image block size (e.g., S × S), then, a height value of the first center point is obtained, and the first image blocks are sequentially selected along a horizontal direction of the first center point according to a difference of S-a on the basis of a width value of the first center point. And then, acquiring the width value of a second central point according to the second central points of all the first image blocks in the same row with the first central point, and sequentially selecting the first image blocks along the vertical direction of the second central point according to the difference of S-a on the basis of the height value corresponding to the second central point. It should be noted that, overlapping pixels with a width value of a exist between adjacent first image blocks.
Step 202: the image processing means determines a color image block corresponding to each first image block in the first color view.
In this embodiment of the present application, since there is only one luminance component in the first black-and-white view, in order to facilitate calculation of the first candidate color value of each pixel point in the second black-and-white view, the first color view adopts a YUV color mode, and the first candidate color value is represented by U, V components in the YUV color mode.
For any one of the N first image blocks, in the first color view, a color image block corresponding to the any one first image block is selected according to the neighborhood window, the euclidean distance between the color image block corresponding to the any one first image block and the luminance value is calculated, and the color image block with the minimum euclidean distance is determined as the color image block corresponding to the any one first image block. In the above manner, a color image block corresponding to each first image block is determined.
Step 203: and the image processing device respectively acquires the third candidate color values of the pixel points in the corresponding first image blocks based on the color values of the pixel points in each color image block.
In the embodiment of the present application, for any first image block in the N first image blocks and a color image block corresponding to the any first image block, according to a corresponding relationship between the any first image block and the color image block, a third candidate color of a pixel point in the N first image blocks is obtained.
It can be understood that, since the first image blocks are overlapped with each other, the number of color values of the third candidate color that can be corresponding to a part of the pixel points in the second black-and-white view is one or more.
Illustratively, assuming that color values of pixels in a color image block are (132,164,205; 83,47, 59; 108,31,27,45), then the third candidate color value of the first image block corresponding to the color image block is (132,164,205; 83,47, 59; 108,31,27, 45).
Step 204: and the image processing device determines the first candidate color value of each pixel point in the second black-and-white view based on the third candidate color value of the pixel point in each first image block.
In the embodiment of the application, the first candidate color value of one corresponding to each pixel point in the second black-and-white view is determined according to one or more third candidate colors through a weighted average algorithm.
For example, assuming the color mode of the first color view is YUV mode, the third candidate color value may pass through the three-dimensional tensor U e RH×W×KAnd V ∈ RH×W×KAnd recording is carried out. And K is the number of the corresponding third candidate color values of the pixel points with the height value H and the width value W in the second black-and-white view.
Therefore, the first candidate color values corresponding to the pixel points in the second black-and-white view are determined through a block matching mode and a weighted average algorithm, and the first candidate color values with better color difference impression can be obtained without performing parallax optimization on the first color view and the second black-and-white view.
Further optionally, as shown in fig. 3, in this embodiment of the application, the step 102 may include steps 301 to 303:
step 301: and the image processing device determines K color image blocks matched with the sampling pixel points aiming at the sampling pixel points in the second black-and-white view.
Step 302: and the image processing device determines K third candidate color values corresponding to the sampling pixel points.
Step 303: and the image processing device carries out weighted average on the K third candidate color values to obtain the first candidate color value of the sampling pixel point.
In the embodiment of the present application, each of the third candidate color values respectively corresponds to one of the K color image blocks. It should be noted that, on the basis of steps 201 and 202 shown in fig. 2, on the basis of partitioning the second black-and-white view, a color image block corresponding to each third candidate color value is determined through a color image block corresponding to each third candidate color value, and then the first candidate color value corresponding to each sampling pixel point in the second black-and-white view is determined.
Illustratively, a sampling pixel point in the second black-and-white view, and N color image blocks matched with the sampling pixel point are determined, and the sampling pixel point can be matched with S2Different color image blocks (assuming image block size of S × S), at S, by presetting sampling strategy2And extracting M color image blocks from the color image blocks. And calculating T matched pixel points which are matched with the brightness values of the sampling pixel points in the M color image blocks based on a minimum perceptible error JND threshold value theory, wherein if the numerical value of T is greater than a preset threshold value, the sampling pixel points are matched with the color image blocks. In the M color image blocks, color image blocks matched with the K sampling pixel points exist, wherein the ratio of K to M is closer to 1, and the confidence coefficient of a third candidate color obtained through block matching is higher.
It will be appreciated that the preset sampling strategy needs to meet the requirement that the sampling rate is as low as possible, i.e. M is as small as possible, and that the ratio of K to M is closer to 1, i.e. the confidence is as high as possible, so that the third candidate color is as dense as possible.
In this way, by determining K third candidate color values through the preset sampling strategy, the determination speed of the third candidate colors of the first color view and the second black-and-white view can be improved.
In the embodiment of the application, each sampling pixel corresponds to a plurality of third color candidate values, a weighted average mode is adopted for sampling of a single sampling pixel, the proportion occupied by the plurality of third candidate color values is balanced, and the first candidate color value of each pixel in the second black-and-white view is calculated, so that the robustness of the first candidate color value is better.
For example, the image processing device determines a first weight value of each pixel point according to a luminance value set corresponding to each pixel point in the second black-and-white view, and determines a first candidate color value of each pixel point in the second black-and-white view according to the first weight value and the third candidate color value of each pixel point. Wherein, the corresponding brightness value set of any pixel point in the second black and white view includes: the brightness value of any pixel point in the second black-and-white view, and the brightness value of a color pixel point matched with any pixel point in each color image block corresponding to each first image block to which any pixel point belongs.
In the embodiment of the present application, the luminance value of each pixel is divided by the sum of deviations of two luminance values in the luminance value set, and a first weight value of each pixel is determined.
Illustratively, the brightness value of each pixel point in the second black-and-white view is M (H, W), the brightness value of the pixel point in the first color view corresponding to the pixel point is Y (H, W, k), epsilon is a fault-tolerant parameter, and k is the number of the corresponding third candidate color values of the pixel points with the height value H and the width value W in the second black-and-white view. Where ε is a small value that prevents an increase in the number of results that cannot be calculated when the denominator is 0 in the calculation work.
Further, the image processing apparatus may calculate a luminance difference coefficient between any one pixel point in the second black-and-white view and a pixel point in the first color view corresponding to the pixel point based on the following formula 1, and then calculate the first weight value based on the following formula 2. The following were used: equation 1 and equation 2 are:
Figure BDA0002980760210000111
Figure BDA0002980760210000112
it can be understood that the image processing apparatus may calculate the channel color values of the color channels of each pixel point respectively according to the first weight value. For example, the third candidate color value may be represented using the U, V component in the YUV color mode.
For example, on the basis of the above example, the image processing apparatus may calculate a color value of the U channel based on the following formula 3, calculate a color value of the V channel based on the following formula 4, and generate a third candidate color value according to a brightness value of each pixel point in the second black-and-white view and the color value of each channel calculated by the formulas 3 and 4. As follows, equation 3 and equation 4 are:
Figure BDA0002980760210000121
Figure BDA0002980760210000122
optionally, in this embodiment of the application, the blocking pixel point is screened out based on a sampling strategy and a minimum perceivable error JND threshold theory, and a second candidate color value of the blocking pixel point is recalculated.
Illustratively, as shown in fig. 4, the step 103 may include the following steps 401 to 404:
step 401: and the image processing device determines the shielding pixel point based on the brightness value of each pixel point in the second black-and-white view.
In this embodiment of the application, the image processing apparatus may determine, according to the parallax angle between the first color view and the second black-and-white view, a partially-shielded region in the second black-and-white image, where a pixel point in the partially-shielded region is a shielded pixel point.
In this embodiment of the application, on the basis of determining the first candidate color shown in fig. 3, in the process of determining K color image blocks matched with the sampling pixel by the image processing apparatus according to the sampling policy calculation, if T matched pixel points matched with the luminance value of the sampling pixel in the M color image blocks are calculated based on the preset JND threshold theory, if the value of T is not greater than the preset threshold, the sampling pixel point is not matched with the color image block, and if the sampling pixel point does not have a color image block matched with the sampling pixel point, the sampling pixel point is a blocking pixel point.
Step 402: and the image processing device carries out blocking processing on the second black-and-white view to obtain X second image blocks which do not overlap with each other.
In the embodiment of the present application, each second image block at least includes one blocking pixel point. The second image blocks are not overlapped with each other. It can be understood that the non-overlapping means that different second image blocks do not include the same pixel points.
Step 403: and the image processing device searches a target pixel point corresponding to the shielding pixel point in the second black-and-white view based on the brightness value of each pixel point in the second image block.
In the embodiment of the application, the brightness value of the target pixel point is similar to that of the shielding pixel point, so that the target pixel point corresponding to the shielding pixel point is searched through the brightness value.
Step 404: and the image processing device calculates a second candidate color value of the shielding pixel point in the second black-and-white view according to the first candidate color value and the second weight value corresponding to the target pixel point.
In the embodiment of the application, when the shielding pixel point corresponds to the plurality of target pixel points, the first candidate color value corresponding to the target pixel point is weighted and averaged through the second weight value, and the second candidate color value of the shielding pixel point is calculated.
Exemplarily, similar to the method for calculating the first weight value, the second weight value of the target pixel corresponding to the shielding pixel is recalculated, and weighted average is performed according to the second weight value and the first candidate color of the target pixel, so as to obtain a second candidate color value of the shielding pixel.
It can be understood that, for each shielding pixel point, the second weight value of the target pixel point corresponding to the shielding pixel point needs to be calculated respectively.
Therefore, through non-overlapping block matching results, whether pixel points in the second black-and-white view are shielding pixel points or not can be directly judged, consistency detection is not needed, meanwhile, second candidate color values of the shielding pixel points are calculated through pixel points with similar surrounding brightness, and therefore denser coloring color values are obtained, and the color overflow problem in the coloring process can be effectively reduced.
Further optionally, as shown in fig. 5, in this embodiment of the application, the step 403 may include the following steps 501 to 503:
step 501: the image processing device sorts the brightness values of all the pixel points in each second image block, and calculates the brightness difference between any adjacent pixel points according to the sorting.
Step 502: and if the brightness difference value between any adjacent pixel points is greater than a preset brightness gradient threshold value, the image processing device divides any adjacent pixel point into different brightness levels.
Step 503: and the image processing device determines a target pixel point according to the brightness level of each pixel point in the second black-and-white view.
The target pixel point does not belong to the shielding pixel point, and the brightness levels of the target pixel point and the shielding pixel point are the same.
In the embodiment of the application, in the process of determining the target pixel point corresponding to the shielding pixel point, the target pixel point which is near the shielding pixel point and belongs to the same brightness level as the shielding pixel point can be searched in a neighborhood window mode by taking the shielding pixel point as a center.
Therefore, the target pixel point of the shielding pixel point is determined by carrying out local brightness grade division on the brightness value of each pixel point in the same second image block, the denseness processing of the initially calculated first candidate color value is realized, the situation that the pixel point in a larger shielding area in the second black-and-white view is acquired due to the fact that the corresponding pixel point does not exist in the first color view can be prevented.
It should be noted that, in the image processing method provided in the embodiment of the present application, the execution subject may be an image processing apparatus, or a control module in the image processing apparatus for executing the image processing method. The image processing apparatus provided in the embodiment of the present application is described with an example in which an image processing apparatus executes an image processing method. However, in practical applications, the main body of the image processing method may also be other devices or apparatuses that can execute the image processing method, and this is not limited in this embodiment of the present application.
As shown in fig. 6, an embodiment of the present application provides an image processing apparatus. The image processing apparatus includes: an acquisition unit 61, a first calculation unit 62, a second calculation unit 63, a coloring unit 64;
an acquisition unit 61 configured to acquire a binocular image pair for a target photographic subject; the binocular image pair includes: a first color view and a second black and white view;
the first calculating unit 62 is configured to calculate a first candidate color value of each pixel point in the second black-and-white view based on a color value of the pixel point in the first color view;
the second calculating unit 63 is configured to calculate a second candidate color value of a blocking pixel in the second black-and-white view based on the first candidate color value of the target pixel in the second black-and-white view; the shielding pixel points are pixel points in the second black-and-white view, which have parallax loss with the first color view; the target pixel point is a pixel point matched with the brightness value of the shielding pixel point;
and the coloring unit 64 is configured to color the second black-and-white view according to the second candidate color value of the shielding pixel and the first candidate color values of other pixels in the second black-and-white view except the shielding pixel, so as to obtain the target image.
Optionally, the first calculating unit 62 is specifically configured to:
partitioning the second black-and-white view to obtain N overlapped first image blocks;
determining a color image block corresponding to each first image block in the first color view;
respectively acquiring third candidate color values of the pixel points in the corresponding first image blocks based on the color values of the pixel points in each color image block;
and determining the first candidate color value of each pixel point in the second black-and-white view based on the third candidate color value of the pixel point in each first image block.
Optionally, the first calculating unit 62 is specifically configured to:
determining K color image blocks matched with the sampling pixel points aiming at the sampling pixel points in the second black-and-white view;
determining K third candidate color values corresponding to the sampling pixel points, wherein each third candidate color value corresponds to one of the K color image blocks;
and carrying out weighted average on the K third candidate color values to obtain the first candidate color value of the sampling pixel point.
Optionally, the second calculating unit 63 is specifically configured to:
determining a shielding pixel point based on the brightness value of each pixel point in the second black-and-white view;
partitioning the second black-and-white view to obtain X second image blocks which are not overlapped with each other, wherein each second image block at least comprises a shielding pixel point;
searching a target pixel point corresponding to the shielding pixel point in the second black-and-white view based on the brightness value of each pixel point in the second image block;
and calculating a second candidate color value of the shielding pixel point in the second black-and-white view according to the first candidate color value and the second weight value corresponding to the target pixel point.
Optionally, the second calculating unit 63 is specifically configured to:
sorting the brightness values of all pixel points in each second image block, and calculating the brightness difference between any adjacent pixel points according to the sorting;
if the brightness difference value between any adjacent pixel points is larger than a preset brightness gradient threshold, dividing any adjacent pixel point into different brightness levels;
determining a target pixel point according to the brightness level of each pixel point in the second black-and-white view;
the target pixel point does not belong to the shielding pixel point, and the brightness levels of the target pixel point and the shielding pixel point are the same.
In the image processing apparatus provided in the embodiment of the application, a binocular image pair (including a first color view and a second black and white view) for a target shooting object is first obtained, then, based on color values of pixel points in the first color view, a first candidate color value of each pixel point in the second black and white view is calculated, based on a first candidate color value of a target pixel point in the second black and white view, a second candidate color value of a blocking pixel point in the second black and white view is calculated, and finally, according to the second candidate color value of the blocking pixel point and the first candidate color values of other pixel points except the blocking pixel point in the second black and white view, the second black and white view is colored to obtain the target image. Therefore, the target image can not only keep the texture details in the second black-and-white view, but also obtain the color impression in the first color view by taking the second black-and-white view as a basic image and the first color view as a guide image. Meanwhile, before the second black-and-white image is colored, parallax optimization is not needed to be carried out on the first color view and the second black-and-white view, and loss of texture details caused by parallax optimization can be avoided, so that a target image with higher definition and better appearance is obtained. Furthermore, because the first candidate color value corresponding to the shielding pixel point may have a larger deviation compared with the actual color value, the second candidate color value of the shielding pixel point is recalculated according to the target pixel point corresponding to the shielding pixel point in the second black-and-white view, so that the second candidate color value corresponding to the shielding pixel point is closer to the color corresponding to the target shooting object, and further, the target image colored according to the first candidate color and the second candidate color is more consistent with the color corresponding to the actual target shooting object.
The image processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an IOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The image processing apparatus provided in the embodiment of the present application can implement each process implemented by the foregoing method embodiment, and is not described here again to avoid repetition.
The beneficial effects of the various implementation manners in this embodiment may specifically refer to the beneficial effects of the corresponding implementation manners in the above method embodiments, and are not described herein again to avoid repetition.
Optionally, as shown in fig. 7, an electronic device 700 is further provided in this embodiment of the present application, and includes a processor 701, a memory 702, and a program or an instruction stored in the memory 702 and executable on the processor 701, where the program or the instruction is executed by the processor 701 to implement each process of the above-mentioned embodiment of the image processing method, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic device and the non-mobile electronic device described above.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 800 includes, but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, and a processor 810.
Those skilled in the art will appreciate that the electronic device 800 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 810 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 8 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 810 is configured to obtain a binocular image pair for a target photographic object; the binocular image pair includes: a first color view and a second black and white view; calculating a first candidate color value of each pixel point in a second black-and-white view based on the color values of the pixel points in the first color view; calculating a second candidate color value of a shielding pixel point in the second black-and-white view based on the first candidate color value of the target pixel point in the second black-and-white view; the shielding pixel points are pixel points in the second black-and-white view, which have parallax loss with the first color view; the target pixel point is a pixel point matched with the brightness value of the shielding pixel point; and coloring the second black-and-white view according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black-and-white view to obtain the target image.
The processor 810 is specifically configured to perform blocking processing on the second black-and-white view to obtain N mutually overlapped first image blocks; determining a color image block corresponding to each first image block in the first color view; respectively acquiring third candidate color values of the pixel points in the corresponding first image blocks based on the color values of the pixel points in each color image block; and determining the first candidate color value of each pixel point in the second black-and-white view based on the third candidate color value of the pixel point in each first image block.
The processor 810 is specifically configured to determine, for a sampling pixel point in the second black-and-white view, K color image blocks matched with the sampling pixel point; determining K third candidate color values corresponding to the sampling pixel points, wherein each third candidate color value corresponds to one of the K color image blocks; and carrying out weighted average on the K third candidate color values to obtain the first candidate color value of the sampling pixel point.
The processor 810 is specifically configured to determine a blocking pixel point based on a brightness value of each pixel point in the second black-and-white view; partitioning the second black-and-white view to obtain X second image blocks which are not overlapped with each other, wherein each second image block at least comprises a shielding pixel point; searching a target pixel point corresponding to the shielding pixel point in the second black-and-white view based on the brightness value of each pixel point in the second image block; and calculating a second candidate color value of the shielding pixel point in the second black-and-white view according to the first candidate color value and the second weight value corresponding to the target pixel point.
The processor 810 is further configured to sort the luminance values of the pixels in each second image block, and calculate a luminance difference between any adjacent pixels according to the sorting; if the brightness difference value between any adjacent pixel points is larger than a preset brightness gradient threshold, dividing any adjacent pixel point into different brightness levels; determining a target pixel point according to the brightness level of each pixel point in the second black-and-white view; the target pixel point does not belong to the shielding pixel point, and the brightness levels of the target pixel point and the shielding pixel point are the same.
In the electronic device provided by the embodiment of the application, a binocular image pair (including a first color view and a second black and white view) for a target shooting object is firstly acquired, then, based on color values of pixel points in the first color view, first candidate color values of all the pixel points in the second black and white view are calculated, based on the first candidate color values of target pixel points in the second black and white view, second candidate color values of shielding pixel points in the second black and white view are calculated, and finally, according to the second candidate color values of the shielding pixel points and the first candidate color values of other pixel points except the shielding pixel points in the second black and white view, the second black and white view is colored to obtain the target image. Therefore, the target image can not only keep the texture details in the second black-and-white view, but also obtain the color impression in the first color view by taking the second black-and-white view as a basic image and the first color view as a guide image. Meanwhile, before the second black-and-white image is colored, parallax optimization is not needed to be carried out on the first color view and the second black-and-white view, and loss of texture details caused by parallax optimization can be avoided, so that a target image with higher definition and better appearance is obtained. Furthermore, because the first candidate color value corresponding to the shielding pixel point may have a larger deviation compared with the actual color value, the second candidate color value of the shielding pixel point is recalculated according to the target pixel point corresponding to the shielding pixel point in the second black-and-white view, so that the second candidate color value corresponding to the shielding pixel point is closer to the color corresponding to the target shooting object, and further, the target image colored according to the first candidate color and the second candidate color is more consistent with the color corresponding to the actual target shooting object.
The beneficial effects of the various implementation manners in this embodiment may specifically refer to the beneficial effects of the corresponding implementation manners in the above method embodiments, and are not described herein again to avoid repetition.
It should be understood that, in the embodiment of the present application, the input unit 804 may include a Graphics Processor (GPU) 8gu8it 8041 and a microphone 8042, and the graphics processor 8041 processes image data of a still picture or a video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 806 may include a display panel 8061, and the display panel 8061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 807 includes a touch panel 8071 and other input devices 8072. A touch panel 8071, also referred to as a touch screen. The touch panel 8071 may include two portions of a touch detection device and a touch controller. Other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 809 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems. The processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device in the above embodiment. Readable storage media, including computer-readable storage media such as a computer-read-only memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, and so forth.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above-mentioned embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes several instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the methods of the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
acquiring a binocular image pair aiming at a target shooting object; the binocular image pair includes: a first color view and a second black and white view;
calculating a first candidate color value of each pixel point in the second black-and-white view based on the color values of the pixel points in the first color view;
calculating a second candidate color value of a shielding pixel point in the second black-and-white view based on the first candidate color value of the target pixel point in the second black-and-white view; the shielding pixel points are pixel points which have parallax loss with the first color view in the second black-and-white view; the target pixel point is a pixel point matched with the brightness value of the shielding pixel point;
and coloring the second black-and-white view according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black-and-white view to obtain a target image.
2. The method of claim 1, wherein said calculating a first candidate color value for each pixel in the second monochrome view based on the color values of the pixels in the first color view comprises:
partitioning the second black-and-white view to obtain N overlapped first image blocks;
determining a color image block corresponding to each first image block in the first color view;
respectively acquiring third candidate color values of the pixel points in the first image block corresponding to the color values of the pixel points in each color image block;
and determining the first candidate color value of each pixel point in the second black-and-white view based on the third candidate color value of the pixel point in each first image block.
3. The method of claim 2, wherein said calculating a first candidate color value for each pixel in the second monochrome view based on the color values of the pixels in the first color view comprises:
determining K color image blocks matched with the sampling pixel points aiming at the sampling pixel points in the second black-and-white view;
determining K third candidate color values corresponding to the sampling pixel points, wherein each third candidate color value corresponds to one of the K color image blocks;
and carrying out weighted average on the K third candidate color values to obtain the first candidate color value of the sampling pixel point.
4. The method of claim 1, wherein said calculating a second candidate color value for an occluded pixel in the second monochrome view based on the first candidate color value for the target pixel in the second monochrome view comprises:
determining the shielding pixel point based on the brightness value of each pixel point in the second black-and-white view;
partitioning the second black-and-white view to obtain X non-overlapping second image blocks, wherein each second image block at least comprises one shielding pixel point;
searching the target pixel point corresponding to the shielding pixel point in the second black-and-white view based on the brightness value of each pixel point in the second image block;
and calculating a second candidate color value of the shielding pixel point in the second black-and-white view according to the first candidate color value and the second weight value corresponding to the target pixel point.
5. The method according to claim 4, wherein said searching for the target pixel corresponding to the masked pixel in the second black-and-white view based on the brightness value of each pixel in the second image block comprises:
sorting the brightness values of all pixel points in each second image block, and calculating the brightness difference between any adjacent pixel points according to the sorting;
if the brightness difference value between any adjacent pixel points is larger than a preset brightness gradient threshold, dividing any adjacent pixel point into different brightness levels;
determining a target pixel point according to the brightness level of each pixel point in the second black-and-white view;
the target pixel point does not belong to the shielding pixel point, and the brightness levels of the target pixel point and the shielding pixel point are the same.
6. An image processing apparatus, characterized in that the apparatus comprises: the device comprises an acquisition unit, a first calculation unit, a second calculation unit and a coloring unit;
the acquisition unit is used for acquiring a binocular image pair aiming at a target shooting object; the binocular image pair includes: a first color view and a second black and white view;
the first calculating unit is used for calculating a first candidate color value of each pixel point in the second black-and-white view based on the color values of the pixel points in the first color view;
the second calculating unit is used for calculating a second candidate color value of a shielding pixel point in the second black-and-white view based on the first candidate color value of the target pixel point in the second black-and-white view; the shielding pixel points are pixel points which have parallax loss with the first color view in the second black-and-white view; the target pixel point is a pixel point matched with the brightness value of the shielding pixel point;
and the coloring unit is used for coloring the second black-and-white view according to the second candidate color value of the shielding pixel point and the first candidate color values of other pixel points except the shielding pixel point in the second black-and-white view to obtain a target image.
7. The apparatus according to claim 6, wherein the first computing unit is specifically configured to:
partitioning the second black-and-white view to obtain N overlapped first image blocks;
determining a color image block corresponding to each first image block in the first color view;
respectively acquiring third candidate color values of the pixel points in the first image block corresponding to the color values of the pixel points in each color image block;
and determining the first candidate color value of each pixel point in the second black-and-white view based on the third candidate color value of the pixel point in each first image block.
8. The apparatus according to claim 7, wherein the first computing unit is specifically configured to:
determining K color image blocks matched with the sampling pixel points aiming at the sampling pixel points in the second black-and-white view;
determining K third candidate color values corresponding to the sampling pixel points, wherein each third candidate color value corresponds to one of the K color image blocks;
and carrying out weighted average on the K third candidate color values to obtain the first candidate color value of the sampling pixel point.
9. The apparatus according to claim 6, wherein the second computing unit is specifically configured to:
determining the shielding pixel point based on the brightness value of each pixel point in the second black-and-white view;
partitioning the second black-and-white view to obtain X non-overlapping second image blocks, wherein each second image block at least comprises one shielding pixel point;
searching the target pixel point corresponding to the shielding pixel point in the second black-and-white view based on the brightness value of each pixel point in the second image block;
and calculating a second candidate color value of the shielding pixel point in the second black-and-white view according to the first candidate color value and the second weight value corresponding to the target pixel point.
10. The apparatus according to claim 9, wherein the second computing unit is specifically configured to:
sorting the brightness values of all pixel points in each second image block, and calculating the brightness difference between any adjacent pixel points according to the sorting;
if the brightness difference value between any adjacent pixel points is larger than a preset brightness gradient threshold, dividing any adjacent pixel point into different brightness levels;
determining a target pixel point according to the brightness level of each pixel point in the second black-and-white view;
the target pixel point does not belong to the shielding pixel point, and the brightness levels of the target pixel point and the shielding pixel point are the same.
CN202110286693.5A 2021-03-17 2021-03-17 Image processing method, image processing device, electronic equipment and readable storage medium Active CN113129400B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110286693.5A CN113129400B (en) 2021-03-17 2021-03-17 Image processing method, image processing device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110286693.5A CN113129400B (en) 2021-03-17 2021-03-17 Image processing method, image processing device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN113129400A true CN113129400A (en) 2021-07-16
CN113129400B CN113129400B (en) 2023-02-24

Family

ID=76773349

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110286693.5A Active CN113129400B (en) 2021-03-17 2021-03-17 Image processing method, image processing device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113129400B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107038695A (en) * 2017-04-20 2017-08-11 厦门美图之家科技有限公司 A kind of image interfusion method and mobile device
GB201712354D0 (en) * 2016-08-08 2017-09-13 Google Inc Monochrome-color mapping using a monochromatic imager and a color map sensor
CN107534735A (en) * 2016-03-09 2018-01-02 华为技术有限公司 Image processing method, device and the terminal of terminal
CN108062765A (en) * 2017-12-19 2018-05-22 上海兴芯微电子科技有限公司 Binocular image processing method, imaging device and electronic equipment
CN110580684A (en) * 2018-06-10 2019-12-17 长沙市军英电子科技有限公司 image enhancement method based on black-white-color binocular camera
CN111354058A (en) * 2020-02-04 2020-06-30 北京邮电大学 Image coloring method and device, image acquisition equipment and readable storage medium
US20200342630A1 (en) * 2019-04-23 2020-10-29 L'oreal Machine image colour extraction and machine image construction using an extracted colour

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107534735A (en) * 2016-03-09 2018-01-02 华为技术有限公司 Image processing method, device and the terminal of terminal
GB201712354D0 (en) * 2016-08-08 2017-09-13 Google Inc Monochrome-color mapping using a monochromatic imager and a color map sensor
CN107038695A (en) * 2017-04-20 2017-08-11 厦门美图之家科技有限公司 A kind of image interfusion method and mobile device
CN108062765A (en) * 2017-12-19 2018-05-22 上海兴芯微电子科技有限公司 Binocular image processing method, imaging device and electronic equipment
CN110580684A (en) * 2018-06-10 2019-12-17 长沙市军英电子科技有限公司 image enhancement method based on black-white-color binocular camera
US20200342630A1 (en) * 2019-04-23 2020-10-29 L'oreal Machine image colour extraction and machine image construction using an extracted colour
CN111354058A (en) * 2020-02-04 2020-06-30 北京邮电大学 Image coloring method and device, image acquisition equipment and readable storage medium

Also Published As

Publication number Publication date
CN113129400B (en) 2023-02-24

Similar Documents

Publication Publication Date Title
CN108769634B (en) Image processing method, image processing device and terminal equipment
CN111835982B (en) Image acquisition method, image acquisition device, electronic device, and storage medium
CN110475063A (en) Image-pickup method and device and storage medium
CN113132695B (en) Lens shading correction method and device and electronic equipment
WO2023046112A1 (en) Document image enhancement method and apparatus, and electronic device
CN112308797A (en) Corner detection method and device, electronic equipment and readable storage medium
CN112037160A (en) Image processing method, device and equipment
CN115546043B (en) Video processing method and related equipment thereof
CN114390181A (en) Shooting method and device and electronic equipment
CN113052923B (en) Tone mapping method, tone mapping apparatus, electronic device, and storage medium
CN113989387A (en) Camera shooting parameter adjusting method and device and electronic equipment
CN113870100A (en) Image processing method and electronic device
CN113628259A (en) Image registration processing method and device
CN111901519B (en) Screen light supplement method and device and electronic equipment
CN109191398A (en) Image processing method, device, computer readable storage medium and electronic equipment
CN113129400B (en) Image processing method, image processing device, electronic equipment and readable storage medium
CN112419218A (en) Image processing method and device and electronic equipment
CN111885371A (en) Image occlusion detection method and device, electronic equipment and computer readable medium
CN115439386A (en) Image fusion method and device, electronic equipment and storage medium
CN105893578A (en) Method and device for selecting photos
CN110089103B (en) Demosaicing method and device
CN114119701A (en) Image processing method and device
CN111654623B (en) Photographing method and device and electronic equipment
CN114565777A (en) Data processing method and device
CN113989706A (en) Image processing method and device, server, electronic device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant