WO2022007787A1 - Procédé et appareil de traitement d'image, dispositif et support - Google Patents

Procédé et appareil de traitement d'image, dispositif et support Download PDF

Info

Publication number
WO2022007787A1
WO2022007787A1 PCT/CN2021/104709 CN2021104709W WO2022007787A1 WO 2022007787 A1 WO2022007787 A1 WO 2022007787A1 CN 2021104709 W CN2021104709 W CN 2021104709W WO 2022007787 A1 WO2022007787 A1 WO 2022007787A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
source area
pixel
image
area
Prior art date
Application number
PCT/CN2021/104709
Other languages
English (en)
Chinese (zh)
Inventor
华路延
Original Assignee
广州虎牙科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202010647193.5A external-priority patent/CN113920299A/zh
Priority claimed from CN202011043175.2A external-priority patent/CN112153303B/zh
Application filed by 广州虎牙科技有限公司 filed Critical 广州虎牙科技有限公司
Publication of WO2022007787A1 publication Critical patent/WO2022007787A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the embodiments of the present application relate to the technical field of image processing, for example, to an image processing method, apparatus, device, and medium.
  • the brightness enhancement solution in the related art can only improve the overall brightness of the image, and cannot adjust the brightness of the image or video. light source display effect.
  • Embodiments of the present application provide an image processing method, apparatus, device, and medium, so as to adjust the display effect of a light source in a video or image, and improve the degree of freedom and diversity of the display effect of the video or image.
  • an embodiment of the present application provides an image processing method, including:
  • the light source area of the target image is processed according to the light source model.
  • acquiring the light source model of the target image includes: acquiring the target image, and determining a light source area in the target image; building a light source model in the light source area;
  • Processing the light source area of the target image according to the light source model includes: determining an opacity parameter of the light source area according to the light source model; The light source texture and the opacity parameter of the light source area are preselected to obtain the display texture in the light source area.
  • the light source model for acquiring the target image includes: a light source model for acquiring an optical image; the light source model represents a light source shape and a light source position in the optical image;
  • the processing of the light source area of the target image according to the light source model includes: obtaining light source information to be added according to the light source model and a preset mapping curve; the light source information to be added represents the light source of the optical image
  • the pixel added value of each pixel point in all the pixel points in the area, the light source area is the image area determined by the light source shape and the light source position; update the light source area according to the light source information to be added.
  • an embodiment of the present application further provides an image processing device, the device includes: a light source model acquisition module, configured to acquire a light source model of an optical image; a light source area processing module, configured to The light source area of the optical image is processed.
  • the light source model acquisition module includes: a light source area determination unit, configured to acquire a target image and determine a light source area in the target image; a light source model construction unit, configured to construct a light source model in the light source area ;
  • the light source area processing module includes: an opacity parameter determination unit configured to determine an opacity parameter of the light source area according to the light source model; a light source area adjustment unit configured to The preselected light source texture in the light source area and the opacity parameter of the light source area are used to obtain the display texture in the light source area.
  • the light source model obtaining module is a light source model configured to obtain an optical image; the light source model represents the light source shape and light source position in the optical image;
  • the light source area processing module includes: an information processing unit configured to obtain light source information to be added according to the light source model and a preset mapping curve; the light source information to be added represents each pixel in the light source area.
  • the pixel added value of each pixel point, the light source area is the image area determined by the light source shape and the light source position; the image updating unit is configured to update each pixel in the light source area according to the light source information to be added The pixel value of the point to get the target processed image.
  • an embodiment of the present application further provides a computer device, the computer device includes: one or more processors; a memory configured to store one or more programs, when the one or more programs are stored When executed by the one or more processors, the one or more processors are caused to implement the image processing method described in any embodiment.
  • an embodiment of the present application further provides a computer-readable storage medium storing a computer program, and when the program is executed by a processor, the image processing method described in any of the embodiments is implemented.
  • FIG. 1 is a flowchart of an image processing method provided by an embodiment of the present application.
  • FIG. 2 is a flowchart of another image processing method provided by an embodiment of the present application.
  • FIG. 3 is an example diagram of identification of a light source area provided by an embodiment of the present application.
  • FIG. 5 is an exemplary diagram of a light source model provided by an embodiment of the present application.
  • FIG. 6 is an example diagram of a calculation parameter of an opacity parameter provided by an embodiment of the present application.
  • FIG. 7 is a flowchart of another image processing method provided by an embodiment of the present application.
  • FIG. 8 is a flowchart of another image processing method provided by an embodiment of the present application.
  • FIG. 9A is a schematic diagram of an optical image before processing by the image processing method provided by an embodiment of the present application.
  • FIG. 9B is a schematic diagram of a target processed image obtained after processing the optical image in FIG. 9A by using the image processing method provided by the embodiment of the present application.
  • FIG. 10 is a flowchart of a light source model for acquiring an optical image provided by an embodiment of the present application
  • FIG. 11 is a schematic diagram of the division of an optical image provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of obtaining transparency degree information according to an embodiment of the present application.
  • FIG. 14 is a flowchart of updating the pixel value of each pixel in the light source area according to the light source information to be added, and obtaining a target processing image, according to an embodiment of the present application;
  • FIG. 15 is a schematic structural diagram of an image processing apparatus provided by an embodiment of the present application.
  • FIG. 16 is a schematic structural diagram of another image processing apparatus provided by an embodiment of the present application.
  • FIG. 17 is a schematic structural diagram of another image processing apparatus provided by an embodiment of the present application.
  • FIG. 18 is a schematic structural diagram of a computer device provided by an embodiment of the present application.
  • FIG. 19 is a schematic structural diagram of an image processing device according to an embodiment of the present application.
  • FIG. 1 is a flowchart of an image processing method provided by an embodiment of the present application.
  • the method may be performed by the image processing apparatus provided in any embodiment of the present application, and the apparatus may be composed of hardware and/or software, and may generally be integrated in computer equipment, such as an intelligent mobile terminal.
  • the image processing method provided in this embodiment includes the following steps.
  • the light source model is a model for simulating the shape of the illumination range of the light source in the target image.
  • the light source area may be determined according to a light source model, or the light source area in the target image may be determined by detecting the brightness value of each pixel point in all the pixel points in the target image, for example, obtaining each pixel in the target image.
  • the brightness value of each pixel point is screened out, and the pixel points whose brightness value is greater than the set brightness threshold value are selected, and the maximum connected area formed by these pixel points is used as the light source area.
  • the image processing method provided by this embodiment obtains the light source model of the image and uses the light source model to process the light source area of the image, so as to adjust the light source display effect of the light source area in the image, thereby improving the light source display effect.
  • multimedia live broadcast has attracted people's attention because of its novel form and rich content.
  • multimedia live broadcast software In order to improve the live video effect, multimedia live broadcast software usually has a video retouching function. If the multimedia live broadcast software can provide rich lighting display effects, it will improve the live video effect.
  • the image re-illumination method driven by big data can achieve the effect of re-illuminating the outdoor input image at a specified time.
  • this method is complicated to implement and has a long image processing cycle, and is only suitable for image processing, not for video processing, such as real-time processing of live video, and this method has a single heavy lighting effect, which is not suitable for complex live scenes.
  • FIG. 2 is a flowchart of an image processing method provided by an embodiment of the present application. This embodiment can be applied to the case of adjusting the display effect of the light source in the image or video.
  • the method can be executed by the image processing apparatus provided in any embodiment of the present application, and the apparatus can be composed of hardware and/or software, and can generally be integrated In the computer equipment, for example, it can be an intelligent mobile terminal.
  • the image processing method provided by this embodiment includes the following steps.
  • the target image refers to an image that needs to be adjusted for the display effect of the light source, and the target image may be an image or a video image frame in a video.
  • the video image frame obtained in real time by the live video software is used as the target image.
  • the light source area refers to the area where the light is displayed in the target image when the light source illuminates the object.
  • the light source area in the target image may be determined by detecting the brightness value of each pixel point in all the pixel points in the target image. For example, the brightness value of each pixel in the target image is obtained, and the pixels whose brightness value is greater than the set brightness threshold are screened out, and the largest connected area formed by these pixels is used as the light source area.
  • determining the light source area in the target image includes: determining the light source area in the target image by using a light source detection model obtained by pre-training.
  • the training method of the light source detection model is as follows: an image sample set is obtained, the image sample set includes a large number of image samples, and each image sample is marked with a light source area.
  • the light source area is marked by the light source detection frame; the image sample set is trained by the target detection frame to obtain the light source detection model.
  • Inputting the target image into the light source detection model can output the information of the light source area of the target image, for example, outputting the coordinate information of the light source target frame used to identify the light source area.
  • the block 21 identifies the target image
  • the block 22 identifies the light source target frame
  • the area within the block 22 includes the light source area of the target image.
  • the light source model refers to a model for simulating the shape of the illumination range of the light source, for example, a circular light source model, or an elliptical light source model.
  • the light source model is constructed with the center point of the light source area as the center.
  • a light source model 23 is constructed at the center point in the light source area 22 , and the shape parameters of the light source model 23 can be determined according to the display range of the light source in the light source area 22 .
  • Opacity parameter used to indicate how opaque the pixels in the area are. If the opacity parameter of a pixel is 0%, then the pixel is completely transparent (that is, invisible), and the opacity parameter is 100%, which means a completely opaque pixel (that is, the display original pixels). A pixel's opacity parameter between 0% and 100% allows the pixel to show through the background as though it were through glass (translucency).
  • the opacity parameter can be expressed as a percentage or a real number from 0 to 1.
  • the opacity parameter of each pixel in all the pixels in the light source area is determined, that is, the transparency of the light source of each pixel in the light source area in the actual scene is simulated, and then the effect of light illuminating the object can be simulated.
  • set the opacity parameter of the pixels outside the light source model in the light source area to 0% or 0, according to the pixel position of each pixel in the light source model in the light source model according to the pixel position of each pixel point in the light source model in the light source area. to set the opacity parameter of each pixel.
  • the original image texture refers to the original texture of the target image.
  • Pre-selected light source textures refer to pre-selected light source textures used to provide light sources of different colors to simulate different light source effects within the light source area of the target image.
  • the preselected light source texture in the light source area is implemented by filling with different pixel values. Assuming that the light source effect of normal light is simulated in the light source area, the pixel red, green and blue (RGB) values of all pixels of the preselected light source texture are (1,1,1).
  • the light source area is the superimposed area of the original image texture and the preselected light source texture, and the opacity parameter of the original image texture is 100% or 1, that is, it is completely opaque (the original pixels are displayed).
  • the opacity parameter of the preselected light source texture is based on The opacity parameter of the light source area determined by the light source model is consistent.
  • the preselected light source texture is adjusted using the matching opacity parameter, and the adjustment result is superimposed with the original image texture, so as to obtain the display texture in the light source area.
  • the process of texture overlay may result in the overflow of the overlay result of pixel values, resulting in the inability to display the correct texture.
  • the display texture in the light source area is obtained according to the original image texture in the light source area, the preselected light source texture in the light source area, and the opacity parameter of the light source area, including: The pixel value of the image texture and the pre-selected light source texture in the light source area are superimposed, and the superposition result is corrected to obtain the superimposed texture in the light source area; use the opacity parameter of the light source area to adjust the superimposed texture to obtain the pending texture; use the light source area The original image texture inside is compensated for the to-be-determined texture to obtain the display texture in the light source area.
  • the pixel RGB value of the pixel in the original image texture is D (D ⁇ 1)
  • the pixel RGB value of the pixel in the preselected light source texture is S (S ⁇ 1)
  • the opacity parameter A of the pixel uses the opacity parameter A of the pixel to adjust the pixel RGB value of the pixel in the superimposed texture to obtain the pixel RGB value A*T of the pixel in the undetermined texture. Since the opacity parameter of the original image texture included in the overlay texture is 100% or 1, the original pixel value should be displayed, so it is necessary to use the pixel RGB value D of the pixel in the original image texture for the pixel in the pending texture. The pixel RGB value A*T is compensated to obtain the pixel RGB value F in the display texture of the pixel in the light source area.
  • the pixel RGB value D of the pixel in the original image texture can be adjusted using (1-A)
  • the result of (1-A)*D compensates the pixel RGB value A*T of the pixel in the undetermined texture.
  • the opacity parameter A of the pixel point is the opacity parameter of the pixel point in the light source area determined according to the light source model.
  • the pixel position of the pixel in the light source area is different, and the opacity parameter of the pixel is different.
  • S130 includes: obtaining the light source area according to the original image texture in the light source area, the preselected light source texture in the light source area, and the opacity parameter of the light source area through a graphics processor (Graphics Processing Unit, GPU). Display texture inside.
  • a graphics processor Graphics Processing Unit, GPU
  • multiple pixels in the light source area can be processed at the same time, so as to improve the rate of calculating the pixel RGB value F in the display texture of the multiple pixel points in the light source area.
  • a processed image corresponding to the target image can be obtained, in which the display effect of the light source in the light source area has been adjusted.
  • a light source model is constructed in the light source area, which can simulate the illumination range of light sources with different shapes; the opacity of the light source area is determined according to the light source model parameter, which can simulate the effect of light illuminating the object; the display texture in the obtained light source area is determined according to the original image texture, pre-selected light source texture and opacity parameters of the light source area.
  • the display texture is different, which realizes the adjustment of the display effect of the light source in the image, greatly enriches the application scene, and improves the degree of freedom and diversity of the image display effect.
  • the above technical solution is simple to implement and has a short processing period, and is suitable for application in online real-time video such as live video.
  • FIG. 4 is a flowchart of another image processing method provided by an embodiment of the present application. This embodiment is described on the basis of the above-mentioned embodiment, wherein constructing a light source model in the light source area includes: detecting the first length of the light source area in the horizontal direction and the second length in the vertical direction; When the first length is equal to the second length, in the light source area, take the center point of the light source area as the center and the first length as the diameter to construct a circular light source model; when the first length is not equal to the second length, In the light source area, with the center point of the light source area as the center, and the first length and the second length as the major and minor axes, an elliptical light source model is constructed.
  • the image processing method provided in this embodiment includes the following steps.
  • the first length of the light source area in the horizontal direction and the second length in the vertical direction are detected, that is, the area span of the light source area in the horizontal direction and the vertical direction is detected, and then the shape of the light source model can be determined.
  • the first length of the light source region in the horizontal direction and the second length in the vertical direction of the light source region are determined by detecting the luminance values of a plurality of pixel points.
  • the length of the line segment AB is the first length of the light source region in the horizontal direction
  • the length of the line segment CD is the second length of the light source region in the vertical direction.
  • a circular light source model can be constructed in the light source area, and the diameter of the light source model is the first length (or the second length).
  • the center of the light source model that is, the light source focus
  • the center point of the light source area such as point O as shown in Figure 5.
  • (x1, y1) are the coordinates of point O, and e is the diameter (first length or second length) of the light source model.
  • an elliptical light source model can be constructed in the light source area, the axis length of the light source model in the horizontal direction is the first length, and the axis length in the vertical direction is the second length , the length of the long axis is the longer of the first length and the second length, and the length of the short axis is the shorter of the first length and the second length.
  • Set the center of the light source model that is, the light source focus
  • the center point of the light source area such as point O as shown in Figure 5.
  • the light source model 23 is an elliptical light source model constructed, and the expression is as follows:
  • (x1, y1) are the coordinates of point O
  • c is the length of the line segment AB (ie the first length)
  • d is the length of the line segment CD (ie the second length).
  • the center of the light source model may also be set at the center point of the light source target frame.
  • the first length and the second length are the range lengths of the light source area in the horizontal direction and the vertical direction in the light source target frame, respectively.
  • multiple light source models can be constructed to suit different image scenarios.
  • the opacity parameter of each pixel in the light source area is determined according to the light source model.
  • the opacity parameter of each pixel is related to the positional relationship between each pixel and the light source model.
  • determining the opacity parameter of the light source area according to the light source model includes: setting the opacity parameter of the pixels outside the light source model in the light source area as a first target constant value, and the first target constant value is used for Indicates full transparency; according to the distance between each pixel in the light source model and the pixel in the center of the light source model, the opacity parameter of each pixel is set; The smaller the distance between the pixel point and the center pixel point, the smaller the value of the opacity parameter of the one pixel point.
  • the pixels are divided into two types: those outside the light source model and those inside the light source model.
  • the opacity parameter of the pixels outside the light source model is set to the first target constant value representing full transparency, such as 0 or 0%, that is, the pixel values of these pixels are completely transparent;
  • the opacity parameter is related to the coordinate position of the pixel point.
  • the value of the opacity parameter of the pixel point closer to the center of the light source model is smaller, that is, the brighter the light source is displayed, and the opacity parameter of the pixel point in the light source model is in 0-1 or between 0%-100%.
  • the opacity parameter of the pixel point within the light source model in the light source area can be calculated according to the following formula: Among them, A is the opacity parameter of the pixel point; a is the distance from the pixel point to the long axis or horizontal diameter of the light source model, b is the distance from the pixel point to the short axis or vertical diameter of the light source model, and a' is the pixel point along the The distance from the short axis direction or the vertical diameter direction to the boundary of the light source model, and b' is the distance from the pixel point to the boundary of the light source model along the long axis direction or the horizontal diameter direction.
  • This embodiment provides a simplified calculation method of the opacity parameter.
  • the proportion of the distance between the horizontal diameter of the light source model and the distance between the pixel point and the vertical diameter of the light source model is calculated, and the value of the opacity parameter of the pixel point is calculated to simulate the transparency of the pixel point.
  • the light source target frame 22 is used to identify the light source area, and an elliptical light source model 23 is constructed in the light source target frame 22 (ie, the light source area).
  • the opacity parameter of any pixel point M outside the light source model 23 is set to 0 or 0%
  • the opacity parameter AN of any pixel point N in the light source model 23 is calculated by the following formula:
  • NP1 is the distance from the pixel point N to the long axis of the light source model (that is, a)
  • NQ1 is the distance from the pixel point N to the short axis of the light source model (that is, b)
  • NP2 is the pixel point N along the short axis direction to the light source model.
  • the distance from the boundary of the pixel point (ie, a'), and NQ2 is the distance from the pixel point N to the boundary of the light source model (ie, b') along the long axis direction
  • NP1 is the distance from the pixel point N to the horizontal diameter of the light source model
  • NQ1 is the vertical diameter from the target pixel point N to the light source model
  • NP2 is the distance from the target pixel N to the boundary of the light source model along the vertical diameter direction
  • NQ2 is the distance from the target pixel N to the boundary of the light source model along the horizontal direction.
  • the method further includes: setting an inconsistency of the original image texture outside the light source area
  • the transparency parameter and the opacity parameter of the displayed texture in the light source area are both the second target constant value, and the second target constant value is used to represent opacity; according to the opacity of the original image texture outside the light source area and the original image texture outside the light source area parameters, the display texture in the light source area, and the opacity parameter of the display texture in the light source area to generate a target processing image corresponding to the target image.
  • the target image is identified in the RGBA color space, R represents red (Red), G represents green (Green), B represents blue (Blue), and A represents an opacity parameter.
  • the area outside the light source area in the target image does not need to be adjusted for the light source display effect, the area outside the light source area still displays the RGB texture of the original image.
  • the opacity parameter of the original image RGB texture outside the light source area and the opacity parameter of the display RGB texture in the light source area can be set as the second target constant value for representing opacity, Like 1 or 100%.
  • the display effect texture obtained by combining the original image RGB texture outside the light source area and the display RGB texture in the light source area can be rendered by the pre-created GPU, and after the rendering is completed, the target processing image is generated for display.
  • the adjustment of the display effect of the light source in the video or image is realized, the degree of freedom and diversity of the display effect of the video or image is improved, and it is suitable for scenes with various light sources changing.
  • the above technical solution can also be combined with the method of the GPU rendering pipeline, so that the change of the display effect of the light source is suitable for real-time video processing.
  • FIG. 7 is a flowchart of another image processing method provided by an embodiment of the present application. This embodiment provides an implementation manner on the basis of the above-mentioned embodiment.
  • the image processing method provided in this embodiment includes the following steps.
  • Video image frames are identified in the RGBA color space.
  • S320 Input the target image into a pre-trained light source detection model to obtain coordinate information of the light source target frame in the target image.
  • the light source focus of the light source model is determined, that is, the center point (x1, y1) of the light source target frame, and the light source model is constructed based on the light source focus (x1, y1).
  • A is the opacity parameter of the pixel point
  • a is the distance from the pixel point to the long axis or horizontal diameter of the light source model
  • b is the distance from the pixel point to the short axis or vertical diameter of the light source model
  • a' is the pixel point along the The distance from the short axis direction or the vertical diameter direction to the boundary of the light source model
  • b' is the distance from the pixel point to the boundary of the light source model along the long axis direction or the horizontal diameter direction.
  • the opacity parameters of multiple pixels in the light source target frame determined in this step will participate in the processing of pixels in the light source area.
  • the pre-selected light source texture can display the light source effects of different colors by filling different pixel RGB values, and then simulate different light source effects, and can fill the corresponding pixel RGB values of different light source effects according to actual needs.
  • the light source area (or the area within the light source target frame) is the superimposed area of the original image texture and the preselected light source texture, and the pixel RGB values need to be processed.
  • the opacity parameters of multiple pixels in the light source area are determined by S340; The areas outside the light source area do not need to be processed, and the original image texture can be displayed, and the opacity parameter of these areas is set to 1.
  • the calculation rate of the superimposed texture in the light source area and the display texture in the light source area is improved.
  • FIG. 8 is a flowchart of another image processing method provided by an embodiment of the present application.
  • the image processing method may include the following steps.
  • the light source model characterizes the light source shape and light source location in the optical image.
  • the shape of the light source can be a square, a circle, an ellipse, a semi-circle, etc.
  • the "non-black pixels" in the optical image for example, the pixels whose RGB value (ie, the pixel RGB value) is not 0
  • the composed image area is fitted to obtain the light source shape.
  • the position of the light source can be represented by the pixel coordinates determined by each of the plurality of pixels in the optical image corresponding to the shape of the light source; the optical image can also be divided into grids, and the optical image with "non-black pixels" can be divided into a grid.
  • the coordinate position corresponding to at least one grid is used as the light source position.
  • the light source information to be added represents the pixel added value of each pixel point in the light source area, and the light source area is an image area determined by the shape of the light source and the position of the light source.
  • the pixel added value may be determined according to the light source intensity and the RGB value to be added of the pixel point
  • the light source intensity may determine the brightness information of each pixel point in the target image
  • the RGB value to be added of the pixel point may determine the target image.
  • a light source is simulated, and the light source information to be added corresponding to the simulated light source is added to the light source area, so that the image display information (light source information to be added) corresponding to the simulated light source is added to the optical image, which is conducive to enriching the optical display information of the image, providing More visual presentations.
  • the pixels of the optical image outside the light source area are all "black pixels” (for example, pixels whose RGB values are all 0), there is no light source at the positions of the pixels outside the light source area.
  • the image processing method is beneficial to enrich the optical display information of the image and provide more visual display effects.
  • FIG. 9A is a schematic diagram of an optical image before processing by the image processing method provided by the embodiment of the present application.
  • FIG. 9B is a schematic diagram of a target processed image obtained after processing the optical image in FIG. 9A by using the image processing method provided by the embodiment of the present application.
  • the optical image and the target processing image can be compared, and the two have changed greatly in the image area with the light source (light source area), and the light source information to be added is added to the light source area of the optical image, so that the target image has richer optics. Display information. That is to say, by using the image processing method provided by the embodiments of the present application, more visual display effects are provided for the image.
  • FIG. 10 is a flowchart of a light source model for acquiring an optical image provided by an embodiment of the present application.
  • S31 acquiring a light source model of the optical image, which may include the following steps.
  • the optical image may be meshed according to M*N to obtain a meshed image with M*N meshes, where M and N are both positive integers greater than or equal to 2.
  • FIG. 11 is a schematic diagram of dividing an optical image according to an embodiment of the present application. The optical image is divided into grids to obtain a grid image with 6*7 grids shown in FIG. 11 .
  • S312 Acquire at least one grid having a light source in the grid image.
  • the leg area of the "dog" shown in Fig. 11 is an image area with a light source (that is, the blank space outlined in Fig. 11), then the blank space in the grid image is regarded as a The above grid with light sources.
  • the position of the light source is the coordinate position of the box corresponding to at least one grid having the light source in the grid image. If there is only one grid with a light source in the grid image, the box coordinate position of the grid is used as the light source position. If there are multiple grids with light sources in the grid image, the frame coordinate positions corresponding to the multiple networks are taken as the light source positions, including two possible situations: In one case, if multiple grids are continuously distributed, then integrate the multiple grids to obtain a light source position; in the other case, if multiple grids are discretely distributed, obtain the light sources in the discretely distributed grid area. Location.
  • the area enclosed by the multiple grids may be a square area, but the shape of the light source may be a circle, a trapezoid, an ellipse, etc. in the square area.
  • the virtual light source corresponding to the optical image may be a point light source.
  • the point light source can be enhanced, and then the pixel value (light source information to be added) can be added to the light source area of the optical image, so as to improve the darker part in the optical image, which is beneficial to the optical image. Observation and recognition of target objects in images.
  • FIG. 12 is a flowchart of another image processing method provided by the embodiment of the present application.
  • S32 obtaining information about the light source to be added according to the light source model and the preset mapping curve, which may include the following steps.
  • S321 Match the transparency level information with a preset mapping curve to obtain the brightness information to be added of the light source area.
  • the transparency degree information represents the brightness information of the light source area
  • the brightness information to be added represents the brightness addition value of each pixel in the light source area.
  • the degree of transparency information may be obtained according to the intensity of the light source corresponding to the optical image, and the intensity of the light source may be obtained by: obtaining the length of the first line segment between any pixel point in the light source area and the center point of the light source area, The length of the second line segment between the edge point and the center point of the light source region on the extension line of the first line segment, the edge point and the arbitrary pixel point are located on the same side of the center point, and the first The length of the line segment is divided by the length of the second line segment to obtain the light source intensity corresponding to any one pixel point.
  • the light source intensity of each pixel in the light source region that is, the transparency of each pixel
  • the above-mentioned transparency information is also obtained.
  • the color information to be added represents the color added value of each pixel in the light source area.
  • the light source change requirement may be identified according to different optical images, or may be set by the user through an operation instruction, for example, different RGB pixel values may be added to each pixel in the light source area.
  • the brightness and color of each pixel in the light source area of the optical image are adjusted using the light source information to be added to obtain the target processing image.
  • the visual data of the target processed image will be clearer and brighter, making the target object recognition in the image more accurate.
  • FIG. 13 is a schematic diagram of obtaining transparency degree information provided by an embodiment of the present application.
  • the light source area shown in FIG. 13 is a rectangular area, and the above-mentioned transparency degree information can be obtained in the following manner.
  • the first edge point is any pixel point on the boundary of the light source area.
  • the first distance is the distance between the center point O and the first edge point b
  • the second distance is the distance between the area center point O and the first area point a
  • the first area point a is composed of the center point O and the first edge point b. Any pixel on the line segment.
  • the first distance is Ob and the second distance is Oa.
  • FIG. 14 is a flowchart of updating the pixel value of each pixel in the light source area according to the light source information to be added to obtain a target processing image provided by an embodiment of the present application.
  • the above S33: updating the pixel value of each pixel in the light source area according to the information of the light source to be added, to obtain the target image may include the following steps.
  • the first pixel is any pixel in the light source area.
  • the first pixel point may be the first area point a shown in FIG. 13 .
  • S332 Determine the first pixel addition value of the first pixel point according to the light source information to be added.
  • the first pixel addition value includes a first luminance addition value and a first color addition value.
  • the first color addition value may be (1, 1, 1); for each pixel in the light source area, setting different color addition values for each pixel can simulate different light source effects.
  • the first luminance addition value may be represented by the intensity of the light source.
  • the value of the first pixel is V
  • the value of the first color addition is G
  • the value of the intermediate pixel is H.
  • the above-mentioned intermediate pixel value H, first color addition value G, and first pixel value V are all less than or equal to 1.
  • the first target pixel value I is:
  • Each pixel can have the same U ⁇ value, and some pixels in the light source area can also have different U ⁇ values.
  • the light source information corresponding to the optical image can be adjusted, the optical display information of the image can be enriched, and more visual display effects can be provided.
  • the user can also manually set the newly added light source.
  • the coordinates, shape and color of the simulated light source can be set directly through the operation command, so as to add the set light source to the optical image, and the user can The target image can be viewed.
  • adding a red light source to an optical image means adding a red color channel value to each pixel in the optical image to obtain the target processing image.
  • FIG. 15 is a schematic structural diagram of an image processing apparatus provided by an embodiment of the present application. This embodiment may be applicable to the case of adjusting the display effect of the light source in the image or video, and the apparatus may be implemented by means of software and/or hardware, and may generally be integrated in computer equipment. As shown in FIG. 15 , the device has a light source model acquisition module 410 and a light source region processing module 420 .
  • the light source model obtaining module 410 is configured to obtain the light source model of the target image; the light source region processing module 420 is configured to process the light source region of the target image according to the light source model.
  • FIG. 16 is a schematic structural diagram of another image processing apparatus provided by an embodiment of the present application.
  • the light source model acquisition module 410 includes: a light source area determination unit 411 and a light source model construction unit 412
  • the light source area processing module 420 includes an opacity parameter determination unit 421 and a light source area adjustment unit 422 .
  • the light source area determination unit 411 is configured to acquire a target image and determine the light source area in the target image; the light source model construction unit 412 is configured to construct a light source model in the light source area; the opacity parameter determination unit 421 is configured to The model determines the opacity parameter of the light source area; the light source area adjustment unit 422 is set to obtain the desired light source area according to the original image texture in the light source area, the preselected light source texture in the light source area, and the opacity parameter of the light source area. The display texture in the light source area.
  • a light source model is constructed in the light source area, which can simulate the illumination range of light sources with different shapes; the opacity of the light source area is determined according to the light source model parameter, which can simulate the effect of light illuminating the object; the display texture in the obtained light source area is determined according to the original image texture, pre-selected light source texture and opacity parameters of the light source area.
  • the display texture is different, which realizes the adjustment of the display effect of the light source in the image, greatly enriches the application scene, and improves the degree of freedom and diversity of the image display effect.
  • the above technical solution is simple to implement and has a short processing period, and is suitable for application in online real-time video such as live video.
  • the light source area adjustment unit 422 is configured to superimpose the pixel value of the original image texture in the light source area and the preselected light source texture in the light source area, and correct the superposition result, Obtain the superimposed texture in the light source area; use the opacity parameter of the light source area to adjust the superimposed texture to obtain a pending texture; use the original image texture in the light source area to compensate the pending texture to obtain Display texture within the light source area.
  • the light source area determining unit 411 is configured to acquire a target image, and determine the light source area in the target image through a pre-trained light source detection model.
  • the light source model building unit 412 is configured to detect the first length and the second length of the light source area in the horizontal direction and the vertical direction respectively; when the first length and the second length are equal, the light source area is In the light source area, a circular light source model is constructed with the center point of the light source area as the center and the first length as the diameter; when the first length is not equal to the second length, in the light source area , taking the center point of the light source area as the center, taking the longer of the first length and the second length as the long axis, and taking the shorter of the first length and the second length as the length For the short axis, construct an elliptical light source model.
  • the opacity parameter determining unit 421 is configured to set the opacity parameters of all the pixels in the light source area outside the light source model as a first target constant value, and the first target constant value is used to represent full transparency; according to The size of the distance between each pixel point in all the pixel points in the light source model in the light source area and the center pixel point of the light source model, and the opacity parameter of each pixel point is set; wherein , the smaller the distance between one pixel in the light source model and the central pixel, the smaller the value of the opacity parameter of the one pixel.
  • the opacity parameter determining unit 421 is set to the size of the distance between each pixel point in the light source area and the central pixel point of the light source model according to the distance between each pixel point in the light source model in the light source model in the following manner: , set the opacity parameter of each pixel point: calculate the opacity parameter of each pixel point within the light source model in the light source area according to the following formula: Among them, A is the opacity parameter of each pixel, a is the distance from each pixel to the long axis or horizontal diameter of the light source model, and b is the distance from each pixel to the light source model The distance from the short axis or the vertical diameter of , a' is the distance from the short axis direction or the vertical diameter direction of each pixel to the boundary of the light source model, b' is the long axis direction of each pixel Or the distance from the horizontal diameter direction to the boundary of the light source model.
  • the light source area adjustment unit 422 is configured to obtain through the GPU, according to the original image texture in the light source area, the preselected light source texture in the light source area, and the opacity parameter of the light source area Display texture within the light source area.
  • the above-mentioned device further includes: a target processing image generation module, which is configured to, in the light source area adjustment unit, according to the original image texture in the light source area, the preselected light source texture in the light source area, and the opacity parameter of the light source area, After obtaining the display texture in the light source area, set the opacity parameter of the original image texture outside the light source area and the opacity parameter of the display texture in the light source area to be the second target constant value, the second target constant value.
  • a target processing image generation module which is configured to, in the light source area adjustment unit, according to the original image texture in the light source area, the preselected light source texture in the light source area, and the opacity parameter of the light source area, After obtaining the display texture in the light source area, set the opacity parameter of the original image texture outside the light source area and the opacity parameter of the display texture in the light source area to be the second target constant value, the second target constant value.
  • the target constant value is used to represent opacity; according to the original image texture outside the light source area, the opacity parameter of the original image texture outside the light source area, the display texture in the light source area, and the display in the light source area The opacity parameter of the texture to generate the target processing image corresponding to the target image.
  • FIG. 17 is a schematic structural diagram of another image processing apparatus provided by an embodiment of the present application.
  • the light source area processing module 420 may include: an information processing unit 423 and an image updating unit 424 .
  • the light source model acquisition module 410 is a light source model configured to acquire optical images.
  • the light source model characterizes the light source shape and light source location in the optical image.
  • the information processing unit 423 is configured to obtain the information of the light source to be added according to the light source model and the preset mapping curve.
  • the light source information to be added represents the pixel added value of each pixel point in all the pixel points in the light source area of the optical image, and the light source area is an image area determined by the shape of the light source and the position of the light source.
  • the image updating unit 424 is configured to update the pixel value of each pixel in the light source area according to the light source information to be added, so as to obtain the target processing image.
  • the information processing unit 423 is configured to match the brightness information with the preset mapping curve to obtain the brightness information to be added in the light source area, the brightness information represents the brightness information of the light source area, and the brightness information to be added Characterize the added value of the brightness of each pixel in the light source area; in response to the changing requirements of the light source, obtain the color information to be added in the light source area, and the color information to be added represents the added color value of each pixel in the light source area; according to the brightness to be added information and the color information to be added to obtain the pixel added value of each pixel in the light source area of the optical image.
  • the light source model determination module 410 , the information processing unit 423 and the image updating unit 424 may cooperate to implement the image processing method of Embodiment 1 or Embodiment 5 and possible sub-steps of the method.
  • the image processing apparatus provided by the embodiment of the present application can execute the image processing method provided by any embodiment of the present application, and has functional modules corresponding to the execution method.
  • FIG. 18 is a schematic structural diagram of a computer device provided by an embodiment of the present application.
  • the computer device includes a processor 50, a memory 51, an input device 52 and an output device 53; the number of processors 50 in the computer device can be one or more, and one processor 50 is taken as an example in FIG. 18 ;
  • the processor 50, the memory 51, the input device 52 and the output device 53 in the computer equipment may be connected by a bus or in other ways. In FIG. 18, the connection by a bus is taken as an example.
  • the memory 51 can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the image processing method in the embodiments of the present application (for example, the image processing method shown in FIG. 15 ).
  • the processor 50 executes various functional applications and data processing of the computer device by running the software programs, instructions and modules stored in the memory 51 , ie, implements the above-mentioned image processing method.
  • the memory 51 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of computer equipment, and the like.
  • the memory 51 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device.
  • memory 51 may include memory located remotely from processor 50, which may be connected to a computer device through a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the input device 52 may be configured to receive input numerical or character information and to generate key signal input related to user settings and function control of the computer device.
  • the output device 53 may include a display device such as a display screen.
  • FIG. 19 is a schematic structural diagram of an image processing device provided by an embodiment of the present application.
  • the image processing device 10 includes a memory 11 , a processor 12 and a communication interface 13 .
  • the memory 11 , the processor 12 and the communication interface 13 are directly or indirectly electrically connected to each other to realize data transmission or interaction. For example, these elements may be electrically connected to each other through one or more communication buses or signal lines.
  • the memory 11 may be configured to store software programs and modules, such as program instructions/modules corresponding to the image processing methods provided in the embodiments of the present application, and the processor 12 executes various functions by executing the software programs and modules stored in the memory 11. applications and data processing.
  • the communication interface 13 can be used for signaling or data communication with other node devices.
  • the image processing apparatus 10 may have a plurality of communication interfaces 13 in the present application.
  • the memory 11 can be, but is not limited to, random access memory (Random Access Memory, RAM), read only memory (Read Only Memory, ROM), programmable read only memory (Programmable Read-Only Memory, PROM), erasable only memory Read memory (Erasable Programmable Read-Only Memory, EPROM), Electrical Erasable Programmable Read-Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc.
  • RAM Random Access Memory
  • ROM read only memory
  • PROM programmable read only memory
  • PROM Programmable Read-Only Memory
  • EPROM Erasable Programmable Read-Only Memory
  • EPROM Electrical Erasable Programmable Read-Only Memory
  • EEPROM Electrical Erasable Programmable Read-Only Memory
  • the processor 12 may be an integrated circuit chip with signal processing capability.
  • the processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; it may also be a digital signal processor (Digital Signal Processing, DSP), an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • CPU Central Processing Unit
  • NP Network Processor
  • DSP Digital Signal Processing
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the image processing device 10 may also implement a display function through a GPU, a display screen, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 12 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the above-mentioned image processing device 10 can be, but is not limited to, a mobile phone, a tablet computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a super mobile personal computer (Ultra-Mobile Personal Computer, UMPC), netbook, personal digital assistant (Personal Digital Assistant, PDA) and other terminals, the embodiment of the present application does not impose any restrictions on the specific type of the image processing device.
  • AR Augmented Reality
  • VR Virtual Reality
  • PDA Personal Digital Assistant
  • the structures illustrated in the embodiments of the present application do not constitute a limitation on the image processing apparatus 10 .
  • the image processing apparatus 10 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • Embodiments of the present application also provide a computer-readable storage medium storing a computer program, where the computer program is used to execute an image processing method when executed by a computer processor.
  • the method includes: acquiring a light source model of an optical image; and processing the light source area of the optical image according to the light source model. .
  • the computer-readable storage medium storing the computer program provided by the embodiment of the present application is not limited to the above method operations, and can also perform related operations in the image processing method provided by any embodiment of the present application.
  • the present application can be implemented by means of software and general hardware, and can also be implemented by hardware. Based on this understanding, the technical solution of the present application can be embodied in the form of a software product, and the computer software product can be stored in a computer-readable storage medium, such as a floppy disk, ROM, RAM, flash memory (FLASH), hard disk of a computer. Or CD, etc., including a plurality of instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods of the various embodiments of the present application.
  • a computer-readable storage medium such as a floppy disk, ROM, RAM, flash memory (FLASH), hard disk of a computer. Or CD, etc.
  • the multiple units and modules included are only divided according to functional logic, but are not limited to the above-mentioned division, as long as the corresponding functions can be realized; the names of the multiple functional units are also It is only for the convenience of distinguishing from each other, and is not intended to limit the protection scope of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Generation (AREA)

Abstract

Selon divers modes de réalisation, la présente invention divulgue un procédé et un appareil de traitement d'image, un dispositif et un support. Le procédé comprend les étapes consistant à : obtenir un modèle de source de lumière d'une image cible ; et traiter une région de source de lumière de l'image cible selon le modèle de source de lumière.
PCT/CN2021/104709 2020-07-07 2021-07-06 Procédé et appareil de traitement d'image, dispositif et support WO2022007787A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202010647193.5A CN113920299A (zh) 2020-07-07 2020-07-07 图像处理方法、装置、设备和介质
CN202010647193.5 2020-07-07
CN202011043175.2A CN112153303B (zh) 2020-09-28 2020-09-28 一种视觉数据处理方法、装置、图像处理设备和存储介质
CN202011043175.2 2020-09-28

Publications (1)

Publication Number Publication Date
WO2022007787A1 true WO2022007787A1 (fr) 2022-01-13

Family

ID=79553617

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/104709 WO2022007787A1 (fr) 2020-07-07 2021-07-06 Procédé et appareil de traitement d'image, dispositif et support

Country Status (1)

Country Link
WO (1) WO2022007787A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114119853A (zh) * 2022-01-26 2022-03-01 腾讯科技(深圳)有限公司 图像渲染方法、装置、设备和介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106464816A (zh) * 2014-06-18 2017-02-22 佳能株式会社 图像处理设备及其图像处理方法
CN107197171A (zh) * 2017-06-22 2017-09-22 西南大学 一种添加智能软件光源的数码拍照处理方法
CN109345602A (zh) * 2018-09-28 2019-02-15 Oppo广东移动通信有限公司 图像处理方法和装置、存储介质、电子设备
US20200204775A1 (en) * 2018-12-19 2020-06-25 Canon Kabushiki Kaisha Image processing apparatus, image capturing apparatus, image processing method, and storage medium
CN112153303A (zh) * 2020-09-28 2020-12-29 广州虎牙科技有限公司 一种视觉数据处理方法、装置、图像处理设备和存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106464816A (zh) * 2014-06-18 2017-02-22 佳能株式会社 图像处理设备及其图像处理方法
CN107197171A (zh) * 2017-06-22 2017-09-22 西南大学 一种添加智能软件光源的数码拍照处理方法
CN109345602A (zh) * 2018-09-28 2019-02-15 Oppo广东移动通信有限公司 图像处理方法和装置、存储介质、电子设备
US20200204775A1 (en) * 2018-12-19 2020-06-25 Canon Kabushiki Kaisha Image processing apparatus, image capturing apparatus, image processing method, and storage medium
CN112153303A (zh) * 2020-09-28 2020-12-29 广州虎牙科技有限公司 一种视觉数据处理方法、装置、图像处理设备和存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114119853A (zh) * 2022-01-26 2022-03-01 腾讯科技(深圳)有限公司 图像渲染方法、装置、设备和介质
WO2023142607A1 (fr) * 2022-01-26 2023-08-03 腾讯科技(深圳)有限公司 Procédé et appareil de rendu d'image, dispositif et support

Similar Documents

Publication Publication Date Title
CN102254340B (zh) 一种基于gpu加速的环境光遮挡图像绘制方法及***
CN110490896B (zh) 一种视频帧图像处理方法和装置
US10719920B2 (en) Environment map generation and hole filling
CN111145135B (zh) 一种图像去扰处理方法、装置、设备及存储介质
US20200302579A1 (en) Environment map generation and hole filling
CN112153303B (zh) 一种视觉数据处理方法、装置、图像处理设备和存储介质
CN112802170B (zh) 光照图像生成方法、装置、设备和介质
EP4261784A1 (fr) Procédé et appareil de traitement d'images à base d'intelligence artificielle et dispositif électronique, support de stockage lisible par ordinateur et produit-programme d'ordinateur
CN104157005A (zh) 一种基于图像的hdr光照渲染方法
CN114638950A (zh) 一种绘制虚拟物体阴影的方法及设备
WO2022007787A1 (fr) Procédé et appareil de traitement d'image, dispositif et support
CN110177287A (zh) 一种图像处理和直播方法、装置、设备和存储介质
CA3199390A1 (fr) Systemes et methodes pour le rendu d'objets virtuels au moyen d'une estimation de parametres modifiables d'eclairage
CN109427089B (zh) 基于环境光照条件的混合现实对象呈现
TWI808321B (zh) 應用於畫面顯示的物件透明度改變方法及實物投影機
US10424236B2 (en) Method, apparatus and system for displaying an image having a curved surface display effect on a flat display panel
CN115526976A (zh) 虚拟场景渲染方法、装置、存储介质和电子设备
TWI678927B (zh) 動態調整影像清晰度的方法及其影像處理裝置
CN113920299A (zh) 图像处理方法、装置、设备和介质
WO2022132153A1 (fr) Déclenchement d'attention contextuelle et de caractéristiques de convolution
US20230316640A1 (en) Image processing apparatus, image processing method, and storage medium
CN114782616B (zh) 模型处理方法、装置、存储介质及电子设备
US20230410406A1 (en) Computer-readable non-transitory storage medium having image processing program stored therein, image processing apparatus, image processing system, and image processing method
US20230317023A1 (en) Local dimming for artificial reality systems
CN115484504A (zh) 图像显示方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21838902

Country of ref document: EP

Kind code of ref document: A1