CN111343472B - Image processing effect adjusting method, device, equipment and medium - Google Patents

Image processing effect adjusting method, device, equipment and medium Download PDF

Info

Publication number
CN111343472B
CN111343472B CN202010108542.6A CN202010108542A CN111343472B CN 111343472 B CN111343472 B CN 111343472B CN 202010108542 A CN202010108542 A CN 202010108542A CN 111343472 B CN111343472 B CN 111343472B
Authority
CN
China
Prior art keywords
parameter
image
effect image
effect
client
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010108542.6A
Other languages
Chinese (zh)
Other versions
CN111343472A (en
Inventor
郑俊明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010108542.6A priority Critical patent/CN111343472B/en
Publication of CN111343472A publication Critical patent/CN111343472A/en
Application granted granted Critical
Publication of CN111343472B publication Critical patent/CN111343472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The method for adjusting the image processing effect provided in the present embodiment may be used in Computer Vision (CV), and includes: the client processes the first effect image through a first operation parameter in the image processing operation to obtain a second effect image; the client sends the second effect image to the server; the server generates an adjusting instruction according to the second effect image; the client receives the adjusting instruction sent by the server and adjusts the first operation parameter according to the adjusting instruction to obtain a second operation parameter; and the client processes the first effect image through the second operation parameter to obtain a third effect image. The application also discloses a device, equipment and medium, through the acquisition to the middle link effect image of image processing process, can know the influence of the operating parameter of every link to image processing operation accurately to promote the accuracy of image processing effect regulation, promote the efficiency of image processing effect regulation.

Description

Image processing effect adjusting method, device, equipment and medium
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a method, an apparatus, a device, and a medium for adjusting an image processing effect.
Background
Computer Vision (CV) is a science of studying how to "look" a machine, and more specifically, to replace human eyes with a camera and a Computer to perform machine Vision such as recognition, tracking and measurement on a target, and further perform graphic processing to make the Computer process into an image more suitable for human eyes to observe or transmit to an instrument to detect. Computer vision techniques typically include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D techniques, virtual reality, augmented reality, synchronous positioning, and map construction, among others, as well as common biometric recognition techniques such as face recognition, fingerprint recognition, and others.
The CV technology can be used for real-time processing of videos, for example, in the live video process, the live portrait pictures are subjected to the face beautifying processing in real time, before the live video process starts, the user can adjust the rendering parameters of the face beautifying effect, such as whitening, skin grinding, face thinning and the like, and in the live video process, the CV technology can be used for carrying out the real-time rendering processing on the videos according to the face beautifying effect preset by the user so as to achieve the effect of carrying out the real-time face beautifying on the videos.
In the prior art, when a live broadcast picture of a video has a problem, the problem can only be checked through the current display picture, and as the live broadcast picture is obtained through a plurality of processing links, the problem can not be positioned in particular to which processing link has the problem. The adjustment efficiency of the image processing is affected.
Disclosure of Invention
In view of the above, the technical solution provided by the present invention is as follows:
a method of adjusting an image processing effect, comprising:
processing the first effect image through a first operation parameter in image processing operation to obtain a second effect image, wherein the first operation parameter is an operation parameter of one of at least two processing links in the image processing process;
transmitting the second effect image to a server;
receiving an adjustment instruction sent by the server, wherein the adjustment instruction is an instruction generated by the server according to the second effect image, and the adjustment instruction comprises a second operation parameter of the image processing operation;
adjusting the first operation parameter according to the adjustment instruction to obtain the second operation parameter;
and processing the first effect image through the second operation parameter to obtain a third effect image.
An apparatus for adjusting an image processing effect, comprising:
the first processing unit is used for processing the first effect image through a first operation parameter in the image processing operation to obtain a second effect image, wherein the first operation parameter is an operation parameter of one of at least two processing links in the image processing process;
a sending unit, configured to send the second effect image obtained by the processing of the first processing unit to a server;
a receiving unit, configured to receive an adjustment instruction sent by the server, where the adjustment instruction is an instruction generated by the server according to the second effect image sent by the sending unit, and the adjustment instruction includes a second operation parameter of the image processing operation;
the adjusting unit is used for adjusting the first operation parameter according to the adjusting instruction received by the receiving unit to obtain the second operation parameter;
and the second processing unit is used for processing the first effect image through the second operation parameter regulated by the regulating unit so as to obtain a third effect image.
Optionally, the adjusting unit is further configured to:
Acquiring a first parameter value of the first operating parameter;
adjusting the first parameter value according to the adjusting instruction to obtain a second parameter value;
and acquiring the second operation parameter according to the second parameter value, wherein the second parameter value is the parameter value of the second operation parameter.
Optionally, the apparatus further comprises an acquisition unit for:
acquiring an acquisition request sent by the server;
the first processing unit is further configured to:
and acquiring the second effect image according to the acquisition request.
Optionally, the sending unit is further configured to:
after the second effect image is acquired, the second effect image is sent to the server in real time;
or alternatively, the process may be performed,
and after all the processing links in the image processing process are executed, packaging and transmitting the second effect image and the images output by other processing links to the server.
A method of adjusting an image processing effect, comprising:
acquiring a second effect image sent by a client, wherein the second effect image is an image obtained by the client after processing the first effect image through a first operation parameter, and the first operation parameter is an operation parameter of one of at least two processing links in the image processing process of the client;
Generating an adjustment instruction according to the second effect image, wherein the adjustment instruction comprises a second operation parameter of the image processing operation;
and sending the adjusting instruction to the client so that the client processes the first effect image according to the second operation parameter to obtain a third effect image.
An apparatus for adjusting an image processing effect, comprising:
the system comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a second effect image sent by a client, the second effect image is an image obtained by the client after processing the first effect image through a first operation parameter, and the first operation parameter is an operation parameter of one of at least two processing links in the image processing process of the client;
a generation unit configured to generate an adjustment instruction according to the second effect image acquired by the acquisition unit, the adjustment instruction including a second operation parameter of the image processing operation;
and the sending unit is used for sending the adjusting instruction generated by the generating unit to the client so that the client can process the first effect image according to the second operation parameter to obtain a third effect image.
Optionally, the generating unit is further configured to:
acquiring a second display parameter of the second effect image, wherein the second display parameter is used for representing the display effect of the second effect image;
acquiring a target difference value of the second display parameter deviating from a target display parameter, wherein the target display parameter is a preset parameter;
and acquiring the adjusting instruction according to the target difference value.
Optionally, the sending unit is further configured to:
and sending an acquisition request to the client, wherein the acquisition request is used for requesting the client to send the image output by the intermediate operation in the image processing process.
Optionally, the method further comprises a determining unit for:
the server determines a target client from at least two clients, wherein the target client is one of the at least two clients;
the transmitting unit is further configured to:
and sending the acquisition request to the target client.
A computer device, the computer device comprising: an interaction device, an input/output (I/O) interface, a processor, and a memory, the memory having program instructions stored therein; the interaction device is used for acquiring an operation instruction input by a user; the processor is configured to execute program instructions stored in the memory to perform a method as claimed in any one of the preceding claims.
A computer readable storage medium comprising instructions which, when run on a computer device, cause the computer device to perform the method of any of the preceding claims.
The method for adjusting the image processing effect provided in the embodiment includes: the client processes the first effect image through a first operation parameter in image processing operation to obtain a second effect image, wherein the first operation parameter is an operation parameter of one of at least two processing links in the image processing process; the client sends the second effect image to the server; the server generates an adjusting instruction according to the second effect image; the client receives an adjusting instruction sent by the server, and adjusts the first operation parameter according to the adjusting instruction to obtain a second operation parameter; and the client processes the first effect image through the second operation parameter to obtain a third effect image. The effect image of the middle link in the image processing process is acquired, so that the influence of the operation parameters of each link on the image processing operation can be accurately known, the accuracy of the adjustment of the image processing effect is improved, and the efficiency of the adjustment of the image processing effect is improved.
The method for adjusting the image processing effect provided in the embodiment includes: acquiring a second effect image sent by a client, wherein the second effect image is an image obtained by the client after processing the first effect image through a first operation parameter, and the first operation parameter is an operation parameter of one of at least two processing links in the image processing process of the client; generating an adjustment instruction according to the second effect image, wherein the adjustment instruction comprises a second operation parameter of the image processing operation; and sending an adjustment instruction to the client so that the client processes the first effect image according to the second operation parameter to obtain a third effect image. The effect image of the middle link in the image processing process is acquired, so that the influence of the operation parameters of each link on the image processing operation can be accurately known, the accuracy of the adjustment of the image processing effect is improved, and the efficiency of the adjustment of the image processing effect is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an embodiment of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of another embodiment of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of another embodiment of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of another embodiment of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of another embodiment of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a specific implementation manner of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of another embodiment of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 8a is a view of a usage scenario of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 8b is a view of a prior art use scenario of an image processing effect;
FIG. 9 is a schematic diagram of another embodiment of a method for adjusting an image processing effect according to an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of another embodiment of a method for adjusting an image processing effect according to the embodiments of the present application;
FIG. 11 is a schematic diagram of a computer device according to an embodiment of the present application;
fig. 12 is a schematic diagram of an adjusting device for image processing effect according to an embodiment of the present disclosure;
fig. 13 is a schematic diagram of an adjusting device for image processing effect according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Computer Vision (CV) is a science of studying how to "look" a machine, and more specifically, to replace human eyes with a camera and a Computer to perform machine Vision such as recognition, tracking and measurement on a target, and further perform graphic processing to make the Computer process into an image more suitable for human eyes to observe or transmit to an instrument to detect. Computer vision techniques typically include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D techniques, virtual reality, augmented reality, synchronous positioning, and map construction, among others, as well as common biometric recognition techniques such as face recognition, fingerprint recognition, and others.
The CV technology can be used for real-time processing of videos, for example, in the live video process, the live portrait pictures are subjected to the face beautifying processing in real time, before the live video process starts, the user can adjust the rendering parameters of the face beautifying effect, such as whitening, skin grinding, face thinning and the like, and in the live video process, the CV technology can be used for carrying out the real-time rendering processing on the videos according to the face beautifying effect preset by the user so as to achieve the effect of carrying out the real-time face beautifying on the videos.
At present, if a live video picture has a problem, the live video picture can only be examined through a final display picture, and as the live video picture is obtained through a plurality of processing links, the problem of which processing link can not be positioned specifically is solved. The adjustment efficiency of the image processing is affected.
In order to solve the above problems, the embodiments of the present application provide a method for adjusting an image processing effect, which can grasp a processing condition of each processing link by acquiring an image output by an intermediate link in an image processing process, so as to accurately adjust the image processing effect.
It should be noted that, the method provided by the embodiment of the present application may be applied to various image processing scenes, for example, video beautifying effect rendering, adding animation special effect rendering or other video rendering into a video, where the video may be a live video in real time or a recorded video; optionally, the method provided in the embodiment of the present application may also be applied to image processing, which is not limited to this embodiment, and for ease of understanding, the specific implementation of the embodiment of the present application mainly uses the rendering of the beauty effect in the video as an example, but does not constitute a limitation to the usage scenario of the present application.
The following describes in detail a method for adjusting an image processing effect provided in an embodiment of the present application with reference to the accompanying drawings.
Referring to fig. 1, as shown in fig. 1, an embodiment of a method for adjusting an image processing effect according to an embodiment of the present application includes the following steps.
101. The client processes the first effect image through a first operation parameter in the image processing operation to obtain a second effect image.
In this embodiment, the first operation parameter is an operation parameter of one of at least two processing links in the image processing process, where the image processing operation may be a video processing operation or a picture processing operation, and for an image processing operation including a plurality of processing links, the step obtains an image output after processing each link. For example, for the beauty operation in the video processing, including three steps of whitening, skin grinding and face thinning, each operation corresponds to a corresponding operation parameter, after an original image (i.e. a first effect image) is obtained, the original image is rendered through the whitening operation parameter (the first operation parameter) to obtain a whitening effect image (a second effect image); and then inputting the whitening effect image into the next link to perform skin grinding operation in the same way. For ease of understanding, the embodiment of the present application will be described with reference to only one step of the whitening operation, and the step 101 is used to obtain the whitening effect map.
Optionally, the second effect image may be a video clip or one or more video shots, and preferably, the second effect image is a video shot of the whitened video image.
102. The client sends the second effect image to the server.
In this embodiment, the second effect image is an image of an intermediate processing step of the client, and in this embodiment, the client sends the second effect image to the server, so that the server can know the processing effect of the first operation parameter on the image.
103. The server generates an adjustment instruction according to the second effect image.
In this embodiment, the adjustment instruction includes a second operation parameter of the image processing operation, after the server evaluates the second effect image, the server may obtain a processing effect of the first operation parameter on processing the image, when the processing effect of the first operation parameter is not ideal, the server generates the adjustment instruction, where the second operation parameter included in the adjustment instruction is a parameter obtained after the adjustment of the first operation parameter, for example, a first whitening effect diagram obtained after the processing of the original image by the first whitening operation parameter, and the skin color of the person in the whitening effect is over-white, and generates the adjustment instruction at this time to adjust the first whitening parameter to the second whitening parameter.
104. The server sends an adjustment instruction to the client.
In this embodiment, the server sends the generated adjustment instruction to the client, so that the client can execute the adjustment instruction generated by the server, and adjustment of the server on the work of the client is achieved.
105. And the client adjusts the first operation parameter according to the adjustment instruction to obtain a second operation parameter.
In this embodiment, the adjustment instruction includes a second operation parameter, and the adjustment instruction includes an operation command for switching the first operation parameter to the second operation parameter, so that the client adjusts the first operation parameter for performing image processing in the first link to obtain the second operation parameter, thereby updating the processing operation of the first link on the image processing.
106. And the client processes the first effect image through the second operation parameter to obtain a third effect image.
In this embodiment, the first operation parameter of the first link is adjusted to the second operation parameter, and at this time, the first effect image is reprocessed through the second operation parameter to obtain the third effect image, and at this time, the display effect of the third effect image is better than that of the second effect image because the operation parameter is adjusted. For example, the second whitening parameters generated after the adjustment of the server are used for performing the whitening rendering operation on the original image, and the obtained second whitening effect diagram is better than the first whitening effect diagram, so that the image processing effect of the whitening link is accurately adjusted.
The method for adjusting the image processing effect provided in the embodiment includes: the client processes the first effect image through a first operation parameter in image processing operation to obtain a second effect image, wherein the first operation parameter is an operation parameter of one of at least two processing links in the image processing process; the client sends the second effect image to the server; the server generates an adjusting instruction according to the second effect image; the client receives an adjusting instruction sent by the server, and adjusts the first operation parameter according to the adjusting instruction to obtain a second operation parameter; and the client processes the first effect image through the second operation parameter to obtain a third effect image. The effect image of the middle link in the image processing process is acquired, so that the influence of the operation parameters of each link on the image processing operation can be accurately known, the accuracy of the adjustment of the image processing effect is improved, and the efficiency of the adjustment of the image processing effect is improved.
It should be noted that, since the adjusting method of the image processing effect provided by the embodiment of the present application occupies a certain computing power resource and a signal transmission resource, when the image processing effect is displayed normally, the adjusting method of the image processing effect may be deactivated and started when needed, and therefore, the embodiment of the present application provides a specific implementation manner.
Referring to fig. 2, as shown in fig. 2, an embodiment of a method for adjusting an image processing effect provided in an embodiment of the present application includes the following steps.
201. The server sends an acquisition request to the client.
In this embodiment, the acquiring request is used to request the client to send the image output by the intermediate operation in the image processing process, and optionally, the acquiring request may be generated by the server based on the request message sent by the client, may be generated periodically by the server itself according to a preset time, or may be generated when the server detects that a final image is abnormal, where the final image is an image that is finally generated by the client after all image processing links are executed. The embodiment of the present application is not limited thereto.
202. And the client acquires the second effect image according to the acquisition request.
In this embodiment, the second effect image is an image obtained by processing the first effect image with the first operation parameter in the image processing operation, and in a general working process, the client side directly sends the second effect image to the next link after outputting the second effect image, so as to perform image processing of the next link. When receiving the acquisition request, the client performs screenshot or segment recording on the image output by the first operation parameter to obtain a second effect image. The other embodiments may refer to step 101, and are not described herein.
The subsequent steps 203 to 207 can be referred to the above steps 102 to 106, and will not be described here again.
In this embodiment, the client starts to execute the method for adjusting the image processing effect provided by the embodiment of the present application through the acquisition request sent by the server to the client, so that in the image processing process, the time for calling the method is determined, and flexibility of the method for adjusting the image processing effect provided by the embodiment of the present application is improved.
It should be noted that, because the adjustment instruction generated by the server adjusts the operation parameters of each link, the client needs to adjust the operation parameters according to the adjustment instruction after receiving the adjustment instruction, and this is described in detail by taking one link in the image processing process as an example.
Referring to fig. 3, as shown in fig. 3, an embodiment three of a method for adjusting an image processing effect provided in an embodiment of the present application includes the following steps.
Steps 301 to 303 can be referred to above in steps 201 to 203, and will not be described here again.
304. The server obtains a second display parameter of the second effect image.
In this embodiment, the second display parameter is used to represent the display effect of the second effect image, for example, if the second effect image is a whitened image, the second display parameter is used to represent the brightness value of the whitened image. Therefore, the server can judge the rendering effect of the second effect image according to the second display parameter.
Alternatively, the specific implementation manner of the server to obtain the second display parameter according to the second effect image may be implemented by using an image processing model provided by the AI, where the image processing model may be any model capable of obtaining the image display parameter in the prior art, and the embodiment of the present application is not limited.
305. The server obtains a target difference value of the second display parameter deviating from the target display parameter.
In this embodiment, the target display parameter is a preset parameter, and by making a difference between the second display parameter and the target display parameter, a difference value of the second display parameter deviating from the target display parameter can be obtained, and the difference value is obtained as a target difference value.
Alternatively, the target display parameters may be acquired specifically in the following ways.
1. The server is preset with an image processing model, the image processing model can acquire an optimal display parameter according to factors such as environment, light and the like in the second effect image, and the server sets the display parameter as a target display parameter.
2. The corresponding relation between the shooting condition of the client and the target display parameters is preset in the server, before the method starts to be executed, the client sends the shooting condition parameters of the client to the server, and the server determines the target parameters corresponding to the shooting condition parameters according to the shooting condition parameters. Optionally, an image processing model is preset in the client, so that the photographed image can be identified, and the photographing condition parameters of the client can be generated. Or, the shooting condition parameters are manually input by a user, for example, the current shooting environment is indoor, the lighting condition is warm color light, and the like, the user provides such options through an operation interface in the client, the user selects through a man-machine interaction interface of the client, and the client generates the client shooting condition parameters according to the selection operation of the user.
306. And the server acquires the adjusting instruction according to the target difference value.
In this embodiment, the second display parameter is a starting point of adjustment in the adjustment command, and the target difference is an amplitude of adjustment for the second display parameter. Thereby enabling the client to make a corresponding adjustment based on the adjustment instruction.
307. The server sends an adjustment instruction to the client.
In this embodiment, the step 104 is referred to above, and will not be described herein.
308. The client obtains a first parameter value of the first operating parameter.
In this embodiment, the first parameter value is a parameter value of the first operation parameter for processing the display effect of the first image.
309. And the client adjusts the first parameter value according to the adjusting instruction to obtain a second parameter value.
310. And the client acquires a second operation parameter according to the second parameter value.
In this embodiment, the second parameter value is a parameter value of a second operation parameter, that is, the first parameter value is added to the target difference value in the adjustment instruction, so as to obtain the second parameter value, thereby obtaining the second operation parameter.
Optionally, the manner of replacing the first operation parameter with the second operation parameter may also be that the adjustment instruction sent by the server carries the second operation parameter, and the client directly replaces the first operation parameter with the second operation parameter after receiving the adjustment instruction, and then, the target display parameter is obtained in the server at this time, that is, the second operation parameter.
311. The client processes the first effect image through the second operation parameter to obtain a third effect image
In this embodiment, the step can refer to the step 106, and will not be described herein.
In this embodiment, the server obtains the adjustment instruction according to the second effect image and sends the adjustment instruction to the client, and after receiving the adjustment instruction, the client adjusts the first operation parameter according to the adjustment instruction to obtain the second operation parameter, thereby realizing the update of the image processing operation.
It should be noted that, taking a live video scene as an example, in a specific working process, more than one client is connected with a server at the same time, and the server needs to provide services for multiple clients, so that the server needs to distinguish which client needs to be currently called with the adjusting method of the image processing effect provided by the embodiment of the present application.
Referring to fig. 4, as shown in fig. 4, an embodiment four of the method for adjusting an image processing effect provided in the embodiment of the present application includes the following steps.
401. The server determines a target client from the at least two clients.
In this embodiment, the target client is one of at least two clients, optionally, the server may confirm the target client through the ID of the client, for example, the image processing effect of a specific client may be adjusted by manually inputting the ID of the client by an administrator of the server, or the server may call the target client according to a preset rule, so as to ensure that the image processing effect of the client achieves an ideal situation. The embodiments of the present application are not limited thereto.
402. The server sends an acquisition request to the target client.
In this embodiment, after determining the target client, the server transmits the acquisition request only to the target client, and does not transmit the acquisition request to other clients.
403. And the client acquires the second effect image according to the acquisition request.
In this embodiment, the step 202 is referred to above, and will not be described herein.
After the client acquires the second effect image, the second effect image may be immediately sent to the server to implement real-time adjustment, and this manner may execute step 404 described below; or, after all the processing links are executed, all the intermediate pictures may be packaged and sent to the server, which executes step 405 described below.
404. And the client side acquires the second effect image and then sends the second effect image to the server in real time.
In this embodiment, the client sends the second effect image to the server in real time after the second effect image is acquired, so that the server can adjust the image processing effect in real time, and the method has higher processing efficiency, can realize synchronous sending, and has the disadvantage of occupying larger computing power resources and communication resources.
405. After all processing links in the image processing process are executed, the client packages and sends the second effect image and images output by other processing links to the server.
In this embodiment, after all the processing links are executed, the screenshot after each link is processed is packaged and sent to the server, the operation parameters of each link are uniformly adjusted by the server, and asynchronous sending is realized.
The subsequent steps 406 to 409 can be referred to the above steps 103 to 106, and will not be described here again.
In this embodiment, for the second effect image, by setting two optional embodiments of synchronous transmission and asynchronous transmission, the method for adjusting the image processing effect provided in this embodiment of the present application has better universality, and is applicable under different network conditions and computing conditions.
The following describes in detail the method for adjusting the image processing effect provided by the embodiment of the application in combination with a real-time beauty working scene in the live video process.
Referring to fig. 5, as shown in fig. 5, a fifth embodiment of the method for adjusting an image processing effect provided in the embodiments of the present application includes the following steps.
501. The server determines a target client from the at least two clients.
In this embodiment, the server may also be connected to an administrator client, in which an operation interface as shown in fig. 6 is operated, and the administrator user may input a room ID in the room ID input window 601 in fig. 6, where the room ID is the ID of the client, and send the client ID to the server, so that the server obtains the target client.
Alternatively, it may be that the user (e.g., a webmaster) side initiates a request to the server, and the server regards the client that initiates the request as the target client.
502. The server sends an acquisition request to the target client.
In this embodiment, the step 402 is referred to above, and will not be described herein.
503. The client processes the first effect image through a first operation parameter in the image processing operation to obtain a second effect image.
In this embodiment, the specific implementation manner of this step may refer to step 101, which is not described herein. As shown in fig. 7, the image processing process of real-time beauty in the live video broadcast process includes five steps of camera 701, skin grinding 702, whitening 703, large eye 704 and thin face 705, and the final processing result is used as code 706 to display the final video effect on the client. After the five steps are completed, intermediate images are respectively output, wherein the camera 701 outputs an original image 7011, the original image 7011 is input into a skin grinding 702 and then a skin grinding picture 7021 is output, the skin grinding image 7021 is input into a whitening 703 and then a whitening picture 7031 is output, the whitening picture 7031 is input into a large eye 704 and then a large eye image 7041 is output, the large eye image 7041 is input into a thin face 705 and then a thin face picture 7051 is obtained, and therefore, in a finally obtained code 706, the image is the image subjected to skin grinding, whitening, large eye and thin face processing.
The first operation parameter may be any one of operation parameters 701 to 705, the first effect image is an input image of each step, and the second effect image is an output image of each step; taking the first operation parameter as the whitening 703 as an example, the first effect image is a skin-polished picture 7021, and the second effect image is a whitening picture 7031.
504. And the client acquires the second effect image according to the acquisition request.
In this embodiment, when the client acquires the acquisition request, the above intermediate pictures 7011 to 7051 are acquired through the screenshot operation and stored in the picture local storage 707 of the client.
It should be noted that, the method for obtaining the second effect image provided in the present application has an additional advantage, as shown in fig. 8a, since the image output by each intermediate step is directly intercepted in the scheme of the present application, the original image of each step can be obtained, and in the prior art, the method for directly capturing the live window after all rendering is completed is shown in fig. 8b, and a plurality of UI controls 801b exist in the capturing, which causes shielding to the image, and affects the subsequent recognition to the image processing effect.
505. The client sends the second effect image to the server.
In this embodiment, the specific implementation manner of sending the second effect image to the server by the client may be a synchronous sending manner as described in step 404 or an asynchronous sending manner as described in step 405, which is not limited to this embodiment of the present application.
506. The server obtains a second display parameter of the second effect image.
In this embodiment, the specific implementation of the server obtaining the second display parameter may refer to step 304, which is not described herein.
As shown in fig. 8a, taking the above-described operation corresponding to the whitening 703 as an example, the first effect image (the ground picture 7021) is an image as shown by 801 in fig. 8a, the second effect image (the whitening picture 7031) is an image as shown by 802 in fig. 8a, the 802 performs the whitening operation with respect to 801, and the brightness value of the skin of the image of the photographed person is higher, thereby achieving the whitening effect. At this point, the server obtains 802 the display parameters in the manner described in step 304.
507. The server obtains a target difference value of the second display parameter deviating from the target display parameter.
In this embodiment, the specific implementation manner of this step may refer to step 305, which is not described herein.
As shown in fig. 8a, the server obtains 801 the difference value of the offset target effect, i.e. the target difference value, in the manner described in step 305.
508. And the server acquires the adjusting instruction according to the target difference value.
In this embodiment, the manner in which the server obtains the adjustment instruction may be as described in step 306, or may be that the administrator client, after knowing the target difference information sent by the server, sets the operation parameter input through the adjustment column 602 in the interface shown in fig. 6 as the second operation parameter in the adjustment instruction.
509. The server sends an adjustment instruction to the client.
In this embodiment, the step 104 is referred to above, and will not be described herein.
510. And the client adjusts the first operation parameter according to the adjustment instruction to obtain a second operation parameter.
In this embodiment, the specific implementation manner of step 510 may be as described in steps 308 to 309, which will not be repeated here.
Taking the operation corresponding to the whitening 703 in fig. 7 as an example, the original whitening 703 processes the first effect image (the skin-polished picture 7021) with the first operation parameter, at this time, according to the adjustment instruction, the client side processes the first effect image (the skin-polished picture 7021) with the second operation parameter instead.
511. And the client processes the first effect image through the second operation parameter to obtain a third effect image.
In this embodiment, taking the operation corresponding to the whitening 703 in fig. 7 as an example, the whitening 703 processes the first effect image (the skin-polished image 7021) with the adjusted second operation parameter to obtain a third effect image, where the whitening effect is better than that of the second effect image due to the adjustment.
512. The client transmits the third effect image to the content distribution network.
In this embodiment, the content delivery network (Content Delivery Network, CDN) is a network that can be read by both the server and the administrator client, so that the administrator client and the server can obtain the third effect image through the CDN, so as to understand the final adjustment effect of the method.
In this embodiment, only the whitening step 703 in the beautifying operation step in fig. 7 is taken as an example to explain the method for adjusting the image processing effect provided in the embodiment of the present application, the image shown in 803 in fig. 8a is a thin-face image, and the image obtained by performing subsequent processing based on the whitening image is subjected to adjustment of the image processing effect by adopting the method exactly the same as the whitening step. It will be appreciated that steps 701 through 705 are all processed in the same manner to obtain the processing of each intermediate step in the image processing. The accurate control of each link in the image processing process is realized. Meanwhile, when the manager client is set, the manager client can remotely control the rendering condition of the beauty effect in each client through the server, for example, when the network anchor finds that the beauty effect is not ideal, the manager can be contacted, and therefore the manager can remotely adjust the beauty effect of the client where the network anchor is located through the manager client.
It should be noted that, in the above-mentioned image processing steps, the step of performing image processing on the image may be any image processing step in the prior art, and this embodiment of the present application is not described herein, but for convenience of understanding, this embodiment of the present application provides an example of an implementation, but does not constitute a limitation of the image processing scheme in the present application.
An optional mode of image processing in the embodiments of the present application is described in detail below with reference to the accompanying drawings.
Referring to fig. 9, as shown in fig. 9, the image processing in the method for adjusting the image processing effect provided in the embodiment of the present application includes the following steps.
901. And acquiring an image to be processed.
In this embodiment, the image to be processed may be an original image obtained by the camera and not subjected to any rendering operation.
902. And obtaining vertex data and texture data of the image to be processed through a PBO protocol.
In this embodiment, the PBO protocol is a pixel buffer object protocol (Pixel Buffer Object), i.e. a pixel buffer provided in the OpenGL protocol, which can be used for GPU expansion. The vertex data obtained through the PBO protocol is used for reflecting contour information of the characters in the image to be processed, and the texture data is used for reflecting texture information such as skin colors of the characters in the processed image.
903. Image processing is performed on the vertex data and the texture data.
In this embodiment, the processing of vertex data may be performed by a vertex shader, and then rasterizing together with texture data; alternatively, the vertex shader may be an OpenGL vertex shader, and then execute a fragment shader to perform a subsequent shading process.
904. And exporting the processed image.
In this embodiment, the processed image is first stored in the Buffer Frame Buffer, and then exported as the processed image through the PBO protocol.
The execution logic of steps 901 to 904 described above can be referred to as fig. 10, wherein the DMA (Direct Memory Access) is used to transfer pixel data on the video card quickly without consuming the clock cycle of the CPU.
The image processing provided in this embodiment performs coloring processing after respectively acquiring vertex data and texture data of an image, and outputs the processed image after performing processing of a fragment shader, thereby realizing processing of the image and realizing the effect of mei Yan Dengxiao.
The method for adjusting the image processing effect provided in the embodiment includes: the client processes the first effect image through a first operation parameter in image processing operation to obtain a second effect image, wherein the first operation parameter is an operation parameter of one of at least two processing links in the image processing process; the client sends the second effect image to the server; the server generates an adjusting instruction according to the second effect image; the client receives an adjusting instruction sent by the server, and adjusts the first operation parameter according to the adjusting instruction to obtain a second operation parameter; and the client processes the first effect image through the second operation parameter to obtain a third effect image. The effect image of the middle link in the image processing process is acquired, so that the influence of the operation parameters of each link on the image processing operation can be accurately known, the accuracy of the adjustment of the image processing effect is improved, and the efficiency of the adjustment of the image processing effect is improved.
The above description has been made on the solution provided in the embodiments of the present application. It will be appreciated that the computer device, in order to carry out the functions described above, comprises corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The method may be implemented by one entity device, or may be implemented by a plurality of entity devices together, or may be a logic functional module in one entity device, which is not specifically limited in the embodiment of the present application.
For example, the methods described above may all be implemented by the computer device in fig. 11. Fig. 11 is a schematic hardware structure of a computer device according to an embodiment of the present application. The computer device includes at least one processor 1101, a communication line 1102, a memory 1103, and at least one communication interface 1104.
The processor 1101 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application specific integrated circuit (server IC), or one or more integrated circuits for controlling the execution of programs in accordance with aspects of the present application.
Communication line 1102 may include a pathway to transfer information between the aforementioned components.
Communication interface 1104 uses any transceiver-like device for communicating with other devices or communication networks, such as ethernet, radio access network (radio access network, RAN), wireless local area network (wireless local area networks, WLAN), etc.
The memory 1103 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc-only memory (compact disc read-only memory) or other optical disk storage, a compact disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via communication line 1102. The memory may also be integrated with the processor.
The memory 1103 is used for storing computer-executable instructions for executing the embodiments of the present application, and the processor 1101 controls the execution. The processor 1101 is configured to execute computer-executable instructions stored in the memory 1103, thereby implementing the method provided in the above-described embodiment of the present application.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In a particular implementation, the processor 1101 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 11, as an embodiment.
In a particular implementation, as one embodiment, a computer device may include multiple processors, such as processor 1101 and processor 1107 in FIG. 11. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In a specific implementation, the computer device may also include an output device 1105 and an input device 1106, as one embodiment. The output device 1105 communicates with the processor 1101 and may display information in a variety of ways. For example, the output device 1105 may be a liquid crystal display (liquid crystal display, LCD), a light emitting diode (light emitting diode, LED) display device, a Cathode Ray Tube (CRT) display device, or a projector (projector), or the like. The input device 1106 is in communication with the processor 1101 and may receive input from a user in a variety of ways. For example, the input device 1106 may be a mouse, a keyboard, a touch screen device, a sensing device, or the like.
The computer device may be a general purpose device or a special purpose device. In particular implementations, the computer device may be a desktop, laptop, web server, palmtop (personal digital assistant, PDA), mobile handset, tablet, wireless terminal device, embedded device, or device having a similar structure as in fig. 11. Embodiments of the present application are not limited in the type of computer device.
The embodiment of the application may divide the functional units of the storage device according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
For example, in the case where the respective functional units are divided in an integrated manner, fig. 12 shows a schematic diagram of an adjusting apparatus of an image processing effect.
As shown in fig. 12, the apparatus for adjusting an image processing effect provided in the embodiment of the present application includes:
a first processing unit 1201, where the first processing unit 1201 is configured to process a first effect image through a first operation parameter in an image processing operation to obtain a second effect image, where the first operation parameter is an operation parameter of one of at least two processing links in an image processing process;
a transmitting unit 1202, where the transmitting unit 1202 is configured to transmit the second effect image obtained by processing by the first processing unit 1201 to a server;
a receiving unit 1203, where the receiving unit 1203 is configured to receive an adjustment instruction sent by the server, where the adjustment instruction is an instruction generated by the server according to the second effect image sent by the sending unit 1202, and the adjustment instruction includes a second operation parameter of the image processing operation;
an adjusting unit 1204, where the adjusting unit 1204 is configured to adjust the first operating parameter according to the adjusting instruction received by the receiving unit 1203, to obtain a second operating parameter;
a second processing unit 1205, where the second processing unit 1205 is configured to process the first effect image through the second operation parameter adjusted by the adjusting unit 1204 to obtain a third effect image.
Optionally, the adjusting unit 1204 is further configured to:
acquiring a first parameter value of a first operating parameter;
adjusting the first parameter value according to the adjusting instruction to obtain a second parameter value;
and acquiring a second operation parameter according to the second parameter value, wherein the second parameter value is the parameter value of the second operation parameter.
Optionally, the apparatus further comprises an acquisition unit 1206, the acquisition unit 1206 being configured to:
acquiring an acquisition request sent by the server;
the first processing unit 1201 is further configured to:
and acquiring the second effect image according to the acquisition request.
Optionally, the sending unit 1202 is further configured to:
after the second effect image is acquired, the second effect image is sent to the server in real time;
or alternatively, the process may be performed,
and after all the processing links in the image processing process are executed, packaging and transmitting the second effect image and the images output by other processing links to the server.
As shown in fig. 13, an apparatus for adjusting an image processing effect provided in an embodiment of the present application includes:
an obtaining unit 1301, where the obtaining unit 1301 is configured to obtain a second effect image sent by a client, where the second effect image is an image obtained by the client after processing a first effect image with a first operation parameter, where the first operation parameter is an operation parameter of one of at least two processing links in an image processing process of the client;
A generating unit 1302, where the generating unit 1302 is configured to generate an adjustment instruction according to the second effect image acquired by the acquiring unit 1301, where the adjustment instruction includes a second operation parameter of the image processing operation;
and a transmitting unit 1303, where the transmitting unit 1303 is configured to transmit the adjustment instruction generated by the generating unit 1302 to the client, so that the client processes the first effect image according to the second operation parameter to obtain a third effect image.
Optionally, the generating unit 1302 is further configured to:
acquiring a second display parameter of the second effect image, wherein the second display parameter is used for representing the display effect of the second effect image;
acquiring a target difference value of the second display parameter deviating from a target display parameter, wherein the target display parameter is a preset parameter;
and acquiring the adjusting instruction according to the target difference value.
Optionally, the sending unit 1303 is further configured to:
and sending an acquisition request to the client, wherein the acquisition request is used for requesting the client to send the image output by the intermediate operation in the image processing process.
Optionally, the method further comprises a determining unit 1304, the determining unit 1304 being configured to:
the server determines a target client from at least two clients, wherein the target client is one of the at least two clients;
The transmitting unit 1303 is further configured to:
and sending the acquisition request to the target client.
Further, embodiments of the present invention also provide a computer storage medium including instructions that, when executed on a computer device, cause the computer device to perform the above-described method.
The detailed description of the program stored in the computer storage medium according to the embodiments of the present application may refer to the above embodiments, and will not be repeated herein.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (13)

1. A method of adjusting an image processing effect, comprising:
processing the first effect image through a first operation parameter in image processing operation to obtain a second effect image, wherein the first operation parameter is an operation parameter of one of at least two processing links in the image processing process;
Transmitting the second effect image to a server;
receiving an adjustment instruction sent by the server, wherein the adjustment instruction is an instruction generated by the server according to the second effect image, and the adjustment instruction comprises a second operation parameter of the image processing operation; the adjusting instruction is generated by the server according to a target difference value, wherein the target difference value is obtained by the server obtaining a second display parameter deviating from a target display parameter, and the target display parameter is a preset parameter; the second display parameters are acquired by the server according to the second effect image, and the second display parameters are used for representing the display effect of the second effect image;
adjusting the first operation parameter according to the adjustment instruction to obtain the second operation parameter;
and processing the first effect image through the second operation parameter to obtain a third effect image.
2. The method of claim 1, wherein adjusting the first operating parameter according to the adjustment command results in the second operating parameter, comprising:
acquiring a first parameter value of the first operating parameter;
Adjusting the first parameter value according to the adjusting instruction to obtain a second parameter value;
and acquiring the second operation parameter according to the second parameter value, wherein the second parameter value is the parameter value of the second operation parameter.
3. The method of claim 1, wherein prior to processing the first effect image by the first operating parameter in the image processing operation to obtain the second effect image, further comprising:
acquiring an acquisition request sent by the server;
the processing the first effect image by the first operation parameter in the image processing operation to obtain a second effect image includes:
and acquiring the second effect image according to the acquisition request.
4. The method of claim 1, wherein the sending the second effect image to a server comprises:
after the second effect image is acquired, the second effect image is sent to the server in real time;
or alternatively, the process may be performed,
and after all processing links in the image processing process are executed, packaging and transmitting the second effect image and images output by other processing links to the server.
5. A method of adjusting an image processing effect, comprising:
Acquiring a second effect image sent by a client, wherein the second effect image is an image obtained after the client processes a first effect image through a first operation parameter, and the first operation parameter is an operation parameter of one of at least two processing links in the image processing process of the client;
generating an adjustment instruction according to the second effect image, wherein the adjustment instruction comprises a second operation parameter of the image processing operation;
sending the adjustment instruction to the client so that the client processes the first effect image according to the second operation parameter to obtain a third effect image;
the generating an adjustment instruction according to the second effect image includes: acquiring a second display parameter of the second effect image, wherein the second display parameter is used for representing the display effect of the second effect image; obtaining a target difference value of the second display parameter deviating from a target display parameter, wherein the target display parameter is a preset parameter; and acquiring the adjusting instruction according to the target difference value.
6. The method of claim 5, wherein prior to the capturing the second effect image sent by the client, further comprising:
And sending an acquisition request to the client, wherein the acquisition request is used for requesting the client to send an image output by an intermediate operation in the image processing process.
7. The method of claim 6, wherein prior to the capturing the second effect image sent by the client, further comprising:
the method comprises the steps that a server determines a target client from at least two clients, wherein the target client is one of the at least two clients;
the sending the acquisition request to the client includes:
and sending the acquisition request to the target client.
8. An image processing effect adjusting apparatus, comprising:
the first processing unit is used for processing the first effect image through a first operation parameter in image processing operation to obtain a second effect image, wherein the first operation parameter is an operation parameter of one of at least two processing links in the image processing process;
the sending unit is used for sending the second effect image obtained by processing of the first processing unit to a server;
the receiving unit is used for receiving an adjusting instruction sent by the server, wherein the adjusting instruction is an instruction generated by the server according to the second effect image sent by the sending unit, and the adjusting instruction comprises a second operation parameter of the image processing operation; the adjusting instruction is generated by the server according to a target difference value, wherein the target difference value is obtained by the server obtaining a second display parameter deviating from a target display parameter, and the target display parameter is a preset parameter; the second display parameters are acquired by the server according to the second effect image, and the second display parameters are used for representing the display effect of the second effect image;
The adjusting unit is used for adjusting the first operation parameter according to the adjusting instruction received by the receiving unit to obtain the second operation parameter;
and the second processing unit is used for processing the first effect image through the second operation parameter regulated by the regulating unit so as to obtain a third effect image.
9. The apparatus of claim 8, wherein the adjustment unit is further configured to:
acquiring a first parameter value of the first operating parameter;
adjusting the first parameter value according to the adjusting instruction to obtain a second parameter value;
and acquiring the second operation parameter according to the second parameter value, wherein the second parameter value is the parameter value of the second operation parameter.
10. An image processing effect adjusting apparatus, comprising:
the system comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a second effect image sent by a client, the second effect image is an image obtained by the client after processing the first effect image through a first operation parameter, and the first operation parameter is an operation parameter of one of at least two processing links in the image processing process of the client;
A generation unit configured to generate an adjustment instruction according to the second effect image acquired by the acquisition unit, where the adjustment instruction includes a second operation parameter of the image processing operation;
a sending unit, configured to send the adjustment instruction generated by the generating unit to the client, so that the client processes the first effect image according to the second operation parameter to obtain a third effect image;
the generating unit is further configured to: acquiring a second display parameter of the second effect image, wherein the second display parameter is used for representing the display effect of the second effect image; obtaining a target difference value of the second display parameter deviating from a target display parameter, wherein the target display parameter is a preset parameter; and acquiring the adjusting instruction according to the target difference value.
11. The apparatus of claim 10, wherein the transmitting unit is further configured to:
and sending an acquisition request to the client, wherein the acquisition request is used for requesting the client to send an image output by an intermediate operation in the image processing process.
12. A computer device, the computer device comprising: an interaction device, an input/output (I/O) interface, a processor, and a memory, the memory having program instructions stored therein;
The interaction device is used for acquiring an operation instruction input by a user;
the processor is configured to execute program instructions stored in the memory and to perform the method of any one of claims 1-4 or 5-7.
13. A computer readable storage medium comprising instructions which, when run on a computer device, cause the computer device to perform the method of any of claims 1-4 or 5-7.
CN202010108542.6A 2020-02-21 2020-02-21 Image processing effect adjusting method, device, equipment and medium Active CN111343472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010108542.6A CN111343472B (en) 2020-02-21 2020-02-21 Image processing effect adjusting method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010108542.6A CN111343472B (en) 2020-02-21 2020-02-21 Image processing effect adjusting method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN111343472A CN111343472A (en) 2020-06-26
CN111343472B true CN111343472B (en) 2023-05-26

Family

ID=71187461

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010108542.6A Active CN111343472B (en) 2020-02-21 2020-02-21 Image processing effect adjusting method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN111343472B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113918442A (en) * 2020-07-10 2022-01-11 北京字节跳动网络技术有限公司 Image special effect parameter processing method, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106230841A (en) * 2016-08-04 2016-12-14 深圳响巢看看信息技术有限公司 A kind of video U.S. face and the method for plug-flow in real time in network direct broadcasting based on terminal
CN106657793A (en) * 2017-01-11 2017-05-10 维沃移动通信有限公司 Image processing method and mobile terminal
CN107798652A (en) * 2017-10-31 2018-03-13 广东欧珀移动通信有限公司 Image processing method, device, readable storage medium storing program for executing and electronic equipment
CN109087239A (en) * 2018-07-25 2018-12-25 腾讯科技(深圳)有限公司 A kind of face image processing process, device and storage medium
WO2019000777A1 (en) * 2017-06-27 2019-01-03 五邑大学 Internet-based face beautification system
CN110177287A (en) * 2019-06-11 2019-08-27 广州虎牙科技有限公司 A kind of image procossing and live broadcasting method, device, equipment and storage medium
CN110312169A (en) * 2019-07-30 2019-10-08 腾讯科技(深圳)有限公司 Video data handling procedure, device, terminal and server

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110870293B (en) * 2018-07-02 2021-05-14 深圳市大疆创新科技有限公司 Video shooting processing method and device and video shooting processing system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106230841A (en) * 2016-08-04 2016-12-14 深圳响巢看看信息技术有限公司 A kind of video U.S. face and the method for plug-flow in real time in network direct broadcasting based on terminal
CN106657793A (en) * 2017-01-11 2017-05-10 维沃移动通信有限公司 Image processing method and mobile terminal
WO2019000777A1 (en) * 2017-06-27 2019-01-03 五邑大学 Internet-based face beautification system
CN107798652A (en) * 2017-10-31 2018-03-13 广东欧珀移动通信有限公司 Image processing method, device, readable storage medium storing program for executing and electronic equipment
CN109087239A (en) * 2018-07-25 2018-12-25 腾讯科技(深圳)有限公司 A kind of face image processing process, device and storage medium
CN110177287A (en) * 2019-06-11 2019-08-27 广州虎牙科技有限公司 A kind of image procossing and live broadcasting method, device, equipment and storage medium
CN110312169A (en) * 2019-07-30 2019-10-08 腾讯科技(深圳)有限公司 Video data handling procedure, device, terminal and server

Also Published As

Publication number Publication date
CN111343472A (en) 2020-06-26

Similar Documents

Publication Publication Date Title
US20180276899A1 (en) Method, apparatus, and system for generating an ar application and rendering an ar instance
US11538211B2 (en) Puppeteering remote avatar by facial expressions
WO2021244172A1 (en) Image processing method and image synthesis method, image processing apparatus and image synthesis apparatus, and storage medium
CN111739141B (en) 3D cloud rendering method for light terminal
CN103650001A (en) Moving image distribution server, moving image playback device, control method, program, and recording medium
CN108389241A (en) The methods, devices and systems of textures are generated in scene of game
CN111246272A (en) Method and device for displaying video cover picture
CN110177287A (en) A kind of image procossing and live broadcasting method, device, equipment and storage medium
CN114840349B (en) Distributed task scheduling method of AI intelligent camera and AI intelligent camera system
CN111340865B (en) Method and apparatus for generating image
CN111343472B (en) Image processing effect adjusting method, device, equipment and medium
CN113206993A (en) Method for adjusting display screen and display device
CN112470464B (en) In-field subcode timing in a field sequential display
CN112419456B (en) Special effect picture generation method and device
JP6637855B2 (en) Data processing device, data processing method, and computer program
CN113938597B (en) Face recognition method, device, computer equipment and storage medium
CN111857336B (en) Head-mounted device, rendering method thereof, and storage medium
US20200066234A1 (en) VR Drawing Method, Device, and System
CN111314627B (en) Method and apparatus for processing video frames
CN112488977B (en) Image processing method and device, electronic equipment and storage medium
CN113434551B (en) Data processing method, device, equipment and computer storage medium
CN106061054B (en) A kind of information processing method and electronic equipment
CN113240577B (en) Image generation method and device, electronic equipment and storage medium
KR20220169202A (en) Client terminal and method to communicate with game server
Hou et al. A Digitized You in My Eye: A Perceptually Driven Spatial Communication Prototype for XR

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40024757

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant