CN111861965B - Image backlight detection method, image backlight detection device and terminal equipment - Google Patents

Image backlight detection method, image backlight detection device and terminal equipment Download PDF

Info

Publication number
CN111861965B
CN111861965B CN201910276705.9A CN201910276705A CN111861965B CN 111861965 B CN111861965 B CN 111861965B CN 201910276705 A CN201910276705 A CN 201910276705A CN 111861965 B CN111861965 B CN 111861965B
Authority
CN
China
Prior art keywords
image
target object
area
backlight
bounding box
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910276705.9A
Other languages
Chinese (zh)
Other versions
CN111861965A (en
Inventor
凌健
边思雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TCL Technology Group Co Ltd
Original Assignee
TCL Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TCL Technology Group Co Ltd filed Critical TCL Technology Group Co Ltd
Priority to CN201910276705.9A priority Critical patent/CN111861965B/en
Publication of CN111861965A publication Critical patent/CN111861965A/en
Application granted granted Critical
Publication of CN111861965B publication Critical patent/CN111861965B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides an image backlight detection method, an image backlight detection device, a terminal device and a computer readable storage medium, comprising the following steps: obtaining an area where a target object is located in an image to be detected and an average brightness value of the area where the target object is located, wherein a boundary box is used for marking the range of the area where the target object is located; amplifying the boundary box and acquiring an average brightness value of the amplified region in the boundary box; calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the amplified boundary frame; and judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that the target object in the image to be detected is in backlight. The application can rapidly and effectively detect whether the image is backlit.

Description

Image backlight detection method, image backlight detection device and terminal equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image backlight detection method, an image backlight detection device, a terminal device, and a computer readable storage medium.
Background
When shooting is performed by using a mobile phone, a camera or other devices in daily life, if the background part is too bright and the target object part is too dark, the quality of the shot image is poor, and how to quickly and effectively detect whether the image is in a backlight state is important. In the prior art, when detecting whether an image is backlit, a bright area and a dark area in the whole image are usually obtained, and whether the image is backlit is judged according to the bright area and the dark area, however, when the image content is complex, the detection result is usually inaccurate and takes a long time.
Disclosure of Invention
In view of the above, embodiments of the present application provide an image backlight detection method, an image backlight detection apparatus, a terminal device, and a computer readable storage medium, so as to quickly and effectively detect whether an image is backlit.
A first aspect of an embodiment of the present application provides an image backlight detection method, including:
Obtaining an area where a target object is located in an image to be detected and an average brightness value of the area where the target object is located, wherein a boundary box is used for marking the range of the area where the target object is located;
Amplifying the boundary box and acquiring an average brightness value of the amplified region in the boundary box;
calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the amplified boundary frame;
And judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that the target object in the image to be detected is in backlight.
A second aspect of an embodiment of the present application provides an image backlight detection apparatus including:
The first brightness acquisition module is used for acquiring an area where a target object is located in an image to be detected and an average brightness value of the area where the target object is located, wherein a boundary box is used for marking the range of the area where the target object is located;
The second brightness acquisition module is used for amplifying the boundary frame and acquiring the average brightness value of the amplified region in the boundary frame;
The ratio calculating module is used for calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the amplified boundary frame;
And the backlight determining module is used for judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that the target object in the image to be detected is in backlight.
A third aspect of an embodiment of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the image backlight detection method according to the first aspect as described above when the computer program is executed by the processor.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the image backlight detection method according to the first aspect described above.
A fifth aspect of the application provides a computer program product comprising a computer program which, when executed by one or more processors, implements the steps of the image backlighting detection method as described in the first aspect above.
From the above, the scheme of the application obtains the range of the area where the target object marked by the boundary frame is located in the image to be detected and the average brightness value of the area where the target object is located, amplifies the boundary frame, obtains the average brightness value of the area in the amplified boundary frame, calculates the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the amplified boundary frame, and judges whether the ratio is smaller than the preset brightness threshold value, and marks that the target object in the image to be detected is in backlight when the ratio is smaller than the brightness threshold value. According to the method, the target object in the image to be detected is detected, the backlight degree of the image is calculated on the basis of detecting the target object, and whether the target object in the image is backlit or not is judged according to the backlight degree, so that the detection of a bright area and a dark area in the whole image to be detected can be avoided, the calculation time is shortened, the calculation cost is reduced, and further whether the image is backlit or not can be rapidly and effectively detected.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an implementation flow of an image backlight detection method according to an embodiment of the present application;
FIG. 2 is an exemplary diagram of marking a target object in an image to be detected using a bounding box;
Fig. 3 is a schematic implementation flow chart of an image backlight detection method according to a second embodiment of the present application;
fig. 4 is a schematic diagram of an image backlight detection device according to a third embodiment of the present application;
Fig. 5 is a schematic diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In particular implementations, the terminal devices described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). It should also be appreciated that in some embodiments, the device is not a portable communication device, but a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following discussion, a terminal device including a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal device supports various applications, such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital video camera applications, web browsing applications, digital music player applications, and/or digital video player applications.
Various applications that may be executed on the terminal device may use at least one common physical user interface device such as a touch sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal may be adjusted and/or changed between applications and/or within the corresponding applications. In this way, the common physical architecture (e.g., touch-sensitive surface) of the terminal may support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence number of each step in this embodiment does not mean the execution sequence, and the execution sequence of each process should be determined by its function and internal logic, and should not limit the implementation process of the embodiment of the present application in any way.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
Referring to fig. 1, a schematic implementation flow chart of an image backlight detection method according to an embodiment of the present application is shown, where the image backlight detection method is applied to a terminal device, and as shown in the figure, the image backlight detection method may include the following steps:
Step S101, obtaining an average brightness value of a region where a target object is located and a region where the target object is located in an image to be detected, wherein a boundary box is used for marking the range of the region where the target object is located.
In the embodiment of the application, the region where the target object is located in the image to be detected can be acquired first, and the range of the region where the target object is located can be marked by using the boundary box. If the probability of detecting the target object as a person is 0.99 as shown in fig. 2, it may be determined that the target object in fig. 2 is a person, frame a is a frame marking the range of the area where the person is located, imageA is an image of the area in frame a, and imageB is an image of the area in frame B. The image to be detected may refer to an image whether the image to be detected is in a backlight state. Optionally, the area where the target object is located may refer to an area including only the target object, or may refer to an area within a bounding box, which is not limited herein. Wherein the area within the bounding box includes not only the target object but also a few non-target objects, as shown in box a in fig. 2.
In the embodiment of the application, when the average brightness value of the area where the target object is located is obtained, the brightness value and the total number of pixels of each pixel in the area where the target object is located can be obtained, the brightness values of all the pixels in the area where the target object is located are accumulated, and the average brightness value of the area where the target object is located can be obtained by dividing the accumulated value by the total number of pixels; since the image to be detected is usually an RGB three-channel image, the average luminance value of the region where the target object is located may also be calculated according to the formula y1= (0.299×r1) + (0.587×g1) + (0.114×b1), where Y1 is the average luminance value of the region where the target object is located, R1 is the average luminance value of the region where the target object is located on the R channel, and the average value of the luminance values of all pixels in the region where the target object is located on the R channel may be taken as the average luminance value of the region where the target object is located on the R channel; g1 is an average luminance value of the region where the target object is located on the G channel, and an average value of luminance values of all pixels in the region where the target object is located on the G channel may be taken as the average luminance value of the region where the target object is located on the G channel; b1 is the average brightness value of the area where the target object is located on the B channel, and the average value of the brightness values of all pixels in the area where the target object is located on the B channel can be used as the average brightness value of the area where the target object is located on the B channel.
Optionally, the bounding box is a bounding box.
In the embodiment of the application, the marking box is a bounding box for marking the range of the area where the target object is located in the image to be detected. The region where the target object marked by using the marking box is located can be obtained through the trained deep learning model, namely, the image to be detected is input into the trained deep learning model for one-time front propagation, and an image marked with the marking box is output, as shown in fig. 2. The deep learning model includes, but is not limited to, a convolutional neural network.
In the embodiment of the application, a plurality of sample images can be acquired first, a target frame including an area where a target object is located in each sample image is marked manually, the plurality of sample images are input into a deep learning model, the deep learning model identifies the target object in each sample image and a binding box including the area where the target object is located, a mapping relation between the binding box and the target frame is acquired for each sample image, and the deep learning model is adjusted according to the mapping relation, so that the binding box output by the deep learning model is closer to the target frame, and training of the deep learning model is completed.
Step S102, amplifying the boundary box and acquiring the average brightness value of the amplified region in the boundary box.
In the embodiment of the application, the bounding box can be enlarged by N times. The specific steps can be as follows: firstly, four-dimensional vectors (w 1, h1, x, y) of the boundary frame are obtained, wherein (x, y) represents the center point coordinates of the boundary frame, w1 represents the width of the boundary frame, h1 represents the height of the boundary frame, and then both w1 and h1 of the boundary frame are amplified by N times, namely the four-dimensional vectors of the amplified boundary frame are (w 1 x N, h1 x N, x, y), the four-dimensional vectors represent multiplication, and N is a number larger than 1, such as 1.5 or 2. Box B is an enlarged bounding box as shown in fig. 2. It should be noted that, if the enlarged bounding box exceeds the range of the image to be detected, the overlapping area of the enlarged bounding box and the image to be detected is the area in the enlarged bounding box.
In the embodiment of the application, when the amplified average brightness value of the area in the boundary frame is obtained, the amplified brightness value and the total number of pixels of each pixel in the area in the boundary frame can be obtained, the amplified brightness values of all the pixels in the area in the boundary frame are accumulated, and the accumulated value is divided by the total number of pixels to obtain the amplified average brightness of the area in the boundary frame; since the image to be detected is typically an RGB three-channel image, the average luminance value of the amplified region in the bounding box may also be calculated according to the formula y2= (0.299×r2) + (0.587×g2) + (0.114×b2), where Y2 is the average luminance value of the amplified region in the bounding box, R2 is the average luminance value of the amplified region in the bounding box on the R channel, and the average value of the luminance values of all pixels in the amplified region in the bounding box on the R channel may be used as the average luminance value of the amplified region in the bounding box on the R channel; g2 is an average luminance value of the amplified region in the bounding box on the G channel, and an average value of luminance values of all pixels in the amplified region in the bounding box on the G channel can be used as an average luminance value of the amplified region in the bounding box on the G channel; b2 is the average brightness value of the amplified area in the boundary box on the B channel, and the average value of the brightness values of all pixels in the amplified area in the boundary box on the B channel can be used as the average brightness value of the amplified area in the boundary box on the B channel.
Step S103, calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the enlarged boundary box.
In the implementation of the present application, the ratio Y1/Y2 of the average luminance value Y1 of the area where the target object is located and the average luminance value Y2 of the area in the enlarged bounding box may be used as the backlight degree of the image to be detected, and whether the target object in the image to be detected is in backlight is determined according to the backlight degree.
Step S104, judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that the target object in the image to be detected is in backlight.
In the embodiment of the present application, the ratio calculated in step S103 may be compared with a brightness threshold to determine whether the target object in the image to be detected is in backlight, if the ratio is smaller than the brightness threshold, it may be determined that the target object in the image to be detected is in backlight, and if the ratio is greater than or equal to the brightness threshold, it may be determined that the target object in the image to be detected is in non-backlight. The brightness threshold may be preset according to the sample image, or may be set by the user according to the actual requirement, which is not limited herein.
Optionally, after step S103, an embodiment of the present application further includes:
the brightness threshold is obtained according to a plurality of sample images in backlight.
In the embodiment of the application, a plurality of sample images known to be in backlight can be selected, the average brightness value of the area where the target object is located in each sample image is obtained and is recorded as a first average brightness value, the area where the target object is located in each sample image is marked by using a boundary box, the boundary box is enlarged, the average brightness value of the area in the enlarged boundary box is obtained and is recorded as a second average brightness value, the ratio of the first average brightness value to the second average brightness value corresponding to each sample image is calculated, and because each sample image corresponds to one ratio, the plurality of sample images correspond to a plurality of ratios, and the average value of the plurality of ratios can be used as a brightness threshold.
According to the embodiment of the application, the target object in the image to be detected is detected, the backlight degree of the image is calculated on the basis of detecting the target object, and whether the target object in the image is backlit or not is judged according to the backlight degree, so that the detection of a bright area and a dark area in the whole image to be detected can be avoided, the calculation time is shortened, the calculation cost is reduced, and further, whether the image is backlit or not can be rapidly and effectively detected.
Referring to fig. 3, a schematic implementation flow chart of an image backlight detection method according to a second embodiment of the present application is shown, where the image backlight detection method is applied to a terminal device, and as shown in the figure, the image backlight detection method may include the following steps:
Step S301, obtaining an average brightness value of a region where a target object is located and a region where the target object is located in an image to be detected, wherein the image to be detected is a preview picture in a camera, and marking the range of the region where the target object is located by using a boundary box.
In the embodiment of the present application, after the terminal device starts the camera function, a preview image of the photographed scene is typically displayed on a screen of the terminal device, and the preview image may be used as an image to be detected to determine whether a target object in the preset image is in backlight.
Step S302, the boundary box is enlarged, and the average brightness value of the area in the enlarged boundary box is obtained.
The step is the same as step S102, and the detailed description of step S102 will be omitted herein.
Step S303, calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the enlarged bounding box.
The step is the same as step S103, and specific reference may be made to the description related to step S103, which is not repeated here.
Step S304, judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that the target object in the image to be detected is in backlight.
The step is the same as step S104, and the detailed description of step S104 will be omitted herein.
Step S305 prompts the user to adjust the shooting angle of the camera or to perform backlight compensation processing.
In the embodiment of the application, when the target object in the preview picture is marked to be in backlight (namely, the preview picture is in a backlight scene), a user can be prompted to adjust the shooting angle of the camera, and the target object is changed from backlight to non-backlight by adjusting the shooting angle of the camera; or executing backlight compensation processing to enhance the brightness of the area where the target object is located. The backlight compensation process includes, but is not limited to, increasing exposure values of the camera, and may preset corresponding relations between different ratios and different exposure values, and when determining that the target object in the image to be detected is in backlight, find the exposure value corresponding to the ratio calculated in step S303 from the preset corresponding relation between different ratios and different exposure values, and adjust the current exposure value of the camera to the exposure value corresponding to the ratio.
According to the embodiment of the application, on the basis of the first embodiment, the image to be detected is a preview picture in the camera, and the user is prompted to adjust the shooting angle of the camera or execute backlight compensation processing, so that the backlight phenomenon of the image can be improved in the shooting process of the camera, and the image with better picture quality can be shot.
Referring to fig. 4, a schematic diagram of an image backlight detection device according to a third embodiment of the present application is shown, for convenience of explanation, only a portion related to the embodiment of the present application is shown.
The image backlight detection device includes:
A first brightness obtaining module 41, configured to obtain an average brightness value of a region where a target object is located and a region where the target object is located in an image to be detected, where a range of the region where the target object is located is marked by using a bounding box;
a second luminance acquiring module 42, configured to amplify the bounding box and acquire an average luminance value of an area within the amplified bounding box;
A ratio calculating module 43, configured to calculate a ratio of an average luminance value of the area where the target object is located to an average luminance value of the area within the enlarged bounding box;
And the backlight determining module 44 is configured to determine whether the ratio is smaller than a preset brightness threshold, and if the ratio is smaller than the brightness threshold, mark that the target object in the image to be detected is in backlight.
Optionally, the image to be detected is a preview screen in the camera.
Optionally, the image backlight detection device further includes:
and the processing module 45 is used for prompting a user to adjust the shooting angle of the camera or execute backlight compensation processing.
Optionally, the image backlight detection device further includes:
a threshold value acquisition module 46 is configured to acquire the brightness threshold value according to a plurality of sample images in backlight.
Optionally, the bounding box is a bounding box.
Optionally, the second brightness acquiring module 42 includes:
An acquisition unit configured to acquire four-dimensional vectors (w 1, h1, x, y) of the bounding box, where (x, y) represents a center point coordinate of the bounding box, w1 represents a width of the bounding box, and h1 represents a height of the bounding box;
And the amplifying unit is used for amplifying the width w1 and the height h1 of the boundary frame by N times, wherein the four-dimensional vector of the amplified boundary frame is (w 1 x N, h1 x N, x, y), and N is a number larger than 1.
Optionally, the second brightness obtaining module 42 further includes:
And the determining unit is used for determining that the overlapped area of the enlarged bounding box and the image to be detected is the area in the enlarged bounding box if the enlarged bounding box exceeds the range of the image to be detected.
The device provided in the embodiment of the present application may be applied to the first and second embodiments of the foregoing method, and details refer to the description of the first and second embodiments of the foregoing method, which are not repeated herein.
Fig. 5 is a schematic diagram of a terminal device according to a fourth embodiment of the present application. As shown in fig. 5, the terminal device 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52 stored in said memory 51 and executable on said processor 50. The processor 50, when executing the computer program 52, implements the steps in the respective image backlight detection method embodiments described above, such as steps S101 to S103 shown in fig. 1. Or the processor 50, when executing the computer program 52, performs the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 41 to 46 shown in fig. 4.
By way of example, the computer program 52 may be partitioned into one or more modules/units that are stored in the memory 51 and executed by the processor 50 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 52 in the terminal device 5. For example, the computer program 52 may be divided into a first brightness acquisition module, a second brightness acquisition module, a ratio calculation module, a backlight determination module, a processing module, and a threshold acquisition module, each of which specifically functions as follows:
The first brightness acquisition module is used for acquiring an area where a target object is located in an image to be detected and an average brightness value of the area where the target object is located, wherein a boundary box is used for marking the range of the area where the target object is located;
The second brightness acquisition module is used for amplifying the boundary frame and acquiring the average brightness value of the amplified region in the boundary frame;
The ratio calculating module is used for calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the amplified boundary frame;
And the backlight determining module is used for determining that the target object in the image to be detected is in backlight if the ratio is smaller than the brightness threshold value.
Optionally, the image to be detected is a preview screen in the camera.
Optionally, the processing module is configured to prompt a user to adjust a shooting angle of the camera or perform backlight compensation processing.
Optionally, the threshold value obtaining module is configured to obtain the brightness threshold value according to a plurality of sample images in backlight.
Optionally, the bounding box is a bounding box.
Optionally, the second brightness acquisition module includes:
An acquisition unit configured to acquire four-dimensional vectors (w 1, h1, x, y) of the bounding box, where (x, y) represents a center point coordinate of the bounding box, w1 represents a width of the bounding box, and h1 represents a height of the bounding box;
And the amplifying unit is used for amplifying the width w1 and the height h1 of the boundary frame by N times, wherein the four-dimensional vector of the amplified boundary frame is (w 1 x N, h1 x N, x, y), and N is a number larger than 1.
Optionally, the second brightness acquisition module further includes:
And the determining unit is used for determining that the overlapped area of the enlarged bounding box and the image to be detected is the area in the enlarged bounding box if the enlarged bounding box exceeds the range of the image to be detected.
The terminal device 5 may be a mobile phone, a notebook, a palm computer, a camera, etc. The terminal device may include, but is not limited to, a processor 50, a memory 51. It will be appreciated by those skilled in the art that fig. 5 is merely an example of the terminal device 5 and does not constitute a limitation of the terminal device 5, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input-output device, a network access device, a bus, etc.
The Processor 50 may be a central processing unit (Central Processing Unit, CPU), other general purpose Processor, digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), off-the-shelf Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the terminal device 5, such as a hard disk or a memory of the terminal device 5. The memory 51 may also be an external storage device of the terminal device 5, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) or the like, which are provided on the terminal device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the terminal device 5. The memory 51 is used for storing the computer program as well as other programs and data required by the terminal device. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (9)

1. An image backlight detection method, characterized in that the image backlight detection method comprises:
Obtaining an area where a target object is located in an image to be detected and an average brightness value of the area where the target object is located, wherein a boundary box is used for marking the range of the area where the target object is located;
Amplifying the boundary box and acquiring an average brightness value of the amplified region in the boundary box;
calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the amplified boundary frame;
judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that a target object in the image to be detected is in backlight;
The brightness threshold is obtained based on the following method:
And acquiring the brightness threshold according to the ratio of the first average brightness value of the boundary box of the area where the target object is located in the plurality of sample images and the second average brightness value of the amplified boundary box.
2. The method of claim 1, wherein the image to be detected is a preview image in a camera.
3. The image backlight detection method according to claim 2, wherein after marking that the target object in the image to be detected is in backlight, further comprising:
prompting a user to adjust the shooting angle of the camera or executing backlight compensation processing.
4. The image backlight detection method of claim 1, wherein the bounding box is a bounding box.
5. The image backlight detection method of claim 1, wherein the enlarging the bounding box comprises:
Acquiring four-dimensional vectors (w 1, h1, x, y) of the bounding box, wherein (x, y) represents center point coordinates of the bounding box, w1 represents width of the bounding box, and h1 represents height of the bounding box;
and amplifying the width w1 and the height h1 of the bounding box by N times, wherein the four-dimensional vector of the amplified bounding box is (w 1 x N, h1 x N, x, y), and N is a number larger than 1.
6. The image backlight detection method according to claim 1, further comprising:
And if the enlarged bounding box exceeds the range of the image to be detected, determining that the overlapped area of the enlarged bounding box and the image to be detected is the enlarged area in the bounding box.
7. An image backlight detection apparatus, characterized by comprising:
The first brightness acquisition module is used for acquiring an area where a target object is located in an image to be detected and an average brightness value of the area where the target object is located, wherein a boundary box is used for marking the range of the area where the target object is located;
The second brightness acquisition module is used for amplifying the boundary frame and acquiring the average brightness value of the amplified region in the boundary frame;
The ratio calculating module is used for calculating the ratio of the average brightness value of the area where the target object is located to the average brightness value of the area in the amplified boundary frame;
The backlight determining module is used for judging whether the ratio is smaller than a preset brightness threshold value, and if the ratio is smaller than the brightness threshold value, marking that a target object in the image to be detected is in backlight;
The brightness threshold is obtained based on the following method:
And acquiring the brightness threshold according to the ratio of the first average brightness value of the boundary box of the area where the target object is located in the plurality of sample images and the second average brightness value of the amplified boundary box.
8. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the image backlighting detection method according to any one of claims 1 to 6 when the computer program is executed.
9. A computer-readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the steps of the image backlight detection method according to any one of claims 1 to 6.
CN201910276705.9A 2019-04-08 2019-04-08 Image backlight detection method, image backlight detection device and terminal equipment Active CN111861965B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910276705.9A CN111861965B (en) 2019-04-08 2019-04-08 Image backlight detection method, image backlight detection device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910276705.9A CN111861965B (en) 2019-04-08 2019-04-08 Image backlight detection method, image backlight detection device and terminal equipment

Publications (2)

Publication Number Publication Date
CN111861965A CN111861965A (en) 2020-10-30
CN111861965B true CN111861965B (en) 2024-06-18

Family

ID=72951889

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910276705.9A Active CN111861965B (en) 2019-04-08 2019-04-08 Image backlight detection method, image backlight detection device and terminal equipment

Country Status (1)

Country Link
CN (1) CN111861965B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112949423B (en) * 2021-02-07 2024-05-24 深圳市优必选科技股份有限公司 Object recognition method, object recognition device and robot
CN114007019B (en) * 2021-12-31 2022-06-17 杭州魔点科技有限公司 Method and system for predicting exposure based on image brightness in backlight scene

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010102426A (en) * 2008-10-22 2010-05-06 Mitsubishi Electric Corp Image processing apparatus and image processing method
JP2015114465A (en) * 2013-12-11 2015-06-22 キヤノン株式会社 Imaging device, control method therefor, and control program

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05176220A (en) * 1991-12-25 1993-07-13 Matsushita Electric Ind Co Ltd Automatic exposure controller
CN102013006B (en) * 2009-09-07 2013-03-13 泉州市铁通电子设备有限公司 Method for automatically detecting and identifying face on the basis of backlight environment
CN106973236B (en) * 2017-05-24 2020-09-15 湖南盘子女人坊文化科技股份有限公司 Shooting control method and device
CN107360361B (en) * 2017-06-14 2020-07-31 中科创达软件科技(深圳)有限公司 Method and device for shooting people in backlight mode
CN108734676B (en) * 2018-05-21 2020-12-01 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010102426A (en) * 2008-10-22 2010-05-06 Mitsubishi Electric Corp Image processing apparatus and image processing method
JP2015114465A (en) * 2013-12-11 2015-06-22 キヤノン株式会社 Imaging device, control method therefor, and control program

Also Published As

Publication number Publication date
CN111861965A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
CN107197169B (en) high dynamic range image shooting method and mobile terminal
CN111654594B (en) Image capturing method, image capturing apparatus, mobile terminal, and storage medium
CN106161967B (en) Backlight scene panoramic shooting method and mobile terminal
CN112102164B (en) Image processing method, device, terminal and storage medium
CN108961157B (en) Picture processing method, picture processing device and terminal equipment
CN109215037B (en) Target image segmentation method and device and terminal equipment
CN110100251A (en) For handling the equipment, method and graphic user interface of document
CN108961183B (en) Image processing method, terminal device and computer-readable storage medium
CN108961267B (en) Picture processing method, picture processing device and terminal equipment
CN109285126B (en) Image processing method and device, electronic equipment and storage medium
CN112071267B (en) Brightness adjusting method, brightness adjusting device, terminal equipment and storage medium
WO2015196715A1 (en) Image retargeting method and device and terminal
CN111861965B (en) Image backlight detection method, image backlight detection device and terminal equipment
CN111368587A (en) Scene detection method and device, terminal equipment and computer readable storage medium
CN111654637B (en) Focusing method, focusing device and terminal equipment
CN110618852B (en) View processing method, view processing device and terminal equipment
CN108932703B (en) Picture processing method, picture processing device and terminal equipment
US10438377B2 (en) Method and device for processing a page
CN107360361B (en) Method and device for shooting people in backlight mode
CN111597009A (en) Application program display method and device and terminal equipment
CN108763491B (en) Picture processing method and device and terminal equipment
CN109559707B (en) Gamma value processing method and device of display panel and display equipment
CN108932704B (en) Picture processing method, picture processing device and terminal equipment
CN108898169B (en) Picture processing method, picture processing device and terminal equipment
CN113391779B (en) Parameter adjusting method, device and equipment for paper-like screen

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: 516006 TCL science and technology building, No. 17, Huifeng Third Road, Zhongkai high tech Zone, Huizhou City, Guangdong Province

Applicant after: TCL Technology Group Co.,Ltd.

Address before: 516006 Guangdong province Huizhou Zhongkai hi tech Development Zone No. nineteen District

Applicant before: TCL Corp.

Country or region before: China

GR01 Patent grant