WO2020015148A1 - 一种色斑检测方法及电子设备 - Google Patents

一种色斑检测方法及电子设备 Download PDF

Info

Publication number
WO2020015148A1
WO2020015148A1 PCT/CN2018/106236 CN2018106236W WO2020015148A1 WO 2020015148 A1 WO2020015148 A1 WO 2020015148A1 CN 2018106236 W CN2018106236 W CN 2018106236W WO 2020015148 A1 WO2020015148 A1 WO 2020015148A1
Authority
WO
WIPO (PCT)
Prior art keywords
channel
value
image
stain
subcutaneous
Prior art date
Application number
PCT/CN2018/106236
Other languages
English (en)
French (fr)
Inventor
郭知智
卢恒惠
郜文美
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US17/260,855 priority Critical patent/US11989885B2/en
Priority to EP18927127.3A priority patent/EP3813012B1/en
Priority to CN201880077836.8A priority patent/CN111417982B/zh
Publication of WO2020015148A1 publication Critical patent/WO2020015148A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/514Depth or shape recovery from specularities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • the present application relates to the field of image processing technology, and in particular, to a method for detecting speckles and an electronic device.
  • the severity of facial stains can directly reflect people's skin age and skin health, and is also an important factor in people's choice of cosmetics and skin care products.
  • an application on a mobile terminal can be used to analyze the color spot problem in a photo of a person's face, and a bright spot outline or a schematic position is marked in the picture to give a rating of the severity of the color spot.
  • These applications usually target the spot detection area in a face photo. The difference between the filtered spot detection area and the grayed spot detection area is used to obtain the spot area, and the spot area ratio is used to obtain the color. Spot score.
  • the detection methods used by these applications can only detect skin surface stains, but not subcutaneous stains.
  • the detection results of the stains are inaccurate, resulting in inaccurate quantification of the severity of the stains.
  • the embodiments of the present application provide a stain detection method and an electronic device, which are used to solve the problem of inaccurate stain detection results in the prior art.
  • a color spot detection method provided by an embodiment of the present application, which can be applied to an electronic device, the method includes: acquiring an image to be detected, and converting the image to be detected into a Lab color space to obtain Lab image.
  • the speckle feature image in the Lab image is extracted to obtain a speckle feature image.
  • the speckle feature image includes a skin surface speckle feature and a subcutaneous speckle feature. Determining a skin surface spot and a subcutaneous spot in the color spot characteristic image.
  • by extracting the speckle features of the image to be detected in the Lab color space not only the skin surface speckles in the image to be detected, but also the subcutaneous speckles in the image to be detected can be detected. Skin surface spots and subcutaneous spots are used to determine the condition of the human face. Therefore, compared with the prior art, which can only detect the spots based on the detected skin surface spots, it helps to improve the accuracy of detecting the spots. Sex.
  • the detailed feature components of the L channel, the a channel, and the b channel in the Lab image may be separately extracted and determined.
  • An L-channel difference between the L-channel and the extracted detailed feature component of the L-channel, and an a-channel difference between the a-channel and the extracted detailed feature component of the a-channel, the b A b-channel difference between the channel and the extracted detailed feature component of the b-channel.
  • the speckle feature image based on the L-channel difference value, the a-channel difference value, and the b-channel difference value, wherein the L channel in the speckle characteristic image is the L-channel difference value, and The a-channel in the speckle feature image is the a-channel difference, and the b-channel in the speckle feature image is the b-channel difference.
  • a spot region in the spot characteristic image may be determined. For each of the color spot areas, a b-channel average value of each pixel point in each of the color spot areas is determined. It is determined that a spot region with a b-channel average value greater than a first threshold is a skin surface spot, and a spot region with a b-channel average value less than or equal to the first threshold is a subcutaneous stain.
  • the above-mentioned design can accurately detect the skin surface pigmentation and the subcutaneous pigmentation.
  • a first pixel point within a detection frame may be determined, wherein a length of the detection frame is shorter than that of the speckle characteristic image. Length, and the width of the detection frame is smaller than the width of the color spot characteristic image, the detection frame moves within the color spot characteristic image by a preset step; the pixel value of the first pixel point satisfies the following formula :
  • r1 is the pixel value of the first pixel point
  • a1 is the average pixel value of the pixel point in the detection frame
  • T1 is the preset value
  • b1 is the pixel value of the pixel point in the detection frame. variance.
  • r2 is the pixel value of the second pixel point
  • a2 is the average pixel value of the pixel point in the color spot characteristic image
  • T2 is a preset value
  • b2 is the pixel in the color spot characteristic image.
  • the first pixel point and the second pixel point are determined as color spots; the color spots are subjected to an expansion operation, and the color spots subjected to the expansion operation are subjected to an etching operation to obtain a color spot area.
  • pigment points can be detected more accurately, thereby improving the accuracy of spot detection.
  • a speckle region having an area smaller than a second threshold and / or having an area larger than a third threshold may be removed, and the second threshold is smaller than The third threshold; and / or removing a stained area whose ratio of area to perimeter is smaller than the fourth threshold. Because the area of the color spot in the human face is generally within a certain range, and the shape of the color spot is generally round, in the above design, by removing the area of the color spot area that is too small, too large, and low in circularity Area, which can improve the accuracy of spot detection.
  • a first feature set may also be determined, and the score of the skin surface stain is quantified based on the first feature set, and the first feature set includes at least one of the following features: a uniform value , The number of skin surface stains, the area of the skin surface stains, and the contrast value of the skin surface stains, the uniform value is used to characterize the pigment uniformity of the characteristic image of the stains, so The contrast value of the skin surface stain is used to characterize the color contrast of the skin surface stain.
  • the second feature set includes at least one of the following characteristics: the uniform value, the number of the subcutaneous stain, The area of the subcutaneous stain, the contrast value of the subcutaneous stain, and the contrast value of the subcutaneous stain are used to characterize the color contrast of the subcutaneous stain.
  • a comprehensive score for spot detection is determined based on the score of the skin surface spot and the score of the subcutaneous spot. The comprehensive score is displayed, or the score of the skin surface stain, the score of the subcutaneous stain, and the comprehensive score are displayed. In the above design, the patch score obtained by combining the score of the skin surface patch and the score of the subcutaneous patch has better accuracy.
  • the score of the skin surface pigmentation can be determined by the following formula:
  • H 1 is the score of the skin surface stain
  • the A is the uniform value
  • the B 1 is the number of the skin surface stain
  • C 1 is all the skin surface color
  • the D 1 is the sum of the areas of all the skin surface color spots
  • the E is the area of the characteristic image of the color spot
  • the w 1 , the w 2 , the w 3 are preset parameters
  • the subcutaneous stain score can be determined by the following formula:
  • H 2 is the score of the subcutaneous pigmentation
  • B 2 is the number of the subcutaneous pigmentation
  • C 2 is the sum of the contrast values of all the subcutaneous pigmentation
  • D 2 is The sum of the areas of all the subcutaneous stains, w 3 and w 4 are preset parameters;
  • the comprehensive score can be determined by the following formula:
  • H y 1 ⁇ H 1 + y 2 ⁇ H 1 ;
  • H is the comprehensive score
  • y 1 and y 2 are preset parameters.
  • the uniform values in the first feature set and the second feature set are determined in the following manner: the color spot feature image is divided into several overlapping rectangular areas, and determined A standard deviation of a pixel value of a pixel point in each of the rectangular regions. Determine the mean value of the standard deviations of the pixel values of all the rectangular regions to obtain the uniform value.
  • the uniformity of the skin pigment can be determined more accurately.
  • the contrast value of the skin surface stain in the first feature set may be determined in the following manner: determining an average first pixel value of a pixel point in each skin surface stain, and The mean value of the second pixel values of the pixel points in the speckle feature image is described. A ratio of the average value of the first pixel value to the average value of the second pixel value is determined to obtain a contrast value of the skin surface stain.
  • the contrast value of the subcutaneous stain in the second feature set may be determined by determining a third pixel average value of the pixel points in each subcutaneous stain and the second pixel value average.
  • a ratio of the average value of the third pixel value to the average value of the second pixel value is determined to obtain a comparison value of the subcutaneous stain.
  • the comparison value of the skin surface patch and the subcutaneous stain can be obtained more accurately.
  • the image to be detected may be converted into a grayscale image. Remove pixels with a pixel value greater than a fifth threshold in the grayscale image. Because the pixel gray value in the gray image of the face spot area caused by reflection is significantly larger than the pixel gray value of the normal skin area, the spot area that can be removed by the above design can reduce the effect of the spot area on the spot detection. , which can further improve the accuracy of spot detection.
  • an embodiment of the present application further provides an electronic device, including: a memory, for storing a computer program.
  • a processing module for invoking a computer program stored by the storage module to execute: acquiring an image to be detected; converting the image to be detected into a Lab color space to obtain a Lab image; and extracting features of the speckle in the Lab image to obtain A color spot characteristic image, the color spot characteristic image including a skin surface color spot feature and a subcutaneous color spot feature; determining the skin surface color spot and the subcutaneous color spot in the color spot characteristic image.
  • the processing module when extracting the speckle feature in the Lab image to obtain the speckle feature image, is specifically used to: extract the L channel, a channel, and b in the Lab image, respectively.
  • the detailed feature component of the channel determining an L-channel difference between the L-channel and the extracted detailed feature component of the L-channel, and a between the a-channel and the extracted detailed feature component of the a-channel Channel difference, the b-channel difference between the b-channel and the extracted detailed feature component of the b-channel; obtained based on the L-channel difference, the a-channel difference, and the b-channel difference
  • the speckle characteristic image wherein the L channel in the speckle characteristic image is the L channel difference value, the a channel in the speckle characteristic image is the a channel difference value, and the b in the speckle characteristic image is The channel is the b-channel difference.
  • the processing module when extracting the detailed feature components of the L channel, the a channel, and the b channel in the Lab image, is specifically used for: respectively targeting the L channel, a, and a in the Lab image.
  • the channel and b channels are bilaterally filtered to obtain the detailed feature components of the L channel, the detailed feature components of the a channel, and the detailed feature components of the b channel.
  • the processing module when determining a skin surface spot and a subcutaneous spot in the spot characteristic image, is specifically configured to: determine a spot area in the spot characteristic image; For each of the stained areas, determine the b-channel average of each pixel in each of the stained areas; determine that the area with a b-channel average greater than the first threshold is a skin surface stain, and the b-channel average is less than or equal to The spot area of the first threshold is a subcutaneous spot.
  • the processing module when determining a speckle region in the speckle feature image, is specifically configured to: determine a first pixel point within a detection frame, wherein a length of the detection frame Less than the length of the speckle characteristic image, and the width of the detection frame is smaller than the width of the speckle characteristic image, the detection frame moves within the speckle characteristic image by a preset step; the first The pixel value of a pixel satisfies the following formula:
  • r1 is the pixel value of the first pixel point
  • a1 is the average pixel value of the pixel point in the detection frame
  • T1 is the preset value
  • b1 is the pixel value of the pixel point in the detection frame. variance.
  • r2 is the pixel value of the second pixel point
  • a2 is the average pixel value of the pixel point in the color spot characteristic image
  • T2 is a preset value
  • b2 is the pixel in the color spot characteristic image.
  • the first pixel point and the second pixel point are determined as color spots.
  • the color spot is subjected to an expansion operation, and the color spot subjected to the expansion operation is subjected to an etching operation to obtain a color spot area.
  • the processing module is further configured to: after determining a speckle region in the speckle feature image, remove a speckle region having an area smaller than a second threshold and / or having an area larger than a third threshold , The second threshold value is smaller than the third threshold value; and / or the stained area where the ratio of the area to the perimeter is smaller than the fourth threshold value is removed.
  • the processing module is further configured to determine a first feature set and quantify a score of the skin surface pigmentation based on the first feature set, where the first feature set includes the following features At least one of: a uniform value, the number of skin surface stains, the area of the skin surface stains, and a contrast value of the skin surface stains, the uniform value is used to characterize the characteristics of the stains The pigment is uniform in the image, and the contrast value of the skin surface stain is used to characterize the color contrast of the skin surface stain.
  • the second feature set includes at least one of the following characteristics: the uniform value, the number of the subcutaneous stain, The area of the subcutaneous stain, the contrast value of the subcutaneous stain, and the contrast value of the subcutaneous stain are used to characterize the color contrast of the subcutaneous stain.
  • a comprehensive score for spot detection is determined based on the score of the skin surface spot and the score of the subcutaneous spot. The comprehensive score is displayed, or the score of the skin surface stain, the score of the subcutaneous stain, and the comprehensive score are displayed.
  • the processing module determines the score of the skin surface pigmentation by the following formula:
  • H 1 is the score of the skin surface stain
  • the A is the uniform value
  • the B 1 is the number of the skin surface stain
  • C 1 is all the skin surface color
  • the D 1 is the sum of the areas of all the skin surface color spots
  • the E is the area of the characteristic image of the color spot
  • the w 1 , the w 2 , the w 3 are preset parameters.
  • the processing module determines the subcutaneous pigmentation score by the following formula:
  • H 2 is the score of the subcutaneous pigmentation
  • B 2 is the number of the subcutaneous pigmentation
  • C 2 is the sum of the contrast values of all the subcutaneous pigmentation
  • D 2 is The sum of the areas of all the subcutaneous stains
  • w 3 and w 4 are preset parameters.
  • the processing module determines the comprehensive score by the following formula:
  • H y 1 ⁇ H 1 + y 2 ⁇ H 1 ;
  • H is the comprehensive score
  • y 1 and y 2 are preset parameters.
  • the processing module is further configured to determine the uniform value in the first feature set and the second feature set in the following manner: dividing the color spot feature image into several Determine the standard deviation of the pixel values of the pixels in each of the rectangular areas; determine the average of the standard deviations of the pixel values of all the rectangular areas to obtain the uniform value.
  • the processing module is further configured to determine a contrast value of the skin surface stain in the first feature set by: determining a first A pixel average value and a second pixel average value of pixels in the speckle characteristic image; determining a ratio of the first pixel average value to the second pixel average value to obtain a comparison of the skin surface stain
  • the processing module is further configured to determine a contrast value of the subcutaneous stain in the second feature set in the following manner: determine an average value of a third pixel value of a pixel point in each subcutaneous stain, and the The average value of the second pixel value; determining a ratio of the average value of the third pixel value to the average value of the second pixel value to obtain a comparison value of the subcutaneous stain.
  • the processing module is further configured to convert the to-be-detected image into a grayscale image before converting the to-be-detected image to a Lab color space; remove pixels from the grayscale image Pixels whose value is greater than the fifth threshold.
  • a computer storage medium provided in an embodiment of the present application stores the program instructions, and when the program instructions are run on the electronic device, the electronic device executes the first aspect of the embodiments of the application and any of the first aspects Possible design methods.
  • a computer program product provided in an embodiment of the present application, when the computer program product is run on an electronic device, causes the electronic device to implement the first aspect of the embodiment of the present application and any of its possible design methods.
  • a chip provided in an embodiment of the present application, the chip is coupled to a memory in an electronic device, and executes the first aspect of the embodiment of the present application and a method of any possible design thereof.
  • Coupled in the embodiments of the present application means that two components are directly or indirectly combined with each other.
  • FIG. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a user interface according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a color spot detection method according to an embodiment of the present application.
  • FIG. 4A is a schematic diagram of a preview user interface according to an embodiment of the present application.
  • 4B is a schematic flowchart of a color spot detection method according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an L channel, an a channel, and a b channel of a Lab image according to an embodiment of the present application;
  • FIG. 6 is a schematic diagram of a speckle feature image provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a detailed feature component of an L channel, a detailed feature component of an a channel, and a detailed feature component of a b channel provided in an embodiment of the present application;
  • FIG. 8 is a schematic diagram of an L-channel difference, an a-channel difference, and a b-channel difference provided in an embodiment of the present application;
  • FIG. 9 is a schematic diagram of a skin surface stain and a subcutaneous stain according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of a user interface for displaying a color spot score according to an embodiment of the present application.
  • FIG. 11 is a schematic diagram of another user interface for displaying a color spot score according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a user interface for displaying a color spot detection result according to an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
  • the electronic device may be a portable electronic device including functions such as a personal digital assistant and / or a music player, such as a mobile phone, a tablet computer, a wearable device (such as a smart watch) with a wireless communication function, Vehicle equipment, etc.
  • portable electronic devices include, but are not limited to, carrying Or portable electronic devices with other operating systems.
  • the above-mentioned portable electronic device may also be a laptop computer or the like having a touch-sensitive surface (for example, a touch panel). It should also be understood that, in other embodiments of the present application, the above electronic device may also be a desktop computer having a touch-sensitive surface (such as a touch panel).
  • FIG. 1 is a schematic structural diagram of an electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 2, and wireless communication
  • the sensor module 180 includes an ambient light sensor 180L.
  • the sensor module 180 may further include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, Bone conduction sensor 180M and so on.
  • the electronic device 100 in this embodiment of the present application may further include an antenna 1, a mobile communication module 150, and a subscriber identification module (SIM) card interface 195.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, and a memory.
  • AP application processor
  • GPU graphics processing unit
  • ISP image signal processor
  • controller controller
  • memory e.g., RAM
  • Video codec e.g., RAM
  • DSP digital signal processor
  • NPU neural-network processing unit
  • different processing units may be independent devices or integrated in one or more processors.
  • the processor 110 may further include a memory for storing instructions and data.
  • the memory in the processor 110 may be a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the processor 110 may further include one or more interfaces.
  • the interface may be a universal serial bus (USB) interface 130.
  • the interface can also be an integrated circuit (I2C) interface, an integrated circuit (I2S) interface, a pulse code modulation (PCM) interface, or a universal asynchronous transmission / reception transmission.
  • I2C integrated circuit
  • I2S integrated circuit
  • PCM pulse code modulation
  • UART Universal asynchronous receiver / transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input / output
  • SIM subscriber identity module
  • the USB interface 130 is an interface that complies with the USB standard specification.
  • the USB interface 130 may include a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and a peripheral device. It can also be used to connect headphones and play audio through headphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. While the charge management module 140 is charging the battery 142, the power management module 141 can also provide power to the electronic device.
  • the power management module 141 is used to connect the battery 142, the charge management module 140 and the processor 110.
  • the power management module 141 receives inputs from the battery 142 and / or the charge management module 140 and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be disposed in the processor 110.
  • the power management module 141 and the charge management module 140 may be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
  • the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals.
  • Each antenna in the electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve antenna utilization.
  • antenna 1 can be multiplexed into a diversity antenna for a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G / 3G / 4G / 5G and the like applied on the electronic device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 150 may receive the electromagnetic wave by the antenna 1, and perform filtering, amplification, and other processing on the received electromagnetic wave, and transmit it to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic wave radiation through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is configured to modulate a low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be a separate device.
  • the modem processor may be independent of the processor 110 and disposed in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 may provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (Bluetooth, BT), and global navigation satellites applied to the electronic device 100. Wireless communication solutions such as global navigation system, GNSS, frequency modulation (FM), near field communication (NFC), and infrared (IR).
  • the wireless communication module 160 may be one or more devices that integrate at least one communication processing module.
  • the wireless communication module 160 receives the electromagnetic wave signal via the antenna 2, frequency-modulates and filters the electromagnetic wave signal, and sends the processed signal to the processor 110.
  • the wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it into electromagnetic wave radiation through the antenna 2.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include a global system for mobile communications (GSM), a general packet radio service (GPRS), code division multiple access (CDMA), and broadband. Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and / or IR technology.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and / or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Bertdou navigation navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing and is connected to the display 194 and an application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light-emitting diode).
  • LED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLEDs flexible light-emitting diodes
  • Miniled MicroLed, Micro-oLed, quantum dot light emitting diodes (QLEDs), etc.
  • the electronic device 100 may include one or N display screens 194, where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP processes the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, and the light is transmitted to the light receiving element of the camera through the lens. The light signal is converted into an electrical signal, and the light receiving element of the camera passes the electrical signal to the ISP for processing and converts the image to the naked eye. ISP can also optimize the image's noise, brightness, and skin tone. ISP can also optimize parameters such as exposure and color temperature of the shooting scene. In some embodiments, an ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • An object generates an optical image through a lens and projects it onto a photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs digital image signals to the DSP for processing.
  • DSP converts digital image signals into image signals in standard RGB, YUV and other formats.
  • the electronic device 100 may include one or N cameras 193, where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals. In addition to digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform a Fourier transform on the frequency point energy and the like.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture expert groups (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture expert groups
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can quickly process input information and continuously learn by itself.
  • the NPU can realize applications such as intelligent cognition of the electronic device 100, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 may be used to connect an external memory card (for example, a Micro SD card) to achieve the expansion of the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, save files such as music and videos on an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area may store an operating system, at least one application required by a function (such as a sound playback function, an image playback function, etc.) and the like.
  • the storage data area may store data (such as audio data, phonebook, etc.) created during the use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), or the like.
  • UFS universal flash memory
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, and an application processor. For example, music playback, recording, etc.
  • the audio module 170 is configured to convert digital audio information into an analog audio signal and output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
  • the speaker 170A also called a "horn", is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music or listen to a hands-free call through the speaker 170A.
  • the receiver 170B also referred to as the "handset" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or a voice message, it can answer the voice by holding the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound through the mouth near the microphone 170C, and input a sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C.
  • the electronic device 100 may be provided with two microphones 170C, in addition to collecting sound signals, it may also implement a noise reduction function.
  • the electronic device 100 may further be provided with three, four, or more microphones 170C to implement sound signal collection, noise reduction, and also identify the source of the sound, to implement a directional recording function, and the like.
  • the headset interface 170D is used to connect a wired headset.
  • the headphone interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic platform (OMTP) standard interface, a cellular telecommunications industry association (of the USA, CTIA) standard interface, etc. .
  • OMTP open mobile electronic platform
  • CTIA cellular telecommunications industry association
  • the pressure sensor 180A is used to sense a pressure signal, and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be disposed on the display screen 194.
  • the capacitive pressure sensor may be at least two parallel plates having a conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity lower than the first pressure threshold is applied to the short message application icon, an instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold is applied to the short message application icon, an instruction for creating a short message is executed.
  • the gyro sensor 180B may be used to determine a movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, the x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the angle of the electronic device 100 shake, and calculates the distance that the lens module needs to compensate according to the angle, so that the lens cancels the shake of the electronic device 100 through the backward movement to achieve image stabilization.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the barometric pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C, and assists in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip leather case by using the magnetic sensor 180D.
  • the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the opened and closed state of the holster or the opened and closed state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to recognize the posture of electronic devices, and is used in applications such as switching between horizontal and vertical screens, and pedometers.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light through a light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from a nearby object. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficiently reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100.
  • the electronic device 100 may use the proximity light sensor 180G to detect that the user is holding the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in holster mode, and the pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • Ambient light sensor 180L can also be used to automatically adjust white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 may use the collected fingerprint characteristics to realize fingerprint unlocking, access application lock, fingerprint photographing, fingerprint answering an incoming call, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 executes a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the electronic device 100 performs a performance reduction of a processor located near the temperature sensor 180J so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid the abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 when the temperature is lower than another threshold, performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K is also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also referred to as a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • a visual output related to the touch operation may be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire a vibration signal of a human voice oscillating bone mass.
  • Bone conduction sensor 180M can also contact the human pulse and receive blood pressure beating signals.
  • the bone conduction sensor 180M may also be disposed in the earphone and combined into a bone conduction earphone.
  • the audio module 170 may analyze a voice signal based on the vibration signal of the oscillating bone mass of the vocal part obtained by the bone conduction sensor 180M to implement a voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M to implement a heart rate detection function.
  • the keys 190 may include a start key, a volume key, and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive a key input, and generate a key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 may generate a vibration alert.
  • the motor 191 can be used for vibration alert for incoming calls, and can also be used for touch vibration feedback.
  • the touch operation applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios (such as time reminders, receiving information, alarm clocks, games, etc.) can also correspond to different vibration feedback effects.
  • Touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate a charging state, a power change, and may also be used to indicate a message, missed call, notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting or removing the SIM card interface 195.
  • the electronic device 100 may support one or N SIM card interfaces, and N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card, etc. Multiple SIM cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 may also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through a SIM card to implement functions such as calling and data communication.
  • the electronic device 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer parts than shown, or some parts may be combined, or some parts may be split, or different parts may be arranged.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the electronic device 100 is taken as an example to describe the embodiment of the present application in detail.
  • the applications supported by the electronic device in the embodiments of the present application may include applications such as a camera, such as a camera.
  • the applications supported by the electronic device may also include various other applications, such as: graphics, games, phones, video players, music players, photo management, browsers, calendars, clocks, and so on.
  • the applications supported by the electronic device in the embodiments of the present application may further include applications for skin detection.
  • the application for skin detection is to detect the characteristics of the user's facial skin (such as facial skin wrinkles, pores, blackheads, stains, red areas, etc.) through the captured face images, and can provide users with a detection result report .
  • the detection result report may include, but is not limited to, scoring for each feature on the facial skin, comprehensive analysis of the facial skin, etc., and may further display the user's face picture, and based on the detection result of each feature on the face image Corresponding problems are marked, such as blackheads in the nose area, wrinkles in the forehead area, and stains in the cheek area.
  • the detection result report can be presented to the user through a user interface.
  • the detection result report can be shown in the user interface 200 shown in FIG. 2, including a comprehensive score, skin age, and pores, blackheads, fine lines, stains, and Red zone score.
  • the user interface 200 may further include a virtual button 201, a virtual button 202, a virtual button 203, a virtual button 204, and a virtual button 205.
  • the virtual button 201 the electronic device 100 responds to the virtual button.
  • the specific care suggestions for the pores are displayed on the display screen 194.
  • the virtual button 202, the virtual button 203, the virtual button 204, and the virtual button 205 refer to the functions of the virtual button 201, and details are not described herein again.
  • the user skin detection solution in the embodiment of the present application may integrate a shooting condition detection module, an image quality detection module, and a region of interest in the processor 110. , ROI) detection module, skin feature detection module, result analysis module, etc.
  • a shooting condition detection module, an image quality detection module, a region of interest (ROI) detection module, a skin feature detection module, a result analysis module, and the like may be integrated on an application processor in the processor 110. Wait.
  • an artificial intelligence (AI) chip is integrated in the processor 110, and a shooting condition detection module, an image quality detection module, and a region of interest (ROI) detection module are integrated on the AI chip.
  • Skin feature detection module, result analysis module, etc. to achieve user skin detection.
  • the shooting condition detection module can detect the current shooting conditions to guide users to shoot under the required shooting conditions to ensure that the captured images meet the requirements, thereby ensuring the accuracy of skin detection based on the images.
  • the required shooting conditions include: sufficient ambient lighting, a suitable distance between the face and the electronic device (for example, about 25cm), straight faces, eyes closed, no glasses, no bangs on the forehead, accurate focus, no Obvious jitter, etc.
  • the processor 110 When the shooting condition detection module successfully detects, the processor 110 will start the intelligent fill light. For example, when the shooting condition detection module meets the requirements of the current shooting condition, it determines that the detection is successful. Specifically, in the embodiment of the present application, the electronic device may use different fill light modes (such as a flash mode and a flashlight mode) to fill the user's face to meet the requirements for detecting different facial skin features. After the user's face is filled with light, the processor 110 can control the camera 193 to photograph the user's face to obtain a face image of the user's face.
  • different fill light modes such as a flash mode and a flashlight mode
  • the image quality detection module can detect the quality of the face image to ensure that the captured image meets the requirements of different facial skin feature detection.
  • the ROI detection module can determine the ROI to be detected from the face image after the image quality detection module detects that the quality of the image meets the requirements. For example, a blackhead ROI is a small area on the nose.
  • the skin feature detection module can detect facial skin features in the determined ROI, for example, detecting wrinkles, pores, blackheads, stains, red areas, and oil output in the skin.
  • the result analysis module can analyze the detection results of the facial skin features detected by the skin feature detection module, and give a score and a ranking of each detection item for each skin feature.
  • the processor 110 may further integrate an image pre-processing module.
  • the image pre-processing module can compress and crop the captured face image, so that the ROI detection module and skin feature detection module can perform subsequent processing.
  • the processor 110 may also report the detected detection report (including the area of the detection result of each feature on the face image, such as a blackhead in the nose area). , Wrinkles are marked in the forehead area, stains are marked in the cheek area, etc., the scores of each detection item are displayed on the display screen 194 for users to check and improve the user experience.
  • an embodiment of the present application provides a color spot detection method for detecting the color spot of a human face.
  • the skin surface spots and subcutaneous spots in the face image can be detected based on the Lab color model, so that the skin surface spots and subcutaneous spots can be combined to determine the situation of the human face.
  • FIG. 3 it is a schematic flowchart of a color spot detection method provided by an embodiment of the present application.
  • the method can be applied to the electronic device 100 and includes the following steps:
  • the display screen 194 of the electronic device 100 may display a photo preview interface in response to a user operation.
  • the display screen 194 displays a main interface including a camera icon.
  • the main interface is the user interface 400 shown in FIG. 4A.
  • the user interface 400 includes a camera icon 401.
  • the user interface 400 may further include a mail icon, a short message icon, a gallery icon, a WeChat icon, and the like.
  • the electronic device 100 may display a photo preview interface on the display screen 194 in response to a user operation on the camera icon 401.
  • the photo preview interface may be the user interface 410 shown in FIG. 4A.
  • the user interface 410 includes a preview area 411.
  • the preview area 411 displays an image collected by a camera of the electronic device 100.
  • the electronic device 100 includes both a front camera and a rear camera
  • the image displayed in the photo preview interface is the front camera.
  • the electronic device 100 may not include a camera, and the electronic device 100 may be connected to other electronic devices including a camera, and acquire images through the camera of an external device.
  • the electronic device 100 can detect the current shooting conditions by using the shooting condition detection function to guide the user to shoot under the required shooting conditions to ensure that the captured image meets the requirements.
  • Shooting condition detection function When the current shooting conditions meet the shooting requirements, the user's face can be supplemented by the intelligent fill light function to meet the requirements of different facial skin feature detection. After the user's face is filled with light, the user's face can be captured by the camera 193 to obtain a face image of the user's face. Then, the image quality detection function can be used to detect the quality of the face image to ensure that the captured image meets the requirements of different facial skin feature detection.
  • the ROI can be determined from the face image through the ROI detection function, and the ROI is the image to be detected.
  • the ROI detection function can obtain a ROI image to be detected after performing key point detection through face detection technology and facial feature point positioning technology.
  • ROI is the area of the cheeks and nose on both sides, as shown by the dashed box range shown in Figure 4B.
  • step S301 After step S301 is performed, and before step S302 is performed, a spot area in the image to be detected may be removed.
  • a possible implementation manner is to convert the image to be detected into a grayscale image. Remove pixels with a pixel value greater than a fifth threshold in the grayscale image.
  • the pixel gray value in the grayscale image of the face spot area caused by reflection is significantly larger than the pixel gray value of the normal skin area, and it exists in flakes. Therefore, by removing the pixel value in the grayscale image of the image to be detected is greater than the The method of setting the threshold pixel point can remove the flare area in the image to be detected.
  • the pixel values in the image to be detected may also be normalized.
  • the normalized pixel values of the pixel points in the image to be detected may be in the range of [0,255].
  • the normalized pixel values of the pixel points in the image to be detected may also be in other ranges.
  • the embodiment of the present application does not specifically limit the range of the pixel values of the normalized pixel points.
  • the Lab color space is composed of three channels: L, a, and b.
  • L represents luminosity
  • a represents a range from magenta to green
  • b represents a range from yellow to blue.
  • Each pixel in the Lab image of the image to be detected has three parameters: L value, a value, and b value.
  • L values of all pixels in the Lab image of the image to be detected constitute the L channel, where L (x, y) represents the L values of the pixels in the xth row and the y column.
  • the a values of all pixels in the Lab image of the image to be detected constitute the a channel, where a (x, y) represents the a value of the pixels in the xth row and the y column.
  • the b-values of all pixels in the Lab image of the image to be detected constitute the b-channel, where b (x, y) represents the b-values of the pixels in the x-th row and y-column.
  • W is the length of the Lab image
  • H is the width of the Lab image.
  • S303 Extract the stain features in the Lab image to obtain a stain feature image, where the stain feature image includes a skin surface stain feature and a subcutaneous stain feature.
  • the speckle feature image can be shown in FIG. 6.
  • step S303 may be implemented through steps A1 to A3:
  • A1 Extract the detailed feature components of the L channel, a channel, and b channel in the Lab image.
  • the detailed characteristic components of the L channel, a channel, and b channel can be shown in FIG. 7.
  • the L channel, a channel, and b channel in the Lab image are processed to extract the detailed features of each channel, so as to obtain the detailed feature components of the L channel, a channel, and b channel.
  • bilateral filtering processing may be performed for the L channel, the a channel, and the b channel in the Lab image to obtain the detailed feature component of the L channel, the detailed feature component of the a channel, and the detailed feature component of the b channel.
  • the embodiments of this application do not extract the L channel, a channel, and b channel in the Lab image.
  • the manner of the detail feature component is specifically limited.
  • A2 determining an L-channel difference between the L-channel and the extracted detailed feature component of the L-channel, and an a-channel difference between the a-channel and the extracted feature feature of the a-channel, A b-channel difference between the b-channel and the extracted detailed feature component of the b-channel.
  • the L-channel difference, a-channel difference, and b-channel difference can be shown in FIG. 8.
  • A3 obtaining the speckle feature image based on the L-channel difference value, the a-channel difference value, and the b-channel difference value, wherein the L-channel in the speckle characteristic image is the L-channel difference value,
  • the a-channel in the speckle characteristic image is the a-channel difference, and the b-channel in the speckle characteristic image is the b-channel difference.
  • the L-channel difference value, the a-channel difference value, and the b-channel difference value are combined to obtain a speckle characteristic image, in which the pixel points of the speckle characteristic image located in the x-th row and the y-th column are L value is equal to L ( x, y ) -L ′ ( x, y ), a value is equal to a ( x, y ) -a ′ ( x, y ), and b value is equal to b ( x, y ) -b ′ ( x , Y ).
  • a stain area in the stain characteristic image may be determined first. Then, for each of the stained regions, a b-channel average value of each pixel point in each of the stained regions may be determined. It can be determined that a spot region with a b-channel average value greater than a first threshold is a skin surface spot, and a spot region with a b-channel average value less than or equal to the first threshold is a subcutaneous stain.
  • the skin spots and subcutaneous spots can be displayed in different ways when marking the spots on the face image. For example, the skin spots are marked with a solid line frame, and the skin spots are marked with a dotted line frame, as shown in Figure 9. As shown, alternatively, the skin marks are marked with red marks, the skin marks are marked with blue marks, and so on.
  • the speckle features of the image to be detected in the Lab color space not only the skin surface speckles in the image to be detected, but also the subcutaneous speckles in the image to be detected can be detected.
  • Skin surface spots and subcutaneous spots are used to determine the condition of the human face. Therefore, compared with the prior art, which can only detect the spots based on the detected skin surface spots, it helps to improve the accuracy of detecting the spots. Sex.
  • a detection frame having a size of n * n is set, and the detection frame is moved within the speckle feature image with a preset step size, where n is a positive integer not greater than W and not greater than H.
  • n is a positive integer not greater than W and not greater than H.
  • the detection frame starts to slide from the upper left corner of the speckle feature image to the lower right with a preset step.
  • the average pixel value and the pixel value variance of the pixel points in the detection frame are calculated, and the first One pixel:
  • r1 is the pixel value of the first pixel point
  • a1 is the average pixel value of the pixel points in the detection frame
  • the T1 is a preset value
  • b1 is the pixel value variance of the pixel points in the detection frame.
  • r2 is the pixel value of the second pixel point
  • a2 is the average pixel value of the pixel point in the color spot characteristic image
  • T2 is a preset value
  • b2 is the pixel in the color spot characteristic image.
  • the colored spots are subjected to an expansion operation, and the colored spots subjected to the expansion operation are subjected to an etching operation to obtain a colored spot area.
  • one or more of the following three types of speckle regions may be removed: a speckle region having an area smaller than the second threshold, and an area having an area greater than the third threshold.
  • the color spot area, the area ratio of the area to the perimeter are smaller than the fourth threshold value, wherein the second threshold value is smaller than the third threshold value.
  • the area of the color spot in a human face is generally within a certain range, and the shape of the color spot is generally round, so by removing the area that is too small, too large, and low in circularity in the color spot area, the color can be improved. Spot detection accuracy.
  • the skin surface spot and the subcutaneous spot can be quantified respectively and then combined with the skin surface spot
  • the score and the subcutaneous staining score determine the overall score of the staining.
  • a comprehensive score of color spots may be displayed on a user interface presented by the display screen 194, for example, as shown in FIG.
  • the score of the pigmented spots on the skin and the score of the pigmented spots on the skin may also be displayed at the same time, for example, as shown in FIG.
  • the electronic device 100 may display the score of the skin surface patch and the score of the subcutaneous patch on the user interface presented on the display screen 194 in response to a user operation, and may also respond to the user operation on the user interface presented on the display screen 194.
  • the superscript indicates the level corresponding to the comprehensive score of the stain, such as average, good, and excellent.
  • the electronic device 100 detects that the user selects a display area that displays the comprehensive score of the color spot in the user interface presented by the display screen 194, and triggers the display screen 194 to display the score of the skin surface spot, the score of the subcutaneous spot, and the synthesis of the spot
  • the level corresponding to the score is shown in Figure 11.
  • the score of the skin surface pigmentation can be quantified by determining a first feature set, and the first feature set includes one or more of the following characteristics: the pigment used to characterize the pigmentation feature image is uniform The uniformity value of the skin, the number of skin surface spots, the area of the skin surface spots, and the contrast value of the skin surface spots are used to characterize the color contrast of the skin surface spots. The score of the skin surface stain is then quantified based on the first feature set.
  • the contrast value of the skin surface stain can be determined in the following manner: determining the average value of the first pixel value of the pixel point in each of the skin surface color spots and the average value of the second pixel value of the pixel point in the color spot characteristic image Determining a ratio between the average value of the first pixel value and the average value of the second pixel value to obtain a comparison value of the skin surface stain.
  • the uniform value can be determined in the following manner: the color spot feature image is divided into several overlapping rectangular areas, and the standard deviation of the pixel values of the pixels in each of the rectangular areas is determined. Determine the mean value of the standard deviations of the pixel values of all the rectangular regions to obtain the uniform value.
  • the adjacent two rectangular areas can overlap by 50% or 30%. Of course, the area where the adjacent two rectangular areas overlap can also be other. In this embodiment of the present application, the area where the adjacent two rectangular areas overlap Not specifically limited.
  • the score of the skin surface pigmentation can be determined by the following formula:
  • H 1 is the score of the skin surface stain
  • the A is the uniform value
  • the B 1 is the number of the skin surface stain
  • C 1 is all the skin surface color
  • the D 1 is the sum of the areas of all the skin surface color spots
  • the E is the area of the characteristic image of the color spot
  • the w 1 , the w 2 , the w 3 are preset parameters.
  • the score of the subcutaneous stain can be quantified as follows: a second feature set is determined, and the second feature set includes at least one of the following characteristics: a uniform value, the number of subcutaneous stains, and a subcutaneous stain The area, the contrast value of the subcutaneous stain, and the contrast value of the subcutaneous stain are used to characterize the color contrast of the subcutaneous stain. The score of the subcutaneous stain is then quantified based on the second feature set.
  • the contrast value of the subcutaneous stain can be determined by determining the average value of the third pixel value and the average value of the second pixel value of each pixel in the subcutaneous color spot. A ratio of the average value of the third pixel value to the average value of the second pixel value is determined to obtain a comparison value of the subcutaneous stain.
  • the subcutaneous stain score can be determined by the following formula:
  • H 2 is the score of the subcutaneous pigmentation
  • B 2 is the number of the subcutaneous pigmentation
  • C 2 is the sum of the contrast values of all the subcutaneous pigmentation
  • D 2 is The sum of the areas of all the subcutaneous stains
  • w 3 and w 4 are preset parameters.
  • the comprehensive score of the stain can be determined by the following formula:
  • H y 1 ⁇ H 1 + y 2 ⁇ H 1 ;
  • H is the comprehensive score
  • y 1 and y 2 are preset parameters.
  • a stain detection result report may be provided to the user.
  • the stain detection result report may include, but is not limited to, a comprehensive score of the stain, a score of the subcutaneous stain, Skin surface stain scores, skincare recommendations, results graphs, and more.
  • the display screen 194 can display a user's face picture on a user interface that can be presented, and use different display modes on the face image to mark the skin surface spots and subcutaneous spots, respectively.
  • the color spot detection result report may be as shown in the user interface 1200 shown in FIG. 12.
  • the user interface 1200 may be triggered when the electronic device 100 detects that the user clicks the virtual button 204 shown in FIG.
  • the electronic device 100 detects that the user clicks on the display area where “90 spots” in the display screen 194 is located to trigger the display of the user interface 1200.
  • the user interface 1200 can also trigger presentation in other ways. The embodiment of the present application does not specifically limit the manner of triggering the presentation of the user interface 1200 here.
  • the method provided by the embodiment of the present application is described from the perspective of the electronic device as the execution subject.
  • the electronic device may include a hardware structure and / or a software module, and implement the foregoing functions in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether one of the above functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application of the technical solution and design constraints.
  • FIG. 13 illustrates an electronic device 1300 provided by the present application.
  • the electronic device 1300 includes at least one processor 1310 and memory 1320, and may further include a display screen 1330 and a camera 1340.
  • the processor 1310 is coupled to the memory 1320, the display screen 1330, and the camera 1340.
  • the coupling in the embodiments of the present application is an indirect coupling or communication connection between devices, units, or modules, which may be electrical, mechanical, or other forms. Used for information exchange between devices, units or modules.
  • the memory 1320 is configured to store program instructions.
  • the display screen 1330 is used to display a photo preview interface, and the photo preview interface includes an image collected by the camera 1340.
  • the display screen 1330 can also be used to display the user interfaces involved in the above embodiments, such as the user interface shown in FIG. 2, the interface shown in FIG. 4A, the user interface shown in FIG. 10, the user interface shown in FIG. 11, The user interface shown in FIG. 12 and the like.
  • the processor 1310 is configured to call program instructions stored in the memory 1320, so that the electronic device 1300 executes steps in the stain detection method shown in FIG. 3.
  • the electronic device 1300 may be used to implement the color spot detection method shown in FIG. 3 in the embodiment of the present application.
  • the electronic device 1300 may be used to implement the color spot detection method shown in FIG. 3 in the embodiment of the present application.
  • Computer-readable media includes computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • computer-readable media may include RAM, ROM, electrically erasable programmable read-only memory (EEPROM), read-only memory (EEPROM), compact disc-read-only memory (CD-ROM) ROM) or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and can be accessed by a computer. Also. Any connection is properly a computer-readable medium.
  • disks and discs include compact discs (CDs), laser discs, optical discs, digital video discs (DVDs), floppy discs, and Blu-ray discs, among which Discs usually reproduce data magnetically, while discs use lasers to reproduce data optically.
  • CDs compact discs
  • DVDs digital video discs
  • floppy discs floppy discs
  • Blu-ray discs among which Discs usually reproduce data magnetically, while discs use lasers to reproduce data optically. The above combination should also be included in the protection scope of the computer-readable medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Quality & Reliability (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

一种色斑检测方法及电子设备,用于解决现有技术中色斑检测结果不准确的问题。该方法包括:获取待检测图像,并将所述待检测图像转换到Lab颜色空间,得到Lab图像。提取所述Lab图像中的色斑特征,得到色斑特征图像,所述色斑特征图像包括皮表色斑特征以及皮下色斑特征。确定所述色斑特征图像中的皮表色斑和皮下色斑。

Description

一种色斑检测方法及电子设备
本申请要求在2018年07月16日提交中国专利局、申请号为201810776283.7、发明名称为“一种检测色斑的方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,特别涉及一种色斑检测方法及电子设备。
背景技术
面部色斑的严重程度可以直接反映人们肌肤年龄、皮肤健康状况,也是人们选择化妆品、护肤品的重要因素。
目前,可以通过移动终端上的应用程序来分析人脸照片中存在的色斑问题,在图中标出色斑轮廓或示意位置,给出色斑严重程度的分级。这些应用程序通常针对人脸照片中的色斑检测区域,利用滤波后的色斑检测区域与灰度化的色斑检测区域分割图做差值得到色斑区域,利用色斑面积占比得到色斑得分。
但是这些应用程序所采用的检测方法仅可以检测出皮表色斑,而不能检测出皮下色斑,色斑检测结果不准确,从而导致色斑严重程度的量化分级结果不准确。
发明内容
本申请实施例提供了一种色斑检测方法及电子设备,用于解决现有技术中色斑检测结果不准确的问题。
第一方面,本申请实施例提供的一种色斑检测方法,所述方法可以应用于电子设备中,该方法包括:获取待检测图像,并将所述待检测图像转换到Lab颜色空间,得到Lab图像。提取所述Lab图像中的色斑特征,得到色斑特征图像,所述色斑特征图像包括皮表色斑特征以及皮下色斑特征。确定所述色斑特征图像中的皮表色斑和皮下色斑。本申请实施例中通过在Lab颜色空间内提取待检测图像的色斑特征,不仅可以检测出待检测图像中的皮表色斑,还可以检测出待检测图像中的皮下色斑,从而可以综合皮表色斑和皮下色斑来确定人脸的色斑情况,因而与现有技术中只能根据检测出的皮表色斑来检测色斑情况相比,有助于提高检测色斑的准确性。
在一种可能的设计中,在提取所述Lab图像中的色斑特征,得到色斑特征图像时,可以分别提取所述Lab图像中L通道、a通道、b通道的细节特征分量,并确定所述L通道与提取的所述L通道的细节特征分量之间的L通道差值,以及所述a通道与提取的所述a通道的细节特征分量之间的a通道差值,所述b通道与提取的所述b通道的细节特征分量之间的b通道差值。基于所述L通道差值、所述a通道差值、所述b通道差值得到所述色斑特征图像,其中,所述色斑特征图像中L通道为所述L通道差值,所述色斑特征图像中a通道为所述a通道差值,所述色斑特征图像中b通道为所述b通道差值。通过上述设计中可以较为准确的提取所述Lab图像中的色斑特征,从而提高色斑检测的准确性。
在一种可能的设计中,在分别提取所述Lab图像中L通道、a通道、b通道的细节特 征分量时,分别针对所述Lab图像中L通道、a通道、b通道进行双边滤波处理,得到L通道的细节特征分量、a通道的细节特征分量、b通道的细节特征分量。上述设计中,通过双边滤波处理可以过滤掉L通道、a通道、b通道的平滑特征,从而可以得到L通道、a通道、b通道的细节特征。
在一种可能的设计中,在确定所述色斑特征图像中的皮表色斑和皮下色斑时,可以确定所述色斑特征图像中的色斑区域。并针对每个所述色斑区域,确定所述每个色斑区域中各个像素点的b通道均值。确定b通道均值大于第一阈值的色斑区域为皮表色斑,b通道均值小于或等于所述第一阈值的色斑区域为皮下色斑。由于色斑像素点在Lab空间中的b通道的像素值与正常皮肤像素点在b通道的像素值有较大区分,因此通过上述设计可以比较准确的检测出皮表色斑和皮下色斑。
在一种可能的设计中,在确定所述色斑特征图像中的色斑区域时,可以确定检测框内的第一像素点,其中,所述检测框的长度小于所述色斑特征图像的长度,且所述检测框的宽度小于所述色斑特征图像的宽度,所述检测框以预设步长在所述色斑特征图像内移动;所述第一像素点的像素值满足如下公式:
r1<(a1-T1×b1);
其中,r1为所述第一像素点的像素值,a1为所述检测框内像素点的像素值均值,所述T1为预设值,所述b1为所述检测框内像素点的像素值方差。
确定所述色斑特征图像内的第二像素点,所述第二像素点的像素值满足如下公式:
r2<(a2-T2×b2);
其中,r2为所述第二像素点的像素值,a2为所述色斑特征图像内像素点的像素值均值,所述T2为预设值,所述b2为所述色斑特征图像内像素点的像素值方差。
将所述第一像素点以及所述第二像素点确定为色斑点;将所述色斑点进行膨胀操作,并将经过膨胀操作的色斑点进行腐蚀操作,得到色斑区域。上述设计中,通过结合局部检测和全局检测,可以较为准确的检测出色素点,从而可以提高色斑检测的准确性。
在一种可能的设计中,在确定所述色斑特征图像中的色斑区域之后,可以去除面积小于第二阈值和/或面积大于第三阈值的色斑区域,所述第二阈值小于所述第三阈值;和/或,去除面积与周长的比值小于第四阈值的色斑区域。由于人脸中色斑的面积一般在一定范围内,并且,色斑的形状一般近似圆形,因此上述设计中,通过去除色斑区域中面积过小、面积过大、圆形度较低的区域,可以提高色斑检测的准确性。
在一种可能的设计中,还可以确定第一特征集,并基于所述第一特征集量化所述皮表色斑的得分,所述第一特征集包括如下特征中的至少一个:均匀值、所述皮表色斑的数量、所述皮表色斑的色斑面积、所述皮表色斑的对比值,所述均匀值用于表征所述色斑特征图像的色素均匀性,所述皮表色斑的对比值用于表征所述皮表色斑的颜色对比度。确定第二特征集,并基于所述第二特征集量化所述皮下色斑的得分,所述第二特征集包括如下特征中的至少一个:所述均匀值、所述皮下色斑的数量、所述皮下色斑的色斑面积、所述皮下色斑的对比值,所述皮下色斑的对比值用于表征所述皮下色斑的颜色对比度。基于所述皮表色斑的得分以及所述皮下色斑的得分确定色斑检测的综合得分。显示所述综合得分,或者,显示所述皮表色斑的得分、所述皮下色斑的得分以及所述综合得分。上述设计中,通过结合皮表色斑的得分以及皮下色斑的得分得到的色斑得分具有较好的准确性。
在一种可能的设计中,可以通过如下公式确定所述皮表色斑的得分:
Figure PCTCN2018106236-appb-000001
其中,所述H 1为所述皮表色斑的得分,所述A为所述均匀值,所述B 1为所述皮表色斑的数量,所述C 1为所有所述皮表色斑的对比值之和,所述D 1为所有所述皮表色斑的面积之和,所述E为所述色斑特征图像的面积,所述w 1、所述w 2、所述w 3均为预设参数;
可以通过如下公式确定所述皮下色斑得分:
Figure PCTCN2018106236-appb-000002
其中,所述H 2为所述皮下色斑的得分,所述B 2为所述皮下色斑的数量,所述C 2为所有所述皮下色斑的对比值之和,所述D 2为所有所述皮下色斑的面积之和,所述w 3、所述w 4均为预设参数;
可以通过如下公式确定所述综合得分:
H=y 1×H 1+y 2×H 1
其中,所述H为所述综合得分,所述y 1、所述y 2均为预设参数。上述设计中,通过综合皮肤的均匀性、色斑的对比值、色斑的数量和面积可以得到一个比较准确的得分。
在一种可能的设计中,所述第一特征集、所述第二特征集中的所述均匀值通过以下方式确定:将所述色斑特征图像划分成若干个相互重叠的矩形区域,并确定每个所述矩形区域内像素点的像素值标准差。确定所有所述矩形区域的像素值标准差的均值,得到所述均匀值。上述设计中,通过分别计算各个矩形区域的像素值标准差,然后计算所有矩形区域的像素值标准差均值,从而可以较为准确的确定皮肤色素的均匀性。
在一种可能的设计中,所述第一特征集中所述皮表色斑的对比值可以通过以下方式确定:确定每个所述皮表色斑内像素点的第一像素值均值,以及所述色斑特征图像内像素点的第二像素值均值。确定所述第一像素值均值与所述第二像素值均值的比值,得到所述皮表色斑的对比值。所述第二特征集中所述皮下色斑的对比值可以通过以下方式确定:确定每个所述皮下色斑内像素点的第三像素值均值,以及所述第二像素值均值。确定所述第三像素值均值与所述第二像素值均值的比值,得到所述皮下色斑的对比值。上述设计中,通过分别将皮表色斑的颜色深度、皮下色斑的颜色深度分别与正常皮肤的颜色深度对比,可以较为准确的得到皮表色斑的对比值以及皮下色斑的对比值。
在一种可能的设计中,将所述待检测图像转换到Lab颜色空间之前,可以将所述待检测图像转换为灰度图像。去除所述灰度图像中像素值大于第五阈值的像素点。由于反光造成的人脸光斑区域在灰度图像中的像素灰度值明显大于正常皮肤区域的像素灰度值,因此上述设计可以去除的光斑区域,从而可以减少光斑区域对色斑检测造成的影响,进而可以提高色斑检测的准确性。
第二方面,本申请实施例还提供了一种电子设备,包括:存储器,用于存储计算机程序。处理模块,用于调用所述存储模块存储的计算机程序,执行:获取待检测图像;将所述待检测图像转换到Lab颜色空间,得到Lab图像;提取所述Lab图像中的色斑特征,得到色斑特征图像,所述色斑特征图像包括皮表色斑特征以及皮下色斑特征;确定所述色斑特征图像中的皮表色斑和皮下色斑。
在一种可能的设计中,所述处理模块,在提取所述Lab图像中的色斑特征,得到色斑 特征图像时,具体用于:分别提取所述Lab图像中L通道、a通道、b通道的细节特征分量;确定所述L通道与提取的所述L通道的细节特征分量之间的L通道差值,以及所述a通道与提取的所述a通道的细节特征分量之间的a通道差值,所述b通道与提取的所述b通道的细节特征分量之间的b通道差值;基于所述L通道差值、所述a通道差值、所述b通道差值得到所述色斑特征图像,其中,所述色斑特征图像中L通道为所述L通道差值,所述色斑特征图像中a通道为所述a通道差值,所述色斑特征图像中b通道为所述b通道差值。
在一种可能的设计中,所述处理模块,在分别提取所述Lab图像中L通道、a通道、b通道的细节特征分量时,具体用于:分别针对所述Lab图像中L通道、a通道、b通道进行双边滤波处理,得到L通道的细节特征分量、a通道的细节特征分量、b通道的细节特征分量。
在一种可能的设计中,所述处理模块,在确定所述色斑特征图像中的皮表色斑和皮下色斑时,具体用于:确定所述色斑特征图像中的色斑区域;针对每个所述色斑区域,确定所述每个色斑区域中各个像素点的b通道均值;确定b通道均值大于第一阈值的色斑区域为皮表色斑,b通道均值小于或等于所述第一阈值的色斑区域为皮下色斑。
在一种可能的设计中,所述处理模块,在确定所述色斑特征图像中的色斑区域时,具体用于:确定检测框内的第一像素点,其中,所述检测框的长度小于所述色斑特征图像的长度,且所述检测框的宽度小于所述色斑特征图像的宽度,所述检测框以预设步长在所述色斑特征图像内移动;所述第一像素点的像素值满足如下公式:
r1<(a1-T1×b1);
其中,r1为所述第一像素点的像素值,a1为所述检测框内像素点的像素值均值,所述T1为预设值,所述b1为所述检测框内像素点的像素值方差。
确定所述色斑特征图像内的第二像素点,所述第二像素点的像素值满足如下公式:
r2<(a2-T2×b2);
其中,r2为所述第二像素点的像素值,a2为所述色斑特征图像内像素点的像素值均值,所述T2为预设值,所述b2为所述色斑特征图像内像素点的像素值方差。
将所述第一像素点以及所述第二像素点确定为色斑点。将所述色斑点进行膨胀操作,并将经过膨胀操作的色斑点进行腐蚀操作,得到色斑区域。
在一种可能的设计中,所述处理模块,还用于;在确定所述色斑特征图像中的色斑区域之后,去除面积小于第二阈值和/或面积大于第三阈值的色斑区域,所述第二阈值小于所述第三阈值;和/或去除面积与周长的比值小于第四阈值的色斑区域。
在一种可能的设计中,所述处理模块,还用于:确定第一特征集,并基于所述第一特征集量化所述皮表色斑的得分,所述第一特征集包括如下特征中的至少一个:均匀值、所述皮表色斑的数量、所述皮表色斑的色斑面积、所述皮表色斑的对比值,所述均匀值用于表征所述色斑特征图像的色素均匀性,所述皮表色斑的对比值用于表征所述皮表色斑的颜色对比度。确定第二特征集,并基于所述第二特征集量化所述皮下色斑的得分,所述第二特征集包括如下特征中的至少一个:所述均匀值、所述皮下色斑的数量、所述皮下色斑的色斑面积、所述皮下色斑的对比值,所述皮下色斑的对比值用于表征所述皮下色斑的颜色对比度。基于所述皮表色斑的得分以及所述皮下色斑的得分确定色斑检测的综合得分。显示所述综合得分,或者,显示所述皮表色斑的得分、所述皮下色斑的得分以及所述综合得 分。
在一种可能的设计中,所述处理模块通过如下公式确定所述皮表色斑的得分:
Figure PCTCN2018106236-appb-000003
其中,所述H 1为所述皮表色斑的得分,所述A为所述均匀值,所述B 1为所述皮表色斑的数量,所述C 1为所有所述皮表色斑的对比值之和,所述D 1为所有所述皮表色斑的面积之和,所述E为所述色斑特征图像的面积,所述w 1、所述w 2、所述w 3均为预设参数。
所述处理模块通过如下公式确定所述皮下色斑得分:
Figure PCTCN2018106236-appb-000004
其中,所述H 2为所述皮下色斑的得分,所述B 2为所述皮下色斑的数量,所述C 2为所有所述皮下色斑的对比值之和,所述D 2为所有所述皮下色斑的面积之和,所述w 3、所述w 4均为预设参数。
所述处理模块通过如下公式确定所述综合得分:
H=y 1×H 1+y 2×H 1
其中,所述H为所述综合得分,所述y 1、所述y 2均为预设参数。
在一种可能的设计中,所述处理模块,还用于通过以下方式确定所述第一特征集、所述第二特征集中的所述均匀值:将所述色斑特征图像划分成若干个相互重叠的矩形区域;确定每个所述矩形区域内像素点的像素值标准差;确定所有所述矩形区域的像素值标准差的均值,得到所述均匀值。
在一种可能的设计中,所述处理模块,还用于通过以下方式确定所述第一特征集中所述皮表色斑的对比值:确定每个所述皮表色斑内像素点的第一像素值均值,以及所述色斑特征图像内像素点的第二像素值均值;确定所述第一像素值均值与所述第二像素值均值的比值,得到所述皮表色斑的对比值;所述处理模块,还用于通过以下方式确定所述第二特征集中所述皮下色斑的对比值:确定每个所述皮下色斑内像素点的第三像素值均值,以及所述第二像素值均值;确定所述第三像素值均值与所述第二像素值均值的比值,得到所述皮下色斑的对比值。
在一种可能的设计中,所述处理模块,还用于在将所述待检测图像转换到Lab颜色空间之前,将所述待检测图像转换为灰度图像;去除所述灰度图像中像素值大于第五阈值的像素点。
第三方面,本申请实施例提供的一种计算机存储介质,该计算机存储介质存储有程序指令,当程序指令在电子设备上运行时,使得电子设备执行本申请实施例第一方面及其任一可能的设计的方法。
第四方面,本申请实施例提供的一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备本申请实施例第一方面及其任一可能的设计的方法。
第五方面,本申请实施例提供的一种芯片,所述芯片与电子设备中的存储器耦合,执行本申请实施例第一方面及其任一可能的设计的方法。
另外,第二方面至第五方面所带来的技术效果可参见上述第一方面的描述,此处不再赘述。
需要说明的是,本申请实施例中“耦合”是指两个部件彼此直接或间接地结合。
附图说明
图1为本申请实施例提供的一种电子设备的结构示意图;
图2为本申请实施例提供的一种用户界面的示意图;
图3为本申请实施例提供的一种色斑检测方法的流程示意图;
图4A为本申请实施例的一种预览用户界面的示意图;
图4B为本申请实施例提供的一种色斑检测方法的流程示意图;
图5为本申请实施例提供的Lab图像的L通道、a通道、b通道的示意图;
图6为本申请实施例提供的色斑特征图像的示意图;
图7为本申请实施例提供的L通道的细节特征分量、a通道的细节特征分量、b通道的细节特征分量的示意图;
图8为本申请实施例提供的L通道差值、a通道差值、b通道差值的示意图;
图9为本申请实施例提供的皮表色斑和皮下色斑的示意图;
图10为本申请实施例提供的一种显示色斑得分的用户界面的示意图;
图11为本申请实施例提供的另一种显示色斑得分的用户界面的示意图;
图12为本申请实施例提供的一种显示色斑检测结果的用户界面的示意图;
图13为本申请实施例提供的另一种电子设备的结构示意图。
具体实施方式
本申请公开的各个实施例可以应用于电子设备中。在本申请一些实施例中,电子设备可以是包含诸如个人数字助理和/或音乐播放器等功能的便携式电子设备,诸如手机、平板电脑、具备无线通讯功能的可穿戴设备(如智能手表)、车载设备等。便携式电子设备的示例性实施例包括但不限于搭载
Figure PCTCN2018106236-appb-000005
或者其它操作***的便携式电子设备。上述便携式电子设备也可以是诸如具有触敏表面(例如触控面板)的膝上型计算机(Laptop)等。还应当理解的是,在本申请其他一些实施例中,上述电子设备也可以是具有触敏表面(例如触控面板)的台式计算机。
图1示出了一种电子设备100的结构示意图。
电子设备100可以包括处理器110、外部存储器接口120、内部存储器121、通用串行总线(universal serial bus,USB)接口130、充电管理模块140、电源管理模块141、电池142、天线2、无线通信模块160、音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、传感器模块180、按键190、马达191、指示器192、摄像头193、以及显示屏194等。其中传感器模块180包括环境光传感器180L。此外,传感器模块180还可以包括压力传感器180A、陀螺仪传感器180B、气压传感器180C、磁传感器180D、加速度传感器180E、距离传感器180F、接近光传感器180G、指纹传感器180H、温度传感器180J、触摸传感器180K、骨传导传感器180M等。在另一些实施例中,本申请实施例中的电子设备100还可以包括天线1、移动通信模块150、以及用户标识模块(subscriber identification module,SIM)卡接口195等。
处理器110可以包括一个或多个处理单元。例如:处理器110可以包括应用处理器 (application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、存储器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
在一些实施例中,处理器110中还可以设置存储器,用于存储指令和数据。示例的,处理器110中的存储器可以为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了***的效率。
在另一些实施例中,处理器110还可以包括一个或多个接口。例如,接口可以为通用串行总线(universal serial bus,USB)接口130。又例如,接口还可以为集成电路(inter-integrated circuit,I2C)接口、集成电路内置音频(inter-integrated circuit sound,I2S)接口、脉冲编码调制(pulse code modulation,PCM)接口、通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口、移动产业处理器接口(mobile industry processor interface,MIPI)、通用输入输出(general-purpose input/output,GPIO)接口、用户标识模块(subscriber identity module,SIM)接口等。可以理解的是,本申请实施例可以通过接口连接电子设备100的不同模块,从而使得电子设备100能够实现不同的功能。例如拍照、处理等。需要说明的是,本申请实施例对电子设备100中接口的连接方式不作限定。
其中,USB接口130是符合USB标准规范的接口。例如,USB接口130可以包括Mini USB接口、Micro USB接口、USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与***设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110、内部存储器121、外部存储器、显示屏194、摄像头193和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电、阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1、天线2、移动通信模块150、无线通信模块160、调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器、开关、功率放大器、低噪声放大 器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波、放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A、受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络)、蓝牙(bluetooth,BT)、全球导航卫星***(global navigation satellite system,GNSS)、调频(frequency modulation,FM)、近距离无线通信技术(near field communication,NFC)、红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波信号,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯***(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA)、时分码分多址(time-division code division multiple access,TD-SCDMA)、长期演进(long term evolution,LTE)、BT、GNSS、WLAN、NFC、FM、和/或IR技术等。所述GNSS可以包括全球卫星定位***(global positioning system,GPS)、全球导航卫星***(global navigation satellite system,GLONASS)、北斗卫星导航***(beidou navigation satellite system,BDS)、准天顶卫星***(quasi-zenith satellite system,QZSS)和/或星基增强***(satellite based augmentation systems,SBAS)。
电子设备100通过GPU、显示屏194、以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像、视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)、有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED)、Miniled、MicroLed、Micro-oLed、量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施 例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点、亮度、肤色进行算法优化。ISP还可以对拍摄场景的曝光、色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1、MPEG2、MPEG3、MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别、人脸识别、语音识别、文本理解等。
外部存储器接口120可以用于连接外部存储卡(例如,Micro SD卡),实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐、视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据、电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D以及应用处理器等实现音频功能。例如音乐播放、录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐、或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个、四个或更多麦克风170C,实现声音信号采集、降噪、还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口、美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190可以包括开机键、音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照、音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒、接收信息、闹钟、游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态、电量变化,也可以用于指示消息、未接来电、通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过***SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时***多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件、软件或软件和硬件的组合实现。
下面以电子设备100为例对本申请实施例进行详细说明。
另外,应理解,本申请实施例中电子设备支持的应用程序可以包括拍照类的应用,例如相机。此外,电子设备支持的应用程序还可以包括其他多种应用,例如:绘图、游戏、电话、视频播放器、音乐播放器、照片管理、浏览器、日历、时钟等。
本申请实施例中的电子设备支持的应用又可以包括用于皮肤检测的应用。其中,用于皮肤检测的应用是通过对拍摄的人脸图像来检测用户面部皮肤的特征(例如面部皮肤的皱纹、毛孔、黑头、色斑、红区等),并可以为用户提供检测结果报告。例如,检测结果报告可以但不限于包括针对面部皮肤上各个特征的打分、对面部皮肤的综合分析等,还可以进而展示用户的人脸图片,并根据对各个特征的检测结果在人脸图像上分别标示出相应的问题,比如在鼻头区域标示有黑头,在额头区域标示有皱纹,在脸颊区域标示有色斑等等。可以理解的是,检测结果报告可以通过用户界面呈现给用户,例如,检测结果报告可以如图2所示的用户界面200,包括综合得分、肤龄、以及毛孔、黑头、细纹、色斑以及红区的得分。在另一些实施例中,用户界面200上还可以包括虚拟按钮201、虚拟按钮202、虚拟按钮203、虚拟按钮204和虚拟按钮205,其中以虚拟按钮201为例,电子设备100响应于对虚拟按钮201的操作,在显示屏194上显示针对毛孔的具体护理建议。虚拟按钮202、虚拟按钮203、虚拟按钮204和虚拟按钮205的功能可参见虚拟按钮201的功能,在此不再赘述。
为了使得电子设备对用户面部皮肤的检测更加准确,示例的,本申请实施例的用户皮肤检测方案,可以在处理器110中集成拍摄条件检测模块、图像质量检测模块、感兴趣区域(region of interest,ROI)检测模块、皮肤特征检测模块、结果分析模块等。在一些实施例中,可以在处理器110中的应用处理器上集成拍摄条件检测模块、图像质量检测模块、感兴趣区域(region of interest,ROI)检测模块、皮肤特征检测模块、结果分析模块等等。在另一些实施例中,在处理器110中集成人工智能(artificial intelligence,AI)芯片,在AI芯片上集成拍摄条件检测模块、图像质量检测模块、感兴趣区域(region of interest,ROI)检测模块、皮肤特征检测模块、结果分析模块等,来实现用户皮肤检测。
其中,拍摄条件检测模块可以实现对当前拍摄条件进行检测,以指导用户在要求的拍摄条件下进行拍摄,确保拍摄图像满足要求,从而保证基于图像对皮肤检测的准确性。例如,要求的拍摄条件包括:环境光照充足、人脸与电子设备之间的距离合适(例如25cm左右)、面部端正、睁眼闭眼、不佩戴眼镜、前额尽量无刘海遮挡、对焦准确、无明显抖动 等。
当拍摄条件检测模块检测成功后,处理器110会启动智能补光。例如,当拍摄条件检测模块在当前拍摄条件满足要求时,确定检测成功。具体的,本申请实施例中电子设备可以采用不同的补光模式(例如闪光灯模式,手电筒模式)对用户的面部进行补光,以满足不同面部皮肤特征检测的要求。在对用户的面部补光后,处理器110就可以控制摄像头193对用户面部进行拍摄得到用户面部的人脸图像。
图像质量检测模块可以对人脸图像的质量进行检测,以确保拍摄的图像满足不同面部皮肤特征检测的要求。
ROI检测模块可以在图像质量检测模块检测到图像的质量满足要求后,从人脸图像中确定待检测的ROI,例如黑头的ROI是鼻头上的一小块区域。
皮肤特征检测模块可以分别对已经确定出的ROI中的面部皮肤特征进行检测,例如检测皮肤中的皱纹、毛孔、黑头、色斑、红区、出油程度等。
结果分析模块可以对皮肤特征检测模块检测得到的面部皮肤特征的检测结果进行分析,并针对各个皮肤特征给出各个检测项的打分、打分排序等。
另外,在一些实施例中,处理器110中还可以集成图像预处理模块。其中,图像预处理模块可以对拍摄到的人脸图像进行压缩、剪裁等,以便ROI检测模块、皮肤特征检测模块等进行后续处理。
为了输出人脸图像分析结果,或输出各个检测项的打分等,处理器110还可以将检测得到的检测报告(包含各个特征的检测结果在人脸图像上的区域,比如在鼻头区域标示有黑头,在额头区域标示有皱纹,在脸颊区域标示有色斑等等,各个检测项的打分等)显示在显示屏194上,供用户进行查看,提高用户体验。
其中,为了达到对用户面部皮肤特征检测更加精确的目的,本申请实施例提供了一种色斑检测方法,用以检测人脸的色斑情况。由于本申请实施例中,可以基于Lab色彩模型检测出人脸图像中的皮表色斑和皮下色斑,从而可以综合皮表色斑和皮下色斑来确定人脸的色斑情况,因而与现有技术中只能根据检测出的皮表色斑来检测色斑情况相比,有助于提高检测色斑的准确性。
如图3所示,为本申请实施例提供的色斑检测方法的流程示意图,该方法可以应用于电子设备100中,包括以下步骤:
S301,获取待检测图像。
其中,电子设备100的显示屏194可以响应于用户的操作,来显示拍照预览界面。显示屏194显示主界面,主界面包括相机图标。例如,主界面如图4A所示的用户界面400。用户界面400包括相机图标401。此外,用户界面400上还可以包括邮件图标、短消息图标、图库图标、微信图标等。示例的,电子设备100可以响应于用户对相机图标401的操作,在显示屏194显示拍照预览界面,在这种情况下,拍照预览界面可以为如图4A所示的用户界面410。用户界面410包括预览区域411。其中预览区域411中显示电子设备100的摄像头采集的图像。
需要说明的是,电子设备100在既包括前置摄像头又包括后置摄像头的情况下,若电子设备100启动前置摄像头,未启动后置摄像头,则拍照预览界面中显示的图像为前置摄像头采集的图像;若电子设备100启动后置摄像头,未启动后置摄像头,则拍照预览界面中显示的图像为后置摄像头采集的图像。此外,还需要说明的是,电子设备100上还可以 不包括摄像头,电子设备100可以与其它包括摄像头的电子设备连接,通过外接设备的摄像头采集图像。
在显示屏194显示拍照预览界面时,电子设备100可以通过拍摄条件检测功能检测对当前拍摄条件进行检测,以指导用户在要求的拍摄条件下进行拍摄,确保拍摄图像满足要求。拍摄条件检测功能在当前拍摄条件满足拍摄要求的情况下,可以通过智能补光功能对用户的面部进行补光,以满足不同面部皮肤特征检测的要求。在对用户的面部补光后,可以通过摄像头193对用户面部进行拍摄得到用户面部的人脸图像。然后可以通过图像质量检测功能对人脸图像的质量进行检测,以确保拍摄的图像满足不同面部皮肤特征检测的要求。在确定人脸图像满足要求之后,可以通过ROI检测功能从人脸图像中确定ROI,该ROI即为待检测图像。其中,ROI检测功能可以通过人脸检测技术和人脸特征点定位技术进行关键点检测后,获取待检测的ROI图像。ROI为两侧脸颊及鼻子区域,如图4B所示的虚线框范围。
在执行步骤S301之后,在执行步骤S302之前,可以去除待检测图像中的光斑区域。
一种可能的实现方式为,将待检测图像转换为灰度图像。去除所述灰度图像中像素值大于第五阈值的像素点。由于反光造成的人脸光斑区域在灰度图像中的像素灰度值明显大于正常皮肤区域的像素灰度值,且成片状存在,因此通过去除待检测图像的灰度图像中像素值大于预设阈值的像素点的方式可以去除待检测图像中的光斑区域。
可选的,在去除待检测图像中的光斑区域之后,还可以将待检测图像内的像素值进行归一化处理。示例性的,待检测图像内像素点经过归一化后像素值可以在[0,255]范围内。当然,待检测图像内像素点经过归一化后像素值也可以在其他范围内,本申请实施例在这里不对像素点经过归一化后像素值的范围进行具体限定。
S302,将所述待检测图像转换到Lab颜色空间,得到Lab图像。
Lab颜色空间是由L、a、b这三个通道组成。L表示亮度(luminosity),a表示从洋红色至绿色的范围,b表示从黄色至蓝色的范围。待检测图像的Lab图像中每个像素点有L值、a值、b值三个参数。如图5所示,待检测图像的Lab图像中所有像素点的L值组成L通道,其中,L (x,y)表示第x行第y列的像素点的L值。同理,待检测图像的Lab图像中所有像素点的a值组成a通道,其中,a (x,y)表示第x行第y列的像素点的a值。待检测图像的Lab图像中所有像素点的b值组成b通道,其中,b (x,y)表示第x行第y列的像素点的b值。其中,W为Lab图像的长度,H为Lab图像的宽度。
由于色斑像素点在Lab空间中的b通道的像素值与正常皮肤像素点在b通道的像素值有较大区分,因此将待检测图像转换到Lab颜色空间后提取色斑特征可以提高色斑检测的准确性。
S303,提取所述Lab图像中的色斑特征,得到色斑特征图像,所述色斑特征图像包括皮表色斑特征以及皮下色斑特征。色斑特征图像可以如图6所示。
一种实施方式中,步骤S303可以通过步骤A1至A3实现:
A1,分别提取Lab图像中L通道、a通道、b通道的细节特征分量。L通道、a通道、b通道的细节特征分量可以如图7所示。分别针对Lab图像中L通道、a通道、b通道进行处理以提取各通道的细节特征,从而得到L通道、a通道、b通道的细节特征分量。
示例性的,可以分别针对Lab图像中L通道、a通道、b通道进行双边滤波处理,从而得到L通道的细节特征分量、a通道的细节特征分量、b通道的细节特征分量。当然, 也可以通过其他方式分别提取Lab图像中L通道、a通道、b通道的细节特征分量,如分别过滤掉L通道、a通道、b通道中梯度比较小的像素点,保留L通道、a通道、b通道中梯度比较大的像素点作为Lab图像中L通道、a通道、b通道的细节特征分量等等,本申请实施例在这里不对提取Lab图像中L通道、a通道、b通道的细节特征分量的方式进行具体限定。
A2,确定所述L通道与提取的所述L通道的细节特征分量之间的L通道差值,以及所述a通道与提取的所述a通道的细节特征分量之间的a通道差值,所述b通道与提取的所述b通道的细节特征分量之间的b通道差值。L通道差值、a通道差值、b通道差值可以如图8所示。
A3,基于所述L通道差值、所述a通道差值、所述b通道差值得到所述色斑特征图像,其中,所述色斑特征图像中L通道为所述L通道差值,所述色斑特征图像中a通道为所述a通道差值,所述色斑特征图像中b通道为所述b通道差值。即,将所述L通道差值、所述a通道差值、所述b通道差值进行合并,得到色斑特征图像,其中,色斑特征图像中位于第x行第y列的像素点的L值等于L( x,y)-L′( x,y),a值等于a( x,y)-a′( x,y),b值等于b( x,y)-b′( x,y)。
S304,确定所述色斑特征图像中的皮表色斑和皮下色斑。示例性的,可以先确定所述色斑特征图像中的色斑区域。之后可以针对每个所述色斑区域,确定所述每个色斑区域中各个像素点的b通道均值。可以确定b通道均值大于第一阈值的色斑区域为皮表色斑,b通道均值小于或等于所述第一阈值的色斑区域为皮下色斑。在人脸图像上标示色斑时可以采用不同的显示方式分别显示皮表色斑和皮下色斑,如用实线框标记标示皮表色斑,用虚线框标记标示皮下色斑,如图9所示,或者,用红色标记标示皮表色斑,用蓝色标记标示皮下色斑等等。
本申请实施例中通过在Lab颜色空间内提取待检测图像的色斑特征,不仅可以检测出待检测图像中的皮表色斑,还可以检测出待检测图像中的皮下色斑,从而可以综合皮表色斑和皮下色斑来确定人脸的色斑情况,因而与现有技术中只能根据检测出的皮表色斑来检测色斑情况相比,有助于提高检测色斑的准确性。
一种实现方式中,确定所述色斑特征图像中的色斑区域时,可以通过如下步骤B1至B4实现:
B1,设置尺寸为n*n的检测框,该检测框以预设步长在色斑特征图像内移动,其中,n为不大于W且不大于H的正整数。例如,检测框从色斑特征图像的左上角开始以预设步长向右下滑动,每次滑动过程中计算检测框内像素点的像素值均值及像素值方差,筛选出满足如下公式的第一像素点:
r1<(a1-T1×b1);
其中,r1为所述第一像素点的像素值,a1为检测框内像素点的像素值均值,所述T1为预设值,所述b1为所述检测框内像素点的像素值方差。
B2,计算色斑特征图像中像素点的像素值均值及像素值方差,筛选满足如下公式的第二像素点:
r2<(a2-T2×b2);
其中,r2为所述第二像素点的像素值,a2为所述色斑特征图像内像素点的像素值均值,所述T2为预设值,所述b2为所述色斑特征图像内像素点的像素值方差。
B3,将第一像素点以及第二像素点确定为色斑点。
B4,将色斑点进行膨胀操作,并将经过膨胀操作的色斑点进行腐蚀操作,得到色斑区域。
可选的,确定所述色斑特征图像中的色斑区域之后,可以去除如下三类色斑区域中的一类或多类:面积小于第二阈值的色斑区域、面积大于第三阈值的色斑区域、面积与周长的比值小于第四阈值的色斑区域,其中,所述第二阈值小于所述第三阈值。人脸中色斑的面积一般在一定范围内,并且,色斑的形状一般近似圆形,因此通过去除色斑区域中面积过小、面积过大、圆形度较低的区域,可以提高色斑检测的准确性。
在一种可能的实施方式中,在确定所述色斑特征图像中的皮表色斑和皮下色斑后,可以分别针对皮表色斑和皮下色斑量化得分,然后结合皮表色斑的得分和皮下色斑的得分确定色斑的综合得分。可以在显示屏194所呈现的用户界面上显示色斑的综合得分,例如,如图2所示。在显示屏194所呈现的用户界面上显示色斑的综合得分时,还可以同时显示皮表色斑的得分、皮下色斑的得分,例如,如图10所示。或者,电子设备100也可以响应用户的操作在显示屏194所呈现的用户界面上显示皮表色斑的得分、皮下色斑的得分,还可以响应用户的操作在显示屏194所呈现的用户界面上标示出该色斑的综合得分对应的级别,如一般、良好、优秀等。例如,电子设备100检测到用户选中显示屏194所呈现的用户界面中显示色斑的综合得分的显示区域,触发显示屏194显示皮表色斑的得分、皮下色斑的得分、色斑的综合得分对应的级别,如图11所示。
一种实现方式中,可以通过如下方式量化皮表色斑的得分:确定第一特征集,第一特征集包括如下特征中的一个或多个:用于表征所述色斑特征图像的色素均匀性的均匀值、皮表色斑的数量、皮表色斑的色斑面积、皮表色斑的对比值,皮表色斑的对比值用于表征所述皮表色斑的颜色对比度。之后基于所述第一特征集量化所述皮表色斑的得分。
其中,皮表色斑的对比值可以通过以下方式确定:确定每个所述皮表色斑内像素点的第一像素值均值,以及所述色斑特征图像内像素点的第二像素值均值;确定所述第一像素值均值与所述第二像素值均值的比值,得到所述皮表色斑的对比值。
均匀值可以通过以下方式确定:将所述色斑特征图像划分成若干个相互重叠的矩形区域,并确定每个所述矩形区域内像素点的像素值标准差。确定所有所述矩形区域的像素值标准差的均值,得到所述均匀值。相邻两个矩形区域可以重叠50%,也可以重叠30%,当然,相邻两个矩形区域重叠的面积也可以为其他,本申请实施例在这里对相邻两个矩形区域重叠的面积大小不做具体限定。
示例性的,所述皮表色斑的得分可以通过如下公式确定:
Figure PCTCN2018106236-appb-000006
其中,所述H 1为所述皮表色斑的得分,所述A为所述均匀值,所述B 1为所述皮表色斑的数量,所述C 1为所有所述皮表色斑的对比值之和,所述D 1为所有所述皮表色斑的面积之和,所述E为所述色斑特征图像的面积,所述w 1、所述w 2、所述w 3均为预设参数。
一种实现方式中,可以通过如下方式量化皮下色斑的得分:确定第二特征集,第二特征集包括如下特征中的至少一个:均匀值、皮下色斑的数量、皮下色斑的色斑面积、皮下色斑的对比值,皮下色斑的对比值用于表征所述皮下色斑的颜色对比度。之后基于所述第二特征集量化所述皮下色斑的得分。
其中,皮下色斑的对比值可以通过如下方式确定:确定每个所述皮下色斑内像素点的第三像素值均值,以及所述第二像素值均值。确定所述第三像素值均值与所述第二像素值均值的比值,得到所述皮下色斑的对比值。
示例性的,所述皮下色斑得分可以通过如下公式确定:
Figure PCTCN2018106236-appb-000007
其中,所述H 2为所述皮下色斑的得分,所述B 2为所述皮下色斑的数量,所述C 2为所有所述皮下色斑的对比值之和,所述D 2为所有所述皮下色斑的面积之和,所述w 3、所述w 4均为预设参数。
示例性的,色斑的综合得分可以通过如下公式确定:
H=y 1×H 1+y 2×H 1
其中,所述H为所述综合得分,所述y 1、所述y 2均为预设参数。
在通过本申请实施例提供的色斑检测方法检测色斑后,还可以向用户提供色斑检测结果报告,色斑检测结果报告可以但不限于包括色斑的综合得分、皮下色斑的得分、皮表色斑的得分、护肤建议、结果图等等。其中,显示屏194可以呈现的用户界面上展示用户的人脸图片,并在人脸图像上采用不同的显示方式分别标示皮表色斑和皮下色斑来呈现。色斑检测结果报告可以如图12所示的用户界面1200。该用户界面1200可以是电子设备100在检测到用户点击图2所示的虚拟按钮204时触发呈现的,或者,也可以是检测到用户点击用户界面中色斑得分所在的显示区域触发呈现的,以图2所示的用户界面200为例,电子设备100检测到用户点击显示屏194中“色斑90分”所在的显示区域触发显示用户界面1200。当然,用户界面1200也可以通过其他方式触发呈现,本申请实施例在这里对触发用户界面1200呈现的方式不做具体限定。
上述涉及的各个实施例可以相互结合使用,也可以单独使用。
上述本申请提供的实施例中,从电子设备作为执行主体的角度对本申请实施例提供的方法进行了介绍。为了实现上述本申请实施例提供的方法中的各功能,电子设备可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。
基于相同的构思,图13所示为本申请提供的一种电子设备1300。示例的,电子设备1300包括至少一个处理器1310、存储器1320,还可以包括显示屏1330和摄像头1340。其中,处理器1310与存储器1320、显示屏1330和摄像头1340耦合,本申请实施例中的耦合是装置、单元或模块之间的间接耦合或通信连接,可以是电性,机械或其它的形式,用于装置、单元或模块之间的信息交互。
具体的,存储器1320用于存储程序指令。
显示屏1330用于显示拍照预览界面,拍照预览界面包括摄像头1340采集的图像。显示屏1330还可以用于显示上述实施例中所涉及的用户界面,如图2所示的用户界面、图4A所示的界面、图10所示的用户界面、图11所示的用户界面、图12所示的用户界面等等。
处理器1310用于调用存储器1320中存储的程序指令,使得电子设备1300执行图3 所示的色斑检测方法中的步骤。
应理解,该电子设备1300可以用于实现本申请实施例的如图3所示的色斑检测方法,相关特征可以参照上文,此处不再赘述。
所属领域的技术人员可以清楚地了解到本申请实施例可以用硬件实现,或固件实现,或它们的组合方式来实现。当使用软件实现时,可以将上述功能存储在计算机可读介质中或作为计算机可读介质上的一个或多个指令或代码进行传输。计算机可读介质包括计算机存储介质和通信介质,其中通信介质包括便于从一个地方向另一个地方传送计算机程序的任何介质。存储介质可以是计算机能够存取的任何可用介质。以此为例但不限于:计算机可读介质可以包括RAM、ROM、电可擦可编程只读存储器(electrically erasable programmable read only memory,EEPROM)、只读光盘(compact disc read-Only memory,CD-ROM)或其他光盘存储、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质。此外。任何连接可以适当的成为计算机可读介质。例如,如果软件是使用同轴电缆、光纤光缆、双绞线、数字用户线(digital subscriber line,DSL)或者诸如红外线、无线电和微波之类的无线技术从网站、服务器或者其他远程源传输的,那么同轴电缆、光纤光缆、双绞线、DSL或者诸如红外线、无线和微波之类的无线技术包括在所属介质的定影中。如本申请实施例所使用的,盘(disk)和碟(disc)包括压缩光碟(compact disc,CD)、激光碟、光碟、数字通用光碟(digital video disc,DVD)、软盘和蓝光光碟,其中盘通常磁性的复制数据,而碟则用激光来光学的复制数据。上面的组合也应当包括在计算机可读介质的保护范围之内。
总之,以上所述仅为本申请的实施例而已,并非用于限定本申请的保护范围。凡根据本申请的揭露,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (25)

  1. 一种色斑检测方法,其特征在于,包括:
    获取待检测图像;
    将所述待检测图像转换到Lab颜色空间,得到Lab图像;
    提取所述Lab图像中的色斑特征,得到色斑特征图像,所述色斑特征图像包括皮表色斑特征以及皮下色斑特征;
    确定所述色斑特征图像中的皮表色斑和皮下色斑。
  2. 如权利要求1所述的方法,其特征在于,提取所述Lab图像中的色斑特征,得到色斑特征图像,包括:
    分别提取所述Lab图像中L通道、a通道、b通道的细节特征分量;
    确定所述L通道与提取的所述L通道的细节特征分量之间的L通道差值,以及所述a通道与提取的所述a通道的细节特征分量之间的a通道差值,所述b通道与提取的所述b通道的细节特征分量之间的b通道差值;
    基于所述L通道差值、所述a通道差值、所述b通道差值得到所述色斑特征图像,其中,所述色斑特征图像中L通道为所述L通道差值,所述色斑特征图像中a通道为所述a通道差值,所述色斑特征图像中b通道为所述b通道差值。
  3. 如权利要求2所述的方法,其特征在于,分别提取所述Lab图像中L通道、a通道、b通道的细节特征分量,包括:
    分别针对所述Lab图像中L通道、a通道、b通道进行双边滤波处理,得到L通道的细节特征分量、a通道的细节特征分量、b通道的细节特征分量。
  4. 如权利要求1至3任一项所述的方法,其特征在于,确定所述色斑特征图像中的皮表色斑和皮下色斑,包括:
    确定所述色斑特征图像中的色斑区域;
    针对每个所述色斑区域,确定所述每个色斑区域中各个像素点的b通道均值;
    确定b通道均值大于第一阈值的色斑区域为皮表色斑,b通道均值小于或等于所述第一阈值的色斑区域为皮下色斑。
  5. 如权利要求4所述的方法,其特征在于,确定所述色斑特征图像中的色斑区域,包括:
    确定检测框内的第一像素点,其中,所述检测框的长度小于所述色斑特征图像的长度,且所述检测框的宽度小于所述色斑特征图像的宽度,所述检测框以预设步长在所述色斑特征图像内移动;所述第一像素点的像素值满足如下公式:
    r1<(a1-T1×b1);
    其中,r1为所述第一像素点的像素值,a1为所述检测框内像素点的像素值均值,所述T1为预设值,所述b1为所述检测框内像素点的像素值方差;
    确定所述色斑特征图像内的第二像素点,所述第二像素点的像素值满足如下公式:
    r2<(a2-T2×b2);
    其中,r2为所述第二像素点的像素值,a2为所述色斑特征图像内像素点的像素值均值,所述T2为预设值,所述b2为所述色斑特征图像内像素点的像素值方差;
    将所述第一像素点以及所述第二像素点确定为色斑点;
    将所述色斑点进行膨胀操作,并将经过膨胀操作的色斑点进行腐蚀操作,得到色斑区域。
  6. 如权利要求4或5所述的方法,其特征在于,在确定所述色斑特征图像中的色斑区域之后,还包括:
    去除面积小于第二阈值和/或面积大于第三阈值的色斑区域,所述第二阈值小于所述第三阈值;和/或
    去除面积与周长的比值小于第四阈值的色斑区域。
  7. 如权利要求1至6任一项所述的方法,其特征在于,所述方法还包括:
    确定第一特征集,并基于所述第一特征集量化所述皮表色斑的得分,所述第一特征集包括如下特征中的至少一个:均匀值、所述皮表色斑的数量、所述皮表色斑的色斑面积、所述皮表色斑的对比值,所述均匀值用于表征所述色斑特征图像的色素均匀性,所述皮表色斑的对比值用于表征所述皮表色斑的颜色对比度;
    确定第二特征集,并基于所述第二特征集量化所述皮下色斑的得分,所述第二特征集包括如下特征中的至少一个:所述均匀值、所述皮下色斑的数量、所述皮下色斑的色斑面积、所述皮下色斑的对比值,所述皮下色斑的对比值用于表征所述皮下色斑的颜色对比度;
    基于所述皮表色斑的得分以及所述皮下色斑的得分确定色斑检测的综合得分;
    显示所述综合得分,或者,显示所述皮表色斑的得分、所述皮下色斑的得分以及所述综合得分。
  8. 如权利要求7所述的方法,其特征在于,通过如下公式确定所述皮表色斑的得分:
    Figure PCTCN2018106236-appb-100001
    其中,所述H 1为所述皮表色斑的得分,所述A为所述均匀值,所述B 1为所述皮表色斑的数量,所述C 1为所有所述皮表色斑的对比值之和,所述D 1为所有所述皮表色斑的面积之和,所述E为所述色斑特征图像的面积,所述w 1、所述w 2、所述w 3均为预设参数;
    通过如下公式确定所述皮下色斑得分:
    Figure PCTCN2018106236-appb-100002
    其中,所述H 2为所述皮下色斑的得分,所述B 2为所述皮下色斑的数量,所述C 2为所有所述皮下色斑的对比值之和,所述D 2为所有所述皮下色斑的面积之和,所述w 3、所述w 4均为预设参数;
    通过如下公式确定所述综合得分:
    H=y 1×H 1+y 2×H 1
    其中,所述H为所述综合得分,所述y 1、所述y 2均为预设参数。
  9. 如权利要求7或8所述的方法,其特征在于,所述第一特征集、所述第二特征集 中的所述均匀值通过以下方式确定:
    将所述色斑特征图像划分成若干个相互重叠的矩形区域;
    确定每个所述矩形区域内像素点的像素值标准差;
    确定所有所述矩形区域的像素值标准差的均值,得到所述均匀值。
  10. 如权利要求7至9任一项所述的方法,其特征在于,所述第一特征集中所述皮表色斑的对比值通过以下方式确定:
    确定每个所述皮表色斑内像素点的第一像素值均值,以及所述色斑特征图像内像素点的第二像素值均值;
    确定所述第一像素值均值与所述第二像素值均值的比值,得到所述皮表色斑的对比值;
    所述第二特征集中所述皮下色斑的对比值通过以下方式确定:
    确定每个所述皮下色斑内像素点的第三像素值均值,以及所述第二像素值均值;
    确定所述第三像素值均值与所述第二像素值均值的比值,得到所述皮下色斑的对比值。
  11. 如权利要求1至10任一项所述的方法,其特征在于,将所述待检测图像转换到Lab颜色空间之前,还包括:
    将所述待检测图像转换为灰度图像;
    去除所述灰度图像中像素值大于第五阈值的像素点。
  12. 一种电子设备,其特征在于,包括:
    存储器,用于存储计算机程序;
    处理模块,用于调用所述存储模块存储的计算机程序,执行:
    获取待检测图像;
    将所述待检测图像转换到Lab颜色空间,得到Lab图像;
    提取所述Lab图像中的色斑特征,得到色斑特征图像,所述色斑特征图像包括皮表色斑特征以及皮下色斑特征;
    确定所述色斑特征图像中的皮表色斑和皮下色斑。
  13. 如权利要求12所述的电子设备,其特征在于,所述处理模块,在提取所述Lab图像中的色斑特征,得到色斑特征图像时,具体用于:
    分别提取所述Lab图像中L通道、a通道、b通道的细节特征分量;
    确定所述L通道与提取的所述L通道的细节特征分量之间的L通道差值,以及所述a通道与提取的所述a通道的细节特征分量之间的a通道差值,所述b通道与提取的所述b通道的细节特征分量之间的b通道差值;
    基于所述L通道差值、所述a通道差值、所述b通道差值得到所述色斑特征图像,其中,所述色斑特征图像中L通道为所述L通道差值,所述色斑特征图像中a通道为所述a通道差值,所述色斑特征图像中b通道为所述b通道差值。
  14. 如权利要求13所述的电子设备,其特征在于,所述处理模块,在分别提取所述 Lab图像中L通道、a通道、b通道的细节特征分量时,具体用于:
    分别针对所述Lab图像中L通道、a通道、b通道进行双边滤波处理,得到L通道的细节特征分量、a通道的细节特征分量、b通道的细节特征分量。
  15. 如权利要求12至14任一项所述的电子设备,其特征在于,所述处理模块,在确定所述色斑特征图像中的皮表色斑和皮下色斑时,具体用于:
    确定所述色斑特征图像中的色斑区域;
    针对每个所述色斑区域,确定所述每个色斑区域中各个像素点的b通道均值;
    确定b通道均值大于第一阈值的色斑区域为皮表色斑,b通道均值小于或等于所述第一阈值的色斑区域为皮下色斑。
  16. 如权利要求15所述的电子设备,其特征在于,所述处理模块,在确定所述色斑特征图像中的色斑区域时,具体用于:
    确定检测框内的第一像素点,其中,所述检测框的长度小于所述色斑特征图像的长度,且所述检测框的宽度小于所述色斑特征图像的宽度,所述检测框以预设步长在所述色斑特征图像内移动;所述第一像素点的像素值满足如下公式:
    r1<(a1-T1×b1);
    其中,r1为所述第一像素点的像素值,a1为所述检测框内像素点的像素值均值,所述T1为预设值,所述b1为所述检测框内像素点的像素值方差;
    确定所述色斑特征图像内的第二像素点,所述第二像素点的像素值满足如下公式:
    r2<(a2-T2×b2);
    其中,r2为所述第二像素点的像素值,a2为所述色斑特征图像内像素点的像素值均值,所述T2为预设值,所述b2为所述色斑特征图像内像素点的像素值方差;
    将所述第一像素点以及所述第二像素点确定为色斑点;
    将所述色斑点进行膨胀操作,并将经过膨胀操作的色斑点进行腐蚀操作,得到色斑区域。
  17. 如权利要求15或16所述的电子设备,其特征在于,所述处理模块,还用于;
    在确定所述色斑特征图像中的色斑区域之后,去除面积小于第二阈值和/或面积大于第三阈值的色斑区域,所述第二阈值小于所述第三阈值;和/或
    去除面积与周长的比值小于第四阈值的色斑区域。
  18. 如权利要求12至17任一项所述的电子设备,其特征在于,所述处理模块,还用于:
    确定第一特征集,并基于所述第一特征集量化所述皮表色斑的得分,所述第一特征集包括如下特征中的至少一个:均匀值、所述皮表色斑的数量、所述皮表色斑的色斑面积、所述皮表色斑的对比值,所述均匀值用于表征所述色斑特征图像的色素均匀性,所述皮表色斑的对比值用于表征所述皮表色斑的颜色对比度;
    确定第二特征集,并基于所述第二特征集量化所述皮下色斑的得分,所述第二特征集包括如下特征中的至少一个:所述均匀值、所述皮下色斑的数量、所述皮下色斑的色斑面 积、所述皮下色斑的对比值,所述皮下色斑的对比值用于表征所述皮下色斑的颜色对比度;
    基于所述皮表色斑的得分以及所述皮下色斑的得分确定色斑检测的综合得分;
    显示所述综合得分,或者,显示所述皮表色斑的得分、所述皮下色斑的得分以及所述综合得分。
  19. 如权利要求18所述的电子设备,其特征在于,所述处理模块通过如下公式确定所述皮表色斑的得分:
    Figure PCTCN2018106236-appb-100003
    其中,所述H 1为所述皮表色斑的得分,所述A为所述均匀值,所述B 1为所述皮表色斑的数量,所述C 1为所有所述皮表色斑的对比值之和,所述D 1为所有所述皮表色斑的面积之和,所述E为所述色斑特征图像的面积,所述w 1、所述w 2、所述w 3均为预设参数;
    所述处理模块通过如下公式确定所述皮下色斑得分:
    Figure PCTCN2018106236-appb-100004
    其中,所述H 2为所述皮下色斑的得分,所述B 2为所述皮下色斑的数量,所述C 2为所有所述皮下色斑的对比值之和,所述D 2为所有所述皮下色斑的面积之和,所述w 3、所述w 4均为预设参数;
    所述处理模块通过如下公式确定所述综合得分:
    H=y 1×H 1+y 2×H 1
    其中,所述H为所述综合得分,所述y 1、所述y 2均为预设参数。
  20. 如权利要求18或19所述的电子设备,其特征在于,所述处理模块,还用于通过以下方式确定所述第一特征集、所述第二特征集中的所述均匀值:
    将所述色斑特征图像划分成若干个相互重叠的矩形区域;
    确定每个所述矩形区域内像素点的像素值标准差;
    确定所有所述矩形区域的像素值标准差的均值,得到所述均匀值。
  21. 如权利要求18至19任一项所述的电子设备,其特征在于,所述处理模块,还用于通过以下方式确定所述第一特征集中所述皮表色斑的对比值:
    确定每个所述皮表色斑内像素点的第一像素值均值,以及所述色斑特征图像内像素点的第二像素值均值;
    确定所述第一像素值均值与所述第二像素值均值的比值,得到所述皮表色斑的对比值;
    所述处理模块,还用于通过以下方式确定所述第二特征集中所述皮下色斑的对比值:
    确定每个所述皮下色斑内像素点的第三像素值均值,以及所述第二像素值均值;
    确定所述第三像素值均值与所述第二像素值均值的比值,得到所述皮下色斑的对比值。
  22. 如权利要求12至21任一项所述的电子设备,其特征在于,所述处理模块,还用 于:
    将所述待检测图像转换到Lab颜色空间之前,将所述待检测图像转换为灰度图像;
    去除所述灰度图像中像素值大于第五阈值的像素点。
  23. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有程序指令,当所述程序指令在电子设备上运行时,使得所述电子设备执行权利要求1至11任一所述的方法。
  24. 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行所述电子设备执行权利要求1至11任一所述的方法。
  25. 一种芯片,其特征在于,所述芯片与电子设备中的存储器耦合,使得所述电子设备执行所述电子设备执行权利要求1至11任一所述的方法。
PCT/CN2018/106236 2018-07-16 2018-09-18 一种色斑检测方法及电子设备 WO2020015148A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/260,855 US11989885B2 (en) 2018-07-16 2018-09-18 Speckle detection method and electronic device
EP18927127.3A EP3813012B1 (en) 2018-07-16 2018-09-18 Skin spot detection method and electronic device
CN201880077836.8A CN111417982B (zh) 2018-07-16 2018-09-18 一种色斑检测方法及电子设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810776283 2018-07-16
CN201810776283.7 2018-07-16

Publications (1)

Publication Number Publication Date
WO2020015148A1 true WO2020015148A1 (zh) 2020-01-23

Family

ID=69163858

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/106236 WO2020015148A1 (zh) 2018-07-16 2018-09-18 一种色斑检测方法及电子设备

Country Status (4)

Country Link
US (1) US11989885B2 (zh)
EP (1) EP3813012B1 (zh)
CN (1) CN111417982B (zh)
WO (1) WO2020015148A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111738984A (zh) * 2020-05-29 2020-10-02 北京工商大学 基于分水岭和种子填充的皮肤图像斑点评估方法及***
CN112147016A (zh) * 2020-09-30 2020-12-29 广西玉柴机器股份有限公司 金属材料硬度测量的图像分析方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016080266A1 (ja) * 2014-11-19 2016-05-26 株式会社資生堂 シミ評価装置、シミ評価方法、及びプログラム
CN105787929A (zh) * 2016-02-15 2016-07-20 天津大学 基于斑点检测的皮肤疹点提取方法
CN106388781A (zh) * 2016-09-29 2017-02-15 深圳可思美科技有限公司 一种皮肤肤色及其色素沉淀情况的检测方法
CN106529429A (zh) * 2016-10-27 2017-03-22 中国计量大学 一种基于图像识别的面部皮肤分析***
CN108269290A (zh) * 2018-01-19 2018-07-10 厦门美图之家科技有限公司 皮肤肤色识别方法及装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8290257B2 (en) 2007-03-02 2012-10-16 The Procter & Gamble Company Method and apparatus for simulation of facial skin aging and de-aging
JP4688954B2 (ja) * 2007-04-18 2011-05-25 国立大学法人 東京大学 特徴量選択方法、特徴量選択装置、画像分類方法、画像分類装置、コンピュータプログラム、及び記録媒体
EP3104873B1 (en) * 2014-02-13 2019-09-04 Technische Universität München Fgf-8 for use in treating diseases or disorders of energy homeostasis

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016080266A1 (ja) * 2014-11-19 2016-05-26 株式会社資生堂 シミ評価装置、シミ評価方法、及びプログラム
CN105787929A (zh) * 2016-02-15 2016-07-20 天津大学 基于斑点检测的皮肤疹点提取方法
CN106388781A (zh) * 2016-09-29 2017-02-15 深圳可思美科技有限公司 一种皮肤肤色及其色素沉淀情况的检测方法
CN106529429A (zh) * 2016-10-27 2017-03-22 中国计量大学 一种基于图像识别的面部皮肤分析***
CN108269290A (zh) * 2018-01-19 2018-07-10 厦门美图之家科技有限公司 皮肤肤色识别方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3813012A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111738984A (zh) * 2020-05-29 2020-10-02 北京工商大学 基于分水岭和种子填充的皮肤图像斑点评估方法及***
CN111738984B (zh) * 2020-05-29 2023-08-18 北京工商大学 基于分水岭和种子填充的皮肤图像斑点评估方法及***
CN112147016A (zh) * 2020-09-30 2020-12-29 广西玉柴机器股份有限公司 金属材料硬度测量的图像分析方法

Also Published As

Publication number Publication date
CN111417982B (zh) 2022-06-21
US11989885B2 (en) 2024-05-21
CN111417982A (zh) 2020-07-14
US20210264597A1 (en) 2021-08-26
EP3813012A1 (en) 2021-04-28
EP3813012B1 (en) 2022-09-14
EP3813012A4 (en) 2021-08-11

Similar Documents

Publication Publication Date Title
WO2020134877A1 (zh) 一种皮肤检测方法及电子设备
KR102548317B1 (ko) 색소 검출 방법 및 전자 장치
JP7067697B2 (ja) 肌の検出方法及び電子デバイス
CN111543049B (zh) 一种拍照方法及电子设备
WO2020015149A1 (zh) 一种皱纹检测方法及电子设备
WO2020015148A1 (zh) 一种色斑检测方法及电子设备
CN111557007B (zh) 一种检测眼睛睁闭状态的方法及电子设备
CN111542856B (zh) 一种皮肤检测方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927127

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018927127

Country of ref document: EP

Effective date: 20210119