CN111259696A - Method and apparatus for displaying image - Google Patents

Method and apparatus for displaying image Download PDF

Info

Publication number
CN111259696A
CN111259696A CN201811458458.6A CN201811458458A CN111259696A CN 111259696 A CN111259696 A CN 111259696A CN 201811458458 A CN201811458458 A CN 201811458458A CN 111259696 A CN111259696 A CN 111259696A
Authority
CN
China
Prior art keywords
color
face
image
information
face region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811458458.6A
Other languages
Chinese (zh)
Other versions
CN111259696B (en
Inventor
朱祥祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201811458458.6A priority Critical patent/CN111259696B/en
Publication of CN111259696A publication Critical patent/CN111259696A/en
Application granted granted Critical
Publication of CN111259696B publication Critical patent/CN111259696B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application discloses a method and a device for displaying an image. One embodiment of the method comprises: dividing the acquired face image to be processed into at least one face area image; for the face area image in the at least one face area image, acquiring and displaying face feature information of the face area image, wherein the face feature information is used for identifying the name of the face area image and color information corresponding to the face area image; and acquiring a color replacement image of the face region image corresponding to the face feature information in response to detecting a first selection signal of the face feature information in at least one piece of face feature information corresponding to the at least one face region image. The embodiment improves the efficiency of obtaining information by the user.

Description

Method and apparatus for displaying image
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method and a device for displaying images.
Background
The makeup can protect the skin of the face, can adjust the color of the face and improve the visual effect of the face. Generally, a user may try out cosmetics of possible multiple colors according to his or her facial characteristics until it is determined that the cosmetics are suitable for himself or herself.
Disclosure of Invention
The embodiment of the application provides a method and a device for displaying an image.
In a first aspect, an embodiment of the present application provides a method for displaying an image, where the method includes: dividing the acquired face image to be processed into at least one face area image; for the face area image in the at least one face area image, acquiring and displaying face feature information of the face area image, wherein the face feature information is used for identifying the name of the face area image and color information corresponding to the face area image; and acquiring a color replacement image of the face region image corresponding to the face feature information in response to detecting a first selection signal of the face feature information in at least one piece of face feature information corresponding to the at least one face region image.
In some embodiments, the acquiring and displaying the face feature information of the face region image includes: and detecting color values of a plurality of pixel points on the face region image, and taking the mean value of the plurality of color values as the color information of the face region image.
In some embodiments, the obtaining a color replacement image of a face region image corresponding to the face feature information includes: displaying at least one piece of color display information corresponding to the face feature information, wherein the color display information is used for indicating that a specified color is displayed on a face area image corresponding to the face feature information; and in response to the detection of a second selection signal corresponding to the color display information in the at least one piece of color display information, rendering the face area image corresponding to the face feature information through the specified color corresponding to the color display information to obtain a color replacement image.
In some embodiments, the above method further comprises: and in response to the detection of the grading signal, acquiring current color information of the face region image in the at least one face region image, importing the at least one piece of current color information into a grading model to obtain a color matching score, wherein the grading model is used for determining the color matching score according to the matching relationship among the color information of the face region image.
In some embodiments, the above method further comprises: in response to detecting the third query signal corresponding to the color replacement image, displaying item information corresponding to color display information of the color replacement image, the item information including at least one of: item name, item volume, item color.
In a second aspect, an embodiment of the present application provides an apparatus for displaying an image, the apparatus including: the face region image acquisition unit is configured to divide the acquired face image to be processed into at least one face region image; a face feature information obtaining unit configured to obtain and display, for a face region image in the at least one face region image, face feature information of the face region image, where the face feature information is used to identify a name of the face region image and color information corresponding to the face region image; and the color replacement image display unit is used for responding to a first selection signal of face characteristic information in at least one piece of face characteristic information corresponding to the at least one face area image and acquiring a color replacement image of the face area image corresponding to the face characteristic information.
In some embodiments, the face feature information acquiring unit includes: and the face characteristic information acquisition subunit is configured to detect color values of a plurality of pixel points on the face region image, and take the average value of the color values as the color information of the face region image.
In some embodiments, the color-replacement-image display unit includes: a color display information display subunit configured to display at least one piece of color display information corresponding to the face feature information, the color display information being used to indicate that a specified color is displayed on the face region image corresponding to the face feature information; and the color replacement image display subunit, in response to detecting a second selection signal corresponding to the color display information in the at least one piece of color display information, is configured to render the face region image corresponding to the face feature information by using the specified color corresponding to the color display information, so as to obtain a color replacement image.
In some embodiments, the above apparatus further comprises: and the scoring unit is used for responding to the detection of the scoring signal and is configured to acquire the current color information of the face region image in the at least one face region image and import the at least one piece of current color information into a scoring model to obtain a color matching score, and the scoring model is used for determining the color matching score according to the matching relation between the color information of the face region image.
In some embodiments, the above apparatus further comprises: an item information display unit configured to display item information corresponding to color display information of the color replacement image in response to detection of the third query signal corresponding to the color replacement image, the item information including at least one of: item name, item volume, item color.
In a third aspect, an embodiment of the present application provides a server, including: one or more processors; a memory having one or more programs stored thereon, which when executed by the one or more processors, cause the one or more processors to perform the method for displaying an image of the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, where the computer program is executed by a processor to implement the method for displaying an image according to the first aspect.
According to the method and the device for displaying the image, the acquired face image to be processed is divided into at least one face area image; then, acquiring and displaying face feature information of the face region image, wherein the face feature information is used for identifying the name of the face region image and color information corresponding to the face region image; and finally, responding to a first selection signal of face characteristic information in at least one piece of face characteristic information corresponding to the at least one face area image, and displaying a color replacement image of the face area image corresponding to the face characteristic information. According to the technical scheme, the color replacement image is displayed on the face region image, so that the effect of the face region image after color replacement can be conveniently and effectively displayed, and the efficiency of obtaining information by a user is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for displaying an image according to the present application;
FIG. 3 is a schematic illustration of an application scenario of a method for displaying an image according to the present application;
FIG. 4 is a flow diagram of yet another embodiment of a method for displaying an image according to the present application;
FIG. 5 is a schematic diagram of an embodiment of an apparatus for displaying an image according to the present application;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which the method for displaying an image or the apparatus for displaying an image of the embodiments of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various image processing applications installed thereon, such as an image capture application, a light detection application, an exposure control application, an image brightness adjustment application, an image editing application, an image sending application, and the like.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting image display, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as a plurality of software or software modules (for example, for providing distributed services), or as a single software or software module, which is not specifically limited herein.
The server 105 may be a server that provides various services, such as a server that processes a face image to be processed sent from the terminal apparatuses 101, 102, 103. The server can analyze and process the received data such as the face image to be processed and the like, and display the color replacement image according to the detected selection signal. After that, the server 105 may transmit the color replacement image to the terminal devices 101, 102, 103.
It should be noted that the method for displaying an image provided in the embodiment of the present application is generally performed by the server 105, and accordingly, the apparatus for displaying an image is generally disposed in the server 105.
The server may be hardware or software. When the server is hardware, it may be implemented as a distributed server cluster formed by multiple servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules (for example, to provide distributed services), or may be implemented as a single software or software module, and is not limited specifically herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for displaying an image in accordance with the present application is shown. The method for displaying an image includes the steps of:
step 201, dividing the acquired face image to be processed into at least one face area image.
In the present embodiment, an execution subject of the method for displaying an image (e.g., the terminal device 101, 102, 103 or the server 105 shown in fig. 1) may receive a face image to be processed from a terminal with which a user performs image acquisition by a wired connection manner or a wireless connection manner. The face image to be processed can be obtained under normal illumination environment, but not under dim light or bright light environment. Therefore, the real face condition of the user corresponding to the face image to be processed can be reflected as truly as possible. It should be noted that the wireless connection means may include, but is not limited to, a 3G/4G connection, a WiFi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a uwb (ultra wideband) connection, and other wireless connection means now known or developed in the future.
In practice, selecting a cosmetic is often a cumbersome process. The user needs to select a cosmetic that may be suitable for the user from among a plurality of cosmetics. Moreover, each attempt requires a complex and time-consuming process of applying and removing makeup, which makes it inefficient for the user to select the makeup.
After the execution main body acquires the face image to be processed, at least one face area image can be divided from the face image to be processed. The face image to be processed generally includes images other than a human face (for example, a jacket, a hair, a hat, etc.). The execution subject can determine a face image in the face image to be processed. Then, the execution subject can further determine the image positions corresponding to the face features such as the nose, the eyebrow, the mouth and the like on the face image through a face recognition method and the like. And finally, dividing the face image to be processed into at least one face region image by the execution main body according to the corresponding image positions of the nose, the eyebrow, the mouth and the like on the face image. At least one face region image. Each of the face region images in (1) contains therein one image of the nose, eyebrow, mouth, etc. described above. For example, the face region image may include only the image corresponding to the left eye.
Step 202, for the face area image in the at least one face area image, obtaining and displaying face feature information of the face area image.
After the face region image is obtained, the execution main body can determine the face characteristics in the face region image through a face recognition mode and the like, and the name corresponding to the face region image is obtained. Then, the color information in the face region image is detected to obtain the color information of the face region image of the face region. And combining the name and the color information of the face region image to form face feature information. That is, the face feature information may be used to identify the name of the face region image and the color information corresponding to the face region image. And then, the execution main body can display the face feature information for further operation of the user according to the face feature information.
In some optional implementation manners of this embodiment, the acquiring and displaying the face feature information of the face region image may include: and monitoring color values of a plurality of pixel points on the face region image, and taking the mean value of the plurality of color values as color information of the face region image.
The face region image may be a planar image having a certain area. In order to accurately obtain the color information of the region, the execution main body can uniformly set a plurality of pixel points as color reference points in the face region image. Then, the execution subject may detect a color value (which may be an RGB value, for example) of each color reference point in various ways. And then, the main body is executed to calculate the mean value of the color values of the color reference points, and the mean value of the color values is used as the color information of the face area image. The color information may be represented by RGB (R's full spellings are Red for Red, G's full spellings are Green for Green, B's full spellings are Blue for Blue) values, etc. For example, if a face feature of a certain face region image is a nose, the name of the face feature may be "nose", and the color information may be "light brown" (RGB values may be: 255, 125, 64). The face feature information may be: { a nose; light brown, RGB: 255, 125, 64}. It should be noted that, the selection of the color reference points may be different for different face region images. For example, when the face feature in the face region image is a nose, the image region corresponding to the nostril is not suitable to be used as the selected region of the color reference point.
Step 203, in response to detecting a first selection signal of face feature information in at least one piece of face feature information corresponding to the at least one face region image, acquiring a color replacement image of the face region image corresponding to the face feature information.
Each face region image can acquire corresponding face feature information, that is, how many face region images have corresponding face feature information. After the face feature information is displayed, the user can select the face feature information according to the needs of the user. That is, the execution subject may detect the first selection signal together with the face feature information. The first selection signal may be a signal detected by a user clicking a position of a screen on which the facial feature information is displayed. At this time, the execution subject may display a color replacement image of the face region image corresponding to the face feature information.
In some optional implementation manners of this embodiment, the obtaining of the color replacement image of the face region image corresponding to the face feature information may include the following steps:
the first step is to display at least one piece of color display information corresponding to the face feature information.
In order to provide a plurality of selectable colors to the user, at least one piece of color display information corresponding to the face feature information may be displayed after the first selection signal is detected. The color display information may be used to indicate that a designated color is displayed on the face region image corresponding to the face feature information. The designated color may be the color of an existing item corresponding to the facial feature. The color display information may be an image and a text description provided with a specified color. The text description is used to explain the specified color. For example, if the face feature is a mouth, the color display information may include an image of the color of lipstick and a name describing the color of lipstick.
And secondly, in response to the detection of a second selection signal corresponding to the color display information in the at least one piece of color display information, rendering the face area image corresponding to the face feature information by using the specified color corresponding to the color display information to obtain a color replacement image.
After the at least one piece of color display information is displayed, the user can select one piece of color display information from the at least one piece of color display information through selection or clicking and the like. At this time, the execution body may detect a second selection signal corresponding to when one piece of color display information is selected. And then, the execution main body can render the face area image corresponding to the face feature information through the specified color corresponding to the color display information to obtain a color replacement image. Also taking the above as an example, the execution main body displays a plurality of pieces of color display information corresponding to the nozzles. For selecting one of the color display information. The execution subject may render the face area image of the corresponding mouth with a designated color corresponding to the color display information selected by the user, thereby obtaining a color replacement image of the corresponding mouth.
In some optional implementation manners of this embodiment, the method of this embodiment may further include: and responding to the detected grading signal, acquiring the current color information of the face region image in the at least one face region image, and importing the at least one piece of current color information into a grading model to obtain a color matching score.
The application may also display a score button. When a user wants to know the effect of the current color of the face region image, the color matching effect of each region of the face at the moment can be checked by clicking a designated scoring button and the like. The execution subject may import the current color information of each face region image into the scoring model to obtain a color matching score. The scoring model can be used for determining a color matching score according to the matching relationship between the color information of the face region images. The executive may use a variety of existing scoring models to derive the color matching score. The execution subject may further score a sample color matching between the obtained sample color information of the plurality of face region images and the sample color information corresponding to the plurality of face region images. And inputting the sample color information and the sample color matching score into the intelligent model to obtain a scoring model. The details are not repeated herein.
Further, the execution subject may also obtain recommended color information according to the color information of the face region image. Specifically, the execution subject may query the color information of the other face region images based on the color information of the face region image when the color matching score is found, and then give the recommended color information of the face region image in the other face region images.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for displaying an image according to the present embodiment. In the application scenario of fig. 3, the user acquires a face image of the user himself through the terminal device 102, and sends the face image to the server 105 through the network 104. The server 105 obtains the face image to be processed to obtain at least one corresponding face region image. Then, the server 105 acquires face feature information of each face region image: { a nose; light brown, RGB: 255, 125, 64 }; { eyebrow; black, RGB: 0, 5, 0 }; DEG G. When the user selects the face feature information corresponding to the nose: { a nose; light brown, RGB: 255, 125, 64), performing color replacement image of the face region image corresponding to the nose acquired by the main body.
The method provided by the embodiment of the application comprises the steps of firstly dividing an acquired face image to be processed into at least one face area image; then, acquiring and displaying face feature information of the face region image, wherein the face feature information is used for identifying the name of the face region image and color information corresponding to the face region image; and finally, responding to a first selection signal of face characteristic information in at least one piece of face characteristic information corresponding to the at least one face area image, and displaying a color replacement image of the face area image corresponding to the face characteristic information. According to the technical scheme, the color replacement image is displayed on the face region image, so that the effect of the face region image after color replacement can be conveniently and effectively displayed, and the efficiency of obtaining information by a user is improved.
With further reference to FIG. 4, a flow 400 of yet another embodiment of a method for displaying an image is shown. The flow 400 of the method for displaying an image comprises the steps of:
step 401, dividing the acquired face image to be processed into at least one face region image.
In the present embodiment, an execution subject of the method for displaying an image (e.g., the terminal device 101, 102, 103 or the server 105 shown in fig. 1) may receive a face image to be processed from a terminal with which a user performs image acquisition by a wired connection manner or a wireless connection manner. The content of step 401 is the same as that of step 201, and is not described in detail here.
Step 402, for the face area image in the at least one face area image, obtaining and displaying face feature information of the face area image.
The content of step 402 is the same as that of step 202, and is not described in detail here.
Step 403, in response to detecting a first selection signal of face feature information in at least one piece of face feature information corresponding to the at least one face region image, acquiring a color replacement image of the face region image corresponding to the face feature information.
The content of step 403 is the same as that of step 203, and is not described in detail here.
In response to detecting the third query signal corresponding to the color replacement image, item information corresponding to the color display information of the color replacement image is displayed, step 404.
After displaying the color replacement image, the execution body may display a confirmation button. When the user is satisfied with the current color information or the overall display effect of each face region image, a third query signal can be sent out by clicking a confirmation button and the like. At this time, the user can display the article information corresponding to the color display information of the color-substituted image. The execution main body can inquire the color display information of the color replacement image, inquire the designated color corresponding to the color display information and further inquire the article information corresponding to the designated color. Wherein the article information includes at least one of: item name, item volume, item color.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present application provides an embodiment of an apparatus for displaying an image, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the apparatus 500 for displaying an image of the present embodiment may include: a face region image acquisition unit 501, a face feature information acquisition unit 502, and a color replacement image display unit 503. The face region image acquiring unit 501 is configured to divide the acquired face image to be processed into at least one face region image; a face feature information obtaining unit 502, configured to obtain and display, for a face region image in the at least one face region image, face feature information of the face region image, where the face feature information is used to identify a name of the face region image and color information corresponding to the face region image; the color replacement image display unit 503 is configured to acquire a color replacement image of a face region image corresponding to the face feature information in response to detecting a first selection signal of the face feature information in at least one piece of face feature information corresponding to the at least one face region image.
In some optional implementations of this embodiment, the face feature information obtaining unit 502 may include: and a face feature information obtaining subunit (not shown in the figure) configured to detect color values of a plurality of pixel points on the face region image, and use an average value of the plurality of color values as color information of the face region image.
In some optional implementations of this embodiment, the color replacement image display unit 503 may include: a color display information display sub-unit (not shown in the figure) and a color replacement image display sub-unit (not shown in the figure). Wherein, the color display information display subunit is configured to display at least one piece of color display information corresponding to the face feature information, the color display information being used for indicating that a specified color is displayed on the face region image corresponding to the face feature information; and the color replacement image display subunit, in response to detecting a second selection signal corresponding to the color display information in the at least one piece of color display information, is configured to render the face region image corresponding to the face feature information by using the specified color corresponding to the color display information, so as to obtain a color replacement image.
In some optional implementations of the present embodiment, the apparatus 500 for displaying an image may further include: and the scoring unit (not shown in the figure) is used for responding to the detection of the scoring signal and is configured to acquire the current color information of the face area image in the at least one face area image and introduce the at least one piece of current color information into a scoring model to obtain a color matching score, and the scoring model is used for determining the color matching score according to the matching relation between the color information of the face area image.
In some optional implementations of the present embodiment, the apparatus 500 for displaying an image may further include: an article information display unit (not shown in the figure), in response to detecting the third query signal corresponding to the color replacement image, configured to display article information corresponding to color display information of the color replacement image, the article information including at least one of: item name, item volume, item color.
The present embodiment further provides a server, including: one or more processors; a memory having one or more programs stored thereon, which when executed by the one or more processors, cause the one or more processors to perform the above-described method for displaying an image.
The present embodiment also provides a computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the above-mentioned method for displaying an image.
Referring now to FIG. 6, a block diagram of a computer system 600 suitable for use in implementing a server (e.g., server 105 of FIG. 1) of an embodiment of the present application is shown. The server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium mentioned above in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a face region image acquisition unit, a face feature information acquisition unit, and a color replacement image display unit. Here, the names of these units do not constitute a limitation to the unit itself in some cases, and for example, the color-alternative-image display unit may also be described as a "unit that displays a color alternative image corresponding to the face region image".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: dividing the acquired face image to be processed into at least one face area image; for the face area image in the at least one face area image, acquiring and displaying face feature information of the face area image, wherein the face feature information is used for identifying the name of the face area image and color information corresponding to the face area image; and acquiring a color replacement image of the face region image corresponding to the face feature information in response to detecting a first selection signal of the face feature information in at least one piece of face feature information corresponding to the at least one face region image.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. A method for displaying an image, comprising:
dividing the acquired face image to be processed into at least one face area image;
for the face area image in the at least one face area image, acquiring and displaying face feature information of the face area image, wherein the face feature information is used for identifying the name of the face area image and color information corresponding to the face area image;
and in response to detecting a first selection signal of face characteristic information in at least one piece of face characteristic information corresponding to the at least one face area image, acquiring a color replacement image of the face area image corresponding to the face characteristic information.
2. The method of claim 1, wherein the acquiring and displaying the face feature information of the face region image comprises:
and detecting color values of a plurality of pixel points on the face region image, and taking the mean value of the plurality of color values as the color information of the face region image.
3. The method of claim 1, wherein the obtaining of the color replacement image of the face region image corresponding to the face feature information comprises:
displaying at least one piece of color display information corresponding to the face feature information, wherein the color display information is used for indicating that a specified color is displayed on a face region image corresponding to the face feature information;
and in response to the detection of a second selection signal corresponding to the color display information in the at least one piece of color display information, rendering the face area image corresponding to the face feature information through the specified color corresponding to the color display information to obtain a color replacement image.
4. The method of claim 1, wherein the method further comprises:
and in response to the detection of the grading signal, acquiring current color information of the face region image in the at least one face region image, importing the at least one piece of current color information into a grading model to obtain a color matching score, wherein the grading model is used for determining the color matching score according to the matching relationship between the color information of the face region image.
5. The method of any of claims 1 to 4, wherein the method further comprises:
in response to detecting the third query signal corresponding to the color replacement image, displaying item information corresponding to color display information of the color replacement image, the item information including at least one of: item name, item volume, item color.
6. An apparatus for displaying an image, comprising:
the face region image acquisition unit is configured to divide the acquired face image to be processed into at least one face region image;
a face feature information obtaining unit configured to obtain and display, for a face region image in the at least one face region image, face feature information of the face region image, where the face feature information is used to identify a name of the face region image and color information corresponding to the face region image;
and the color replacement image display unit is used for responding to a first selection signal of face characteristic information in at least one piece of face characteristic information corresponding to the at least one face area image and acquiring a color replacement image of the face area image corresponding to the face characteristic information.
7. The apparatus according to claim 6, wherein the face feature information acquisition unit includes:
and the face characteristic information acquisition subunit is configured to detect color values of a plurality of pixel points on the face region image, and take the average value of the color values as the color information of the face region image.
8. The apparatus of claim 6, wherein the color replacement image display unit comprises:
a color display information display subunit configured to display at least one piece of color display information corresponding to the face feature information, the color display information being used to indicate that a specified color is displayed on the face region image corresponding to the face feature information;
and the color replacement image display subunit is used for responding to the detection of a second selection signal corresponding to the color display information in the at least one piece of color display information and rendering the face area image corresponding to the face feature information by the specified color corresponding to the color display information to obtain a color replacement image.
9. The apparatus of claim 6, wherein the apparatus further comprises:
and the scoring unit is used for responding to the detection of the scoring signal and is configured to acquire the current color information of the face region image in the at least one face region image and import the at least one piece of current color information into a scoring model to obtain a color matching score, and the scoring model is used for determining the color matching score according to the matching relation between the color information of the face region image.
10. The apparatus of any of claims 6 to 9, wherein the apparatus further comprises:
an item information display unit configured to display item information of color display information of a corresponding color replacement image in response to detection of a third query signal of the corresponding color replacement image, the item information including at least one of: item name, item volume, item color.
11. A server, comprising:
one or more processors;
a memory having one or more programs stored thereon,
the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-5.
12. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
CN201811458458.6A 2018-11-30 2018-11-30 Method and device for displaying image Active CN111259696B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811458458.6A CN111259696B (en) 2018-11-30 2018-11-30 Method and device for displaying image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811458458.6A CN111259696B (en) 2018-11-30 2018-11-30 Method and device for displaying image

Publications (2)

Publication Number Publication Date
CN111259696A true CN111259696A (en) 2020-06-09
CN111259696B CN111259696B (en) 2023-08-29

Family

ID=70953648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811458458.6A Active CN111259696B (en) 2018-11-30 2018-11-30 Method and device for displaying image

Country Status (1)

Country Link
CN (1) CN111259696B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104380339A (en) * 2013-04-08 2015-02-25 松下电器(美国)知识产权公司 Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state
US20150248775A1 (en) * 2012-10-03 2015-09-03 Holition Limited Image processing
US20150254500A1 (en) * 2013-08-30 2015-09-10 Panasonic Intellectual Property Management Co., Ltd. Makeup supporting device, makeup supporting method, and non-transitory computer-readable recording medium
KR101720016B1 (en) * 2015-10-16 2017-03-27 광운대학교 산학협력단 A clothing fitting system with a mirror display by using salient points and the method thereof
CN106682632A (en) * 2016-12-30 2017-05-17 百度在线网络技术(北京)有限公司 Method and device for processing face images
CN107220960A (en) * 2017-05-27 2017-09-29 无限极(中国)有限公司 One kind examination cosmetic method, system and equipment
US20180005448A1 (en) * 2016-06-30 2018-01-04 Fittingbox Method of hiding an object in an image or video and associated augmented reality process
CN108053365A (en) * 2017-12-29 2018-05-18 百度在线网络技术(北京)有限公司 For generating the method and apparatus of information
CN108804975A (en) * 2017-04-27 2018-11-13 丽宝大数据股份有限公司 Lip gloss guidance device and method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150248775A1 (en) * 2012-10-03 2015-09-03 Holition Limited Image processing
CN104380339A (en) * 2013-04-08 2015-02-25 松下电器(美国)知识产权公司 Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state
US20150254500A1 (en) * 2013-08-30 2015-09-10 Panasonic Intellectual Property Management Co., Ltd. Makeup supporting device, makeup supporting method, and non-transitory computer-readable recording medium
KR101720016B1 (en) * 2015-10-16 2017-03-27 광운대학교 산학협력단 A clothing fitting system with a mirror display by using salient points and the method thereof
US20180005448A1 (en) * 2016-06-30 2018-01-04 Fittingbox Method of hiding an object in an image or video and associated augmented reality process
CN106682632A (en) * 2016-12-30 2017-05-17 百度在线网络技术(北京)有限公司 Method and device for processing face images
CN108804975A (en) * 2017-04-27 2018-11-13 丽宝大数据股份有限公司 Lip gloss guidance device and method
CN107220960A (en) * 2017-05-27 2017-09-29 无限极(中国)有限公司 One kind examination cosmetic method, system and equipment
CN108053365A (en) * 2017-12-29 2018-05-18 百度在线网络技术(北京)有限公司 For generating the method and apparatus of information

Also Published As

Publication number Publication date
CN111259696B (en) 2023-08-29

Similar Documents

Publication Publication Date Title
CN107909065B (en) Method and device for detecting face occlusion
CN107622240B (en) Face detection method and device
CN107679466B (en) Information output method and device
CN107729929B (en) Method and device for acquiring information
WO2019100282A1 (en) Face skin color recognition method, device and intelligent terminal
CN108280413B (en) Face recognition method and device
CN109344752B (en) Method and apparatus for processing mouth image
CN109472264B (en) Method and apparatus for generating an object detection model
US20160104303A1 (en) Image-based color palette generation
WO2020056901A1 (en) Method and device for processing image
CN110298850B (en) Segmentation method and device for fundus image
CN109101309B (en) Method and apparatus for updating user interface
CN108268200A (en) Image processing method and device, electronic equipment, computer program and storage medium
CN108038473B (en) Method and apparatus for outputting information
CN108462832A (en) Method and device for obtaining image
CN109241930B (en) Method and apparatus for processing eyebrow image
CN107247657B (en) Method, device and system for displaying webpage coordinate click rate
CN108197563B (en) Method and device for acquiring information
CN109523564B (en) Method and apparatus for processing image
WO2023207381A1 (en) Image processing method and apparatus, and electronic device and storage medium
CN111259696B (en) Method and device for displaying image
CN109961060B (en) Method and apparatus for generating crowd density information
US20200065631A1 (en) Produce Assessment System
CN108121969B (en) Method and apparatus for processing image
US20210217074A1 (en) Systems and methods for providing a style recommendation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant