WO2018032674A1 - 色域映射方法及装置 - Google Patents

色域映射方法及装置 Download PDF

Info

Publication number
WO2018032674A1
WO2018032674A1 PCT/CN2016/110861 CN2016110861W WO2018032674A1 WO 2018032674 A1 WO2018032674 A1 WO 2018032674A1 CN 2016110861 W CN2016110861 W CN 2016110861W WO 2018032674 A1 WO2018032674 A1 WO 2018032674A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
color gamut
gamut
display
mapping
Prior art date
Application number
PCT/CN2016/110861
Other languages
English (en)
French (fr)
Inventor
李国盛
杨冬东
杨晓星
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Priority to RU2017121651A priority Critical patent/RU2671763C1/ru
Priority to JP2017527795A priority patent/JP6564859B2/ja
Priority to KR1020187031499A priority patent/KR102189189B1/ko
Publication of WO2018032674A1 publication Critical patent/WO2018032674A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6058Reduction of colour to a range of reproducible colours, e.g. to ink- reproducible colour gamut
    • H04N1/6063Reduction of colour to a range of reproducible colours, e.g. to ink- reproducible colour gamut dependent on the contents of the image to be reproduced
    • H04N1/6069Reduction of colour to a range of reproducible colours, e.g. to ink- reproducible colour gamut dependent on the contents of the image to be reproduced spatially varying within the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/022Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using memory planes
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/026Control of mixing and/or overlay of colours in general
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices
    • G06F13/12Program control for peripheral devices using hardware independent of the central processor, e.g. channel or peripheral processor
    • G06F13/124Program control for peripheral devices using hardware independent of the central processor, e.g. channel or peripheral processor where hardware is a sequential transfer control unit, e.g. microprocessor, peripheral processor or state-machine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/04Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using circuits for interfacing with colour displays

Definitions

  • the present invention relates to the field of display technologies, and in particular, to a color gamut mapping method and apparatus.
  • Color gamut refers to the method of encoding colors, and also refers to the sum of colors that a display system can produce. For example, sRGB (standard Red Green Blue) color gamut coding and NTSC (National Television Standards Committee) color gamut coding.
  • sRGB standard Red Green Blue
  • NTSC National Television Standards Committee
  • the terminal maps the display image to the sRGB color gamut for display; when the user wishes to display a more vivid color, the terminal maps the display image to the NTSC. The gamut is displayed.
  • the present invention provides a gamut mapping method and apparatus.
  • the technical solution is as follows:
  • a gamut mapping method comprising:
  • the at least one gamut mapped layer is superimposed to form a display image and output.
  • each layer is mapped to a respective gamut, including:
  • the at least one gamut mapped layer is superimposed to form a display image and output, including:
  • the layers are superimposed according to the superimposed order to obtain a display image
  • the display image is output to the display for display.
  • the gamut type tag of the layer is obtained, and the gamut type tag is a tag added when the layer is generated, including:
  • the gamut type label is the label that the application adds when generating the layer.
  • the method further includes:
  • the step of acquiring the color gamut type label of the layer is performed.
  • a gamut mapping apparatus comprising:
  • the obtaining module is configured to obtain a color gamut type label of the layer, and the color gamut type label is a label added when the layer is generated;
  • An identification module configured to determine a color gamut corresponding to the layer according to the color gamut type label
  • mapping module configured to map a layer to a corresponding color gamut
  • the display module is configured to superimpose at least one gamut mapped layer to form a display image and output.
  • mapping module includes:
  • a markup submodule configured to mark an effective display area of the layer according to a superposition order
  • a mapping sub-module configured to map a valid display area in the layer to a corresponding color gamut.
  • the display module is configured to superimpose the layers according to the superimposed order to obtain a display image; and output the display image to the display screen for display.
  • the obtaining module is configured to acquire a layer generated by the application and a gamut type label corresponding to the layer, where the gamut type label is a label added by the application when generating the layer.
  • the apparatus further includes:
  • the detecting module is configured to detect whether an open condition of the automatic mapping color gamut function is satisfied
  • the obtaining module is configured to perform the step of acquiring a color gamut type label of the layer when the open condition is satisfied.
  • a gamut mapping apparatus comprising:
  • a memory for storing processor executable instructions
  • processor is configured to:
  • the at least one gamut mapped layer is superimposed to form a display image and output.
  • the color gamut type label of the layer is obtained, the color gamut corresponding to the layer is determined according to the color gamut type label, the layer is mapped to the corresponding color gamut, and the at least one gamut mapped layer is superimposed to form a display image and output;
  • the invention solves the problem that the display effect of some layers in the display image is poor when the display image of each frame in the terminal is displayed by using the same color gamut; the different layers in each display image in the terminal are achieved.
  • map to different color gamuts so that each layer has a better display effect. For example, the natural layer gets a more accurate color, and the UI layer gets a more vivid color, thereby improving the display effect of the terminal as a whole.
  • FIG. 1 is a flowchart of a gamut mapping method according to an exemplary embodiment
  • FIG. 2 is a schematic structural diagram of a display image according to an exemplary embodiment
  • FIG. 3 is a flowchart of a gamut mapping method according to another exemplary embodiment
  • FIG. 4 is a flowchart of a gamut mapping method according to another exemplary embodiment
  • FIG. 5 is a block diagram of a color gamut mapping apparatus according to an exemplary embodiment
  • FIG. 6 is a block diagram of a color gamut mapping apparatus, according to another exemplary embodiment.
  • the lock screen interface of the smartphone includes: a status bar layer, a wallpaper layer, and a desktop icon layer.
  • the status bar layer and the desktop icon layer belong to the user interface (UI) layer, and the UI layer is a manually designed layer.
  • UI user interface
  • Unreasonable gamut mapping will reduce the aesthetics of the UI layer.
  • the present invention provides the following exemplary embodiments.
  • FIG. 1 is a flowchart of a gamut mapping method, according to an exemplary embodiment. This embodiment is exemplified by applying the method to a terminal having image processing capability. The method includes:
  • step 102 a color gamut type tag of the layer is obtained, and the color gamut type tag is a tag added when the layer is generated.
  • step 104 the color gamut corresponding to the layer is determined according to the color gamut type label.
  • step 106 the layer is mapped to the corresponding color gamut.
  • step 108 the at least one gamut mapped layer is superimposed to form a display image and output.
  • the gamut mapping method obtains the gamut type label of the layer, determines the gamut corresponding to the layer according to the gamut type label, and maps the layer to the corresponding gamut, at least A gamut-mapped layer is superimposed to form a display image and output; and the problem that the display effect of some layers in the display image is poor when displaying the image in each frame of the terminal by using the same color gamut is solved; It is achieved that different layers in each display image in the terminal are mapped to different color gamuts according to different color gamut types, so that each layer has a better display effect. For example, the natural layer gets a more accurate color, and the UI layer gets a more vivid color, thereby improving the display effect of the terminal as a whole.
  • FIG. 2 is a block diagram showing a frame display image 10 according to an exemplary embodiment.
  • the display image 10 is a mobile home page.
  • the mobile home page includes three layers: a status bar layer 12, a desktop icon layer 14, and a wallpaper layer 16.
  • the status bar layer 12 is located at the uppermost layer
  • the desktop icon layer 14 is located at the middle layer
  • the wallpaper layer 16 is located at the lowermost layer.
  • the layer at the top is an opaque layer
  • the layer at the top has the ability to cover the layer below.
  • the order of superposition between layers is determined by the z-order value of the layer.
  • Z-order refers to the hierarchical relationship between layers (also called display objects).
  • the layer corresponding to the higher z-order value is placed on top of the layer corresponding to the lower z-order value.
  • the display image 10 is synthesized from the above three layers.
  • the source of each layer may be the same or different.
  • the sources of the layer include: desktop applications, state applications, wallpaper applications, third-party applications, and other applications; layers from each application, will be in the operating system
  • the image synthesis program is synthesized into the final display image.
  • the SurfaceFlinger layer is responsible for each in the Android system. The synthesis of layers.
  • the source of the layer includes at least one of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and a video decoding chip.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • video decoding chip a Video decoding chip
  • the layer types of each layer include: natural layers and UI layers.
  • a natural layer is a layer that is produced from an object that exists naturally, or that simulates a layer that is produced by an object that exists naturally.
  • Common natural layers include: layers captured by the camera, layer frames obtained by decoding the video, and layers in the simulated world rendered in real time according to the game rendering engine.
  • a UI layer is a layer used for human-computer interaction. Typically, UI layers are designed by hand. Different color gamuts are suitable for different layer types. For example, natural layers are suitable for NTSC color gamut, and UI layers are suitable for sRGB color gamut.
  • FIG. 3 is a flowchart of a gamut mapping method, according to an exemplary embodiment. This embodiment is exemplified by applying the method to a terminal. The method includes:
  • step 301 it is detected whether an open condition of the automatic mapping color gamut function is satisfied
  • the opening condition of the automatic mapping color gamut function includes, but is not limited to, at least one of the following conditions:
  • step 302 is entered; if the open condition is not met, no processing is performed, and the color gamut of the screen itself, or the color gamut currently used in the operating system, is used.
  • step 302 if the open condition is met, the layer generated by the application and the gamut type label corresponding to the layer are acquired, and the gamut type label is a label added by the application when generating the layer.
  • Each application generates a layer during normal operation.
  • the desktop application generates an icon layer
  • the wallpaper application generates a wallpaper layer
  • the status bar application generates a status bar layer.
  • each application also generates a gamut type tag corresponding to the layer according to the layer content of the layer.
  • the icon layer includes icons of multiple applications. Since the icons are manually designed UI icons, the desktop application generates a gamut type label "Tag1" corresponding to the icon layer; for example, the wallpaper layer is shot. Get the photo of the natural landscape, so the wallpaper app generates the gamut type tag "Tag2" corresponding to the wallpaper layer; for example, the status bar layer is the artificially designed UI layer, so the status bar application generates and the status bar The color gamut type label "Tag1" corresponding to the layer.
  • An image synthesizer or an image compositing program in the terminal acquires a layer generated by each application and a color corresponding to the layer Domain type label.
  • step 303 the color gamut corresponding to the layer is determined according to the color gamut type label.
  • step 304 the layers are mapped to corresponding color gamuts.
  • the terminal maps each layer to its respective color gamut.
  • the terminal maps the layer to the sRGB color gamut; for example, if the color gamut corresponding to the layer is NTSC, the terminal maps the layer to the NTSC color gamut.
  • step 305 the stacking order of the layers is obtained
  • the terminal also obtains the z-order value of each layer, and determines the stacking order of each layer according to the z-order value of each layer. Typically, the layer corresponding to the higher z-order value is placed on top of the layer corresponding to the lower z-order value.
  • step 306 the layers are superimposed according to the superimposed order to obtain a display image
  • the terminal superimposes each layer according to the superimposing order to obtain a frame display image.
  • Each display image is superimposed by at least one frame layer.
  • step 307 the display image is output to the display screen for display.
  • the terminal outputs the display image to the display for display.
  • the gamut mapping method obtains the gamut type label of the layer, determines the gamut corresponding to the layer according to the gamut type label, and maps the layer to the corresponding gamut, at least A gamut-mapped layer is superimposed to form a display image and output; and the problem that the display effect of some layers in the display image is poor when displaying the image in each frame of the terminal by using the same color gamut is solved; It is achieved that different layers in each display image in the terminal are mapped to different color gamuts according to different color gamut types, so that each layer has a better display effect. For example, the natural layer gets a more accurate color, and the UI layer gets a more vivid color, thereby improving the display effect of the terminal as a whole.
  • one layer may all be a valid display area, or one layer may include a valid display area and an invalid display area.
  • the effective display area is the area that will eventually appear in the display image
  • the invalid display area is the area that does not appear in the display area.
  • FIG. 4 is a flowchart of a gamut mapping method, according to an exemplary embodiment. This embodiment is exemplified by applying the method to a terminal. The method includes:
  • step 401 it is detected whether the open condition of the automatic mapping color gamut function is satisfied;
  • the opening condition of the automatic mapping color gamut function includes, but is not limited to, at least one of the following conditions:
  • step 402 is entered; if the open condition is not met, no processing is performed, and the color gamut of the screen itself, or the color gamut currently used in the operating system, is used.
  • step 402 if the open condition is met, the layer generated by the application and the gamut type label corresponding to the layer are acquired, and the gamut type label is a label added by the application when generating the layer.
  • Each application generates a layer during normal operation.
  • the desktop application generates an icon layer
  • the wallpaper application generates a wallpaper layer
  • the status bar application generates a status bar layer.
  • each application also generates a gamut type tag corresponding to the layer according to the layer content of the layer.
  • the icon layer includes icons of multiple applications. Since the icons are manually designed UI icons, the desktop application generates a gamut type label "Tag1" corresponding to the icon layer; for example, the wallpaper layer is shot. Get the photo of the natural landscape, so the wallpaper app generates the gamut type tag "Tag2" corresponding to the wallpaper layer; for example, the status bar layer is the artificially designed UI layer, so the status bar application generates and the status bar The color gamut type label "Tag1" corresponding to the layer.
  • An image synthesizer or an image compositing program in the terminal acquires a layer generated by each application and a gamut type tag corresponding to the layer.
  • step 403 the color gamut corresponding to the layer is determined according to the color gamut type label.
  • step 404 the stacking order of the layers is obtained
  • the terminal also obtains the z-order value of each layer, and determines the stacking order of each layer according to the z-order value of each layer. Typically, the layer corresponding to the higher z-order value is placed on top of the layer corresponding to the lower z-order value.
  • step 405 the effective display area of the layer is marked according to the superimposition order
  • the effective display area of each layer refers to an area in the current layer that is not blocked by the layer located in the upper layer.
  • the effective display area of each layer refers to the area displayed in the display image.
  • Each layer is superimposed according to the z-order value.
  • the upper layer pixel in the upper layer is an opaque pixel
  • the lower layer pixel in the lower layer at the same position as the upper layer pixel is blocked. It is finally displayed in the display image.
  • the terminal marks the effective display area of each layer according to the superimposed order of the respective layers.
  • the layer is the uppermost layer, the entire area of the layer is marked as a valid display area; when the layer is not the uppermost layer, at least one upper layer located above the layer is obtained.
  • a layer marking a pixel of the opaque pixel in the upper layer on a pixel position corresponding to the layer to obtain a first pixel set, and determining a second pixel set other than the first pixel set in the layer To effectively display the area.
  • step 406 the effective display area in the layer is mapped to the corresponding color gamut.
  • the terminal maps the effective display areas of the layer to their respective color gamuts. For example, if the color gamut corresponding to the layer is the sRGB color gamut, the terminal maps the effective display area of the layer to the sRGB color gamut; for example, if the color gamut corresponding to the layer is NTSC, the terminal effectively displays the layer. The area is mapped to the NTSC color gamut.
  • step 404 and step 405 are optional steps.
  • the entire display area in the layer is directly mapped to the color corresponding to the layer in step 406. area.
  • step 407 the layers are superimposed according to the superimposed order to obtain a display image
  • the terminal superimposes each layer according to the superimposing order to obtain a frame display image.
  • Each display image is superimposed by at least one frame layer.
  • step 408 the display image is output to the display for display.
  • the terminal outputs the display image to the display for display.
  • the gamut mapping method obtains the gamut type label of the layer, determines the gamut corresponding to the layer according to the gamut type label, and maps the layer to the corresponding gamut, at least A gamut-mapped layer is superimposed to form a display image and output; and the problem that the display effect of some layers in the display image is poor when displaying the image in each frame of the terminal by using the same color gamut is solved; It is achieved that different layers in each display image in the terminal are mapped to different color gamuts according to different color gamut types, so that each layer has a better display effect. For example, the natural layer gets a more accurate color, and the UI layer gets a more vivid color, thereby improving the display effect of the terminal as a whole.
  • the gamut mapping method provided in this embodiment can also reduce the calculation amount of the terminal and improve the calculation speed of the terminal in the gamut mapping by mapping only the effective display area of each layer.
  • the terminal is in the gamut mapping, only the gamut mapping is performed on the effective display area in the layer.
  • FIG. 5 is a block diagram showing the structure of a gamut mapping apparatus according to an exemplary embodiment.
  • the gamut mapping device can be implemented as a whole or a part of a terminal having image processing capability by a dedicated hardware circuit, and/or a combination of hardware and software.
  • the device includes:
  • the obtaining module 520 is configured to acquire a color gamut type label of the layer, where the color gamut type label is a label added when the layer is generated;
  • the identification module 540 is configured to determine a color gamut corresponding to the layer according to the gamut type label
  • Mapping module 560 configured to map a layer to a corresponding color gamut
  • the display module 580 is configured to superimpose at least one gamut mapped layer to form a display image and output the image.
  • mapping module 560 includes:
  • a markup submodule configured to mark an effective display area of the layer according to a superposition order
  • a mapping sub-module configured to map a valid display area in the layer to a corresponding color gamut.
  • the display module 580 is configured to superimpose the layers according to the superimposed order to obtain a display image; and output the display image to the display screen for display.
  • the obtaining module 520 is configured to acquire a layer generated by the application and a gamut type label corresponding to the layer, where the gamut type label is a label added by the application when generating the layer.
  • the apparatus further includes:
  • the detecting module 510 is configured to detect whether an open condition of the automatic mapping color gamut function is satisfied
  • the obtaining module 520 is configured to perform the step of acquiring the color gamut type label of the layer when the open condition is satisfied.
  • the gamut mapping device obtains the gamut type tag of the layer, determines the gamut corresponding to the layer according to the gamut type tag, and maps the layer to the corresponding gamut, at least A gamut-mapped layer is superimposed to form a display image and output; and the problem that the display effect of some layers in the display image is poor when displaying the image in each frame of the terminal by using the same color gamut is solved; It is achieved that different layers in each display image in the terminal are mapped to different color gamuts according to different color gamut types, so that each layer has a better display effect. For example, the natural layer gets a more accurate color, and the UI layer gets a more vivid color, thereby improving the display effect of the terminal as a whole.
  • the gamut mapping apparatus provided in this embodiment can also reduce the calculation amount of the terminal and improve the calculation speed of the terminal in the gamut mapping by mapping only the effective display area of each layer.
  • the terminal when the terminal is in the gamut mapping, only the gamut mapping is performed on the effective display area in the layer.
  • An exemplary embodiment of the present invention provides a color gamut mapping apparatus, which can implement the gamut mapping method provided by the present invention, where the gamut mapping apparatus includes: a processor, a memory for storing processor executable instructions; The processor is configured to:
  • the at least one gamut mapped layer is superimposed to form a display image and output.
  • FIG. 6 is a block diagram of a color gamut mapping apparatus, according to an exemplary embodiment.
  • device 600 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a gaming console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
  • apparatus 600 can include one or more of the following components: processing component 602, memory 604, power component 606, multimedia component 608, audio component 610, input/output (I/O) interface 612, sensor component 614, and Communication component 616.
  • Processing component 602 typically controls the overall operation of device 600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 602 can include one or more processors 618 to execute instructions to perform all or part of the steps described above.
  • processing component 602 can include one or more modules to facilitate interaction between component 602 and other components.
  • processing component 602 can include a multimedia module to facilitate interaction between multimedia component 608 and processing component 602.
  • Memory 604 is configured to store various types of data to support operation at device 600. Examples of such data include instructions for any application or method operating on device 600, contact data, phone book data, messages, pictures, videos, and the like.
  • the memory 604 can be implemented by any type of volatile or non-volatile storage device or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Disk or Optical Disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM erasable Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Disk Disk or Optical Disk.
  • Power component 606 provides power to various components of device 600.
  • Power component 606 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 600.
  • the multimedia component 608 includes a screen between the device 600 and the user that provides an output interface.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor can sense not only the boundaries of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the multimedia component 608 includes a front camera and/or a rear camera. When the device 600 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 610 is configured to output and/or input an audio signal.
  • audio component 610 includes a microphone (MIC) that is configured to receive an external audio signal when device 600 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 604 or transmitted via communication component 616.
  • audio component 610 also includes a speaker for outputting an audio signal.
  • the I/O interface 612 provides an interface between the processing component 602 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor assembly 614 includes one or more sensors for providing device 600 with a status assessment of various aspects.
  • sensor component 614 can detect an open/closed state of device 600, a relative positioning of components, such as a display and a keypad of device 600, and sensor component 614 can also detect a change in position of one component of device 600 or device 600, user The presence or absence of contact with device 600, device 600 orientation or acceleration/deceleration and temperature variation of device 600.
  • Sensor assembly 614 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 614 may also include a light sensor, such as a CMOS or CCD layer sensor, for use in imaging applications.
  • the sensor component 614 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 616 is configured to facilitate wired or wireless communication between device 600 and other devices.
  • the device 600 can access a wireless network based on a communication standard, such as Wi-Fi, 2G or 3G, or a combination thereof.
  • communication component 616 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • communication component 616 also includes a near field communication (NFC) module to facilitate short range communication.
  • NFC near field communication
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • device 600 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable Gate array (FPGA), A controller, microcontroller, microprocessor or other electronic component implementation for performing the gamut mapping method described above.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable Gate array
  • a controller microcontroller, microprocessor or other electronic component implementation for performing the gamut mapping method described above.
  • non-transitory computer readable storage medium comprising instructions, such as a memory 604 comprising instructions executable by processor 618 of apparatus 600 to perform the gamut mapping method described above.
  • the non-transitory computer readable storage medium can be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Nonlinear Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
  • Video Image Reproduction Devices For Color Tv Systems (AREA)
  • Processing Of Color Television Signals (AREA)
  • Image Generation (AREA)
  • Color Image Communication Systems (AREA)

Abstract

一种色域映射方法及装置,属于显示技术领域。所述方法包括:获取图层的色域类型标签(102);根据色域类型标签确定图层对应的色域(104);对于每个图层,使用与色域类型所对应的色域映射方式将图层映射至对应的色域(106);将至少一个色域映射后的图层叠加形成显示图像并输出(108)。解决了对终端中的每帧显示图像采用相同的色域进行显示时,一些图层的显示效果较差的问题。对于终端中的每个显示图像中的不同图层,根据不同色域类型采用不同的色域映射方式,使得自然图层得到更准确的色彩,UI图层得到更鲜艳的色彩,从而整体上提升了终端的显示效果。

Description

色域映射方法及装置
本申请基于申请号为201610677001.9、申请日为2016年08月16日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本发明涉及显示技术领域,特别涉及一种色域映射方法及装置。
背景技术
色域(Color gamut)是指对颜色进行编码的方法,也指一个显示***能够产生的颜色的总和。比如,sRGB(standard Red Green Blue,标准红绿蓝)色域编码和NTSC(National Television Standards Committee,国家(美国)电视标准委员会)色域编码。
相关技术中存在一种色域映射方法,当用户希望显示较为准确的色彩时,终端将显示图像映射至sRGB色域进行显示;当用户希望显示较为鲜艳的色彩时,终端将显示图像映射至NTSC色域进行显示。
发明内容
为了解决对终端中的各帧显示图像均使用相同的色域进行显示时,一些图层的显示效果较差的问题,本发明提供一种色域映射方法及装置。所述技术方案如下:
根据本发明的第一方面,提供了一种色域映射方法,该方法包括:
获取图层的色域类型标签,色域类型标签是在生成图层时所添加的标签;
根据色域类型标签确定图层对应的色域;
将图层映射至对应的色域;
将至少一个色域映射后的图层叠加形成显示图像并输出。
在可选的实施例中,将每个图层映射至各自对应的色域,包括:
获取图层的叠加顺序;
根据叠加顺序标记图层的有效显示区域;
将图层中的有效显示区域映射至对应的色域。
在可选的实施例中,将至少一个色域映射后的图层叠加形成显示图像并输出,包括:
根据叠加顺序将图层进行叠加,得到显示图像;
将显示图像输出至显示屏进行显示。
在可选的实施例中,获取图层的色域类型标签,色域类型标签是在生成图层时所添加的标签,包括:
获取应用程序生成的图层以及与图层对应的色域类型标签,色域类型标签是应用程序在生成图层时所添加的标签。
在可选的实施例中,该方法还包括:
检测是否满足自动映射色域功能的开启条件;
若满足开启条件,则执行获取图层的色域类型标签的步骤。
根据本发明的第二方面,提供了一种色域映射装置,该装置包括:
获取模块,被配置为获取图层的色域类型标签,色域类型标签是在生成图层时所添加的标签;
识别模块,被配置为根据色域类型标签确定图层对应的色域;
映射模块,被配置为将图层映射至对应的色域;
显示模块,被配置为将至少一个色域映射后的图层叠加形成显示图像并输出。
在可选的实施例中,映射模块,包括:
获取子模块,被配置为获取图层的叠加顺序;
标记子模块,被配置为根据叠加顺序标记图层的有效显示区域;
映射子模块,被配置为将图层中的有效显示区域映射至对应的色域。
在可选的实施例中,显示模块,被配置为根据叠加顺序将图层进行叠加,得到显示图像;将显示图像输出至显示屏进行显示。
在可选的实施例中,获取模块,被配置为获取应用程序生成的图层以及与图层对应的色域类型标签,色域类型标签是应用程序在生成图层时所添加的标签。
在可选的实施例中,该装置还包括:
检测模块,被配置为检测是否满足自动映射色域功能的开启条件;
获取模块,被配置为当满足开启条件,则执行获取图层的色域类型标签的步骤。
根据本发明的第三方面,提供了一种色域映射装置,该装置包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,处理器被配置为:
获取图层的色域类型标签,色域类型标签是在生成图层时所添加的标签;
根据色域类型标签确定图层对应的色域;
将图层映射至对应的色域;
将至少一个色域映射后的图层叠加形成显示图像并输出。
本发明的实施例提供的技术方案可以包括以下有益效果:
通过获取图层的色域类型标签,根据色域类型标签确定图层对应的色域,将图层映射至对应的色域,将至少一个色域映射后的图层叠加形成显示图像并输出;解决了对终端中的每帧显示图像进行采用相同的色域进行显示时,显示图像中的一些图层的显示效果较差的问题;达到了对于终端中的每个显示图像中的不同图层,根据不同色域类型映射至不同的色域,使每个图层都具有较好的显示效果。比如,自然图层得到更准确的色彩,UI图层得到更鲜艳的色彩,从而整体上提升终端的显示效果。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性的,并不能限制本发明。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本发明的实施例,并于说明书一起用于解释本发明的原理。
图1是根据一示例性实施例示出的一种色域映射方法的流程图;
图2是根据一示例性实施例示出的显示图像的结构示意图;
图3是根据另一示例性实施例示出的一种色域映射方法的流程图;
图4是根据另一示例性实施例示出的一种色域映射方法的流程图;
图5是根据一示例性实施例示出的一种色域映射装置的框图;
图6是根据另一示例性实施例示出的一种色域映射装置的框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本发明相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本发明的一些方面相一致的装置和方法的例子。
相关技术中,通常对终端中所显示的每帧显示图像采用相同的色域映射方法。一帧显示图像通常由多个图层(英文:layer)叠加而成,但并不是每帧图层都适合同一种色域。例如, 智能手机的锁屏界面包括:状态栏图层、壁纸图层和桌面图标图层。其中,状态栏图层和桌面图标图层属于用户界面(User Interface,UI)图层,UI图层是人工设计的图层,UI图层的原始显示效果已经很好,所以对UI图层进行不合理的色域映射,反而会降低UI图层的美观程度。为此,本发明提供有如下示例性的实施例。
图1是根据一示例性实施例示出的色域映射方法的流程图。本实施例以该方法应用于具有图像处理能力的终端中来举例说明。该方法包括:
在步骤102中,获取图层的色域类型标签,该色域类型标签是在生成图层时所添加的标签。
在步骤104中,根据色域类型标签确定图层对应的色域。
在步骤106中,将图层映射至对应的色域。
在步骤108中,将至少一个色域映射后的图层叠加形成显示图像并输出。
综上所述,本实施例提供的色域映射方法,通过获取图层的色域类型标签,根据色域类型标签确定图层对应的色域,将图层映射至对应的色域,将至少一个色域映射后的图层叠加形成显示图像并输出;解决了对终端中的每帧显示图像进行采用相同的色域进行显示时,显示图像中的一些图层的显示效果较差的问题;达到了对于终端中的每个显示图像中的不同图层,根据不同色域类型映射至不同的色域,使每个图层都具有较好的显示效果。比如,自然图层得到更准确的色彩,UI图层得到更鲜艳的色彩,从而整体上提升终端的显示效果。
图2是根据一示意性实施例示出的一帧显示图像10的结构示意图。该显示图像10是一个手机主页。该手机主页包括三个图层:状态栏(Status bar)图层12、桌面图标图层14和壁纸图层16。其中,状态栏图层12位于最上层,桌面图标图层14位于中间层,壁纸图层16位于最下层。当位于上层的图层是不透明图层时,位于上层的图层具有遮盖位于下层的图层的能力。各个图层之间的叠加顺序,由图层对应的z-order值决定。z-order是指图层(也称显示对象)之间的层次关系。通常,较高的z-order值对应的图层置于较低的z-order值对应的图层的上层。
在终端中,显示图像10是由上述三个图层合成得到的。每个图层的来源可能相同,也可能不同。
可选地,从软件层面看,图层的来源包括:桌面应用程序、状态应用程序、壁纸应用程序、第三方应用程序等各种应用程序;来自各个应用程序的图层,会由操作***中的图像合成程序合成为最终的显示图像。可选地,在Android***中由SurfaceFlinger层来负责各个 图层的合成。
可选地,从硬件层面来看,图层的来源包括CPU(Central Processing Unit,中央处理器)、GPU(Graphics Processing Unit,图形处理器)和视频解码芯片中的至少一种。这些图层在AP(Application Processor,应用处理器)中进行合成后,输出至显示屏进行显示。
可选地,每个图层的图层类型包括:自然图层和UI图层。自然图层是指根据自然存在的物体所产生的图层,或者,模拟自然存在的物体所产生的图层。常见的自然图层包括:相机拍摄得到的图层、对视频进行解码后得到的图层帧、根据游戏渲染引擎实时渲染出的模拟世界中的图层等。UI图层是指用于进行人机交互的图层。通常,UI图层由人工设计得到。对应不同的图层类型,适合使用不同的色域,比如,自然图层适合使用NTSC色域,UI图层适合使用sRGB色域。
图3是根据一示例性实施例示出的色域映射方法的流程图。本实施例以该方法应用于终端中来举例说明。该方法包括:
在步骤301中,检测是否满足自动映射色域功能的开启条件;
可选地,该自动映射色域功能的开启条件包括但不限于如下条件中的至少一种:
一、当指定的应用程序启动时;比如,播放器程序启动时。
二、当自动映射色域功能的设置项被用户设置为启用时。
若满足开启条件,则进入步骤302;若不满足开启条件,则不作处理,使用屏幕本身的色域,或者说,操作***中当前所采用的色域。
在步骤302中,若满足开启条件,获取应用程序生成的图层以及与图层对应的色域类型标签,色域类型标签是应用程序在生成图层时所添加的标签。
终端在正常运行过程中,各个应用程序会生成图层。比如,桌面应用程序生成图标图层、壁纸应用程序生成壁纸图层、状态栏应用程序生成状态栏图层。
可选地,每个应用程序还会根据该图层的图层内容,生成与该图层对应的色域类型标签。
比如,图标图层包括多个应用程序的图标,由于图标均为人工设计的UI图标,所以桌面应用程序生成与图标图层对应的色域类型标签“Tag1”;又比如,壁纸图层是拍摄得到的自然景观的照片,所以壁纸应用程序生成与壁纸图层对应的色域类型标签“Tag2”;再比如,状态栏图层是人工设计的UI图层,所以状态栏应用程序生成与状态栏图层对应的色域类型标签“Tag1”。
终端中的图像合成器或图像合成程序获取各个应用程序生成的图层以及与图层对应的色 域类型标签。
在步骤303中,根据色域类型标签确定图层对应的色域。
示意性的,当图标图层的色域类型标签是“Tag1”时,确定与图标图层对应的色域是sRGB色域;当壁纸图层的色域类型标签是“Tag2”时,确定与壁纸图层对应的色域是NTSC色域。
在步骤304中,将图层映射至对应的色域。
示意性的,终端将每个图层映射至各自对应的色域。
比如,图层对应的色域为sRGB色域,则终端将该图层映射至sRGB色域;又比如,图层对应的色域为NTSC,则终端将该图层映射至NTSC色域。
在步骤305中,获取图层的叠加顺序;
终端还会获取每个图层的z-order值,根据每个图层的z-order值确定各个图层的叠加顺序。通常,较高的z-order值对应的图层置于较低的z-order值对应的图层的上层。
在步骤306中,根据叠加顺序将图层进行叠加,得到显示图像;
终端根据叠加顺序将各个图层进行叠加,得到一帧显示图像。每个显示图像由至少一帧图层叠加得到。
在步骤307中,将显示图像输出至显示屏进行显示。
终端将显示图像输出至显示屏进行显示。
综上所述,本实施例提供的色域映射方法,通过获取图层的色域类型标签,根据色域类型标签确定图层对应的色域,将图层映射至对应的色域,将至少一个色域映射后的图层叠加形成显示图像并输出;解决了对终端中的每帧显示图像进行采用相同的色域进行显示时,显示图像中的一些图层的显示效果较差的问题;达到了对于终端中的每个显示图像中的不同图层,根据不同色域类型映射至不同的色域,使每个图层都具有较好的显示效果。比如,自然图层得到更准确的色彩,UI图层得到更鲜艳的色彩,从而整体上提升终端的显示效果。
由于位于上层的图层中的非透明像素会覆盖位于下层的图层中的像素,所以位于下层的图层中,实际上有些区域是不会最终显示给用户的。换句话说,一个图层可能全部是有效显示区域,或者,一个图层包括有效显示区域和无效显示区域。有效显示区域是最终会出现在显示图像中的区域,无效显示区域是不会出现在显示区域中的区域。为了减少计算量,本发明还提供有如下实施例。
图4是根据一示例性实施例示出的色域映射方法的流程图。本实施例以该方法应用于终端中来举例说明。该方法包括:
在步骤401中,检测是否满足自动映射色域功能的开启条件;
可选地,该自动映射色域功能的开启条件包括但不限于如下条件中的至少一种:
一、当指定的应用程序启动时;比如,播放器程序启动时。
二、当自动映射色域功能的设置项被用户设置为启用时。
若满足开启条件,则进入步骤402;若不满足开启条件,则不作处理,使用屏幕本身的色域,或者说,操作***中当前所采用的色域。
在步骤402中,若满足开启条件,获取应用程序生成的图层以及与图层对应的色域类型标签,色域类型标签是应用程序在生成图层时所添加的标签。
终端在正常运行过程中,各个应用程序会生成图层。比如,桌面应用程序生成图标图层、壁纸应用程序生成壁纸图层、状态栏应用程序生成状态栏图层。
可选地,每个应用程序还会根据该图层的图层内容,生成与该图层对应的色域类型标签。
比如,图标图层包括多个应用程序的图标,由于图标均为人工设计的UI图标,所以桌面应用程序生成与图标图层对应的色域类型标签“Tag1”;又比如,壁纸图层是拍摄得到的自然景观的照片,所以壁纸应用程序生成与壁纸图层对应的色域类型标签“Tag2”;再比如,状态栏图层是人工设计的UI图层,所以状态栏应用程序生成与状态栏图层对应的色域类型标签“Tag1”。
终端中的图像合成器或图像合成程序获取各个应用程序生成的图层以及与图层对应的色域类型标签。
在步骤403中,根据色域类型标签确定图层对应的色域。
示意性的,当图标图层的色域类型标签是“Tag1”时,确定与图标图层对应的色域是sRGB色域;当壁纸图层的色域类型标签是“Tag2”时,确定与壁纸图层对应的色域是NTSC色域。
在步骤404中,获取图层的叠加顺序;
终端还会获取每个图层的z-order值,根据每个图层的z-order值确定各个图层的叠加顺序。通常,较高的z-order值对应的图层置于较低的z-order值对应的图层的上层。
在步骤405中,根据叠加顺序标记图层的有效显示区域;
可选地,每个图层的有效显示区域是指当前图层中未被位于上层的图层所遮挡的区域。或者,每个图层的有效显示区域是指在显示图像中被显示的区域。
每个图层是按照z-order值进行叠加的,当位于上层的图层中的上层像素是不透明的像素时,位于下层的图层中与上层像素相同位置的下层像素会被遮挡,不会最终显示在显示图像中。
终端根据各个图层的叠加顺序标记每个图层的有效显示区域。可选地,当图层为最上层的图层时,将图层的整个区域均标记为有效显示区域;当图层不是最上层的图层时,获取位于该图层之上的至少一个上层图层;将上层图层中的不透明像素的像素在该图层上所对应的像素位置进行标记,得到第一像素集合,将该图层中除第一像素集合之外的第二像素集合确定为有效显示区域。
在步骤406中,将图层中的有效显示区域映射至对应的色域。
示意性的,终端将该图层的有效显示区域映射至各自对应的色域。比如,图层对应的色域为sRGB色域,则终端将该图层的有效显示区域映射至sRGB色域;又比如,图层对应的色域为NTSC,则终端将该图层的有效显示区域映射至NTSC色域。
需要说明的是,步骤404和步骤405是可选步骤,当未确定每个图层的有效显示区域时,在步骤406中直接将图层中的整个显示区域映射至与该图层对应的色域。
在步骤407中,根据叠加顺序将图层进行叠加,得到显示图像;
终端根据叠加顺序将各个图层进行叠加,得到一帧显示图像。每个显示图像由至少一帧图层叠加得到。
在步骤408中,将显示图像输出至显示屏进行显示。
终端将显示图像输出至显示屏进行显示。
需要说明的是,本发明实施例中提及的色域类型标签和色域的具体举例,仅为便于理解本实施例,并不构成对色域类型标签和色域的限定。
综上所述,本实施例提供的色域映射方法,通过获取图层的色域类型标签,根据色域类型标签确定图层对应的色域,将图层映射至对应的色域,将至少一个色域映射后的图层叠加形成显示图像并输出;解决了对终端中的每帧显示图像进行采用相同的色域进行显示时,显示图像中的一些图层的显示效果较差的问题;达到了对于终端中的每个显示图像中的不同图层,根据不同色域类型映射至不同的色域,使每个图层都具有较好的显示效果。比如,自然图层得到更准确的色彩,UI图层得到更鲜艳的色彩,从而整体上提升终端的显示效果。
本实施例提供的色域映射方法,还通过只对每个图层的有效显示区域进行映射,能够减少终端的计算量,提高终端在色域映射时的计算速度。可选地,终端在色域映射时,也仅对图层中的有效显示区域进行色域映射即可。
下述为本发明装置实施例,可以用于执行本发明方法实施例。对于本发明装置实施例中未披露的细节,请参照本发明方法实施例。
图5是根据一示例性实施例示出的色域映射装置的结构方框图。该色域映射装置可以通过专用硬件电路,和/或,软硬件的组合实现成为具有图像处理能力的终端的全部或一部分。该装置包括:
获取模块520,被配置为获取图层的色域类型标签,色域类型标签是在生成图层时所添加的标签;
识别模块540,被配置为根据色域类型标签确定图层对应的色域;
映射模块560,被配置为将图层映射至对应的色域;
显示模块580,被配置为将至少一个色域映射后的图层叠加形成显示图像并输出。
在可选的实施例中,映射模块560,包括:
获取子模块,被配置为获取图层的叠加顺序;
标记子模块,被配置为根据叠加顺序标记图层的有效显示区域;
映射子模块,被配置为将图层中的有效显示区域映射至对应的色域。
在可选的实施例中,显示模块580,被配置为根据叠加顺序将图层进行叠加,得到显示图像;将显示图像输出至显示屏进行显示。
在可选的实施例中,获取模块520,被配置为获取应用程序生成的图层以及与图层对应的色域类型标签,色域类型标签是应用程序在生成图层时所添加的标签。
在可选的实施例中,该装置还包括:
检测模块510,被配置为检测是否满足自动映射色域功能的开启条件;
获取模块520,被配置为当满足开启条件,则执行获取图层的色域类型标签的步骤。
综上所述,本实施例提供的色域映射装置,通过获取图层的色域类型标签,根据色域类型标签确定图层对应的色域,将图层映射至对应的色域,将至少一个色域映射后的图层叠加形成显示图像并输出;解决了对终端中的每帧显示图像进行采用相同的色域进行显示时,显示图像中的一些图层的显示效果较差的问题;达到了对于终端中的每个显示图像中的不同图层,根据不同色域类型映射至不同的色域,使每个图层都具有较好的显示效果。比如,自然图层得到更准确的色彩,UI图层得到更鲜艳的色彩,从而整体上提升终端的显示效果。
本实施例提供的色域映射装置,还通过只对每个图层的有效显示区域进行映射,能够减少终端的计算量,提高终端在色域映射时的计算速度。可选地,终端在色域映射时,也仅对图层中的有效显示区域进行色域映射即可。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施 例中进行了详细描述,此处将不做详细阐述说明。
本发明一示例性实施例提供了一种色域映射装置,能够实现本发明提供的色域映射方法,该色域映射装置包括:处理器、用于存储处理器可执行指令的存储器;其中,处理器被配置为:
获取图层的色域类型标签,色域类型标签是在生成图层时所添加的标签;
根据色域类型标签确定图层对应的色域;
将图层映射至对应的色域;
将至少一个色域映射后的图层叠加形成显示图像并输出。
图6是根据一示例性实施例示出的一种色域映射装置的框图。例如,装置600可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。
参照图6,装置600可以包括以下一个或多个组件:处理组件602,存储器604,电源组件606,多媒体组件608,音频组件610,输入/输出(I/O)接口612,传感器组件614,以及通信组件616。
处理组件602通常控制装置600的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件602可以包括一个或多个处理器618来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件602可以包括一个或多个模块,便于处理组件602和其他组件之间的交互。例如,处理组件602可以包括多媒体模块,以方便多媒体组件608和处理组件602之间的交互。
存储器604被配置为存储各种类型的数据以支持在装置600的操作。这些数据的示例包括用于在装置600上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器604可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件606为装置600的各种组件提供电力。电源组件606可以包括电源管理***,一个或多个电源,及其他与为装置600生成、管理和分配电力相关联的组件。
多媒体组件608包括在装置600和用户之间的提供一个输出接口的屏幕。在一些实施例 中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件608包括一个前置摄像头和/或后置摄像头。当装置600处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜***或具有焦距和光学变焦能力。
音频组件610被配置为输出和/或输入音频信号。例如,音频组件610包括一个麦克风(MIC),当装置600处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器604或经由通信组件616发送。在一些实施例中,音频组件610还包括一个扬声器,用于输出音频信号。
I/O接口612为处理组件602和***接口模块之间提供接口,上述***接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件614包括一个或多个传感器,用于为装置600提供各个方面的状态评估。例如,传感器组件614可以检测到装置600的打开/关闭状态,组件的相对定位,例如组件为装置600的显示器和小键盘,传感器组件614还可以检测装置600或装置600一个组件的位置改变,用户与装置600接触的存在或不存在,装置600方位或加速/减速和装置600的温度变化。传感器组件614可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件614还可以包括光传感器,如CMOS或CCD图层传感器,用于在成像应用中使用。在一些实施例中,该传感器组件614还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件616被配置为便于装置600和其他设备之间有线或无线方式的通信。装置600可以接入基于通信标准的无线网络,如Wi-Fi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件616经由广播信道接收来自外部广播管理***的广播信号或广播相关信息。在一个示例性实施例中,通信组件616还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,装置600可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、 控制器、微控制器、微处理器或其他电子元件实现,用于执行上述色域映射方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器604,上述指令可由装置600的处理器618执行以完成上述色域映射方法。例如,非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本发明的其它实施方案。本申请旨在涵盖本发明的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本发明的一般性原理并包括本发明未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本发明的真正范围和精神由下面的权利要求指出。
应当理解的是,本发明并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本发明的范围仅由所附的权利要求来限制。

Claims (11)

  1. 一种色域映射方法,其特征在于,所述方法包括:
    获取图层的色域类型标签,所述色域类型标签是在生成所述图层时所添加的标签;
    根据所述色域类型标签确定所述图层对应的色域;
    将所述图层映射至对应的色域;
    将至少一个色域映射后的图层叠加形成显示图像并输出。
  2. 根据权利要求1所述的方法,其特征在于,所述将所述图层映射至对应的色域,包括:
    获取所述图层的叠加顺序;
    根据所述叠加顺序标记所述图层的有效显示区域;
    将所述图层中的所述有效显示区域映射至对应的色域。
  3. 根据权利要求2所述的方法,其特征在于,所述将至少一个色域映射后的图层叠加形成显示图像并输出,包括:
    根据所述叠加顺序将所述图层进行叠加,得到所述显示图像;
    将所述显示图像输出至显示屏进行显示。
  4. 根据权利要求1至3任一所述的方法,其特征在于,所述获取图层的色域类型标签,所述色域类型标签是在生成所述图层时所添加的标签,包括:
    获取应用程序生成的所述图层以及与所述图层对应的所述色域类型标签,所述色域类型标签是所述应用程序在生成所述图层时所添加的标签。
  5. 根据权利要求1至3任一所述的方法,其特征在于,所述方法还包括:
    检测是否满足自动映射色域功能的开启条件;
    若满足所述开启条件,则执行所述获取图层的色域类型标签的步骤。
  6. 一种色域映射装置,其特征在于,所述装置包括:
    获取模块,被配置为获取图层的色域类型标签,所述色域类型标签是在生成所述图层时所添加的标签;
    识别模块,被配置为根据所述色域类型标签确定所述图层对应的色域;
    映射模块,被配置为将所述图层映射至对应的色域;
    显示模块,被配置为将至少一个色域映射后的图层叠加形成显示图像并输出。
  7. 根据权利要求6所述的装置,其特征在于,所述映射模块,包括:
    获取子模块,被配置为获取所述图层的叠加顺序;
    标记子模块,被配置为根据所述叠加顺序标记所述图层的有效显示区域;
    映射子模块,被配置为将所述图层中的所述有效显示区域映射至对应的色域。
  8. 根据权利要求7所述的装置,其特征在于,所述显示模块,被配置为根据所述叠加顺序将所述图层进行叠加,得到所述显示图像;将所述显示图像输出至显示屏进行显示。
  9. 根据权利要求6至8任一所述的装置,其特征在于,所述获取模块,被配置为获取应用程序生成的所述图层以及与所述图层对应的所述色域类型标签,所述色域类型标签是所述应用程序在生成所述图层时所添加的标签。
  10. 根据权利要求6至8任一所述的装置,其特征在于,所述装置还包括:
    检测模块,被配置为检测是否满足自动映射色域功能的开启条件;
    所述获取模块,被配置为当满足所述开启条件,则执行所述获取图层的色域类型标签的步骤。
  11. 一种色域映射装置,其特征在于,所述装置包括:
    处理器;
    用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为:
    获取图层的色域类型标签,所述色域类型标签是在生成所述图层时所添加的标签;
    根据所述色域类型标签确定所述图层对应的色域;
    将所述图层映射至对应的色域;
    将至少一个色域映射后的图层叠加形成显示图像并输出。
PCT/CN2016/110861 2016-08-16 2016-12-19 色域映射方法及装置 WO2018032674A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
RU2017121651A RU2671763C1 (ru) 2016-08-16 2016-12-19 Способ и устройство для отображения на цветовое пространство
JP2017527795A JP6564859B2 (ja) 2016-08-16 2016-12-19 色域マッピング方法および装置
KR1020187031499A KR102189189B1 (ko) 2016-08-16 2016-12-19 색 영역 매핑 방법 및 장치

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610677001.9A CN107767838B (zh) 2016-08-16 2016-08-16 色域映射方法及装置
CN201610677001.9 2016-08-16

Publications (1)

Publication Number Publication Date
WO2018032674A1 true WO2018032674A1 (zh) 2018-02-22

Family

ID=59631661

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/110861 WO2018032674A1 (zh) 2016-08-16 2016-12-19 色域映射方法及装置

Country Status (7)

Country Link
US (1) US10325569B2 (zh)
EP (1) EP3285474B1 (zh)
JP (1) JP6564859B2 (zh)
KR (1) KR102189189B1 (zh)
CN (1) CN107767838B (zh)
RU (1) RU2671763C1 (zh)
WO (1) WO2018032674A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019199061A1 (ko) 2018-04-10 2019-10-17 주식회사 엘지화학 장식 부재
CN110378974B (zh) * 2019-07-17 2021-09-14 Oppo广东移动通信有限公司 图片处理方法、装置、移动终端以及存储介质
CN114067739B (zh) * 2020-07-31 2024-02-06 北京小米移动软件有限公司 色域映射方法及装置、电子设备及存储介质
CN114866752B (zh) * 2022-06-01 2023-10-27 青岛海信激光显示股份有限公司 激光投影显示方法、三色激光投影设备及可读性存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156999A (zh) * 2010-02-11 2011-08-17 腾讯科技(深圳)有限公司 一种用户界面的生成方法和装置
EP2887634A1 (en) * 2013-12-23 2015-06-24 Thomson Licensing Method of mapping source colors from a source color gamut into a target color gamut
CN105118026A (zh) * 2015-07-28 2015-12-02 小米科技有限责任公司 色域模式切换方法及装置
CN105141806A (zh) * 2015-07-28 2015-12-09 小米科技有限责任公司 图像文件的显示方法及装置
CN105261326A (zh) * 2015-10-09 2016-01-20 惠州Tcl移动通信有限公司 调整显示色域的显示设备及其调整显示色域的方法

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4531356B2 (ja) * 2003-06-26 2010-08-25 株式会社メガチップス 画像表示装置および画像表示切替方法
US7646391B1 (en) * 2004-04-30 2010-01-12 Apple Inc. Systems and methods for color managing content elements on a display device
KR100814079B1 (ko) * 2007-05-28 2008-03-14 주식회사 모비더스 Html 파일을 플래시 이미지로 변환하는 파일 변환 장치및 그 변환 방법
US20100011914A1 (en) * 2008-07-16 2010-01-21 Chang Ming-Chi Hand tool
KR101502598B1 (ko) * 2008-11-12 2015-03-16 삼성전자주식회사 깊이감 인지 향상을 위한 영상 처리 장치 및 방법
TWI580275B (zh) 2011-04-15 2017-04-21 杜比實驗室特許公司 高動態範圍影像的編碼、解碼及表示
RU2616158C2 (ru) * 2011-04-28 2017-04-12 Конинклейке Филипс Н.В. Устройства и способы для кодирования и декодирования hdr-изображений
EP3073742A4 (en) * 2013-11-21 2017-06-28 LG Electronics Inc. Signal transceiving apparatus and signal transceiving method
EP3092806A4 (en) * 2014-01-07 2017-08-23 Nokia Technologies Oy Method and apparatus for video coding and decoding
US10257964B2 (en) 2014-07-28 2019-04-09 International Business Machines Corporation Reducing condensation risk within liquid cooled computers

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156999A (zh) * 2010-02-11 2011-08-17 腾讯科技(深圳)有限公司 一种用户界面的生成方法和装置
EP2887634A1 (en) * 2013-12-23 2015-06-24 Thomson Licensing Method of mapping source colors from a source color gamut into a target color gamut
CN105118026A (zh) * 2015-07-28 2015-12-02 小米科技有限责任公司 色域模式切换方法及装置
CN105141806A (zh) * 2015-07-28 2015-12-09 小米科技有限责任公司 图像文件的显示方法及装置
CN105261326A (zh) * 2015-10-09 2016-01-20 惠州Tcl移动通信有限公司 调整显示色域的显示设备及其调整显示色域的方法

Also Published As

Publication number Publication date
CN107767838B (zh) 2020-06-02
EP3285474B1 (en) 2021-05-05
JP6564859B2 (ja) 2019-08-21
KR20180132095A (ko) 2018-12-11
KR102189189B1 (ko) 2020-12-09
RU2671763C1 (ru) 2018-11-06
CN107767838A (zh) 2018-03-06
US20180052337A1 (en) 2018-02-22
EP3285474A1 (en) 2018-02-21
JP2018537870A (ja) 2018-12-20
US10325569B2 (en) 2019-06-18

Similar Documents

Publication Publication Date Title
US11315336B2 (en) Method and device for editing virtual scene, and non-transitory computer-readable storage medium
EP3672262A1 (en) Operation method, device, apparatus and storage medium of playing video
US20170032725A1 (en) Method, device, and computer-readable medium for setting color gamut mode
WO2017000485A1 (zh) 信息展示方法及装置
CN109191549B (zh) 显示动画的方法及装置
US10629167B2 (en) Display apparatus and control method thereof
WO2016192325A1 (zh) 视频文件的标识处理方法及装置
CN106339224B (zh) 可读性增强方法及装置
WO2018032674A1 (zh) 色域映射方法及装置
WO2017016172A1 (zh) 图标的角标显示方法及装置
WO2016090831A1 (zh) 页面显示方法及装置、电子设备
WO2021013147A1 (zh) 视频处理方法、装置、终端及存储介质
CN104035674B (zh) 图片显示方法和装置
WO2023284632A1 (zh) 图像展示方法、装置及电子设备
CN112817675A (zh) 界面显示的处理方法、装置、电子设备及存储介质
CN107566878B (zh) 直播中显示图片的方法及装置
US10951816B2 (en) Method and apparatus for processing image, electronic device and storage medium
US10204403B2 (en) Method, device and medium for enhancing saturation
CN104536713B (zh) 显示图像中的字符的方法及装置
CN106557294A (zh) 色彩调整方法及装置
CN106775548B (zh) 页面处理方法及装置
US10827156B2 (en) Light filling method and apparatus for photographing environment, and computer-readable storage medium
WO2018036526A1 (zh) 显示方法及装置
WO2021147976A1 (zh) 图像处理方法、装置、电子设备及存储介质
CN106371714B (zh) 信息显示方法及装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017527795

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2017121651

Country of ref document: RU

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20187031499

Country of ref document: KR

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16913426

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16913426

Country of ref document: EP

Kind code of ref document: A1