WO2017071559A1 - 图像处理装置及方法 - Google Patents

图像处理装置及方法 Download PDF

Info

Publication number
WO2017071559A1
WO2017071559A1 PCT/CN2016/103238 CN2016103238W WO2017071559A1 WO 2017071559 A1 WO2017071559 A1 WO 2017071559A1 CN 2016103238 W CN2016103238 W CN 2016103238W WO 2017071559 A1 WO2017071559 A1 WO 2017071559A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
foreground
background
processed
color transfer
Prior art date
Application number
PCT/CN2016/103238
Other languages
English (en)
French (fr)
Inventor
戴向东
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2017071559A1 publication Critical patent/WO2017071559A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters

Definitions

  • the present application relates to, but is not limited to, the field of image processing technology, and more particularly to an image processing apparatus and method.
  • color transfer is a method of automatically adding or changing image colors by using a specific method, that is, specifying an original image and a reference image, and linearly transforming low-order statistical features of the original image and the target image, and in the reference image.
  • the color information is transferred to the original image, causing the color of the original image to change and having a color feature similar to the reference image.
  • This paper proposes an image processing apparatus and method for more flexible color transfer of images.
  • An embodiment of the present invention provides an image processing apparatus, where the image processing apparatus includes:
  • a dividing module configured to: obtain depth information of an image to be processed, and divide the image to be processed into a foreground image and a background image according to the obtained depth information;
  • a transform module configured to: obtain a reference image of color transfer, and perform color transfer on the foreground image and/or the background image according to the obtained reference image, so that the foreground image and the color of the background image Different characteristics;
  • a synthesis module configured to: when the color transfer is completed, the foreground image and the The background image is merged into the resulting image.
  • the image processing apparatus further includes:
  • the shooting module is configured to: when the shooting instruction is detected, take a shooting scene, and obtain depth information of the scene to be shot;
  • the association module is configured to: associate the depth information of the scene to be shot with the captured image, and use the captured image as the image to be processed.
  • the dividing module is configured to: search for local or cloud presence depth information pre-associated with the to-be-processed image, and when searching for depth information pre-associated with the to-be-processed image, The depth information is used as depth information of the image to be processed.
  • the transforming module is configured to adopt color transfer based on low-order statistical information, or adopt color transfer based on high-order statistical information, or adopt cluster-based regional color transfer.
  • the image processing apparatus further includes a feathering module, configured to: perform feathering processing on the foreground image and the segmentation edge of the background image when the color transfer is completed;
  • the synthesis module is further configured to fuse the foreground image and the background image into a resulting image upon completion of the feathering process.
  • the transformation module is further configured to: display a selection interface of the reference image, for the user to select a reference image for color delivery of the foreground image and/or the background image based on the selection interface; and receive When the user selects a selection instruction triggered based on the selection interface, the reference image corresponding to the selection instruction is acquired.
  • the transforming module is further configured to: respectively perform color on the foreground image and the background image by using a reference image corresponding to each of the foreground image and the background image transfer.
  • the image processing apparatus further includes:
  • a recording module configured to: when the dividing module divides the image to be processed into a foreground image and a background image, record connection information between the foreground image and the background image;
  • the synthesizing module is configured to: fuse the foreground image and the background image into a result image according to the foreground image recorded by the recording module and connection information between the background images when the color transfer is completed .
  • the image processing apparatus further includes:
  • the display module is set to: display the result image.
  • the display module is further configured to: when displaying the result image, displaying a prompt information display interface, for the user to confirm whether to store the result image based on the prompt information display interface;
  • the prompt information displays the confirmation information input by the interface, the result image is stored in a storage area pointed to by the preset storage path.
  • An embodiment of the present invention further provides an image processing method, where the image processing method includes:
  • the foreground image and the background image are merged into a resulting image.
  • the method before the step of acquiring the depth information of the image to be processed and dividing the image to be processed into the foreground image and the background image according to the obtained depth information, the method further includes:
  • the shooting scene is photographed, and the depth information of the scene to be photographed is acquired;
  • the depth information of the scene to be photographed is associated with the photographed image, and the photographed image is taken as the image to be processed.
  • the obtaining the depth information of the image to be processed includes:
  • the step of acquiring a color-transmitted reference image and performing color transfer on the foreground image and/or the background image according to the acquired reference image is performed based on a low-order system
  • the method before the step of fusing the foreground image and the background image into a result image, the method further includes:
  • the step of fusing the foreground image and the background image into a resultant image is performed.
  • the obtaining the reference image of the color delivery comprises:
  • performing color delivery on the foreground image and/or the background image according to the acquired reference image comprises:
  • the foreground image and the background image are respectively subjected to color transfer using the foreground image and the reference image corresponding to each of the background images.
  • the method further includes: recording the foreground image and connection information between the background images;
  • the merging the foreground image and the background image into a result image includes fusing the foreground image and the background image into a result image according to the connection information between the foreground image and the background image.
  • the method further includes:
  • the result image is displayed.
  • the displaying the result image further includes: displaying a prompt information display interface, for the user to confirm whether to store the result image based on the prompt information display interface;
  • the user displays the confirmation information input by the interface based on the prompt information, the result image is stored in a storage area pointed to by the preset storage path.
  • Embodiments of the present invention further provide a computer readable storage medium storing computer executable instructions that are implemented when executed by a processor.
  • the mobile terminal when performing color transfer of an image to be processed, the mobile terminal first divides the image to be processed into a foreground image and a background image based on the depth information of the image to be processed, and then for the foreground image.
  • FIG. 1 is a schematic structural diagram of hardware of a mobile terminal that implements an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a wireless communication device of the mobile terminal shown in FIG. 1;
  • FIG. 3 is a schematic diagram of functional modules of a first embodiment of an image processing apparatus according to the present invention.
  • FIG. 4 is a diagram showing an example of an image to be processed in the first embodiment of the image processing apparatus of the present invention
  • Figure 5 is a diagram showing an example of a foreground image in the first embodiment of the image processing apparatus of the present invention.
  • Figure 6 is a diagram showing an example of a background image in the first embodiment of the image processing apparatus of the present invention.
  • FIG. 7 is a diagram showing an example of another image to be processed in the first embodiment of the image processing apparatus of the present invention.
  • Figure 8 is a diagram showing an example of a reference image for color transfer in the first embodiment of the image processing apparatus of the present invention.
  • FIG. 9 is a diagram showing an example of a result image of completion of color transfer of another image to be processed in the first embodiment of the image processing apparatus of the present invention.
  • FIG. 10 is a schematic flowchart diagram of a first embodiment of an image processing method according to the present invention.
  • the mobile terminal can be implemented in a variety of forms.
  • the terminals described herein may include, for example, mobile phones, smart phones, notebook computers, digital broadcast receivers, PDAs (Personal Digital Assistants), PADs (Tablets), PMPs (Portable Multimedia Players), navigation devices, and the like.
  • Mobile terminals and fixed terminals such as digital TVs, desktop computers, and the like.
  • the terminal is a mobile terminal.
  • configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.
  • FIG. 1 is a schematic structural diagram of hardware of a mobile terminal that implements an embodiment of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an A/V (Audio/Video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
  • Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication device or network.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • Broadcast management The server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives the previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may exist in the form of Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), and the like.
  • the broadcast receiving module 111 can receive a signal broadcast by using a plurality of types of broadcast apparatuses.
  • the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
  • MediaFLO forward link media
  • DMB-T multimedia broadcast-terrestrial
  • DMB-S digital multimedia broadcast-satellite
  • DVD-H digital video broadcast-handheld
  • the digital broadcasting device of the @) data broadcasting device, the terrestrial digital broadcasting integrated service (ISDB-T), or the like receives the digital broadcasting.
  • the broadcast receiving module 111 can be configured as a broadcast device suitable for providing a broadcast signal as well as the above-described digital broadcast device.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or multiple types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .
  • the short range communication module 114 is configured to support short range communication.
  • Some examples of short-range communication technology include Bluetooth TM, a radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, etc. TM.
  • the location information module 115 is configured to check or acquire location information of the mobile terminal.
  • a typical example of a location information module is a GPS (Global Positioning Device).
  • the GPS module 115 calculates distance information and accurate time information from three or more satellites and for the calculated letter Triangulation is applied to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
  • the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
  • the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
  • the A/V input unit 120 is arranged to receive an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 122 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 passes a noise cancellation (or suppression) algorithm to cancel (or suppress) noise or interference generated during the process of receiving and transmitting the audio signal.
  • the user input unit 130 may generate key input data according to a command input by the user to control the operation of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
  • Sensing unit 140 may include proximity sensor 141 which will be described below in connection with a touch screen.
  • the interface unit 170 serves as a connection through which at least one external device is connected to the mobile terminal 100. mouth.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module may be stored to verify a variety of information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identification Module (USIM), and the like.
  • UIM User Identification Module
  • SIM Customer Identification Module
  • USB Universal Customer Identification Module
  • the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 may be arranged to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more components within the mobile terminal 100 or may be configured to be at the mobile terminal and external device Transfer data between.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a plurality of command signals allowed to be input from the base to be transmitted to the mobile The path to the terminal.
  • a variety of command signals or power input from the base can be used as a signal for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 can display a user interface (UI) or a graphical user interface (GUI) related to a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor LCD
  • OLED organic light emitting diode
  • a flexible display a three-dimensional (3D) display, and the like.
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, which may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display, etc. Wait.
  • TOLED Transparent Organic Light Emitting Diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
  • the touch screen can be set to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a pickup, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (ie, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of an event even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 may store data regarding vibration and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, which may be constructed within the controller 180 or may be configured to be separate from the controller 180.
  • the controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate each component and component.
  • the embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle.
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by controller
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among a plurality of types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the embodiment of the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
  • FIG. 2 is a block diagram of the electrical structure of the camera of FIG. 1.
  • the photographic lens 1211 is composed of a plurality of optical lenses for forming a subject image, and is a single focus lens or a zoom lens.
  • the photographic lens 1211 is movable in the optical axis direction under the control of the lens driver 1221, and the lens driver 1221 controls the focus position of the photographic lens 1211 in accordance with a control signal from the lens driving control circuit 1222, and can also be controlled in the case of the zoom lens. Focus distance.
  • the lens drive control circuit 1222 performs drive control of the lens driver 1221 in accordance with a control command from the microcomputer 1217.
  • An imaging element 1212 is disposed on the optical axis of the photographic lens 1211 near the position of the subject image formed by the photographic lens 1211.
  • the imaging element 1212 is provided to image the subject image and acquire captured image data.
  • Photodiodes constituting each pixel are arranged two-dimensionally and in a matrix on the imaging element 1212.
  • the photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to the photodiode.
  • the front surface of each pixel is provided with a Bayer array of RGB color filters.
  • the imaging element 1212 is connected to the imaging circuit 1213.
  • the imaging circuit 1213 performs charge accumulation control and image signal readout control in the imaging element 1212, and performs waveform shaping after reducing the reset noise of the read image signal (analog image signal). Further, gain improvement or the like is performed to obtain an appropriate signal level.
  • the imaging circuit 1213 is connected to an A/D converter 1214 that performs analog-to-digital conversion on the analog image signal and outputs a digital image signal (hereinafter referred to as image data) to the bus 1227.
  • A/D converter 1214 that performs analog-to-digital conversion on the analog image signal and outputs a digital image signal (hereinafter referred to as image data) to the bus 1227.
  • the bus 1227 is a transmission path for transmitting a variety of data read or generated inside the camera.
  • the A/D converter 1214 is connected to the bus 1227, and an image processor 1215, a JPEG processor 1216, a microcomputer 1217, a SDRAM (Synchronous Dynamic Random Access Memory) 1218, and a memory interface are also connected. (hereinafter referred to as memory I/F) 1219, LCD (Liquid Crystal Display) driver 1220.
  • the image processor 1215 performs OB subtraction processing, white balance adjustment, color matrix calculation, gamma conversion, color difference signal processing, noise removal processing, simultaneous processing, edge processing, and the like on the image data based on the output of the imaging element 1212. deal with.
  • the JPEG processor 1216 compresses the image data read out from the SDRAM 1218 in accordance with the JPEG compression method when the image data is recorded on the recording medium 1225. Further, the JPEG processor 1216 performs decompression of JPEG image data for image reproduction display. At the time of decompression, the file recorded on the recording medium 1225 is read, and after the compression processing is performed in the JPEG processor 1216, the decompressed image data is temporarily stored in the SDRAM 1218 and displayed on the LCD 1226. Further, in the present embodiment, the JPEG method is adopted as the image compression/decompression method. However, the compression/decompression method is not limited thereto, and other compression/decompression methods such as MPEG, TIFF, and H.264 may be
  • the microcomputer 1217 functions as a control unit of the entire camera, and controls the camera in a unified manner. Multiple processing sequences.
  • the microcomputer 1217 is connected to the operation unit 1223 and the flash memory 1224.
  • the operating unit 1223 includes, but is not limited to, a physical button or a virtual button, and the entity or virtual button may be a power button, a camera button, an edit button, a dynamic image button, a reproduction button, a menu button, a cross button, an OK button, a delete button, an enlarge button A variety of input buttons and a variety of input keys and other operational controls to detect the operational status of these operational controls.
  • the detection result is output to the microcomputer 1217. Further, a touch panel is provided on the front surface of the LCD 1226 as a display, and the touch position of the user is detected, and the touch position is output to the microcomputer 1217.
  • the microcomputer 1217 executes a plurality of processing sequences corresponding to the user's operation in accordance with the detection result from the operation position of the operation unit 1223.
  • the flash memory 1224 stores programs for executing various processing sequences of the microcomputer 1217.
  • the microcomputer 1217 performs overall control of the camera in accordance with the program. Further, the flash memory 1224 stores various adjustment values of the camera, and the microcomputer 1217 reads out the adjustment value, and performs control of the camera in accordance with the adjustment value.
  • the SDRAM 1218 is an electrically rewritable volatile memory for temporarily storing image data or the like.
  • the SDRAM 1218 temporarily stores image data output from the A/D converter 1214 and image data processed in the image processor 1215, the JPEG processor 1216, and the like.
  • the memory interface 1219 is connected to the recording medium 1225, and performs control for writing image data and a file header attached to the image data to the recording medium 1225 and reading out from the recording medium 1225.
  • the recording medium 1225 is, for example, a recording medium such as a memory card that can be detachably attached to the camera body.
  • the recording medium 1225 is not limited thereto, and may be a hard disk or the like built in the camera body.
  • the LCD driver 1210 is connected to the LCD 1226, and stores image data processed by the image processor 1215 in the SDRAM 1218.
  • the image data stored in the SDRAM 1218 is read and displayed on the LCD 1226, or the image data stored in the JPEG processor 1216 is compressed.
  • the JPEG processor 1216 reads the compressed image data of the SDRAM 1218, decompresses it, and displays the decompressed image data through the LCD 1226.
  • the LCD 1226 is configured to display an image on the back of the camera body.
  • the LCD 1226 is an LCD, but is not limited thereto, and various display panels such as an organic EL may be used.
  • the image processing apparatus includes:
  • the dividing module 10 is configured to: obtain depth information of an image to be processed, and divide the image to be processed into a foreground image and a background image according to the obtained depth information;
  • the image processing apparatus provided in this embodiment can be applied to mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (Personal Digital Assistants), PADs (tablets), PMPs (portable multimedia players), for example, image processing apparatuses.
  • mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (Personal Digital Assistants), PADs (tablets), PMPs (portable multimedia players), for example, image processing apparatuses.
  • the mobile phone Built in the mobile phone, when the user transmits the color of the photos taken by the mobile phone, the mobile phone automatically separates the portrait and the background in the photo according to the depth information of the photo, and uses different reference images to respectively transmit the color of the portrait part and the background part. Or only color transfer to the background part, increasing the flexibility and visual effect of color transfer.
  • the depth information may be used to describe a distance of any point in the image to be processed with respect to a lens that captures an image to be processed, for example, the image to be processed is a portrait photo taken by a mobile phone,
  • the depth information of the image to be processed can describe the distance of the "person” in the photo from the mobile phone when shooting, and describe the distance of the "background” in the photo from the mobile phone when shooting.
  • the partitioning module 10 acquires the depth information that is to be associated with the image to be processed, and uses the obtained depth information as the depth information of the image to be processed, for example, whether the local or cloud exists.
  • the depth information that is pre-associated with the image is processed, and the depth information that is searched for is used as the depth information of the image to be processed.
  • the dividing module 10 clusters each pixel of the image to be processed according to the acquired depth information, and divides the image to be processed into a foreground image and a background image, for example, in conjunction with FIG. 4 to FIG. 6, the image to be processed is a portrait photo (as shown in FIG. 4), and the mobile phone divides the aforementioned portrait photo into "people" (foreground image, as shown in FIG. 5) and "Background” (background image, as shown in Figure 6).
  • the transform module 20 is configured to: obtain a reference image of color transfer, and according to the obtained The reference image performs color transfer on the foreground image and/or the background image such that the foreground image is different from the color feature of the background image;
  • the transform module 20 acquires a reference image of color delivery, wherein the reference image may include the first image. a reference image and a second reference image, the first reference image being used for color transfer of the foreground image, the second reference image being used for color transfer of the background image; or the reference image is only for Performing color transfer on the foreground image; or the reference image is only used for color transfer of the background image.
  • the obtaining of the reference image may be performed according to a preset setting, for example, the mobile phone default setting only performs color transfer on the background image, and the transform module 20 will acquire a reference image for color transfer of the background image;
  • the user presets to perform color transfer on the background image and the foreground image, and the transform module 20 acquires a first reference image for color transfer of the foreground image, and acquires the background image for The image is a second reference image for color transfer.
  • the transform module 20 After acquiring the reference image of the color transfer, the transform module 20 performs color transfer on the foreground image and/or the background image according to the acquired reference image, so that the foreground image and the background image are The color characteristics are different.
  • the transforming module 20 is configured to adopt color transfer based on low-order statistical information, or adopt color transfer based on high-order statistical information, or adopt cluster-based regional color transfer.
  • the transform module 20 describes the color transfer of the background image based on the low-order statistical information.
  • the transform module 20 first converts the background image and the second reference image from the RGB color space to the L ⁇ color space, and uses the L ⁇ color space as the execution space for color transfer.
  • the L channel represents an achromatic channel, that is, a luminance channel
  • represents a colored yellow-blue channel
  • represents a colored red-green channel.
  • the transform module 20 After completing the color space conversion of the background image and the second reference image, the transform module 20 first calculates an L channel mean of the background image. Alpha channel mean Beta channel mean And an L channel mean of the second reference image Alpha channel mean Beta channel mean And the L channel standard deviation of the background image Alpha channel standard deviation Beta channel standard deviation And an L channel standard deviation of the second reference image Alpha channel standard deviation Beta channel standard deviation Then, the transform module 20 removes the mean value from the background image, scales the remaining portion according to the ratio of the standard deviation, and finally adds the mean value of the second reference image, and the transformation formula is as follows:
  • l dst , ⁇ dst , and ⁇ dst represent the values of each channel of the selected pixel, respectively.
  • the low-order statistical information of each channel of the background image and the second reference image is consistent, and the color feature of the second reference image is transmitted to the background image, and then The calculated background image is converted from the L ⁇ color space to the RGB color space, which is beneficial for the display of the mobile terminal.
  • the color transfer of the foreground image by using the first reference image may be implemented by referring to the foregoing technical solution, and details are not described herein again. It can be understood by those skilled in the art that in other embodiments, the manner of color transfer can be selected according to actual needs, for example, color transfer based on high-order statistical information or cluster-based regional color transfer can be adopted.
  • the composition module 30 is configured to fuse the foreground image and the background image into a resultant image upon completion of color transfer.
  • the image processing apparatus further includes a recording module, configured to: when the dividing module 10 divides the image to be processed into a foreground image and a background image, recording between the foreground image and the background image Connection information (ie split edges).
  • the synthesizing module 30 fuses the foreground image and the background image into a resultant image according to the foreground image recorded by the recording module and the connection information between the background images.
  • FIG. 7 is a to-be-processed image taken by a user
  • FIG. 8 is a reference image for performing color transfer on the background portion of FIG. 7
  • FIG. 9 is a result image for completing color transfer, as shown in FIG.
  • the color atmosphere of the sunset backlight special effect scene shown in FIG. 8 is transmitted to the background portion of the image to be processed shown in FIG. 7 by using color transfer, and the original color feature of the foreground portion (portrait) is maintained, giving a kind of A new visual sense and special artistic effects.
  • the image processing apparatus further includes a display module, configured to: display the result image.
  • the display module displays the result image at the mobile terminal where the display module is located, so that the user can immediately view the color delivery of the image to be processed. result.
  • the display module may further display a prompt information display interface for displaying, by the user, whether to store the result image based on the prompt information display interface, and displaying the user based on the
  • the display module stores the result image to a storage area pointed by the preset storage path.
  • the user if the user is not satisfied with the resulting image of the color transfer, it may be selected not to store.
  • the image processing device provided in this embodiment is built in the mobile terminal, so that when the mobile terminal performs color transfer, the image to be processed is first divided into a foreground image and a background image based on the depth information of the image to be processed, and then the foreground image is Applying a color transfer technique to the background image using different reference images, or applying a color transfer technique only to one of the foreground image and the background image using a reference image, and after completing the color transfer, the foreground image and The background image is combined into a result image, so that the foreground portion and the background portion of the resulting image have different color ambiences, and the embodiment of the present invention can more flexibly perform images in a manner that only the overall color transfer can be performed on the image to be processed.
  • the color of the pass is built in the mobile terminal, so that when the mobile terminal performs color transfer, the image to be processed is first divided into a foreground image and a background image based on the depth information of the image to be processed, and then the foreground
  • the image processing apparatus further includes:
  • the shooting module is configured to: when the shooting instruction is detected, take a shooting scene, and obtain depth information of the scene to be shot;
  • the association module is configured to: associate the depth information of the scene to be shot with the captured image, and use the captured image as the image to be processed.
  • the color transfer scheme described in the first embodiment is applied to the photographing.
  • the image processing device is built in the mobile terminal, and the mobile terminal transmits the color atmosphere of the sunset backlight special effect scene to the photographing by using the color transfer.
  • the image is such that the captured image exhibits a contrasting sunset atmosphere, or the golden color of the fall is transmitted to the captured image, so that the captured image has a seasonal change effect.
  • the photographing module captures (calls the camera of the mobile terminal to take a photograph)
  • the depth information of the scene to be photographed is obtained by a binocular camera or a depth sensor preset in the mobile terminal where the photographing terminal is located, wherein the binocular camera is located Two cameras with the same side of the mobile terminal and a certain distance apart.
  • the binocular camera is used to obtain the depth information of the scene to be photographed
  • the scene image captured by any of the binocular cameras may be used as the image to be processed according to the default setting of the mobile terminal, or the camera specified by the user may be set according to the user setting.
  • the captured scene image is taken as the image to be processed.
  • the shooting module separately captures two scene images through a binocular camera set by the mobile terminal where the camera is located, and generates a depth map by using gray information and imaging geometry of the two scene images, and each pixel value in the depth map represents a scene.
  • the photographing module receives light energy emitted or reflected from a scene to be photographed by a depth sensor provided by the mobile terminal, and forms a light energy distribution function related to the scene to be photographed, that is, a grayscale image, and then based on the images.
  • the shooting module transmits energy to the scene to be photographed by the depth sensor, and then receives the reflected energy of the emitted energy of the scene to be photographed to form a light energy distribution function related to the scene to be photographed, ie Grayscale images, and then restore the depth information of the captured scene based on these images.
  • the shooting module may also designate the other cameras for shooting of a scene to be photographed, and the binocular The camera specifies the acquisition of the depth information of the scene to be captured.
  • the image processing apparatus further includes a feathering module, configured to: when the color transfer is completed, Deriving the foreground image and the segmentation edge of the background image;
  • the synthesis module is further configured to fuse the foreground image and the background image into a resulting image upon completion of the feathering process.
  • the transform module 20 pairs the foreground image and the background.
  • One of the images is subjected to a linear transformation of color transfer, or a linear transformation of the foreground image and the background image is performed using different reference images, respectively, and the foreground image and the background image have a certain color contrast If the foreground image and the background image are directly fused, the transition at the merged edge of the foreground image and the background image (ie, the segmentation edge) will be relatively stiff, affecting the display effect of the resulting image.
  • the feathering module before the merging module 30 merges the foreground image and the background image, the feathering module first presets the feathered value of the segmented edge of the foreground image and the background image. (By default setting by the mobile terminal, or user-defined), the feathering process is performed, and after the feathering module completes the feathering process, the synthesizing module 30 merges the foreground image and the background image into a resultant image.
  • the principle of feathering is to blur the edge of the image fusion and play the role of gradual change to achieve the natural connection effect. Among them, the larger the feathering value, the wider the blurring range, that is, the color change is softer, and the feathering value is smaller. The narrower the blurring range, the more dramatic the color change is, and it can be adjusted according to the actual situation.
  • the foreground image and the segmentation edge of the background image are feathered, so that the fusion edge transition of the resultant image is relatively natural. Can improve the display of the resulting image.
  • a fourth embodiment of the image processing apparatus of the present invention is provided.
  • the transformation module 20 is further configured to: display a selection interface of the reference image, a reference image for the user to perform color transfer on the foreground image and/or the background image based on the selection interface; and acquiring a selection instruction corresponding to the selection instruction when receiving a selection instruction triggered by the user based on the selection interface Reference image.
  • the user when performing linear transformation of the foreground image and the color transfer of the background image, the user may respectively specify the foreground image and the reference image corresponding to the background image, or only the two A corresponding reference image is specified.
  • the transformation module 20 displays a selection interface of the reference image for the user to select the foreground image and/or based on the selection interface.
  • the background image performs a color-converted reference image, for example, a user may select a first reference image that performs color conversion on the foreground image, and select a second reference image that performs color conversion on the background image; or the user only One of the foreground image and the background image selects a reference image for color conversion.
  • the transformation module 20 Upon receiving the selection instruction triggered by the user based on the selection interface, the transformation module 20 acquires a first reference image corresponding to the foreground image according to the selection instruction, and acquires a second reference image corresponding to the background image;
  • the transformation module 20 when receiving a selection instruction triggered by the user based on the selection interface, acquires a reference image corresponding to one of the foreground image and the background image according to the selection instruction.
  • the image processing device is built in the mobile phone, and the image to be processed is a landscape photo taken by the user, including the portrait.
  • the mobile phone divides the “person” (foreground part) in the photo into a foreground image, and the “landscape” background in the photo. Partially divided into background images, the user can select the reference image of the golden color of the fall for the "landscape", and not select the reference image for the "person". After the selection is completed, the mobile phone transmits the golden color of the autumn to the "landscape", and finally Makes the photo taken seasonally changing.
  • the user experience can be improved by specifying different reference images for the foreground portion and the background portion of the image to be processed in response to the user operation, or specifying the reference image only for one of the foreground portion and the background portion of the image to be processed.
  • a fifth embodiment of the image processing apparatus of the present invention is proposed.
  • the transformation module 20 is further configured to: adopt The foreground image and the reference image corresponding to each of the background images respectively perform color transfer on the foreground image and the background image.
  • the transformation module 20 acquires a first reference image corresponding to the foreground image, and acquires a second reference image corresponding to the background image, and uses the acquired first reference image pair
  • the foreground image is subjected to color transfer, and the acquired background image is used for color transfer of the background image.
  • the transform module 20 uses the acquired second reference image to describe the color transfer of the background image.
  • the transform module 20 first converts the background image and the second reference image from the RGB color space to the L ⁇ color space, and uses the L ⁇ color space as the execution space for color transfer.
  • the L channel represents an achromatic channel, that is, a luminance channel
  • represents a colored yellow-blue channel
  • represents a colored red-green channel.
  • the transform module 20 After completing the color space conversion of the background image and the second reference image, the transform module 20 first calculates an L channel mean of the background image. Alpha channel mean Beta channel mean And an L channel mean of the second reference image Alpha channel mean Beta channel mean And the L channel standard deviation of the background image Alpha channel standard deviation Beta channel standard deviation And an L channel standard deviation of the second reference image Alpha channel standard deviation Beta channel standard deviation Then, the transform module 20 removes the mean value from the background image, scales the remaining portion according to the ratio of the standard deviation, and finally adds the mean value of the second reference image, and the transformation formula is as follows:
  • l dst , ⁇ dst , and ⁇ dst represent the values of each channel of the selected pixel, respectively.
  • the low-order statistical information of each channel of the background image and the second reference image is consistent, and the color feature of the second reference image is transmitted to the background image, and then The calculated background image is converted from the L ⁇ color space to the RGB color space, which is beneficial for the display of the mobile terminal.
  • the color transfer of the foreground image by using the first reference image may be implemented by referring to the foregoing technical solution, and details are not described herein again. It can be understood by those skilled in the art that in other embodiments, the manner of color transfer can be selected according to actual needs, for example, color transfer based on high-order statistical information or cluster-based regional color transfer can be adopted.
  • An embodiment of the present invention further provides an image processing method.
  • the image processing method includes:
  • Step S10 Obtain depth information of an image to be processed, and divide the image to be processed into a foreground image and a background image according to the obtained depth information;
  • the image processing method provided in this embodiment can be applied to mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (Personal Digital Assistants), PADs (tablets), PMPs (portable multimedia players), for example, users are passing
  • mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (Personal Digital Assistants), PADs (tablets), PMPs (portable multimedia players), for example, users are passing
  • the mobile phone transmits the color of the photographed photo
  • the mobile phone automatically separates the portrait and the background in the photo according to the depth information of the photo, and uses different reference images to respectively transmit the color of the portrait part and the background part, or only color the background part.
  • Handing adds flexibility and visual effects to color delivery.
  • the depth information may be used to describe a distance of any point in the image to be processed with respect to a lens that captures an image to be processed, for example, the image to be processed is a portrait photo taken by a mobile phone,
  • the depth information of the image to be processed can describe the distance of the "person” in the photo from the mobile phone when shooting, and describe the distance of the "background” in the photo from the mobile phone when shooting.
  • the mobile terminal acquires the depth information pre-associated with the image to be processed, and uses the obtained depth information as the depth information of the image to be processed, for example, whether the local or the cloud exists to be processed. And the depth information that is pre-associated with the image to be processed, the depth information that is searched for is used as the depth information of the image to be processed.
  • the mobile terminal clusters each pixel of the image to be processed according to the acquired depth information, and divides the image to be processed into a foreground image and a background.
  • An image for example, in conjunction with FIG. 4 to FIG. 6, the image to be processed is a portrait photo (as shown in FIG. 4), and the mobile phone divides the aforementioned portrait photo into "person" (foreground image, as shown in FIG. 5) and " Background” (background image, as shown in Figure 6).
  • Step S20 acquiring a reference image of color delivery, and performing color transfer on the foreground image and/or the background image according to the obtained reference image, so that the foreground image and the background image have different color features;
  • the mobile terminal divides the image to be processed into the foreground image and the background image, acquiring a reference image of color delivery, wherein the reference image may include a first reference image and a first a second reference image for color transfer of the foreground image, the second reference image for color transfer of the background image; or the reference image is only used for the foreground
  • the image is color-transferred; or the reference image is only used for color transfer of the background image.
  • the obtaining of the reference image is performed according to a preset setting, for example, the default setting of the mobile phone only performs color transfer on the background image, and the mobile phone acquires a reference image for color transfer of the background image; for example, the user pre- Setting the color transfer of the background image and the foreground image at the same time, the mobile phone will acquire a first reference image for color transfer of the foreground image, and acquire color transfer for the background image.
  • a preset setting for example, the default setting of the mobile phone only performs color transfer on the background image, and the mobile phone acquires a reference image for color transfer of the background image; for example, the user pre- Setting the color transfer of the background image and the foreground image at the same time, the mobile phone will acquire a first reference image for color transfer of the foreground image, and acquire color transfer for the background image.
  • Second reference image for example, the default setting of the mobile phone only performs color transfer on the background image, and the mobile phone acquires a reference image for color transfer of the background image; for
  • the mobile terminal After acquiring the reference image of the color transfer, the mobile terminal performs color transfer on the foreground image and/or the background image according to the acquired reference image, such that the foreground image and the color of the background image Different characteristics.
  • Color delivery based on low-order statistical information, or color transfer based on high-order statistical information, or cluster-based regional color transfer may be employed.
  • the mobile terminal first converts the background image and the second reference image from an RGB color space to an L ⁇ color space, and uses the L ⁇ color space as an execution space for color transfer.
  • the L channel represents an achromatic channel, that is, a luminance channel
  • represents a colored yellow-blue channel
  • represents a colored red-green channel.
  • the mobile terminal After completing the color space conversion of the background image and the second reference image, the mobile terminal first calculates an L channel mean of the background image Alpha channel mean Beta channel mean And an L channel mean of the second reference image Alpha channel mean Beta channel mean And the L channel standard deviation of the background image Alpha channel standard deviation Beta channel standard deviation And an L channel standard deviation of the second reference image Alpha channel standard deviation Beta channel standard deviation Then, the mobile terminal removes the mean value from the background image, scales the remaining portion according to the ratio of the standard deviation, and finally adds the mean value of the second reference image, and the transformation formula is as follows:
  • l dst , ⁇ dst , and ⁇ dst represent the values of each channel of the selected pixel, respectively.
  • the low-order statistical information of each channel of the background image and the second reference image is consistent, and the color feature of the second reference image is transmitted to the background image, and then The calculated background image is converted from the L ⁇ color space to the RGB color space, which is beneficial for the display of the mobile terminal.
  • the color transfer of the foreground image by using the first reference image may be implemented by referring to the foregoing technical solution, and details are not described herein again. It can be understood by those skilled in the art that in other embodiments, the manner of color transfer can be selected according to actual needs, for example, color transfer based on high-order statistical information or cluster-based regional color transfer can be adopted.
  • Step S30 when the color transfer is completed, the foreground image and the background image are merged into a resultant image.
  • the connection information between the foreground image and the background image ie, the segmentation edge
  • the mobile terminal fuses the foreground image and the background image into a resultant image based on the recorded foreground image and connection information between the background images.
  • FIG. 7 is a to-be-processed image taken by a user
  • FIG. 8 is a reference image for performing color transfer on the background portion of FIG. 7
  • FIG. 9 is a result image for completing color transfer, as shown in FIG.
  • the color atmosphere of the sunset backlight special effect scene shown in FIG. 8 is transmitted to the background portion of the image to be processed shown in FIG. 7 by using color transfer, and the foreground portion (portrait) is kept original.
  • the color features give a new visual sense and special artistic effects.
  • the method further includes:
  • the result image is displayed.
  • the mobile terminal After merging the foreground image and the background image into a resultant image, the mobile terminal displays the result image such that the user can immediately view the result of color transfer of the image to be processed.
  • the mobile terminal may further display a prompt information display interface for the user to confirm whether to store the result image based on the prompt information display interface, and display the user based on the
  • the prompt information displays the confirmation information input by the interface
  • the mobile terminal stores the result image to a storage area pointed by the preset storage path.
  • the user may be selected not to store.
  • the mobile terminal first divides the image to be processed into a foreground image and a background image based on the depth information of the image to be processed, and then applies color transfer to the foreground image and the background image by using different reference images.
  • Technique or applying a color transfer technique only to one of the foreground image and the background image using a reference image, and after completing the color transfer, combining the foreground image and the background image into a resulting image such that the resulting image
  • the foreground portion and the background portion have different color ambiences, and the embodiment of the present invention can more flexibly perform color transfer of images than the related art can only perform overall color transfer of the image to be processed.
  • the method further includes:
  • the shooting scene is photographed, and the depth information of the scene to be photographed is acquired;
  • the depth information of the scene to be photographed is associated with the photographed image, and the photographed image is taken as the image to be processed.
  • the color transfer scheme described in the first embodiment is applied to the photographing.
  • the color atmosphere of the sunset backlight special effect scene is transmitted to the photographed image by using color transfer, so that the photographed image exhibits contrast.
  • Bright sunset atmosphere effect, or golden autumn The color ambience is passed to the captured image, which makes the captured image appear seasonally changing.
  • the depth information of the scene to be captured is acquired through a preset binocular camera or a depth sensor, wherein the binocular cameras refer to two cameras located on the same side of the mobile terminal and separated by a certain distance.
  • the binocular camera is used to obtain the depth information of the scene to be photographed
  • the scene image captured by any of the binocular cameras may be used as the image to be processed according to the default setting of the mobile terminal, or the camera specified by the user may be set according to the user setting.
  • the captured scene image is taken as the image to be processed.
  • the mobile terminal separately captures two scene images through the set binocular camera, and generates a depth map by using gray information and imaging geometry of the two scene images, and each pixel value in the depth map represents a certain point in the scene.
  • the distance between the mobile terminals For another example, the mobile terminal receives light energy emitted or reflected from a scene to be photographed by a depth sensor, forms a light energy distribution function related to a scene to be photographed, that is, a grayscale image, and then restores the depth of the photographed scene based on the images.
  • the mobile terminal transmits energy to the scene to be photographed through the depth sensor, and then receives the reflected energy of the emitted energy of the scene to be photographed, and forms a light energy distribution function, that is, a grayscale image, about the scene to be photographed, and then in the image Based on the depth information of the restored shooting scene.
  • a light energy distribution function that is, a grayscale image
  • the mobile terminal further includes other cameras located on the same side as the binocular camera
  • the other cameras may also be designated for shooting of a scene to be photographed, and the binocular camera is designated for shooting. Acquisition of scene depth information.
  • a third embodiment of the image processing method of the present invention is provided.
  • the method before the step S30, the method further includes:
  • the step of fusing the foreground image and the background image into a resultant image is performed.
  • the foreground image and the background image After the image to be processed is divided into a foreground image and a background image, one of the foreground image and the background image is entered. Performing a linear transformation of color transfer, or linearly transforming the foreground image and the background image by using different reference images, respectively, the foreground image and the background image have a certain color contrast, if directly The foreground image and the background image are fused, and the transition at the merged edge of the foreground image and the background image (ie, the aforementioned segmentation edge) will be relatively stiff, affecting the display effect of the resulting image.
  • the foreground image and the segmentation edge of the background image are first preset with a feathering value (which can be set by the mobile terminal by default).
  • the feathering process is performed, and after the feathering process is completed, the foreground image and the background image are merged into a resultant image.
  • the principle of feathering is to blur the edge of the image fusion and play the role of gradual change to achieve the natural connection effect. Among them, the larger the feathering value, the wider the blurring range, that is, the color change is softer, and the feathering value is smaller. The narrower the blurring range, the more dramatic the color change is, and it can be adjusted according to the actual situation.
  • the foreground image and the segmentation edge of the background image are feathered, so that the fusion edge transition of the resultant image is relatively natural. Can improve the display of the resulting image.
  • the reference image for obtaining color delivery in the foregoing step S20 includes:
  • the user when performing linear transformation of the foreground image and the color transfer of the background image, the user may respectively specify the foreground image and the reference image corresponding to the background image, or only the two A corresponding reference image is specified.
  • the mobile terminal displays a selection interface of the reference image for the user to select a color for the foreground image and/or the background image based on the selection interface.
  • the passed reference image for example, the user may select a first reference image that performs color conversion on the foreground image, and select a second reference image that performs color conversion on the background image; or the user only One of the foreground image and the background image selects a reference image for color conversion.
  • the mobile terminal After receiving the selection instruction triggered by the user based on the selection interface, the mobile terminal acquires a first reference image corresponding to the foreground image according to the selection instruction, and acquires a second reference image corresponding to the background image;
  • the mobile terminal when receiving a selection instruction triggered by the user based on the selection interface, acquires a reference image corresponding to one of the foreground image and the background image according to the selection instruction.
  • the image to be processed is a landscape photo taken by a user, including a portrait
  • the mobile phone divides the “person” (foreground part) in the photo into a foreground image, and divides the “landscape” background portion in the photo into a background image, and the user can Select the reference image of the autumn golden color atmosphere for “Landscape”, and not select the reference image for “People”.
  • the mobile phone will pass the golden color of the fall to the “Landscape”, which will eventually make the photo of the photo appear seasonally changed. effect.
  • the user experience can be improved by specifying different reference images for the foreground portion and the background portion of the image to be processed in response to the user operation, or specifying the reference image only for one of the foreground portion and the background portion of the image to be processed.
  • a fifth embodiment of the image processing method of the present invention is proposed.
  • the step S20 is performed according to the obtained method.
  • Color transfer of the foreground image and/or the background image by the reference image includes:
  • the foreground image and the background image are respectively subjected to color transfer using the foreground image and the reference image corresponding to each of the background images.
  • the mobile terminal acquires a first reference image corresponding to the foreground image, and acquires a second reference image corresponding to the background image, and uses the acquired first reference image pair to
  • the foreground image is color-transferred
  • the background image is color-transmitted using the acquired second reference image.
  • the color transfer of the background image is described using the acquired second reference image based on low-order statistical information.
  • the L ⁇ color space is more in line with the human visual perception system than the RGB color space, the L ⁇ color space can significantly reduce the phase between the color channels when applied to natural scenes. Closeness, so that the channels have a certain degree of mutual independence, which can minimize the influence of one channel change on the other two channels, so that different operations can be performed on different color channels without channel Cross problem. Therefore, in order to achieve a better color transfer effect, the mobile terminal first converts the background image and the second reference image from an RGB color space to an L ⁇ color space, and uses the L ⁇ color space as an execution space for color transfer.
  • the L channel represents an achromatic channel, that is, a luminance channel, ⁇ represents a colored yellow-blue channel, and ⁇ represents a colored red-green channel.
  • the mobile terminal After completing the color space conversion of the background image and the second reference image, the mobile terminal first calculates an L channel mean of the background image Alpha channel mean Beta channel mean And an L channel mean of the second reference image Alpha channel mean Beta channel mean And the L channel standard deviation of the background image Alpha channel standard deviation Beta channel standard deviation And an L channel standard deviation of the second reference image Alpha channel standard deviation Beta channel standard deviation Then, the mobile terminal removes the mean value from the background image, scales the remaining portion according to the ratio of the standard deviation, and finally adds the mean value of the second reference image, and the transformation formula is as follows:
  • l dst , ⁇ dst , and ⁇ dst represent the values of each channel of the selected pixel, respectively.
  • the low-order statistical information of each channel of the background image and the second reference image is consistent, and the color feature of the second reference image is transmitted to the background image, and then The calculated background image is converted from the L ⁇ color space to the RGB color space, which is beneficial for the display of the mobile terminal.
  • the color transfer of the foreground image by using the first reference image may be implemented by referring to the foregoing technical solution, and details are not described herein again. It can be understood by those skilled in the art that in other embodiments, the manner of color transfer can be selected according to actual needs, for example, color transfer based on high-order statistical information or cluster-based regional color transfer can be adopted.
  • the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better.
  • Implementation Based on such understanding, the technical solution of the present application, which is essential or contributes to the related art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk, CD-ROM).
  • the method includes a plurality of instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method described in the embodiments of the present invention.
  • the mobile terminal when performing color transfer of an image to be processed, the mobile terminal first divides the image to be processed into a foreground image and a background image based on the depth information of the image to be processed, and then for the foreground image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

本文公布一种图像处理装置及方法,所述图像处理装置包括:划分模块,设置为:获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像;变换模块,设置为:获取色彩传递的参考图像,并根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同;合成模块,设置为:在完成所述色彩传递时,将所述前景图像以及所述背景图像融合为结果图像。

Description

图像处理装置及方法 技术领域
本申请涉及但不限于图像处理技术领域,尤指一种图像处理装置及方法。
背景技术
在图像处理领域,色彩传递是一种利用特定方法自动进行图像色彩添加或变更的方法,即指定原图像和参考图像,对原图像和目标图像的低阶统计特征进行线性变换,将参考图像中的色彩信息传递到原图像中,使原图像的色彩改变并具有与参考图像相似的颜色特征。
然而,相关技术在进行色彩传递时,通常是针对全局的色彩传递,即对原图像整体的色彩传递,例如,用户需求将拍摄的人像照片中的“人像”和“背景”进行不同参考图像的色彩传递时,相关技术将无法实现,存在色彩传递不够灵活的问题。
发明内容
以下是对本文详细描述的主题的概述。本概述并非是为了限制权利要求的保护范围。
本文提出一种图像处理装置及方法,旨在更灵活的进行图像的色彩传递。
本发明实施例提供一种图像处理装置,该图像处理装置包括:
划分模块,设置为:获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像;
变换模块,设置为:获取色彩传递的参考图像,并根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同;
合成模块,设置为:在完成所述色彩传递时,将所述前景图像以及所 述背景图像融合为结果图像。
可选地,所述图像处理装置还包括:
拍摄模块,设置为:在侦测到拍摄指令时,对待拍摄场景进行拍摄,并获取所述待拍摄场景的深度信息;
关联模块,设置为:将所述待拍摄场景的深度信息与拍摄的图像关联,并将拍摄的所述图像作为所述待处理图像。
可选地,所述划分模块,设置为:搜索本地或者云端是否存在所述待处理图像预先关联的深度信息,在搜索到所述待处理图像预先关联的深度信息时,将搜索到的所述深度信息作为所述待处理图像的深度信息。
可选地,所述变换模块,设置为:采用基于低阶统计信息的色彩传递、或采用基于高阶统计信息的色彩传递、或采用基于聚类的区域色彩传递。
可选地,所述图像处理装置还包括羽化模块,设置为:在完成所述色彩传递时,对所述前景图像以及所述背景图像的分割边缘进行羽化处理;
所述合成模块还设置为:在完成所述羽化处理时,将所述前景图像以及所述背景图像融合为结果图像。
可选地,所述变换模块还设置为:显示参考图像的选择界面,以供用户基于所述选择界面选择对所述前景图像和/或所述背景图像进行色彩传递的参考图像;以及在接收到用户基于所述选择界面触发的选择指令时,获取所述选择指令对应的参考图像。
可选地,当所述参考图像为多个时,所述变换模块还设置为:采用所述前景图像以及所述背景图像各自对应的参考图像分别对所述前景图像和所述背景图像进行色彩传递。
可选地,所述图像处理装置还包括:
记录模块,设置为:在所述划分模块将所述待处理图像划分为前景图像和背景图像时,记录所述前景图像以及所述背景图像间的连接信息;
所述合成模块,设置为:在完成所述色彩传递时,根据所述记录模块记录的所述前景图像以及所述背景图像间的连接信息将所述前景图像以及所述背景图像融合为结果图像。
可选地,所述图像处理装置还包括:
显示模块,设置为:显示所述结果图像。
可选地,所述显示模块还设置为:在显示所述结果图像时,显示提示信息显示界面,供用户基于所述提示信息显示界面确认是否存储所述结果图像;在接收到用户基于所述提示信息显示界面输入的确认信息时,将所述结果图像存储至预设的存储路径指向的存储区域。
本发明实施例还提出一种图像处理方法,该图像处理方法包括:
获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像;
获取色彩传递的参考图像,并根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同;
在完成所述色彩传递时,将所述前景图像以及所述背景图像融合为结果图像。
可选地,所述获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像的步骤之前,还包括:
在侦测到拍摄指令时,对待拍摄场景进行拍摄,并获取所述待拍摄场景的深度信息;
将所述待拍摄场景的深度信息与拍摄的图像关联,并将拍摄的所述图像作为所述待处理图像。
可选地,所述获取待处理图像的深度信息包括:
搜索本地或者云端是否存在所述待处理图像预先关联的深度信息,在搜索到所述待处理图像预先关联的深度信息时,将搜索到的所述深度信息作为所述待处理图像的深度信息。
可选地,所述获取色彩传递的参考图像,并根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递的步骤中,采用基于低阶统 计信息的色彩传递、或采用基于高阶统计信息的色彩传递、或采用基于聚类的区域色彩传递。
可选地,所述将所述前景图像以及所述背景图像融合为结果图像的步骤之前,还包括:
在完成所述色彩传递时,对所述前景图像以及所述背景图像的分割边缘进行羽化处理;
在完成所述羽化处理时,执行所述将所述前景图像以及所述背景图像融合为结果图像的步骤。
可选地,所述获取色彩传递的参考图像包括:
显示参考图像的选择界面,以供用户基于所述选择界面选择对所述前景图像和/或所述背景图像进行色彩传递的参考图像;
在接收到用户基于所述选择界面触发的选择指令时,获取所述选择指令对应的参考图像。
可选地,当所述参考图像为多个时,所述根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递包括:
采用所述前景图像以及所述背景图像各自对应的参考图像分别对所述前景图像和所述背景图像进行色彩传递。
可选地,根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像时,还包括:记录所述前景图像以及所述背景图像间的连接信息;
所述将所述前景图像以及所述背景图像融合为结果图像包括:根据所述所述前景图像以及所述背景图像间的连接信息将所述前景图像以及所述背景图像融合为结果图像。
可选地,将所述前景图像以及所述背景图像融合为结果图像之后,还包括:
显示所述结果图像。
可选地,所述显示所述结果图像时,还包括:显示提示信息显示界面,供用户基于所述提示信息显示界面确认是否存储所述结果图像;在接收到 用户基于所述提示信息显示界面输入的确认信息时,将所述结果图像存储至预设的存储路径指向的存储区域。
本发明实施例还提出一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被处理器执行时实现上述方法。
本发明实施例提出的图像处理装置及方法,在进行待处理图像的色彩传递时,移动终端首先基于待处理图像的深度信息将待处理图像划分为前景图像和背景图像,然后对于所述前景图像和所述背景图像采用不同的参考图像应用色彩传递技术,或者仅对所述前景图像和所述背景图像之一采用参考图像应用色彩传递技术,在完成色彩传递之后,再将所述前景图像以及所述背景图像组合为结果图像,使得结果图像的前景部分和背景部分具有不同的颜色氛围,相较于相关技术仅能对待处理图像进行整体色彩传递的方式,本发明实施例能够更灵活进行图像的色彩传递。
在阅读并理解了附图和详细描述后,可以明白其他方面。
附图概述
图1为实现本发明实施例的移动终端的硬件结构示意图;
图2为如图1所示的移动终端的无线通信装置示意图;
图3为本发明图像处理装置第一实施例的功能模块示意图;
图4为本发明图像处理装置第一实施例中一待处理图像的示例图;
图5为本发明图像处理装置第一实施例中前景图像的示例图;
图6为本发明图像处理装置第一实施例中背景图像的示例图;
图7为本发明图像处理装置第一实施例中另一待处理图像的示例图;
图8为本发明图像处理装置第一实施例中用于色彩传递的参考图像的示例图;
图9为本发明图像处理装置第一实施例中另一待处理图像完成色彩传递的结果图像的示例图;
图10为本发明图像处理方法第一实施例的流程示意图。
本发明的实施方式
应当理解,此处所描述的实施例仅仅用以解释本申请,并不用于限定本申请。
现在将参考附图描述实现本发明实施例的移动终端。在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本文的说明,其本身并没有特定的意义。因此,“模块”与“部件”可以混合地使用。
移动终端可以以多种形式来实施。例如,本文中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、数字广播接收器、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、导航装置等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是移动终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。
图1为实现本发明实施例的移动终端的硬件结构示意图。
移动终端100可以包括无线通信单元110、A/V(音频/视频)输入单元120、用户输入单元130、感测单元140、输出单元150、存储器160、接口单元170、控制器180和电源单元190等等。图1示出了具有多种组件的移动终端,但是应理解的是,并不要求实施所有示出的组件。可以替代地实施更多或更少的组件。将在下面详细描述移动终端的元件。
无线通信单元110通常包括一个或多个组件,其允许移动终端100与无线通信装置或网络之间的无线电通信。例如,无线通信单元可以包括广播接收模块111、移动通信模块112、无线互联网模块113、短程通信模块114和位置信息模块115中的至少一个。
广播接收模块111经由广播信道从外部广播管理服务器接收广播信号和/或广播相关信息。广播信道可以包括卫星信道和/或地面信道。广播管理 服务器可以是生成并发送广播信号和/或广播相关信息的服务器或者接收之前生成的广播信号和/或广播相关信息并且将其发送给终端的服务器。广播信号可以包括TV广播信号、无线电广播信号、数据广播信号等等。而且,广播信号可以进一步包括与TV或无线电广播信号组合的广播信号。广播相关信息也可以经由移动通信网络提供,并且在该情况下,广播相关信息可以由移动通信模块112来接收。广播信号可以以多种形式存在,例如,其可以以数字多媒体广播(DMB)的电子节目指南(EPG)、数字视频广播手持(DVB-H)的电子服务指南(ESG)等等的形式而存在。广播接收模块111可以通过使用多种类型的广播装置接收信号广播。特别地,广播接收模块111可以通过使用诸如多媒体广播-地面(DMB-T)、数字多媒体广播-卫星(DMB-S)、数字视频广播-手持(DVB-H),前向链路媒体(MediaFLO@)的数据广播装置、地面数字广播综合服务(ISDB-T)等等的数字广播装置接收数字广播。广播接收模块111可以被构造为适合提供广播信号的广播装置以及上述数字广播装置。经由广播接收模块111接收的广播信号和/或广播相关信息可以存储在存储器160(或者其它类型的存储介质)中。
移动通信模块112将无线电信号发送到基站(例如,接入点、节点B等等)、外部终端以及服务器中的至少一个和/或从其接收无线电信号。这样的无线电信号可以包括语音通话信号、视频通话信号、或者根据文本和/或多媒体消息发送和/或接收的多种类型的数据。
无线互联网模块113支持移动终端的无线互联网接入。该模块可以内部或外部地耦接到终端。该模块所涉及的无线互联网接入技术可以包括WLAN(无线LAN)(Wi-Fi)、Wibro(无线宽带)、Wimax(全球微波互联接入)、HSDPA(高速下行链路分组接入)等等。
短程通信模块114是设置为支持短程通信。短程通信技术的一些示例包括蓝牙TM、射频识别(RFID)、红外数据协会(IrDA)、超宽带(UWB)、紫蜂TM等等。
位置信息模块115是设置为检查或获取移动终端的位置信息。位置信息模块的典型示例是GPS(全球定位装置)。根据当前的技术,GPS模块115计算来自三个或更多卫星的距离信息和准确的时间信息并且对于计算的信 息应用三角测量法,从而根据经度、纬度和高度准确地计算三维当前位置信息。当前,用于计算位置和时间信息的方法使用三颗卫星并且通过使用另外的一颗卫星校正计算出的位置和时间信息的误差。此外,GPS模块115能够通过实时地连续计算当前位置信息来计算速度信息。
A/V输入单元120设置为接收音频或视频信号。A/V输入单元120可以包括相机121和麦克风122,相机121对在视频捕获模式或图像捕获模式中由图像捕获装置获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元151上。经相机121处理后的图像帧可以存储在存储器160(或其它存储介质)中或者经由无线通信单元110进行发送,可以根据移动终端的构造提供两个或更多相机121。麦克风122可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由移动通信模块112发送到移动通信基站的格式输出。麦克风122通过噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。
用户输入单元130可以根据用户输入的命令生成键输入数据以控制移动终端的操作。用户输入单元130允许用户输入多种类型的信息,并且可以包括键盘、锅仔片、触摸板(例如,检测由于被接触而导致的电阻、压力、电容等等的变化的触敏组件)、滚轮、摇杆等等。特别地,当触摸板以层的形式叠加在显示单元151上时,可以形成触摸屏。
感测单元140检测移动终端100的当前状态,(例如,移动终端100的打开或关闭状态)、移动终端100的位置、用户对于移动终端100的接触(即,触摸输入)的有无、移动终端100的取向、移动终端100的加速或将速移动和方向等等,并且生成用于控制移动终端100的操作的命令或信号。例如,当移动终端100实施为滑动型移动电话时,感测单元140可以感测该滑动型电话是打开还是关闭。另外,感测单元140能够检测电源单元190是否提供电力或者接口单元170是否与外部装置耦接。感测单元140可以包括接近传感器141将在下面结合触摸屏来对此进行描述。
接口单元170用作至少一个外部装置与移动终端100连接可以通过的接 口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。识别模块可以是存储用于验证用户使用移动终端100的多种信息并且可以包括用户识别模块(UIM)、客户识别模块(SIM)、通用客户识别模块(USIM)等等。另外,具有识别模块的装置(下面称为“识别装置”)可以采取智能卡的形式,因此,识别装置可以经由端口或其它连接装置与移动终端100连接。接口单元170可以设置为接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端100内的一个或多个元件或者可以设置为在移动终端和外部装置之间传输数据。
另外,当移动终端100与外部底座连接时,接口单元170可以用作允许通过其将电力从底座提供到移动终端100的路径或者可以用作允许从底座输入的多种命令信号通过其传输到移动终端的路径。从底座输入的多种命令信号或电力可以用作用于识别移动终端是否准确地安装在底座上的信号。输出单元150被构造为以视觉、音频和/或触觉方式提供输出信号(例如,音频信号、视频信号、警报信号、振动信号等等)。输出单元150可以包括显示单元151、音频输出模块152、警报单元153等等。
显示单元151可以显示在移动终端100中处理的信息。例如,当移动终端100处于电话通话模式时,显示单元151可以显示与通话或其它通信(例如,文本消息收发、多媒体文件下载等等)相关的用户界面(UI)或图形用户界面(GUI)。当移动终端100处于视频通话模式或者图像捕获模式时,显示单元151可以显示捕获的图像和/或接收的图像、示出视频或图像以及相关功能的UI或GUI等等。
同时,当显示单元151和触摸板以层的形式彼此叠加以形成触摸屏时,显示单元151可以用作输入装置和输出装置。显示单元151可以包括液晶显示器(LCD)、薄膜晶体管LCD(TFT-LCD)、有机发光二极管(OLED)显示器、柔性显示器、三维(3D)显示器等等中的至少一种。这些显示器中的一些可以被构造为透明状以允许用户从外部观看,这可以称为透明显示器,典型的透明显示器可以例如为TOLED(透明有机发光二极管)显示器等 等。根据特定想要的实施方式,移动终端100可以包括两个或更多显示单元(或其它显示装置),例如,移动终端可以包括外部显示单元(未示出)和内部显示单元(未示出)。触摸屏可设置为检测触摸输入压力以及触摸输入位置和触摸输入面积。
音频输出模块152可以在移动终端处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将无线通信单元110接收的或者在存储器160中存储的音频数据转换音频信号并且输出为声音。而且,音频输出模块152可以提供与移动终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出模块152可以包括拾音器、蜂鸣器等等。
警报单元153可以提供输出以将事件的发生通知给移动终端100。典型的事件可以包括呼叫接收、消息接收、键信号输入、触摸输入等等。除了音频或视频输出之外,警报单元153可以以不同的方式提供输出以通知事件的发生。例如,警报单元153可以以振动的形式提供输出,当接收到呼叫、消息或一些其它进入通信(incoming communication)时,警报单元153可以提供触觉输出(即,振动)以将其通知给用户。通过提供这样的触觉输出,即使在用户的移动电话处于用户的口袋中时,用户也能够识别出事件的发生。警报单元153也可以经由显示单元151或音频输出模块152提供通知事件的发生的输出。
存储器160可以存储由控制器180执行的处理和控制操作的软件程序等等,或者可以暂时地存储己经输出或将要输出的数据(例如,电话簿、消息、静态图像、视频等等)。而且,存储器160可以存储关于当触摸施加到触摸屏时输出的多种方式的振动和音频信号的数据。
存储器160可以包括至少一种类型的存储介质,所述存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等等)、随机访问存储器(RAM)、静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘等等。而且,移动终端100可以与通过网络连接执行存储器160的存储功能的网络存储装置协作。
控制器180通常控制移动终端的总体操作。例如,控制器180执行与语音通话、数据通信、视频通话等等相关的控制和处理。另外,控制器180可以包括用于再现(或回放)多媒体数据的多媒体模块181,多媒体模块181可以构造在控制器180内,或者可以构造为与控制器180分离。控制器180可以执行模式识别处理,以将在触摸屏上执行的手写输入或者图片绘制输入识别为字符或图像。
电源单元190在控制器180的控制下接收外部电力或内部电力并且提供操作每个元件和组件所需的适当的电力。
这里描述的实施方式可以以使用例如计算机软件、硬件或其任何组合的计算机可读介质来实施。对于硬件实施,这里描述的实施方式可以通过使用特定用途集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理装置(DSPD)、可编程逻辑装置(PLD)、现场可编程门阵列(FPGA)、处理器、控制器、微控制器、微处理器、被设计为执行这里描述的功能的电子单元中的至少一种来实施,在一些情况下,这样的实施方式可以在控制器180中实施。对于软件实施,诸如过程或功能的实施方式可以与允许执行至少一种功能或操作的单独的软件模块来实施。软件代码可以由以任何适当的编程语言编写的软件应用程序(或程序)来实施,软件代码可以存储在存储器160中并且由控制器180执行。
至此,己经按照其功能描述了移动终端。下面,为了简要起见,将描述诸如折叠型、直板型、摆动型、滑动型移动终端等等的多种类型的移动终端中的滑动型移动终端作为示例。因此,本发明实施例能够应用于任何类型的移动终端,并且不限于滑动型移动终端。
参照图2,图2为图1中相机的电气结构框图。
摄影镜头1211由用于形成被摄体像的多个光学镜头构成,为单焦点镜头或变焦镜头。摄影镜头1211在镜头驱动器1221的控制下能够在光轴方向上移动,镜头驱动器1221根据来自镜头驱动控制电路1222的控制信号,控制摄影镜头1211的焦点位置,在变焦镜头的情况下,也可控制焦点距离。镜头驱动控制电路1222按照来自微型计算机1217的控制命令进行镜头驱动器1221的驱动控制。
在摄影镜头1211的光轴上、由摄影镜头1211形成的被摄体像的位置附近配置有摄像元件1212。摄像元件1212设置为对被摄体像摄像并取得摄像图像数据。在摄像元件1212上二维且呈矩阵状配置有构成每个像素的光电二极管。光电二极管产生与受光量对应的光电转换电流,该光电转换电流由与光电二极管连接的电容器进行电荷蓄积。每个像素的前表面配置有拜耳排列的RGB滤色器。
摄像元件1212与摄像电路1213连接,该摄像电路1213在摄像元件1212中进行电荷蓄积控制和图像信号读出控制,对该读出的图像信号(模拟图像信号)降低重置噪声后进行波形整形,进而进行增益提高等以成为适当的信号电平。
摄像电路1213与A/D转换器1214连接,该A/D转换器1214对模拟图像信号进行模数转换,向总线1227输出数字图像信号(以下称之为图像数据)。
总线1227是用于传送在相机的内部读出或生成的多种数据的传送路径。在总线1227连接着上述A/D转换器1214,此外还连接着图像处理器1215、JPEG处理器1216、微型计算机1217、SDRAM(Synchronous Dynamic random access memory,同步动态随机存取内存)1218、存储器接口(以下称之为存储器I/F)1219、LCD(Liquid Crystal Display,液晶显示器)驱动器1220。
图像处理器1215对基于摄像元件1212的输出的图像数据进行OB相减处理、白平衡调整、颜色矩阵运算、伽马转换、色差信号处理、噪声去除处理、同时化处理、边缘处理等多种图像处理。JPEG处理器1216在将图像数据记录于记录介质1225时,按照JPEG压缩方式压缩从SDRAM1218读出的图像数据。此外,JPEG处理器1216为了进行图像再现显示而进行JPEG图像数据的解压缩。进行解压缩时,读出记录在记录介质1225中的文件,在JPEG处理器1216中实施了解压缩处理后,将解压缩的图像数据暂时存储于SDRAM1218中并在LCD1226上进行显示。另外,在本实施方式中,作为图像压缩解压缩方式采用的是JPEG方式,然而压缩解压缩方式不限于此,当然可以采用MPEG、TIFF、H.264等其他的压缩解压缩方式。
微型计算机1217发挥作为该相机整体的控制部的功能,统一控制相机 的多种处理序列。微型计算机1217连接着操作单元1223和闪存1224。
操作单元1223包括但不限于实体按键或者虚拟按键,该实体或虚拟按键可以为电源按钮、拍照键、编辑按键、动态图像按钮、再现按钮、菜单按钮、十字键、OK按钮、删除按钮、放大按钮等多种输入按钮和多种输入键等操作控件,检测这些操作控件的操作状态。
将检测结果向微型计算机1217输出。此外,在作为显示器的LCD1226的前表面设有触摸面板,检测用户的触摸位置,将该触摸位置向微型计算机1217输出。微型计算机1217根据来自操作单元1223的操作位置的检测结果,执行与用户的操作对应的多种处理序列。
闪存1224存储用于执行微型计算机1217的多种处理序列的程序。微型计算机1217根据该程序进行相机整体的控制。此外,闪存1224存储相机的多种调整值,微型计算机1217读出调整值,按照该调整值进行相机的控制。
SDRAM1218是用于对图像数据等进行暂时存储的可电改写的易失性存储器。该SDRAM1218暂时存储从A/D转换器1214输出的图像数据和在图像处理器1215、JPEG处理器1216等中进行了处理后的图像数据。
存储器接口1219与记录介质1225连接,进行将图像数据和附加在图像数据中的文件头等数据写入记录介质1225和从记录介质1225中读出的控制。记录介质1225例如为能够在相机主体上自由拆装的存储器卡等记录介质,然而不限于此,也可以是内置在相机主体中的硬盘等。
LCD驱动器1210与LCD1226连接,将由图像处理器1215处理后的图像数据存储于SDRAM1218,需要显示时,读取SDRAM1218存储的图像数据并在LCD1226上显示,或者,JPEG处理器1216压缩过的图像数据存储于SDRAM1218,在需要显示时,JPEG处理器1216读取SDRAM1218的压缩过的图像数据,再进行解压缩,将解压缩后的图像数据通过LCD1226进行显示。
LCD1226配置在相机主体的背面进行图像显示。该LCD1226采用LCD,然而不限于此,也可以采用有机EL等多种显示面板。
基于上述移动终端硬件结构以及相机的电气结构示意图,提出本发明图像处理装置实施例。
参照图3,在本发明图像处理装置的第一实施例中,所述图像处理装置包括:
划分模块10,设置为:获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像;
本实施例提供的图像处理装置可以应用于手机、笔记本电脑、数字广播接收器、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)等移动终端,例如,图像处理装置内置于手机运行,用户在通过手机对拍摄的照片进行色彩传递时,手机自动根据照片的深度信息将照片中的人像和背景分离,使用不同的参考图像分别对人像部分和背景部分进行色彩传递,或者只对背景部分进行色彩传递,增加了色彩传递的灵活性和视觉效果。
在本实施例中,所述深度信息可用于描述所述待处理图像中任一点相对于拍摄所述待处理图像的镜头的距离,例如,所述待处理图像为手机拍摄的人像照片,所述待处理图像的深度信息可以描述照片中的“人”在拍摄时与手机的距离,以及描述照片中“背景”在拍摄时与手机的距离。
本实施例中,划分模块10获取所述待处理图像预先关联的深度信息,并将获取的所述深度信息作为所述待处理图像的深度信息,例如,可搜索本地或者云端是否存在所述待处理图像预先关联的深度信息,在搜索到所述待处理图像预先关联的深度信息时,将搜索到的所述深度信息作为所述待处理图像的深度信息。
在获取到所述待处理图像的深度信息之后,所述划分模块10根据获取的所述深度信息对所述待处理图像的每个像素进行聚类,将所述待处理图像划分为前景图像以及背景图像,例如,结合参照图4至图6,所述待处理图像为人像照片(如图4所示),手机将前述人像照片划分为“人”(前景图像,如图5所示)和“背景”(背景图像,如图6所示)。
变换模块20,设置为:获取色彩传递的参考图像,并根据获取的所述 参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同;
本实施例中,在所述划分模块10将所述待处理图像划分为所述前景图像以及所述背景图像之后,变换模块20获取色彩传递的参考图像,其中,所述参考图像可以包括第一参考图像和第二参考图像,所述第一参考图像用于对所述前景图像进行色彩传递,所述第二参考图像用于对所述背景图像进行色彩传递;或者所述参考图像仅用于对所述前景图像进行色彩传递;或者所述参考图像仅用于对所述背景图像进行色彩传递。所述参考图像的获取可以按照预先设置进行,例如手机缺省设置仅对背景图像进行色彩传递,则所述变换模块20将获取到用于对所述背景图像进行色彩传递的参考图像;又例如,用户预设设置同时对背景图像及前景图像进行色彩传递,则所述变换模块20将获取到用于对所述前景图像进行色彩传递的第一参考图像,以及获取到用于对所述背景图像进行色彩传递的第二参考图像。
在获取到色彩传递的参考图像之后,所述变换模块20根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同。
可选地,所述变换模块20,设置为:采用基于低阶统计信息的色彩传递、或采用基于高阶统计信息的色彩传递、或采用基于聚类的区域色彩传递。
以下以获取到第一参考图像以及第二参考图像时,变换模块20基于低阶统计信息对所述背景图像的色彩传递进行说明。
由于Lαβ颜色空间相较于RGB颜色空间更符合人类视觉感知***,将其运用到自然场景中时,Lαβ颜色空间能够显著降低颜色通道之间的相关性,使通道之间具有一定的相互独立性,可以最大限度地减小一个通道的变化给另外两个通道造成的影响,从而可以在不同的颜色通道进行不同的运算,而不会出现通道交叉的问题。因此,为了达到较好的色彩传递效果,所述变换模块20首先将所述背景图像以及所述第二参考图像由RGB颜色空间转换至Lαβ颜色空间,将Lαβ颜色空间作为色彩传递的执行空间。其中,L通道表示非彩色通道,即亮度通道,α表示彩色的黄蓝通道,β表示彩色的红绿通道。
在完成所述背景图像以及所述第二参考图像的颜色空间转换之后,所述变换模块20首先计算所述背景图像的L通道均值
Figure PCTCN2016103238-appb-000001
α通道均值
Figure PCTCN2016103238-appb-000002
β通道均值
Figure PCTCN2016103238-appb-000003
以及所述第二参考图像的L通道均值
Figure PCTCN2016103238-appb-000004
α通道均值
Figure PCTCN2016103238-appb-000005
β通道均值
Figure PCTCN2016103238-appb-000006
以及所述背景图像的L通道标准差
Figure PCTCN2016103238-appb-000007
α通道标准差
Figure PCTCN2016103238-appb-000008
β通道标准差
Figure PCTCN2016103238-appb-000009
以及所述第二参考图像的L通道标准差
Figure PCTCN2016103238-appb-000010
α通道标准差
Figure PCTCN2016103238-appb-000011
β通道标准差
Figure PCTCN2016103238-appb-000012
然后所述变换模块20从所述背景图像中移走均值,将剩余部分按照标准差的比值缩放,最后加入所述第二参考图像的均值,其变换公式如下:
Figure PCTCN2016103238-appb-000013
Figure PCTCN2016103238-appb-000014
Figure PCTCN2016103238-appb-000015
Figure PCTCN2016103238-appb-000016
Figure PCTCN2016103238-appb-000017
Figure PCTCN2016103238-appb-000018
其中,ldst、αdst和βdst分别表示选中的像素的每个通道值,
Figure PCTCN2016103238-appb-000019
Figure PCTCN2016103238-appb-000020
分别表示选中的像素变换后的每个通道值。
经过上述运算即可使得所述背景图像和所述第二参考图像的每个通道的低阶统计信息一致,达到将所述第二参考图像的颜色特征传递到所述背景图像的目的,然后将运算后的背景图像由Lαβ颜色空间转换为RGB颜色空间,有利于移动终端显示。
采用第一参考图像对所述前景图像进行的色彩传递可参照上述技术方案实施,此处不再赘述。本领域技术人员可以理解的是,在其他实施例中,可以按实际需要选取色彩传递的方式,例如,可以采用基于高阶统计信息的色彩传递,或者采用基于聚类的区域色彩传递。
合成模块30,设置为:在完成色彩传递时,将所述前景图像以及所述背景图像融合为结果图像。
可选地,所述图像处理装置还包括记录模块,设置为:在所述划分模块10将所述待处理图像划分为前景图像和背景图像时,记录所述前景图像以及所述背景图像间的连接信息(即分割边缘)。在完成色彩传递时,所述合成模块30根据所述记录模块记录的所述前景图像以及所述背景图像间的连接信息将所述前景图像以及所述背景图像融合为结果图像。
例如,结合参照图7至图9,图7为用户拍摄的待处理图像,图8为对图7背景部分进行色彩传递的参考图像,图9为完成色彩传递的结果图像,如图9所示,本实施例利用色彩传递将图8所示的夕阳背光特效场景的颜色氛围传递给图7所示待处理图像的背景部分,并保持前景部分(人像)原有的颜色特征,给人一种全新的视觉感和特殊的艺术效果。
可选地,在本实施例中,所述图像处理装置还包括显示模块,设置为:显示所述结果图像。
在所述合成模块30将所述前景图像以及所述背景图像融合为结果图像之后,显示模块在其所在移动终端显示所述结果图像,使得用户能够立即查看对所述待处理图像进行色彩传递的结果。
可选地,所述显示模块在显示所述结果图像的同时,还可以显示提示信息显示界面,供用户基于所述提示信息显示界面确认是否存储所述结果图像;以及在接收到用户基于所述提示信息显示界面输入的确认信息时,所述显示模块将所述结果图像存储至预设的存储路径指向的存储区域。此外,若用户不满意色彩传递的结果图像,可选择不存储。
本实施例提出的图像处理装置,内置于移动终端运行,使得移动终端在进行色彩传递时,首先基于待处理图像的深度信息将待处理图像划分为前景图像和背景图像,然后对于所述前景图像和所述背景图像采用不同的参考图像应用色彩传递技术,或者仅对所述前景图像和所述背景图像之一采用参考图像应用色彩传递技术,在完成色彩传递之后,再将所述前景图像以及所述背景图像组合为结果图像,使得结果图像的前景部分和背景部分具有不同的颜色氛围,相较于相关技术仅能对待处理图像进行整体色彩传递的方式,本发明实施例能够更灵活进行图像的色彩传递。
可选地,基于第一实施例,提出本发明图像处理装置的第二实施例,在本实施例中,所述图像处理装置还包括:
拍摄模块,设置为:在侦测到拍摄指令时,对待拍摄场景进行拍摄,并获取所述待拍摄场景的深度信息;
关联模块,设置为:将所述待拍摄场景的深度信息与拍摄的图像关联,并将拍摄的所述图像作为所述待处理图像。
本实施例将第一实施例所述的色彩传递方案应用到拍摄中,例如,图像处理装置内置于移动终端运行,移动终端在拍摄时,利用色彩传递将夕阳背光特效场景的颜色氛围传递给拍摄的图像,使得拍摄图像呈现出对比度鲜明的夕阳氛围效果,或者将秋天金黄的颜色氛围传递给拍摄的图像,使得拍摄图像出现季节变幻的效果。
本实施例中,拍摄模块在拍摄(调用移动终端的摄像头进行拍摄)时,通过其所在移动终端预先设置的双目摄像头或者深度传感器获取待拍摄场景的深度信息,其中,双目摄像头是指位于移动终端同一面且相距一定距离的两个摄像头。在采用双目摄像头获取待拍摄场景的深度信息时,可按移动终端的缺省设置将双目摄像头中任一摄像头拍摄的场景图像作为所述待处理图像,或者按用户设置将用户指定的摄像头拍摄的场景图像作为所述待处理图像。
例如,所述拍摄模块通过其所在移动终端设置的双目摄像头分别拍摄两幅场景图像,通过两幅场景图像的灰度信息和成像几何来生成深度图,深度图中的每一个像素值表示场景中某一点与所述移动终端之间的距离。又例如,所述拍摄模块通过其所在移动终端设置的深度传感器接收来自待拍摄场景发射或反射的光能量,形成有关待拍摄场景的光能量分布函数,即灰度图像,然后在这些图像的基础上恢复拍摄场景的深度信息;或者所述拍摄模块通过所述深度传感器向待拍摄场景发射能量,然后接收待拍摄场景对所发射能量的反射能量,形成有关待拍摄场景的光能量分布函数,即灰度图像,然后在这些图像的基础上恢复拍摄场景的深度信息。
此外,在所述移动终端包括除所述双目摄像头之外的其它摄像头时,所述拍摄模块还可将所述其它摄像头指定用于待拍摄场景的拍摄,将所述双目 摄像头指定用于待拍摄场景深度信息的获取。
可选地,基于前述任一实施例,提出本发明图像处理装置的第三实施例,在本实施例中,所述图像处理装置还包括羽化模块,设置为:在完成色彩传递时,对所述前景图像以及所述背景图像的分割边缘进行羽化处理;
所述合成模块还设置为:在完成羽化处理时,将所述前景图像以及所述背景图像融合为结果图像。
本领域技术人员可以理解的是,在前述实施例中,在所述划分模块10将所述待处理图像划分为前景图像以及背景图像之后,所述变换模块20对所述前景图像和所述背景图像之一进行了色彩传递的线性变换,或者分别采用不同的参考图像对所述前景图像以及所述背景图像进行了色彩传递的线性变换,所述前景图像和所述背景图像存在一定的颜色反差,若直接将所述前景图像以及所述背景图像进行融合,所述前景图像和所述背景图像的融合边缘(即前述分割边缘)处的过渡将比较生硬,影响结果图像的显示效果。因此,在本实施例中,在所述合成模块30对所述前景图像以及所述背景图像进行融合之前,羽化模块先对所述前景图像以及所述背景图像的分割边缘按预设的羽化值(可由移动终端缺省设定,或者用户自定义)进行羽化处理,在所述羽化模块完成羽化处理后,所述合成模块30再将所述前景图像以及所述背景图像融合为结果图像。羽化处理原理是将图像融合边缘处虚化,起到渐变的作用从而达到自然衔接的效果,其中,羽化值越大,虚化范围越宽,也就是说颜色递变更柔和,羽化值越小,虚化范围越窄,颜色递变更剧烈,可根据实际情况进行调节。
本实施例通过在将所述前景图像以及所述背景图像进行融合之前,先对所述前景图像以及所述背景图像的分割边缘进行羽化处理,使得融合得到的结果图像的融合边缘过渡比较自然,能够提高结果图像的显示效果。
可选地,基于前述任一实施例,提出本发明图像处理装置的第四实施例,在本实施例中,所述变换模块20还设置为:显示参考图像的选择界面,以 供用户基于所述选择界面选择对所述前景图像和/或所述背景图像进行色彩传递的参考图像;以及在接收到用户基于所述选择界面触发的选择指令时,获取所述选择指令对应的参考图像。
本实施例中,在进行所述前景图像以及所述背景图像的色彩传递的线性变换时,用户可分别指定所述前景图像以及所述背景图像各自对应的参考图像,也可仅为二者之一指定对应的参考图像。在所述划分模块10将所述待处理图像划分为前景图像以及背景图像之后,所述变换模块20显示参考图像的选择界面,以供用户基于所述选择界面选择对所述前景图像和/或所述背景图像进行色彩传递的参考图像,例如,用户可选择对所述前景图像进行色彩变换的第一参考图像,以及选择对所述背景图像进行色彩变换的第二参考图像;或者用户仅为所述前景图像和所述背景图像之一选择进行色彩变换的参考图像。
在接收到用户基于所述选择界面触发的选择指令时,所述变换模块20根据所述选择指令获取所述前景图像对应的第一参考图像,以及获取所述背景图像对应的第二参考图像;
或者,在接收到用户基于所述选择界面触发的选择指令时,所述变换模块20根据所述选择指令获取所述前景图像和所述背景图像之一所对应的参考图像。
例如,图像处理装置内置于手机运行,所述待处理图像为用户拍摄的包括人像的风景照片,手机将照片中的“人”(前景部分)划分为前景图像,将照片中的“风景”背景部分划分为背景图像,用户可为“风景”选择秋天金黄的颜色氛围的参考图像,不为“人”选择参考图像,在完成选择后,手机将秋天金黄的颜色氛围传递给“风景”,最终使得拍摄的照片出现季节变换的效果。
本实施例通过响应用户操作分别为待处理图像的前景部分和背景部分指定不同的参考图像,或者仅为待处理图像的前景部分和背景部分之一指定参考图像,能够提升用户体验。
可选地,基于前述任一实施例,提出本发明图像处理装置的第五实施例,在本实施例中,当所述参考图像为多个时,所述变换模块20还设置为:采用所述前景图像以及所述背景图像各自对应的参考图像分别对所述前景图像和所述背景图像进行色彩传递。
在本实施例中,所述变换模块20获取到所述前景图像对应的第一参考图像,以及获取到所述背景图像对应的第二参考图像,并采用获取的所述第一参考图像对所述前景图像进行色彩传递,采用获取的所述第二参考图像对所述背景图像进行色彩传递。以下以基于低阶统计信息,所述变换模块20采用获取的所述第二参考图像对所述背景图像的色彩传递进行说明。
由于Lαβ颜色空间相较于RGB颜色空间更符合人类视觉感知***,将其运用到自然场景中时,Lαβ颜色空间能够显著降低颜色通道之间的相关性,使通道之间具有一定的相互独立性,可以最大限度地减小一个通道的变化给另外两个通道造成的影响,从而可以在不同的颜色通道进行不同的运算,而不会出现通道交叉的问题。因此,为了达到较好的色彩传递效果,所述变换模块20首先将所述背景图像以及所述第二参考图像由RGB颜色空间转换至Lαβ颜色空间,将Lαβ颜色空间作为色彩传递的执行空间。其中,L通道表示非彩色通道,即亮度通道,α表示彩色的黄蓝通道,β表示彩色的红绿通道。
在完成所述背景图像以及所述第二参考图像的颜色空间转换之后,所述变换模块20首先计算所述背景图像的L通道均值
Figure PCTCN2016103238-appb-000021
α通道均值
Figure PCTCN2016103238-appb-000022
β通道均值
Figure PCTCN2016103238-appb-000023
以及所述第二参考图像的L通道均值
Figure PCTCN2016103238-appb-000024
α通道均值
Figure PCTCN2016103238-appb-000025
β通道均值
Figure PCTCN2016103238-appb-000026
以及所述背景图像的L通道标准差
Figure PCTCN2016103238-appb-000027
α通道标准差
Figure PCTCN2016103238-appb-000028
β通道标准差
Figure PCTCN2016103238-appb-000029
以及所述第二参考图像的L通道标准差
Figure PCTCN2016103238-appb-000030
α通道标准差
Figure PCTCN2016103238-appb-000031
β通道标准差
Figure PCTCN2016103238-appb-000032
然后所述变换模块20从所述背景图像中移走均值,将剩余部分按照标准差的比值缩放,最后加入所述第二参考图像的均值,其变换公式如下:
Figure PCTCN2016103238-appb-000033
Figure PCTCN2016103238-appb-000034
Figure PCTCN2016103238-appb-000035
Figure PCTCN2016103238-appb-000036
Figure PCTCN2016103238-appb-000037
Figure PCTCN2016103238-appb-000038
其中,ldst、αdst和βdst分别表示选中的像素的每个通道值,
Figure PCTCN2016103238-appb-000039
Figure PCTCN2016103238-appb-000040
分别表示选中的像素变换后的每个通道值。
经过上述运算即可使得所述背景图像和所述第二参考图像的每个通道的低阶统计信息一致,达到将所述第二参考图像的颜色特征传递到所述背景图像的目的,然后将运算后的背景图像由Lαβ颜色空间转换为RGB颜色空间,有利于移动终端显示。
采用第一参考图像对所述前景图像进行的色彩传递可参照上述技术方案实施,此处不再赘述。本领域技术人员可以理解的是,在其他实施例中,可以按实际需要选取色彩传递的方式,例如,可以采用基于高阶统计信息的色彩传递,或者采用基于聚类的区域色彩传递。
本发明实施例还提供一种图像处理方法,参照图10,在本发明图像处理方法的第一实施例中,所述图像处理方法包括:
步骤S10,获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像;
本实施例提供的图像处理方法可以应用于手机、笔记本电脑、数字广播接收器、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)等移动终端,例如,用户在通过手机对拍摄的照片进行色彩传递时,手机自动根据照片的深度信息将照片中的人像和背景分离,使用不同的参考图像分别对人像部分和背景部分进行色彩传递,或者只对背景部分进行色彩传 递,增加了色彩传递的灵活性和视觉效果。
在本实施例中,所述深度信息可用于描述所述待处理图像中任一点相对于拍摄所述待处理图像的镜头的距离,例如,所述待处理图像为手机拍摄的人像照片,所述待处理图像的深度信息可以描述照片中的“人”在拍摄时与手机的距离,以及描述照片中“背景”在拍摄时与手机的距离。
本实施例中,移动终端获取所述待处理图像预先关联的深度信息,并将获取的所述深度信息作为所述待处理图像的深度信息,例如,可搜索本地或者云端是否存在所述待处理图像预先关联的深度信息,在搜索到所述待处理图像预先关联的深度信息时,将搜索到的所述深度信息作为所述待处理图像的深度信息。
在获取到所述待处理图像的深度信息之后,所述移动终端根据获取的所述深度信息对所述待处理图像的每个像素进行聚类,将所述待处理图像划分为前景图像以及背景图像,例如,结合参照图4至图6,所述待处理图像为人像照片(如图4所示),手机将前述人像照片划分为“人”(前景图像,如图5所示)和“背景”(背景图像,如图6所示)。
步骤S20,获取色彩传递的参考图像,并根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同;
本实施例中,所述移动终端在将所述待处理图像划分为所述前景图像以及所述背景图像之后,获取色彩传递的参考图像,其中,所述参考图像可以包括第一参考图像和第二参考图像,所述第一参考图像用于对所述前景图像进行色彩传递,所述第二参考图像用于对所述背景图像进行色彩传递;或者所述参考图像仅用于对所述前景图像进行色彩传递;或者所述参考图像仅用于对所述背景图像进行色彩传递。所述参考图像的获取按照预先设置进行,例如手机缺省设置仅对背景图像进行色彩传递,则所述手机将获取到用于对所述背景图像进行色彩传递的参考图像;又例如,用户预设设置同时对背景图像及前景图像进行色彩传递,则所述手机将获取到用于对所述前景图像进行色彩传递的第一参考图像,以及获取到用于对所述背景图像进行色彩传递的第二参考图像。
在获取到色彩传递的参考图像之后,所述移动终端根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同。
可以采用基于低阶统计信息的色彩传递、或采用基于高阶统计信息的色彩传递、或采用基于聚类的区域色彩传递。
以下以获取到第一参考图像以及第二参考图像时,基于低阶统计信息对所述背景图像的色彩传递进行说明。
由于Lαβ颜色空间相较于RGB颜色空间更符合人类视觉感知***,将其运用到自然场景中时,Lαβ颜色空间能够显著降低颜色通道之间的相关性,使通道之间具有一定的相互独立性,可以最大限度地减小一个通道的变化给另外两个通道造成的影响,从而可以在不同的颜色通道进行不同的运算,而不会出现通道交叉的问题。因此,为了达到较好的色彩传递效果,所述移动终端首先将所述背景图像以及所述第二参考图像由RGB颜色空间转换至Lαβ颜色空间,将Lαβ颜色空间作为色彩传递的执行空间。其中,L通道表示非彩色通道,即亮度通道,α表示彩色的黄蓝通道,β表示彩色的红绿通道。
在完成所述背景图像以及所述第二参考图像的颜色空间转换之后,所述移动终端首先计算所述背景图像的L通道均值
Figure PCTCN2016103238-appb-000041
α通道均值
Figure PCTCN2016103238-appb-000042
β通道均值
Figure PCTCN2016103238-appb-000043
以及所述第二参考图像的L通道均值
Figure PCTCN2016103238-appb-000044
α通道均值
Figure PCTCN2016103238-appb-000045
β通道均值
Figure PCTCN2016103238-appb-000046
以及所述背景图像的L通道标准差
Figure PCTCN2016103238-appb-000047
α通道标准差
Figure PCTCN2016103238-appb-000048
β通道标准差
Figure PCTCN2016103238-appb-000049
以及所述第二参考图像的L通道标准差
Figure PCTCN2016103238-appb-000050
α通道标准差
Figure PCTCN2016103238-appb-000051
β通道标准差
Figure PCTCN2016103238-appb-000052
然后所述移动终端从所述背景图像中移走均值,将剩余部分按照标准差的比值缩放,最后加入所述第二参考图像的均值,其变换公式如下:
Figure PCTCN2016103238-appb-000053
Figure PCTCN2016103238-appb-000054
Figure PCTCN2016103238-appb-000055
Figure PCTCN2016103238-appb-000056
Figure PCTCN2016103238-appb-000057
Figure PCTCN2016103238-appb-000058
其中,ldst、αdst和βdst分别表示选中的像素的每个通道值,
Figure PCTCN2016103238-appb-000059
Figure PCTCN2016103238-appb-000060
分别表示选中的像素变换后的每个通道值。
经过上述运算即可使得所述背景图像和所述第二参考图像的每个通道的低阶统计信息一致,达到将所述第二参考图像的颜色特征传递到所述背景图像的目的,然后将运算后的背景图像由Lαβ颜色空间转换为RGB颜色空间,有利于移动终端显示。
采用第一参考图像对所述前景图像进行的色彩传递可参照上述技术方案实施,此处不再赘述。本领域技术人员可以理解的是,在其他实施例中,可以按实际需要选取色彩传递的方式,例如,可以采用基于高阶统计信息的色彩传递,或者采用基于聚类的区域色彩传递。
步骤S30,在完成色彩传递时,将所述前景图像以及所述背景图像融合为结果图像。
本实施例在对所述待处理图像进行分割时,记录了所述前景图像以及所述背景图像间的连接信息(即分割边缘)。在完成色彩传递时,所述移动终端根据记录的所述前景图像以及所述背景图像间的连接信息将所述前景图像以及所述背景图像融合为结果图像。
例如,结合参照图7至图9,图7为用户拍摄的待处理图像,图8为对图7背景部分进行色彩传递的参考图像,图9为完成色彩传递的结果图像,如图9所示,本实施例利用色彩传递将图8所示的夕阳背光特效场景的颜色氛围传递给图7所示待处理图像的背景部分,并保持前景部分(人像)原有 的颜色特征,给人一种全新的视觉感和特殊的艺术效果。
可选地,在本实施例中,上述步骤S30之后,还包括:
显示所述结果图像。
在将所述前景图像以及所述背景图像融合为结果图像之后,所述移动终端显示所述结果图像,使得用户能够立即查看对所述待处理图像进行色彩传递的结果。
可选地,所述移动终端在显示所述结果图像的同时,还可以显示提示信息显示界面,供用户基于所述提示信息显示界面确认是否存储所述结果图像;以及在接收到用户基于所述提示信息显示界面输入的确认信息时,所述移动终端将所述结果图像存储至预设的存储路径指向的存储区域。此外,若用户不满意色彩传递的结果图像,可选择不存储。
本实施例提出的图像处理方法,移动终端首先基于待处理图像的深度信息将待处理图像划分为前景图像和背景图像,然后对于所述前景图像和所述背景图像采用不同的参考图像应用色彩传递技术,或者仅对所述前景图像和所述背景图像之一采用参考图像应用色彩传递技术,在完成色彩传递之后,再将所述前景图像以及所述背景图像组合为结果图像,使得结果图像的前景部分和背景部分具有不同的颜色氛围,相较于相关技术仅能对待处理图像进行整体色彩传递的方式,本发明实施例能够更灵活进行图像的色彩传递。
可选地,基于第一实施例,提出本发明图像处理方法的第二实施例,在本实施例中,上述步骤S10之前,还包括:
在侦测到拍摄指令时,对待拍摄场景进行拍摄,并获取所述待拍摄场景的深度信息;
将所述待拍摄场景的深度信息与拍摄的图像关联,并将拍摄的所述图像作为所述待处理图像。
本实施例将第一实施例所述的色彩传递方案应用到拍摄中,例如,移动终端在拍摄时,利用色彩传递将夕阳背光特效场景的颜色氛围传递给拍摄的图像,使得拍摄图像呈现出对比度鲜明的夕阳氛围效果,或者将秋天金黄的 颜色氛围传递给拍摄的图像,使得拍摄图像出现季节变幻的效果。
本实施例中,移动终端在拍摄时,通过预先设置的双目摄像头或者深度传感器获取待拍摄场景的深度信息,其中,双目摄像头是指位于移动终端同一面且相距一定距离的两个摄像头。在采用双目摄像头获取待拍摄场景的深度信息时,可按移动终端的缺省设置将双目摄像头中任一摄像头拍摄的场景图像作为所述待处理图像,或者按用户设置将用户指定的摄像头拍摄的场景图像作为所述待处理图像。
例如,所述移动终端通过设置的双目摄像头分别拍摄两幅场景图像,通过两幅场景图像的灰度信息和成像几何来生成深度图,深度图中的每一个像素值表示场景中某一点与所述移动终端之间的距离。又例如,所述移动终端通过深度传感器接收来自待拍摄场景发射或反射的光能量,形成有关待拍摄场景的光能量分布函数,即灰度图像,然后在这些图像的基础上恢复拍摄场景的深度信息;或者所述移动终端通过深度传感器向待拍摄场景发射能量,然后接收待拍摄场景对所发射能量的反射能量,形成有关待拍摄场景的光能量分布函数,即灰度图像,然后在这些图像的基础上恢复拍摄场景的深度信息。
此外,在所述移动终端还包括与所述双目摄像头位于同一面的其它摄像头时,还可将所述其它摄像头指定用于待拍摄场景的拍摄,将所述双目摄像头指定用于待拍摄场景深度信息的获取。
可选地,基于前述任一实施例,提出本发明图像处理方法的第三实施例,在本实施例中,上述步骤S30之前,还包括:
在完成色彩传递时,对所述前景图像以及所述背景图像的分割边缘进行羽化处理;
在完成羽化处理时,执行所述将所述前景图像以及所述背景图像融合为结果图像的步骤。
本领域技术人员可以理解的是,在前述实施例中,在将所述待处理图像划分为前景图像以及背景图像之后,对所述前景图像和所述背景图像之一进 行了色彩传递的线性变换,或者分别采用不同的参考图像对所述前景图像以及所述背景图像进行了色彩传递的线性变换,所述前景图像和所述背景图像存在一定的颜色反差,若直接将所述前景图像以及所述背景图像进行融合,所述前景图像和所述背景图像的融合边缘(即前述分割边缘)处的过渡将比较生硬,影响结果图像的显示效果。因此,在本实施例中,在对所述前景图像以及所述背景图像进行融合之前,先对所述前景图像以及所述背景图像的分割边缘按预设的羽化值(可由移动终端缺省设定,或者用户自定义)进行羽化处理,在完成羽化处理后,再将所述前景图像以及所述背景图像融合为结果图像。羽化处理原理是将图像融合边缘处虚化,起到渐变的作用从而达到自然衔接的效果,其中,羽化值越大,虚化范围越宽,也就是说颜色递变更柔和,羽化值越小,虚化范围越窄,颜色递变更剧烈,可根据实际情况进行调节。
本实施例通过在将所述前景图像以及所述背景图像进行融合之前,先对所述前景图像以及所述背景图像的分割边缘进行羽化处理,使得融合得到的结果图像的融合边缘过渡比较自然,能够提高结果图像的显示效果。
可选地,基于前述任一实施例,提出本发明图像处理方法的第四实施例,在本实施例中,上述步骤S20中所述获取色彩传递的参考图像包括:
显示参考图像的选择界面,以供用户基于所述选择界面选择对所述前景图像和/或所述背景图像进行色彩传递的参考图像;
在接收到用户基于所述选择界面触发的选择指令时,获取所述选择指令对应的参考图像。
本实施例中,在进行所述前景图像以及所述背景图像的色彩传递的线性变换时,用户可分别指定所述前景图像以及所述背景图像各自对应的参考图像,也可仅为二者之一指定对应的参考图像。所述移动终端在将所述待处理图像划分为前景图像以及背景图像之后,显示参考图像的选择界面,以供用户基于所述选择界面选择对所述前景图像和/或所述背景图像进行色彩传递的参考图像,例如,用户可选择对所述前景图像进行色彩变换的第一参考图像,以及选择对所述背景图像进行色彩变换的第二参考图像;或者用户仅为 所述前景图像和所述背景图像之一选择进行色彩变换的参考图像。
在接收到用户基于所述选择界面触发的选择指令时,所述移动终端根据所述选择指令获取所述前景图像对应的第一参考图像,以及获取所述背景图像对应的第二参考图像;
或者,在接收到用户基于所述选择界面触发的选择指令时,所述移动终端根据所述选择指令获取所述前景图像和所述背景图像之一所对应的参考图像。
例如,所述待处理图像为用户拍摄的包括人像的风景照片,手机将照片中的“人”(前景部分)划分为前景图像,将照片中的“风景”背景部分划分为背景图像,用户可为“风景”选择秋天金黄的颜色氛围的参考图像,不为“人”选择参考图像,在完成选择后,手机将秋天金黄的颜色氛围传递给“风景”,最终使得拍摄的照片出现季节变换的效果。
本实施例通过响应用户操作分别为待处理图像的前景部分和背景部分指定不同的参考图像,或者仅为待处理图像的前景部分和背景部分之一指定参考图像,能够提升用户体验。
可选地,基于前述任一实施例,提出本发明图像处理方法的第五实施例,在本实施例中,当所述参考图像为多个时,上述步骤S20中所述根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递包括:
采用所述前景图像以及所述背景图像各自对应的参考图像分别对所述前景图像和所述背景图像进行色彩传递。
在本实施例中,所述移动终端获取到所述前景图像对应的第一参考图像,以及获取到所述背景图像对应的第二参考图像,并采用获取的所述第一参考图像对所述前景图像进行色彩传递,采用获取的所述第二参考图像对所述背景图像进行色彩传递。以下以基于低阶统计信息,采用获取的所述第二参考图像对所述背景图像的色彩传递进行说明。
由于Lαβ颜色空间相较于RGB颜色空间更符合人类视觉感知***,将其运用到自然场景中时,Lαβ颜色空间能够显著降低颜色通道之间的相 关性,使通道之间具有一定的相互独立性,可以最大限度地减小一个通道的变化给另外两个通道造成的影响,从而可以在不同的颜色通道进行不同的运算,而不会出现通道交叉的问题。因此,为了达到较好的色彩传递效果,所述移动终端首先将所述背景图像以及所述第二参考图像由RGB颜色空间转换至Lαβ颜色空间,将Lαβ颜色空间作为色彩传递的执行空间。其中,L通道表示非彩色通道,即亮度通道,α表示彩色的黄蓝通道,β表示彩色的红绿通道。
在完成所述背景图像以及所述第二参考图像的颜色空间转换之后,所述移动终端首先计算所述背景图像的L通道均值
Figure PCTCN2016103238-appb-000061
α通道均值
Figure PCTCN2016103238-appb-000062
β通道均值
Figure PCTCN2016103238-appb-000063
以及所述第二参考图像的L通道均值
Figure PCTCN2016103238-appb-000064
α通道均值
Figure PCTCN2016103238-appb-000065
β通道均值
Figure PCTCN2016103238-appb-000066
以及所述背景图像的L通道标准差
Figure PCTCN2016103238-appb-000067
α通道标准差
Figure PCTCN2016103238-appb-000068
β通道标准差
Figure PCTCN2016103238-appb-000069
以及所述第二参考图像的L通道标准差
Figure PCTCN2016103238-appb-000070
α通道标准差
Figure PCTCN2016103238-appb-000071
β通道标准差
Figure PCTCN2016103238-appb-000072
然后所述移动终端从所述背景图像中移走均值,将剩余部分按照标准差的比值缩放,最后加入所述第二参考图像的均值,其变换公式如下:
Figure PCTCN2016103238-appb-000073
Figure PCTCN2016103238-appb-000074
Figure PCTCN2016103238-appb-000075
Figure PCTCN2016103238-appb-000076
Figure PCTCN2016103238-appb-000077
Figure PCTCN2016103238-appb-000078
其中,ldst、αdst和βdst分别表示选中的像素的每个通道值,
Figure PCTCN2016103238-appb-000079
Figure PCTCN2016103238-appb-000080
分别表示选中的像素变换后的每个通道值。
经过上述运算即可使得所述背景图像和所述第二参考图像的每个通道的低阶统计信息一致,达到将所述第二参考图像的颜色特征传递到所述背景图像的目的,然后将运算后的背景图像由Lαβ颜色空间转换为RGB颜色空间,有利于移动终端显示。
采用第一参考图像对所述前景图像进行的色彩传递可参照上述技术方案实施,此处不再赘述。本领域技术人员可以理解的是,在其他实施例中,可以按实际需要选取色彩传递的方式,例如,可以采用基于高阶统计信息的色彩传递,或者采用基于聚类的区域色彩传递。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括多个指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明实施例所述的方法。
以上仅为本发明的可选实施例,并非因此限制本申请的专利范围,凡是利用本文说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请保护范围内。
工业实用性
本发明实施例提出的图像处理装置及方法,在进行待处理图像的色彩传递时,移动终端首先基于待处理图像的深度信息将待处理图像划分为前景图像和背景图像,然后对于所述前景图像和所述背景图像采用不同的参考图像应用色彩传递技术,或者仅对所述前景图像和所述背景图像之一采用参考图像应用色彩传递技术,在完成色彩传递之后,再将所述前景图像以及所述背景图像组合为结果图像,使得结果图像的前景部分和背景部分具有不同的颜色氛围,相较于相关技术仅能对待处理图像进行整体色彩传递的方式,本发明实施例能够更灵活进行图像的色彩传递。

Claims (20)

  1. 一种图像处理装置,所述图像处理装置包括:
    划分模块,设置为:获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像;
    变换模块,设置为:获取色彩传递的参考图像,并根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同;
    合成模块,设置为:在完成所述色彩传递时,将所述前景图像以及所述背景图像融合为结果图像。
  2. 如权利要求1所述的图像处理装置,所述图像处理装置还包括:
    拍摄模块,设置为:在侦测到拍摄指令时,对待拍摄场景进行拍摄,并获取所述待拍摄场景的深度信息;
    关联模块,设置为:将所述待拍摄场景的深度信息与拍摄的图像关联,并将拍摄的所述图像作为所述待处理图像。
  3. 如权利要求1所述的图像处理装置,其中,
    所述划分模块,设置为:搜索本地或者云端是否存在所述待处理图像预先关联的深度信息,在搜索到所述待处理图像预先关联的深度信息时,将搜索到的所述深度信息作为所述待处理图像的深度信息。
  4. 如权利要求1所述的图像处理装置,其中,
    所述变换模块,设置为:采用基于低阶统计信息的色彩传递、或采用基于高阶统计信息的色彩传递、或采用基于聚类的区域色彩传递。
  5. 如权利要求1所述的图像处理装置,所述图像处理装置还包括羽化模块,设置为:在完成所述色彩传递时,对所述前景图像以及所述背景图像的分割边缘进行羽化处理;
    所述合成模块还设置为:在完成所述羽化处理时,将所述前景图像以及所述背景图像融合为结果图像。
  6. 如权利要求1-5任一项所述的图像处理装置,其中,所述变换模块 还设置为:显示参考图像的选择界面,以供用户基于所述选择界面选择对所述前景图像和/或所述背景图像进行色彩传递的参考图像;以及在接收到用户基于所述选择界面触发的选择指令时,获取所述选择指令对应的参考图像。
  7. 如权利要求1-5任一项所述的图像处理装置,其中,当所述参考图像为多个时,所述变换模块设置为:采用所述前景图像以及所述背景图像各自对应的参考图像分别对所述前景图像和所述背景图像进行色彩传递。
  8. 如权利要求1-5任一项所述的图像处理装置,所述图像处理装置还包括:
    记录模块,设置为:在所述划分模块将所述待处理图像划分为前景图像和背景图像时,记录所述前景图像以及所述背景图像间的连接信息;
    所述合成模块设置为:在完成所述色彩传递时,根据所述记录模块记录的所述前景图像以及所述背景图像间的连接信息将所述前景图像以及所述背景图像融合为结果图像。
  9. 如权利要求1-5任一项所述的图像处理装置,所述图像处理装置还包括:
    显示模块,设置为:显示所述结果图像。
  10. 如权利要求9所述的图像处理装置,其中,
    所述显示模块还设置为:在显示所述结果图像时,显示提示信息显示界面,供用户基于所述提示信息显示界面确认是否存储所述结果图像;在接收到用户基于所述提示信息显示界面输入的确认信息时,将所述结果图像存储至预设的存储路径指向的存储区域。
  11. 一种图像处理方法,所述图像处理方法包括:
    获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像;
    获取色彩传递的参考图像,并根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递,以使得所述前景图像与所述背景图像的颜色特征不同;
    在完成所述色彩传递时,将所述前景图像以及所述背景图像融合为结果图像。
  12. 如权利要求11所述的图像处理方法,其中,所述获取待处理图像的深度信息,并根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像的步骤之前,还包括:
    在侦测到拍摄指令时,对待拍摄场景进行拍摄,并获取所述待拍摄场景的深度信息;
    将所述待拍摄场景的深度信息与拍摄的图像关联,并将拍摄的所述图像作为所述待处理图像。
  13. 如权利要求11所述的图像处理方法,其中,所述获取待处理图像的深度信息包括:
    搜索本地或者云端是否存在所述待处理图像预先关联的深度信息,在搜索到所述待处理图像预先关联的深度信息时,将搜索到的所述深度信息作为所述待处理图像的深度信息。
  14. 如权利要求11所述的图像处理方法,其中,所述获取色彩传递的参考图像,并根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递的步骤中,采用基于低阶统计信息的色彩传递、或采用基于高阶统计信息的色彩传递、或采用基于聚类的区域色彩传递。
  15. 如权利要求11所述的图像处理方法,其中,所述将所述前景图像以及所述背景图像融合为结果图像的步骤之前,还包括:
    在完成所述色彩传递时,对所述前景图像以及所述背景图像的分割边缘进行羽化处理;
    在完成所述羽化处理时,执行所述将所述前景图像以及所述背景图像融合为结果图像的步骤。
  16. 如权利要求11-15任一项所述的图像处理方法,其中,所述获取色彩传递的参考图像包括:
    显示参考图像的选择界面,以供用户基于所述选择界面选择对所述前景图像和/或所述背景图像进行色彩传递的参考图像;
    在接收到用户基于所述选择界面触发的选择指令时,获取所述选择指令对应的参考图像。
  17. 如权利要求11-15任一项所述的图像处理方法,其中,当所述参考图像为多个时,所述根据获取的所述参考图像对所述前景图像和/或所述背景图像进行色彩传递包括:
    采用所述前景图像以及所述背景图像各自对应的参考图像分别对所述前景图像和所述背景图像进行色彩传递。
  18. 如权利要求11-15任一项所述的图像处理方法,其中,根据获取的所述深度信息将所述待处理图像划分为前景图像和背景图像时,还包括:记录所述前景图像以及所述背景图像间的连接信息;
    所述将所述前景图像以及所述背景图像融合为结果图像包括:根据所述所述前景图像以及所述背景图像间的连接信息将所述前景图像以及所述背景图像融合为结果图像。
  19. 如权利要求11-15任一项所述的图像处理方法,其中,将所述前景图像以及所述背景图像融合为结果图像之后,还包括:
    显示所述结果图像。
  20. 如权利要求19所述的图像处理方法,其中,所述显示所述结果图像时,还包括:显示提示信息显示界面,供用户基于所述提示信息显示界面确认是否存储所述结果图像;在接收到用户基于所述提示信息显示界面输入的确认信息时,将所述结果图像存储至预设的存储路径指向的存储区域。
PCT/CN2016/103238 2015-10-30 2016-10-25 图像处理装置及方法 WO2017071559A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510733347.1A CN105430295B (zh) 2015-10-30 2015-10-30 图像处理装置及方法
CN201510733347.1 2015-10-30

Publications (1)

Publication Number Publication Date
WO2017071559A1 true WO2017071559A1 (zh) 2017-05-04

Family

ID=55508196

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/103238 WO2017071559A1 (zh) 2015-10-30 2016-10-25 图像处理装置及方法

Country Status (2)

Country Link
CN (1) CN105430295B (zh)
WO (1) WO2017071559A1 (zh)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11468625B2 (en) 2018-09-11 2022-10-11 Apple Inc. User interfaces for simulated depth effects
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11617022B2 (en) 2020-06-01 2023-03-28 Apple Inc. User interfaces for managing media
US11641517B2 (en) 2016-06-12 2023-05-02 Apple Inc. User interface for camera effects
US11669985B2 (en) 2018-09-28 2023-06-06 Apple Inc. Displaying and editing images with depth information
US11687224B2 (en) 2017-06-04 2023-06-27 Apple Inc. User interface camera effects
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media
US11895391B2 (en) 2018-09-28 2024-02-06 Apple Inc. Capturing and displaying images with multiple focal planes

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105430295B (zh) * 2015-10-30 2019-07-12 努比亚技术有限公司 图像处理装置及方法
CN106447677A (zh) * 2016-10-12 2017-02-22 广州视源电子科技股份有限公司 图像处理方法和装置
CN111164563B (zh) * 2017-08-02 2024-02-20 深圳传音通讯有限公司 一种智能终端的图像色彩调节***及色彩调节方法
CN107707839A (zh) * 2017-09-11 2018-02-16 广东欧珀移动通信有限公司 图像处理方法及装置
CN107679542B (zh) * 2017-09-27 2020-08-11 中央民族大学 一种双摄像头立体视觉识别方法及***
CN108234978B (zh) * 2017-12-12 2019-11-05 维沃移动通信有限公司 一种图像处理方法及移动终端
CN107958449A (zh) * 2017-12-13 2018-04-24 北京奇虎科技有限公司 一种图像合成方法及装置
CN108174091B (zh) * 2017-12-28 2021-04-13 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN108777783A (zh) * 2018-07-09 2018-11-09 广东交通职业技术学院 一种图像处理方法及装置
CN109005314B (zh) * 2018-08-27 2020-07-28 维沃移动通信有限公司 一种图像处理方法及终端
CN111193859A (zh) * 2019-03-29 2020-05-22 安庆市汇智科技咨询服务有限公司 一种图像处理***及其工作流程
CN110010088B (zh) * 2019-05-20 2022-01-11 京东方科技集团股份有限公司 透明显示模组和透明显示装置
CN111833263B (zh) * 2020-06-08 2024-06-07 北京嘀嘀无限科技发展有限公司 图像处理方法、装置、可读存储介质和电子设备
CN112261320A (zh) * 2020-09-30 2021-01-22 北京市商汤科技开发有限公司 图像处理方法和相关产品
CN112606402A (zh) * 2020-11-03 2021-04-06 泰州芯源半导体科技有限公司 应用多参数解析的产品制造平台
CN113724276B (zh) * 2021-08-04 2024-05-28 香港中文大学(深圳) 一种息肉图像的分割方法和装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101867685A (zh) * 2010-06-25 2010-10-20 北京理工大学 基于颜色查找表的双波段视频快速自然感彩色融合方法
CN101872473A (zh) * 2010-06-25 2010-10-27 清华大学 基于过分割与优化的多尺度图像自然色彩融合方法及装置
CN102609927A (zh) * 2012-01-12 2012-07-25 北京理工大学 基于场景景深的雾天可见光/红外图像彩色融合方法
CN102780855A (zh) * 2011-05-13 2012-11-14 晨星软件研发(深圳)有限公司 影像处理的方法与相关装置
CN104375797A (zh) * 2014-11-17 2015-02-25 联想(北京)有限公司 一种信息处理方法及电子设备
US8976191B1 (en) * 2014-03-13 2015-03-10 Qualcomm Incorporated Creating a realistic color for a virtual object in an augmented reality environment
CN105430295A (zh) * 2015-10-30 2016-03-23 努比亚技术有限公司 图像处理装置及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101867685A (zh) * 2010-06-25 2010-10-20 北京理工大学 基于颜色查找表的双波段视频快速自然感彩色融合方法
CN101872473A (zh) * 2010-06-25 2010-10-27 清华大学 基于过分割与优化的多尺度图像自然色彩融合方法及装置
CN102780855A (zh) * 2011-05-13 2012-11-14 晨星软件研发(深圳)有限公司 影像处理的方法与相关装置
CN102609927A (zh) * 2012-01-12 2012-07-25 北京理工大学 基于场景景深的雾天可见光/红外图像彩色融合方法
US8976191B1 (en) * 2014-03-13 2015-03-10 Qualcomm Incorporated Creating a realistic color for a virtual object in an augmented reality environment
CN104375797A (zh) * 2014-11-17 2015-02-25 联想(北京)有限公司 一种信息处理方法及电子设备
CN105430295A (zh) * 2015-10-30 2016-03-23 努比亚技术有限公司 图像处理装置及方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
RAO, NA: "Research on Algorithm of Unsupervised Color Transfer between Images Based on Clustering", CMFD, 28 February 2005 (2005-02-28), pages 15 - 17 and 21-23 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11641517B2 (en) 2016-06-12 2023-05-02 Apple Inc. User interface for camera effects
US11962889B2 (en) 2016-06-12 2024-04-16 Apple Inc. User interface for camera effects
US11687224B2 (en) 2017-06-04 2023-06-27 Apple Inc. User interface camera effects
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US11468625B2 (en) 2018-09-11 2022-10-11 Apple Inc. User interfaces for simulated depth effects
US11669985B2 (en) 2018-09-28 2023-06-06 Apple Inc. Displaying and editing images with depth information
US11895391B2 (en) 2018-09-28 2024-02-06 Apple Inc. Capturing and displaying images with multiple focal planes
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US11617022B2 (en) 2020-06-01 2023-03-28 Apple Inc. User interfaces for managing media
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media

Also Published As

Publication number Publication date
CN105430295A (zh) 2016-03-23
CN105430295B (zh) 2019-07-12

Similar Documents

Publication Publication Date Title
WO2017071559A1 (zh) 图像处理装置及方法
WO2017067520A1 (zh) 具有双目摄像头的移动终端及其拍照方法
US11832022B2 (en) Framing method for multi-channel video recording, graphical user interface, and electronic device
KR102352681B1 (ko) 동영상 안정화 방법 및 이를 위한 전자 장치
WO2017107629A1 (zh) 移动终端、数据传输***及移动终端拍摄方法
US9692959B2 (en) Image processing apparatus and method
WO2017045647A1 (zh) 一种处理图像的移动终端和方法
WO2017050115A1 (zh) 一种图像合成方法和装置
WO2017054704A1 (zh) 生成视频图片的方法及装置
US11134191B2 (en) Image display method and electronic device
US20110102630A1 (en) Image capturing devices using device location information to adjust image data during image signal processing
US10298899B2 (en) Image processing device, imaging device, image processing method, and program
CN103037077B (zh) 移动终端和移动终端的图像显示方法
WO2018076938A1 (zh) 图像处理装置及方法和计算机存储介质
WO2017118353A1 (zh) 显示视频文件的装置及方法
WO2017071558A1 (zh) 移动终端拍摄装置和方法
WO2018059206A1 (zh) 终端、获取视频的方法及存储介质
CN104767941A (zh) 拍照方法及装置
WO2017088662A1 (zh) 对焦方法和装置
CN104660903A (zh) 拍摄方法及拍摄装置
WO2017088609A1 (zh) 图像去噪装置和方法
WO2017054677A1 (zh) 移动终端拍摄***和移动终端拍摄方法
US20240119566A1 (en) Image processing method and apparatus, and electronic device
WO2015151747A1 (ja) 画像処理装置、撮像装置、画像処理方法及びプログラム
US20230056332A1 (en) Image Processing Method and Related Apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16858994

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16858994

Country of ref document: EP

Kind code of ref document: A1