CN106993091B - Image blurring method and mobile terminal - Google Patents

Image blurring method and mobile terminal Download PDF

Info

Publication number
CN106993091B
CN106993091B CN201710198188.9A CN201710198188A CN106993091B CN 106993091 B CN106993091 B CN 106993091B CN 201710198188 A CN201710198188 A CN 201710198188A CN 106993091 B CN106993091 B CN 106993091B
Authority
CN
China
Prior art keywords
image
depth data
blurring
image areas
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710198188.9A
Other languages
Chinese (zh)
Other versions
CN106993091A (en
Inventor
王俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710198188.9A priority Critical patent/CN106993091B/en
Publication of CN106993091A publication Critical patent/CN106993091A/en
Application granted granted Critical
Publication of CN106993091B publication Critical patent/CN106993091B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • H04N5/2226Determination of depth image, e.g. for foreground/background separation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)
  • Telephone Function (AREA)

Abstract

The invention provides an image blurring method, which comprises the following steps: acquiring N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data for each of the N image regions. Therefore, according to the image blurring scheme provided by the invention, the depth data corresponding to different depth layers are fused into the same depth data, and then a plurality of regions can be fused into the same depth layer, so that the generated image can clearly display the fused layer, and the depth layers except the fused layer can be displayed in a fuzzy manner, namely, the region selected by a user can clearly display other regions as blurring, and any plurality of targets can be selected for image blurring.

Description

Image blurring method and mobile terminal
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image blurring method and a mobile terminal.
Background
At present, along with the rapid development of science and technology, mobile terminal increasingly becomes an indispensable part in people's life, and people use mobile terminal to shoot, for bring better use experience for the user, set up two cameras in mobile terminal, can realize blurring the image according to user's needs.
The current mobile terminal virtualization operation mainly comprises the steps of firstly taking a picture and then focusing, and the double-camera virtualization operation is mainly to divide the picture into 2 layers, a foreground and a background; the blurring and the clarity of the foreground and the background are switched by clicking the positions on the foreground and the background.
The existing image blurring processing scheme cannot perform blurring on images of multiple depth layers. For example: if a person takes an article in the hand and puts the article in the front and focuses the image from the main target layer, the depth data can be mainly divided into three layers (namely front, middle and rear layers) of the article, the person and the background; at this time, if the existing dual-camera virtual focusing mode is adopted, when any one layer is clicked, the clicked layer is clear, and the other two layers are both virtual and fuzzy. Because the existing blurring processing scheme can only realize the clearness of one of the front layer, the middle layer and the rear layer, when a user wants the front layer of an article and the middle layer of a character to be kept clear, the situation that the rear layer of a background is fuzzy cannot be realized, or when the user wants the front layer of the article to be fuzzy, the situation that the middle layer of the character and the rear layer of the background are clear cannot be realized.
Disclosure of Invention
The invention provides an image blurring method and a mobile terminal, which aim to solve the problem that blurring of an image can only be performed on a single layer in the prior art.
The invention discloses an image blurring method, which comprises the following steps: acquiring N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data for each of the N image regions.
The invention also discloses a mobile terminal, which comprises: the first acquisition module is used for acquiring N image areas selected by a mobile terminal user on an image; the second acquisition module is used for acquiring initial depth data of each of the N image areas; and the blurring module is used for blurring the image based on the initial depth data of each of the N image areas, wherein N is an integer larger than 1.
The image blurring scheme provided by the embodiment of the invention obtains N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1. Therefore, according to the image blurring scheme provided by the invention, the depth data corresponding to different depth layers are fused into the same depth data, and then a plurality of regions can be fused into the same depth layer, so that the generated image can clearly display the fused layer, and the depth layers except the fused layer can be displayed in a fuzzy manner, namely, the region selected by a user can clearly display other regions as blurring, and any plurality of targets can be selected for image blurring.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flow chart of an image blurring method according to an embodiment of the present invention;
FIG. 2 is a flow chart of an image blurring method according to an embodiment of the present invention;
fig. 3 is a block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 4 is a block diagram of a mobile terminal according to an embodiment of the present invention;
FIG. 5 is a block diagram of a virtualization module according to an embodiment of the present invention;
FIG. 6 is a block diagram of a judgment sub-module according to an embodiment of the present invention;
fig. 7 is a block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 8 is a block diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to FIG. 1, a flow chart of the steps of an image blurring method of the present invention is shown.
The image blurring method provided by the embodiment of the invention comprises the following steps:
step 101: n image areas selected by a mobile terminal user on an image are obtained.
When a user needs to perform multi-depth layer blurring on an image, touch operation is performed on an area, which needs to be clear or needs to be blurred, in the image on the screen of the mobile terminal, it should be noted that a person skilled in the art may set the touch operation according to an actual situation, and the touch operation may be a single machine, a double click, a long press, or the like, which is not limited specifically.
According to the touch operation of the user on the image, a plurality of areas selected by the user on the image are obtained, and the image areas which the user needs to perform blurring processing can be determined.
Step 102: initial depth data for each of the N image regions is acquired.
Initial depth data for individual pixels in the image is determined. Because the target of the image is near or far, some depth layers corresponding to the pixels are in the same depth layer, and some depth layers are not in the same depth layer, so that the depth data of the different depth layers are different, and the initial depth data of the pixels in the image is determined.
The initial depth data of each of the N image regions is acquired, and in the subsequent operation that requires image processing, the acquisition of the initial depth data of the region is not required, which saves time.
Note that the initial depth data is depth data of each pixel that has not been processed.
Step 103: the image is blurred based on the initial depth data for each of the N image regions.
Wherein N is an integer greater than 1.
And processing based on the initial depth data of each of the N image areas to complete the blurring processing of the multiple depth layers of the image.
The image blurring method provided by the embodiment of the invention comprises the steps of obtaining N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1. Therefore, according to the image blurring method provided by the invention, the depth data corresponding to different depth layers are fused into the same depth data, and then a plurality of regions can be fused into the same depth layer, so that the generated image can clearly display the fused layer, and the depth layers except the fused layer can be displayed in a fuzzy manner, namely, the region selected by a user can clearly display other regions as blurring, and any plurality of targets can be selected for image blurring.
Referring to fig. 2, a flow chart of the steps of an image blurring method of the present invention is shown.
The image blurring method provided by the embodiment of the invention comprises the following steps:
step 201: n image areas selected by a mobile terminal user on an image are obtained.
When a user needs to perform multi-depth layer blurring on an image, touch operation is performed on an area which needs to be clear or needs to be blurred in the image on a screen of the mobile terminal, it should be noted that a person skilled in the art can set the touch operation according to an actual situation, and the touch operation can be a single machine, a double click or a long press, and the like, which is not particularly limited to this, and a plurality of areas selected by the user on the image are obtained according to the touch operation of the user on the image.
Step 202: initial depth data for each of the N image regions is acquired.
Wherein N is an integer greater than 1.
Initial depth data for individual pixels in the image is determined. Because the target of the image is near or far, some depth layers corresponding to the pixels are in the same depth layer, and some depth layers are not in the same depth layer, so that the depth data of the different depth layers are different, and the initial depth data of the pixels in the image is determined.
Note that the initial depth data is depth data of each pixel that has not been processed.
The initial depth data of each of the N image regions is acquired, and in the subsequent operation that requires image processing, the acquisition of the initial depth data of the region is not required, which saves time.
The obtaining of the initial depth data of each of the N image regions may be performed by performing blurring on the N image regions, and when blurring on other regions is required, performing blurring on different regions by using the initial depth data of each region again. .
Step 203: judging whether the N image areas are in the same depth layer or not based on the initial depth data of each of the N image areas; if yes, go to step 204, otherwise go to step 205.
It should be noted that, whether the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range is determined; and if the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range, judging that the N image areas are in the same depth layer.
Step 204: and if the N image areas are in the same depth layer, blurring the image.
When the N image areas are judged to be in the same depth layer, the blurring processing is directly carried out on the N image areas without carrying out any processing on the initial depth data corresponding to the selected area.
Step 205: if the N image areas are in different depth layers, setting the depth data of all pixel points of each area in the N image areas to be the same depth value or the same depth range.
If the judgment result is that the N image areas are not in the same depth layer, all pixel points of each area in the N image areas are fused into the same depth layer or the same depth range, and therefore the N image areas can be virtualized.
The different depth ranges or the depth values corresponding to the N image areas are set to be the same depth value or the same depth range, and the depth values or the depth ranges corresponding to the image areas are consistent, so that the areas of the different depth ranges can be subjected to blurring processing, and other unselected areas are not influenced.
Step 206: and blurring the image based on the depth data of each of the N image areas after the depth data is adjusted.
Another optimized blurring processing method is to perform blurring processing on the N image regions, or perform blurring processing on all image regions except the N image regions in the image.
In the process of blurring the image, the selected image area may be blurred, and all images except the N image areas in the image may be blurred. The image blurring method provided by the embodiment of the invention comprises the steps of obtaining N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1. Therefore, according to the image blurring method provided by the invention, the depth data corresponding to different depth layers are fused into the same depth data, and the multiple regions can be fused into the same depth layer, so that the generated image can clearly display the fused layer, the depth layers except the fused layer are displayed in a fuzzy manner, namely, the region selected by a user can clearly display other regions in a blurring manner, and the use experience of the user is improved through the image blurring scheme.
Referring to fig. 3, a block diagram of a mobile terminal according to a third embodiment of the present invention is shown.
The mobile terminal provided by the embodiment of the invention comprises: a first obtaining module 301, configured to obtain N image areas selected by a mobile terminal user on an image; a second obtaining module 302, configured to obtain initial depth data of each of the N image regions; a blurring module 303, configured to perform blurring processing on the image based on the initial depth data of each of the N image regions, where N is an integer greater than 1.
The embodiment of the invention provides a mobile terminal, which is used for acquiring N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1. Therefore, according to the mobile terminal provided by the invention, the depth data corresponding to different depth layers are fused into the same depth data, so that a plurality of regions can be fused into the same depth layer, the generated image can clearly display the fused layer, the depth layers except the fused layer can be displayed in a fuzzy manner, namely, the region selected by a user can clearly display other regions in a virtual manner, and any plurality of targets can be selected for image virtual.
Referring to fig. 4, there is shown a block diagram of a mobile terminal of the present invention.
The mobile terminal provided by the embodiment of the invention comprises: a first obtaining module 401, configured to obtain N image areas selected by a mobile terminal user on an image; a second obtaining module 402, configured to obtain initial depth data of each of the N image regions; a blurring module 403, configured to perform blurring processing on the image based on the initial depth data of each of the N image regions, where N is an integer greater than 1.
Preferably, as shown in fig. 5, which is a block diagram of a blurring module, the blurring module 403 includes: a determining submodule 4031, configured to determine whether the N image regions are in the same depth layer based on initial depth data of each of the N image regions; a blurring sub-module 4032, configured to perform blurring processing on the image if the N image areas are in the same depth layer.
Preferably, the mobile terminal further includes: a fusion module 404, configured to set initial depth data of each of the N image regions as preset target depth data if the N image regions are in different depth layers; an image blurring module 405, configured to perform blurring processing on the image based on the depth data of each of the N image regions after the depth data is adjusted.
Preferably, the fusion module 404 is specifically configured to set the depth data of all the pixel points in each of the N image regions to be the same depth value or the same depth range.
Preferably, as shown in fig. 6, in order to obtain a structural block diagram of the determination sub-module, the determination sub-module 4031 includes: a determining unit 40311, configured to determine whether the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range; a determining unit 40322, configured to determine that the N image areas are in the same depth layer if the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range.
Preferably, as shown in fig. 5, the blurring module 403 is a structural block diagram of the blurring module, and further includes a processing sub-module 4033, configured to perform blurring processing on the N image areas, or perform blurring processing on all image areas except the N image areas in the image.
The embodiment of the invention provides a mobile terminal, which is used for acquiring N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1. Therefore, according to the mobile terminal provided by the invention, the depth data corresponding to different depth layers are fused into the same depth data, so that a plurality of regions can be fused into the same depth layer, the generated image can clearly display the fused layer, the depth layers in addition can be displayed in a fuzzy manner, namely, the region selected by the user can clearly display other regions in a virtual manner, and the use experience of the user is improved through the image virtual scheme.
Referring to fig. 7, there is shown a block diagram of the structure of the mobile terminal of the present invention.
The mobile terminal 800 of the embodiment of the present invention includes: at least one processor 801, memory 802, at least one network interface 804, and other user interfaces 803. The various components in the mobile terminal 800 are coupled together by a bus system 805. It is understood that the bus system 805 is used to enable communications among the components connected. The bus system 805 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 805 in fig. 7.
The user interface 803 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or touch screen, among others.
It will be appreciated that the memory 802 in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double data rate Synchronous Dynamic random access memory (ddr DRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous link SDRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The memory 802 of the subject systems and methods described in connection with the embodiments of the invention is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 802 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof: an operating system 8021 and application programs 8022.
The operating system 8021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application program 8022 includes various application programs, such as a Media Player (Media Player), a Browser (Browser), and the like, for implementing various application services. A program implementing a method according to an embodiment of the present invention may be included in application program 8022.
In the embodiment of the present invention, by calling the program or instruction stored in the memory 802, specifically, the program or instruction stored in the application program 8022, the processor 801 is configured to: acquiring N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1.
The methods disclosed in the embodiments of the present invention described above may be implemented in the processor 801 or implemented by the processor 801. The processor 801 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 801. The Processor 801 may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 802, and the processor 801 reads the information in the memory 802, and combines the hardware to complete the steps of the method.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described in this embodiment of the invention may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described in this embodiment of the invention. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, the processor 801 is further configured to: judging whether the N image areas are in the same depth layer or not based on the initial depth data of each of the N image areas; and if the N image areas are in the same depth layer, performing blurring processing on the image.
Optionally, the processor 801 is further configured to: if the N image areas are in different depth layers, setting the initial depth data of each of the N image areas as preset target depth data; and performing blurring processing on the image based on the depth data of each of the N image areas after the depth data is adjusted.
Optionally, the processor 801 is further configured to: and setting the depth data of all pixel points of each of the N image areas as the same depth value or the same depth range.
Optionally, the processor 801 is further configured to: judging whether the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range; and if the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range, judging that the N image areas are in the same depth layer.
Optionally, the processor 801 is further configured to: and performing blurring processing on the N image areas, or performing blurring processing on all image areas except the N image areas in the image.
The mobile terminal 800 can implement each process implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
By the mobile terminal provided by the embodiment of the invention, N image areas selected by a mobile terminal user on an image are obtained; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1. Therefore, according to the mobile terminal provided by the invention, the depth data corresponding to different depth layers are fused into the same depth data, so that a plurality of regions can be fused into the same depth layer, the generated image can clearly display the fused layer, the depth layers except the fused layer can be displayed in a fuzzy manner, namely, the region selected by a user can clearly display other regions in a virtual manner, and any plurality of targets can be selected for image virtual.
Referring to fig. 8, there is shown a block diagram of a mobile terminal of the present invention.
The mobile terminal in the embodiment of the present invention may be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a vehicle-mounted computer.
The mobile terminal in fig. 8 includes a Radio Frequency (RF) circuit 910, a memory 920, an input unit 930, a display unit 940, a processor 960, an audio circuit 970, a wifi (wireless fidelity) module 980, and a power supply 990.
The input unit 930 may be used, among other things, to receive numeric or character information input by a user and to generate signal inputs related to user settings and function control of the mobile terminal. Specifically, in the embodiment of the present invention, the input unit 930 may include a touch panel 931. The touch panel 931, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 931 (for example, a user may operate the touch panel 931 by using a finger, a stylus pen, or any other suitable object or accessory), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 931 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 960, and can receive and execute commands from the processor 960. In addition, the touch panel 931 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 931, the input unit 930 may also include other input devices 932, and the other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Among them, the display unit 940 may be used to display information input by the user or information provided to the user and various menu interfaces of the mobile terminal. The display unit 940 may include a display panel 941, and the display panel 941 may be optionally configured in the form of an LCD or an Organic Light-Emitting Diode (OLED).
It should be noted that the touch panel 931 may overlay the display panel 941 to form a touch display screen, and when the touch display screen detects a touch operation on or near the touch display screen, the touch display screen transmits the touch operation to the processor 960 to determine the type of the touch event, and then the processor 960 provides a corresponding visual output on the touch display screen according to the type of the touch event.
The touch display screen comprises an application program interface display area and a common control display area. The arrangement modes of the application program interface display area and the common control display area are not limited, and can be an arrangement mode which can distinguish two display areas, such as vertical arrangement, left-right arrangement and the like. The application interface display area may be used to display an interface of an application. Each interface may contain at least one interface element such as an icon and/or widget desktop control for an application. The application interface display area may also be an empty interface that does not contain any content. The common control display area is used for displaying controls with high utilization rate, such as application icons like setting buttons, interface numbers, scroll bars, phone book icons and the like.
The processor 960 is a control center of the mobile terminal, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the first memory 921 and calling data stored in the second memory 922, thereby performing overall monitoring of the mobile terminal. Optionally, processor 960 may include one or more processing units.
In an embodiment of the present invention, processor 960 is configured to, by invoking a software program and/or module stored in first memory 921 and/or data in second memory 922: acquiring N image areas selected by a mobile terminal user on an image; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1.
Optionally, the processor 960 is further configured to: judging whether the N image areas are in the same depth layer or not based on the initial depth data of each of the N image areas; and if the N image areas are in the same depth layer, performing blurring processing on the image.
Optionally, the processor 960 is further configured to: if the N image areas are in different depth layers, setting the initial depth data of each of the N image areas as preset target depth data; and performing blurring processing on the image based on the depth data of each of the N image areas after the depth data is adjusted.
Optionally, the processor 960 is further configured to: and setting the depth data of all pixel points of each of the N image areas as the same depth value or the same depth range.
Optionally, the processor 960 is further configured to: judging whether the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range; and if the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range, judging that the N image areas are in the same depth layer.
Optionally, the processor 960 is further configured to: and performing blurring processing on the N image areas, or performing blurring processing on all image areas except the N image areas in the image.
By the mobile terminal provided by the embodiment of the invention, N image areas selected by a mobile terminal user on an image are obtained; acquiring initial depth data of each of the N image areas; blurring the image based on the initial depth data of each of the N image regions; wherein N is an integer greater than 1. Therefore, according to the mobile terminal provided by the invention, the depth data corresponding to different depth layers are fused into the same depth data, so that a plurality of regions can be fused into the same depth layer, the generated image can clearly display the fused layer, the depth layers except the fused layer can be displayed in a fuzzy manner, namely, the region selected by a user can clearly display other regions in a virtual manner, and any plurality of targets can be selected for image virtual.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The image blurring methods provided herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with the teachings herein. The structure required to construct a system incorporating aspects of the present invention will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. It will be appreciated by those skilled in the art that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components of the image blurring method according to embodiments of the invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.

Claims (6)

1. An image blurring method applied to a mobile terminal is characterized by comprising the following steps:
acquiring N image areas selected by a mobile terminal user on an image;
acquiring initial depth data of each of the N image areas;
blurring the image based on the initial depth data of each of the N image regions;
wherein N is an integer greater than 1; wherein the blurring the image based on the initial depth data of each of the N image regions comprises:
judging whether the N image areas are in the same depth layer or not based on the initial depth data of each of the N image areas;
if the N image areas are in the same depth layer, performing blurring processing on the image;
after the step of determining whether the N image regions are in the same depth layer based on the initial depth data of each of the N image regions, the method further includes:
if the N image areas are in different depth layers, setting the initial depth data of each of the N image areas as preset target depth data;
blurring the image based on the depth data of each of the N image regions after the depth data is adjusted;
the step of setting the initial depth data of each of the N image regions as preset target depth data includes:
and setting the depth data of all pixel points of each of the N image areas as the same depth value or the same depth range.
2. The method of claim 1, wherein the step of determining whether the N image regions are in the same depth layer based on the initial depth data of each of the N image regions comprises:
judging whether the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range;
and if the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range, judging that the N image areas are in the same depth layer.
3. The method of claim 1, wherein the step of blurring the image comprises:
and performing blurring processing on the N image areas, or performing blurring processing on all image areas except the N image areas in the image.
4. A mobile terminal, comprising:
the first acquisition module is used for acquiring N image areas selected by a mobile terminal user on an image;
a second obtaining module, configured to obtain initial depth data of each of the N image regions;
a blurring module, configured to perform blurring processing on the image based on initial depth data of each of the N image regions, where N is an integer greater than 1;
wherein the blurring module comprises:
the judging submodule is used for judging whether the N image areas are in the same depth layer or not based on the initial depth data of each of the N image areas;
the blurring submodule is used for blurring the image if the N image areas are in the same depth layer;
the mobile terminal further includes:
the fusion module is used for setting the initial depth data of each of the N image areas as preset target depth data if the N image areas are in different depth layers;
the image blurring module is used for blurring the image based on the depth data of each of the N image areas after the depth data is adjusted;
the fusion module is specifically configured to set the depth data of all the pixel points in each of the N image regions to the same depth value or the same depth range.
5. The mobile terminal of claim 4, wherein: the judgment sub-module includes:
the judging unit is used for judging whether the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range;
a determining unit, configured to determine that the N image areas are in the same depth layer if the initial depth data of each of the N image areas is the same depth value or belongs to the same depth range.
6. The mobile terminal of claim 4, wherein the blurring module comprises:
and the processing submodule is used for carrying out blurring processing on the N image areas or carrying out blurring processing on all the image areas except the N image areas in the image.
CN201710198188.9A 2017-03-29 2017-03-29 Image blurring method and mobile terminal Active CN106993091B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710198188.9A CN106993091B (en) 2017-03-29 2017-03-29 Image blurring method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710198188.9A CN106993091B (en) 2017-03-29 2017-03-29 Image blurring method and mobile terminal

Publications (2)

Publication Number Publication Date
CN106993091A CN106993091A (en) 2017-07-28
CN106993091B true CN106993091B (en) 2020-05-12

Family

ID=59412950

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710198188.9A Active CN106993091B (en) 2017-03-29 2017-03-29 Image blurring method and mobile terminal

Country Status (1)

Country Link
CN (1) CN106993091B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230333B (en) * 2017-11-28 2021-01-26 深圳市商汤科技有限公司 Image processing method, image processing apparatus, computer program, storage medium, and electronic device
CN108234865A (en) * 2017-12-20 2018-06-29 深圳市商汤科技有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN108335323B (en) * 2018-03-20 2020-12-29 厦门美图之家科技有限公司 Blurring method of image background and mobile terminal
CN110569453B (en) * 2018-03-27 2021-10-15 阿里巴巴(中国)有限公司 Path blurring method and device based on canvas element
CN112889265B (en) * 2018-11-02 2022-12-09 Oppo广东移动通信有限公司 Depth image processing method, depth image processing device and electronic device
CN110062157B (en) * 2019-04-04 2021-09-17 北京字节跳动网络技术有限公司 Method and device for rendering image, electronic equipment and computer readable storage medium
CN112541867A (en) * 2020-12-04 2021-03-23 Oppo(重庆)智能科技有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104081765A (en) * 2011-09-14 2014-10-01 三星电子株式会社 Image processing apparatus and image processing method thereof
CN105488511A (en) * 2015-11-25 2016-04-13 小米科技有限责任公司 Image identification method and device
EP3035681A1 (en) * 2014-12-15 2016-06-22 Sony Computer Entertainment Europe Ltd. Image processing method and apparatus
CN105933589A (en) * 2016-06-28 2016-09-07 广东欧珀移动通信有限公司 Image processing method and terminal
CN106060423A (en) * 2016-06-02 2016-10-26 广东欧珀移动通信有限公司 Bokeh photograph generation method and device, and mobile terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5397190B2 (en) * 2009-11-27 2014-01-22 ソニー株式会社 Image processing apparatus, image processing method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104081765A (en) * 2011-09-14 2014-10-01 三星电子株式会社 Image processing apparatus and image processing method thereof
EP3035681A1 (en) * 2014-12-15 2016-06-22 Sony Computer Entertainment Europe Ltd. Image processing method and apparatus
CN105488511A (en) * 2015-11-25 2016-04-13 小米科技有限责任公司 Image identification method and device
CN106060423A (en) * 2016-06-02 2016-10-26 广东欧珀移动通信有限公司 Bokeh photograph generation method and device, and mobile terminal
CN105933589A (en) * 2016-06-28 2016-09-07 广东欧珀移动通信有限公司 Image processing method and terminal

Also Published As

Publication number Publication date
CN106993091A (en) 2017-07-28

Similar Documents

Publication Publication Date Title
CN106993091B (en) Image blurring method and mobile terminal
CN107678644B (en) Image processing method and mobile terminal
CN107509030B (en) focusing method and mobile terminal
CN105975152B (en) Fingerprint-based application program calling method and mobile terminal
CN107613203B (en) Image processing method and mobile terminal
CN106354373B (en) Icon moving method and mobile terminal
CN106095250B (en) Application icon layout method and mobile terminal
CN107506130B (en) Character deleting method and mobile terminal
CN107659722B (en) Image selection method and mobile terminal
CN107562345B (en) Information storage method and mobile terminal
CN106339436B (en) Picture-based shopping method and mobile terminal
CN108366169B (en) Notification message processing method and mobile terminal
CN106780314B (en) Jigsaw previewing method and mobile terminal
CN106372893B (en) Code scanning interface switching method and device
CN109218819B (en) Video preview method and mobile terminal
CN107221347B (en) Audio playing method and terminal
CN107479818B (en) Information interaction method and mobile terminal
CN106168894B (en) Content display method and mobile terminal
CN107592458B (en) Shooting method and mobile terminal
CN106775378B (en) Method for determining candidate words of input method and mobile terminal
CN107547738B (en) Prompting method and mobile terminal
CN106341530B (en) Payment page switching method and mobile terminal
CN107678813B (en) Desktop icon theme switching method and mobile terminal
CN106991150B (en) Webpage data display method and mobile terminal
CN114779977A (en) Interface display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant