CN109785226B - Image processing method and device and terminal equipment - Google Patents

Image processing method and device and terminal equipment Download PDF

Info

Publication number
CN109785226B
CN109785226B CN201811627079.5A CN201811627079A CN109785226B CN 109785226 B CN109785226 B CN 109785226B CN 201811627079 A CN201811627079 A CN 201811627079A CN 109785226 B CN109785226 B CN 109785226B
Authority
CN
China
Prior art keywords
depth information
pixel point
target image
image
filter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811627079.5A
Other languages
Chinese (zh)
Other versions
CN109785226A (en
Inventor
谢晋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811627079.5A priority Critical patent/CN109785226B/en
Publication of CN109785226A publication Critical patent/CN109785226A/en
Application granted granted Critical
Publication of CN109785226B publication Critical patent/CN109785226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides an image processing method, an image processing device and terminal equipment, and relates to the technical field of image processing. Wherein the method comprises the following steps: acquiring depth information of a target image; determining filter parameters according to the depth information; and carrying out filter processing on the target image according to the filter parameters. In the embodiment of the invention, for the pixel points with different depth information in the target image, the filter parameters corresponding to any pixel point can be determined according to the depth information of the pixel point, and further the filter processing is carried out on the pixel point according to the filter parameters, so that the stereoscopic distortion of the image in the filter processing process can be avoided, and the image after the filter processing can have stereoscopic impression while the filter effect is maintained.

Description

Image processing method and device and terminal equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method, an image processing device, and a terminal device.
Background
Nowadays, various image processing applications can provide personalized image processing functions for users, and the users can utilize the image processing functions to perform operations such as image splicing, image deformation, filter effect superposition and the like, so that personalized requirements of user creation, sharing and the like are met.
The filter processing is a common image processing mode, different filter styles are overlapped to enable images to display different effects, for example, gray level filters are overlapped to enable the images to display gray level display effects, and cartoon filters are overlapped to enable the images to display effects of hand-drawn cartoon.
However, in practical applications, after the image is subjected to the filter processing, the image is often distorted to some extent, so that the original stereoscopic impression of the image is reduced.
Disclosure of Invention
The invention provides an image processing method, an image processing device and terminal equipment, which are used for solving the problem that after an image is subjected to filter processing, the image is distorted to a certain extent, so that the original stereoscopic impression of the image is reduced.
In order to solve the technical problems, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, applied to a terminal device, including:
acquiring depth information of a target image;
determining filter parameters according to the depth information;
and carrying out filter processing on the target image according to the filter parameters.
In a second aspect, an embodiment of the present invention further provides an image processing apparatus, including:
the acquisition module is used for acquiring depth information of the target image;
the determining module is used for determining filter parameters according to the depth information;
and the processing module is used for carrying out filter processing on the target image according to the filter parameters.
In a third aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes a processor, a memory, and a computer program stored in the memory and executable on the processor, where the computer program is executed by the processor to implement the steps of the image processing method according to the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium, where a computer program is stored, where the computer program when executed by a processor implements the steps of the image processing method according to the present invention.
In the embodiment of the invention, the terminal equipment can firstly acquire the depth information of the target image, then can determine the filter parameters according to the depth information, and can further carry out filter processing on the target image according to the filter parameters. In the embodiment of the invention, for the pixel points with different depth information in the target image, the filter parameters corresponding to any pixel point can be determined according to the depth information of the pixel point, and further the filter processing is carried out on the pixel point according to the filter parameters, so that the stereoscopic distortion of the image in the filter processing process can be avoided, and the image after the filter processing can have stereoscopic impression while the filter effect is maintained.
Drawings
Fig. 1 is a flowchart showing an image processing method in the first embodiment of the present invention;
fig. 2 is a flowchart showing an image processing method in the second embodiment of the present invention;
fig. 3 is a block diagram showing the structure of an image processing apparatus in a third embodiment of the present invention;
fig. 4 is a block diagram showing the structure of another image processing apparatus in the third embodiment of the present invention;
fig. 5 is a schematic diagram showing a hardware structure of a terminal device in various embodiments of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
Referring to fig. 1, a flowchart of an image processing method according to a first embodiment of the present invention may specifically include the following steps:
step 101, depth information of a target image is acquired.
In an embodiment of the present invention, the target image is a depth image having depth information. In practical application, the terminal device may collect the target image and the depth information of the target image, or may also collect the target image and the depth information of the target image by other image devices other than the terminal device, so that the image device may send the target image and the depth information of the target image to the terminal device when the image device is connected to the terminal device, so that the terminal device may obtain the target image and the depth information of the target image.
And 102, determining filter parameters according to the depth information.
In the embodiment of the invention, after the terminal equipment acquires the target image and the depth information of the target image, the terminal equipment can determine the filter parameters corresponding to the pixel point according to the depth information corresponding to the pixel point for any pixel point in the target image, namely, the parameters for performing filter processing on the pixel points with different depth information are different.
And 103, performing filter processing on the target image according to the filter parameters.
In the embodiment of the invention, for any pixel point in the target image, the terminal equipment can multiply the filter parameter corresponding to the pixel point with the pixel value of the pixel point to obtain a new pixel value of the pixel point, and then can replace the original pixel value of the pixel point with the new pixel value of the pixel point. Since the filter processing method in the prior art originally carries out indiscriminate filter processing on all pixel points in an image, the degree of stereo distortion of the pixel points in a part of depth ranges is small, and the degree of stereo distortion of the pixel points in another part of depth ranges is large, so that the filter parameters corresponding to the depth information are overlapped on the pixel points with different depth information in the target image, and the image after the filter processing can keep the filter effect and has stereo perception.
In the embodiment of the invention, the terminal equipment can firstly acquire the depth information of the target image, then can determine the filter parameters according to the depth information, and can further carry out filter processing on the target image according to the filter parameters. In the embodiment of the invention, for the pixel points with different depth information in the target image, the filter parameters corresponding to any pixel point can be determined according to the depth information of the pixel point, and further the filter processing is carried out on the pixel point according to the filter parameters, so that the stereoscopic distortion of the image in the filter processing process can be avoided, and the image after the filter processing can have stereoscopic impression while the filter effect is maintained.
Example two
Referring to fig. 2, a flowchart of an image processing method according to a second embodiment of the present invention may specifically include the following steps:
in step 201, a target image is acquired.
In the embodiment of the invention, the terminal equipment can be internally or externally connected with an image depth detection device, and the image depth detection device has the functions of image acquisition and depth detection, and can shoot a scene to obtain a target image and acquire the depth information of the target image. In one implementation, the terminal device may acquire the target image through a built-in or external image depth detection device. In another implementation, the terminal device may also acquire the target image through a common camera.
Step 202, acquiring depth information of a target image through an image depth detection device.
In the embodiment of the invention, the terminal equipment can acquire the depth information of the target image through a built-in or external image depth detection device. In practice, the image depth detection device may comprise a binocular stereoscopic device, a Time of Flight (TOF) device, or a structured light device.
The binocular stereoscopic vision device can simulate the process of observing objects by two eyes, so that depth information of the shot objects is determined by utilizing the principle that parallax exists between left eyes and right eyes, the binocular stereoscopic vision device can specifically comprise two depth cameras with fixed positions and intervals, the depth of each object in a target image is respectively simulated by the left eyes and the right eyes, and further the depth information of each object in the target image is obtained through calculation according to the depth error between the two depth cameras and the positions and the intervals of the two depth cameras. The specific process of acquiring the depth information of the target image by the image depth detection device may refer to the related art, and the embodiments of the present invention are not described herein.
The time-of-flight device may specifically include an optical pulse generator and a sensor, where the optical pulse generator may send optical pulses to each object in the target image, and then receive the optical pulses reflected from each object through the sensor, and calculate the time-of-flight of the optical pulses, that is, the round trip time of the optical pulses, and then multiply 1/2 of the time-of-flight by the speed of light, so that depth information of each object in the target image may be obtained.
The structured light device specifically can include infrared laser projector and infrared camera, and infrared laser projector can send infrared laser, and then can assemble into the very narrow light band of width after infrared laser passes through the cylindrical lens, is called structured light. The infrared laser projector can project the structured light with certain structural characteristics onto a photographed object, and then the structured light is collected by the infrared camera. The light rays with certain structural characteristics are equivalent to phase modulation with different degrees when reflected by different depth areas of an object, so that the infrared camera can acquire different image phase information, and then the phase change can be converted into depth information.
It should be noted that, in the embodiment of the present invention, specific internal components of the binocular stereoscopic vision device, the time-of-flight device, and the structured light device are not specifically limited, and specific calculation processes for obtaining depth information of the foregoing devices are not specifically limited, and reference may be made to related technologies for specific structures of the foregoing devices and specific calculation processes for depth information of the foregoing devices.
In addition, it should be noted that, in the embodiment of the present invention, the execution sequence of the acquisition of the target image and the acquisition of the depth information of the target image is not limited specifically, and in practical application, the acquisition of the target image may be performed first, and then the acquisition of the depth information may be performed, or the acquisition of the target image and the depth information may be performed simultaneously.
And 203, determining filter parameters according to the depth information.
In the embodiment of the present invention, the steps specifically may include: determining the minimum value of the depth information of the target image from the depth information corresponding to each pixel point in the target image; subtracting the minimum value of the depth information from the depth information corresponding to each pixel point to obtain the relative depth information corresponding to each pixel point; adding the relative depth information corresponding to each pixel point to obtain the sum of the relative depth information of the target image; dividing the sum of the relative depth information by the total number of pixels of the target image to obtain average relative depth information of the target image; determining the maximum value of the depth information of the target image from the depth information corresponding to each pixel point in the target image; determining a target parameter corresponding to each pixel point according to the relative depth information and the average relative depth information corresponding to each pixel point, and the minimum value and the maximum value of the depth information through the following formula (1); and multiplying the target parameter corresponding to each pixel point by at least one preset processing parameter to obtain the filter parameter corresponding to each pixel point.
T(x,y)=[d(x,y)+max_D(x,y)-min_D(x,y)]/d’(x,y) (1)
Wherein T (x, y) is a target parameter corresponding to a pixel point with a display position (x, y), D (x, y) is relative depth information corresponding to a pixel point with a display position (x, y), max_d (x, y) is a maximum value of the depth information, min_d (x, y) is a minimum value of the depth information, and D' (x, y) is average relative depth information.
Wherein for each pixel point in the target image, the terminal device may first determine the relative depth information D (x, y) of the depth information D (x, y) of each pixel point with respect to the depth information minimum value min_d (x, y) of the entire target image, that is, D (x, y) satisfies the following formula (2).
d(x,y)=D(x,y)-min_D(x,y) (2)
The terminal device can then determine the average relative depth information d' (x, y) of the entire target image by the following formula (3). Wherein w is the width of the target image, that is, the total number of pixels of the target image in the horizontal direction, and h is the height of the target image, that is, the total number of pixels of the target image in the vertical direction.
The terminal device may then determine the maximum value max_d (x, y) of the depth information of the entire target image, and may further determine the target parameter T (x, y) corresponding to each pixel point through the above formula (1). And the terminal equipment can multiply the target parameter corresponding to each pixel point by at least one preset processing parameter to obtain the filter parameter corresponding to each pixel point, namely, the parameters for performing filter processing on the pixel points with different depth information are different. The preset processing parameters may include constant parameters in at least one conventional filter formula currently applied, for example, constant parameters in a gray filter formula, constant parameters in a nostalgic filter formula, constant parameters in a cartoon filter formula, and the like.
It should be noted that, in the embodiment of the present invention, only after the step of obtaining the target image, the step of determining the maximum value of the depth information of the target image from the depth information corresponding to each pixel point in the target image may be performed before the step of determining the target parameter corresponding to each pixel point according to the formula T (x, y) = [ D (x, y) +max_d (x, y) -min_d (x, y) ]/D' (x, y), for example, may be performed after the step of determining the minimum value of the depth information of the target image, or may be performed after the step of obtaining the average relative depth information of the target image.
Step 204, multiplying the first pixel value of each pixel by the filter parameter corresponding to each pixel to obtain the second pixel value of each pixel.
In the embodiment of the invention, after determining the filter parameters corresponding to each pixel point, the terminal device may multiply the first pixel value of each pixel point by the filter parameters corresponding to each pixel point, so as to obtain the second pixel value of each pixel point.
Specifically, when the preset processing parameter includes a constant parameter in the gray filter formula, the terminal device may convert the first pixel value (R0, G0, B0) of each pixel point in the target image into the second pixel value (R1, G1, B1) by the following formulas (4), (5) and (6).
R1=T(x,y)(0.3*R0+0.59*G0+0.11*B0) (4)
G1=T(x,y)(0.3*R0+0.59*G0+0.11*B0) (5)
B1=T(x,y)(0.3*R0+0.59*G0+0.11*B0) (6)
When the preset processing parameters include constant parameters in the nostalgic filter formula, the terminal device may convert the first pixel value (R0, G0, B0) of each pixel point in the target image into the second pixel value (R1, G1, B1) by the following formulas (7), (8) and (9).
R1=T(x,y)(0.393*R0+0.769*G0+0.189*B0) (7)
G1=T(x,y)(0.349*R0+0.686*G0+0.168*B0) (8)
B1=T(x,y)(0.272*R0+0.534*G0+0.131*B0) (9)
When the preset processing parameters include constant parameters in the cartoon filter formula, the terminal device may convert the first pixel value (R0, G0, B0) of each pixel point in the target image into the second pixel value (R1, G1, B1) by the following formulas (10), (11) and (12).
R1=T(x,y)(|2g–b+r|*r/256) (10)
G1=T(x,y)(|2b–g+r|*r/256) (11)
B1=T(x,y)(|2b–g+r|*g/256) (12)
It should be noted that, in specific application, the preset processing parameters overlapped with the target parameters are not limited to the constant parameters in the above three conventional filter formulas, and the preset processing parameters are not specifically limited in the embodiment of the present invention.
Step 205, replace the first pixel value of each pixel point with the second pixel value of each pixel point.
In the embodiment of the invention, the terminal equipment can replace the first pixel value of each pixel point with the second pixel value of each pixel point, so that the filter processing of the target image can be realized. Since the filter processing method in the prior art originally carries out indiscriminate filter processing on all pixel points in an image, the degree of stereo distortion of the pixel points in a part of depth ranges is small, and the degree of stereo distortion of the pixel points in another part of depth ranges is large, so that the filter parameters corresponding to the depth information are overlapped on the pixel points with different depth information in the target image, and the image after the filter processing can keep the filter effect and has stereo perception.
In addition, it should be noted that, in practical application, the target image may be an original image that is not subjected to any filter processing, and accordingly, the target parameter T (x, y) may be a part of the filter processing, so that the terminal device may perform the filter processing on the original image by using the filter parameter including the target parameter T (x, y), so that the processed image may have a stereoscopic effect while maintaining the filter effect. Of course, in practical application, the target image may also be an image obtained after performing at least one conventional filter processing on the original image, and correspondingly, the target parameter T (x, y) may be independently used as a filter for reducing the stereoscopic impression of the image, so that the terminal device may use the target parameter T (x, y) to perform the filter processing for reducing the stereoscopic impression of the image after performing the conventional filter processing.
In the embodiment of the invention, the terminal device may acquire the depth information of the target image first, and then determine the filter parameter corresponding to each pixel according to the depth information and the formula T (x, y) = [ D (x, y) +max_d (x, y) -min_d (x, y) ]/D' (x, y), and further perform filter processing on the target image according to the filter parameter corresponding to each pixel. In the embodiment of the invention, for the pixel points with different depth information in the target image, the filter parameters corresponding to any pixel point can be determined according to the depth information of the pixel point, and further the filter processing is carried out on the pixel point according to the filter parameters, so that the stereoscopic distortion of the image in the filter processing process can be avoided, and the image after the filter processing can have stereoscopic impression while the filter effect is maintained.
Example III
Referring to fig. 3, a block diagram of an image processing apparatus 300 according to a third embodiment of the present invention is shown, and may specifically include:
an acquisition module 301, configured to acquire depth information of a target image;
a determining module 302, configured to determine filter parameters according to the depth information;
and the processing module 303 is used for performing filter processing on the target image according to the filter parameters.
Optionally, referring to fig. 4, the determining module 302 includes:
a first determining submodule 3021, configured to determine a minimum value of depth information of the target image from depth information corresponding to each pixel point in the target image;
a first operation submodule 3022, configured to subtract the depth information corresponding to each pixel point from the minimum value of the depth information to obtain relative depth information corresponding to each pixel point;
a second operation submodule 3023, configured to add the relative depth information corresponding to each pixel point to obtain a sum of the relative depth information of the target image;
a third operator module 3024, configured to divide the sum of the relative depth information by the total number of pixels of the target image to obtain average relative depth information of the target image;
a second determining submodule 3025, configured to determine, according to the relative depth information and the average relative depth information corresponding to each pixel point, and the depth information minimum value and the depth information maximum value, a target parameter corresponding to each pixel point by a formula T (x, y) = [ D (x, y) +max_d (x, y) -min_d (x, y) ]/D' (x, y);
a fourth operation submodule 3026, configured to multiply the target parameter corresponding to each pixel point by at least one preset processing parameter to obtain a filter parameter corresponding to each pixel point;
wherein T (x, y) is a depth processing parameter corresponding to a pixel point with a display position (x, y), D (x, y) is relative depth information corresponding to a pixel point with a display position (x, y), max_d (x, y) is the maximum value of the depth information, min_d (x, y) is the minimum value of the depth information, and D' (x, y) is the average relative depth information.
Optionally, referring to fig. 4, the determining module 302 further includes:
a third determining submodule 3027, configured to determine a maximum value of depth information of the target image from the depth information corresponding to each pixel point in the target image.
Optionally, referring to fig. 4, the processing module 303 includes:
a fifth operation submodule 3031, configured to multiply the first pixel value of each pixel point by the filter parameter corresponding to each pixel point to obtain a second pixel value of each pixel point;
and the replacing submodule 3032 is used for replacing the first pixel value of each pixel point with the second pixel value of each pixel point.
Optionally, referring to fig. 4, the acquiring module 301 includes:
a first acquisition submodule 3011, configured to acquire a target image;
the second acquisition submodule 3012 is used for acquiring depth information of the target image through the image depth detection device;
wherein the image depth detection device comprises a binocular stereoscopic vision device, a time-of-flight device, or a structured light device.
The image processing device provided in the embodiment of the present invention can implement each process implemented by the terminal device in the method embodiments of fig. 1 and fig. 2, and in order to avoid repetition, a description is omitted here.
In the embodiment of the invention, the image processing device can acquire the depth information of the target image through the acquisition module, can determine the filter parameters according to the depth information through the determination module, and can perform filter processing on the target image according to the filter parameters through the processing module. In the embodiment of the invention, for the pixel points with different depth information in the target image, the filter parameters corresponding to any pixel point can be determined according to the depth information of the pixel point, and further the filter processing is carried out on the pixel point according to the filter parameters, so that the stereoscopic distortion of the image in the filter processing process can be avoided, and the image after the filter processing can have stereoscopic impression while the filter effect is maintained.
Example IV
Figure 5 is a schematic diagram of a hardware architecture of a terminal device implementing various embodiments of the present invention,
the terminal device 500 includes, but is not limited to: radio frequency unit 501, network module 502, audio output unit 503, input unit 504, sensor 505, display unit 506, user input unit 507, interface unit 508, memory 509, processor 510, and power source 511. It will be appreciated by those skilled in the art that the terminal device structure shown in fig. 5 does not constitute a limitation of the terminal device, and the terminal device may comprise more or less components than shown, or may combine certain components, or may have a different arrangement of components. In the embodiment of the invention, the terminal equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
The processor 510 is configured to obtain depth information of a target image; determining filter parameters according to the depth information; and carrying out filter processing on the target image according to the filter parameters.
In the embodiment of the invention, the terminal equipment can firstly acquire the depth information of the target image, then can determine the filter parameters according to the depth information, and can further carry out filter processing on the target image according to the filter parameters. In the embodiment of the invention, for the pixel points with different depth information in the target image, the filter parameters corresponding to any pixel point can be determined according to the depth information of the pixel point, and further the filter processing is carried out on the pixel point according to the filter parameters, so that the stereoscopic distortion of the image in the filter processing process can be avoided, and the image after the filter processing can have stereoscopic impression while the filter effect is maintained.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used to receive and send information or signals during a call, specifically, receive downlink data from a base station, and then process the downlink data with the processor 510; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 may also communicate with networks and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 502, such as helping the user to send and receive e-mail, browse web pages, access streaming media, etc.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the terminal device 500. The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used for receiving an audio or video signal. The input unit 504 may include a graphics processor (Graphics Processing Unit, GPU) 5041 and a microphone 5042, the graphics processor 5041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphics processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. Microphone 5042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 501 in case of a phone call mode.
The terminal device 500 further comprises at least one sensor 505, such as a light sensor, a motion sensor and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 5061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 5061 and/or backlight when the terminal device 500 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when the accelerometer sensor is stationary, and can be used for recognizing the gesture (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking) and the like of the terminal equipment; the sensor 505 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 506 is used to display information input by a user or information provided to the user. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 is operable to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on touch panel 5071 or thereabout using any suitable object or accessory such as a finger, stylus, etc.). Touch panel 5071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, physical keyboards, function keys (e.g., volume control keys, switch keys, etc.), trackballs, mice, joysticks, and so forth, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or thereabout, the touch operation is transmitted to the processor 510 to determine a type of touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of touch event. Although in fig. 5, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the terminal device, which is not limited herein.
The interface unit 508 is an interface for connecting an external device to the terminal apparatus 500. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 500 or may be used to transmit data between the terminal apparatus 500 and an external device.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the terminal device, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the terminal device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 510.
The terminal device 500 may further include a power source 511 (e.g., a battery) for powering the various components, and preferably the power source 511 may be logically coupled to the processor 510 via a power management system that performs functions such as managing charging, discharging, and power consumption.
In addition, the terminal device 500 includes some functional modules, which are not shown, and will not be described herein.
Preferably, the embodiment of the present invention further provides a terminal device, which includes a processor 510, a memory 509, and a computer program stored in the memory 509 and capable of running on the processor 510, where the computer program when executed by the processor 510 implements each process of the above embodiment of the image processing method, and the same technical effects can be achieved, and for avoiding repetition, a description is omitted herein.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above-mentioned image processing method embodiment, and can achieve the same technical effects, so that repetition is avoided, and no further description is given here. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (8)

1. An image processing method applied to a terminal device, the method comprising:
acquiring depth information of a target image;
determining filter parameters according to the depth information;
performing filter processing on the target image according to the filter parameters;
wherein, the step of determining the filter parameters according to the depth information includes:
determining the minimum value of the depth information of the target image from the depth information corresponding to each pixel point in the target image;
subtracting the minimum value of the depth information from the depth information corresponding to each pixel point to obtain the relative depth information corresponding to each pixel point;
adding the relative depth information corresponding to each pixel point to obtain the sum of the relative depth information of the target image;
dividing the sum of the relative depth information by the total number of pixels of the target image to obtain average relative depth information of the target image;
determining a target parameter corresponding to each pixel point according to the relative depth information and the average relative depth information corresponding to each pixel point, and the minimum value and the maximum value of the depth information through a formula T (x, y) = [ D (x, y) +max_D (x, y) -min_D (x, y) ]/D' (x, y);
multiplying the target parameter corresponding to each pixel point by at least one preset processing parameter to obtain a filter parameter corresponding to each pixel point;
wherein T (x, y) is a target parameter corresponding to a pixel point with a display position (x, y), D (x, y) is relative depth information corresponding to a pixel point with a display position (x, y), max_d (x, y) is the maximum value of the depth information, min_d (x, y) is the minimum value of the depth information, and D' (x, y) is the average relative depth information.
2. The method according to claim 1, wherein the step of determining the target parameter corresponding to each pixel point according to the relative depth information and the average relative depth information corresponding to each pixel point, and the depth information minimum value and the depth information maximum value, by the formula T (x, y) = [ D (x, y) +max_d (x, y) -min_d (x, y) ]/D' (x, y), further comprises:
and determining the maximum value of the depth information of the target image from the depth information corresponding to each pixel point in the target image.
3. The method of claim 1, wherein the step of performing filter processing on the target image based on the filter parameters comprises:
multiplying the first pixel value of each pixel point by the filter parameter corresponding to each pixel point to obtain a second pixel value of each pixel point;
and replacing the first pixel value of each pixel point with the second pixel value of each pixel point.
4. The method according to claim 1, wherein the step of acquiring depth information of the target image comprises:
collecting a target image;
acquiring depth information of the target image through an image depth detection device; the method comprises the steps of carrying out a first treatment on the surface of the
Wherein the image depth detection device comprises a binocular stereoscopic vision device, a time-of-flight device, or a structured light device.
5. An image processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring depth information of the target image;
the determining module is used for determining filter parameters according to the depth information;
the processing module is used for carrying out filter processing on the target image according to the filter parameters;
wherein the determining module comprises:
the first determining submodule is used for determining the minimum value of the depth information of the target image from the depth information corresponding to each pixel point in the target image;
the first operation submodule is used for subtracting the depth information corresponding to each pixel point from the minimum value of the depth information to obtain the relative depth information corresponding to each pixel point; the method comprises the steps of carrying out a first treatment on the surface of the
The second operation submodule is used for adding the relative depth information corresponding to each pixel point to obtain the sum of the relative depth information of the target image;
a third operator module for dividing the sum of the relative depth information by the total number of pixels of the target image to obtain average relative depth information of the target image;
the second determining submodule is used for determining a target parameter corresponding to each pixel point according to the relative depth information and the average relative depth information corresponding to each pixel point and the minimum depth information and the maximum depth information through a formula T (x, y) = [ D (x, y) +max_d (x, y) -min_d (x, y) ]/D' (x, y);
a fourth operation submodule, configured to multiply the target parameter corresponding to each pixel point by at least one preset processing parameter to obtain a filter parameter corresponding to each pixel point;
wherein T (x, y) is a target parameter corresponding to a pixel point with a display position (x, y), D (x, y) is relative depth information corresponding to a pixel point with a display position (x, y), max_d (x, y) is the maximum value of the depth information, min_d (x, y) is the minimum value of the depth information, and D' (x, y) is the average relative depth information.
6. The apparatus of claim 5, wherein the means for determining further comprises:
and the third determining submodule is used for determining the maximum value of the depth information of the target image from the depth information corresponding to each pixel point in the target image.
7. A terminal device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps of the image processing method according to any one of claims 1 to 4.
8. A computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the steps of the image processing method according to any one of claims 1 to 4.
CN201811627079.5A 2018-12-28 2018-12-28 Image processing method and device and terminal equipment Active CN109785226B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811627079.5A CN109785226B (en) 2018-12-28 2018-12-28 Image processing method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811627079.5A CN109785226B (en) 2018-12-28 2018-12-28 Image processing method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN109785226A CN109785226A (en) 2019-05-21
CN109785226B true CN109785226B (en) 2023-11-17

Family

ID=66498802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811627079.5A Active CN109785226B (en) 2018-12-28 2018-12-28 Image processing method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN109785226B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113902786B (en) * 2021-09-23 2022-05-27 珠海视熙科技有限公司 Depth image preprocessing method, system and related device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105825494A (en) * 2015-08-31 2016-08-03 维沃移动通信有限公司 Image processing method and mobile terminal
WO2017016030A1 (en) * 2015-07-30 2017-02-02 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
WO2017067523A1 (en) * 2015-10-22 2017-04-27 努比亚技术有限公司 Image processing method, device and mobile terminal
CN107580209A (en) * 2017-10-24 2018-01-12 维沃移动通信有限公司 Take pictures imaging method and the device of a kind of mobile terminal
CN107948530A (en) * 2017-12-28 2018-04-20 努比亚技术有限公司 A kind of image processing method, terminal and computer-readable recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017016030A1 (en) * 2015-07-30 2017-02-02 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN105825494A (en) * 2015-08-31 2016-08-03 维沃移动通信有限公司 Image processing method and mobile terminal
WO2017067523A1 (en) * 2015-10-22 2017-04-27 努比亚技术有限公司 Image processing method, device and mobile terminal
CN107580209A (en) * 2017-10-24 2018-01-12 维沃移动通信有限公司 Take pictures imaging method and the device of a kind of mobile terminal
CN107948530A (en) * 2017-12-28 2018-04-20 努比亚技术有限公司 A kind of image processing method, terminal and computer-readable recording medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于FPGA的嵌入式彩色图像检测***;陆洲等;《传感技术学报》;20070330(第03期);全文 *

Also Published As

Publication number Publication date
CN109785226A (en) 2019-05-21

Similar Documents

Publication Publication Date Title
US11451706B2 (en) Photographing method and mobile terminal
CN110557575B (en) Method for eliminating glare and electronic equipment
CN108989678B (en) Image processing method and mobile terminal
CN107846583B (en) Image shadow compensation method and mobile terminal
CN109688322B (en) Method and device for generating high dynamic range image and mobile terminal
CN109685915B (en) Image processing method and device and mobile terminal
CN107730460B (en) Image processing method and mobile terminal
CN110213485B (en) Image processing method and terminal
CN107248137B (en) Method for realizing image processing and mobile terminal
CN107749046B (en) Image processing method and mobile terminal
CN110602389B (en) Display method and electronic equipment
CN111031234B (en) Image processing method and electronic equipment
CN108898555B (en) Image processing method and terminal equipment
CN109005355B (en) Shooting method and mobile terminal
CN108174110B (en) Photographing method and flexible screen terminal
CN108174109B (en) Photographing method and mobile terminal
CN108307123B (en) Exposure adjusting method and mobile terminal
CN111008929B (en) Image correction method and electronic equipment
CN110290263B (en) Image display method and mobile terminal
CN109104573B (en) Method for determining focusing point and terminal equipment
CN107798662B (en) Image processing method and mobile terminal
CN109785226B (en) Image processing method and device and terminal equipment
CN108965701B (en) Jitter correction method and terminal equipment
CN108391050B (en) Image processing method and mobile terminal
CN108550182B (en) Three-dimensional modeling method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant