CN110933313B - Dark light photographing method and related equipment - Google Patents

Dark light photographing method and related equipment Download PDF

Info

Publication number
CN110933313B
CN110933313B CN201911253797.5A CN201911253797A CN110933313B CN 110933313 B CN110933313 B CN 110933313B CN 201911253797 A CN201911253797 A CN 201911253797A CN 110933313 B CN110933313 B CN 110933313B
Authority
CN
China
Prior art keywords
management module
module
image
algorithm
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911253797.5A
Other languages
Chinese (zh)
Other versions
CN110933313A (en
Inventor
陈岩
方攀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911253797.5A priority Critical patent/CN110933313B/en
Publication of CN110933313A publication Critical patent/CN110933313A/en
Application granted granted Critical
Publication of CN110933313B publication Critical patent/CN110933313B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/617Upgrading or updating of programs or applications for camera control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Abstract

The application discloses a dim light photographing method and related equipment, which are applied to electronic equipment, wherein the electronic equipment comprises a media service module and an android system, and an application layer of the android system is provided with a third-party application; the method comprises the following steps: when the current photographing environment of the electronic equipment is a dark light environment, the third-party application sends a data request to a hardware abstraction layer of the android system; the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is that the third-party application requests an android system to be open for the third-party application in advance through the media service module; the hardware abstraction layer sends the target application data to the third party application. By adopting the embodiment of the application, the third party application can directly use the image enhancement algorithm provided by the system to process the original application data in the dark environment.

Description

Dark light photographing method and related equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a dim light photographing method and related devices.
Background
Photographing has become an indispensable function of electronic equipment, and is also a competition field for the electronic manufacturers to compete. For an Android platform, a third-party Application can only access a bottom layer through an Application Programming Interface (API) at present, but cannot call an algorithm inside a system to process Application data, so that only Application data transmitted by the bottom layer can be passively received. In a dim light environment, how to call an algorithm inside the system to process the original application data is a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a dim light photographing method and related equipment, which are beneficial to processing original application data by a third party application directly using an image enhancement algorithm provided by a system in a dim light environment.
In a first aspect, an embodiment of the present application provides a dim light photographing method, which is applied to an electronic device, where the electronic device includes a media service module and an android system, and an application layer of the android system is provided with a third-party application; the method comprises the following steps:
when the current photographing environment of the electronic equipment is a dark light environment, the third-party application sends a data request to a hardware abstraction layer of the android system;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting the android system to be open for the third-party application in advance through the media service module;
the hardware abstraction layer sends the target application data to the third party application.
In a second aspect, an embodiment of the present application provides a dim light photographing device, which is applied to an electronic device, where the electronic device includes a media service module and an android system, and an application layer of the android system is provided with a third-party application; the device comprises:
a processing unit to: when the current photographing environment of the electronic equipment is a dark light environment, controlling the third-party application to send a data request to a hardware abstraction layer of the android system; controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting an android system to be open for a third-party application in advance through the media service module; and controlling the hardware abstraction layer to send the target application data to the third-party application.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, a communication interface, and one or more programs, stored in the memory and configured to be executed by the processor, the programs including instructions for performing some or all of the steps described in the method according to the first aspect of the embodiments of the present application.
In a fourth aspect, an embodiment of the present application provides a chip, including: and the processor is used for calling and running the computer program from the memory so that the device provided with the chip executes part or all of the steps described in any method of the first aspect of the embodiment of the application.
In a fifth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is executed by a processor to implement part or all of the steps described in the method according to the first aspect of the present application.
In a sixth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps described in the method according to the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the application, when the current photographing environment of the electronic device is a dim light environment, a third-party application set in an application layer in an android system of the electronic device sends a data request to a hardware abstraction layer of the android system; the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting the android system to be open for the third-party application in advance through the media service module; the hardware abstraction layer sends the target application data to the third party application. Therefore, according to the technical scheme provided by the application, the method and the device are beneficial to processing the original application data by the third-party application directly using the image enhancement algorithm provided by the system in the dark environment.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure;
fig. 1B is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 2A is an interaction diagram of a dark light photographing method according to an embodiment of the present disclosure;
fig. 2B is a schematic structural diagram of an image enhancement algorithm provided in an embodiment of the present application;
fig. 3 is an interaction diagram of a dark light photographing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a dark-light photographing device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Hereinafter, some terms in the present application are explained to facilitate understanding by those skilled in the art.
As shown in fig. 1A, fig. 1A is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure. The electronic device includes a processor, a Memory, a signal processor, a communication interface, a display screen, a speaker, a microphone, a Random Access Memory (RAM), a camera module, a sensor, and the like. The storage, the signal processor, the display screen, the loudspeaker, the microphone, the RAM, the camera module, the sensor and the IR are connected with the processor, and the communication interface is connected with the signal processor.
The Display screen may be a Liquid Crystal Display (LCD), an Organic or inorganic Light-Emitting Diode (OLED), an Active Matrix/Organic Light-Emitting Diode (AMOLED), or the like.
The camera module can include a common camera and an infrared camera, and is not limited herein. The camera may be a front camera or a rear camera, and is not limited herein.
Wherein the sensor comprises at least one of: light sensors, gyroscopes, Infrared light (IR) sensors, fingerprint sensors, pressure sensors, and the like. Among them, the light sensor, also called an ambient light sensor, is used to detect the ambient light brightness. The light sensor may include a light sensitive element and an analog to digital converter. The photosensitive element is used for converting collected optical signals into electric signals, and the analog-to-digital converter is used for converting the electric signals into digital signals. Optionally, the light sensor may further include a signal amplifier, and the signal amplifier may amplify the electrical signal converted by the photosensitive element and output the amplified electrical signal to the analog-to-digital converter. The photosensitive element may include at least one of a photodiode, a phototransistor, a photoresistor, and a silicon photocell.
The processor is a control center of the electronic equipment, various interfaces and lines are used for connecting all parts of the whole electronic equipment, and various functions and processing data of the electronic equipment are executed by operating or executing software programs and/or modules stored in the memory and calling data stored in the memory, so that the electronic equipment is monitored integrally.
The processor may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The memory is used for storing software programs and/or modules, and the processor executes various functional applications and data processing of the electronic equipment by operating the software programs and/or modules stored in the memory. The memory mainly comprises a program storage area and a data storage area, wherein the program storage area can store an operating system, a software program required by at least one function and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
As shown in fig. 1B, fig. 1B is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present disclosure. As shown in fig. 1B, the electronic device according to the embodiment of the present application includes a media Service module (ome Service) and an android system, an application layer of the android system is provided with a third party application and a media software development kit module (ome SDK), and a hardware abstraction layer of the android system is provided with a media policy module (ome Strategy), an algorithm management module (Algo Manager), and a Camera hardware abstraction module (Camera HAL). The third-party application is in communication connection with the media software development kit module, the media software development kit module is in communication connection with the media service module, the media service module is in communication connection with the camera hardware abstraction module, the camera hardware abstraction module is in communication connection with the media policy module, and the media policy module is in communication connection with the algorithm management module. In addition, the media service module may be further communicatively coupled to the media policy module and/or the algorithm management module.
The media software development kit module comprises a control interface, can acquire information such as capability value and configuration capability value, does not store static configuration information, can communicate with the media service module by a binder, and transmits the third-party application configuration information to the media service module.
The media service module resident in the service module of the system runs, authenticates and responds to the configuration request of the third-party application after the electronic equipment is started, so that the configuration information can reach the bottom layer. In the application, the media service module acquires a data request of the third-party application and sets a data processing scheme.
The media policy module is a bottom layer policy module, and can send information configured by the media service module to a bottom layer, convert the information into the capability which can be identified by the bottom layer, prevent the third party application from directly coupling and seeing the capability of the bottom layer, convert a request of an upper layer into a proprietary pipeline, and call algorithm information.
The algorithm management module can enable the capability configuration information issued by the upper layer, and can utilize the corresponding algorithm.
Wherein the third party application may directly notify the media service module that data processing or continuous shooting is required.
The electronic equipment of the embodiment of the application adopts a framework based on a media platform (OMedia), so that a third party application can use a pipeline which is continuously shot at the bottom layer to upload a clear shot image rather than a preview stream to the third party application, and a media service module and a hardware abstraction layer can be set through the media platform to use system functions such as high resolution, denoising, beautifying and the like provided by an Image Signal Processor (ISP) and system software.
Meanwhile, since a problem of too slow image speed may be caused by using a high resolution process provided by an image signal processor and system software, the problem may be solved by: the bottom layer can send clear YUV to a third-party application, the thumbnail can be reported to the third-party application for displaying in the middle, and after the third-party application receives the thumbnail, the third-party application does post-processing and JPG generation.
As shown in fig. 2A, fig. 2A is an interaction schematic diagram of a dim light photographing method provided in an embodiment of the present application, and is applied to an electronic device shown in fig. 1A and 1B, where the electronic device includes a media service module and an android system, and an application layer of the android system is provided with a third-party application; the method comprises the following steps:
step 201: and when the current photographing environment of the electronic equipment is a dark light environment, the third-party application sends a data request to a hardware abstraction layer of the android system.
For example, when the current photographing environment of the electronic device is a dark light environment, and when a third-party application installed in the electronic device needs to use an image enhancement function of an android system, the third-party application sends a data request to a hardware abstraction layer of the android system, where the data request may be to perform image enhancement on an image photographed by a camera, so as to obtain an image with higher definition in the dark light environment. Optionally, the data request may further include denoising and/or beautifying.
Step 202: the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting the android system to be open for the third-party application in advance through the media service module;
the image enhancement algorithm may be stored in an algorithm library, and the algorithm library further includes other algorithms, such as an image fusion algorithm, an image denoising algorithm, and the like.
Step 203: the hardware abstraction layer sends the target application data to the third party application.
If the original application data and the target application data are images, the definition of the images of the target application data is higher than that of the images of the original application data.
Further, before the hardware abstraction layer sends the target application data to the third-party application, the method further includes:
the hardware abstraction layer determining a transmission bandwidth between the hardware abstraction layer and the application layer;
the hardware abstraction layer determines the transmission time of the target application data based on the transmission bandwidth and the size of the target application data;
the hardware abstraction layer determines that the transmission time is less than or equal to a first threshold.
Further, the method further comprises:
if the transmission time is greater than the first threshold, the hardware abstraction layer compresses the target application data, and the transmission time of the compressed target application data is less than or equal to the first threshold.
It can be seen that, in the embodiment of the application, when the current photographing environment of the electronic device is a dim light environment, a third-party application set in an application layer in an android system of the electronic device sends a data request to a hardware abstraction layer of the android system; the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting the android system to be open for the third-party application in advance through the media service module; the hardware abstraction layer sends the target application data to the third party application. Therefore, according to the technical scheme provided by the application, the method and the device are beneficial to processing the original application data by the third-party application directly using the image enhancement algorithm provided by the system in the dark environment.
In an implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected to the algorithm management module;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the camera hardware abstraction module receives the data request, acquires original application data to be processed and sends the original application data to the algorithm management module;
and the algorithm management module receives the original application data, and processes the original application data by using an image enhancement algorithm to obtain target application data.
It can be seen that, in the embodiment of the present application, the third-party application may directly use the camera hardware abstraction module provided by the system to acquire the original application data, and directly use the algorithm management module provided by the system to call the image enhancement algorithm to process the original application data.
In one implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module; the camera hardware abstraction module is connected with the algorithm management module through the media strategy module;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the camera hardware abstraction module receives the data request, acquires original application data to be processed, and sends the original application data to the algorithm management module through the media strategy module;
and the algorithm management module receives the original application data, and processes the original application data by using an image enhancement algorithm to obtain target application data.
It can be seen that, in the embodiment of the present application, the third-party application may directly use the camera hardware abstraction module provided by the system to obtain the original application data, send the original application data to the algorithm management module through the media policy module, and call the image enhancement algorithm to process the original application data.
In an implementation manner of the present application, the original application data includes N frames of first images, the algorithm management module receives the original application data, and processes the original application data by using an image enhancement algorithm to obtain target application data, including:
the algorithm management module screens one frame of first image from the N frames of first images;
the algorithm management module determines 4 first matrixes based on the first image of one frame;
the algorithm management module calls a convolutional neural network model to process the 4 first matrixes to obtain 12 second matrixes;
the algorithm management module determines 3 third matrixes based on the 12 second matrixes, wherein the 3 third matrixes are a red channel of an image, a green channel of the image and a blue channel of the image respectively;
and the algorithm management module performs image synthesis based on the 3 third matrixes to obtain target application data.
The first image of one frame may be any one of the N frames, or may be a frame with the best image quality; the sizes of the 4 first matrixes and the sizes of the 12 second matrixes are both (R/2S/2), the sizes of the 3 third matrixes are (R, S), the first image of one frame comprises the R multiplied by the S pixel points, the R is the row number of the pixel points included by the first image of one frame, the S is the column number of the pixel points included by the first image of one frame, the R and the S are both integer multiples of 2, and the R and the S are both positive integers.
In an implementation manner of the present application, the screening, by the algorithm management module, one of the N first images includes:
the algorithm management module determines Q image evaluation parameters, wherein Q is a positive integer;
the algorithm management module determines N image quality evaluation values based on the Q image evaluation parameters, wherein the N image quality evaluation values are used for evaluating the image quality of the N frames of first images, and the N image quality evaluation values are in one-to-one correspondence with the N frames of first images;
and the algorithm management module screens one frame of first image from the N frames of first images based on the N image quality evaluation values.
The Q image evaluations may include, for example, contrast, sharpness, noise, artifacts, distortion, and the like. Contrast refers to the ability to segment different objects on an image by black and white differences formed on the image; the definition refers to whether the boundary of an object on an image is clear or not, the capability of distinguishing details of different objects can be represented by LP/mm, and the clearer the image is, the better the image quality is; noise refers to the randomly observable optical density variations in an image, usually manifested as speckles, snowflakes, moire, etc., and can be described by the signal-to-noise ratio (SNR), the greater the SNR, the better the image quality; the artifact is false information that a detected object appears in an image and does not exist, and the more serious the artifact is, the worse the image quality is; distortion is the varying degrees to which the shape, size and position of objects in an image are transmitted.
Further, the image quality evaluation value of each frame of the first image is determined based on the Q image evaluation parameters, Q second numerical values for representing the weights of the Q image evaluation parameters when used for evaluating the image quality of the N frames of the first image, and an image quality evaluation formula that
Figure BDA0002309753030000091
G is an image quality evaluation value, EiIs the ith second value, FiThe parameter is evaluated for the ith image.
In an implementation manner of the present application, the algorithm management module determines 4 first matrices based on the one frame of the first image, including:
the algorithm management module is used for converting the first image of one frame into data to obtain a fourth matrix;
the algorithm management module maps the fourth matrix to 4 color channels to obtain 4 fifth matrices, wherein the 4 color channels comprise 2 red channels, 1 green channel and 1 blue channel;
and the algorithm management module subtracts the black level matrix from the 4 fifth matrixes to obtain 4 first matrixes.
The image data can be represented by a matrix, so that the first image of one frame can be analyzed and processed by adopting matrix theory and matrix algorithm. The pixel data of the gray image of one frame of the first image is a matrix, the rows of the matrix correspond to the height (in pixels) of one frame of the first image, the columns of the matrix correspond to the width (in pixels) of one frame of the first image, the elements of the fourth matrix correspond to the pixels of one frame of the first image, and the values of the elements of the fourth matrix are the gray values of the pixels.
The information collected by the image sensor needs to be subjected to a series of conversion, and finally, a first image of original RAW format data is generated, each pixel point of the first image of the RAW data only has one color information, only one of RGB, but because the image sensor can basically and completely pass the frequency of green light, two G, one R and one B are arranged in the first image of the RAW data. Therefore, the fourth matrix needs to be mapped to the four color channels, resulting in four fifth matrices.
The pixel value of the image data is generally 0 to 255, but when the image sensor is shipped from a factory, the factory generally sets the image data output range to be 5 to 250, i.e., the lowest level is not zero. Therefore, it is necessary to adjust the image data range so that the minimum value thereof becomes zero, i.e., black level correction, and the black level matrix is used for performing black level correction.
As shown in fig. 2B, fig. 2B is a schematic structural diagram of an image enhancement algorithm provided in the embodiment of the present application. Firstly, a frame of first image is mapped to four color channels in a datamation mode to obtain four fifth matrixes, then a black level matrix is subtracted from the four fifth matrixes to obtain four first matrixes, the four first matrixes are amplified to obtain new first matrixes, then the four new first matrixes are used as input of a Full Convolutional Neural Network (FCNN), twelve second matrixes are obtained through FCNN processing, and finally 3 third matrixes are determined based on the 12 second matrixes, wherein the 3 third matrixes are respectively a red channel of the image, a green channel of the image and a blue channel of the image.
In an implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module, the camera hardware abstraction module is connected to the media policy module, and the media policy module is connected to the algorithm management module;
the image enhancement algorithm is specifically opened to the third-party application by the following operations:
the media policy module receives first function configuration information from the media service module, wherein the first function configuration information comprises description information of an image enhancement function; converting the first function configuration information into second function configuration information which can be identified by the algorithm management module, and sending the second function configuration information to the algorithm management module;
and the algorithm management module receives the second function configuration information and opens the use permission of the third-party application for the image enhancement algorithm of the android system according to the second function configuration information.
It can be seen that, in the embodiment of the present application, the media policy module receives first function configuration information from the media service module, where the first function configuration information includes description information of an image enhancement function, converts the first function configuration information into second function configuration information that can be identified by the algorithm management module, and sends the second function configuration information to the algorithm management module; and the algorithm management module receives the second function configuration information, opens the use permission of the third-party application for the image enhancement algorithm of the android system according to the second function configuration information, and is favorable for enabling the third-party application to directly use the image enhancement algorithm provided by the system.
In an implementation manner of the present application, before the third-party application sends a data request to a hardware abstraction layer of the android system, the method further includes:
the third-party application sends a media platform version acquisition request carrying an authentication code to the media service module;
the media service module receives the media platform version acquisition request, verifies the authentication code and passes the verification;
and the media service module sends the media platform version information to the third-party application.
Therefore, in this example, the authentication is performed before the third-party application requests the system to open the use permission of the image enhancement algorithm, which is beneficial to ensuring the safety of the opening of the algorithm of the target effect.
In an implementation manner of the present application, the third-party application receives the media platform version information, and sends a capability obtaining request carrying the media platform version information to the media service module;
the media service module receives the capability acquisition request, inquires an application capability list of the media platform version information, and sends the application capability list to the third-party application;
the third-party application receives the application capability list, and inquires the application capability list to acquire a plurality of android native functions supported by the current media platform for the third-party application; and determining an image enhancement function selected to be open from the plurality of android native functions.
As can be seen, in this example, after the authentication code is verified and the verification is passed, the media platform version information is returned to the third-party application, the verification result is made clear, the third-party application requests the media service module for the application capability list, and selects the image enhancement function that is open to the third-party application, which is favorable for accurately selecting the open image enhancement algorithm to process the image.
As shown in fig. 3, fig. 3 is a schematic flowchart of a dim light photographing method provided in an embodiment of the present application, and is applied to the electronic device shown in fig. 1A and 1B, where the electronic device includes a media service module and an android system, and an application layer of the android system is provided with a third-party application; the hardware abstraction layer of the android system is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected with the algorithm management module; the method comprises the following steps:
step 301: and the third-party application sends a media platform version acquisition request carrying an authentication code to the media service module.
Step 302: and the media service module receives the media platform version acquisition request, verifies the authentication code and passes the verification.
Step 303: and the media service module sends the media platform version information to the third-party application.
Step 304: and the third-party application receives the media platform version information and sends a capability acquisition request carrying the media platform version information to the media service module.
Step 305: and the media service module receives the capability acquisition request, inquires an application capability list of the media platform version information and sends the application capability list to the third-party application.
Step 306: the third-party application receives the application capability list, and inquires the application capability list to acquire a plurality of android native functions supported by the current media platform for the third-party application; and determining an image enhancement function selected to be open from the plurality of android native functions.
Step 307: and when the current photographing environment of the electronic equipment is a dark light environment, the third-party application sends a data request to a hardware abstraction layer of the android system.
Step 308: the camera hardware abstraction module receives the data request, acquires original application data to be processed, and sends the original application data to the algorithm management module, wherein the original application data comprises N frames of first images.
Step 309: the algorithm management module determines Q image evaluation parameters, wherein Q is a positive integer.
Step 310: the algorithm management module determines N image quality evaluation values based on the Q image evaluation parameters, wherein the N image quality evaluation values are used for evaluating the image quality of the N frames of first images, and the N image quality evaluation values are in one-to-one correspondence with the N frames of first images.
Step 311: and the algorithm management module screens one frame of first image from the N frames of first images based on the N image quality evaluation values.
Step 312: and the algorithm management module is used for converting the first image of one frame into data to obtain a fourth matrix.
Step 313: and the algorithm management module maps the fourth matrix to 4 color channels to obtain 4 fifth matrices, wherein the 4 color channels comprise 2 red channels, 1 green channel and 1 blue channel.
Step 314: and the algorithm management module subtracts the black level matrix from the 4 fifth matrixes to obtain 4 first matrixes.
Step 315: and the algorithm management module calls a convolutional neural network model to process the 4 first matrixes to obtain 12 second matrixes.
Step 316: the algorithm management module determines 3 third matrixes based on the 12 second matrixes, wherein the 3 third matrixes are a red channel of the image, a green channel of the image and a blue channel of the image respectively.
Step 317: and the algorithm management module performs image synthesis based on the 3 third matrixes to obtain target application data.
Step 318: and the algorithm management module sends the target application data to the third-party application.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
In accordance with the embodiments shown in fig. 2A and fig. 3, please refer to fig. 4, and fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in the figure, the electronic device includes a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for performing the following steps:
when the current photographing environment of the electronic equipment is a dark light environment, controlling the third-party application to send a data request to a hardware abstraction layer of the android system;
controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting an android system to be open for a third-party application in advance through the media service module;
and controlling the hardware abstraction layer to send the target application data to the third-party application.
In an implementation manner of the present application, the photographing request carries photographing parameters, the hardware abstraction layer is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected to the algorithm management module;
in controlling the hardware abstraction layer to receive the data request, obtain original application data to be processed, and call an image enhancement algorithm to process the original application data to obtain target application data, the program includes instructions specifically for performing the following steps:
controlling the camera hardware abstraction module to receive the data request, acquiring original application data to be processed, and sending the original application data to the algorithm management module;
and controlling the algorithm management module to receive the original application data, and processing the original application data by using an image enhancement algorithm to obtain target application data.
In one implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module; the camera hardware abstraction module is connected with the algorithm management module through the media strategy module;
in controlling the hardware abstraction layer to receive the data request, obtain original application data to be processed, and call an image enhancement algorithm to process the original application data to obtain target application data, the program includes instructions specifically for performing the following steps:
controlling the camera hardware abstraction module to receive the data request, acquiring original application data to be processed, and sending the original application data to the algorithm management module through the media policy module;
and controlling the algorithm management module to receive the original application data, and processing the original application data by using an image enhancement algorithm to obtain target application data.
In an implementation manner of the present application, the original application data includes N frames of first images, and in terms of controlling the algorithm management module to receive the original application data and processing the original application data by using an image enhancement algorithm to obtain target application data, the program includes instructions specifically configured to perform the following steps:
controlling the algorithm management module to screen one frame of first image from the N frames of first images;
controlling the algorithm management module to determine 4 first matrices based on the one frame of the first image;
controlling the algorithm management module to call a convolutional neural network model to process the 4 first matrixes to obtain 12 second matrixes;
controlling the algorithm management module to determine 3 third matrixes based on the 12 second matrixes, wherein the 3 third matrixes are a red channel of an image, a green channel of the image and a blue channel of the image respectively;
and controlling the algorithm management module to perform image synthesis based on the 3 third matrixes to obtain target application data.
In an implementation manner of the present application, in controlling the algorithm management module to screen one of the N first images, the program includes instructions specifically configured to:
controlling the algorithm management module to determine Q image evaluation parameters, wherein Q is a positive integer;
controlling the algorithm management module to determine N image quality evaluation values based on the Q image evaluation parameters, wherein the N image quality evaluation values are used for evaluating the image quality of the N frames of first images, and the N image quality evaluation values are in one-to-one correspondence with the N frames of first images;
and controlling the algorithm management module to screen one frame of first image from the N frames of first images based on the N image quality evaluation values.
In an implementation of the present application, in controlling the algorithm management module to determine the 4 first matrices based on the one of the first images, the program includes instructions specifically configured to:
controlling the algorithm management module to convert the first image of one frame into data to obtain a fourth matrix;
controlling the algorithm management module to map the fourth matrix to 4 color channels to obtain 4 fifth matrices, wherein the 4 color channels comprise 2 red channels, 1 green channel and 1 blue channel;
and controlling the algorithm management module to subtract the black level matrix from the 4 fifth matrixes to obtain 4 first matrixes.
In an implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module, the camera hardware abstraction module is connected to the media policy module, and the media policy module is connected to the algorithm management module;
the image enhancement algorithm is specifically opened to the third-party application by the following operations:
controlling the media policy module to receive first function configuration information from the media service module, wherein the first function configuration information comprises description information of an image enhancement function; converting the first function configuration information into second function configuration information which can be identified by the algorithm management module, and sending the second function configuration information to the algorithm management module;
and controlling the algorithm management module to receive the second function configuration information, and opening the use permission of the third-party application for the image enhancement algorithm of the android system according to the second function configuration information.
In an implementation manner of the present application, before controlling the third-party application to send a data request to a hardware abstraction layer of the android system, the program includes instructions further configured to perform the following steps:
controlling the third-party application to send a media platform version acquisition request carrying an authentication code to the media service module;
controlling the media service module to receive the media platform version acquisition request, and checking the authentication code and passing the check;
and controlling the media service module to send the media platform version information to the third-party application.
In an implementation manner of the present application, after controlling the media service module to send the media platform version information to the third-party application, the program includes instructions further configured to:
controlling the third-party application to receive the media platform version information and send a capability acquisition request carrying the media platform version information to the media service module;
controlling the media service module to receive the capability acquisition request, inquiring an application capability list of the media platform version information, and sending the application capability list to the third-party application;
controlling the third-party application to receive the application capability list, and inquiring the application capability list to acquire a plurality of android native functions supported by the current media platform for the third-party application; and determining an image enhancement function selected to be open from the plurality of android native functions.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
The above embodiments mainly introduce the scheme of the embodiments of the present application from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
The following is an embodiment of the apparatus of the present application, which is used to execute the method implemented by the embodiment of the method of the present application. Referring to fig. 5, fig. 5 is a schematic structural diagram of a dim light photographing device provided in an embodiment of the present application, and is applied to an electronic device, where the electronic device includes a media service module and an android system, and an application layer of the android system is provided with a third-party application; the device comprises:
a processing unit 501, configured to: when the current photographing environment of the electronic equipment is a dark light environment, controlling the third-party application to send a data request to a hardware abstraction layer of the android system; controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting an android system to be open for a third-party application in advance through the media service module; and controlling the hardware abstraction layer to send the target application data to the third-party application.
In an implementation manner of the present application, the photographing request carries photographing parameters, the hardware abstraction layer is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected to the algorithm management module;
in controlling the hardware abstraction layer to receive the data request, obtain original application data to be processed, and call an image enhancement algorithm to process the original application data to obtain target application data, the processing unit 501 is specifically configured to:
controlling the camera hardware abstraction module to receive the data request, acquiring original application data to be processed, and sending the original application data to the algorithm management module;
and controlling the algorithm management module to receive the original application data, and processing the original application data by using an image enhancement algorithm to obtain target application data.
In one implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module; the camera hardware abstraction module is connected with the algorithm management module through the media strategy module;
in controlling the hardware abstraction layer to receive the data request, obtain original application data to be processed, and call an image enhancement algorithm to process the original application data to obtain target application data, the processing unit 501 is specifically configured to:
controlling the camera hardware abstraction module to receive the data request, acquiring original application data to be processed, and sending the original application data to the algorithm management module through the media policy module;
and controlling the algorithm management module to receive the original application data, and processing the original application data by using an image enhancement algorithm to obtain target application data.
In an implementation manner of the present application, the original application data includes N frames of first images, and in terms of controlling the algorithm management module to receive the original application data and processing the original application data by using an image enhancement algorithm to obtain target application data, the processing unit 501 is specifically configured to:
controlling the algorithm management module to screen one frame of first image from the N frames of first images;
controlling the algorithm management module to determine 4 first matrices based on the one frame of the first image;
controlling the algorithm management module to call a convolutional neural network model to process the 4 first matrixes to obtain 12 second matrixes;
controlling the algorithm management module to determine 3 third matrixes based on the 12 second matrixes, wherein the 3 third matrixes are a red channel of an image, a green channel of the image and a blue channel of the image respectively;
and controlling the algorithm management module to perform image synthesis based on the 3 third matrixes to obtain target application data.
In an implementation manner of the present application, in controlling the algorithm management module to screen one of the N first images, the processing unit 501 is specifically configured to:
controlling the algorithm management module to determine Q image evaluation parameters, wherein Q is a positive integer;
controlling the algorithm management module to determine N image quality evaluation values based on the Q image evaluation parameters, wherein the N image quality evaluation values are used for evaluating the image quality of the N frames of first images, and the N image quality evaluation values are in one-to-one correspondence with the N frames of first images;
and controlling the algorithm management module to screen one frame of first image from the N frames of first images based on the N image quality evaluation values.
In an implementation manner of the present application, in controlling the algorithm management module to determine 4 first matrices based on the first image of one of the frames, the processing unit 501 is specifically configured to:
controlling the algorithm management module to convert the first image of one frame into data to obtain a fourth matrix;
controlling the algorithm management module to map the fourth matrix to 4 color channels to obtain 4 fifth matrices, wherein the 4 color channels comprise 2 red channels, 1 green channel and 1 blue channel;
and controlling the algorithm management module to subtract the black level matrix from the 4 fifth matrixes to obtain 4 first matrixes.
In an implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module, the camera hardware abstraction module is connected to the media policy module, and the media policy module is connected to the algorithm management module;
the image enhancement algorithm is specifically opened to the third-party application by the following operations:
controlling the media policy module to receive first function configuration information from the media service module, wherein the first function configuration information comprises description information of an image enhancement function; converting the first function configuration information into second function configuration information which can be identified by the algorithm management module, and sending the second function configuration information to the algorithm management module;
and controlling the algorithm management module to receive the second function configuration information, and opening the use permission of the third-party application for the image enhancement algorithm of the android system according to the second function configuration information.
In an implementation manner of the present application, before controlling the third-party application to send a data request to a hardware abstraction layer of the android system, the processing unit 501 is further configured to:
controlling the third-party application to send a media platform version acquisition request carrying an authentication code to the media service module;
controlling the media service module to receive the media platform version acquisition request, and checking the authentication code and passing the check;
and controlling the media service module to send the media platform version information to the third-party application.
In an implementation manner of the present application, after controlling the media service module to send the media platform version information to the third-party application, the processing unit 501 is further configured to:
controlling the third-party application to receive the media platform version information and send a capability acquisition request carrying the media platform version information to the media service module;
controlling the media service module to receive the capability acquisition request, inquiring an application capability list of the media platform version information, and sending the application capability list to the third-party application;
controlling the third-party application to receive the application capability list, and inquiring the application capability list to acquire a plurality of android native functions supported by the current media platform for the third-party application; and determining an image enhancement function selected to be open from the plurality of android native functions.
The dim light photographing device may further include a storage unit 503 for storing program codes and data of the electronic device. The processing unit 501 may be a processor, the communication unit 502 may be a touch display screen or a transceiver, and the storage unit 503 may be a memory.
It can be understood that, since the method embodiment and the apparatus embodiment are different presentation forms of the same technical concept, the content of the method embodiment portion in the present application should be synchronously adapted to the apparatus embodiment portion, and is not described herein again.
Embodiments of the present application further provide a chip, where the chip includes a processor, configured to call and run a computer program from a memory, so that a device in which the chip is installed performs some or all of the steps described in the electronic device in the above method embodiments.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (13)

1. The dim light photographing method is applied to electronic equipment, the electronic equipment comprises a media service module and an android system, and an application layer of the android system is provided with a third-party application; the method comprises the following steps:
when the current photographing environment of the electronic equipment is a dark light environment, the third-party application sends a data request to a hardware abstraction layer of the android system;
the hardware abstraction layer receives the data request through the media service module, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting an android system to be open for the third-party application in advance through the media service module by the third-party application;
the hardware abstraction layer sends the target application data to the third party application.
2. The method according to claim 1, wherein the hardware abstraction layer is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected with the algorithm management module;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the camera hardware abstraction module receives the data request, acquires original application data to be processed and sends the original application data to the algorithm management module;
and the algorithm management module receives the original application data, and processes the original application data by using an image enhancement algorithm to obtain target application data.
3. The method of claim 1, wherein the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module, and an algorithm management module; the camera hardware abstraction module is connected with the algorithm management module through the media strategy module;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image enhancement algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the camera hardware abstraction module receives the data request, acquires original application data to be processed, and sends the original application data to the algorithm management module through the media strategy module;
and the algorithm management module receives the original application data, and processes the original application data by using an image enhancement algorithm to obtain target application data.
4. The method of claim 2 or 3, wherein the raw application data comprises N frames of the first image, the algorithm management module receives the raw application data, processes the raw application data using an image enhancement algorithm to obtain the target application data, and comprises:
the algorithm management module screens one frame of first image from the N frames of first images;
the algorithm management module determines 4 first matrixes based on the first image of one frame;
the algorithm management module calls a convolutional neural network model to process the 4 first matrixes to obtain 12 second matrixes;
the algorithm management module determines 3 third matrixes based on the 12 second matrixes, wherein the 3 third matrixes are a red channel of an image, a green channel of the image and a blue channel of the image respectively;
and the algorithm management module performs image synthesis based on the 3 third matrixes to obtain target application data.
5. The method of claim 4, wherein the algorithm management module screens one of the N first images, comprising:
the algorithm management module determines Q image evaluation parameters, wherein Q is a positive integer;
the algorithm management module determines N image quality evaluation values based on the Q image evaluation parameters, wherein the N image quality evaluation values are used for evaluating the image quality of the N frames of first images, and the N image quality evaluation values are in one-to-one correspondence with the N frames of first images;
and the algorithm management module screens one frame of first image from the N frames of first images based on the N image quality evaluation values.
6. The method according to claim 4 or 5, wherein the algorithm management module determines 4 first matrices based on the one frame first image, comprising:
the algorithm management module is used for converting the first image of one frame into data to obtain a fourth matrix;
the algorithm management module maps the fourth matrix to 4 color channels to obtain 4 fifth matrices, wherein the 4 color channels comprise 2 red channels, 1 green channel and 1 blue channel;
and the algorithm management module subtracts the black level matrix from the 4 fifth matrixes to obtain 4 first matrixes.
7. The method according to claim 1, wherein the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module, the camera hardware abstraction module is connected with the media policy module, and the media policy module is connected with the algorithm management module;
the image enhancement algorithm is specifically opened to the third-party application by the following operations:
the media policy module receives first function configuration information from the media service module, wherein the first function configuration information comprises description information of an image enhancement function; converting the first function configuration information into second function configuration information which can be identified by the algorithm management module, and sending the second function configuration information to the algorithm management module;
and the algorithm management module receives the second function configuration information and opens the use permission of the third-party application for the image enhancement algorithm of the android system according to the second function configuration information.
8. The method of claim 7, wherein before the third-party application sends a data request to a hardware abstraction layer of the android system, the method further comprises:
the third-party application sends a media platform version acquisition request carrying an authentication code to the media service module;
the media service module receives the media platform version acquisition request, verifies the authentication code and passes the verification;
and the media service module sends the media platform version information to the third-party application.
9. The method of claim 8, wherein after the media service module sends the media platform version information to the third-party application, the method further comprises:
the third party application receives the media platform version information and sends a capability acquisition request carrying the media platform version information to the media service module;
the media service module receives the capability acquisition request, inquires an application capability list of the media platform version information, and sends the application capability list to the third-party application;
the third-party application receives the application capability list, and inquires the application capability list to acquire a plurality of android native functions supported by the current media platform for the third-party application; and determining an image enhancement function selected to be open from the plurality of android native functions.
10. The dim light photographing device is applied to electronic equipment, wherein the electronic equipment comprises a media service module and an android system, and an application layer of the android system is provided with a third-party application; the device comprises:
a processing unit to: when the current photographing environment of the electronic equipment is a dark light environment, controlling the third-party application to send a data request to a hardware abstraction layer of the android system; controlling the hardware abstraction layer to receive the data request through the media service module, acquiring original application data to be processed, and calling an image enhancement algorithm to process the original application data to obtain target application data, wherein the image enhancement algorithm is used for requesting an android system to be open for a third-party application in advance through the media service module by the third-party application; and controlling the hardware abstraction layer to send the target application data to the third-party application.
11. A chip, comprising: a processor for calling and running a computer program from a memory so that a device on which the chip is installed performs the method of any one of claims 1-9.
12. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-9.
13. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-9.
CN201911253797.5A 2019-12-09 2019-12-09 Dark light photographing method and related equipment Active CN110933313B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911253797.5A CN110933313B (en) 2019-12-09 2019-12-09 Dark light photographing method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911253797.5A CN110933313B (en) 2019-12-09 2019-12-09 Dark light photographing method and related equipment

Publications (2)

Publication Number Publication Date
CN110933313A CN110933313A (en) 2020-03-27
CN110933313B true CN110933313B (en) 2021-07-16

Family

ID=69857828

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911253797.5A Active CN110933313B (en) 2019-12-09 2019-12-09 Dark light photographing method and related equipment

Country Status (1)

Country Link
CN (1) CN110933313B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111061524A (en) * 2019-12-09 2020-04-24 Oppo广东移动通信有限公司 Application data processing method and related device
CN112463897B (en) * 2020-10-14 2023-06-02 麒麟软件有限公司 Method and system for redirecting positioning data

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012023675A (en) * 2010-07-16 2012-02-02 Fujifilm Corp Imaging module, imaging apparatus, and signal processing method of imaging module
US9118864B2 (en) * 2012-08-17 2015-08-25 Flextronics Ap, Llc Interactive channel navigation and switching
CN103279730B (en) * 2013-06-05 2016-09-28 北京奇虎科技有限公司 Mobile terminal processes the method and apparatus of visual graphic code
CN103442172B (en) * 2013-08-15 2017-09-19 Tcl集团股份有限公司 A kind of camera image quality adjusting method, system and mobile terminal based on Android platform
US10922148B2 (en) * 2015-04-26 2021-02-16 Intel Corporation Integrated android and windows device
KR20180023326A (en) * 2016-08-25 2018-03-07 삼성전자주식회사 Electronic device and method for providing image acquired by the image sensor to application
CN108012084A (en) * 2017-12-14 2018-05-08 维沃移动通信有限公司 A kind of image generating method, application processor AP and third party's picture processing chip
CN109462732B (en) * 2018-10-29 2021-01-15 努比亚技术有限公司 Image processing method, device and computer readable storage medium
CN110177218B (en) * 2019-06-28 2021-06-04 广州鲁邦通物联网科技有限公司 Photographing image processing method of android device

Also Published As

Publication number Publication date
CN110933313A (en) 2020-03-27

Similar Documents

Publication Publication Date Title
CN110933275B (en) Photographing method and related equipment
US11849224B2 (en) Global tone mapping
JP5414910B2 (en) Display unit, imaging unit, and display system apparatus
KR101920816B1 (en) White balance method for shading compensation, and apparatus applied to the same
CN111510698A (en) Image processing method, device, storage medium and mobile terminal
CN110930335B (en) Image processing method and electronic equipment
CN110933313B (en) Dark light photographing method and related equipment
CN110944160A (en) Image processing method and electronic equipment
CN114693580B (en) Image processing method and related device
US9984448B2 (en) Restoration filter generation device and method, image processing device and method, imaging device, and non-transitory computer-readable medium
CN112419167A (en) Image enhancement method, device and storage medium
KR100867049B1 (en) Image correction device and image correction method
JP2017157902A (en) Signal processor, imaging apparatus and signal processing method
JP5768193B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
KR102644899B1 (en) Spatially multiplexed exposure
US8086063B2 (en) Image display apparatus and method, and program
CN114222072A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112150357B (en) Image processing method and mobile terminal
CN115527474A (en) Image display method, image display device, projection device, and storage medium
KR100747729B1 (en) Image processor, device for compensating of lens shading and the same method
JP2016040879A (en) Control system, imaging device, control method, and program
JP2018137580A (en) Image processor, image processing method, and program
US20240114251A1 (en) Server device and program
CN108810416B (en) Image processing method and terminal equipment
CN114331880A (en) Image processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant