CN114095666A - Photographing method, electronic device and computer-readable storage medium - Google Patents

Photographing method, electronic device and computer-readable storage medium Download PDF

Info

Publication number
CN114095666A
CN114095666A CN202110927005.9A CN202110927005A CN114095666A CN 114095666 A CN114095666 A CN 114095666A CN 202110927005 A CN202110927005 A CN 202110927005A CN 114095666 A CN114095666 A CN 114095666A
Authority
CN
China
Prior art keywords
image
exposure value
electronic device
electronic equipment
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110927005.9A
Other languages
Chinese (zh)
Other versions
CN114095666B (en
Inventor
陈珂
商亚洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110927005.9A priority Critical patent/CN114095666B/en
Publication of CN114095666A publication Critical patent/CN114095666A/en
Priority to PCT/CN2022/091901 priority patent/WO2023015991A1/en
Application granted granted Critical
Publication of CN114095666B publication Critical patent/CN114095666B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The application belongs to the field of image processing and provides a photographing method, electronic equipment and a computer readable storage medium. The method comprises the following steps: the electronic equipment triggers a photographing instruction to control the flash lamp to be in a normally-on state; the electronic equipment collects images in the normally-on state, the collected images comprise a first image collected under a first exposure value, a second image collected under a second exposure value and a third image collected under a third exposure value, the first exposure value is smaller than the second exposure value, the second exposure value is smaller than the third exposure value, and the first exposure value is a standard exposure value of the current scene; the electronic equipment generates an image needing to be output according to the first image, the second image and the third image. The flash lamp is controlled to be in a normally bright state, so that image acquisition can be performed based on the environment with the same brightness, the time interval of image acquisition is reduced, the jitter probability of image acquisition is reduced, the stability of image acquisition parameters is improved, and the image can be fused and generated more clearly.

Description

Photographing method, electronic device and computer-readable storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to a photographing method, an electronic device, and a computer-readable storage medium.
Background
When shooing under the darker environment of light, in order to obtain the image of relative clarity, the camera can control the flash light work usually to promote the luminance of environment, make the image that obtains more clear. For example, in an automatic flash mode, the camera detects that the ambient brightness meets the working requirement of a flash lamp, and if a photographing instruction is received and the image of the current scene is detected to be a high dynamic range image, the camera collects the images when the flash lamp is in a pre-flash state and a strong flash state respectively, and synthesizes a photo according to the collected images.
Due to the fact that the difference between the brightness and the definition of the acquired images in the pre-flashing state and the strong flashing state is large, people or objects in the acquired images are prone to being segmented inaccurately, and a time gap exists between the pre-flashing state and the strong flashing state, image shaking can be generated, the stability of 3A (automatic focusing, automatic white balance and automatic exposure) is affected, the images are not favorably subjected to registering and fusing operations, and the quality of the shot images is affected.
Disclosure of Invention
The embodiment of the application provides a photographing method and electronic equipment, and aims to solve the problems that in the prior art, when a user takes a picture, the acquired image is prone to inaccurate in segmentation, and picture jitter and 3A stability are possibly generated, so that the quality of the photographed image is influenced.
In a first aspect, an embodiment of the present application provides a photographing method, where the method includes: the electronic equipment triggers a photographing instruction to control the flash lamp to be in a normally-on state; the electronic equipment collects images in the normally-on state, the collected images comprise a first image collected under a first exposure value, a second image collected under a second exposure value and a third image collected under a third exposure value, the first exposure value is smaller than the second exposure value, the second exposure value is smaller than the third exposure value, and the first exposure value is a standard exposure value of the current scene; and the electronic equipment generates an image needing to be output according to the first image, the second image and the third image.
And the first image acquired under the first exposure value is a normally exposed image. The second exposure value is greater than the first exposure value, which indicates that the exposure of the second image acquired at the second exposure value is less than the exposure of the first image acquired at the first exposure value. For the same reason, the exposure of the third image is smaller than that of the second image. For highlight areas, including for example solid color areas, or light source areas, it may be displayed more clearly in the third image. For normally exposed image areas, a clearer display is possible in the first image. The second image can be respectively registered with the first image and the third image, so that the first image and the third image can be registered based on the second image, the images can be effectively registered and fused, and a clear output image is obtained.
It can be seen that the electronic device controls the flash lamp to maintain a constant-brightness state with fixed brightness during the first, second and third images. In the image acquisition process of the electronic equipment, the ambient brightness information is basically consistent. Therefore, when the first image, the second image and the third image with consistent ambient brightness information are acquired, stable camera parameters including an automatic exposure parameter, an automatic white balance parameter and an automatic focusing parameter can be adopted for image acquisition, and the image acquisition parameters do not need to be changed frequently, so that the reliability of image acquisition is higher.
And the first image, the second image and the third image with different exposure values are obtained by controlling the flash lamp to be in a constant normally-on state and combining the adjustment of the exposure values on the software level. For the adjustment of drive current, it is more convenient to adjust the exposure value through the software layer, and the realization is easier, and the problem of the shooting stability brought by the adjustment of drive current can be reduced.
The HDR image is generated according to the first image, the second image and the third image which are fixed. The existing HDR image generation method needs to judge the brightness of the frame through the ambient brightness information, and the decision calculation is troublesome, so that the decision efficiency can be effectively improved and the photographing process can be more efficiently completed through the fixed frame output mode.
In some possible implementations of the first aspect, the controlling, by the electronic device, the flash to be in a normally on state includes: the electronic equipment acquires the brightness of a photographing scene; according to the preset corresponding relation between the brightness and the working current, the electronic equipment determines the working current corresponding to the brightness of the scene; and the electronic equipment drives the flash lamp to be in a normally-on state according to the working current.
And when the flash lamp is determined to be in the normally-on state, further combining the brightness of the environment to calculate the working current of the flash lamp. When the calculated working current drives the flash lamp to work, the brightness of the environment is not too bright or too dark, so that the requirement of acquiring the first image by normal exposure can be better met.
In a possible implementation, the flash lamp may also be controlled to operate at a fixed operating current. The implementation process is relatively simpler. The first image, the second image and the third image are acquired based on the same ambient brightness, and the stability of the images is high.
In some possible implementations of the first aspect, the electronic device acquires the image in the normally-on state, including: the electronic equipment determines a first exposure value according to the brightness of the current scene; according to a preset exposure proportional relation, the electronic equipment determines the second exposure value and the third exposure value; the electronic device collects images according to the determined first exposure value, second exposure value and third exposure value, respectively.
The preset exposure is based on the normal exposure of the first image, the exposure of the image is gradually reduced, and the second image and the third image are acquired. In a possible implementation, it may be assumed that the first exposure value is EV0 and the second exposure value may be 4 times the first exposure value, i.e. the second exposure value may be EV 2. The third exposure value may be 64 times the first exposure value, i.e. the third exposure value may be EV 6. In practice, this ratio is not limited. The first exposure value is EV0, for example, the second exposure value may be any of EV1-EV4, and the third exposure value may be any of EV4-EV 10. Based on the set exposure value, the object with different brightness and the image with different definition under different exposure values can be obtained.
In some possible implementations of the first aspect, the electronic device separately capturing an image according to the determined first exposure value, second exposure value, and third exposure value includes: the electronic equipment determines the aperture size and/or the exposure time corresponding to the first exposure value, the second exposure value and the third exposure value respectively; and the electronic equipment respectively acquires images according to the determined aperture size and/or exposure time.
When the image is collected according to the determined exposure value, the exposure parameters including the aperture and the exposure time are only required to be adjusted through a software layer, so that the collection of images with different exposure values can be effectively realized. Compared with the mode of determining images with different exposure values by adjusting the brightness of different flash lamps through current, the stability of the photographing process can be improved more effectively through the adjustment parameters of the software layer.
In some possible implementations of the first aspect, the generating, by the electronic device, an image to be output according to the first image, the second image, and the third image includes: the electronic equipment performs optimization and fusion processing according to the first image in the RAW format, the second image in the RAW format and the third image in the RAW format to obtain a first output image in the RAW format; and the electronic equipment performs color space transformation on the first output image to obtain an image needing to be output.
The electronic equipment optimizes and fuses the acquired image in the RAW, and compared with a YUV domain image optimization and fusion mode or an RGB domain image fusion and optimization mode, the RAW domain-based image optimization and fusion mode can keep richer details in the optimization and fusion process, so that the definition of the fused image is higher, and the quality of the shot picture is improved.
When performing image optimization processing in the RAW domain, the electronic device may perform optimization and fusion processing according to the first image in the RAW format, the second image in the RAW format, and the third image in the RAW format, to obtain a first output image in the RAW format, including: and carrying out dead pixel correction, lens shading correction, black level correction, RAW domain noise reduction, white balance gain and image fusion processing on the first image, the second image and the third image to obtain a first output image in a RAW format.
When the RAW domain is subjected to noise reduction processing, the RAW image can be processed based on a network model, so that the definition, the sharpness and the image signal to noise ratio of the processed image are better. The network model can be a convolutional neural network model or can also be a U-Net network model.
In some possible implementations of the first aspect, the electronic device performs color space conversion on the first output image to obtain an image to be output, including: the electronic equipment demosaicing processing is carried out on the first output image to obtain an RGB image; and carrying out color correction and global color mapping processing on the RGB image, carrying out color space transformation on the processed image, and carrying out YUV domain processing on the transformed image to obtain an image to be output.
As more details of the image are reserved when the RAW is used for image processing, the image after the RAW processing is used for color space conversion, high-definition images in different color spaces can be obtained, and the use requirements of users on the high-definition images in different color spaces are met.
In some possible implementations of the first aspect, after the electronic device acquires the image in the normally-on state, the method further includes: the upper layer of the electronic equipment sends a closing instruction of a normally-on state of a flash lamp to the bottom layer of the electronic equipment; and when the upper layer of the electronic equipment receives the response of turning off the flash lamp, the photographing function of the electronic equipment is updated to be in an available state.
By controlling the updating of the available state of the photographing function, useless clicking operations by the user during the unavailable state can be reduced. When the photographing is finished and the response of turning off the flash lamp is received, the key of the photographing function can be recovered to be in the available state, so that the user can conveniently continue to take other photos.
In some possible implementations of the first aspect, after the upper layer of the electronic device sends a flash turn-off instruction to the bottom layer of the electronic device, the method further includes: when the upper layer of the electronic equipment does not receive the response of turning off the flash lamp within the preset first time, the HAL of the electronic equipment turns off the flash lamp.
The flash lamp is forcibly turned off through the Hardware Abstraction Layer (HAL), so that the problem that the flash lamp is not turned off in time at the bottom layer to cause that an upper layer instruction cannot be responded in time can be avoided. Wherein the predetermined first time period may be 200ms to 5 s.
Second aspect an embodiment of the present application provides a photographing apparatus, including: the triggering unit is used for triggering a photographing instruction by the electronic equipment and controlling the flash lamp to be in a normally-on state; the image acquisition unit is used for acquiring images by the electronic equipment in the normally-on state, wherein the acquired images comprise a first image acquired under a first exposure value, a second image acquired under a second exposure value and a third image acquired under a third exposure value, the first exposure value is smaller than the second exposure value, the second exposure value is smaller than the third exposure value, and the first exposure value is a standard exposure value of the current scene; and the image generating unit is used for generating an image needing to be output by the electronic equipment according to the first image, the second image and the third image.
The photographing device corresponds to the photographing method.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the method according to any one of the second aspect or the third aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method according to any one of the second aspect or the third aspect.
In a fifth aspect, embodiments of the present application provide a chip system, where the chip system includes a processor, and the processor is coupled with a memory, and executes a computer program stored in the memory to implement the method according to any one of the second aspect and the third aspect. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In a sixth aspect, embodiments of the present application provide a computer program product, which, when run on an electronic device, causes the electronic device to perform the method of any one of the second or third aspects.
It is understood that the beneficial effects of the second to sixth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a photographing scene provided in the present application;
fig. 4 is a schematic diagram of a photographing method for improving image definition according to the present application;
fig. 5 is a schematic view of a flow chart of a photographing method according to the present application;
fig. 6 is a schematic flow chart illustrating a normal lighting mode of a flash lamp according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram illustrating a normal lighting state of a flash lamp with different ambient brightness according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a photographing process provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a first image captured at a first exposure value according to an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of a second image captured at a second exposure value according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a third image captured at a third exposure value according to an embodiment of the present application;
fig. 12 is a schematic view of an image processing flow based on a RAW domain according to an embodiment of the present application;
fig. 13 is a schematic diagram illustrating an image signal processing flow of a night scene image according to an embodiment of the present disclosure;
fig. 14 is a schematic diagram of image signal processing of a night view image according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application.
The following provides an exemplary description of relevant content to which embodiments of the present application may relate.
(1) An electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 100 answers a call or voice information, it can answer the voice by placing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
After the hardware architecture of the electronic device 100 is introduced, the software architecture of the electronic device 100 is exemplarily described below.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. Such data may include video, images, audio, and the like.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.). The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
Android Runtime (Android on-the-go) includes a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, e.g., MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing. The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary work flows of software and hardware of the electronic device 100 in connection with a photographing scene.
In a photographing scene, a finger or a stylus of a user contacts the display screen 105 of the electronic device 100, the touch sensor 107 disposed on the display screen 105 receives a touch operation, and a corresponding hardware interrupt is sent to the kernel layer. The core layer processes the touch operation into a touch event (e.g., an action down event, an action move event, an action up event, etc.) that can be read by the upper layer. Wherein, the action down event represents that a finger or a stylus makes initial contact with the touch screen, for example, a user uses the stylus to pen down in a certain area on the touch screen; an action move event represents a finger or stylus sliding on the touch screen, e.g., after a pen drop, the stylus sliding on the touch screen; an action up event indicates that the finger or stylus is separated from the touch screen, e.g., after the stylus is dropped and slid a distance, the user lifts the stylus so that the stylus is separated from the touch screen.
Touch events are stored at the kernel layer. The application framework layer acquires the touch event from the kernel layer, and identifies the control corresponding to the high-level touch event and the touch operation corresponding to the touch event, for example, the touch operation includes single click, double click, sliding and the like. Taking the touch operation as a touch click operation, and taking the control corresponding to the click operation as the photographing control of the video application as an example, the video application calls the interface of the application framework layer, and then calls the corresponding callback function to realize the application function corresponding to the touch operation.
(2) And (5) photographing a scene. After exemplary description of the hardware architecture and the software architecture of the electronic device 100 provided in the embodiment of the present application, an exemplary description of a screen-projection scenario related to the embodiment of the present application is provided below.
Fig. 3 is a schematic diagram of a photographing scene. The photographing scene may include an indoor scene, a night scene, and other places with low ambient brightness. In order to improve the clarity of a shot picture, an electronic device (such as a mobile phone in fig. 3) usually increases the brightness of a scene by supplementing light. And the electronic equipment in the scene with lower brightness enters a photographing interface after the user opens the camera application program. The photographing interface comprises a preview area and a key area. The preview area is used for displaying a preview image of the electronic equipment at the current shooting angle. The right part of the preview area includes a zoom button, i.e., a circle button on the right side of the preview area. When the round button receives an up-and-down dragging instruction, the magnification or reduction multiple of the shot image can be correspondingly adjusted. The lower part of the preview area comprises a first key area, and the key area comprises a photographing mode selection key, a photographing key, an album key and a lens switching key. The corresponding photographing mode can be entered through the mode key. A shooting parameter key and a function selection key are included above the preview area. For example, the photographing parameter keys may include a flash mode key, a parameter setting key, and the like. The function selection keys may include, for example, a smart photograph key, a color style key, a photograph shopping key, etc. As shown in the left diagram of fig. 3, when the user selects the automatic flash mode, the electronic device determines whether the flash needs to be turned on according to the brightness of the current scene.
In the auto-flash mode, when it is determined whether the flash needs to be turned on, the acquired ambient brightness may be compared with a preset brightness threshold. If the current ambient brightness is greater than or equal to the preset brightness threshold, as shown in the lower diagram on the right side of fig. 3, the flash operation is not triggered during photographing. If the current ambient brightness is less than the preset brightness threshold, as shown in the upper diagram on the right side of fig. 3, the flash lamp is triggered to operate when a picture is taken. The method comprises the steps that when a flash lamp is triggered to work, if the electronic equipment receives a photographing instruction of a user, the flash lamp can be controlled to emit strong flashing light, and when the strong flashing light is emitted, the electronic equipment can acquire an image of which the ambient brightness is improved through the strong flashing light.
Fig. 4 shows an implementation process of improving image sharpness based on an image in a strong flash state and an image in a pre-flash state. As shown in fig. 4, the electronic device is in a scene with low brightness, and the flash operation mode of the electronic device is the auto-flash mode. The electronic equipment compares the acquired environment brightness with a preset brightness threshold value, determines that the acquired environment brightness is smaller than the preset brightness threshold value, and can trigger flashing when photographing.
When taking a picture, the image acquisition process can be divided into a pre-flash state and a strong flash state. The pre-flashing state is a time period before the electronic equipment flashes the flash lamp after receiving the photographing instruction. The strong flash state is a time period when the flash lamp emits flash. In the pre-flash state, the electronic device may capture one or more images. In the strong flash state, the electronic device may capture one or more images. If electronic equipment when gathering many images under the state of preflash or strong flash, can fuse many images under the state of preflash into an image, perhaps fuse many images under the state of strong flash into an image. Because the electronic device does not improve the brightness of the scene in the pre-flash state, the brightness of the acquired image is low. Under the strong flash state, the electronic equipment improves the scene brightness through the flash lamp, and the brightness of the acquired image is higher than that of the image acquired under the pre-flash state. Due to the images with different brightness, the reflected detail information is different, for example, the low-brightness image can clearly display the object with higher brightness, and the image with higher brightness can clearly display the object with lower brightness. As shown in the third row of images in fig. 4, the image generated in the pre-flash state and the image generated in the strong flash state are fused, so that an image with higher definition can be obtained.
Although the image quality can be effectively improved through the collection and fusion of the strong flash state and the pre-flash state. However, when the images are captured in the strong flash state and the pre-flash state, the difference between the brightness and the sharpness of the captured images in the two states is large due to the large difference between the brightness in the scene and the brightness in the pre-flash state, which is not favorable for accurately segmenting the objects in the images, including people, objects, and the like. And a larger time gap exists between the time point of image acquisition in the pre-flash state and the time point of image acquisition in the strong flash state, and image jitter may be generated in the gap, which affects the stability of automatic focusing, automatic white balance and automatic exposure of the image, is not beneficial to accurate and efficient registration and fusion operation of the image, and thus affects the shooting quality of the image.
Based on this, the embodiment of the application provides a photographing method. The photographing method adjusts the control mode of the flash lamp, and during photographing, the pre-flash state and the strong flash state of the flash lamp are adjusted to be in a normally-on state, so that the electronic equipment can acquire images with consistent scene brightness, and people or objects in the scene can be segmented. By adjusting the exposure value of the captured image, a first image is captured based on the standard exposure value (i.e., the first exposure value), a second image is captured based on the second exposure value, and a third image is captured based on the third exposure value, and the first exposure value is greater than the second exposure value and the second exposure value is greater than the third exposure value. Because the time interval for image acquisition is shorter by adjusting the exposure value compared with the time interval for image acquisition in a pre-flash state and a strong flash state, the image jitter can be effectively reduced, the stability of automatic focusing, automatic white balance and automatic exposure is improved, the image registration and fusion operation is facilitated, and the image photographing quality is improved.
Fig. 5 is a schematic flow chart of an implementation of a photographing method provided in the embodiment of the present application, which is detailed as follows:
in S501, the electronic device triggers a photographing instruction to control the flash to be in a normally on state.
The electronic device in the embodiment of the application includes a smart phone, a tablet computer, a notebook computer or other electronic devices with cameras.
The photographing instruction triggered by the electronic device can be a key instruction, a touch instruction, a voice control instruction or an instruction triggered by picture content. The key instruction may include an instruction triggered by a preset photographing function key when a camera application in the electronic device is in an operating state. For example, in the operating state of the camera application, the user clicks a volume up key or a volume down key to trigger a photographing instruction of the camera application. Or, the key instruction may also be an instruction sent by a photographing key of the photographing auxiliary device when the electronic device is connected to another photographing auxiliary device, for example, the auxiliary device includes, for example, a selfie stick, and the key instruction of the selfie stick may be transmitted to the electronic device in a wired or wireless connection manner, so that the electronic device triggers the photographing instruction. Or, the button instruction can also be used for rapidly starting the camera and triggering the photographing instruction through a shortcut button when the camera application program is not in the running state, so as to meet the photographing requirement of a rapid snapshot scene.
The touch instruction may be an instruction generated when a photographing key in a touch screen of the electronic device is triggered. For example, in the photographing interface of the electronic device shown in the left diagram of fig. 3, three touch buttons are included below the preview area, and when a touch instruction of the user within the range of the button is received, the photographing instruction is triggered.
The voice control instruction may be a voice content of the user detected by the voice detection system during the operation of a camera application of the electronic device. And if the detected voice content comprises a preset photographing instruction keyword, triggering a photographing instruction. Alternatively, the voice content of the user can be detected in real time by the voice detection system. And if the detected content is matched with a preset photographing instruction keyword, starting a camera application program and triggering a photographing instruction.
When the photographing instruction is triggered according to the picture content, the feature content which needs to trigger the photographing instruction can be selected according to the requirement. For example, when the preview screen includes a smiling face feature, the photographing instruction is automatically triggered. Without limitation, other feature content may also be included, including, for example, a particular face, a vehicle, etc.
After the electronic equipment triggers the photographing instruction, the flash lamp is controlled to be in a normally-on state.
The flash lamp is in a normally-on state, which means that the flash lamp is in a normally-on state in the image acquisition process. When the image acquisition is completed, the flash may be turned off. The brightness of the flash lamp in the normally-on state may be a fixed brightness, or the brightness in the normally-on state may be determined according to the brightness of the environment.
Fig. 6 is a schematic view of an implementation flow of an electronic device triggering a photographing instruction to control a flash lamp to be in a normally on state according to an embodiment of the present application. As shown in fig. 6, the implementation process of triggering the photographing instruction by the electronic device and controlling the flash to be in the normally on state includes:
in S601, the electronic device determines that flash operation is required for photographing according to a preset flash mode.
The preset flash mode may be an automatic flash mode or a flash mode.
The electronic equipment acquires the brightness in the environment, namely the brightness in the shooting scene of the electronic equipment, through lower hardware when shooting each time, and compares the acquired brightness with a preset brightness threshold. And if the acquired brightness is lower than a preset brightness threshold value, reporting the comparison result to an upper layer. And the upper layer determines to trigger the flash lamp to work during shooting according to the reported conclusion. And if the acquired brightness is lower than a preset brightness threshold, the flash lamp is not triggered to work during shooting.
When the flash mode is the flash mode, the electronic equipment does not need to compare the ambient brightness with a preset brightness threshold value every time of shooting, and the upper layer determines to trigger the flash lamp to work all the time during shooting.
In S602, the electronic device triggers a photographing instruction.
When the electronic device triggers a photographing instruction, a camera application program of the electronic device may be in an operating state or a non-operating state. If the camera application program in the electronic equipment is in a non-running state, the camera application program can be started through the photographing instruction.
The photographing instruction can be triggered according to preset key operation, touch operation, sound or picture content.
In S603, the electronic device determines the operating current of the flash according to the ambient brightness.
S603, S602, and S601 may not be executed in strict sequence of sequence numbers. For example, the photographing instruction may be triggered first, then it is determined that the flash lamp needs to operate for photographing, and the operating current of the flash lamp is determined, or it is determined that the flash lamp needs to operate for photographing and the operating current of the flash lamp is determined first, and then the photographing instruction is triggered.
When the electronic device determines the working current of the flash lamp according to the ambient brightness, the corresponding relationship between the brightness and the working current can be preset. And when the ambient brightness is detected, searching the working current corresponding to the ambient brightness according to the preset corresponding relation.
Compared with a fixed working current mode, the working current determined by the ambient brightness can adapt to the brightness adjustment requirements of different brightness, and the adjusted ambient brightness is in a stable range. When the fixed working current drives the flash lamp to supplement light, the fixed brightness is increased on the basis of the ambient brightness, so that the adjusted brightness may have larger deviation.
In S604, the flash is turned on according to the operating current, and a normally on state is maintained for a predetermined period of time.
The determined operating current is larger when the brightness of the environment is lower, and the determined operating current is lower when the brightness of the environment is higher. For example, in the schematic diagram of the brightness of the flash under different ambient brightness shown in fig. 7, the ambient brightness in the left graph is higher than the brightness in the middle graph and the right graph, the determined operating current is smaller, and the brightness of the flash driven by the operating current is lower. The ambient brightness in the right graph is lower relative to the brightness in the middle and left graphs, the determined working current is larger, and the brightness of the flash driven by the working current is higher.
And after the flash lamp is lightened according to the determined working current, keeping the brightness corresponding to the working current unchanged, namely keeping the flash lamp in a normally-on state, within a preset time period. In the normal bright state, the process may proceed to S502 to acquire images in the scene, so as to obtain a plurality of images with uniform scene brightness.
And if the image acquisition is finished, the upper layer of the electronic equipment sends a closing instruction of the normally-on state of the flash lamp to the bottom layer of the electronic equipment, so that the bottom layer closes the flash lamp according to the closing instruction and feeds back closing response information. If the upper layer receives the response information of the closing instruction fed back by the bottom layer, for example, the status bit information of the register, determines that the flash lamp is currently closed, the photographing function of the camera application program can be updated from the unavailable state to the available state. And when the photographing function is in an unavailable state, the photographing instruction is triggered to the electronic equipment until the upper layer of the electronic equipment receives the response of the flash lamp closing instruction.
Alternatively, when the upper layer of the electronic device fails to receive a response command from the lower layer of the electronic device within a predetermined time period (which may be set to any value within 300 ms to 5 s), the flash lamp may be turned off by a HAL (abstraction hardware layer), so that the flash lamp exits from the normally on state. When the HAL turns off the flash, the photographing function of the camera application is updated from the unavailable state to the available state. In this case, the unavailable state of the photographing function of the camera application triggers a photographing instruction for the electronic device to a period of time when the HAL of the electronic device turns off the flash.
When the electronic equipment is in the state that the photographing function is unavailable, the triggering instruction is not responded, or prompt information without response can be generated. Including, for example, a prompt window.
In S502, the electronic device captures images in the normally-on state, the captured images including a first image captured at a first exposure value, a second image captured at a second exposure value, and a third image captured at a third exposure value.
Wherein, the exposure value in the embodiment of the present application includes a first exposure value and a second exposureLight value and third exposure value. The exposure value is a base 2 logarithmic scale system. The calculation formula can be expressed as:
Figure RE-GDA0003318650810000151
where N denotes the aperture (f value), EV denotes the exposure value, and t denotes the exposure time (in seconds). Since the exposure value is calculated with the exposure time and the aperture, different combinations of apertures and exposure times can be selected for the same exposure value.
The first exposure value is a standard exposure value of the current scene. The standard exposure value is an exposure value set for normally exposing an image. Therefore, the set standard exposure value is different for environments with different brightness. In the embodiment of the application, the brightness in the environment is kept in a certain range by adjusting different working currents. Therefore, when the flash lamps are turned on by the working current so that the brightness of the scene is kept consistent, the standard exposure value in the embodiment of the present application may be the same exposure value.
In a possible implementation manner, a corresponding relation table of the standard exposure value and the ambient brightness or the ambient illuminance, or a corresponding relation table of the brightness of the area where the shot image is located and the standard exposure value may be established in advance. And determining the standard exposure value according to the ambient brightness and the ambient illumination detected by the electronic equipment or the brightness of the area where the shot image is located.
On the basis of the determination of the standard exposure value, i.e. the first exposure value, the present application is also provided with a second exposure value and a third exposure value. Wherein the second exposure value is greater than the first exposure value, and the third exposure value is greater than the second exposure value. As understood from the definition of the exposure value, the larger the exposure value, the smaller the brightness of the image. As shown in the image capture and synthesis diagram of FIG. 8, the brightness of the second image 82 captured by the second exposure value is less than the brightness of the first image 81 captured by the first exposure value. The brightness of the third image 83 acquired by the third exposure value is smaller than the brightness of the second image 82 acquired by the second exposure value.
In the embodiment of the present application,the first exposure value may be expressed as EV0, the second exposure value may be selected as any value within the range of EV1 to EV4, the third exposure value may be selected as any value of EV4 to EV8, and the second exposure value is smaller than the third exposure compensation value. For example, the first exposure value is EV0, and the second exposure value is EV 3. According to the definition of the exposure value, the exposure parameter value of the second exposure value
Figure RE-GDA0003318650810000161
Exposure parameter value being a first exposure value
Figure RE-GDA0003318650810000162
2 of (2)3The exposure of the first exposure value is 8 times the exposure of the second exposure value. When the second exposure value is EV0, the third exposure value is EV 6. Exposure parameter value of third exposure value according to the definition of exposure value
Figure RE-GDA0003318650810000163
Is the first exposure value
Figure RE-GDA0003318650810000164
2 of exposure parameters6And (4) doubling. I.e. the exposure dose of the first exposure value is 64 times the exposure dose of the third exposure value. Wherein N is1、N2、N3Respectively representing the aperture size of the first exposure value, the aperture size of the second exposure value and the aperture size of the third exposure value, t1、t2、t3Respectively representing the exposure time of the first exposure value, the exposure time of the second exposure value and the exposure time of the third exposure value.
The electronic equipment controls the flash lamp to keep a constant-brightness state with fixed brightness when the first image, the second image and the third image are acquired. In the image acquisition process of the electronic equipment, the ambient brightness information is basically consistent. Therefore, when the first image, the second image and the third image with consistent ambient brightness information are acquired, stable camera parameters including an automatic exposure parameter, an automatic white balance parameter and an automatic focusing parameter can be adopted for image acquisition, and the image acquisition parameters do not need to be changed frequently, so that the reliability of image acquisition is higher.
And the first image, the second image and the third image with different exposure values are obtained by controlling the flash lamp to be in a constant normally-on state and combining the adjustment of the exposure values on the software level. For the adjustment of drive current, it is more convenient to adjust the exposure value through the software layer, and the realization is easier, and the problem of the shooting stability brought by the adjustment of drive current can be reduced.
The HDR image is generated according to the frame image by fixing the frame, namely fixedly generating the first image, the second image and the third image. The existing HDR image generation method needs to judge the brightness of the frame through the ambient brightness information, and the decision calculation is troublesome, so that the decision efficiency can be effectively improved and the photographing process can be more efficiently completed through the fixed frame output mode.
The first image in the embodiment of the present application is a normally exposed image, and therefore, the first image includes more image detail information. In order to improve the quality of the taken picture, a plurality of images may be included in the first image. The multiple images included in the first image are subjected to denoising and image quality optimization processing, so that a clearer fusion image can be obtained conveniently.
In S503, the electronic device generates an image to be output according to the first image, the second image, and the third image.
As shown in fig. 8, in the embodiment of the present application, a first image 81 can be acquired through an exposure value. Since the first exposure value is the standard exposure value, the sharpness of most of the articles or figures in the first image 81 is higher than the sharpness of the second image 82 and the third image 83. As shown in fig. 9, in the first image of the first exposure value, the portrait information is relatively clear in the first image collected by the first exposure value, but the pure color area above the portrait is relatively blurred due to the overexposure.
In a possible implementation, the first image may comprise a plurality, such as may comprise 5-8 frame images. A plurality of images are collected through the first exposure value, denoising processing can be adopted, and the images are fused into a standard exposure image with higher definition.
A second image 82 is acquired through the second exposure value. Since the second exposure value is larger than the first exposure value, the definition of the portrait is somewhat reduced in the second image of the second exposure value as shown in fig. 10, but the definition of the solid color region above the portrait is somewhat improved.
The exposure amount of the third exposure value is much smaller than that of the first exposure value, and therefore, in the third image 83 captured at the third exposure value, such as the third image captured based on the third exposure value shown in fig. 11, a clearer image can be captured in the solid color region, but the human figure content is very blurred.
The sharp region in the third image is very blurred in the third image due to the sharp region in the first image, and the sharp region in the third image is very blurred in the first image. Therefore, the first image and the third image cannot be accurately registered. The portrait area in the second image is clearer relative to the portrait area in the third image, and the solid color area in the second image is clearer relative to the first image. Thus, the first image may be registered with the second image based on the portrait area and the second image may be registered with the third image based on the solid area. And obtaining a registration result of the first image and the third image according to the registration result of the first image and the second image and the registration result of the second image and the third image. And generating an image 84 with better definition and needing to be output according to the registered first image and the registered third image.
In the embodiment of the present application, the portrait area and the solid color area are merely examples, and the area used for image registration is not limited to the portrait area and the solid color area. In a possible implementation, the portrait area may be other normally exposed image areas. The solid color region may be replaced with a region of high brightness in the first image, such as a light source region.
In addition, in the embodiment of the present application, based on the first image, the second image, and the third image, an HDR image with higher definition can be synthesized. Compared with the current HDR image synthesis method, the method can reduce the decision calculation of photos required by synthesizing HDR images. For example, when an HDR image is synthesized, it is determined whether an overexposed image and an underexposed image need to be acquired according to scene brightness information. And determining the image required to be acquired according to the calculation result. According to the method and the device, the first image, the second image and the third image which are needed can be directly acquired, clear HDR images can be acquired, decision calculation can be effectively reduced, and image generation efficiency is improved.
Fig. 13 is a schematic flow chart of processing an acquired image and in a RAW domain according to an embodiment of the present application. As shown in fig. 13, the shooting flow includes:
1301, triggering a flash lamp normally-on mode.
The camera application program can respond to the photographing function or the video shooting function according to a preset trigger instruction, and the flash lamp is kept in a normally-on state in the realization process of the photographing function or the video shooting function.
Due to the adoption of the photographing method in the embodiment of the application, the method can be mainly used for night scene image photographing, indoor image photographing or other low-brightness scenes. Therefore, when the user selects the night scene shooting mode, or the electronic device detects that the brightness of the current scene is low, and the electronic device automatically switches to the night scene mode, the normal lighting mode of the flash lamp can be triggered.
In the normally-on mode, the electronic device determines the operating current of the flash lamp according to the brightness of the environment, so that the operating current is related to the brightness of the scene. When the brightness of the scene is high, the working current is small, and if the brightness of the scene is low, the working current is large. By adopting the working current matched with the ambient or scene brightness, the ambient brightness of the flash lamp in the normal-lighting state is kept stable.
1302, multiple frames of RAW images of different EVs are acquired.
When the flash lamp is in a normally-on state, the brightness of the environment is kept stable. The image acquisition is carried out under the condition that the ambient brightness keeps a stable state, so that the parameters of the acquired image, such as an automatic exposure parameter, an automatic white balance parameter and an automatic focusing parameter, are more stable, and the image with better image quality can be obtained.
And, based on the same driving current, the exposure value is adjusted to obtain images of different exposure levels. The exposure parameters are only needed to be modified on the software level, the adjustment of the driving current is avoided, the operation convenience can be improved, and the stability problem caused by the current adjustment is reduced.
The HDR image is generated from the frame image by fixing the frame, that is, by fixing the first image, the second image, and the third image. The existing HDR image generation method needs to judge the brightness of the frame through the ambient brightness information, and the decision calculation is troublesome, so that the decision efficiency can be effectively improved and the photographing process can be more efficiently completed through the fixed frame output mode.
1303, image registration.
In the first image, the second image and the third image acquired in the embodiment of the application, the first image is a normally exposed image, so that the detail content of the first image is more. The image with two or more frames can be collected as the first image, and the clear picture can be conveniently obtained through denoising and fusion processing.
The highlight areas in the scene are blurred in the first image while more detail content remains in the first image.
The third image is the image with the highest exposure value, namely the third image has the lowest exposure. In the third image, the high brightness area in the scene can be displayed more clearly, but the low brightness area in the scene is blurred.
The overexposed area in the scene is sharper in the second image relative to the first image. The low brightness regions in the scene are sharper in the second image relative to the third image. Thus, the first image may be registered with the third image and the second image may be registered with the third image. And realizing the registration of the first image and the third image according to the registration result of the first image and the second image and the registration result of the second image and the third image. Therefore, the clear image of the low-brightness area in the scene can be effectively matched with the clear image of the high-brightness area in the scene.
1304, lens shading correction/black level correction.
The Lens Shading Correction (abbreviated as Lens Shading Correction, LSC) is a Correction performed to solve the problem that the Lens has Shading around the Lens due to non-uniform optical refraction. The lens shading correction method may include a concentric circle method, a mesh method, and the like.
The black level, i.e., the lowest level value of the black data, generally refers to the level value of the sensor signal corresponding to the case where the sensed image data is 0. The current Black Level Correction (BLC) scheme includes performing Black Level Correction by subtracting a fixed value, and performing Black Level Correction by using a drift curve of a Black Level with temperature and gain.
1305, network model processing.
In the embodiment of the present application, the network model may be a convolutional neural network model, or may also be a U-Net network model, and the like. Before the network model is used for optimization, images for training can be input into the network model in advance, and changes of parameters such as definition, signal-to-noise ratio and sharpness of the images output by the network model and the input images can be judged. By continuously adjusting parameters in the model, the output image has higher definition, higher signal-to-noise ratio and higher sharpness relative to the input image. After the parameter optimization training of the network model is completed, the first image, the second image and the third image can be optimized through the trained network model, and therefore the output image with higher definition, higher signal-to-noise ratio and better sharpness is obtained.
1306, automatic white balancing.
In order to adjust the color of an object under different illumination to a color that is recognized by human eyes, it is necessary to perform white balance processing on an image. Automatic snapshot balancing methods typically include gray world methods, white block methods, light source parameter voting methods, and tone scale correction methods, among others.
1307, image fusion.
And after the acquired first image, the acquired second image and the acquired third image are subjected to optimization processing such as denoising and the like, directly fusing the optimized images in the RAW domain to obtain a fused image. Because a large amount of details of the image are reserved in the RAW, compared with the image optimized and fused by the YUV, the detailed content of the image optimized and fused in the RAW domain is richer, and the image with higher image quality can be obtained conveniently.
1308, nonlinear superposition & ghosting correction.
In the embodiment of the present application, the fused image may be subjected to nonlinear superposition processing. The method can include that HDR images are included in the images, and LDR images are generated through nonlinear mapping and superposition, so that an LDR screen can effectively display the generated images.
Due to the picture registration problem or when the object in the image moves, the fused image may have ghost shapes. Ghosting is a ghost image of the same object present in the image, or may also be referred to as an artifact. The ghost area included in the fused image can be determined through ghost detection, and the ghost image can be corrected through replacing the ghost area of the fused image by selecting the corresponding area of one of the images before fusion. Without being limited thereto, the ghost correction processing may be realized by improving the registration accuracy and the like.
1309, RAW image output.
After the processing, compared with an image optimization and fusion mode of a YUV domain, the image processing process can realize image optimization and fusion processing in the RAW domain with rich detail content, and obtain the RAW output image with clear and rich detail content.
Fig. 14 is a schematic diagram of image signal processing of a night view image according to an embodiment of the present application, and as shown in fig. 14, the process includes:
1401, the sensor acquires a plurality of frames of RAW images.
When the sensor acquires multiple frames of RAW images, multiple frames of images including the first image, the second image and the third image may be acquired while the flash is in the normally on state based on S502 in fig. 5. Wherein the exposure values of the first image, the second image and the third image can be controlled based on the parameters of the software layer. The exposure value of the first image is a standard exposure value, and the exposure value of the first image is a standard exposure image. The second exposure value of the second image is acquired to be larger than the first exposure value of the first image, and the third exposure value of the third image is acquired to be larger than the second exposure value of the second image.
1402, removing dead spots and PD spots.
Due to the CMOS process and low cost, sensor manufacturers typically have bad spots (for example, about 200 sensors are provided and the bad spots are not adjacent), and the bad spots need to be removed in the ISP process.
The dead pixel removing method may include an averaging method, a linkage method, a debugging method, or the like.
1403, lens shading correction.
The Lens Shading Correction (abbreviated as Lens Shading Correction, LSC) is a Correction performed to solve the problem that the Lens has Shading around the Lens due to non-uniform optical refraction. The lens shading correction method may include a concentric circle method, a mesh method, and the like.
1404, processing the night scene in the RAW domain.
The method for processing the night scenes in the RAW domain can comprise the steps of 1303, 1304, 1305, 1306, 1307, 1308 and the like shown in fig. 13.
1405, demosaicing/RAW 2 GRG.
Demosaicing is used to convert bayer array, or RAW, format images to RGB images. Demosaicing is a digital image process for reconstructing a full-color image from incomplete color samples output by an image sensor covered with a Color Filter Array (CFA). Also known as CFA interpolation or color reconstruction. The reconstructed image is usually accurate in uniformly colored regions, but has a loss of resolution (detail and sharpness) and has edge artifacts.
1406, color correction.
Since the color matching characteristics of a camera do not generally satisfy the rutherse condition, i.e. the RGB response of the sensor is generally not linearly independent, there is no linear relationship between the color matching characteristics of a camera and the CIE (commission internationale de l' eclairage) standard observer. Therefore, we need to correct the characteristics of the camera, i.e. by color correction, to be close to the standard observer. Color correction of an image is typically accomplished using a color correction matrix.
1407, tone mapping.
Since the luminance range of the synthesized image is large, in order to enable the generated image to be normally displayed in an ordinary LDR display, it is necessary to tone-map an image of a high dynamic range into an image of a low dynamic range. The tone mapping may include global tone mapping and local tone mapping.
1408,RGB2YUV。
After tone mapping is completed, the RGB image may be converted into an image in YUV format, or the image in RGB format may be directly output.
1409 YUV processing.
In the YUV domain, optimization processing such as noise reduction can be further carried out on the image, and the image quality is further improved.
In the image signal processing process of the whole night scene image, the photographing scheme selects to carry out image optimization and fusion in the RAW domain, so that more image details can be reserved for the image, the definition of the output image is improved, and the image quality of the output image is improved. It is worth noting that the method applied to image processing can be used for generating a picture with higher definition and more stable quality in a shooting scene, and can also be used for shooting a video scene to generate a video with higher definition and more stable quality.
The electronic device provided by the embodiment of the present application may include a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the processor implements the method according to any one of the above method embodiments.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps that can be implemented in the above method embodiments.
The embodiments of the present application provide a computer program product, which when running on an electronic device, enables the electronic device to implement the steps in the above method embodiments when executed.
Embodiments of the present application further provide a chip system, where the chip system includes a processor, the processor is coupled with a memory, and the processor executes a computer program stored in the memory to implement the methods according to the above method embodiments. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment. It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance. Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A method of taking a picture, the method comprising:
the electronic equipment triggers a photographing instruction to control the flash lamp to be in a normally-on state;
the electronic equipment collects images in the normally-on state, the collected images comprise a first image collected under a first exposure value, a second image collected under a second exposure value and a third image collected under a third exposure value, the first exposure value is smaller than the second exposure value, the second exposure value is smaller than the third exposure value, and the first exposure value is a standard exposure value of the current scene;
and the electronic equipment generates an image needing to be output according to the first image, the second image and the third image.
2. The method of claim 1, wherein the electronic device controls the flash to be in a normally on state, comprising:
the electronic equipment acquires the brightness of a photographing scene;
according to the preset corresponding relation between the brightness and the working current, the electronic equipment determines the working current corresponding to the brightness of the scene;
and the electronic equipment drives the flash lamp to be in a normally-on state according to the working current.
3. The method of claim 1, wherein the electronic device acquiring the image in the normally on state comprises:
the electronic equipment determines a first exposure value according to the brightness of the current scene;
according to a preset exposure proportional relation, the electronic equipment determines the second exposure value and the third exposure value;
the electronic device collects images according to the determined first exposure value, second exposure value and third exposure value, respectively.
4. The method of claim 3, wherein the electronic device separately captures images based on the determined first exposure value, second exposure value, and third exposure value, comprising:
the electronic equipment determines the aperture size and/or the exposure time corresponding to the first exposure value, the second exposure value and the third exposure value respectively;
and the electronic equipment respectively acquires images according to the determined aperture size and/or exposure time.
5. The method of claim 1, wherein generating, by an electronic device, an image to be output from the first image, the second image, and the third image comprises:
the electronic equipment performs optimization and fusion processing according to the first image in the RAW format, the second image in the RAW format and the third image in the RAW format to obtain a first output image in the RAW format;
and the electronic equipment performs color space transformation on the first output image to obtain an image needing to be output.
6. The method according to claim 5, wherein the electronic device performs optimization and fusion processing on the first image in RAW format, the second image in RAW format, and the third image in RAW format to obtain a first output image in RAW format, and the method comprises:
and carrying out dead pixel correction, lens shading correction, black level correction, RAW domain noise reduction, white balance gain and image fusion processing on the first image, the second image and the third image to obtain a first output image in a RAW format.
7. The method according to claim 6, wherein when the first image, the second image and the third image are subjected to RAW domain noise reduction processing, the first image, the second image and the third image are subjected to noise reduction processing by a network model.
8. The method of claim 5, wherein the electronic device performs color space conversion on the first output image to obtain an image to be output, comprising:
the electronic equipment demosaicing processing is carried out on the first output image to obtain an RGB image;
and carrying out color correction and global color mapping processing on the RGB image, carrying out color space transformation on the processed image, and carrying out YUV domain processing on the transformed image to obtain an image to be output.
9. The method of claim 1, wherein after the electronic device acquires the image in the normally on state, the method further comprises:
the upper layer of the electronic equipment sends a closing instruction of a normally-on state of a flash lamp to the bottom layer of the electronic equipment;
and when the upper layer of the electronic equipment receives the response of turning off the flash lamp, the photographing function of the electronic equipment is updated to be in an available state.
10. The method of claim 9, wherein after the upper layer of the electronic device sends a flash turn off command to the bottom layer of the electronic device, the method further comprises:
when the upper layer of the electronic equipment does not receive the response of turning off the flash lamp within the preset first time, the HAL of the electronic equipment turns off the flash lamp.
11. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 10 when executing the computer program.
12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-10.
CN202110927005.9A 2021-08-12 2021-08-12 Photographing method, electronic device, and computer-readable storage medium Active CN114095666B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110927005.9A CN114095666B (en) 2021-08-12 2021-08-12 Photographing method, electronic device, and computer-readable storage medium
PCT/CN2022/091901 WO2023015991A1 (en) 2021-08-12 2022-05-10 Photography method, electronic device, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110927005.9A CN114095666B (en) 2021-08-12 2021-08-12 Photographing method, electronic device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN114095666A true CN114095666A (en) 2022-02-25
CN114095666B CN114095666B (en) 2023-09-22

Family

ID=80296148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110927005.9A Active CN114095666B (en) 2021-08-12 2021-08-12 Photographing method, electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN114095666B (en)
WO (1) WO2023015991A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023015991A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Photography method, electronic device, and computer readable storage medium
CN116074634A (en) * 2022-05-27 2023-05-05 荣耀终端有限公司 Exposure parameter determination method and device
CN117408927A (en) * 2023-12-12 2024-01-16 荣耀终端有限公司 Image processing method, device and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117689559A (en) * 2023-08-07 2024-03-12 上海荣耀智慧科技开发有限公司 Image fusion method and device, electronic equipment and storage medium
CN117499789B (en) * 2023-12-25 2024-05-17 荣耀终端有限公司 Shooting method and related device

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002223387A (en) * 2001-01-26 2002-08-09 Olympus Optical Co Ltd Imaging unit
US20100245609A1 (en) * 2009-03-26 2010-09-30 Texas Instruments Incorporated Digital Image Segmentation Using Flash
CN103957363A (en) * 2014-05-16 2014-07-30 深圳市中兴移动通信有限公司 Flash camera shooting method and camera shooting device
CN105163047A (en) * 2015-09-15 2015-12-16 厦门美图之家科技有限公司 HDR (High Dynamic Range) image generation method and system based on color space conversion and shooting terminal
US20160352994A1 (en) * 2015-05-28 2016-12-01 Blackberry Limited Camera having hdr during pre-flash
CN107888842A (en) * 2017-12-28 2018-04-06 上海传英信息技术有限公司 A kind of flash lamp control method and control system based on intelligent terminal
CN108717691A (en) * 2018-06-06 2018-10-30 成都西纬科技有限公司 A kind of image interfusion method, device, electronic equipment and medium
CN109194873A (en) * 2018-10-29 2019-01-11 浙江大华技术股份有限公司 A kind of image processing method and device
CN109788207A (en) * 2019-01-30 2019-05-21 Oppo广东移动通信有限公司 Image composition method, device, electronic equipment and readable storage medium storing program for executing
CN110198419A (en) * 2019-06-28 2019-09-03 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
CN111064898A (en) * 2019-12-02 2020-04-24 联想(北京)有限公司 Image shooting method and device, equipment and storage medium
US10911691B1 (en) * 2019-11-19 2021-02-02 Samsung Electronics Co., Ltd. System and method for dynamic selection of reference image frame
KR20210018121A (en) * 2019-08-06 2021-02-17 삼성전자주식회사 Device and method for performing local histogram matching with global regularization and motion exclusion
WO2021082580A1 (en) * 2019-10-31 2021-05-06 北京迈格威科技有限公司 Night scene high dynamic range image generation method, device, and electronic apparatus
CN113038027A (en) * 2021-03-05 2021-06-25 上海商汤临港智能科技有限公司 Exposure control method, device, equipment and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3967510B2 (en) * 1999-12-28 2007-08-29 富士フイルム株式会社 Digital camera
JP2006197243A (en) * 2005-01-13 2006-07-27 Canon Inc Imaging apparatus and method, program, and storage medium
JP7306269B2 (en) * 2017-10-24 2023-07-11 ソニーグループ株式会社 Control device, control method and program
CN109729279B (en) * 2018-12-20 2020-11-17 华为技术有限公司 Image shooting method and terminal equipment
CN109862282B (en) * 2019-02-18 2021-04-30 Oppo广东移动通信有限公司 Method and device for processing person image
CN110198417A (en) * 2019-06-28 2019-09-03 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN114095666B (en) * 2021-08-12 2023-09-22 荣耀终端有限公司 Photographing method, electronic device, and computer-readable storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002223387A (en) * 2001-01-26 2002-08-09 Olympus Optical Co Ltd Imaging unit
US20100245609A1 (en) * 2009-03-26 2010-09-30 Texas Instruments Incorporated Digital Image Segmentation Using Flash
CN103957363A (en) * 2014-05-16 2014-07-30 深圳市中兴移动通信有限公司 Flash camera shooting method and camera shooting device
US20160352994A1 (en) * 2015-05-28 2016-12-01 Blackberry Limited Camera having hdr during pre-flash
CN105163047A (en) * 2015-09-15 2015-12-16 厦门美图之家科技有限公司 HDR (High Dynamic Range) image generation method and system based on color space conversion and shooting terminal
CN107888842A (en) * 2017-12-28 2018-04-06 上海传英信息技术有限公司 A kind of flash lamp control method and control system based on intelligent terminal
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
CN108717691A (en) * 2018-06-06 2018-10-30 成都西纬科技有限公司 A kind of image interfusion method, device, electronic equipment and medium
CN109194873A (en) * 2018-10-29 2019-01-11 浙江大华技术股份有限公司 A kind of image processing method and device
CN109788207A (en) * 2019-01-30 2019-05-21 Oppo广东移动通信有限公司 Image composition method, device, electronic equipment and readable storage medium storing program for executing
CN110198419A (en) * 2019-06-28 2019-09-03 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
KR20210018121A (en) * 2019-08-06 2021-02-17 삼성전자주식회사 Device and method for performing local histogram matching with global regularization and motion exclusion
WO2021082580A1 (en) * 2019-10-31 2021-05-06 北京迈格威科技有限公司 Night scene high dynamic range image generation method, device, and electronic apparatus
US10911691B1 (en) * 2019-11-19 2021-02-02 Samsung Electronics Co., Ltd. System and method for dynamic selection of reference image frame
CN111064898A (en) * 2019-12-02 2020-04-24 联想(北京)有限公司 Image shooting method and device, equipment and storage medium
CN113038027A (en) * 2021-03-05 2021-06-25 上海商汤临港智能科技有限公司 Exposure control method, device, equipment and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
华顺刚;王丽丹;欧宗瑛;: "同一场景不同曝光图像的配准及HDR图像合成", 计算机辅助设计与图形学学报, no. 04 *
李卫中;易本顺;邱康;彭红;: "细节保留的多曝光图像融合", 光学精密工程, no. 09 *
李英杰;张俊举;常本康;钱芸生;刘磊;: "一种多波段红外图像联合配准和融合方法", 电子与信息学报, no. 01 *
都琳;孙华燕;王帅;高宇轩;齐莹莹;: "针对动态目标的高动态范围图像融合算法研究", 光学学报, no. 04 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023015991A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Photography method, electronic device, and computer readable storage medium
CN116074634A (en) * 2022-05-27 2023-05-05 荣耀终端有限公司 Exposure parameter determination method and device
CN116074634B (en) * 2022-05-27 2023-11-14 荣耀终端有限公司 Exposure parameter determination method and device
CN117408927A (en) * 2023-12-12 2024-01-16 荣耀终端有限公司 Image processing method, device and storage medium

Also Published As

Publication number Publication date
WO2023015991A1 (en) 2023-02-16
CN114095666B (en) 2023-09-22

Similar Documents

Publication Publication Date Title
CN113132620B (en) Image shooting method and related device
CN112532857B (en) Shooting method and equipment for delayed photography
CN112532859B (en) Video acquisition method and electronic equipment
CN109951633B (en) Method for shooting moon and electronic equipment
CN114095666B (en) Photographing method, electronic device, and computer-readable storage medium
CN111327814A (en) Image processing method and electronic equipment
CN113475057B (en) Video frame rate control method and related device
CN112532892B (en) Image processing method and electronic device
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN112492193B (en) Method and equipment for processing callback stream
CN113810603B (en) Point light source image detection method and electronic equipment
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
CN111770282B (en) Image processing method and device, computer readable medium and terminal equipment
WO2023273323A9 (en) Focusing method and electronic device
CN113891009B (en) Exposure adjusting method and related equipment
CN113572948B (en) Video processing method and video processing device
CN113837984A (en) Playback abnormality detection method, electronic device, and computer-readable storage medium
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN114863494A (en) Screen brightness adjusting method and device and terminal equipment
CN115705241A (en) Application scheduling method and electronic equipment
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment
CN113923372B (en) Exposure adjusting method and related equipment
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN114070916A (en) Light supplementing method for shooting and related device
US20240236504A9 (en) Point light source image detection method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant