CN114095666B - Photographing method, electronic device, and computer-readable storage medium - Google Patents

Photographing method, electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN114095666B
CN114095666B CN202110927005.9A CN202110927005A CN114095666B CN 114095666 B CN114095666 B CN 114095666B CN 202110927005 A CN202110927005 A CN 202110927005A CN 114095666 B CN114095666 B CN 114095666B
Authority
CN
China
Prior art keywords
image
exposure value
electronic device
exposure
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110927005.9A
Other languages
Chinese (zh)
Other versions
CN114095666A (en
Inventor
陈珂
商亚洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110927005.9A priority Critical patent/CN114095666B/en
Publication of CN114095666A publication Critical patent/CN114095666A/en
Priority to PCT/CN2022/091901 priority patent/WO2023015991A1/en
Application granted granted Critical
Publication of CN114095666B publication Critical patent/CN114095666B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The application belongs to the field of image processing, and provides a photographing method, electronic equipment and a computer readable storage medium. The method comprises the following steps: the electronic equipment triggers a photographing instruction to control the flash lamp to be in a normally-on state; the electronic equipment collects images in the normally-bright state, the collected images comprise a first image collected at a first exposure value, a second image collected at a second exposure value and a third image collected at a third exposure value, the first exposure value is smaller than the second exposure value, the second exposure value is smaller than the third exposure value, and the first exposure value is the standard exposure value of the current scene; the electronic device generates an image to be output according to the first image, the second image and the third image. The flash lamp is controlled to be in a normally-on state, so that image acquisition can be performed based on the environment with the same brightness, the time gap of the image acquisition is reduced, the jitter probability of the image acquisition is reduced, the stability of image acquisition parameters is improved, and the fusion is facilitated to generate a clearer image.

Description

Photographing method, electronic device, and computer-readable storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to a photographing method, an electronic device, and a computer-readable storage medium.
Background
When photographing in a dark environment, a camera generally controls a flash lamp to work in order to obtain a relatively clear image, so that the brightness of the environment is improved, and the obtained image is clearer. For example, in an automatic flash mode, the camera detects that the ambient brightness meets the working requirement of the flash lamp, if a photographing instruction is received and the image of the current scene is detected to be a high dynamic range image, the camera respectively collects the images when the flash lamp is in a pre-flash state and a strong flash state, and synthesizes a photo according to the collected images.
Because the brightness and definition of the collected images in the pre-flash state and the strong flash state are large, the person or object in the collected images is easily separated inaccurately, and a time gap exists between the pre-flash state and the strong flash state, so that the image shake can be generated, the stability of 3A (automatic focusing, automatic white balance and automatic exposure) is influenced, the registration and fusion operation of the images are not facilitated, and the quality of the shot images is influenced.
Disclosure of Invention
The embodiment of the application provides a photographing method and electronic equipment, which are used for solving the problems that in the prior art, when photographing, the acquired image is easily separated inaccurately, and the quality of the photographed image is influenced due to the fact that picture jitter and 3A stability are possibly generated.
In a first aspect, an embodiment of the present application provides a photographing method, including: the electronic equipment triggers a photographing instruction to control the flash lamp to be in a normally-on state; the electronic equipment collects images in the normally-bright state, the collected images comprise a first image collected at a first exposure value, a second image collected at a second exposure value and a third image collected at a third exposure value, the first exposure value is smaller than the second exposure value, the second exposure value is smaller than the third exposure value, and the first exposure value is a standard exposure value of a current scene; and the electronic equipment generates an image to be output according to the first image, the second image and the third image.
The first image acquired under the first exposure value is a normally exposed image. The second exposure value is larger than the first exposure value, which means that the exposure of the second image acquired at the second exposure value is smaller than the exposure of the first image acquired at the first exposure value. In the same way, the exposure of the third image is smaller than the exposure of the second image. For highlight areas, including for example solid color areas, or light source areas, a clearer display in the third image may be possible. For normally exposed image areas, a clearer display can be achieved in the first image. The second image can be registered with the first image and the third image respectively, so that the first image and the third image can be registered based on the second image, the images can be effectively registered and fused, and a clear output image is obtained.
It can be seen that the electronic device controls the flash to maintain a constant bright state of a fixed brightness while the first, second and third images are acquired. In the image acquisition process of the electronic equipment, the environment brightness information is basically consistent. Therefore, when the first image, the second image and the third image with the consistent ambient brightness information are acquired, stable camera parameters including an automatic exposure parameter, an automatic white balance parameter and an automatic focusing parameter can be adopted for image acquisition, and frequent image acquisition parameter change is not needed, so that the reliability of image acquisition is higher.
And the flash lamp is controlled to be in a constant normally-on state, and the first image, the second image and the third image with different exposure values are obtained by combining the adjustment of the exposure values on the software level. Compared with the adjustment of the driving current, the exposure value on the software level is more convenient to adjust, the implementation is easier, and the photographing stability problem caused by the adjustment of the driving current can be reduced.
The embodiment of the application generates the HDR image according to the first image, the second image and the third image by fixing the frame, namely fixing the first image, the second image and the third image. In the existing HDR image generation method, the brightness of the frame is required to be judged through the ambient brightness information, and decision calculation is troublesome, so that the decision efficiency can be effectively improved by fixing the frame, and the photographing process can be more efficiently completed.
In some possible implementations of the first aspect, the electronic device controls the flash to be in a normally-on state, including: the electronic equipment acquires the brightness of a photographing scene; according to the preset corresponding relation between the brightness and the working current, the electronic equipment determines the working current corresponding to the brightness of the scene; and the electronic equipment drives the flash lamp to be in a normally-on state according to the working current.
When the flash lamp is in a normally-on state, the working current of the flash lamp is calculated by further combining the brightness of the environment. When the calculated working current drives the flash lamp to work, the brightness of the environment is not too bright or too dark, so that the requirement of acquiring the first image through normal exposure can be better met.
In a possible implementation, the flash lamp may also be controlled to operate at a fixed operating current. The implementation process is relatively simpler. The first image, the second image and the third image are acquired based on the same ambient brightness, and the stability of the images is high.
In some possible implementations of the first aspect, the electronic device acquiring an image in the normally-on state includes: the electronic equipment determines a first exposure value according to the brightness of the current scene; according to a preset exposure proportional relation, the electronic equipment determines the second exposure value and the third exposure value; the electronic device respectively acquires images according to the determined first exposure value, second exposure value and third exposure value.
The preset exposure is based on the normal exposure of the first image, the exposure of the image is gradually reduced, and the second image and the third image are acquired. In a possible implementation, it may be assumed that the first exposure value is EV0, and the second exposure value may be 4 times the first exposure value, i.e. the second exposure value may be EV2. The third exposure value may be 64 times the first exposure value, i.e., the third exposure value may be EV6. In practice, this ratio is not limiting. The first exposure value is EV0, for example, the second exposure value may be any of EV1-EV4, and the third exposure value may be any of EV4-EV 10. Based on the set exposure values, objects with different brightness and images with different definition under different exposure values can be obtained.
In some possible implementations of the first aspect, the electronic device respectively acquires images according to the determined first, second and third exposure values, including: the electronic equipment respectively determines the aperture sizes and/or the exposure times corresponding to the first exposure value, the second exposure value and the third exposure value; and the electronic equipment respectively performs image acquisition according to the determined aperture size and/or exposure time.
When the image acquisition is carried out according to the determined exposure value, the exposure parameters including the aperture and the exposure time are adjusted only through a software layer, so that the acquisition of images with different exposure values can be effectively realized. Compared with the mode of determining images with different exposure values by realizing different flash lamp brightness through current adjustment, the stability of the photographing process can be effectively improved through the adjustment parameters of the software layer.
In some possible implementations of the first aspect, the electronic device generating an image to be output according to the first image, the second image, and the third image includes: the electronic equipment performs optimization and fusion processing according to the first image in the RAW format, the second image in the RAW format and the third image in the RAW format to obtain a first output image in the RAW format; and the electronic equipment performs color space transformation on the first output image to obtain an image to be output.
The electronic equipment optimizes and fuses the acquired images in the RAW, and the method is higher in definition of the fused images and beneficial to improving the quality of the shot photos based on the image optimizing and fusing mode of the RAW domain relative to the mode of optimizing and fusing the images in the YUV domain or relative to the mode of optimizing and fusing the images in the RGB domain.
When the image optimization processing is performed in the RAW domain, the electronic device may perform the optimization and fusion processing according to the first image in the RAW format, the second image in the RAW format, and the third image in the RAW format, to obtain a first output image in the RAW format, including: and performing dead point correction, lens shading correction, black level correction, RAW domain noise reduction, white balance gain and image fusion processing on the first image, the second image and the third image to obtain a first output image in a RAW format.
During the RAW domain noise reduction processing, the RAW image can be processed based on the network model, so that the definition, the sharpness and the image signal-to-noise ratio of the processed image are better. The network model may be a convolutional neural network model or may also be a U-Net network model.
In some possible implementations of the first aspect, the electronic device performs color space conversion on the first output image to obtain an image to be output, including: the electronic equipment performs demosaicing processing on the first output image to obtain an RGB image; and carrying out color correction and global color mapping on the RGB image, carrying out color space conversion on the processed image, and carrying out YUV domain processing on the converted image to obtain the image to be output.
As more details of the image are reserved when the RAW is used for image processing, the image after the RAW processing is used for color space conversion, so that high-definition images in different color spaces can be obtained, and the use requirements of users on the high-definition images in different color spaces are met.
In some possible implementations of the first aspect, after the electronic device acquires the image in the normally-on state, the method further includes: the upper layer of the electronic equipment sends a closing instruction of a normally-on state of a flash lamp to the bottom layer of the electronic equipment; and when the upper layer of the electronic equipment receives a response of closing the flash lamp, the photographing function of the electronic equipment is updated to be in an available state.
By controlling the updating of the available state of the photographing function, it is possible to reduce the user from doing useless clicking operations during the unavailable state. When photographing is completed and a response of closing the flash lamp is received, the key capable of recovering the photographing function is in an available state, so that a user can conveniently and continuously take other photos.
In some possible implementations of the first aspect, after the upper layer of the electronic device sends a flash off instruction to the bottom layer of the electronic device, the method further includes: and the upper layer of the electronic equipment does not receive a response of closing the flash lamp within a preset first time period, and the HAL of the electronic equipment closes the flash lamp.
The flash lamp is forcibly turned off by the Hardware Abstraction Layer (HAL), so that the problem that the bottom layer does not turn off the flash lamp in time and cannot respond to an upper instruction in time can be avoided. Wherein the predetermined first duration may be 200ms to 5s.
In a second aspect, an embodiment of the present application provides a photographing apparatus, including: the triggering unit is used for triggering a photographing instruction by the electronic equipment and controlling the flash lamp to be in a normally-on state; the image acquisition unit is used for acquiring an image under the normally-bright state by the electronic equipment, wherein the acquired image comprises a first image acquired under a first exposure value, a second image acquired under a second exposure value and a third image acquired under a third exposure value, the first exposure value is smaller than the second exposure value, the second exposure value is smaller than the third exposure value, and the first exposure value is a standard exposure value of a current scene; and the image generation unit is used for generating an image to be output by the electronic equipment according to the first image, the second image and the third image.
The photographing device corresponds to the photographing method.
In a third aspect, an embodiment of the present application provides an electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing a method according to any one of the second or third aspects as described above when the computer program is executed by the processor.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which when executed by a processor performs a method as in any of the second or third aspects above.
In a fifth aspect, an embodiment of the present application provides a chip system, the chip system including a processor, the processor being coupled to a memory, the processor executing a computer program stored in the memory to implement a method as in any of the second or third aspects above. The chip system can be a single chip or a chip module composed of a plurality of chips.
In a sixth aspect, embodiments of the present application provide a computer program product for, when run on an electronic device, causing the electronic device to perform the method of any one of the second or third aspects above.
It will be appreciated that the advantages of the second to sixth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a software structural block diagram of an electronic device according to an embodiment of the present application;
Fig. 3 is a schematic view of a photographing scene provided by the present application;
FIG. 4 is a schematic diagram of a photographing method for improving image definition according to the present application;
fig. 5 is a schematic diagram of a photographing method implementation flow provided by the present application;
fig. 6 is a schematic flow chart of a normally-on mode of turning on a flash according to an embodiment of the present application;
fig. 7 is a schematic diagram of a normally-on state of a flash lamp with different ambient brightness according to an embodiment of the present application;
fig. 8 is a schematic diagram of a photographing process according to an embodiment of the present application;
FIG. 9 is a schematic view of a first image acquired at a first exposure value according to an embodiment of the present application;
FIG. 10 is a schematic view of a second image acquired at a second exposure value according to an embodiment of the present application;
FIG. 11 is a schematic view of a third image acquired at a third exposure value according to an embodiment of the present application;
fig. 12 is a schematic diagram of an image processing flow based on a RAW domain according to an embodiment of the present application;
fig. 13 is a schematic diagram of an image signal processing flow of a night scene image according to an embodiment of the present application;
fig. 14 is an image signal processing schematic diagram of a night scene image according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application.
The following is an exemplary description of relevant aspects of embodiments of the application that may be relevant.
(1) An electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquidcrystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
After describing the hardware architecture of the electronic device 100, the following exemplarily describes the software architecture of the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2 is a block diagram of a software architecture of the electronic device 100 according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make such data accessible to applications. Such data may include video, images, audio, and the like.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time (Android on-the-fly) includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing. The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated below in connection with a photo scene.
In the photographing scene, a finger or a stylus of a user is in contact with the display screen 105 of the electronic device 100, the touch sensor 107 arranged on the display screen 105 receives a touch operation, and corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the touch operation into touch events (e.g., action down event, action move event, action up event, etc.) that the upper layer can read. Wherein the action down event indicates that a finger or a stylus is in initial contact with the touch screen, for example, a user uses the stylus to drop a pen in a certain area on the touch screen; the action move event indicates that a finger or stylus is slid on the touch screen, e.g., after a pen is dropped, the stylus is slid on the touch screen; the action up event indicates that the finger or stylus is separated from the touch screen, e.g., after the stylus has been dropped and slid a distance, the user lifts the stylus so that it is separated from the touch screen.
Touch events are stored at the kernel layer. The application framework layer acquires the touch event from the kernel layer, and identifies the control corresponding to the advanced touch event and the touch operation corresponding to the touch event, for example, the touch operation comprises single click, double click, sliding and the like. Taking the touch operation as a touch click operation, taking a shooting control of the video application as an example, wherein the control corresponding to the click operation is the shooting control of the video application, and the video application calls an interface of an application framework layer, so that an application function corresponding to the touch operation is realized by calling a corresponding callback function.
(2) The scene is photographed. Having described the hardware architecture and the software architecture of the electronic device 100 provided by the embodiments of the present application, the following describes an exemplary screen projection scenario according to the embodiments of the present application.
Fig. 3 is a schematic view of a photographing scene. The photographing scene can comprise indoor scenes, night scenes and other places with low environmental brightness. Electronic devices (such as the cell phone of fig. 3) typically increase the brightness of a scene by supplementing light in order to increase the clarity of a picture taken. And the electronic equipment in the scene with lower brightness enters a photographing interface after the user opens the camera application program. The photographing interface comprises a preview area and a key area. The preview area is used for displaying a preview image of the electronic device under the current shooting angle. At the right part of the preview area, there is included a zoom button, i.e., a circular button at the right side of the preview area. When the round button receives an instruction of dragging up and down, the magnification or reduction multiple of the shot image can be correspondingly adjusted. The first key area is arranged below the preview area, and the key area comprises a shooting mode selection key, a shooting key, an album key and a lens switching key. The corresponding shooting mode can be entered through the mode key. A shooting parameter key and a function selection key are arranged above the preview area. For example, the photographing parameter key may include a flash mode key, a parameter setting key, and the like. The function selection keys may include, for example, a smart photo key, a color style key, a photo shopping key, and the like. As shown in the left diagram of fig. 3, when the user selects the auto flash mode, the electronic device determines whether the flash needs to be turned on according to the brightness of the current scene.
In the auto flash mode, when determining whether a flash needs to be turned on, the acquired ambient brightness may be compared with a preset brightness threshold. If the current ambient brightness is greater than or equal to the preset brightness threshold, as shown in the lower diagram on the right side of fig. 3, the flash will not be triggered to work when photographing. If the current ambient brightness is less than the preset brightness threshold, as shown in the upper diagram on the right side of fig. 3, the flash lamp is triggered to work when photographing. When the shooting is determined, the flash lamp is triggered to work, if the electronic equipment receives a shooting instruction of a user, the flash lamp is controlled to emit intense flash light, and when the intense flash light is emitted, the electronic equipment can acquire an image with the environment brightness improved through the intense flash light.
Fig. 4 shows an implementation of the present method for improving the sharpness of an image based on the image in the intense and preflash states. As shown in fig. 4, the electronic device is in a scene with low brightness, and the flash operation mode of the electronic device is an auto flash mode. The electronic device compares the collected ambient brightness with a preset brightness threshold value, determines that the collected ambient brightness is smaller than the preset brightness threshold value, and can trigger flash when photographing.
When photographing, the image acquisition process can be divided into a pre-flash state and a strong flash state. The pre-flash state is a time period before the flash lamp emits flash after the electronic equipment receives a photographing instruction. The strong flash state is a period of time in which the flash lamp emits a flash. In the pre-flash state, the electronic device may capture one or more images. In the flash state, the electronic device may capture one or more images. If the electronic device collects multiple images in the pre-flash state or the strong-flash state, the multiple images in the pre-flash state can be fused into one image, or the multiple images in the strong-flash state can be fused into one image. Since the electronic device does not increase the brightness of the scene in the pre-flash state, the brightness of the acquired image is low. In the intense flash state, the electronic device improves scene brightness through the flash lamp, and the brightness of the collected image is higher than that of the collected image in the pre-flash state. Because of the images with different brightness, the reflected detail information is different, for example, the low-brightness image can clearly display the object with higher brightness, and the high-brightness image can clearly display the object with lower brightness. As shown in the third line of image in fig. 4, the image generated in the pre-flash state is fused with the image generated in the strong flash state, so that an image with higher definition can be obtained.
Although the image quality can be effectively improved by means of the collection and fusion of the strong flash state and the pre-flash state. However, when the images are collected in the intense flash state and the preflash state, the difference of the brightness and the definition of the collected images in the two states is large due to the large difference of the brightness in the scene, which is not beneficial to accurately dividing objects in the images, including objects such as people or objects. And there is great time interval between the time point of the image of the pre-flash state collection and the time point of the image of the strong flash state collection, can produce the picture shake in this interval, can influence the stability of automatic focusing, automatic white balance and automatic exposure of the image, unfavorable for carrying on registering and fusing the operation to the image accurately and efficiently, thus can influence the shooting quality of the image.
Based on the above, the embodiment of the application provides a photographing method. According to the photographing method, the control mode of the flash lamp is adjusted, and the pre-flashing state and the strong flashing state of the flash lamp are adjusted to be in a normally-bright state during photographing, so that the electronic equipment can acquire images with consistent scene brightness, and people or objects in the scene can be segmented. By adjusting the exposure value of the acquired images, a first image is acquired based on a standard exposure value (i.e., a first exposure value), a second image is acquired based on a second exposure value, and a third image is acquired based on a third exposure value, and the first exposure value is greater than the second exposure value, which is greater than the third exposure value. Because the time interval for image acquisition by adjusting the exposure value is shorter than the time interval for image acquisition in the pre-flash state and the strong flash state, the image shake can be effectively reduced, the stability of automatic focusing, automatic white balance and automatic exposure is improved, the registration and fusion operation of images is facilitated, and the photographing quality of the images is improved.
Fig. 5 is a schematic implementation flow chart of a photographing method according to an embodiment of the present application, which is described in detail below:
in S501, the electronic device triggers a photographing instruction to control the flash to be in a normally-on state.
The electronic equipment in the embodiment of the application comprises a smart phone, a tablet personal computer, a notebook computer or other electronic equipment with a camera.
The shooting instruction triggered by the electronic equipment can be a key instruction, a touch instruction, a sound control instruction or an instruction triggered by picture content. The key instruction may include an instruction triggered according to a preset photographing function key when a camera application program in the electronic device is in an operating state. For example, in the running state of the camera application, the user clicks the volume up key or the volume down key to trigger a photographing instruction of the camera application. Or, the key instruction may be an instruction sent by a photographing key of the photographing auxiliary device when the electronic device is connected with other photographing auxiliary devices, for example, the auxiliary device includes a selfie stick, and the key instruction of the selfie stick may be transmitted to the electronic device through a wired or wireless connection manner, so that the electronic device triggers the photographing instruction. Or, the key instruction can also quickly start the camera and trigger the photographing instruction through the shortcut key under the non-running state of the camera application program so as to be used for rapidly capturing the photographing requirement of the scene.
The touch instruction may be an instruction generated when a photographing key is triggered in a touch screen of the electronic device. For example, in the photographing interface of the electronic device shown in the left diagram of fig. 3, three touch buttons are included below the preview area, and when a touch instruction of a user within the key range is received, the photographing instruction is triggered.
The voice control instruction can be the voice content of the user detected by the voice detection system in the running process of the camera application program of the electronic equipment. And triggering a photographing instruction if the detected voice content comprises a preset photographing instruction keyword. Alternatively, the voice content of the user may be detected in real time by the voice detection system. And if the detected content is matched with the preset photographing instruction keyword, starting a camera application program and triggering a photographing instruction.
When the photographing instruction is triggered according to the picture content, the characteristic content of the photographing instruction can be selected according to the requirement. For example, when the preview screen includes a smiling face feature, a photographing instruction is automatically triggered. Other features may be included including, but not limited to, specific faces, vehicles, etc.
After triggering a photographing instruction, the electronic equipment controls the flash lamp to be in a normally-on state.
The flash lamp is in a normally-on state, which means that the flash lamp is in a normally-on state in the image acquisition process. When the image acquisition is completed, then the flash may be turned off. The brightness of the flash lamp in the normally-on state may be fixed, or the brightness in the normally-on state may be determined according to the brightness of the environment.
Fig. 6 is a schematic implementation flow chart of triggering a photographing instruction by an electronic device to control a flash lamp to be in a normally-on state according to an embodiment of the present application. As shown in fig. 6, the implementation flow of triggering a photographing instruction by the electronic device to control the flash to be in a normally-on state includes:
in S601, the electronic device determines that the flash is required to work for photographing according to a preset flash mode.
The preset flash mode may be an automatic flash mode or a flash mode.
The electronic equipment acquires the brightness in the environment, namely the brightness in the shooting scene of the electronic equipment through lower hardware when shooting each time, and compares the acquired brightness with a preset brightness threshold value. And if the acquired brightness is lower than a preset brightness threshold value, reporting the comparison result to an upper layer. And the upper layer determines to trigger the flash lamp to work during shooting according to the report conclusion. If the acquired brightness is lower than a preset brightness threshold value, the flash lamp is not triggered to work during shooting.
When the flash mode is the flash mode, the electronic equipment does not need to compare the ambient brightness with a preset brightness threshold value when photographing each time, and the upper layer decides to trigger the flash lamp to work all the time when photographing.
In S602, the electronic device triggers a photographing instruction.
When the electronic device triggers a photographing instruction, a camera application program of the electronic device can be in an operating state or a non-operating state. If the camera application program in the electronic equipment is in a non-running state, the camera application program can be started through the photographing instruction.
The photographing instruction may be an instruction triggered according to a preset key operation, touch operation, sound or picture content.
In S603, the electronic device determines an operating current of the flash according to the ambient brightness.
Wherein, S603, S602, and S601 may not be executed strictly in order of sequence numbers. For example, the photographing instruction may be triggered first, then the flash lamp operation required for photographing is determined, and the working current of the flash lamp is determined, or the flash lamp operation required for photographing and the working current of the flash lamp are determined first, and then the photographing instruction is triggered.
When the electronic device determines the working current of the flash lamp according to the ambient brightness, the corresponding relation between the brightness and the working current can be preset. When the ambient brightness is detected, the working current corresponding to the ambient brightness is found according to the preset corresponding relation.
Compared with a fixed working current mode, the working current determined by the ambient brightness can adapt to brightness adjustment requirements of different brightnesses, and the adjusted ambient brightness is in a stable range. When the fixed working current drives the flash lamp to perform light filling, the fixed brightness is increased on the basis of the ambient brightness, so that the adjusted brightness may have larger deviation.
In S604, the flash lamp is turned on according to the operation current, and a normally-on state is maintained for a predetermined period of time.
The determined operating current is greater when the brightness of the environment is lower and is lower when the brightness of the environment is higher. For example, in the flash luminance schematic diagram of fig. 7, the ambient luminance in the left graph is higher than the luminance in the middle and right graphs, the determined operation current is smaller, and the luminance of the flash driven by the operation current is lower. The ambient brightness in the right graph is lower relative to the brightness in the middle and left graphs, the determined operating current is greater, and the brightness of the flash driven by the operating current is higher.
When the flash lamp is lightened according to the determined working current, the brightness corresponding to the working current is kept unchanged within a preset time length, namely the flash lamp is kept in a normally-on state. In the normally bright state, the method can enter S502, and the images in the scene are acquired to obtain a plurality of images with consistent scene brightness.
And if the image acquisition is completed, the upper layer of the electronic equipment sends a closing instruction of the normally-on state of the flash lamp to the bottom layer of the electronic equipment, so that the bottom layer closes the flash lamp according to the closing instruction and feeds back the closing response information. If the upper layer receives response information of the closing instruction fed back by the bottom layer, for example, the response information can be state bit information of a register, and the current closed flash lamp is determined, the photographing function of the camera application program can be updated from an unavailable state to an available state. The photographing function is in an unavailable state, and is a period from when the electronic equipment triggers a photographing instruction to when the upper layer of the electronic equipment receives a response of a flash lamp closing instruction.
Or when the upper layer of the electronic device fails to receive the response instruction of the bottom layer of the electronic device within a preset time period (which can be set to any value within 300 milliseconds to 5 seconds), the flash lamp can be turned off through the HAL (abstract hardware layer), so that the flash lamp exits from a normally-on state. When the flash is turned off by the HAL, the photographing function of the camera application program is updated from the unavailable state to the available state. In this case, the camera application's disabled state of the photographing function triggers a photographing instruction for the electronic device to the time period when the HAL of the electronic device turns off the flash.
When the electronic equipment is in a state that the photographing function is unavailable, the triggering instruction does not respond, or a prompt message without response can be generated. Including, for example, a prompt window.
In S502, the electronic device acquires an image in the normally-on state, the acquired image including a first image acquired at a first exposure value, a second image acquired at a second exposure value, and a third image acquired at a third exposure value.
The exposure values in the embodiment of the application comprise a first exposure value, a second exposure value and a third exposure value. The exposure value is a 2-based logarithmic scale system. The calculation formula can be expressed as:where N represents an aperture (f-value), EV represents an exposure value, and t represents an exposure time (in seconds). Since the exposure value is calculated with the exposure time and the aperture, different combinations of aperture and exposure time can be selected for the same exposure value.
The first exposure value is a standard exposure value of the current scene. The standard exposure value is an exposure value set for normally exposing an image. Therefore, the set standard exposure value is also different for environments of different brightness. In the embodiment of the application, the brightness in the environment is kept in a certain range by adjusting different working currents. Therefore, when the flash lamp is turned on by the working current so that the brightness of the scene is kept consistent, the standard exposure value in the embodiment of the application can be the same exposure value.
In a possible implementation manner, a corresponding relation table of the standard exposure value and the ambient brightness or the ambient illuminance, or a corresponding relation table of the brightness of the area where the photographed image is located and the standard exposure value may be established in advance. And determining the standard exposure value according to the ambient brightness, the ambient illuminance or the brightness of the area where the shot image is located, which are detected by the electronic equipment.
On the basis of determining the standard exposure value, namely the first exposure value, the application is also provided with a second exposure value and a third exposure value. The second exposure value is larger than the first exposure value, and the third exposure value is larger than the second exposure value. As is known from the definition of the exposure value, the larger the exposure value is, the smaller the brightness of the image is. As shown in fig. 8, the brightness of the second image 82 acquired by the second exposure value is smaller than the brightness of the first image 81 acquired by the first exposure value. The brightness of the third image 83 acquired by the third exposure value is smaller than the brightness of the second image 82 acquired by the second exposure value.
In the embodiment of the present application, the first exposure value may be expressed as EV0, the second exposure value may be selected from any value in the range of EV1 to EV4, the third exposure value may be selected from any value of EV4 to EV8, and the second exposure value is smaller than the third exposure compensation value. For example, the first exposure value is EV0, and the second exposure value is EV3. According to the definition of the exposure value, the exposure parameter value of the second exposure value An exposure parameter value which is the first exposure value +.>2 of (2) 3 The exposure amount of the first exposure value is 8 times of the exposure amount of the second exposure value. When the second exposure value is EV0 and the third exposure value is EV 6. According to the definition of the exposure value, the exposure parameter value of the third exposure value +.>Is the first exposure value +.>2 of the exposure parameters of (2) 6 Multiple times. I.e. the exposure of the first exposure value is 64 times the exposure of the third exposure value. Wherein N is 1 、N 2 、N 3 Respectively representing the aperture size of the first exposure value, the aperture size of the second exposure value and the aperture size of the third exposure value, t 1 、t 2 、t 3 The exposure time of the first exposure value, the exposure time of the second exposure value, and the exposure time of the third exposure value are respectively represented.
The electronic device controls the flash lamp to keep a constant-brightness state when the first image, the second image and the third image are acquired. In the image acquisition process of the electronic equipment, the environment brightness information is basically consistent. Therefore, when the first image, the second image and the third image with the consistent ambient brightness information are acquired, stable camera parameters including an automatic exposure parameter, an automatic white balance parameter and an automatic focusing parameter can be adopted for image acquisition, and frequent image acquisition parameter change is not needed, so that the reliability of image acquisition is higher.
And the flash lamp is controlled to be in a constant normally-on state, and the first image, the second image and the third image with different exposure values are obtained by combining the adjustment of the exposure values on the software level. Compared with the adjustment of the driving current, the exposure value on the software level is more convenient to adjust, the implementation is easier, and the photographing stability problem caused by the adjustment of the driving current can be reduced.
The embodiment of the application generates the HDR image according to the frame image by fixing the frame, namely, fixedly generating the first image, the second image and the third image. In the existing HDR image generation method, the brightness of the frame is required to be judged through the ambient brightness information, and decision calculation is troublesome, so that the decision efficiency can be effectively improved by fixing the frame, and the photographing process can be more efficiently completed.
The first image in the embodiment of the application is a normally exposed image, so that the first image comprises more image detail information. In order to improve the quality of the photographed picture, a plurality of images may be included in the first image. The plurality of images included in the first image are subjected to denoising and image quality optimization processing, so that a clearer fusion image is conveniently obtained.
In S503, the electronic device generates an image to be output from the first image, the second image, and the third image.
As shown in fig. 8, the embodiment of the present application can acquire the first image 81 through an exposure value. Since the first exposure value is the standard exposure value, the sharpness of most of the objects or figures in the first image 81 is higher than the sharpness of the second image 82 and the third image 83. As shown in the first image of the first exposure value in fig. 9, the portrait information is clearer in the first image acquired by the first exposure value, but the image is more blurred due to overexposure in the solid-color area above the portrait.
In a possible implementation, the first image may comprise a plurality, for example may comprise 5-8 frames of images. And acquiring a plurality of images through the first exposure value, and adopting denoising treatment to fuse the images into a standard exposure image with higher definition.
A second image 82 is acquired with a second exposure value. Since the second exposure value is larger than the first exposure value, the resolution of the figure is lowered in the second image of the second exposure value as shown in fig. 10, but the resolution of the solid color region above the figure is improved.
The exposure of the third exposure value is much smaller than that of the first exposure value, so that in the third image 83 collected under the third exposure value, as shown in fig. 11, the solid color region can collect a clearer image, but the portrait content is very blurred.
The clear region in the third image is very blurred in the first image due to the clear region in the first image. Therefore, the first image and the third image cannot accurately complete image registration. The portrait area in the second image is clearer relative to the portrait area of the third image, and the solid-color area in the second image is clearer relative to the first image. Thus, the first image may be registered with the second image according to the portrait region, and the second image may be registered with the third image according to the solid region. And according to the registration results of the first image and the second image and the registration results of the second image and the third image, the registration results of the first image and the third image can be obtained. From the registered first and third images, an image 84 with better sharpness to be output can be generated.
In the embodiment of the present application, the portrait area and the solid-color area are merely examples, and the area for image registration is not limited to the portrait area and the solid-color area. In a possible implementation, the portrait area may be other normally exposed image areas. The solid color region may be replaced with a region of high brightness in the first image, such as a light source region or the like.
In addition, in the embodiment of the present application, an HDR image with higher definition can be synthesized based on the first image, the second image, and the third image. Compared with the current HDR image synthesis mode, the decision calculation of photos required for synthesizing the HDR image can be reduced. For example, currently, when synthesizing an HDR image, it is first determined whether an overexposed image and an underexposed image need to be acquired according to scene brightness information. And determining the image to be acquired according to the calculation result. The application can directly acquire the first image, the second image and the third image which are required, can acquire a clear HDR image, can effectively reduce decision calculation and improve the image generation efficiency.
Fig. 13 is a schematic diagram of a process flow of capturing an image and processing the image in a RAW domain according to an embodiment of the present application. As shown in fig. 13, the photographing flow includes:
1301, a flash normally on mode is triggered.
According to a preset trigger instruction, the camera application program can respond to the photographing function or the video photographing function, and the flash lamp is kept in a normally-on state in the realization process of the photographing function or the video photographing function.
The photographing method in the embodiment of the application can be mainly used for night scene image photographing, indoor image photographing or other low-brightness scenes. Therefore, when the user selects the night scene shooting mode or the electronic equipment detects that the brightness of the current scene is low, the electronic equipment automatically switches to the night scene mode, and the flash lamp normally-on mode is triggered.
In the normally-on mode, the electronic device determines an operating current of the flash according to the brightness of the environment, such that the operating current is associated with the brightness of the scene. When the brightness of the scene is higher, the working current is smaller, and if the brightness of the scene is lower, the working current is larger. By adopting the working current matched with the environment or scene brightness, the environment brightness of the flash lamp in a normally-on state is kept stable.
1302, a plurality of frame RAW images of different EVs are acquired.
When the flash lamp is in a normally-on state, the brightness of the environment is kept stable. The image acquisition is carried out under the condition that the ambient brightness is kept stable, so that parameters of the acquired image, such as an automatic exposure parameter, an automatic white balance parameter and an automatic focusing parameter, are more stable, and an image with better image quality can be obtained.
And, based on the same driving current, the exposure value is adjusted to obtain images of different exposure degrees. Only the exposure parameters need to be modified on the software level, the adjustment of the driving current is avoided, the operation convenience can be improved, and the stability problem caused by current adjustment is reduced.
The HDR image is generated from the frame image by fixing the frame, i.e. by fixing the first, second and third image. In the existing HDR image generation method, the brightness of the frame is required to be judged through the ambient brightness information, and decision calculation is troublesome, so that the decision efficiency can be effectively improved by fixing the frame, and the photographing process can be more efficiently completed.
1303, image registration.
In the first image, the second image and the third image acquired by the embodiment of the application, the first image is a normally exposed image, so that the detail content of the first image is more. Two or more frames of images can be collected as a first image, and a clearer photo can be conveniently obtained through denoising and fusion processing.
While more detail content remains in the first image, the highlight region in the scene is blurred in the first image.
The third image is the image with the highest exposure value, namely the exposure of the third image is the lowest. In the third image, the highlight region in the scene may be displayed more clearly, but the low-luminance region in the scene is blurred.
The overexposed region in the scene is clearer in the second image relative to the first image. The low-intensity regions in the scene are more clear in the second image relative to the third image. Thus, the first image may be registered with the third image and the second image may be registered with the third image. And according to the registration result of the first image and the second image and the registration result of the second image and the third image, realizing the registration of the first image and the third image. Thus, the clear image of the low-brightness area in the scene can be effectively matched with the clear image of the high-brightness area in the scene.
1304, lens shading correction/black level correction.
The lens shading correction (the english term is Lens Shading Correction, the english term is abbreviated as LSC) is to solve the problem that the lens shadows around the lens due to uneven optical refraction. The lens shading correction method may include a concentric circle method, a grid method, and the like.
The black level, i.e., the lowest level value of the black data, generally refers to the level value of the sensor signal corresponding to the photosensitive image data being 0. Current black level correction (english is called Black Level Correction for short) schemes include black level correction by subtracting a fixed value and black level correction by using a drift curve of black level with temperature and gain.
1305, network model processing.
In the embodiment of the application, the network model can be a convolutional neural network model or a U-Net network model. Before the network model is used for optimization, the images used for training can be input into the network model in advance, and the changes of parameters such as definition, signal-to-noise ratio, sharpness and the like of the images output by the network model and the input images can be judged. By continuously adjusting parameters in the model, the definition of the output image relative to the input image is higher, the signal-to-noise ratio is higher, and the sharpness is higher. After the training of the parameter optimization of the network model is completed, the first image, the second image and the third image can be optimized through the trained network model, so that an output image with higher definition, higher signal-to-noise ratio and better sharpness is obtained.
1306, automatic white balance.
In order to be able to adjust the color of an object to a color that is recognized by the human eye under different illumination, the image needs to be subjected to white balance processing. Automatic snapshot balancing methods generally comprise a gray world method, a white block method, a light source parameter voting method, a color level correction method and the like.
1307, image fusion.
And (3) after the acquired first image, second image and third image are subjected to optimization processing such as noise reduction, the optimized images are directly subjected to fusion processing in a RAW domain, and a fused image is obtained. As a great amount of details of the image are reserved in the RAW, compared with the image optimized and fused by YUV, the detail content of the image optimized and fused in the RAW domain is more abundant, and the image with higher image quality can be conveniently obtained.
1308, nonlinear superposition & ghost correction.
In the embodiment of the application, nonlinear superposition processing can be performed on the fused images. It may refer to that an HDR image is included in an image, and an LDR image is generated by means of nonlinear mapping and superposition, so that the LDR screen can effectively display the generated image.
Due to picture registration problems, or when an object moves in the image, the fused image may have a ghost shape. Ghosting is the ghost of the same object present in an image, or may also be referred to as artifact. The ghost areas included in the fused image can be determined through ghost detection, and the ghost images can be corrected by selecting the corresponding area of one of the images before fusion to replace the ghost area of the fused image. The ghost correction process is not limited thereto, and may be realized by improving registration accuracy or the like.
1309, raw image output.
Through the processing, compared with the image optimization fusion mode of the YUV domain, the image processing process can realize image optimization and fusion processing in the RAW domain with rich detail content, and the RAW output image with more clear and rich detail content is obtained.
Fig. 14 is an image signal processing schematic diagram of a night scene image according to an embodiment of the present application, as shown in fig. 14, the process includes:
1401, the sensor acquires a plurality of frames of RAW images.
When the sensor collects multiple frames of RAW images, based on S502 in fig. 5, multiple frames of images including a first image, a second image, and a third image may be collected in a normally-on state of the flash lamp. Wherein the exposure values of the first image, the second image and the third image may be made different based on the parameter control at the software level. And the exposure value of the first image is a standard exposure value, and the standard exposure image of the first image. The second exposure value of the second image is larger than the first exposure value of the first image, and the third exposure value of the third image is larger than the second exposure value of the second image.
1402, removing bad points and PD points.
Due to the CMOS process and low cost, the sensor is generally shipped with dead spots (e.g., about 200 sensors and non-adjacent dead spots), which need to be removed during ISP processing.
The dead pixel removing method can comprise a mean value method, a linkage method or a debugging method and the like.
1403, lens shading correction.
The lens shading correction (the english term is Lens Shading Correction, the english term is abbreviated as LSC) is to solve the problem that the lens shadows around the lens due to uneven optical refraction. The lens shading correction method may include a concentric circle method, a grid method, and the like.
1404, raw domain night scene processing.
The RAW domain night scene processing method may include steps 1303, 1304, 1305, 1306, 1307, 1308 and the like shown in fig. 13.
1405 demosaicing/RAW 2GRG.
The demosaicing process is used to convert a bayer array, or a RAW format image, into an RGB image. The demosaicing process is a digital image process for reconstructing a full color image from incomplete color samples of an image sensor output covered with a Color Filter Array (CFA). Also known as CFA interpolation or color reconstruction. The reconstructed image is usually accurate in uniformly colored regions, but with loss of resolution (detail and sharpness) and with edge artifacts.
1406, color correction.
Since the color matching characteristics of cameras do not typically meet rutherd conditions, i.e. the RGB response of the sensor is typically not linearly independent, there is no linear relationship between the color matching characteristics of cameras and CIE (international commission on luminescence) standard observers. Therefore, we need to correct the camera characteristics, i.e. by color correction, to be close to the standard observer. Color correction of an image is typically accomplished using a color correction matrix.
1407, tone mapping.
Since the luminance range of the synthesized image is large, in order to enable the generated image to be normally displayed in a general LDR display, it is necessary to tone-map the high dynamic range image into the low dynamic range image. Tone mapping may include global tone mapping and local tone mapping.
1408,RGB2YUV。
After tone mapping is completed, the RGB image may be converted into an image in YUV format, or the image in RGB format may be directly output.
1409, yuv processing.
In YUV domain, the image can be further subjected to noise reduction and other optimization processing, so that the image quality is further improved.
In the whole night scene image signal processing process, the photographing scheme of the application selects to perform image optimization and fusion in the RAW domain, so that the image can retain more image details, the definition of the output image is improved, and the image quality of the output image is improved. It is noted that the method applied to image processing can be used for generating a photo with higher definition and more stable quality in a shooting scene, and can also be used for shooting a video scene to generate a video with higher definition and more stable quality.
The electronic device provided by the embodiment of the application may include a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the method according to any one of the above method embodiments.
The embodiments of the present application also provide a computer readable storage medium storing a computer program, which when executed by a processor implements steps of the above-described respective method embodiments.
Embodiments of the present application provide a computer program product which, when run on an electronic device, causes the electronic device to perform steps that may be carried out in the various method embodiments described above.
The embodiment of the application also provides a chip system, which comprises a processor, wherein the processor is coupled with a memory, and the processor executes a computer program stored in the memory to realize the method according to each method embodiment. The chip system can be a single chip or a chip module formed by a plurality of chips.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments. It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application. Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance. Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise.
Finally, it should be noted that: the foregoing is merely illustrative of specific embodiments of the present application, and the scope of the present application is not limited thereto, but any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A photographing method, the method comprising:
the electronic equipment triggers a photographing instruction to control the flash lamp to be in a normally-on state;
the electronic equipment acquires images in the normally-bright state, wherein the acquired images comprise a first image acquired at a first exposure value, a second image acquired at a second exposure value and a third image acquired at a third exposure value, and the first exposure value is a standard exposure value of a current scene; the calculation formula of the exposure value is thatWherein N represents an aperture, t represents exposure time, and EV represents an exposure value; an exposure parameter value of the second exposure value +.>An exposure parameter value +.>2 of (2) 3 Multiple of the third exposure value>An exposure parameter value which is the first exposure value2 of (2) 6 Multiple of, N 1 、N 2 、N 3 Respectively representing the aperture size of the first exposure value, the aperture size of the second exposure value and the aperture size of the third exposure value, t 1 、t 2 、t 3 The exposure time of the first exposure value, the exposure time of the second exposure value and the exposure time of the third exposure value are respectively represented;
the electronic equipment registers the first image and the second image according to a low-brightness area, registers the second image and the third image according to a high-brightness area, acquires the registration result of the first image and the third image according to the registration result of the first image and the second image and the registration result of the second image and the third image, and performs fusion processing on the registered first image, second image and third image in a RAW (random access) area to generate an image to be output, wherein the high-brightness area is an area with the definition in the second image being greater than that in the first image, and the low-brightness area is an area with the definition in the second image being greater than that in the third image.
2. The method of claim 1, wherein the electronic device controlling the flash to be in a normally on state comprises:
The electronic equipment acquires the brightness of a photographing scene;
according to the preset corresponding relation between the brightness and the working current, the electronic equipment determines the working current corresponding to the brightness of the scene;
and the electronic equipment drives the flash lamp to be in a normally-on state according to the working current.
3. The method of claim 1, wherein the electronic device acquiring an image in the normally-on state comprises:
the electronic equipment determines a first exposure value according to the brightness of the current scene;
according to a preset exposure proportional relation, the electronic equipment determines the second exposure value and the third exposure value;
the electronic device respectively acquires images according to the determined first exposure value, second exposure value and third exposure value.
4. A method according to claim 3, wherein the electronic device separately acquiring images from the determined first, second and third exposure values, comprises:
the electronic equipment respectively determines the aperture sizes and/or the exposure times corresponding to the first exposure value, the second exposure value and the third exposure value;
and the electronic equipment respectively performs image acquisition according to the determined aperture size and/or exposure time.
5. The method of claim 1, wherein the electronic device performs fusion processing on the registered first image, the second image, and the third image in a RAW domain to generate an image to be output, and the method comprises:
the electronic equipment performs optimization and fusion processing according to the first image in the RAW format, the second image in the RAW format and the third image in the RAW format to obtain a first output image in the RAW format;
and the electronic equipment performs color space transformation on the first output image to obtain an image to be output.
6. The method of claim 5, wherein the electronic device performs optimization and fusion processing according to the first image in the RAW format, the second image in the RAW format, and the third image in the RAW format to obtain a first output image in the RAW format, and the method comprises:
and performing dead point correction, lens shading correction, black level correction, RAW domain noise reduction, white balance gain and image fusion processing on the first image, the second image and the third image to obtain a first output image in a RAW format.
7. The method of claim 6, wherein the first, second, and third images are noise reduced by a network model when the first, second, and third images are RAW domain noise reduced.
8. The method of claim 5, wherein the electronic device performs color space conversion on the first output image to obtain an image to be output, comprising:
the electronic equipment performs demosaicing processing on the first output image to obtain an RGB image;
and carrying out color correction and global color mapping on the RGB image, carrying out color space conversion on the processed image, and carrying out YUV domain processing on the converted image to obtain the image to be output.
9. The method of claim 1, wherein after the electronic device captures an image in the normally-on state, the method further comprises:
the upper layer of the electronic equipment sends a closing instruction of a normally-on state of a flash lamp to the bottom layer of the electronic equipment;
and when the upper layer of the electronic equipment receives a response of closing the flash lamp, the photographing function of the electronic equipment is updated to be in an available state.
10. The method of claim 9, wherein after the upper layer of the electronic device sends a flash off instruction to the lower layer of the electronic device, the method further comprises:
and the upper layer of the electronic equipment does not receive a response of closing the flash lamp within a preset first time period, and the HAL of the electronic equipment closes the flash lamp.
11. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 10 when executing the computer program.
12. A computer readable storage medium storing a computer program, which when executed by a processor implements the method according to any one of claims 1-10.
CN202110927005.9A 2021-08-12 2021-08-12 Photographing method, electronic device, and computer-readable storage medium Active CN114095666B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110927005.9A CN114095666B (en) 2021-08-12 2021-08-12 Photographing method, electronic device, and computer-readable storage medium
PCT/CN2022/091901 WO2023015991A1 (en) 2021-08-12 2022-05-10 Photography method, electronic device, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110927005.9A CN114095666B (en) 2021-08-12 2021-08-12 Photographing method, electronic device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN114095666A CN114095666A (en) 2022-02-25
CN114095666B true CN114095666B (en) 2023-09-22

Family

ID=80296148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110927005.9A Active CN114095666B (en) 2021-08-12 2021-08-12 Photographing method, electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN114095666B (en)
WO (1) WO2023015991A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095666B (en) * 2021-08-12 2023-09-22 荣耀终端有限公司 Photographing method, electronic device, and computer-readable storage medium
CN116074634B (en) * 2022-05-27 2023-11-14 荣耀终端有限公司 Exposure parameter determination method and device
CN117689559A (en) * 2023-08-07 2024-03-12 上海荣耀智慧科技开发有限公司 Image fusion method and device, electronic equipment and storage medium
CN117408927A (en) * 2023-12-12 2024-01-16 荣耀终端有限公司 Image processing method, device and storage medium
CN117499789B (en) * 2023-12-25 2024-05-17 荣耀终端有限公司 Shooting method and related device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002223387A (en) * 2001-01-26 2002-08-09 Olympus Optical Co Ltd Imaging unit
CN103957363A (en) * 2014-05-16 2014-07-30 深圳市中兴移动通信有限公司 Flash camera shooting method and camera shooting device
CN105163047A (en) * 2015-09-15 2015-12-16 厦门美图之家科技有限公司 HDR (High Dynamic Range) image generation method and system based on color space conversion and shooting terminal
CN107888842A (en) * 2017-12-28 2018-04-06 上海传英信息技术有限公司 A kind of flash lamp control method and control system based on intelligent terminal
CN108717691A (en) * 2018-06-06 2018-10-30 成都西纬科技有限公司 A kind of image interfusion method, device, electronic equipment and medium
CN109194873A (en) * 2018-10-29 2019-01-11 浙江大华技术股份有限公司 A kind of image processing method and device
CN109788207A (en) * 2019-01-30 2019-05-21 Oppo广东移动通信有限公司 Image composition method, device, electronic equipment and readable storage medium storing program for executing
CN110198419A (en) * 2019-06-28 2019-09-03 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
CN111064898A (en) * 2019-12-02 2020-04-24 联想(北京)有限公司 Image shooting method and device, equipment and storage medium
US10911691B1 (en) * 2019-11-19 2021-02-02 Samsung Electronics Co., Ltd. System and method for dynamic selection of reference image frame
KR20210018121A (en) * 2019-08-06 2021-02-17 삼성전자주식회사 Device and method for performing local histogram matching with global regularization and motion exclusion
WO2021082580A1 (en) * 2019-10-31 2021-05-06 北京迈格威科技有限公司 Night scene high dynamic range image generation method, device, and electronic apparatus
CN113038027A (en) * 2021-03-05 2021-06-25 上海商汤临港智能科技有限公司 Exposure control method, device, equipment and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3967510B2 (en) * 1999-12-28 2007-08-29 富士フイルム株式会社 Digital camera
JP2006197243A (en) * 2005-01-13 2006-07-27 Canon Inc Imaging apparatus and method, program, and storage medium
US8174611B2 (en) * 2009-03-26 2012-05-08 Texas Instruments Incorporated Digital image segmentation using flash
US9706130B2 (en) * 2015-05-28 2017-07-11 Blackberry Limited Camera having HDR during pre-flash
WO2019082539A1 (en) * 2017-10-24 2019-05-02 ソニー株式会社 Control apparatus and control method, and program
CN109729279B (en) * 2018-12-20 2020-11-17 华为技术有限公司 Image shooting method and terminal equipment
CN109862282B (en) * 2019-02-18 2021-04-30 Oppo广东移动通信有限公司 Method and device for processing person image
CN110198417A (en) * 2019-06-28 2019-09-03 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN114095666B (en) * 2021-08-12 2023-09-22 荣耀终端有限公司 Photographing method, electronic device, and computer-readable storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002223387A (en) * 2001-01-26 2002-08-09 Olympus Optical Co Ltd Imaging unit
CN103957363A (en) * 2014-05-16 2014-07-30 深圳市中兴移动通信有限公司 Flash camera shooting method and camera shooting device
CN105163047A (en) * 2015-09-15 2015-12-16 厦门美图之家科技有限公司 HDR (High Dynamic Range) image generation method and system based on color space conversion and shooting terminal
CN107888842A (en) * 2017-12-28 2018-04-06 上海传英信息技术有限公司 A kind of flash lamp control method and control system based on intelligent terminal
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
CN108717691A (en) * 2018-06-06 2018-10-30 成都西纬科技有限公司 A kind of image interfusion method, device, electronic equipment and medium
CN109194873A (en) * 2018-10-29 2019-01-11 浙江大华技术股份有限公司 A kind of image processing method and device
CN109788207A (en) * 2019-01-30 2019-05-21 Oppo广东移动通信有限公司 Image composition method, device, electronic equipment and readable storage medium storing program for executing
CN110198419A (en) * 2019-06-28 2019-09-03 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
KR20210018121A (en) * 2019-08-06 2021-02-17 삼성전자주식회사 Device and method for performing local histogram matching with global regularization and motion exclusion
WO2021082580A1 (en) * 2019-10-31 2021-05-06 北京迈格威科技有限公司 Night scene high dynamic range image generation method, device, and electronic apparatus
US10911691B1 (en) * 2019-11-19 2021-02-02 Samsung Electronics Co., Ltd. System and method for dynamic selection of reference image frame
CN111064898A (en) * 2019-12-02 2020-04-24 联想(北京)有限公司 Image shooting method and device, equipment and storage medium
CN113038027A (en) * 2021-03-05 2021-06-25 上海商汤临港智能科技有限公司 Exposure control method, device, equipment and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
一种多波段红外图像联合配准和融合方法;李英杰;张俊举;常本康;钱芸生;刘磊;;电子与信息学报(01);全文 *
同一场景不同曝光图像的配准及HDR图像合成;华顺刚;王丽丹;欧宗瑛;;计算机辅助设计与图形学学报(04);全文 *
细节保留的多曝光图像融合;李卫中;易本顺;邱康;彭红;;光学精密工程(09);全文 *
针对动态目标的高动态范围图像融合算法研究;都琳;孙华燕;王帅;高宇轩;齐莹莹;;光学学报(04);全文 *

Also Published As

Publication number Publication date
WO2023015991A1 (en) 2023-02-16
CN114095666A (en) 2022-02-25

Similar Documents

Publication Publication Date Title
CN113132620B (en) Image shooting method and related device
WO2020168956A1 (en) Method for photographing the moon and electronic device
CN114095666B (en) Photographing method, electronic device, and computer-readable storage medium
CN113475057B (en) Video frame rate control method and related device
CN112887583B (en) Shooting method and electronic equipment
CN111327814A (en) Image processing method and electronic equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
WO2020029306A1 (en) Image capture method and electronic device
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
US20240137659A1 (en) Point light source image detection method and electronic device
WO2023273323A9 (en) Focusing method and electronic device
CN113891009B (en) Exposure adjusting method and related equipment
CN113837984A (en) Playback abnormality detection method, electronic device, and computer-readable storage medium
CN113572948B (en) Video processing method and video processing device
US11816494B2 (en) Foreground element display method and electronic device
CN116389884B (en) Thumbnail display method and terminal equipment
CN116708751B (en) Method and device for determining photographing duration and electronic equipment
CN116051351B (en) Special effect processing method and electronic equipment
CN116017138B (en) Light measuring control display method, computer equipment and storage medium
CN115460343B (en) Image processing method, device and storage medium
CN116055872B (en) Image acquisition method, electronic device, and computer-readable storage medium
CN116233599B (en) Video mode recommendation method and electronic equipment
CN115705663B (en) Image processing method and electronic equipment
WO2024078275A1 (en) Image processing method and apparatus, electronic device and storage medium
CN117850989A (en) Service calling method, system and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant