WO2023016025A1 - Image capture method and device - Google Patents

Image capture method and device Download PDF

Info

Publication number
WO2023016025A1
WO2023016025A1 PCT/CN2022/093613 CN2022093613W WO2023016025A1 WO 2023016025 A1 WO2023016025 A1 WO 2023016025A1 CN 2022093613 W CN2022093613 W CN 2022093613W WO 2023016025 A1 WO2023016025 A1 WO 2023016025A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
blur
electronic device
strength
Prior art date
Application number
PCT/CN2022/093613
Other languages
French (fr)
Chinese (zh)
Inventor
乔晓磊
肖斌
丁大钧
朱聪超
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023016025A1 publication Critical patent/WO2023016025A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Definitions

  • the present application relates to the field of photographing, in particular to a photographing method and equipment.
  • the camera function of mobile phones is becoming more and more powerful, and more and more users use mobile phones to take pictures.
  • many mobile phones will be equipped with a conventional main camera, an additional telephoto camera and/or an ultra-wide-angle camera, etc. .
  • the mobile phone when taking photos, the mobile phone often selects a camera with variable zoom multiples to take pictures according to different zoom multiples.
  • two cameras are also used to shoot the subject at the same time, and then the images captured by the two cameras are fused, thereby improving the imaging quality of the photos taken by the mobile phone.
  • the present application provides a photographing method and device, which solves the problem that the fusion boundary of the photographed image is obvious when two images are acquired by two cameras for fusion to obtain a photographed image.
  • the present application provides a method for taking pictures, which can be applied to electronic devices.
  • the electronic device includes a first camera and a second camera, and the angle of view of the first camera is different from that of the second camera.
  • the method includes: the electronic device starts the camera; displays a preview interface, and the preview interface includes a first control; detects a first operation on the first control; in response to the first operation, the first camera acquires the first image, and the second camera acquires the second Two images, the definition of the second image is higher than that of the first image; the second image is blurred to obtain the third image; the third image is fused with the first image to obtain the fourth image; the fourth image is saved image.
  • the electronic device when the electronic device takes pictures with two cameras, and then fuses the images obtained separately to obtain the captured image, the electronic device blurs the image with higher resolution among the two obtained images. Decreases the sharpness of the corresponding image. Therefore, the difference in definition between the images acquired by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
  • the field angle of the first camera is larger than the field angle of the second camera.
  • the image captured by the camera with a small field of view has a higher definition. Therefore, when the field of view of the first camera is larger than that of the second camera, the second image captured by the second camera is blurred. By reducing the sharpness of the second image, the difference between the sharpness of the first image and the blurred second image (ie, the third image) can be reduced, thereby reducing the splicing feeling of the fused image.
  • blurring the second image to obtain the third image includes: according to the similarity between the second image and the first image, according to the preset correspondence between similarity and blur strength , determine the blur strength; perform blur processing on the second image according to the determined blur strength.
  • the sharpness difference between the first image and the second image can be determined according to the similarity, and the higher the similarity, the smaller the sharpness difference. Therefore, the blur strength is determined according to the similarity between the first image and the second image, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
  • the similarity is a structural similarity SSIM value.
  • the similarity is represented by the structural similarity value, and the similarity between the first image and the second image can be more accurately measured from the perspective of image composition.
  • the similarity is inversely proportional to the blurriness.
  • the similarity is inversely proportional to the blur strength. To avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
  • performing blur processing on the second image to obtain the third image includes: determining the blur according to the corresponding relationship between the sensitivity and the blur strength according to the sensitivity of the second image. Intensity: perform blur processing on the second image according to the determined blur intensity.
  • a camera capable of obtaining a relatively clear image has a strong denoising capability, therefore, when image noise increases due to increased light sensitivity, the second image will be clearer than the first image. That is, the higher the sensitivity, the greater the difference in definition between the second image and the first image. Therefore, the blur strength is determined according to the sensitivity, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
  • ISO is proportional to blur strength.
  • the sensitivity is proportional to the blur strength, and when the sensitivity is low, the sharpness of the first image and the second image When the difference is small, avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
  • performing blur processing on the second image to obtain the third image includes: according to the ambient brightness corresponding to the second image, and according to the preset correspondence between the ambient brightness and the blur strength, determining the blur Intensity: perform blur processing on the second image according to the determined blur intensity.
  • a camera capable of obtaining a relatively clear image has a strong denoising capability, so when image noise increases due to increased ambient brightness, the second image will be clearer than the first image. That is, the higher the ambient brightness, the greater the difference in clarity between the second image and the first image. Therefore, the blur strength is determined according to the brightness of the environment, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
  • the ambient brightness is proportional to the blur strength.
  • the ambient brightness is proportional to the blur strength.
  • the sharpness of the first image and the second image When the difference is small, avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
  • the blur processing includes any of the following: Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, grainy blur, radial Blurred, ambiguous direction.
  • the first image is an image obtained by digital zooming the image captured by the first camera and adjusted to the current zoom factor.
  • the first image is an image directly captured by the first camera; fusing the third image with the first image to obtain the fourth image includes: digitally zooming the first image to The first image is adjusted to the current zoom factor; the third image is fused with the digitally zoomed first image to obtain a fourth image.
  • the present application provides a photographing device, which can be applied to an electronic device, and the electronic device includes a first camera and a second camera, and the first camera has a different field of view than the second camera.
  • the device is used to implement the method in the first aspect above.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • Hardware or software includes one or more modules corresponding to the above functions, for example, a processing module and a display module.
  • the display module can be used to display a preview interface when the electronic device starts the camera, and the preview interface includes the first control;
  • the processing module can be used to detect the first operation on the first control; in response to the first operation, through the first The first image is acquired by the camera, the second image is acquired by the second camera, and the definition of the second image is higher than that of the first image; the second image is blurred to obtain a third image; the third image and the first The images are fused to obtain a fourth image; and the fourth image is saved.
  • the field angle of the first camera is larger than the field angle of the second camera.
  • the processing module is specifically configured to determine the blur strength according to the similarity between the second image and the first image according to the preset correspondence between the similarity and the blur strength; Blur Strength Blurs the second image.
  • the similarity is a structural similarity SSIM value.
  • the similarity is inversely proportional to the blurriness.
  • the processing module is specifically configured to determine the blur strength according to the sensitivity corresponding to the second image and according to the preset correspondence between the sensitivity and the blur strength;
  • the second image is blurred.
  • ISO is proportional to blur strength.
  • the processing module is specifically configured to determine the blur strength according to the ambient brightness corresponding to the second image and according to the preset correspondence between the ambient brightness and the blur strength;
  • the second image is blurred.
  • the ambient brightness is proportional to the blur strength.
  • the blur processing includes any of the following: Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, grainy blur, radial Blurred, ambiguous direction.
  • the first image is an image obtained by digital zooming the image captured by the first camera and adjusted to the current zoom factor.
  • the first image is an image directly captured by the first camera; the processing module is specifically configured to digitally zoom the first image, so as to adjust the first image to the current zoom factor;
  • the third image is fused with the digitally zoomed first image to obtain a fourth image.
  • an embodiment of the present application provides an electronic device, including: a processor, and a memory configured to store instructions executable by the processor.
  • the electronic device implements the photographing method described in any one of the first aspect or possible implementation manners of the first aspect.
  • the embodiment of the present application provides a computer-readable storage medium on which computer program instructions are stored.
  • the electronic device is made to implement the photographing method according to any one of the first aspect or the possible implementation manners of the first aspect.
  • the embodiment of the present application provides a computer program product, including computer readable code, when the computer readable code is run in the electronic device, the electronic device realizes the possible functions of the first aspect or the first aspect. Realize the photographing method described in any one of the manners.
  • FIG. 1 is a schematic diagram of an application using dual cameras provided by related technologies
  • FIG. 2 is a schematic diagram of an image fusion application provided by related technologies
  • FIG. 3 is a schematic diagram of another image fusion application provided by related technologies
  • FIG. 4 is an application schematic diagram of a photographing method provided in an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 6 is a schematic composition diagram of a system architecture of an electronic device provided by an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of a photographing method provided in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a scene of a photographing operation provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the relationship between structural similarity and blur strength provided by the embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a photographing device provided by an embodiment of the present application.
  • the cameras of mobile phones are constantly upgraded, and the camera functions of mobile phones are becoming more and more powerful.
  • many mobile phones will also be equipped with additional cameras such as telephoto cameras and ultra-wide-angle cameras that use different focal lengths from the main camera (that is, cameras with different field of view, Among them, the camera with a larger field of view has a shorter focal length). Therefore, when the user uses the mobile phone to take pictures, the mobile phone can provide a longer focal length through the telephoto camera to obtain a better telephoto shooting effect, and the mobile phone can also provide a larger field of view through the ultra-wide-angle camera to obtain a better wide-angle Shooting effect.
  • the mobile phone will select the corresponding camera to take pictures according to the different zoom factors. For example, when the user increases the zoom factor to zoom in on the captured image, the mobile phone can choose to use a telephoto camera for shooting, so as to obtain a higher quality captured image while zooming in on the captured image. For another example, when the user lowers the zoom factor to reduce the captured image, the mobile phone may choose to use a super wide-angle camera for shooting, so as to obtain a higher quality captured image while reducing the captured image. Moreover, since most of the cameras installed on mobile phones currently use fixed-focus cameras (i.e. cameras with fixed focal lengths), cameras with different focal lengths installed on mobile phones can only be photographed at a certain zoom factor corresponding to the focal length of the camera to obtain higher image quality. image.
  • fixed-focus cameras i.e. cameras with fixed focal lengths
  • the zoom factor corresponding to the focal length of the main camera can be set to 1.0x
  • the zoom factor corresponding to the focal length of the super wide-angle camera can be set to 0.4x
  • the zoom factor corresponding to the focal length of the telephoto camera can be set to 3.5x. That is, when the zoom factor is adjusted to 1.0x, the imaging quality of the captured image captured by the main camera of the mobile phone is relatively high; When the zoom factor is adjusted to 3.5x, the mobile phone uses a telephoto camera to capture high-quality images.
  • the mobile phone when the zoom factor adjusted by the user is greater than or equal to 0.4x and less than 1.0x, the mobile phone can use the ultra-wide-angle camera to shoot; when the zoom factor adjusted by the user is greater than or equal to 1.0x and less than 3.5x, the mobile phone can use the The main camera is used for shooting, and when the zoom factor adjusted by the user is greater than or equal to 3.5x, the mobile phone can use the telephoto camera for shooting.
  • the imaging quality of the captured image captured by the mobile phone will be reduced.
  • the main camera of the mobile phone can capture images with higher image quality, but when the user adjusts the zoom factor to 2.5x, although the mobile phone will continue to use the main camera, But at this time, because the focal length of the main camera is fixed, the captured image is a digital zoom based on the image captured by the main camera (that is, the image captured by the main camera is enlarged to obtain a captured image corresponding to the zoom factor), clear Compared with the image captured by the main camera, the resolution will be reduced.
  • the mobile phone when the user adjusts the zoom factor to 0.9x, the mobile phone will use the ultra-wide-angle camera to shoot, but at this time, because the zoom factor is larger than the zoom factor corresponding to the focal length of the ultra-wide-angle camera, the captured image is based on the ultra-wide-angle camera.
  • the digital zoom of the captured image that is, the image captured by the ultra-wide-angle camera is enlarged to obtain the captured image corresponding to the zoom factor
  • the clarity will be reduced compared with the image captured by the ultra-wide-angle camera
  • the mobile phone can only use the ultra-wide-angle camera to capture images with higher imaging quality.
  • the mobile phone will simultaneously use two cameras with adjacent focal lengths to shoot, and then the two cameras will be respectively The captured images are fused to obtain a final captured image.
  • the zoom factor corresponding to the focal length of the main camera As an example, continue to set the zoom factor corresponding to the focal length of the main camera to 1.0x, the zoom factor corresponding to the focal length of the super wide-angle camera to 0.4x, and the zoom factor corresponding to the focal length of the telephoto camera to 3.5x as an example.
  • the mobile phone when the user adjusts the zoom factor to 2.0x-3.5x, the mobile phone will use the main camera to shoot, and at the same time use the telephoto camera to shoot, so that the images taken by the main camera The digitally zoomed image is fused with the image captured by the telephoto camera to obtain the final captured image.
  • the zoom factor When the user adjusts the zoom factor to 0.6x-0.9x, the mobile phone will use the main camera for shooting on the basis of using the ultra-wide-angle camera for shooting. In order to fuse the digitally zoomed image captured by the ultra-wide-angle camera with the image captured by the main camera to obtain the final captured image.
  • the mobile phone when the user adjusts the zoom factor to 2.5x, the mobile phone will use the telephoto camera for shooting in addition to the main camera for shooting.
  • the zoom factor corresponding to the focal length of the telephoto camera is 3.5x, which is greater than the zoom factor 2.5x adjusted by the user, as shown in Figure 2, the image captured by the telephoto camera (as shown in (b) in Figure 2 ) is a part of the image (as shown in (a) in Figure 2) when the image captured by the main camera is digitally zoomed to a zoom factor of 2.5x.
  • the mobile phone can obtain a captured image by merging the image captured by the main camera through digital zooming to a zoom factor of 2.5x and the image captured by the telephoto camera (see (c) in Figure 2).
  • a captured image by merging the image captured by the main camera through digital zooming to a zoom factor of 2.5x and the image captured by the telephoto camera (see (c) in Figure 2).
  • the mobile phone when the user adjusts the zoom factor to 0.7x, the mobile phone will use the main camera for shooting in addition to the ultra-wide-angle camera for shooting.
  • the zoom factor corresponding to the focal length of the main camera is 1.0x, which is greater than the zoom factor adjusted by the user of 0.7x
  • the image captured by the main camera is the image captured by the ultra-wide-angle camera and adjusted to a zoom factor of 0.7x through digital zoom. part of the image at that time. Therefore, the mobile phone can adjust the digital zoom to the image captured by the ultra-wide-angle camera to a zoom factor of 0.7x for fusion with the image captured by the main camera to improve the contrast between the image obtained by the ultra-wide-angle camera and the digital zoom.
  • the sharpness of the image of the overlapped part of the image captured by the main camera camera thereby improving the clarity of the final captured image.
  • an embodiment of the present application provides a photographing method, which can be applied to a scene where an electronic device with a photographing function takes pictures through multiple cameras provided.
  • the photographing method may be, as shown in FIG. 4 , the electronic device may use two cameras with different focal lengths (that is, cameras with different field of view) to obtain two images, for example, the first One image and second image. Then one of the images, such as the second image, is blurred, and then the blurred image (such as the third image) of the image (such as the second image) is fused with the first image, and the fused image can be as a captured image (or called a fourth image).
  • two cameras with different focal lengths that is, cameras with different field of view
  • the second image may be obtained by the camera with a relatively larger focal length (that is, the camera with a relatively smaller field of view) among the two images
  • the image that is, the image obtained by the camera with a relatively large focal length is blurred.
  • the second image may also be an image acquired by a camera with a relatively small focal length (that is, a camera with a relatively large field of view) among the two images. Which of the two images has a higher definition is determined to perform blurring on the image with a higher definition.
  • the photographing method may be applied when the zoom factor adjusted by the user is not the zoom factor corresponding to the focal length of each camera set in the electronic device. And when the number of cameras with different focal lengths set on the electronic device is three or more, the two cameras involved in the photographing method can be specifically determined according to the zoom factor adjusted by the user. For example, a camera with a focal length corresponding to a zoom factor greater than the user-adjusted zoom factor and a camera with a focal length corresponding to a zoom factor smaller than the user-adjusted zoom factor may be used as the two cameras involved in the photographing method.
  • the electronic device may also always use two fixed cameras with different focal lengths to perform image capture and the like. Therefore, in the embodiment of the present application, there is no limitation on when the electronic device applies the photographing method to capture images together through the first camera and the second camera, and it can be set according to actual needs.
  • the focal lengths of two cameras with different focal lengths used at the same time may be adjacent focal lengths or the like.
  • one of the two cameras is a super wide-angle camera and the other is a main camera, or one is a main camera and the other is a telephoto camera.
  • the captured image (that is, the above-mentioned fourth image) refers to an image finally captured by the mobile phone when the user uses the mobile phone to capture the image, or an image finally captured by the mobile phone and displayed to the user.
  • the image captured by the camera with a relatively smaller focal length among the two cameras is usually an image captured by the corresponding camera (for example, the camera with a relatively smaller focal length among the two cameras).
  • the image after the digital zoom is adjusted to the zoom factor adjusted by the user (that is, the current zoom factor). Therefore, the fused image can meet the zoom factor adjusted by the user.
  • the zoom factor (that is, the current zoom factor) is then fused with the third image to be a captured image.
  • no limitation is imposed on the specific manner of making the final captured image satisfy the zoom factor adjusted by the user.
  • the electronic device when the electronic device shoots through two cameras, and then fuses the images captured separately to obtain a captured image, the electronic device reduces the corresponding image by blurring the image with higher resolution among the two acquired images. clarity. Therefore, the difference in definition between the images acquired by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
  • the electronic device with camera function can be mobile phone, tablet computer, handheld computer, PC, cell phone, personal digital assistant (personal digital assistant, PDA), wearable device (such as: smart watch, Smart bracelet), smart home equipment (such as: TV), car machine (such as: car computer), smart screen, game console, and augmented reality (augmented reality, AR) / virtual reality (virtual reality, VR) equipment, etc. .
  • PDA personal digital assistant
  • wearable device such as: smart watch, Smart bracelet
  • smart home equipment such as: TV
  • car machine such as: car computer
  • smart screen such as: game console
  • AR augmented reality
  • VR virtual reality
  • the electronic device is provided with at least two cameras with different focal lengths (that is, two cameras with different viewing angles).
  • the electronic device is provided with a main camera (usually a wide-angle camera), a telephoto camera with a longer focal length than the main camera, and an ultra-wide-angle camera with a shorter focal length than the main camera.
  • FIG. 5 shows a schematic structural diagram of an electronic device provided by an embodiment of the present application. That is, for example, the electronic device shown in FIG. 5 may be a mobile phone.
  • the electronic device may include a processor 510, an external memory interface 520, an internal memory 521, a universal serial bus (universal serial bus, USB) interface 530, a charging management module 540, a power management module 541, a battery 542, Antenna 1, antenna 2, mobile communication module 550, wireless communication module 560, audio module 570, speaker 570A, receiver 570B, microphone 570C, earphone jack 570D, sensor module 580, button 590, motor 591, indicator 592, camera 593, A display screen 594, and a subscriber identification module (subscriber identification module, SIM) card interface 595, etc.
  • SIM subscriber identification module
  • the sensor module 580 may include a pressure sensor 580A, a gyroscope sensor 580B, an air pressure sensor 580C, a magnetic sensor 580D, an acceleration sensor 580E, a distance sensor 580F, a proximity light sensor 580G, a fingerprint sensor 580H, a temperature sensor 580J, a touch sensor 580K, an environment Light sensor 580L, bone conduction sensor 580M, etc.
  • the structure shown in this embodiment does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 510 may include one or more processing units, for example: the processor 510 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • a controller can be the nerve center and command center of an electronic device.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 510 for storing instructions and data.
  • the memory in processor 510 is a cache memory.
  • the memory may hold instructions or data that the processor 510 has just used or recycled. If the processor 510 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 510 is reduced, thus improving the efficiency of the system.
  • processor 510 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 550, the wireless communication module 560, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 550 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
  • the mobile communication module 550 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 550 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 550 can also amplify the signal modulated by the modem processor, convert it into electromagnetic wave and radiate it through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 550 may be set in the processor 510 .
  • at least part of the functional modules of the mobile communication module 550 and at least part of the modules of the processor 510 may be set in the same device.
  • the wireless communication module 560 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 560 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 560 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 510 .
  • the wireless communication module 560 can also receive the signal to be transmitted from the processor 510 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device is coupled to the mobile communication module 550, and the antenna 2 is coupled to the wireless communication module 560, so that the electronic device can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • general packet radio service general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband Code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device realizes the display function through the GPU, the display screen 594, and the application processor.
  • the GPU is a microprocessor for image processing, connected to the display screen 594 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 510 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 594 is used to display images, videos and the like.
  • Display 594 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device may include 1 or N display screens 594, where N is a positive integer greater than 1.
  • the electronic device can realize the shooting function through ISP, camera 593 , video codec, GPU, display screen 594 and application processor.
  • the electronic device may include 1 or N cameras 593, where N is a positive integer greater than 1.
  • the electronic device may include three cameras, one of which is a main camera, one is a telephoto camera, and one is a super wide-angle camera.
  • the internal memory 521 may be used to store computer-executable program codes including instructions.
  • the processor 510 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 521 .
  • the internal memory 521 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device.
  • the internal memory 521 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash memory (universal flash storage, UFS) and the like.
  • the structure of the electronic device may include fewer structures than those shown in Figure 5, or may include more structures than those shown in Figure 5, and there is no limitation here .
  • system architecture of the electronic device may include an application layer, a framework layer
  • the application layer may be used to deploy application programs.
  • a camera application may be deployed in the application layer.
  • Framework layers can be frame, Frameworks and other system frameworks are not limited here.
  • the hardware abstraction layer can deploy a unified interface of each piece of hardware.
  • a camera hardware abstraction layer (Camera HAL3) can be deployed in the hardware abstraction layer.
  • the module (camera algorithm module (Libcamera algo)) for realizing the photographing method provided by the embodiment of the present application may also be deployed in the hardware abstraction layer.
  • the driver layer can be used to deploy the driver components of each hardware device.
  • the driver layer can be deployed with a video device driver (V4L2Driver), an image video processor (image video processor, IVP) driver (IVP Driver) or DSP Driver (DSP Driver), NPU Driver (NPU Driver), GPU Driver (GPU Driver), etc.
  • the firmware layer can be used to deploy the firmware of each hardware device.
  • the firmware layer can deploy the Internet of Things firmware (lite-OS FW) so as to drive image sensors, time of flight (time of flight, TOF) sensors, and ISP and so on.
  • the hardware layer includes various hardware provided by the electronic device.
  • the hardware layer may include image sensor, TOF sensor, ISP, IVP or DSP, NPU, GPU, etc.
  • the module (camera algorithm module) implementing the photographing method provided by the embodiment of the present application may be initialized in the hardware abstraction layer when the user opens the camera application deployed in the application layer.
  • the camera application includes a preview interface, and the preview interface includes a first control (or called a shutter button) control
  • the camera operation is the user’s first operation on the first control, such as clicking the shutter control, etc.
  • the camera application in the application layer can pass the camera command through the framework layer, camera hardware abstraction layer, video device driver and object
  • the networking firmware is sent to the image sensor, so that the image sensor can acquire an image in response to a camera command.
  • the image sensors of each camera are different, and the camera application may send a photographing instruction to the image sensor of the corresponding camera according to the camera to be used.
  • the camera application can provide the image sensor of the main camera and the telephoto camera respectively.
  • the image sensor of the camera sends a camera instruction. If the electronic device needs to use the main camera and the ultra-wide-angle camera to shoot together, the camera application can send a photographing instruction to the image sensor of the main camera and the image sensor of the ultra-wide-angle camera respectively.
  • the image sensor may send the image to the ISP.
  • the ISP processes the received images according to the preset method, it can send the processed two images to the camera hardware abstraction layer through the IoT firmware and video device driver.
  • the camera hardware abstraction layer receives the two images, the two images can be sent to the camera algorithm module for implementing the camera method of the embodiment of the present application.
  • the camera algorithm module After the camera algorithm module receives two images, it can use corresponding drivers (such as IPV or DSP Driver, NPU Driver, GPU Driver, etc.) to call corresponding hardware (such as IVP or DSP, NPU , GPU, etc.) performs blurring processing on the image captured by the camera with a relatively large focal length among the two images, and fuses the blurred image with another image to obtain a captured image. Finally, the camera algorithm module can obtain the captured image from the hardware that processes the fused image to obtain the captured image, and sends the captured image obtained by fusing the blurred image and another image to the application layer for deployment through the camera hardware abstraction layer and the framework layer The camera application of the camera application, so that the camera application displays and/or stores the received captured image.
  • corresponding drivers such as IPV or DSP Driver, NPU Driver, GPU Driver, etc.
  • hardware such as IVP or DSP, NPU , GPU, etc.
  • the following will take the electronic device as a mobile phone, and the electronic device is provided with a main camera (wide-angle camera), a telephoto camera and an ultra-wide-angle camera, wherein the zoom factor corresponding to the focal length of the main camera is set to 1.0x, and the ultra-wide-angle camera
  • the zoom factor corresponding to the focal length of the camera is set to 0.4x
  • the zoom factor corresponding to the focal length of the telephoto camera is set to 3.5x.
  • the mobile phone uses the main camera and the telephoto camera for shooting.
  • the zoom factor to 0.6x-0.9x the mobile phone uses the super wide-angle camera and the main camera to take pictures.
  • a specific implementation of a method for taking pictures provided in the embodiment of the present application is illustrated.
  • FIG. 7 shows a schematic flowchart of a photographing method provided by an embodiment of the present application. As shown in FIG. 7, the photographing method may include the following steps S701-S703.
  • the mobile phone can use the corresponding two cameras with different focal lengths to shoot when the user is taking pictures to obtain two cameras respectively Capture the resulting image.
  • the mobile phone executes the following S701.
  • the mobile phone acquires a first image through a first camera, and acquires a second image through a second camera.
  • the acquisition of the first image by the first camera and the acquisition of the second image by the second camera can be performed at the same time.
  • the time intervals are performed separately, and there is no limitation here.
  • the focal lengths of the first camera and the second camera are different.
  • the focal length of the second camera is greater than the focal length of the first camera (hereinafter all take this as an example), that is, the first camera can is the main camera set by the electronic device in this example, then the second camera can be a telephoto camera set by the electronic device, or the first camera can be an ultra-wide-angle camera set by the electronic device in this example, then the second camera can be The main camera set by the electronic device. Therefore, the second image captured by the second camera with a relatively long focal length can be included in the first image captured by the first camera with a relatively short focal length, so as to facilitate subsequent fusion of the first image and the second image.
  • the camera combination composed of the first camera and the second camera corresponds to a preset range, that is, different preset ranges correspond to different camera combinations.
  • the first camera when the preset range is 2.0x-3.5x, the first camera can be the main camera, and the second camera can be a telephoto camera, and when the preset range is 0.6x-0.9x , the first camera can be a super wide-angle camera, and the second camera can be a main camera.
  • the mobile phone can pass through the main camera (that is, the first The first image is captured by the camera as the main camera, and the second image is captured by the telephoto camera (that is, the second camera is the telephoto camera).
  • the mobile phone can pass through the ultra-wide-angle camera (that is, the first camera is an ultra-wide-angle camera). wide-angle camera) to obtain the first image, and the main camera (ie, the second camera as the main camera) to obtain the second image.
  • the camera interface of the mobile phone may include a preview interface
  • the preview interface includes a first control (or called a shutter control, a camera control), and the user's camera operation may be the user's first operation on the first control (such as clicking operation, long press operation, etc.).
  • a preview interface is displayed on the mobile phone, and the interface includes a preview frame, a camera control 801 and a zoom control 802 .
  • the preview frame is used to display the current zoom factor and the preview image of the subject in the photographing mode.
  • the camera control 801 is used to trigger the camera action of the mobile phone.
  • the zoom control 802 can be used to adjust the zoom factor, and the current zoom factor can be displayed on the zoom control.
  • a preview image when the zoom factor is 2.5x may be displayed in the preview box.
  • the user can click the camera control 801 to perform a camera operation.
  • the user's photographing operation may also be a pressing operation on a preset key (such as a power key, a volume key, etc.). Therefore, in the embodiment of the present application, there is no restriction on the user's photographing operation, as long as the operation used to trigger the mobile phone to take a photograph is the user's photographing operation.
  • a preset key such as a power key, a volume key, etc.
  • the first image captured by the mobile phone through the first camera may be an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera (for example, an image captured by the first camera and processed by an ISP), or It may be an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera and processed through digital zooming. That is, an image that matches the zoom factor adjusted by the user (ie, the current zoom factor).
  • the second image captured by the mobile phone through the second camera may be an image captured by the second camera that matches the zoom factor corresponding to the focal length of the second camera (for example, the image captured by the second camera is processed by the ISP).
  • the mobile phone can digitally zoom the first image to adjust to
  • the zoom factor adjusted by the user that is, the current zoom factor
  • the third image to obtain a fused image that matches the zoom factor adjusted by the user, so that the fused image can be used as a captured image later.
  • the mobile phone may perform the following S702.
  • the blur processing can include Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, granular blur, radial blur, and direction blur, etc.
  • the second image can be Gaussian Blur, Surface Blur, Box Blur, Kawase Blur, Double Blur, Bokeh Blur, Tilt-Shift Blur, Aperture Blur, Grainy Blur, Radial Blur, and Direction Blur are used for blurring.
  • the preset rule for blurring the second image may be based on the similarity between the first image and the second image (which may be used to characterize the first image and the second image)
  • the sharpness difference between images such as the higher the similarity between the first image and the second image, the smaller the sharpness difference, the lower the similarity, the larger the sharpness difference) to determine the corresponding blur strength (blur)
  • a blurring algorithm is used to perform blurring processing on the second image according to the corresponding blurring strength.
  • the similarity between the first image and the second image may be represented by structural similarity (structural similarity, SSIM).
  • structural similarity structural similarity
  • SSIM values of the first image (i.e. image x) and the second image (i.e. image y) can be calculated using the following formula:
  • x is the image x (such as the first image)
  • y is the image y (such as the second image)
  • ⁇ x is the average value of x
  • ⁇ y is the average value of y
  • ⁇ xy is the covariance of x and y
  • c 1 (k 1 L) 2
  • c 2 (k 2 L) 2
  • L is the dynamic range of the pixel value (that is, the value of the image pixel value
  • SSIM values range from 0 to 1. When the two images are exactly the same, the SSIM value is equal to 1.
  • the maximum blur strength and the minimum blur strength can be calibrated based on the SSIM values of the first image and the second image, so as to obtain the corresponding relationship between the SSIM value and the blur strength of the first image and the second image, so as to facilitate the follow-up based on the first
  • the SSIM values of the image and the second image are used to determine the corresponding blur strength.
  • the low SSIM value that can limit the similarity between the first image and the second image is too low (that is, when the SSIM value is smaller than the low SSIM value, the first image and the second image can be determined.
  • the similarity of the second image is too low
  • a high SSIM value capable of defining a high similarity between the first image and the second image that is, when the SSIM value is greater than the high SSIM value, the similarity between the first image and the second image can be determined higher degree.
  • the maximum blur strength can be calibrated based on a low SSIM value
  • the minimum blur strength can be calibrated based on a high SSIM value.
  • the blur strength is adjusted linearly until the third image obtained after blurring the second image can be compared with the first image after fusion with the first image. If the sharpness is improved and the fusion boundary is not obvious, the blur strength at this time can be used as the maximum blur strength.
  • the blur strength is linearly adjusted until the third image obtained after blurring the second image can be compared with the first image after fusion with the first image. If there is a large increase in definition and the fusion boundary is not obvious, the blur strength at this time can be taken as the minimum blur strength.
  • the blur strength corresponding to the SSIM value lower than the above-mentioned low SSIM value can be set to the above-mentioned maximum blur strength
  • the blur strength corresponding to the SSIM value higher than the above-mentioned high SSIM value can be set to the above-mentioned minimum blur strength, which will be between
  • the SSIM value between the low SSIM value and the high SSIM value corresponds linearly to the blur strength between the maximum blur strength and the highest blur strength.
  • the blur strength is calibrated when the SSIM value is 0.25, the maximum blur strength is 9, and the blur strength when the SSIM value is 0.38 is calibrated If the minimum blur intensity is 1, the corresponding relationship curve between the blur intensity and the SSIM value as shown in FIG. 9 can be obtained.
  • the preset rule for blurring the second image may be based on the sensitivity (ISO) corresponding to the second image, that is, the sensitivity of the second camera when capturing the second image.
  • the corresponding blur strength is determined, and then the second image is blurred by using a blur algorithm according to the corresponding blur strength. Therefore, when the sharpness difference between the first image and the second image is small, the situation that the sharpness of the fused image is not improved compared with the first image due to excessive blurring processing on the second image is avoided.
  • the sensitivity can be divided into segments, and then the blur strength corresponding to different sensitivity segments can be set according to the rule that the higher the sensitivity, the greater the blur strength, from low to high.
  • the blur strength is greater than a certain value, in order to avoid using a higher blur strength for blur processing, resulting in excessive blur processing of the second image and resulting in the fused image not being improved compared with the first image definition, you can Use this blur strength as the maximum blur strength so that the higher sensitivity segments all correspond to the maximum blur strength.
  • the sensitivity can be divided into 100-1000, 1000-2000, 2000-3000, 3000-4000, 4000-5000, 5000-6000 and so on. Therefore, set the corresponding blur strength when the sensitivity is 100-1000 to 1, set the corresponding blur strength to 3 when the sensitivity is 1000-2000, set the corresponding blur strength to 5 when the sensitivity is 2000-3000, and set When the sensitivity is 3000-4000, the corresponding blur strength is set to 7, and when the sensitivity is 4000-5000, the corresponding blur strength is set to 9.
  • the specific parameter settings of the sensitivity segment and its corresponding blur strength in the above example may be as follows:
  • ⁇ iso100> etc. indicate the index of the sensitivity segment, and ⁇ blur>1 ⁇ /blur> indicates the blur intensity corresponding to the sensitivity segment.
  • the preset rule for blurring the second image may be to determine the corresponding blur strength according to the ambient brightness corresponding to the second image, and then use a blur algorithm to blur the second image according to the corresponding blur strength. The image is blurred. Therefore, when the sharpness difference between the first image and the second image is small, the situation that the sharpness of the fused image is not improved compared with the first image due to excessive blurring processing on the second image is avoided.
  • the ambient brightness is usually the average brightness of the ambient light obtained by the mobile phone according to the ambient light measurement.
  • the exposure parameters adopted by the mobile phone can be calculated according to the ambient brightness, that is, the exposure parameters of the image captured by the camera are calculated according to the ambient brightness. Therefore, in the embodiment of the present application, the ambient brightness can be obtained according to the exposure parameters of the second image.
  • the ambient brightness may be divided into segments, and then the blurring strengths corresponding to different ambient brightness segments are set according to the rule that the higher the ambient brightness, the greater the blurring strength, from low to high.
  • the blur strength is greater than a certain value, in order to avoid using a higher blur strength for blur processing, resulting in excessive blur processing of the second image and resulting in the fused image not being improved compared with the first image definition, you can Use this blur strength as the maximum blur strength so that the higher sensitivity segments all correspond to the maximum blur strength.
  • the ambient brightness can be divided into 100-1000, 1000-2000, 2000-3000, 3000-4000, 4000-5000, 5000-6000 and so on. Therefore, set the corresponding blur strength when the ambient brightness is 100-1000 to 1, set the corresponding blur strength to 3 when the ambient brightness is 1000-2000, set the corresponding blur strength to 5 when the ambient brightness is 2000-3000, and set When the ambient brightness is 3000-4000, the corresponding blur strength is set to 7, and when the ambient brightness is 4000-5000, the corresponding blur strength is set to 9.
  • the specific parameter settings of the ambient brightness segment and its corresponding blur strength in the above example may be as follows:
  • ⁇ lv 100> etc. indicate the index of the sensitivity segment, and ⁇ blur>1 ⁇ /blur> indicates the blur intensity corresponding to the ambient brightness segment.
  • blur parameters corresponding to different blur strengths may be determined according to specific blur algorithms.
  • the formula of Gaussian blur can be as follows:
  • u 2 +v 2 is the blur radius
  • is the standard deviation of the normal distribution.
  • the Gaussian matrix of Gaussian blur when the blur strength is 3 is:
  • the blurring process can be performed according to the above-mentioned Gaussian matrix.
  • the Gaussian matrix of Gaussian blur is:
  • the blurring process can be performed according to the above-mentioned Gaussian matrix.
  • the third image when merging the third image with the first image, can be superimposed on the part of the first image that overlaps with the content of the third image, or the third image can directly replace the part of the first image that is related to the third image.
  • the overlapped part of the content of the third image, or other algorithms are used for fusion, which is not limited here.
  • the fused image (that is, the fourth image) may be saved as a captured image.
  • the camera algorithm module can call the IVP, DSP or CPU according to the above-mentioned embodiment, and combine the SSIM value of the first image and the second image and the difference between the SSIM value and the blur strength
  • the relationship curve (or the configuration parameter of the relationship between the sensitivity of the second image and the blur strength, or the configuration parameter of the relationship between the ambient brightness of the second image and the blur strength) is sent to IVP or DSP, so that IVP or DSP can The parameter determines how hard to blur the second image.
  • the IVP or DSP can return the determined blur strength to the camera algorithm module, and the camera algorithm module can send the determined blur strength and the first image and the second image to the GPU, so as to call the GPU to process the second image according to the determined blur strength. Blurring is performed on the blur strength to obtain a third image, and the first image and the third pattern are fused to obtain a captured image.
  • the GPU can return the captured image to the camera algorithm module, and the camera algorithm module can send it to the camera application deployed in the application layer through the camera hardware abstraction layer and the framework layer, so that the camera application can display and/or store the received captured image.
  • the camera algorithm module can also flexibly call IVP, DSP, CPU, GPU, etc.
  • the third image and The fused image of the first image is used as the final captured image.
  • the first image is an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera, then when fusion is performed, the first image can be digitally zoomed to the zoom factor adjusted by the user (that is, the current zoom factor ) and then fused with the third image, so that the fused image can match the zoom factor adjusted by the user as the final captured image. Therefore, there is no limitation on when to adjust the image through the digital zoom so that the final image matches the zoom factor adjusted by the user.
  • the electronic device when the electronic device takes pictures with two cameras, and then fuses the images captured separately to obtain the captured image, the electronic device blurs the image with higher resolution among the two acquired images
  • the processing is done in a way that reduces the sharpness of the corresponding image. Therefore, the difference in definition between the images captured by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
  • the embodiments of the present application further provide a photographing device.
  • the apparatus may be applied to the above-mentioned electronic equipment to implement the methods in the foregoing embodiments.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • Hardware or software includes one or more modules corresponding to the above-mentioned functions.
  • FIG. 10 shows a schematic structural diagram of a photographing device. As shown in FIG. 10 , the device includes: a processing module 1001 , a display module 1002 and the like.
  • the processing module 1001 and the display module 1002 may cooperate to implement the related methods in the foregoing embodiments.
  • units in the above device is only a division of logical functions, and may be fully or partially integrated into a physical entity or physically separated during actual implementation.
  • the units in the device can all be implemented in the form of software called by the processing element; they can also be implemented in the form of hardware; some units can also be implemented in the form of software called by the processing element, and some units can be implemented in the form of hardware.
  • each unit can be a separate processing element, or it can be integrated in a certain chip of the device. In addition, it can also be stored in the memory in the form of a program, which is called and executed by a certain processing element of the device. Function. In addition, all or part of these units can be integrated together, or implemented independently.
  • the processing element described here may also be referred to as a processor, and may be an integrated circuit with a signal processing capability. In the process of implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in the processor element or implemented in the form of software called by the processing element.
  • the units in the above device may be one or more integrated circuits configured to implement the above method, for example: one or more ASICs, or, one or more DSPs, or, one or more FPGAs, Or a combination of at least two of these integrated circuit forms.
  • the processing element can be a general-purpose processor, such as a CPU or other processors that can call programs.
  • these units can be integrated together and implemented in the form of a system-on-a-chip (SOC).
  • the units of the above apparatus for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler.
  • the apparatus may include a processing element and a storage element, and the processing element invokes a program stored in the storage element to execute the methods described in the above method embodiments.
  • the storage element may be a storage element on the same chip as the processing element, that is, an on-chip storage element.
  • the program for executing the above method may be stored in a storage element on a different chip from the processing element, that is, an off-chip storage element.
  • the processing element invokes or loads a program from the off-chip storage element to the on-chip storage element, so as to invoke and execute the methods described in the above method embodiments.
  • an embodiment of the present application may also provide an apparatus, such as an electronic device, which may include a processor, and a memory configured to store instructions executable by the processor.
  • an electronic device which may include a processor, and a memory configured to store instructions executable by the processor.
  • the processor When the processor is configured to execute the above instructions, the electronic device implements the photographing method implemented by the electronic device in the foregoing embodiments.
  • the memory can be located inside the electronic device or outside the electronic device.
  • the processor includes one or more.
  • the unit of the apparatus that implements each step in the above method may be configured as one or more processing elements, and these processing elements may be set on the corresponding electronic equipment described above, where the processing elements may be integrated circuits , for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits can be integrated together to form a chip.
  • an embodiment of the present application further provides a chip system, and the chip system may be applied to the above-mentioned electronic device.
  • the chip system includes one or more interface circuits and one or more processors; the interface circuits and the processors are interconnected through lines; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuits, so as to realize the electronic processing in the above method embodiments.
  • Device-dependent methods include one or more interface circuits and one or more processors; the interface circuits and the processors are interconnected through lines; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuits, so as to realize the electronic processing in the above method embodiments.
  • Device-dependent methods are possible implementation of the present application.
  • An embodiment of the present application further provides a computer program product, including computer instructions executed by an electronic device, such as the above-mentioned electronic device.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division, and there may be other division methods in actual implementation.
  • multiple units or components can be Incorporation or may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the unit described as a separate component may or may not be physically separated, and the component displayed as a unit may be one physical unit or multiple physical units, that is, it may be located in one place, or may be distributed to multiple different places . Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • a software product such as a program.
  • the software product is stored in a program product, such as a computer-readable storage medium, and includes several instructions to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all of the methods described in various embodiments of the present application. or partial steps.
  • the aforementioned storage medium includes: various media capable of storing program codes such as U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk.
  • the embodiments of the present application may also provide a computer-readable storage medium on which computer program instructions are stored.
  • the computer program instructions are executed by the electronic device, the electronic device is made to implement the photographing method described in the foregoing method embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The present application disclose an image capture method and device, which relate to the field of image capturing. The present application solves the problem of obvious boundary fusion of captured images when two images are acquired by means of two cameras for fusion to obtain a captured image. The specific solution is as follows: an electronic device starts a camera; a preview interface is displayed, the preview interface comprising a first control; a first operation on a first control is detected; in response to the first operation, a first camera acquires a first image, and a second camera acquires a second image, the resolution of the second image being higher than that of the first image; the second image is blurred to obtain a third image; the third image is fused with the first image to obtain a fourth image; and the fourth image is saved.

Description

一种拍照方法及设备Method and device for taking pictures
本申请要求于2021年8月11日提交国家知识产权局、申请号为202110919953.8、申请名称为“一种拍照方法及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application filed with the State Intellectual Property Office on August 11, 2021, with the application number 202110919953.8 and the application title "A Method and Device for Taking Photos", the entire contents of which are incorporated in this application by reference .
技术领域technical field
本申请涉及拍照领域,尤其涉及一种拍照方法及设备。The present application relates to the field of photographing, in particular to a photographing method and equipment.
背景技术Background technique
随着手机相机的不断优化,手机的拍照功能越来越强大,使用手机进行拍照的用户也越来越多。目前,很多手机为了能够实现更长的焦距和/或更大的视场角的照片拍摄,手机上除了会设置常规的主摄摄像头外,还会额外设置长焦摄像头和/或超广角摄像头等。设置有多个摄像头的手机,在拍摄照片时,手机经常会根据不同的变焦倍数选择对应变焦倍数的摄像头进行拍摄。有些手机,在部分变焦倍数范围内,还会采用两个摄像头同时对拍摄对象进行拍摄,然后将两个摄像头拍摄得到的图像进行融合,从而提高手机拍摄照片的成像质量。With the continuous optimization of mobile phone cameras, the camera function of mobile phones is becoming more and more powerful, and more and more users use mobile phones to take pictures. At present, in order to be able to take photos with a longer focal length and/or a larger field of view, many mobile phones will be equipped with a conventional main camera, an additional telephoto camera and/or an ultra-wide-angle camera, etc. . For a mobile phone provided with multiple cameras, when taking photos, the mobile phone often selects a camera with variable zoom multiples to take pictures according to different zoom multiples. In some mobile phones, within the range of partial zoom multiples, two cameras are also used to shoot the subject at the same time, and then the images captured by the two cameras are fused, thereby improving the imaging quality of the photos taken by the mobile phone.
但是,由于不同摄像头拍摄的图像会有差异,因此在手机采用两个摄像头同时对拍摄对象进行拍摄时,两个摄像头分别拍摄得到的图像的清晰度会有较大差异,最终通过两个摄像头分别拍摄得到的图像进行融合后的图像在融合边界处便会有较大的清晰度差异,融合边界明显,使得最终得到的融合图像拼接感较强。However, since the images taken by different cameras will be different, when the mobile phone uses two cameras to shoot the subject at the same time, the clarity of the images captured by the two cameras will be quite different, and finally through the two cameras respectively After the fusion of the captured images, there will be a large difference in definition at the fusion boundary, and the fusion boundary is obvious, so that the finally obtained fusion image has a strong sense of splicing.
发明内容Contents of the invention
本申请提供一种拍照方法及设备,解决了通过两个摄像头获取两个图像以进行融合得到拍摄图像时,拍摄图像的融合边界明显的问题。The present application provides a photographing method and device, which solves the problem that the fusion boundary of the photographed image is obvious when two images are acquired by two cameras for fusion to obtain a photographed image.
为了达到上述目的,本申请采用如下技术方案:In order to achieve the above object, the application adopts the following technical solutions:
第一方面,本申请提供一种拍照方法,该方法可应用于电子设备。该电子设备包括第一摄像头和第二摄像头,第一摄像头的视场角和第二摄像头的视场角不同。该方法包括:电子设备启动相机;显示预览界面,预览界面包括第一控件;检测到对第一控件的第一操作;响应于第一操作,第一摄像头获取第一图像,第二摄像头获取第二图像,第二图像的清晰度高于第一图像的清晰度;对第二图像进行模糊处理,得到第三图像;将第三图像和第一图像进行融合,得到第四图像;保存第四图像。In a first aspect, the present application provides a method for taking pictures, which can be applied to electronic devices. The electronic device includes a first camera and a second camera, and the angle of view of the first camera is different from that of the second camera. The method includes: the electronic device starts the camera; displays a preview interface, and the preview interface includes a first control; detects a first operation on the first control; in response to the first operation, the first camera acquires the first image, and the second camera acquires the second Two images, the definition of the second image is higher than that of the first image; the second image is blurred to obtain the third image; the third image is fused with the first image to obtain the fourth image; the fourth image is saved image.
采用上述技术方案,当电子设备通过两个摄像头拍摄,再将分别拍摄得到的图像进行融合以得到拍摄图像时,电子设备通过对获取的两个图像中清晰度更高的图像进行模糊处理的方式降低对应图像的清晰度。从而能够减小因摄像头间的分辨率、降噪能力的差异导致的两个摄像头分别获取得到的图像之间的清晰度差异。进而,将两个清晰度差异较小的图像进行融合,便能够得到融合边界不明显,拼接感较弱的拍摄图像。With the above technical solution, when the electronic device takes pictures with two cameras, and then fuses the images obtained separately to obtain the captured image, the electronic device blurs the image with higher resolution among the two obtained images. Decreases the sharpness of the corresponding image. Therefore, the difference in definition between the images acquired by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
在一种可能的实现方式中,第一摄像头的视场角大于第二摄像头的视场角。In a possible implementation manner, the field angle of the first camera is larger than the field angle of the second camera.
通常,视场角小的摄像头拍摄得到的图像的清晰度更高,因此,当第一摄像头的视 场角大于第二摄像头的视场角时,对第二摄像头获取的第二图像进行模糊处理以降低第二图像的清晰度,能够减小第一图像和模糊处理后的第二图像(即第三图像)的清晰度差异,从而使融合后的图像的拼接感降低。Generally, the image captured by the camera with a small field of view has a higher definition. Therefore, when the field of view of the first camera is larger than that of the second camera, the second image captured by the second camera is blurred. By reducing the sharpness of the second image, the difference between the sharpness of the first image and the blurred second image (ie, the third image) can be reduced, thereby reducing the splicing feeling of the fused image.
在另一种可能的实现方式中,对第二图像进行模糊处理,得到第三图像,包括:根据第二图像和第一图像的相似度,按照预设的相似度与模糊力度间的对应关系,确定模糊力度;根据确定出的模糊力度对第二图像进行模糊处理。In another possible implementation, blurring the second image to obtain the third image includes: according to the similarity between the second image and the first image, according to the preset correspondence between similarity and blur strength , determine the blur strength; perform blur processing on the second image according to the determined blur strength.
根据相似度能够确定第一图像和第二图像的清晰度差异,相似度越高则表明清晰度差异越小。因此,根据第一图像和第二图像的相似度来确定模糊力度,使对第二图像进行模糊处理的程度能够基于第二图像和第一图像间的清晰度差异调整。从而避免因对第二图像过度模糊处理导致第二图像处理后的清晰度比第一图像还要差,而导致的融合后图像清晰度相比于第一图像没有得到提升的情况。The sharpness difference between the first image and the second image can be determined according to the similarity, and the higher the similarity, the smaller the sharpness difference. Therefore, the blur strength is determined according to the similarity between the first image and the second image, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
在另一种可能的实现方式中,相似度为结构相似性SSIM值。In another possible implementation manner, the similarity is a structural similarity SSIM value.
通过结构相似性值来表示相似度,能够从图像组成的角度更加准确的衡量第一图像和第二图像间的相似性。The similarity is represented by the structural similarity value, and the similarity between the first image and the second image can be more accurately measured from the perspective of image composition.
在另一种可能的实现方式中,相似度与模糊力度成反比。In another possible implementation, the similarity is inversely proportional to the blurriness.
相似度越高,说明第一图像和第二图像的清晰度差异越小,因此使相似度与模糊力度成反比,能够在相似度较高,即第一图像和第二图像的清晰度差异较小时,避免因对第二图像过度模糊处理导致第二图像处理后的清晰度比第一图像还要差,而导致的融合后图像清晰度相比于第一图像没有得到提升的情况。The higher the similarity, the smaller the sharpness difference between the first image and the second image. Therefore, the similarity is inversely proportional to the blur strength. To avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
在另一种可能的实现方式中,对第二图像进行模糊处理,得到第三图像,包括:根据第二图像对应的感光度,按照预设的感光度与模糊力度间的对应关系,确定模糊力度;根据确定出的模糊力度对第二图像进行模糊处理。In another possible implementation manner, performing blur processing on the second image to obtain the third image includes: determining the blur according to the corresponding relationship between the sensitivity and the blur strength according to the sensitivity of the second image. Intensity: perform blur processing on the second image according to the determined blur intensity.
通常,能够得到相对清晰的图像的摄像头的去噪能力较强,因此,在因为增加感光度而导致图像噪点增加时,第二图像相比于第一图像会更加清晰。即感光度越高则第二图像与第一图像的清晰度差异便越大。因此,根据感光度来确定模糊力度,使对第二图像进行模糊处理的程度能够基于第二图像和第一图像间的清晰度差异调整。从而避免因对第二图像过度模糊处理导致第二图像处理后的清晰度比第一图像还要差,而导致的融合后图像清晰度相比于第一图像没有得到提升的情况。Generally, a camera capable of obtaining a relatively clear image has a strong denoising capability, therefore, when image noise increases due to increased light sensitivity, the second image will be clearer than the first image. That is, the higher the sensitivity, the greater the difference in definition between the second image and the first image. Therefore, the blur strength is determined according to the sensitivity, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
在另一种可能的实现方式中,感光度与模糊力度成正比。In another possible implementation, ISO is proportional to blur strength.
感光度越高,则说明第一图像和第二图像的清晰度差异越大,因此使感光度与模糊力度成正比,能够在感光度较低时,即第一图像和第二图像的清晰度差异较小时,避免因对第二图像过度模糊处理导致第二图像处理后的清晰度比第一图像还要差,而导致的融合后图像清晰度相比于第一图像没有得到提升的情况。The higher the sensitivity, the greater the sharpness difference between the first image and the second image, so the sensitivity is proportional to the blur strength, and when the sensitivity is low, the sharpness of the first image and the second image When the difference is small, avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
在另一种可能的实现方式中,对第二图像进行模糊处理,得到第三图像,包括:根据第二图像对应的环境亮度,按照预设的环境亮度与模糊力度间的对应关系,确定模糊力度;根据确定出的模糊力度对第二图像进行模糊处理。In another possible implementation manner, performing blur processing on the second image to obtain the third image includes: according to the ambient brightness corresponding to the second image, and according to the preset correspondence between the ambient brightness and the blur strength, determining the blur Intensity: perform blur processing on the second image according to the determined blur intensity.
通常,能够得到相对清晰的图像的摄像头的去噪能力较强,因此,在因为增加环境亮度而导致图像噪点增加时,第二图像相比于第一图像会更加清晰。即环境亮度越高则第二图像与第一图像的清晰度差异便越大。因此,根据环境亮度来确定模糊力度,使对 第二图像进行模糊处理的程度能够基于第二图像和第一图像间的清晰度差异调整。从而避免因对第二图像过度模糊处理导致第二图像处理后的清晰度比第一图像还要差,而导致的融合后图像清晰度相比于第一图像没有得到提升的情况。Generally, a camera capable of obtaining a relatively clear image has a strong denoising capability, so when image noise increases due to increased ambient brightness, the second image will be clearer than the first image. That is, the higher the ambient brightness, the greater the difference in clarity between the second image and the first image. Therefore, the blur strength is determined according to the brightness of the environment, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
在另一种可能的实现方式中,环境亮度与模糊力度成正比。In another possible implementation, the ambient brightness is proportional to the blur strength.
环境亮度越高,则说明第一图像和第二图像的清晰度差异越大,因此使环境亮度与模糊力度成正比,能够在环境亮度较低时,即第一图像和第二图像的清晰度差异较小时,避免因对第二图像过度模糊处理导致第二图像处理后的清晰度比第一图像还要差,而导致的融合后图像清晰度相比于第一图像没有得到提升的情况。The higher the ambient brightness, the greater the sharpness difference between the first image and the second image. Therefore, the ambient brightness is proportional to the blur strength. When the ambient brightness is low, the sharpness of the first image and the second image When the difference is small, avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
在另一种可能的实现方式中,模糊处理包括以下任一种:高斯模糊、表面模糊、方框模糊、Kawase模糊、双重模糊、散景模糊、移轴模糊、光圈模糊、粒状模糊、径向模糊、方向模糊。In another possible implementation, the blur processing includes any of the following: Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, grainy blur, radial Blurred, ambiguous direction.
在另一种可能的实现方式中,第一图像为第一摄像头拍摄得到的图像经数码变焦调整到当前的变焦倍数后的图像。In another possible implementation manner, the first image is an image obtained by digital zooming the image captured by the first camera and adjusted to the current zoom factor.
在另一种可能的实现方式中,第一图像为第一摄像头直接拍摄得到的图像;对第三图像和第一图像进行融合,得到第四图像,包括:对第一图像进行数码变焦,以将第一图像调整到当前的变焦倍数;将第三图像和进行数码变焦后的第一图像进行融合,得到第四图像。In another possible implementation, the first image is an image directly captured by the first camera; fusing the third image with the first image to obtain the fourth image includes: digitally zooming the first image to The first image is adjusted to the current zoom factor; the third image is fused with the digitally zoomed first image to obtain a fourth image.
第二方面,本申请提供一种拍照装置,该装置可以应用于电子设备,该电子设备包括第一摄像头和第二摄像头,第一摄像头的视场角和第二摄像头的视场角不同。该装置用于实现上述第一方面中的方法。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块,例如,处理模块和显示模块等。In a second aspect, the present application provides a photographing device, which can be applied to an electronic device, and the electronic device includes a first camera and a second camera, and the first camera has a different field of view than the second camera. The device is used to implement the method in the first aspect above. The functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware. Hardware or software includes one or more modules corresponding to the above functions, for example, a processing module and a display module.
其中,显示模块可以用于当电子设备启动相机时显示预览界面,预览界面包括第一控件;处理模块,可以用于检测到对第一控件的第一操作;响应于第一操作,通过第一摄像头获取第一图像,通过第二摄像头获取第二图像,第二图像的清晰度高于第一图像的清晰度;对第二图像进行模糊处理,得到第三图像;将第三图像和第一图像进行融合,得到第四图像;保存第四图像。Wherein, the display module can be used to display a preview interface when the electronic device starts the camera, and the preview interface includes the first control; the processing module can be used to detect the first operation on the first control; in response to the first operation, through the first The first image is acquired by the camera, the second image is acquired by the second camera, and the definition of the second image is higher than that of the first image; the second image is blurred to obtain a third image; the third image and the first The images are fused to obtain a fourth image; and the fourth image is saved.
在一种可能的实现方式中,第一摄像头的视场角大于第二摄像头的视场角。In a possible implementation manner, the field angle of the first camera is larger than the field angle of the second camera.
在另一种可能的实现方式中,处理模块,具体用于根据第二图像和第一图像的相似度,按照预设的相似度与模糊力度间的对应关系,确定模糊力度;根据确定出的模糊力度对第二图像进行模糊处理。In another possible implementation manner, the processing module is specifically configured to determine the blur strength according to the similarity between the second image and the first image according to the preset correspondence between the similarity and the blur strength; Blur Strength Blurs the second image.
在另一种可能的实现方式中,相似度为结构相似性SSIM值。In another possible implementation manner, the similarity is a structural similarity SSIM value.
在另一种可能的实现方式中,相似度与模糊力度成反比。In another possible implementation, the similarity is inversely proportional to the blurriness.
在另一种可能的实现方式中,处理模块,具体用于根据第二图像对应的感光度,按照预设的感光度与模糊力度间的对应关系,确定模糊力度;根据确定出的模糊力度对第二图像进行模糊处理。In another possible implementation manner, the processing module is specifically configured to determine the blur strength according to the sensitivity corresponding to the second image and according to the preset correspondence between the sensitivity and the blur strength; The second image is blurred.
在另一种可能的实现方式中,感光度与模糊力度成正比。In another possible implementation, ISO is proportional to blur strength.
在另一种可能的实现方式中,处理模块,具体用于根据第二图像对应的环境亮度,按照预设的环境亮度与模糊力度间的对应关系,确定模糊力度;根据确定出的模糊力度 对第二图像进行模糊处理。In another possible implementation manner, the processing module is specifically configured to determine the blur strength according to the ambient brightness corresponding to the second image and according to the preset correspondence between the ambient brightness and the blur strength; The second image is blurred.
在另一种可能的实现方式中,环境亮度与模糊力度成正比。In another possible implementation, the ambient brightness is proportional to the blur strength.
在另一种可能的实现方式中,模糊处理包括以下任一种:高斯模糊、表面模糊、方框模糊、Kawase模糊、双重模糊、散景模糊、移轴模糊、光圈模糊、粒状模糊、径向模糊、方向模糊。In another possible implementation, the blur processing includes any of the following: Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, grainy blur, radial Blurred, ambiguous direction.
在另一种可能的实现方式中,第一图像为第一摄像头拍摄得到的图像经数码变焦调整到当前的变焦倍数后的图像。In another possible implementation manner, the first image is an image obtained by digital zooming the image captured by the first camera and adjusted to the current zoom factor.
在另一种可能的实现方式中,第一图像为第一摄像头直接拍摄得到的图像;处理模块,具体用于对第一图像进行数码变焦,以将第一图像调整到当前的变焦倍数;将第三图像和进行数码变焦后的第一图像进行融合,得到第四图像。In another possible implementation manner, the first image is an image directly captured by the first camera; the processing module is specifically configured to digitally zoom the first image, so as to adjust the first image to the current zoom factor; The third image is fused with the digitally zoomed first image to obtain a fourth image.
第三方面,本申请实施例提供一种电子设备,包括:处理器,用于存储该处理器可执行指令的存储器。该处理器被配置为执行上述指令时,使得该电子设备实现如第一方面或第一方面的可能的实现方式中任一项所述的拍照方法。In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, and a memory configured to store instructions executable by the processor. When the processor is configured to execute the above instructions, the electronic device implements the photographing method described in any one of the first aspect or possible implementation manners of the first aspect.
第四方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序指令。当计算机程序指令被电子设备执行时,使得电子设备实现如第一方面或第一方面的可能的实现方式中任一项所述的拍照方法。In a fourth aspect, the embodiment of the present application provides a computer-readable storage medium on which computer program instructions are stored. When the computer program instructions are executed by the electronic device, the electronic device is made to implement the photographing method according to any one of the first aspect or the possible implementation manners of the first aspect.
第五方面,本申请实施例提供一种计算机程序产品,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,使得电子设备实现如第一方面或第一方面的可能的实现方式中任一项所述的拍照方法。In the fifth aspect, the embodiment of the present application provides a computer program product, including computer readable code, when the computer readable code is run in the electronic device, the electronic device realizes the possible functions of the first aspect or the first aspect. Realize the photographing method described in any one of the manners.
应当理解的是,上述第二方面至第五方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。It should be understood that, for the beneficial effects of the above-mentioned second aspect to the fifth aspect, reference may be made to relevant descriptions in the above-mentioned first aspect, and details are not repeated here.
附图说明Description of drawings
图1为相关技术提供的一种使用双摄像头拍摄的应用示意图;FIG. 1 is a schematic diagram of an application using dual cameras provided by related technologies;
图2为相关技术提供的一种图像融合的应用示意图;FIG. 2 is a schematic diagram of an image fusion application provided by related technologies;
图3为相关技术提供的另一种图像融合的应用示意图;FIG. 3 is a schematic diagram of another image fusion application provided by related technologies;
图4为本申请实施例提供的一种拍照方法的应用示意图;FIG. 4 is an application schematic diagram of a photographing method provided in an embodiment of the present application;
图5为本申请实施例提供的一种电子设备的结构示意图;FIG. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
图6为本申请实施例提供的一种电子设备的***架构的组成示意图;FIG. 6 is a schematic composition diagram of a system architecture of an electronic device provided by an embodiment of the present application;
图7为本申请实施例提供的一种拍照方法的流程示意图;FIG. 7 is a schematic flowchart of a photographing method provided in an embodiment of the present application;
图8为本申请实施例提供的一种拍照操作的场景示意图;FIG. 8 is a schematic diagram of a scene of a photographing operation provided by an embodiment of the present application;
图9为本申请实施例提供的一种结构相似性与模糊力度间的关系示意图;FIG. 9 is a schematic diagram of the relationship between structural similarity and blur strength provided by the embodiment of the present application;
图10为本申请实施例提供的一种拍照装置的结构示意图。FIG. 10 is a schematic structural diagram of a photographing device provided by an embodiment of the present application.
具体实施方式Detailed ways
随着手机的不断发展,手机的摄像头不断升级,手机拍照功能越来越强大。目前,很多手机除了设置常规的主摄摄像头(一般为广角摄像头)外,还会额外设置如长焦摄像头、超广角摄像头等与主摄摄像头采用不同焦距的摄像头(即不同视场角的摄像头,其中,视场角越大的摄像头焦距越短)。从而用户使用手机进行照片拍摄时,手机能够通过长焦摄像头提供更长的焦距以获得更好的长焦拍摄效果,手机还能够通过超广角摄像头提供更大的视场角以获得更好的广角拍摄效果。因此,当用户调整变焦倍数以放大 或缩小拍摄图像时,手机则会根据不同的变焦倍数选择对应的摄像头来进行拍摄。例如,当用户调高变焦倍数以放大拍摄图像时,手机可以选择采用长焦摄像头来进行拍摄,从而在放大拍摄图像的同时获得质量较高的拍摄图像。又例如,当用户调低变焦倍数以缩小拍摄图像时,手机可以选择采用超广角摄像头来进行拍摄,从而在缩小拍摄图像的同时获得质量较高的拍摄图像。并且,由于目前手机设置的摄像头大多数采用定焦摄像头(即焦距固定的摄像头),因此手机设置的不同焦距的摄像头只有在摄像头焦距对应的某个变焦倍数上才能拍摄得到成像质量较高的拍摄图像。With the continuous development of mobile phones, the cameras of mobile phones are constantly upgraded, and the camera functions of mobile phones are becoming more and more powerful. At present, in addition to the conventional main camera (generally a wide-angle camera), many mobile phones will also be equipped with additional cameras such as telephoto cameras and ultra-wide-angle cameras that use different focal lengths from the main camera (that is, cameras with different field of view, Among them, the camera with a larger field of view has a shorter focal length). Therefore, when the user uses the mobile phone to take pictures, the mobile phone can provide a longer focal length through the telephoto camera to obtain a better telephoto shooting effect, and the mobile phone can also provide a larger field of view through the ultra-wide-angle camera to obtain a better wide-angle Shooting effect. Therefore, when the user adjusts the zoom factor to enlarge or reduce the captured image, the mobile phone will select the corresponding camera to take pictures according to the different zoom factors. For example, when the user increases the zoom factor to zoom in on the captured image, the mobile phone can choose to use a telephoto camera for shooting, so as to obtain a higher quality captured image while zooming in on the captured image. For another example, when the user lowers the zoom factor to reduce the captured image, the mobile phone may choose to use a super wide-angle camera for shooting, so as to obtain a higher quality captured image while reducing the captured image. Moreover, since most of the cameras installed on mobile phones currently use fixed-focus cameras (i.e. cameras with fixed focal lengths), cameras with different focal lengths installed on mobile phones can only be photographed at a certain zoom factor corresponding to the focal length of the camera to obtain higher image quality. image.
示例地,可以将主摄摄像头的焦距对应的变焦倍数设置为1.0x,将超广角摄像头的焦距对应的变焦倍数设置为0.4x,将长焦摄像头的焦距对应的变焦倍数设置为3.5x。即,当变焦倍数调整为1.0x时,手机采用主摄摄像头拍摄得到的拍摄图像成像质量较高,当变焦倍数调整为0.4x时,手机采用超广角摄像头拍摄得到的拍摄图像成像质量较高,当变焦倍数调整为3.5x时,手机采用长焦摄像头拍摄得到的拍摄图像成像质量较高。因此,当用户调整的变焦倍数大于或等于0.4x,且小于1.0x时,手机可以采用超广角摄像头进行拍摄,当用户调整的变焦倍数大于或等于1.0x,且小于3.5x时,手机可以采用主摄摄像头进行拍摄,当用户调整的变焦倍数大于或等于3.5x时,手机可以采用长焦摄像头进行拍摄。For example, the zoom factor corresponding to the focal length of the main camera can be set to 1.0x, the zoom factor corresponding to the focal length of the super wide-angle camera can be set to 0.4x, and the zoom factor corresponding to the focal length of the telephoto camera can be set to 3.5x. That is, when the zoom factor is adjusted to 1.0x, the imaging quality of the captured image captured by the main camera of the mobile phone is relatively high; When the zoom factor is adjusted to 3.5x, the mobile phone uses a telephoto camera to capture high-quality images. Therefore, when the zoom factor adjusted by the user is greater than or equal to 0.4x and less than 1.0x, the mobile phone can use the ultra-wide-angle camera to shoot; when the zoom factor adjusted by the user is greater than or equal to 1.0x and less than 3.5x, the mobile phone can use the The main camera is used for shooting, and when the zoom factor adjusted by the user is greater than or equal to 3.5x, the mobile phone can use the telephoto camera for shooting.
基于上述示例,当用户调整的变焦倍数不是0.4x、1.0x或3.5x时,手机拍摄得到的拍摄图像成像质量会有所降低。例如,当用户将变焦倍数调整为1.0x时,手机采用主摄摄像头能够拍摄得到成像质量较高的拍摄图像,而当用户将变焦倍数调整为2.5x时,虽然手机会继续采用主摄摄像头,但此时由于主摄摄像头的焦距固定,因此拍摄得到的拍摄图像为基于主摄摄像头拍摄的图像的数码变焦(即将主摄摄像头拍摄的图像进行放大处理得到与变焦倍数对应的拍摄图像),清晰度相比于主摄摄像头拍摄的图像会有所降低。又例如,当用户将变焦倍数调整为0.9x时,手机会采用超广角摄像头进行拍摄,但是此时由于变焦倍数比超广角摄像头焦距对应的变焦倍数大,因此拍摄得到的图像为基于超广角摄像头拍摄的拍摄图像的数码变焦(即将超广角摄像头拍摄的图像进行放大处理得到与变焦倍数对应的拍摄图像),清晰度相比于超广角摄像头拍摄的图像会有所降低,而当用户将变焦倍数调整为0.4x时,手机使用超广角摄像头才能够拍摄得到成像质量较高的拍摄图像。Based on the above example, when the zoom factor adjusted by the user is not 0.4x, 1.0x or 3.5x, the imaging quality of the captured image captured by the mobile phone will be reduced. For example, when the user adjusts the zoom factor to 1.0x, the main camera of the mobile phone can capture images with higher image quality, but when the user adjusts the zoom factor to 2.5x, although the mobile phone will continue to use the main camera, But at this time, because the focal length of the main camera is fixed, the captured image is a digital zoom based on the image captured by the main camera (that is, the image captured by the main camera is enlarged to obtain a captured image corresponding to the zoom factor), clear Compared with the image captured by the main camera, the resolution will be reduced. For another example, when the user adjusts the zoom factor to 0.9x, the mobile phone will use the ultra-wide-angle camera to shoot, but at this time, because the zoom factor is larger than the zoom factor corresponding to the focal length of the ultra-wide-angle camera, the captured image is based on the ultra-wide-angle camera. The digital zoom of the captured image (that is, the image captured by the ultra-wide-angle camera is enlarged to obtain the captured image corresponding to the zoom factor), the clarity will be reduced compared with the image captured by the ultra-wide-angle camera, and when the user increases the zoom factor When adjusted to 0.4x, the mobile phone can only use the ultra-wide-angle camera to capture images with higher imaging quality.
目前,为了提高手机拍摄的拍摄图像的成像质量,在用户调整的变焦倍数不是各个摄像头的焦距对应的变焦倍数时,手机会同时采用焦距相邻的两个摄像头进行拍摄,然后将两个摄像头分别拍摄得到的图像进行融合以得到最终的拍摄图像。At present, in order to improve the imaging quality of the captured images captured by the mobile phone, when the zoom factor adjusted by the user is not the zoom factor corresponding to the focal length of each camera, the mobile phone will simultaneously use two cameras with adjacent focal lengths to shoot, and then the two cameras will be respectively The captured images are fused to obtain a final captured image.
示例地,继续以主摄摄像头的焦距对应的变焦倍数设置为1.0x,超广角摄像头的焦距对应的变焦倍数设置为0.4x,长焦摄像头的焦距对应的变焦倍数设置为3.5x为例。如图1所示,当用户调整变焦倍数为2.0x-3.5x时,手机会在使用主摄摄像头进行拍摄的基础上,同时采用长焦摄像头进行拍摄,以便于将主摄摄像头拍摄的图像经数码变焦后的图像与长焦摄像头拍摄的图像进行融合得到最终的拍摄图像。当用户调整变焦倍数为0.6x-0.9x时,手机会在使用超广角摄像头进行拍摄的基础上,同时采用主摄摄像头进行拍摄。以便于将超广角摄像头拍摄的图像经数码变焦后的图像与主摄摄像头拍摄的图像进行融合得到最终的拍摄图像。As an example, continue to set the zoom factor corresponding to the focal length of the main camera to 1.0x, the zoom factor corresponding to the focal length of the super wide-angle camera to 0.4x, and the zoom factor corresponding to the focal length of the telephoto camera to 3.5x as an example. As shown in Figure 1, when the user adjusts the zoom factor to 2.0x-3.5x, the mobile phone will use the main camera to shoot, and at the same time use the telephoto camera to shoot, so that the images taken by the main camera The digitally zoomed image is fused with the image captured by the telephoto camera to obtain the final captured image. When the user adjusts the zoom factor to 0.6x-0.9x, the mobile phone will use the main camera for shooting on the basis of using the ultra-wide-angle camera for shooting. In order to fuse the digitally zoomed image captured by the ultra-wide-angle camera with the image captured by the main camera to obtain the final captured image.
例如,当用户调整变焦倍数为2.5x时,手机会在使用主摄摄像头进行拍摄的基础上,同时采用长焦摄像头进行拍摄。此时,由于长焦摄像头的焦距对应的变焦倍数为3.5x,大于用户调整的变焦倍数2.5x,因此如图2所示,长焦摄像头拍摄的图像(如图2中的(b)所示)是主摄摄像头拍摄的图像通过数码变焦调整到变焦倍数为2.5x时的图像(如图2中的(a)所示)内的部分图像。所以,手机可以通过将主摄摄像头拍摄得到的图像经数码变焦调整到变焦倍数为2.5x时的图像和长焦摄像头拍摄得到的图像进行融合得到拍摄图像(如图2中的(c)),以提高主摄摄像头经数码变焦得到的图像中与长焦摄像头拍摄得到的图像重合部分的图像的清晰度,从而提高最终得到的拍摄图像的清晰度。For example, when the user adjusts the zoom factor to 2.5x, the mobile phone will use the telephoto camera for shooting in addition to the main camera for shooting. At this time, since the zoom factor corresponding to the focal length of the telephoto camera is 3.5x, which is greater than the zoom factor 2.5x adjusted by the user, as shown in Figure 2, the image captured by the telephoto camera (as shown in (b) in Figure 2 ) is a part of the image (as shown in (a) in Figure 2) when the image captured by the main camera is digitally zoomed to a zoom factor of 2.5x. Therefore, the mobile phone can obtain a captured image by merging the image captured by the main camera through digital zooming to a zoom factor of 2.5x and the image captured by the telephoto camera (see (c) in Figure 2). In order to improve the clarity of the image of the part of the image obtained by the digital zoom of the main camera that overlaps with the image captured by the telephoto camera, thereby improving the clarity of the finally captured image.
又例如,当用户调整变焦倍数为0.7x时,手机会在使用超广角摄像头进行拍摄的基础上,同时采用主摄摄像头进行拍摄。此时,由于主摄摄像头的焦距对应的变焦倍数为1.0x,大于用户调整的变焦倍数0.7x,因此主摄摄像头拍摄的图像是超广角摄像头拍摄的图像通过数码变焦调整到变焦倍数为0.7x时的图像内的部分图像。所以,手机可以通过将超广角摄像头拍摄得到的图像经数码变焦调整到变焦倍数为0.7x时的图像和主摄摄像头拍摄得到的图像进行融合,以提高超广角摄像头经数码变焦得到的图像中与主摄摄像头拍摄得到的图像重合部分的图像的清晰度,从而提高最终得到的拍摄图像的清晰度。For another example, when the user adjusts the zoom factor to 0.7x, the mobile phone will use the main camera for shooting in addition to the ultra-wide-angle camera for shooting. At this time, since the zoom factor corresponding to the focal length of the main camera is 1.0x, which is greater than the zoom factor adjusted by the user of 0.7x, the image captured by the main camera is the image captured by the ultra-wide-angle camera and adjusted to a zoom factor of 0.7x through digital zoom. part of the image at that time. Therefore, the mobile phone can adjust the digital zoom to the image captured by the ultra-wide-angle camera to a zoom factor of 0.7x for fusion with the image captured by the main camera to improve the contrast between the image obtained by the ultra-wide-angle camera and the digital zoom. The sharpness of the image of the overlapped part of the image captured by the main camera camera, thereby improving the clarity of the final captured image.
但是,在通过上述将两个摄像头分别拍摄得到的图像进行融合以得到拍摄图像的方式来提升拍摄图像的成像质量(如清晰度、色彩饱和度等)时,由于融合前的两个图像时分别通过不同的摄像头拍摄得到的,而不同的摄像头分辨率、降噪能力等均有差异,因此,如图3所示,两个图像的清晰度会有较大差异,导致融合后的拍摄图像在融合边界处会有较大的清晰度差异,融合边界较明显,使得最终得到的拍摄图像(即融合图像)拼接感较强。However, when the image quality (such as sharpness, color saturation, etc.) It is obtained by shooting with different cameras, and different cameras have different resolutions and noise reduction capabilities. Therefore, as shown in Figure 3, the sharpness of the two images will be quite different, resulting in the fusion of the captured image. There will be a large difference in definition at the fusion boundary, and the fusion boundary is more obvious, so that the final captured image (that is, the fusion image) has a strong sense of splicing.
为解决上述问题,本申请实施例提供一种拍照方法,该方法可以应用于具有拍照功能的电子设备通过设置的多个摄像头进行拍摄的场景中。In order to solve the above problems, an embodiment of the present application provides a photographing method, which can be applied to a scene where an electronic device with a photographing function takes pictures through multiple cameras provided.
在本申请实施例中,该拍照方法可以是,如图4所示,电子设备可以通过两个焦距不同的摄像头(即视场角不同的摄像头)进行拍摄以获取得到两个图像,例如,第一图像和第二图像。然后对其中一个图像,如第二图像,进行模糊处理,再将该图像(如第二图像)模糊处理后的图像(如第三图像)与第一图像进行融合,可将该融合后的图像作为拍摄图像(或称为第四图像)。In the embodiment of the present application, the photographing method may be, as shown in FIG. 4 , the electronic device may use two cameras with different focal lengths (that is, cameras with different field of view) to obtain two images, for example, the first One image and second image. Then one of the images, such as the second image, is blurred, and then the blurred image (such as the third image) of the image (such as the second image) is fused with the first image, and the fused image can be as a captured image (or called a fourth image).
示例地,通常焦距相对较大的摄像头拍摄的图像清晰度更高,则第二图像可以是这两个图像中由焦距相对较大的摄像头(即视场角相对较小的摄像头)获取得到的图像,即对焦距相对较大的摄像头获取得到的图像进行模糊处理。当然,在其他实施方式中,第二图像也可以是两个图像中由焦距相对较小的摄像头(即视场角相对较大的摄像头)获取得到的图像,此处不做限制,一般可根据两个图像中具体哪个图像清晰度更高,来确定对清晰度更高的图像进行模糊处理。For example, usually the image captured by the camera with a relatively larger focal length has a higher definition, then the second image may be obtained by the camera with a relatively larger focal length (that is, the camera with a relatively smaller field of view) among the two images The image, that is, the image obtained by the camera with a relatively large focal length is blurred. Certainly, in other implementation manners, the second image may also be an image acquired by a camera with a relatively small focal length (that is, a camera with a relatively large field of view) among the two images. Which of the two images has a higher definition is determined to perform blurring on the image with a higher definition.
其中,该拍照方法可在用户调整的变焦倍数不是电子设备设置的各个摄像头的焦距对应的变焦倍数时应用。且当电子设备设置的不同焦距的摄像头的个数为三个及以上时,该拍照方法中涉及的两个摄像头可以根据用户调整的变焦倍数来具体确定。如,可将焦距对应的变焦倍数大于用户调整的变焦倍数的一个摄像头和焦距对应的变焦倍数小于用户调整的变焦倍数的一个摄像头分别作为该拍照方法中涉及的两个摄像头。又如,可根 据不同的变焦倍数范围设置不同的两个摄像头的组合,然后根据用户调整的变焦倍数所处的变焦倍数范围确定相应的两个摄像头。当然,在本申请实施例的其他实施方式中,电子设备还可以始终通过固定的两个不同焦距的摄像头来进行图像拍摄等。因此,在本申请实施例中,对于电子设备何时应用该拍照方法以通过第一摄像头和第二摄像头一起进行图像拍摄不做限制,可根据实际需求设置。Wherein, the photographing method may be applied when the zoom factor adjusted by the user is not the zoom factor corresponding to the focal length of each camera set in the electronic device. And when the number of cameras with different focal lengths set on the electronic device is three or more, the two cameras involved in the photographing method can be specifically determined according to the zoom factor adjusted by the user. For example, a camera with a focal length corresponding to a zoom factor greater than the user-adjusted zoom factor and a camera with a focal length corresponding to a zoom factor smaller than the user-adjusted zoom factor may be used as the two cameras involved in the photographing method. For another example, different combinations of two cameras may be set according to different zoom factor ranges, and then the corresponding two cameras may be determined according to the zoom factor range adjusted by the user. Certainly, in other implementation manners of the embodiment of the present application, the electronic device may also always use two fixed cameras with different focal lengths to perform image capture and the like. Therefore, in the embodiment of the present application, there is no limitation on when the electronic device applies the photographing method to capture images together through the first camera and the second camera, and it can be set according to actual needs.
同时使用的两个焦距不同的摄像头,其焦距可以为相邻的焦距等。例如,两个摄像头一个为超广角摄像头,一个为主摄摄像头,或一个为主摄摄像头,一个为长焦摄像头等。The focal lengths of two cameras with different focal lengths used at the same time may be adjacent focal lengths or the like. For example, one of the two cameras is a super wide-angle camera and the other is a main camera, or one is a main camera and the other is a telephoto camera.
拍摄图像(即上述的第四图像)是指用户通过手机进行拍摄时,手机最终拍摄得到的图像,或手机最终拍摄展示给用户的图像。The captured image (that is, the above-mentioned fourth image) refers to an image finally captured by the mobile phone when the user uses the mobile phone to capture the image, or an image finally captured by the mobile phone and displayed to the user.
需要说明的是,由两个摄像头中焦距相对较小的摄像头拍摄得到的图像,如上述的第一图像,通常为对应摄像头(例如,两个摄像头中焦距相对较小的摄像头)拍摄的图像经数码变焦调整到用户调整的变焦倍数(即当前的变焦倍数)后的图像。从而使融合后的图像能够满足用户调整的变焦倍数。当然,在一些其他实施方式中,还可以是直接将对应摄像头拍摄得到的图像作为上述的第一图像,在后续进行融合时,电子设备可以将上述的第一图像经数码变焦调整到用户调整的变焦倍数(即当前的变焦倍数)后再与第三图像融合以作为拍摄图像。此处,对于使最终的拍摄图像满足用户调整的变焦倍数的具体方式,不做限制。It should be noted that the image captured by the camera with a relatively smaller focal length among the two cameras, such as the above-mentioned first image, is usually an image captured by the corresponding camera (for example, the camera with a relatively smaller focal length among the two cameras). The image after the digital zoom is adjusted to the zoom factor adjusted by the user (that is, the current zoom factor). Therefore, the fused image can meet the zoom factor adjusted by the user. Of course, in some other implementation manners, it is also possible to directly use the image captured by the corresponding camera as the above-mentioned first image, and during subsequent fusion, the electronic device can digitally zoom the above-mentioned first image to the value adjusted by the user. The zoom factor (that is, the current zoom factor) is then fused with the third image to be a captured image. Here, no limitation is imposed on the specific manner of making the final captured image satisfy the zoom factor adjusted by the user.
如此,当电子设备通过两个摄像头拍摄,再将分别拍摄得到的图像进行融合以得到拍摄图像时,电子设备通过对获取的两个图像中清晰度更高的图像进行模糊处理的方式降低对应图像的清晰度。从而能够减小因摄像头间的分辨率、降噪能力的差异导致的两个摄像头分别获取得到的图像之间的清晰度差异。进而,将两个清晰度差异较小的图像进行融合,便能够得到融合边界不明显,拼接感较弱的拍摄图像。In this way, when the electronic device shoots through two cameras, and then fuses the images captured separately to obtain a captured image, the electronic device reduces the corresponding image by blurring the image with higher resolution among the two acquired images. clarity. Therefore, the difference in definition between the images acquired by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
以下,将结合附图对本申请实施例提供的拍照方法进行说明。Hereinafter, the photographing method provided in the embodiment of the present application will be described with reference to the accompanying drawings.
在本申请实施例中,具有拍照功能的电子设备,可以是手机、平板电脑、手持计算机,PC,蜂窝电话,个人数字助理(personal digital assistant,PDA),可穿戴式设备(如:智能手表、智能手环),智能家居设备(如:电视机),车机(如:车载电脑),智慧屏,游戏机,以及增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备等。本申请实施例对于电子设备的具体设备形态不作特殊限制。In the embodiment of the present application, the electronic device with camera function can be mobile phone, tablet computer, handheld computer, PC, cell phone, personal digital assistant (personal digital assistant, PDA), wearable device (such as: smart watch, Smart bracelet), smart home equipment (such as: TV), car machine (such as: car computer), smart screen, game console, and augmented reality (augmented reality, AR) / virtual reality (virtual reality, VR) equipment, etc. . The embodiment of the present application does not impose special restrictions on the specific device form of the electronic device.
其中,本申请实施例中,电子设备设置有至少两个不同焦距的摄像头(即两个不同视场角的摄像头)。示例地,电子设备设置有一个主摄摄像头(通常为广角摄像头),一个相比于主摄摄像头焦距更长的长焦摄像头和一个相比于主摄摄像头焦距更短的超广角摄像头。Wherein, in the embodiment of the present application, the electronic device is provided with at least two cameras with different focal lengths (that is, two cameras with different viewing angles). For example, the electronic device is provided with a main camera (usually a wide-angle camera), a telephoto camera with a longer focal length than the main camera, and an ultra-wide-angle camera with a shorter focal length than the main camera.
示例地,以电子设备为手机为例,图5示出了本申请实施例提供的一种电子设备的结构示意图。也即,示例性的,图5所示的电子设备可以是手机。Exemplarily, taking the electronic device as a mobile phone as an example, FIG. 5 shows a schematic structural diagram of an electronic device provided by an embodiment of the present application. That is, for example, the electronic device shown in FIG. 5 may be a mobile phone.
如图5所示,电子设备可以包括处理器510,外部存储器接口520,内部存储器521,通用串行总线(universal serial bus,USB)接口530,充电管理模块540,电源管理模块541,电池542,天线1,天线2,移动通信模块550,无线通信模块560,音频模块570,扬声器570A,受话器570B,麦克风570C,耳机接口570D,传感器模块580,按键590, 马达591,指示器592,摄像头593,显示屏594,以及用户标识模块(subscriber identification module,SIM)卡接口595等。其中,传感器模块580可以包括压力传感器580A,陀螺仪传感器580B,气压传感器580C,磁传感器580D,加速度传感器580E,距离传感器580F,接近光传感器580G,指纹传感器580H,温度传感器580J,触摸传感器580K,环境光传感器580L,骨传导传感器580M等。As shown in Figure 5, the electronic device may include a processor 510, an external memory interface 520, an internal memory 521, a universal serial bus (universal serial bus, USB) interface 530, a charging management module 540, a power management module 541, a battery 542, Antenna 1, antenna 2, mobile communication module 550, wireless communication module 560, audio module 570, speaker 570A, receiver 570B, microphone 570C, earphone jack 570D, sensor module 580, button 590, motor 591, indicator 592, camera 593, A display screen 594, and a subscriber identification module (subscriber identification module, SIM) card interface 595, etc. Among them, the sensor module 580 may include a pressure sensor 580A, a gyroscope sensor 580B, an air pressure sensor 580C, a magnetic sensor 580D, an acceleration sensor 580E, a distance sensor 580F, a proximity light sensor 580G, a fingerprint sensor 580H, a temperature sensor 580J, a touch sensor 580K, an environment Light sensor 580L, bone conduction sensor 580M, etc.
可以理解的是,本实施例示意的结构并不构成对电子设备的具体限定。在另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It should be understood that the structure shown in this embodiment does not constitute a specific limitation on the electronic device. In other embodiments, the electronic device may include more or fewer components than shown, or combine certain components, or separate certain components, or arrange different components. The illustrated components can be realized in hardware, software or a combination of software and hardware.
处理器510可以包括一个或多个处理单元,例如:处理器510可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 510 may include one or more processing units, for example: the processor 510 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
控制器可以是电子设备的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。A controller can be the nerve center and command center of an electronic device. The controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
处理器510中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器510中的存储器为高速缓冲存储器。该存储器可以保存处理器510刚用过或循环使用的指令或数据。如果处理器510需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器510的等待时间,因而提高了***的效率。A memory may also be provided in the processor 510 for storing instructions and data. In some embodiments, the memory in processor 510 is a cache memory. The memory may hold instructions or data that the processor 510 has just used or recycled. If the processor 510 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 510 is reduced, thus improving the efficiency of the system.
在一些实施例中,处理器510可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。In some embodiments, processor 510 may include one or more interfaces. The interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
电子设备的无线通信功能可以通过天线1,天线2,移动通信模块550,无线通信模块560,调制解调处理器以及基带处理器等实现。The wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 550, the wireless communication module 560, the modem processor and the baseband processor.
天线1和天线2用于发射和接收电磁波信号。电子设备中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。 Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
移动通信模块550可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块550可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块550可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块550还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块550的至少部分功能模块可以被设置于处理器510中。在一些实施例中,移动通信模块550的至少部分功能模块可以与处理器510的至少部分模 块被设置在同一个器件中。The mobile communication module 550 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices. The mobile communication module 550 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like. The mobile communication module 550 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation. The mobile communication module 550 can also amplify the signal modulated by the modem processor, convert it into electromagnetic wave and radiate it through the antenna 1 . In some embodiments, at least part of the functional modules of the mobile communication module 550 may be set in the processor 510 . In some embodiments, at least part of the functional modules of the mobile communication module 550 and at least part of the modules of the processor 510 may be set in the same device.
无线通信模块560可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星***(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块560可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块560经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器510。无线通信模块560还可以从处理器510接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。The wireless communication module 560 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions. The wireless communication module 560 may be one or more devices integrating at least one communication processing module. The wireless communication module 560 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 510 . The wireless communication module 560 can also receive the signal to be transmitted from the processor 510 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
在一些实施例中,电子设备的天线1和移动通信模块550耦合,天线2和无线通信模块560耦合,使得电子设备可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯***(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位***(global positioning system,GPS),全球导航卫星***(global navigation satellite system,GLONASS),北斗卫星导航***(beidou navigation satellite system,BDS),准天顶卫星***(quasi-zenith satellite system,QZSS)和/或星基增强***(satellite based augmentation systems,SBAS)。In some embodiments, the antenna 1 of the electronic device is coupled to the mobile communication module 550, and the antenna 2 is coupled to the wireless communication module 560, so that the electronic device can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc. The GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
电子设备通过GPU,显示屏594,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏594和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器510可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The electronic device realizes the display function through the GPU, the display screen 594, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 594 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering. Processor 510 may include one or more GPUs that execute program instructions to generate or alter display information.
显示屏594用于显示图像,视频等。显示屏594包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备可以包括1个或N个显示屏594,N为大于1的正整数。The display screen 594 is used to display images, videos and the like. Display 594 includes a display panel. The display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc. In some embodiments, the electronic device may include 1 or N display screens 594, where N is a positive integer greater than 1.
电子设备可以通过ISP,摄像头593,视频编解码器,GPU,显示屏594以及应用处理器等实现拍摄功能。在一些实施例中,电子设备可以包括1个或N个摄像头593,N为大于1的正整数。示例地,在本申请实施例中,电子设备可以包括三个摄像头,其中一个为主摄摄像头,一个为长焦摄像头,一个为超广角摄像头。The electronic device can realize the shooting function through ISP, camera 593 , video codec, GPU, display screen 594 and application processor. In some embodiments, the electronic device may include 1 or N cameras 593, where N is a positive integer greater than 1. For example, in the embodiment of the present application, the electronic device may include three cameras, one of which is a main camera, one is a telephoto camera, and one is a super wide-angle camera.
内部存储器521可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器510通过运行存储在内部存储器521的指令,从而执行电子设备的各种功能应用以及数据处理。内部存储器521可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器521可以包括高速随机存取存储器,还可以包括非易失性 存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。The internal memory 521 may be used to store computer-executable program codes including instructions. The processor 510 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 521 . The internal memory 521 may include an area for storing programs and an area for storing data. Wherein, the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like. The storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device. In addition, the internal memory 521 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash memory (universal flash storage, UFS) and the like.
当然,可以理解的,上述图5所示仅仅为电子设备的形态为手机时的示例性说明。若电子设备是平板电脑,手持计算机,PC,PDA,可穿戴式设备(如:智能手表、智能手环),智能家居设备(如:电视机),车机(如:车载电脑),智慧屏,游戏机以及AR/VR设备等其他设备形态时,电子设备的结构中可以包括比图5中所示更少的结构,也可以包括比图5中所示更多的结构,在此不作限制。Of course, it can be understood that what is shown in FIG. 5 above is only an exemplary description when the form of the electronic device is a mobile phone. If the electronic device is a tablet computer, handheld computer, PC, PDA, wearable device (such as: smart watch, smart bracelet), smart home equipment (such as: TV), car machine (such as: vehicle computer), smart screen , game consoles, AR/VR devices, and other device forms, the structure of the electronic device may include fewer structures than those shown in Figure 5, or may include more structures than those shown in Figure 5, and there is no limitation here .
以下实施例中的方法均可以在具有上述硬件结构的电子设备中实现。The methods in the following embodiments can all be implemented in electronic devices having the above hardware structure.
在本申请实施例中,如图6所示,电子设备的***架构可以包括应用层、框架层In the embodiment of the present application, as shown in Figure 6, the system architecture of the electronic device may include an application layer, a framework layer
(framework)、硬件抽象层(hardware abstract layer,HAL)、驱动层(driver)、固件层(firmware,FW)以及硬件层(hardware,HW)。其中,应用层可用于部署应用程序,例如本申请实施例中,应用层中可以部署有相机应用。框架层可以是
Figure PCTCN2022093613-appb-000001
框架、
Figure PCTCN2022093613-appb-000002
框架等***框架,此处不做限制。硬件抽象层能够部署各个硬件的统一接口,例如,本申请实施例中,硬件抽象层中可以部署有相机硬件抽象层(Camera HAL3)。用于实现本申请实施例提供的拍照方法的模块(相机算法模块(Libcamera algo))也可以部署在硬件抽象层中。驱动层可以用于部署各硬件设备的驱动组件,例如,本申请实施例中,驱动层可以部署有视频设备驱动(V4L2Driver),图像视频处理器(image video processor,IVP)驱动(IVP Driver)或DSP驱动(DSP Driver),NPU驱动(NPU Driver),GPU驱动(GPU Driver)等。固件层能够用于部署各硬件设备的固件,例如,本申请实施例中固件层可以部署物联网固件(lite-OS FW)以便于驱动影像传感器、飞行时间(time of flight,TOF)传感器、以及ISP等。硬件层则包括电子设备设置的各种硬件,例如本申请实施例中,硬件层可以包括影像传感器、TOF传感器、ISP、IVP或DSP、NPU、GPU等。
(framework), hardware abstract layer (hardware abstract layer, HAL), driver layer (driver), firmware layer (firmware, FW) and hardware layer (hardware, HW). Wherein, the application layer may be used to deploy application programs. For example, in the embodiment of the present application, a camera application may be deployed in the application layer. Framework layers can be
Figure PCTCN2022093613-appb-000001
frame,
Figure PCTCN2022093613-appb-000002
Frameworks and other system frameworks are not limited here. The hardware abstraction layer can deploy a unified interface of each piece of hardware. For example, in the embodiment of the present application, a camera hardware abstraction layer (Camera HAL3) can be deployed in the hardware abstraction layer. The module (camera algorithm module (Libcamera algo)) for realizing the photographing method provided by the embodiment of the present application may also be deployed in the hardware abstraction layer. The driver layer can be used to deploy the driver components of each hardware device. For example, in the embodiment of the present application, the driver layer can be deployed with a video device driver (V4L2Driver), an image video processor (image video processor, IVP) driver (IVP Driver) or DSP Driver (DSP Driver), NPU Driver (NPU Driver), GPU Driver (GPU Driver), etc. The firmware layer can be used to deploy the firmware of each hardware device. For example, in the embodiment of the present application, the firmware layer can deploy the Internet of Things firmware (lite-OS FW) so as to drive image sensors, time of flight (time of flight, TOF) sensors, and ISP and so on. The hardware layer includes various hardware provided by the electronic device. For example, in the embodiment of the present application, the hardware layer may include image sensor, TOF sensor, ISP, IVP or DSP, NPU, GPU, etc.
示例地,实现本申请实施例提供的拍照方法的模块(相机算法模块)可以在用户打开应用层中部署的相机应用时初始化到硬件抽象层中。用户在相机应用中调整变焦倍数为需要电子设备通过两个摄像头同时拍摄的变焦倍数时,响应于用户的拍照操作(例如,相机应用中包括预览界面,预览界面包括第一控件(或称为快门控件),拍照操作为用户对第一控件的第一操作,如,点击快门控件等),应用层中的相机应用便可以将拍照指令依次经框架层、相机硬件抽象层、视频设备驱动以及物联网固件发送给影像传感器,从而使影像传感器能够响应于拍照指令获取图像。其中,根据电子设备设置的摄像头不同,各个摄像头的影像传感器有所不同,相机应用可以根据所需使用的摄像头向对应摄像头的影像传感器下发拍照指令。例如,电子设备设置有主摄摄像头、长焦摄像头和超广角摄像头时,若电子设备需要采用主摄摄像头和长焦摄像头一同拍摄,则相机应用可以分别向主摄摄像头的影像传感器和长焦摄像头的影像传感器发送拍照指令。若电子设备需要采用主摄摄像头和超广角摄像头一同拍摄,则相机应用可以分别向主摄摄像头的影像传感器和超广角摄像头的影像传感器发送拍照指令。当相应的两个影像传感器接收到拍照指令获取了图像后,影像传感器可以将图像发送给ISP。ISP对接收到的图像按照预设方式进行处理后便能够将处理后的两个图像经物联网固件和视频设备驱动发送给相机硬件抽象层。相机硬件抽象层接收到两个图像后,可以将两个图像发送给用于实现本 申请实施例的拍照方法的相机算法模块。该相机算法模块接收到两个图像后,可以根据本申请实施例的拍照方法利用相应的驱动(如,IPV或DSP Driver、NPU Driver、GPU Driver等)调用对应的硬件(如IVP或DSP、NPU、GPU等)对两个图像中由焦距相对较大的摄像头拍摄得到的图像进行模糊处理,并将模糊处理后的图像与另一图像进行融合以得到拍摄图像。最后,相机算法模块可以从处理融合图像得到拍摄图像的硬件中获取拍摄图像,并将由模糊处理后的图像和另一图像融合得到的拍摄图像经相机硬件抽象层和框架层发送给应用层中部署的相机应用,以便于相机应用显示和/或存储接收到的拍摄图像。For example, the module (camera algorithm module) implementing the photographing method provided by the embodiment of the present application may be initialized in the hardware abstraction layer when the user opens the camera application deployed in the application layer. When the user adjusts the zoom factor in the camera application to a zoom factor that requires the electronic device to take pictures simultaneously through two cameras, in response to the user's camera operation (for example, the camera application includes a preview interface, and the preview interface includes a first control (or called a shutter button) control), the camera operation is the user’s first operation on the first control, such as clicking the shutter control, etc.), the camera application in the application layer can pass the camera command through the framework layer, camera hardware abstraction layer, video device driver and object The networking firmware is sent to the image sensor, so that the image sensor can acquire an image in response to a camera command. Wherein, according to the different cameras set in the electronic device, the image sensors of each camera are different, and the camera application may send a photographing instruction to the image sensor of the corresponding camera according to the camera to be used. For example, when an electronic device is equipped with a main camera, a telephoto camera, and an ultra-wide-angle camera, if the electronic device needs to use the main camera and the telephoto camera to shoot together, the camera application can provide the image sensor of the main camera and the telephoto camera respectively. The image sensor of the camera sends a camera instruction. If the electronic device needs to use the main camera and the ultra-wide-angle camera to shoot together, the camera application can send a photographing instruction to the image sensor of the main camera and the image sensor of the ultra-wide-angle camera respectively. After corresponding two image sensors receive a camera instruction and acquire an image, the image sensor may send the image to the ISP. After the ISP processes the received images according to the preset method, it can send the processed two images to the camera hardware abstraction layer through the IoT firmware and video device driver. After the camera hardware abstraction layer receives the two images, the two images can be sent to the camera algorithm module for implementing the camera method of the embodiment of the present application. After the camera algorithm module receives two images, it can use corresponding drivers (such as IPV or DSP Driver, NPU Driver, GPU Driver, etc.) to call corresponding hardware (such as IVP or DSP, NPU , GPU, etc.) performs blurring processing on the image captured by the camera with a relatively large focal length among the two images, and fuses the blurred image with another image to obtain a captured image. Finally, the camera algorithm module can obtain the captured image from the hardware that processes the fused image to obtain the captured image, and sends the captured image obtained by fusing the blurred image and another image to the application layer for deployment through the camera hardware abstraction layer and the framework layer The camera application of the camera application, so that the camera application displays and/or stores the received captured image.
以下将以电子设备为手机、电子设备设置有一个主摄摄像头(广角摄像头)、一个长焦摄像头和一个超广角摄像头,其中,主摄摄像头的焦距对应的变焦倍数设置为1.0x,超广角摄像头的焦距对应的变焦倍数设置为0.4x,长焦摄像头的焦距对应的变焦倍数设置为3.5x。当用户调整变焦倍数为2.0x-3.5x时,手机使用主摄摄像头和长焦摄像头进行拍摄。当用户调整变焦倍数为0.6x-0.9x时,手机使用超广角摄像头和主摄摄像头进行拍摄,为例,对本申请实施例提供的一种拍照方法的具体实施方式进行举例说明。The following will take the electronic device as a mobile phone, and the electronic device is provided with a main camera (wide-angle camera), a telephoto camera and an ultra-wide-angle camera, wherein the zoom factor corresponding to the focal length of the main camera is set to 1.0x, and the ultra-wide-angle camera The zoom factor corresponding to the focal length of the camera is set to 0.4x, and the zoom factor corresponding to the focal length of the telephoto camera is set to 3.5x. When the user adjusts the zoom factor to 2.0x-3.5x, the mobile phone uses the main camera and the telephoto camera for shooting. When the user adjusts the zoom factor to 0.6x-0.9x, the mobile phone uses the super wide-angle camera and the main camera to take pictures. As an example, a specific implementation of a method for taking pictures provided in the embodiment of the present application is illustrated.
图7示出了本申请实施例提供的一种拍照方法的流程示意图。如图7所示,该拍照方法可以包括以下S701-S703。FIG. 7 shows a schematic flowchart of a photographing method provided by an embodiment of the present application. As shown in FIG. 7, the photographing method may include the following steps S701-S703.
当用户打开手机的拍照界面,并调整变焦倍数时,若调整的变焦倍数不是手机设置的各个摄像头的焦距对应的变焦倍数(即在本示例中调整的变焦倍数不是0.4x、1.0x和3.5x),并且调整后的变焦倍数位于预设的使用两个摄像头拍摄的变焦倍数范围内,则手机可以在用户进行拍照操作时,使用相应的两个不同焦距的摄像头进行拍摄以分别得到两个摄像头拍摄得到的图像。例如,手机执行以下S701。When the user opens the camera interface of the mobile phone and adjusts the zoom factor, if the adjusted zoom factor is not the zoom factor corresponding to the focal length of each camera set on the mobile phone (that is, the adjusted zoom factor in this example is not 0.4x, 1.0x and 3.5x ), and the adjusted zoom factor is within the preset zoom factor range for shooting with two cameras, then the mobile phone can use the corresponding two cameras with different focal lengths to shoot when the user is taking pictures to obtain two cameras respectively Capture the resulting image. For example, the mobile phone executes the following S701.
S701、当变焦倍数位于预设范围时,响应于用户的拍照操作,手机通过第一摄像头拍摄获取第一图像,通过第二摄像头拍摄获取第二图像。S701. When the zoom factor is within a preset range, in response to a user's camera operation, the mobile phone acquires a first image through a first camera, and acquires a second image through a second camera.
其中,第一摄像头获取第一图像和第二摄像头获取第二图像可以同时进行,当然在一些其他实施方式中,第一摄像头获取第一图像和第二摄像头获取第二图像也可以以较小的时间间隔分别进行,此处不做限制。The acquisition of the first image by the first camera and the acquisition of the second image by the second camera can be performed at the same time. The time intervals are performed separately, and there is no limitation here.
需要说明的是,第一摄像头和第二摄像头的焦距不同,示例地,在本申请实施例中第二摄像头的焦距大于第一摄像头的焦距(以下皆以此为例),即第一摄像头可以是本示例中电子设备设置的主摄摄像头,则第二摄像头可以是电子设备设置的长焦摄像头,或者,第一摄像头可以是本示例中电子设备设置的超广角摄像头,则第二摄像头可以是电子设备设置的主摄摄像头。从而能够使通过焦距相对较长的第二摄像头拍摄的第二图像包含于焦距相对较短的第一摄像头拍摄的第一图像中,以便于后续将第一图像和第二图像进行融合。It should be noted that the focal lengths of the first camera and the second camera are different. For example, in the embodiment of the present application, the focal length of the second camera is greater than the focal length of the first camera (hereinafter all take this as an example), that is, the first camera can is the main camera set by the electronic device in this example, then the second camera can be a telephoto camera set by the electronic device, or the first camera can be an ultra-wide-angle camera set by the electronic device in this example, then the second camera can be The main camera set by the electronic device. Therefore, the second image captured by the second camera with a relatively long focal length can be included in the first image captured by the first camera with a relatively short focal length, so as to facilitate subsequent fusion of the first image and the second image.
通常,第一摄像头和第二摄像头组成的摄像头组合与预设范围对应,即不同的预设范围对应不同的摄像头组合。例如,在本申请实施例中,当预设范围为2.0x-3.5x时,第一摄像头可以是主摄摄像头、第二摄像头可以是长焦摄像头,而当预设范围为0.6x-0.9x时,第一摄像头可以是超广角摄像头、第二摄像头可以是主摄摄像头。Usually, the camera combination composed of the first camera and the second camera corresponds to a preset range, that is, different preset ranges correspond to different camera combinations. For example, in this embodiment of the application, when the preset range is 2.0x-3.5x, the first camera can be the main camera, and the second camera can be a telephoto camera, and when the preset range is 0.6x-0.9x , the first camera can be a super wide-angle camera, and the second camera can be a main camera.
示例地,当用户调整设置的变焦倍数为2.5x时,由于该变焦倍数位于预设的2.0x-3.5x范围内,因此,响应于用户的拍照操作,手机可以通过主摄摄像头(即第一摄像头为主 摄摄像头)拍摄获取第一图像、通过长焦摄像头(即第二摄像头为长焦摄像头)拍摄获取第二图像。当用户调整设置的变焦倍数为0.7x时,由于该变焦倍数位于预设的0.6x-0.9x范围内,因此,响应于用户的拍照操作,手机可以通过超广角摄像头(即第一摄像头为超广角摄像头)拍摄获取第一图像、通过主摄摄像头(即第二摄像头为主摄摄像头)拍摄获取第二图像。For example, when the zoom factor adjusted by the user is 2.5x, since the zoom factor is within the preset range of 2.0x-3.5x, in response to the user's camera operation, the mobile phone can pass through the main camera (that is, the first The first image is captured by the camera as the main camera, and the second image is captured by the telephoto camera (that is, the second camera is the telephoto camera). When the user adjusts the set zoom factor to 0.7x, since the zoom factor is within the preset range of 0.6x-0.9x, in response to the user's camera operation, the mobile phone can pass through the ultra-wide-angle camera (that is, the first camera is an ultra-wide-angle camera). wide-angle camera) to obtain the first image, and the main camera (ie, the second camera as the main camera) to obtain the second image.
其中,手机的拍照界面中可以包括预览界面,在预览界面中包括有第一控件(或称为快门控件、拍照控件),用户的拍照操作可以是用户对第一控件的第一操作(如点击操作、长按操作等)。示例地,如图8所示,手机中显示有预览界面,该界面中包括预览框、拍照控件801以及变焦控件802。其中,预览框用于显示当前变焦倍数和拍照模式下的被摄对象的预览图像。拍照控件801则用于触发手机的拍照动作。变焦控件802则可用于调整变焦倍数,变焦控件上可以显示有当前的变焦倍数。如图8所示,用户将变焦倍数调整为2.5x时,预览框中可以显示变焦倍数为2.5x时的预览图像。此时,用户可以通过点击拍照控件801来进行拍照操作。Wherein, the camera interface of the mobile phone may include a preview interface, and the preview interface includes a first control (or called a shutter control, a camera control), and the user's camera operation may be the user's first operation on the first control (such as clicking operation, long press operation, etc.). For example, as shown in FIG. 8 , a preview interface is displayed on the mobile phone, and the interface includes a preview frame, a camera control 801 and a zoom control 802 . Wherein, the preview frame is used to display the current zoom factor and the preview image of the subject in the photographing mode. The camera control 801 is used to trigger the camera action of the mobile phone. The zoom control 802 can be used to adjust the zoom factor, and the current zoom factor can be displayed on the zoom control. As shown in FIG. 8 , when the user adjusts the zoom factor to 2.5x, a preview image when the zoom factor is 2.5x may be displayed in the preview box. At this point, the user can click the camera control 801 to perform a camera operation.
可选地,在本申请实施例的其他实施方式中,用户的拍照操作还可以是对预设的按键(如电源键、音量键等)进行的按压操作。因此,在本申请实施例中,对用户的拍照操作不做限制,只要是用于触发手机进行拍照的操作均为用户的拍照操作。Optionally, in other implementations of the embodiment of the present application, the user's photographing operation may also be a pressing operation on a preset key (such as a power key, a volume key, etc.). Therefore, in the embodiment of the present application, there is no restriction on the user's photographing operation, as long as the operation used to trigger the mobile phone to take a photograph is the user's photographing operation.
手机通过第一摄像头拍摄获取的第一图像,可以是通过第一摄像头拍摄得到的与第一摄像头焦距对应的变焦倍数匹配的图像(例如第一摄像头采集的图像经ISP处理后的图像),还可以是第一摄像头拍摄得到的与第一摄像头焦距对应的变焦倍数匹配的图像通过数码变焦处理后的图像。即与用户调整设置的变焦倍数(即当前的变焦倍数)相匹配的图像。手机通过第二摄像头拍摄获取的第二图像,则可以是第二摄像头拍摄得到的与第二摄像头焦距对应的变焦倍数匹配的图像(例如第二摄像头采集的图像经ISP处理后的图像)。其中,第一摄像头拍摄获取的第一图像为通过第一摄像头拍摄得到的与第一摄像头焦距对应的变焦倍数匹配的图像时,手机可以在后续进行融合时,将第一图像经数码变焦调整到用户调整的变焦倍数(即当前的变焦倍数)后再和第三图像融合以得到与用户调整设置的变焦倍数相匹配的融合后图像,从而便于后续以该融合后图像作为拍摄图像。The first image captured by the mobile phone through the first camera may be an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera (for example, an image captured by the first camera and processed by an ISP), or It may be an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera and processed through digital zooming. That is, an image that matches the zoom factor adjusted by the user (ie, the current zoom factor). The second image captured by the mobile phone through the second camera may be an image captured by the second camera that matches the zoom factor corresponding to the focal length of the second camera (for example, the image captured by the second camera is processed by the ISP). Wherein, when the first image captured by the first camera is an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera, the mobile phone can digitally zoom the first image to adjust to The zoom factor adjusted by the user (that is, the current zoom factor) is fused with the third image to obtain a fused image that matches the zoom factor adjusted by the user, so that the fused image can be used as a captured image later.
当手机拍摄得到第一图像和第二图像后,手机可执行以下S702。After the first image and the second image are captured by the mobile phone, the mobile phone may perform the following S702.
S702、对第二图像按照预设规则进行模糊处理,得到第三图像。S702. Perform blurring processing on the second image according to a preset rule to obtain a third image.
其中,模糊处理可以包括高斯模糊、表面模糊、方框模糊、Kawase模糊、双重模糊、散景模糊、移轴模糊、光圈模糊、粒状模糊、径向模糊以及方向模糊等,即对第二图像可以采用高斯模糊、表面模糊、方框模糊、Kawase模糊、双重模糊、散景模糊、移轴模糊、光圈模糊、粒状模糊、径向模糊以及方向模糊等模糊算法进行模糊处理。Wherein, the blur processing can include Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, granular blur, radial blur, and direction blur, etc., that is, the second image can be Gaussian Blur, Surface Blur, Box Blur, Kawase Blur, Double Blur, Bokeh Blur, Tilt-Shift Blur, Aperture Blur, Grainy Blur, Radial Blur, and Direction Blur are used for blurring.
作为一种示例,在本申请实施例中,对第二图像进行模糊处理的预设规则,可以是根据第一图像和第二图像之间的相似度(可以用于表征第一图像和第二图像间的清晰度差异大小,如第一图像和第二图像的相似度越高则清晰度差异越小,相似度越低则清晰度差异越大)来确定对应的模糊力度(blur),然后根据相应的模糊力度采用模糊算法对第二图像进行模糊处理。从而避免在第一图像和第二图像的清晰度差异较小时(即SSIM值较高时),对第二图像过度模糊处理导致融合后的图像相比于第一图像清晰度没有得 到提升的情况。As an example, in the embodiment of the present application, the preset rule for blurring the second image may be based on the similarity between the first image and the second image (which may be used to characterize the first image and the second image) The sharpness difference between images, such as the higher the similarity between the first image and the second image, the smaller the sharpness difference, the lower the similarity, the larger the sharpness difference) to determine the corresponding blur strength (blur), and then A blurring algorithm is used to perform blurring processing on the second image according to the corresponding blurring strength. In this way, when the sharpness difference between the first image and the second image is small (that is, when the SSIM value is high), excessive blurring of the second image will result in the situation that the sharpness of the fused image is not improved compared with the first image .
示例地,第一图像和第二图像之间的相似度可以采用结构相似性(structural similarity,SSIM)来表示。其中,第一图像(即图像x)和第二图像(即图像y)的SSIM值可以采用以下公式计算得到:For example, the similarity between the first image and the second image may be represented by structural similarity (structural similarity, SSIM). Wherein, the SSIM values of the first image (i.e. image x) and the second image (i.e. image y) can be calculated using the following formula:
Figure PCTCN2022093613-appb-000003
Figure PCTCN2022093613-appb-000003
其中,x为图像x(如第一图像),y为图像y(如第二图像),μ x是x的平均值,μ y是y的平均值,
Figure PCTCN2022093613-appb-000004
是x的方差,
Figure PCTCN2022093613-appb-000005
是y的方差,σ xy是x和y的协方差,c 1=(k 1L) 2,c 2=(k 2L) 2,L是像素值的动态范围(即图像像素值的取值范围的最大值,例如,对于8位通道图像,其像素值取值范围为0-255,则L=255),k 1=0.01,k 2=0.03。
Wherein, x is the image x (such as the first image), y is the image y (such as the second image), μ x is the average value of x, μ y is the average value of y,
Figure PCTCN2022093613-appb-000004
is the variance of x,
Figure PCTCN2022093613-appb-000005
is the variance of y, σ xy is the covariance of x and y, c 1 =(k 1 L) 2 , c 2 =(k 2 L) 2 , L is the dynamic range of the pixel value (that is, the value of the image pixel value The maximum value of the range, for example, for an 8-bit channel image, the pixel value range is 0-255, then L=255), k 1 =0.01, k 2 =0.03.
SSIM值的范围为0到1。当两张图像一模一样时,SSIM值等于1。SSIM values range from 0 to 1. When the two images are exactly the same, the SSIM value is equal to 1.
因此,可以基于第一图像和第二图像的SSIM值来标定最大模糊力度和最小模糊力度,从而得到第一图像和第二图像的SSIM值与模糊力度间的对应关系,以便于后续根据第一图像和第二图像的SSIM值来确定相应的模糊力度。Therefore, the maximum blur strength and the minimum blur strength can be calibrated based on the SSIM values of the first image and the second image, so as to obtain the corresponding relationship between the SSIM value and the blur strength of the first image and the second image, so as to facilitate the follow-up based on the first The SSIM values of the image and the second image are used to determine the corresponding blur strength.
例如,可以根据经验数据(如测试验证得到的数据),确定能够界定第一图像和第二图像的相似度过低的低SSIM值(即SSIM值小于该低SSIM值时可确定第一图像和第二图像的相似度过低),以及能够界定第一图像和第二图像的相似度较高的高SSIM值(即SSIM值大于该高SSIM值时可确定第一图像和第二图像的相似度较高)。从而可基于低SSIM值来标定最大模糊力度,基于高SSIM值来标定最小模糊力度。如,在第一图像和第二图像的SSIM值为低SSIM值时线性调整模糊力度,直到对第二图像模糊处理后得到的第三图像与第一图像融合后能相比于第一图像具有较大清晰度提升,且融合边界不明显,则可以将此时的模糊力度作为最大模糊力度。同样地,在第一图像和第二图像的SSIM值为高SSIM值时线性调整模糊力度,直到对第二图像模糊处理后得到的第三图像与第一图像融合后能相比于第一图像具有较大清晰度提升,且融合边界不明显,则可以将此时的模糊力度作为最小模糊力度。然后,可以将低于上述低SSIM值的SSIM值对应的模糊力度均设置为上述最大模糊力度,将高于上述高SSIM值的SSIM值对应的模糊力度均设置为上述最小模糊力度,将介于低SSIM值和高SSIM值之间的SSIM值与最大模糊力度和最高模糊力度间的模糊力度进行线性对应。如,以确定出的低SSIM值为0.25,高SSIM值为0.38为例,若对SSIM值为0.25时的模糊力度进行标定得到最大模糊力度为9,对SSIM值为0.38时的模糊力度进行标定得到最小模糊力度为1,则可以得到如图9所示的模糊力度与SSIM值间的对应关系曲线。For example, based on empirical data (such as data obtained from test verification), it can be determined that the low SSIM value that can limit the similarity between the first image and the second image is too low (that is, when the SSIM value is smaller than the low SSIM value, the first image and the second image can be determined. The similarity of the second image is too low), and a high SSIM value capable of defining a high similarity between the first image and the second image (that is, when the SSIM value is greater than the high SSIM value, the similarity between the first image and the second image can be determined higher degree). Thus, the maximum blur strength can be calibrated based on a low SSIM value, and the minimum blur strength can be calibrated based on a high SSIM value. For example, when the SSIM values of the first image and the second image are low, the blur strength is adjusted linearly until the third image obtained after blurring the second image can be compared with the first image after fusion with the first image. If the sharpness is improved and the fusion boundary is not obvious, the blur strength at this time can be used as the maximum blur strength. Similarly, when the SSIM values of the first image and the second image are high, the blur strength is linearly adjusted until the third image obtained after blurring the second image can be compared with the first image after fusion with the first image. If there is a large increase in definition and the fusion boundary is not obvious, the blur strength at this time can be taken as the minimum blur strength. Then, the blur strength corresponding to the SSIM value lower than the above-mentioned low SSIM value can be set to the above-mentioned maximum blur strength, and the blur strength corresponding to the SSIM value higher than the above-mentioned high SSIM value can be set to the above-mentioned minimum blur strength, which will be between The SSIM value between the low SSIM value and the high SSIM value corresponds linearly to the blur strength between the maximum blur strength and the highest blur strength. For example, taking the determined low SSIM value of 0.25 and high SSIM value of 0.38 as an example, if the blur strength is calibrated when the SSIM value is 0.25, the maximum blur strength is 9, and the blur strength when the SSIM value is 0.38 is calibrated If the minimum blur intensity is 1, the corresponding relationship curve between the blur intensity and the SSIM value as shown in FIG. 9 can be obtained.
作为另一种示例,通常情况下由于用户拍摄获取第二图像的第二摄像头其焦距大于第一摄像头,第二摄像头的去噪能力相比于第一摄像头较强,因此,在因为增加感光度(ISO)而导致图像噪点增加时,第二摄像头拍摄获取的第二图像相比于第一摄像头拍摄获取的第一图像要更加清晰,因此,当第一图像和第二图像的感光度接近时,感光度越高则第二图像与第一图像的清晰度差异便越大。所以,在本申请实施例中,对第二图像进行模糊处理的预设规则,可以是根据第二图像对应的感光度(ISO),即拍摄获取第二图像时第二摄像头的感光度,来确定对应的模糊力度,然后根据相应的模糊力度采用模 糊算法对第二图像进行模糊处理。从而避免在第一图像和第二图像的清晰度差异较小时,对第二图像过度模糊处理导致融合后的图像相比于第一图像清晰度没有得到提升的情况。As another example, usually because the focal length of the second camera used by the user to capture the second image is larger than that of the first camera, the noise removal capability of the second camera is stronger than that of the first camera. Therefore, due to the increase in sensitivity (ISO) increases image noise, the second image captured by the second camera is clearer than the first image captured by the first camera, therefore, when the sensitivity of the first image and the second image are close , the higher the sensitivity is, the greater the sharpness difference between the second image and the first image will be. Therefore, in the embodiment of the present application, the preset rule for blurring the second image may be based on the sensitivity (ISO) corresponding to the second image, that is, the sensitivity of the second camera when capturing the second image. The corresponding blur strength is determined, and then the second image is blurred by using a blur algorithm according to the corresponding blur strength. Therefore, when the sharpness difference between the first image and the second image is small, the situation that the sharpness of the fused image is not improved compared with the first image due to excessive blurring processing on the second image is avoided.
示例地,可以将感光度进行分段划分,然后从低到高按照感光度越高对应的模糊力度越大的规则设置不同感光度分段对应的模糊力度。其中,当模糊力度大于一定数值时,为了避免采用更高的模糊力度进行模糊处理,导致对第二图像过度模糊处理导致融合后的图像相比于第一图像清晰度没有得到提升的情况,可以将该模糊力度作为最大模糊力度,以使更高的感光度分段均与最大模糊力度对应。For example, the sensitivity can be divided into segments, and then the blur strength corresponding to different sensitivity segments can be set according to the rule that the higher the sensitivity, the greater the blur strength, from low to high. Wherein, when the blur strength is greater than a certain value, in order to avoid using a higher blur strength for blur processing, resulting in excessive blur processing of the second image and resulting in the fused image not being improved compared with the first image definition, you can Use this blur strength as the maximum blur strength so that the higher sensitivity segments all correspond to the maximum blur strength.
例如,可以将感光度划分为100-1000,1000-2000,2000-3000,3000-4000,4000-5000,5000-6000等。从而将感光度为100-1000时对应的模糊力度设置为1,将感光度为1000-2000时对应的模糊力度设置为3,将感光度为2000-3000时对应的模糊力度设置为5,将感光度为3000-4000时对应的模糊力度设置为7,将感光度为4000-5000时对应的模糊力度设置为9。示例地,上述示例的感光度分段及其对应的模糊力度的具体参数设置可以如下所示:For example, the sensitivity can be divided into 100-1000, 1000-2000, 2000-3000, 3000-4000, 4000-5000, 5000-6000 and so on. Therefore, set the corresponding blur strength when the sensitivity is 100-1000 to 1, set the corresponding blur strength to 3 when the sensitivity is 1000-2000, set the corresponding blur strength to 5 when the sensitivity is 2000-3000, and set When the sensitivity is 3000-4000, the corresponding blur strength is set to 7, and when the sensitivity is 4000-5000, the corresponding blur strength is set to 9. Exemplarily, the specific parameter settings of the sensitivity segment and its corresponding blur strength in the above example may be as follows:
Figure PCTCN2022093613-appb-000006
Figure PCTCN2022093613-appb-000006
其中,<?xml version="1.0"encoding="GB2312"?>,表示该参数设置是以1.0版本的可扩展标记语言(extensible markup language,XML)文件的方式配置的。<iso100>等表示感光度分段的索引,<blur>1</blur>表示该感光度分段对应的模糊力度。Among them, <? xml version="1.0"encoding="GB2312"? >, indicating that the parameter setting is configured in the form of an extensible markup language (XML) file of version 1.0. <iso100> etc. indicate the index of the sensitivity segment, and <blur>1</blur> indicates the blur intensity corresponding to the sensitivity segment.
作为另一种示例,通常情况下由于用户拍摄获取第二图像的第二摄像头其焦距大于第一摄像头,第二摄像头的去噪能力相比于第一摄像头较强,因此,在因为增加环境亮度(light value,LV)而导致图像噪点增加时,第二摄像头拍摄获取的第二图像相比于第 一摄像头拍摄获取的第一图像要更加清晰,因此,当第一图像和第二图像的环境亮度接近时,环境亮度越高则第二图像与第一图像的清晰度差异可能会越大。所以,在本申请实施例中,对第二图像进行模糊处理的预设规则,可以是根据第二图像对应的环境亮度来确定对应的模糊力度,然后根据相应的模糊力度采用模糊算法对第二图像进行模糊处理。从而避免在第一图像和第二图像的清晰度差异较小时,对第二图像过度模糊处理导致融合后的图像相比于第一图像清晰度没有得到提升的情况。As another example, usually because the focal length of the second camera used by the user to capture the second image is larger than that of the first camera, the noise removal capability of the second camera is stronger than that of the first camera. Therefore, when the ambient brightness is increased (light value, LV) resulting in increased image noise, the second image captured by the second camera is clearer than the first image captured by the first camera. Therefore, when the environment of the first image and the second image When the brightness is close, the higher the ambient brightness, the greater the difference in clarity between the second image and the first image. Therefore, in the embodiment of the present application, the preset rule for blurring the second image may be to determine the corresponding blur strength according to the ambient brightness corresponding to the second image, and then use a blur algorithm to blur the second image according to the corresponding blur strength. The image is blurred. Therefore, when the sharpness difference between the first image and the second image is small, the situation that the sharpness of the fused image is not improved compared with the first image due to excessive blurring processing on the second image is avoided.
需要说明的是,环境亮度通常为手机根据环境光测量得到的环境光线的平均亮度。手机在使用摄像头进行拍摄时,手机采用的曝光参数可以根据环境亮度计算得到,即摄像头拍摄得到的图像的曝光参数是根据环境亮度计算得到的。因此,在本申请实施例中,可以根据第二图像的曝光参数得到环境亮度。It should be noted that the ambient brightness is usually the average brightness of the ambient light obtained by the mobile phone according to the ambient light measurement. When the mobile phone uses the camera to take pictures, the exposure parameters adopted by the mobile phone can be calculated according to the ambient brightness, that is, the exposure parameters of the image captured by the camera are calculated according to the ambient brightness. Therefore, in the embodiment of the present application, the ambient brightness can be obtained according to the exposure parameters of the second image.
示例地,可以将环境亮度进行分段划分,然后从低到高按照环境亮度越高对应的模糊力度越大的规则设置不同环境亮度分段对应的模糊力度。其中,当模糊力度大于一定数值时,为了避免采用更高的模糊力度进行模糊处理,导致对第二图像过度模糊处理导致融合后的图像相比于第一图像清晰度没有得到提升的情况,可以将该模糊力度作为最大模糊力度,以使更高的感光度分段均与最大模糊力度对应。For example, the ambient brightness may be divided into segments, and then the blurring strengths corresponding to different ambient brightness segments are set according to the rule that the higher the ambient brightness, the greater the blurring strength, from low to high. Wherein, when the blur strength is greater than a certain value, in order to avoid using a higher blur strength for blur processing, resulting in excessive blur processing of the second image and resulting in the fused image not being improved compared with the first image definition, you can Use this blur strength as the maximum blur strength so that the higher sensitivity segments all correspond to the maximum blur strength.
例如,可以将环境亮度划分为100-1000,1000-2000,2000-3000,3000-4000,4000-5000,5000-6000等。从而将环境亮度为100-1000时对应的模糊力度设置为1,将环境亮度为1000-2000时对应的模糊力度设置为3,将环境亮度为2000-3000时对应的模糊力度设置为5,将环境亮度为3000-4000时对应的模糊力度设置为7,将环境亮度为4000-5000时对应的模糊力度设置为9。示例地,上述示例的环境亮度分段及其对应的模糊力度的具体参数设置可以如下所示:For example, the ambient brightness can be divided into 100-1000, 1000-2000, 2000-3000, 3000-4000, 4000-5000, 5000-6000 and so on. Therefore, set the corresponding blur strength when the ambient brightness is 100-1000 to 1, set the corresponding blur strength to 3 when the ambient brightness is 1000-2000, set the corresponding blur strength to 5 when the ambient brightness is 2000-3000, and set When the ambient brightness is 3000-4000, the corresponding blur strength is set to 7, and when the ambient brightness is 4000-5000, the corresponding blur strength is set to 9. Exemplarily, the specific parameter settings of the ambient brightness segment and its corresponding blur strength in the above example may be as follows:
Figure PCTCN2022093613-appb-000007
Figure PCTCN2022093613-appb-000007
Figure PCTCN2022093613-appb-000008
Figure PCTCN2022093613-appb-000008
其中,<?xml version="1.0"encoding="GB2312"?>,表示该参数设置是以1.0版本的可扩展标记语言(extensible markup language,XML)文件的方式配置的。<lv 100>等表示感光度分段的索引,<blur>1</blur>表示该环境亮度分段对应的模糊力度。Among them, <? xml version="1.0"encoding="GB2312"? >, indicating that the parameter setting is configured in the form of an extensible markup language (XML) file of version 1.0. <lv 100> etc. indicate the index of the sensitivity segment, and <blur>1</blur> indicates the blur intensity corresponding to the ambient brightness segment.
可选地,根据模糊算法的不同,可根据具体的模糊算法来确定不同模糊力度对应的模糊参数。Optionally, according to different blur algorithms, blur parameters corresponding to different blur strengths may be determined according to specific blur algorithms.
例如,以模糊算法为高斯模糊为例,高斯模糊的公式可以如下所示:For example, taking the blur algorithm as Gaussian blur as an example, the formula of Gaussian blur can be as follows:
Figure PCTCN2022093613-appb-000009
Figure PCTCN2022093613-appb-000009
其中,u 2+v 2为模糊半径,σ是正态分布的标准偏差。 Among them, u 2 +v 2 is the blur radius, and σ is the standard deviation of the normal distribution.
此时,若设置标准偏差为1,则示例地,根据上述公式可得模糊力度为3时高斯模糊的高斯矩阵为:At this time, if the standard deviation is set to 1, as an example, according to the above formula, the Gaussian matrix of Gaussian blur when the blur strength is 3 is:
Figure PCTCN2022093613-appb-000010
Figure PCTCN2022093613-appb-000010
即当模糊力度为3,采用高斯模糊时可以按照上述的高斯矩阵进行模糊处理。That is, when the blur strength is 3 and Gaussian blur is used, the blurring process can be performed according to the above-mentioned Gaussian matrix.
模糊力度为5时高斯模糊的高斯矩阵为:When the blur strength is 5, the Gaussian matrix of Gaussian blur is:
Figure PCTCN2022093613-appb-000011
Figure PCTCN2022093613-appb-000011
即当模糊力度为5,采用高斯模糊时可以按照上述的高斯矩阵进行模糊处理。That is, when the blur strength is 5 and Gaussian blur is used, the blurring process can be performed according to the above-mentioned Gaussian matrix.
S703、将第三图像与第一图像进行融合。S703. Fusion the third image with the first image.
可选地,将第三图像与第一图像进行融合时,可以将第三图像叠加在第一图像中与第三图像内容重合的部分上,也可以直接通过第三图像替换第一图像中与第三图像内容重合的部分,或者采用其他算法进行融合,此处不做限制。Optionally, when merging the third image with the first image, the third image can be superimposed on the part of the first image that overlaps with the content of the third image, or the third image can directly replace the part of the first image that is related to the third image. The overlapped part of the content of the third image, or other algorithms are used for fusion, which is not limited here.
将第三图像和第一图像融合后,可以以融合后得到的图像(即第四图像)进行保存以作为拍摄图像。After the third image and the first image are fused, the fused image (that is, the fourth image) may be saved as a captured image.
示例地,当手机采用如图6所示的***架构时,相机算法模块则可以根据上述实施例调用IVP、DSP或CPU,将第一图像和第二图像的SSIM值以及SSIM值与模糊力度的关系曲线(或者将第二图像的感光度与模糊力度的关系的配置参数,或者将第二图像的环境亮度与模糊力度的关系的配置参数)发送给IVP或DSP,以便于IVP或DSP根据这些参数确定对第二图像进行模糊处理需采用的模糊力度。IVP或DSP可将确定出的模糊力度返回给相机算法模块,相机算法模块可以将确定出的模糊力度以及第一图像和第二图像发送给GPU,以便于调用GPU对第二图像按照确定出的模糊力度进行模糊处理得到第三图像,并将第一图像和第三图案进行融合得到拍摄图像。GPU可将拍摄图像返回给相机算法模块,相机算法模块便可以通过相机硬件抽象层和框架层发送给应用层中部署的相机应用,以便于相机应用显示和/或存储接收到的拍摄图像。当然,以上仅为示例,在本申请的其他实施方式中,相机算法模块还可以灵活调用IVP、DSP、CPU、GPU等 以进行模糊力度的确定、对第二图像的模糊处理、对模糊处理后得到的第三图像与第一图像的融合等操作,因此,此处对于相机算法模块如何调度相关硬件以实现上述实施例的方法不做具体限制,可根据IVP、DSP、CPU、GPU等硬件的处理能力和功能进行设置。For example, when the mobile phone adopts the system architecture shown in Figure 6, the camera algorithm module can call the IVP, DSP or CPU according to the above-mentioned embodiment, and combine the SSIM value of the first image and the second image and the difference between the SSIM value and the blur strength The relationship curve (or the configuration parameter of the relationship between the sensitivity of the second image and the blur strength, or the configuration parameter of the relationship between the ambient brightness of the second image and the blur strength) is sent to IVP or DSP, so that IVP or DSP can The parameter determines how hard to blur the second image. The IVP or DSP can return the determined blur strength to the camera algorithm module, and the camera algorithm module can send the determined blur strength and the first image and the second image to the GPU, so as to call the GPU to process the second image according to the determined blur strength. Blurring is performed on the blur strength to obtain a third image, and the first image and the third pattern are fused to obtain a captured image. The GPU can return the captured image to the camera algorithm module, and the camera algorithm module can send it to the camera application deployed in the application layer through the camera hardware abstraction layer and the framework layer, so that the camera application can display and/or store the received captured image. Of course, the above is only an example. In other embodiments of the present application, the camera algorithm module can also flexibly call IVP, DSP, CPU, GPU, etc. to determine the blur strength, blur the second image, and blur the image after blur processing. Operations such as the fusion of the obtained third image and the first image, therefore, there is no specific limitation on how the camera algorithm module schedules relevant hardware to implement the method of the above embodiment. Processing power and functions are set.
可选地,在本申请实施例中,若第一图像为第一摄像头拍摄的图像经数码变焦调整到用户调整的变焦倍数(即当前的变焦倍数)后的图像,则可将第三图像和第一图像融合后的图像作为最终的拍摄图像。若第一图像为第一摄像头拍摄的与第一摄像头焦距对应的变焦倍数匹配的图像,则在进行融合时,可将第一图像经数码变焦调整到用户调整的变焦倍数(即当前的变焦倍数)后再和第三图像融合,使融合后的图像可与用户调整设置的变焦倍数匹配以作为最终的拍摄图像。故在此,对于何时通过数码变焦对图像进行调整以使最终的图像与用户调整的变焦倍数匹配,不做限制。Optionally, in this embodiment of the present application, if the first image is an image captured by the first camera and digitally zoomed to the zoom factor adjusted by the user (ie, the current zoom factor), the third image and The fused image of the first image is used as the final captured image. If the first image is an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera, then when fusion is performed, the first image can be digitally zoomed to the zoom factor adjusted by the user (that is, the current zoom factor ) and then fused with the third image, so that the fused image can match the zoom factor adjusted by the user as the final captured image. Therefore, there is no limitation on when to adjust the image through the digital zoom so that the final image matches the zoom factor adjusted by the user.
采用上述实施例中的方法,当电子设备通过两个摄像头拍摄,再将分别拍摄得到的图像进行融合以得到拍摄图像时,电子设备通过对获取的两个图像中清晰度更高的图像进行模糊处理的方式降低对应图像的清晰度。从而能够减小因摄像头间的分辨率、降噪能力的差异导致的两个摄像头分别拍摄得到的图像之间的清晰度差异。进而,将两个清晰度差异较小的图像进行融合,便能够得到融合边界不明显,拼接感较弱的拍摄图像。Using the method in the above embodiment, when the electronic device takes pictures with two cameras, and then fuses the images captured separately to obtain the captured image, the electronic device blurs the image with higher resolution among the two acquired images The processing is done in a way that reduces the sharpness of the corresponding image. Therefore, the difference in definition between the images captured by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
对应于前述实施例中的方法,本申请实施例还提供一种拍照装置。该装置可以应用于上述的电子设备用于实现前述实施例中的方法。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块。例如,图10示出了一种拍照装置的结构示意图,如图10所示,该装置包括:处理模块1001以及显示模块1002等。处理模块1001、显示模块1002可以配合用于实现上述实施例中相关的方法。Corresponding to the methods in the foregoing embodiments, the embodiments of the present application further provide a photographing device. The apparatus may be applied to the above-mentioned electronic equipment to implement the methods in the foregoing embodiments. The functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware. Hardware or software includes one or more modules corresponding to the above-mentioned functions. For example, FIG. 10 shows a schematic structural diagram of a photographing device. As shown in FIG. 10 , the device includes: a processing module 1001 , a display module 1002 and the like. The processing module 1001 and the display module 1002 may cooperate to implement the related methods in the foregoing embodiments.
应理解以上装置中单元或模块(以下均称为单元)的划分仅仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。且装置中的单元可以全部以软件通过处理元件调用的形式实现;也可以全部以硬件的形式实现;还可以部分单元以软件通过处理元件调用的形式实现,部分单元以硬件的形式实现。It should be understood that the division of units or modules (hereinafter referred to as units) in the above device is only a division of logical functions, and may be fully or partially integrated into a physical entity or physically separated during actual implementation. And the units in the device can all be implemented in the form of software called by the processing element; they can also be implemented in the form of hardware; some units can also be implemented in the form of software called by the processing element, and some units can be implemented in the form of hardware.
例如,各个单元可以为单独设立的处理元件,也可以集成在装置的某一个芯片中实现,此外,也可以以程序的形式存储于存储器中,由装置的某一个处理元件调用并执行该单元的功能。此外这些单元全部或部分可以集成在一起,也可以独立实现。这里所述的处理元件又可以称为处理器,可以是一种具有信号的处理能力的集成电路。在实现过程中,上述方法的各步骤或以上各个单元可以通过处理器元件中的硬件的集成逻辑电路实现或者以软件通过处理元件调用的形式实现。For example, each unit can be a separate processing element, or it can be integrated in a certain chip of the device. In addition, it can also be stored in the memory in the form of a program, which is called and executed by a certain processing element of the device. Function. In addition, all or part of these units can be integrated together, or implemented independently. The processing element described here may also be referred to as a processor, and may be an integrated circuit with a signal processing capability. In the process of implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in the processor element or implemented in the form of software called by the processing element.
在一个例子中,以上装置中的单元可以是被配置成实施以上方法的一个或多个集成电路,例如:一个或多个ASIC,或,一个或多个DSP,或,一个或者多个FPGA,或这些集成电路形式中至少两种的组合。In one example, the units in the above device may be one or more integrated circuits configured to implement the above method, for example: one or more ASICs, or, one or more DSPs, or, one or more FPGAs, Or a combination of at least two of these integrated circuit forms.
再如,当装置中的单元可以通过处理元件调度程序的形式实现时,该处理元件可以是通用处理器,例如CPU或其它可以调用程序的处理器。再如,这些单元可以集成在一起,以片上***(system-on-a-chip,SOC)的形式实现。For another example, when the units in the device can be implemented in the form of a processing element scheduler, the processing element can be a general-purpose processor, such as a CPU or other processors that can call programs. For another example, these units can be integrated together and implemented in the form of a system-on-a-chip (SOC).
在一种实现中,以上装置实现以上方法中各个对应步骤的单元可以通过处理元件调 度程序的形式实现。例如,该装置可以包括处理元件和存储元件,处理元件调用存储元件存储的程序,以执行以上方法实施例所述的方法。存储元件可以为与处理元件处于同一芯片上的存储元件,即片内存储元件。In one implementation, the units of the above apparatus for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler. For example, the apparatus may include a processing element and a storage element, and the processing element invokes a program stored in the storage element to execute the methods described in the above method embodiments. The storage element may be a storage element on the same chip as the processing element, that is, an on-chip storage element.
在另一种实现中,用于执行以上方法的程序可以在与处理元件处于不同芯片上的存储元件,即片外存储元件。此时,处理元件从片外存储元件调用或加载程序于片内存储元件上,以调用并执行以上方法实施例所述的方法。In another implementation, the program for executing the above method may be stored in a storage element on a different chip from the processing element, that is, an off-chip storage element. At this point, the processing element invokes or loads a program from the off-chip storage element to the on-chip storage element, so as to invoke and execute the methods described in the above method embodiments.
例如,本申请实施例还可以提供一种装置,如:电子设备,可以包括:处理器,用于存储该处理器可执行指令的存储器。该处理器被配置为执行上述指令时,使得该电子设备实现如前述实施例中电子设备实施的拍照方法。该存储器可以位于该电子设备之内,也可以位于该电子设备之外。且该处理器包括一个或多个。For example, an embodiment of the present application may also provide an apparatus, such as an electronic device, which may include a processor, and a memory configured to store instructions executable by the processor. When the processor is configured to execute the above instructions, the electronic device implements the photographing method implemented by the electronic device in the foregoing embodiments. The memory can be located inside the electronic device or outside the electronic device. And the processor includes one or more.
在又一种实现中,该装置实现以上方法中各个步骤的单元可以是被配置成一个或多个处理元件,这些处理元件可以设置于对应上述的电子设备上,这里的处理元件可以为集成电路,例如:一个或多个ASIC,或,一个或多个DSP,或,一个或者多个FPGA,或者这些类集成电路的组合。这些集成电路可以集成在一起,构成芯片。In yet another implementation, the unit of the apparatus that implements each step in the above method may be configured as one or more processing elements, and these processing elements may be set on the corresponding electronic equipment described above, where the processing elements may be integrated circuits , for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits can be integrated together to form a chip.
例如,本申请实施例还提供一种芯片***,该芯片***可以应用于上述电子设备。芯片***包括一个或多个接口电路和一个或多个处理器;接口电路和处理器通过线路互联;处理器通过接口电路从电子设备的存储器接收并执行计算机指令,以实现以上方法实施例中电子设备相关的方法。For example, an embodiment of the present application further provides a chip system, and the chip system may be applied to the above-mentioned electronic device. The chip system includes one or more interface circuits and one or more processors; the interface circuits and the processors are interconnected through lines; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuits, so as to realize the electronic processing in the above method embodiments. Device-dependent methods.
本申请实施例还提供一种计算机程序产品,包括电子设备,如上述电子设备,运行的计算机指令。An embodiment of the present application further provides a computer program product, including computer instructions executed by an electronic device, such as the above-mentioned electronic device.
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。Through the description of the above embodiments, those skilled in the art can clearly understand that for the convenience and brevity of the description, only the division of the above-mentioned functional modules is used as an example for illustration. In practical applications, the above-mentioned functions can be allocated according to needs It is completed by different functional modules, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。In the several embodiments provided in this application, it should be understood that the disclosed devices and methods may be implemented in other ways. For example, the device embodiments described above are only illustrative. For example, the division of the modules or units is only a logical function division, and there may be other division methods in actual implementation. For example, multiple units or components can be Incorporation or may be integrated into another device, or some features may be omitted, or not implemented. In another point, the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。The unit described as a separate component may or may not be physically separated, and the component displayed as a unit may be one physical unit or multiple physical units, that is, it may be located in one place, or may be distributed to multiple different places . Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。In addition, each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit. The above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上 或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,如:程序。该软件产品存储在一个程序产品,如计算机可读存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。If the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium. Based on such an understanding, the essence of the technical solution of the embodiment of the present application or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, such as a program. The software product is stored in a program product, such as a computer-readable storage medium, and includes several instructions to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all of the methods described in various embodiments of the present application. or partial steps. The aforementioned storage medium includes: various media capable of storing program codes such as U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk.
例如,本申请实施例还可以提供一种计算机可读存储介质,其上存储有计算机程序指令。当计算机程序指令被电子设备执行时,使得电子设备实现如前述方法实施例中所述的拍照方法。For example, the embodiments of the present application may also provide a computer-readable storage medium on which computer program instructions are stored. When the computer program instructions are executed by the electronic device, the electronic device is made to implement the photographing method described in the foregoing method embodiments.
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。The above is only a specific implementation of the application, but the protection scope of the application is not limited thereto, and any changes or replacements within the technical scope disclosed in the application should be covered within the protection scope of the application . Therefore, the protection scope of the present application should be determined by the protection scope of the claims.

Claims (15)

  1. 一种拍照方法,其特征在于,应用于电子设备,所述电子设备包括第一摄像头和第二摄像头,所述第一摄像头的视场角和所述第二摄像头的视场角不同,所述方法包括:A photographing method, characterized in that it is applied to an electronic device, the electronic device includes a first camera and a second camera, the angle of view of the first camera is different from the angle of view of the second camera, and the Methods include:
    所述电子设备启动相机;the electronic device activates a camera;
    显示预览界面,所述预览界面包括第一控件;displaying a preview interface, where the preview interface includes a first control;
    检测到对所述第一控件的第一操作;detecting a first operation on the first control;
    响应于所述第一操作,所述第一摄像头获取第一图像,所述第二摄像头获取第二图像;所述第二图像的清晰度高于所述第一图像的清晰度;In response to the first operation, the first camera captures a first image, and the second camera captures a second image; the definition of the second image is higher than that of the first image;
    对所述第二图像进行模糊处理,得到第三图像;performing blur processing on the second image to obtain a third image;
    将所述第三图像和所述第一图像进行融合,得到第四图像;fusing the third image with the first image to obtain a fourth image;
    保存所述第四图像。Save the fourth image.
  2. 根据权利要求1所述的方法,其特征在于,所述第一摄像头的视场角大于所述第二摄像头的视场角。The method according to claim 1, wherein the field angle of the first camera is larger than the field angle of the second camera.
  3. 根据权利要求1或2所述的方法,其特征在于,所述对所述第二图像进行模糊处理,得到第三图像,包括:The method according to claim 1 or 2, wherein said blurring the second image to obtain a third image comprises:
    根据所述第二图像和所述第一图像的相似度,按照预设的相似度与模糊力度间的对应关系,确定模糊力度;According to the similarity between the second image and the first image, according to the preset correspondence between the similarity and the blur strength, determine the blur strength;
    根据确定出的模糊力度对所述第二图像进行模糊处理。Perform blurring processing on the second image according to the determined blurring strength.
  4. 根据权利要求3所述的方法,其特征在于,所述相似度为结构相似性SSIM值。The method according to claim 3, wherein the similarity is a structural similarity SSIM value.
  5. 根据权利要求3所述的方法,其特征在于,所述相似度与所述模糊力度成反比。The method according to claim 3, wherein the similarity is inversely proportional to the blur strength.
  6. 根据权利要求1或2所述的方法,其特征在于,所述对所述第二图像进行模糊处理,得到第三图像,包括:The method according to claim 1 or 2, wherein said blurring the second image to obtain a third image comprises:
    根据所述第二图像对应的感光度,按照预设的感光度与模糊力度间的对应关系,确定模糊力度;According to the sensitivity corresponding to the second image, according to the correspondence between the preset sensitivity and the blur intensity, determine the blur intensity;
    根据确定出的模糊力度对所述第二图像进行模糊处理。Perform blurring processing on the second image according to the determined blurring strength.
  7. 根据权利要求6所述的方法,其特征在于,所述感光度与所述模糊力度成正比。The method according to claim 6, wherein the sensitivity is proportional to the blur strength.
  8. 根据权利要求1或2所述的方法,其特征在于,所述对所述第二图像进行模糊处理,得到第三图像,包括:The method according to claim 1 or 2, wherein said blurring the second image to obtain a third image comprises:
    根据所述第二图像对应的环境亮度,按照预设的环境亮度与模糊力度间的对应关系,确定模糊力度;Determine the blur strength according to the ambient brightness corresponding to the second image and according to the preset correspondence between the ambient brightness and the blur strength;
    根据确定出的模糊力度对所述第二图像进行模糊处理。Perform blurring processing on the second image according to the determined blurring strength.
  9. 根据权利要求8所述的方法,其特征在于,所述环境亮度与所述模糊力度成正比。The method according to claim 8, wherein the ambient brightness is proportional to the blur strength.
  10. 根据权利要求1至9任一项所述的方法,其特征在于,所述模糊处理包括以下任一种:高斯模糊、表面模糊、方框模糊、Kawase模糊、双重模糊、散景模糊、移轴模糊、光圈模糊、粒状模糊、径向模糊、方向模糊。The method according to any one of claims 1 to 9, wherein the blur processing includes any of the following: Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis shift Blur, Aperture Blur, Grainy Blur, Radial Blur, Directional Blur.
  11. 根据权利要求1所述的方法,其特征在于,所述第一图像为所述第一摄像头拍摄得到的图像经数码变焦调整到当前的变焦倍数后的图像。The method according to claim 1, wherein the first image is an image obtained by digital zooming the image captured by the first camera and adjusted to a current zoom factor.
  12. 根据权利要求1所述的方法,其特征在于,所述第一图像为所述第一摄像头直接拍摄得到的图像;所述对所述第三图像和所述第一图像进行融合,得到第四图像,包括:The method according to claim 1, wherein the first image is an image directly captured by the first camera; the fusion of the third image and the first image obtains a fourth images, including:
    对所述第一图像进行数码变焦,以将所述第一图像调整到当前的变焦倍数;performing digital zoom on the first image, so as to adjust the first image to the current zoom factor;
    将所述第三图像和进行数码变焦后的第一图像进行融合,得到所述第四图像。The third image is fused with the digitally zoomed first image to obtain the fourth image.
  13. 一种电子设备,其特征在于,包括:处理器,用于存储所述处理器可执行指令的存储器,所述处理器被配置为执行所述指令时,使得所述电子设备实现如权利要求1至12任一项所述的方法。An electronic device, characterized in that it includes: a processor, a memory for storing instructions executable by the processor, and the processor is configured to execute the instructions so that the electronic device implements the method described in claim 1. to the method described in any one of 12.
  14. 一种计算机可读存储介质,其上存储有计算机程序指令;其特征在于,当所述计算机程序指令被电子设备执行时,使得电子设备实现如权利要求1至12任一项所述的方法。A computer-readable storage medium, on which computer program instructions are stored; it is characterized in that, when the computer program instructions are executed by an electronic device, the electronic device is made to implement the method according to any one of claims 1 to 12.
  15. 一种计算机程序产品,其特征在于,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,使得电子设备实现如权利要求1至12任一项所述的方法。A computer program product, characterized by comprising computer readable codes, and when the computer readable codes are run in an electronic device, the electronic device implements the method according to any one of claims 1 to 12.
PCT/CN2022/093613 2021-08-11 2022-05-18 Image capture method and device WO2023016025A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110919953.8 2021-08-11
CN202110919953.8A CN113810598B (en) 2021-08-11 2021-08-11 Photographing method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
WO2023016025A1 true WO2023016025A1 (en) 2023-02-16

Family

ID=78893436

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/093613 WO2023016025A1 (en) 2021-08-11 2022-05-18 Image capture method and device

Country Status (2)

Country Link
CN (1) CN113810598B (en)
WO (1) WO2023016025A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113810598B (en) * 2021-08-11 2022-11-22 荣耀终端有限公司 Photographing method, electronic device and storage medium
CN116723394B (en) * 2022-02-28 2024-05-10 荣耀终端有限公司 Multi-shot strategy scheduling method and related equipment thereof
CN114782296B (en) * 2022-04-08 2023-06-09 荣耀终端有限公司 Image fusion method, device and storage medium
CN116245741B (en) * 2022-06-28 2023-11-17 荣耀终端有限公司 Image processing method and related device
CN116051368B (en) * 2022-06-29 2023-10-20 荣耀终端有限公司 Image processing method and related device
CN115348390A (en) * 2022-08-23 2022-11-15 维沃移动通信有限公司 Shooting method and shooting device
CN116051435B (en) * 2022-08-23 2023-11-07 荣耀终端有限公司 Image fusion method and electronic equipment
CN117835077A (en) * 2022-09-27 2024-04-05 华为终端有限公司 Shooting method, electronic equipment and medium
CN117729445A (en) * 2024-02-07 2024-03-19 荣耀终端有限公司 Image processing method, electronic device and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180048832A1 (en) * 2015-10-06 2018-02-15 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
CN110290300A (en) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment
CN112995467A (en) * 2021-02-05 2021-06-18 深圳传音控股股份有限公司 Image processing method, mobile terminal and storage medium
CN113012085A (en) * 2021-03-18 2021-06-22 维沃移动通信有限公司 Image processing method and device
CN113810598A (en) * 2021-08-11 2021-12-17 荣耀终端有限公司 Photographing method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107959778B (en) * 2017-11-30 2019-08-20 Oppo广东移动通信有限公司 Imaging method and device based on dual camera
CN112188096A (en) * 2020-09-27 2021-01-05 北京小米移动软件有限公司 Photographing method and device, terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180048832A1 (en) * 2015-10-06 2018-02-15 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
CN110290300A (en) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment
CN112995467A (en) * 2021-02-05 2021-06-18 深圳传音控股股份有限公司 Image processing method, mobile terminal and storage medium
CN113012085A (en) * 2021-03-18 2021-06-22 维沃移动通信有限公司 Image processing method and device
CN113810598A (en) * 2021-08-11 2021-12-17 荣耀终端有限公司 Photographing method and device

Also Published As

Publication number Publication date
CN113810598B (en) 2022-11-22
CN113810598A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
WO2023016025A1 (en) Image capture method and device
WO2020073959A1 (en) Image capturing method, and electronic device
CN114092364B (en) Image processing method and related device
CN112150399A (en) Image enhancement method based on wide dynamic range and electronic equipment
WO2022262344A1 (en) Photographing method and electronic device
US20240119566A1 (en) Image processing method and apparatus, and electronic device
CN115601244B (en) Image processing method and device and electronic equipment
CN113660408B (en) Anti-shake method and device for video shooting
CN112929558B (en) Image processing method and electronic device
CN114466134A (en) Method and electronic device for generating HDR image
CN116347224B (en) Shooting frame rate control method, electronic device, chip system and readable storage medium
CN113630558B (en) Camera exposure method and electronic equipment
CN116033275B (en) Automatic exposure method, electronic equipment and computer readable storage medium
WO2022267506A1 (en) Image fusion method, electronic device, storage medium, and computer program product
CN117061861B (en) Shooting method, chip system and electronic equipment
CN115631250B (en) Image processing method and electronic equipment
CN117395495B (en) Image processing method and electronic equipment
WO2023160178A1 (en) Exposure control method and electronic device
CN116051368B (en) Image processing method and related device
CN116723264B (en) Method, apparatus and storage medium for determining target location information
CN117135468B (en) Image processing method and electronic equipment
CN116668862B (en) Image processing method and electronic equipment
CN115526786B (en) Image processing method and related device
CN117528265A (en) Video shooting method and electronic equipment
WO2024067071A1 (en) Photographing method, and electronic device and medium

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE