CN112004029A - Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium - Google Patents

Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium Download PDF

Info

Publication number
CN112004029A
CN112004029A CN201910447895.6A CN201910447895A CN112004029A CN 112004029 A CN112004029 A CN 112004029A CN 201910447895 A CN201910447895 A CN 201910447895A CN 112004029 A CN112004029 A CN 112004029A
Authority
CN
China
Prior art keywords
image
camera
parameter value
cameras
exposure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910447895.6A
Other languages
Chinese (zh)
Other versions
CN112004029B (en
Inventor
陈伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910447895.6A priority Critical patent/CN112004029B/en
Publication of CN112004029A publication Critical patent/CN112004029A/en
Application granted granted Critical
Publication of CN112004029B publication Critical patent/CN112004029B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to an exposure processing method, an exposure processing device, an electronic device and a computer readable storage medium. The method comprises the following steps: controlling each first-class camera of the at least two first-class cameras to shoot the same shooting scene to respectively obtain a corresponding original image; aligning all the shot original images, and determining mutually overlapped areas in all the original images as a first light metering area; performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of the at least two first-class cameras according to the obtained first brightness parameter value; and controlling the at least two first-class cameras to perform exposure according to the obtained exposure parameter values. The exposure processing method, the exposure processing device, the electronic equipment and the computer readable storage medium can improve the accuracy of exposure processing.

Description

Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an exposure processing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
The camera is generally composed of a lens and an image sensor, wherein the lens can collect light rays in the environment, and the image sensor converts the light rays into electric signals. In the process of converting light into an electric signal, the image sensor senses different light durations and different received light energy, so that the converted electric signal is different in strength.
The process of sensing light by the image sensor is exposure, and the longer the exposure time, the more energy of the received light. However, the exposure time is too long, and the generated image may be too bright due to overexposure; too short an exposure time may result in too dark an image being produced due to underexposure.
Disclosure of Invention
The embodiment of the application provides an exposure processing method, an exposure processing device, electronic equipment and a computer readable storage medium, which can improve the accuracy of exposure processing.
An exposure processing method comprising:
controlling each first-class camera of at least two first-class cameras to shoot a same shooting scene to respectively obtain a corresponding original image, wherein the field angles of the first-class cameras are mutually overlapped;
aligning all the shot original images, and determining mutually overlapped areas in all the original images as a first light metering area;
performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of the at least two first-class cameras according to the obtained first brightness parameter value;
and controlling the at least two first-class cameras to perform exposure according to the obtained exposure parameter values.
An exposure processing apparatus includes:
the image shooting module is used for controlling each first-class camera in at least two first-class cameras to shoot the same shooting scene to respectively obtain a corresponding original image, wherein the field angles of the first-class cameras are mutually overlapped;
the region determining module is used for aligning all the shot original images and determining regions which are overlapped with each other in all the original images as a first light metering region;
the parameter value acquisition module is used for performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of the at least two first-class cameras according to the obtained first brightness parameter value;
and the exposure control module is used for controlling the at least two first-class cameras to expose according to the obtained exposure parameter values.
An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
controlling each first-class camera of at least two first-class cameras to shoot a same shooting scene to respectively obtain a corresponding original image, wherein the field angles of the first-class cameras are mutually overlapped;
aligning all the shot original images, and determining mutually overlapped areas in all the original images as a first light metering area;
performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of the at least two first-class cameras according to the obtained first brightness parameter value;
and controlling the at least two first-class cameras to perform exposure according to the obtained exposure parameter values.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
controlling each first-class camera of at least two first-class cameras to shoot a same shooting scene to respectively obtain a corresponding original image, wherein the field angles of the first-class cameras are mutually overlapped;
aligning all the shot original images, and determining mutually overlapped areas in all the original images as a first light metering area;
performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of the at least two first-class cameras according to the obtained first brightness parameter value;
and controlling the at least two first-class cameras to perform exposure according to the obtained exposure parameter values.
The exposure processing method, the exposure processing device, the electronic equipment and the computer readable storage medium control each first type camera of the at least two first type cameras to shoot a corresponding original image. Then, all the shot original images are subjected to alignment processing, and the overlapped areas in all the original images are determined as a first light metering area. And performing photometric processing on the first photometric area to obtain a first brightness parameter value, so as to determine an exposure parameter value according to the obtained first brightness parameter value. And finally, controlling at least two first-class cameras to perform exposure according to the obtained exposure parameter values. Therefore, exposure parameter values of the cameras can be adjusted through the mutually overlapped areas in the multiple original images collected by the different cameras, the exposure consistency of the cameras is guaranteed, the images with different exposure degrees are prevented from being collected by the different cameras, and the exposure accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram illustrating an exemplary embodiment of an exposure process;
FIG. 2 is a flow diagram of a method of exposure processing in one embodiment;
FIG. 3 is a schematic view of a first photometric area in one embodiment;
FIG. 4 is a diagram showing a hardware configuration for implementing an exposure processing method according to an embodiment;
FIG. 5 is a schematic illustration of registration of an original image with a reference image in one embodiment;
FIG. 6 is a schematic diagram of registration of an original image and a reference image in another embodiment;
FIG. 7 is a flowchart illustrating an exposure processing method according to another embodiment;
FIG. 8 is a block diagram showing the structure of an exposure processing apparatus according to an embodiment;
FIG. 9 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first type of camera may be referred to as a second type of camera, and similarly, a second type of camera may be referred to as a first type of camera, without departing from the scope of the present application. Both the first type of camera and the second type of camera are cameras, but they are not the same type of camera.
In the process of shooting images by multiple cameras, in order to shoot wider scenes, shooting can be carried out by the multiple cameras. Each camera shoots different scene areas, and then images shot by the multiple cameras are spliced into one image. In a traditional splicing mode, because the overlapping area of images is small, exposure parameters can be obtained by multiple cameras according to the respective shot images when shooting. If the brightness difference of the images shot by each camera is large, the adjusted exposure parameters when each camera shoots are different, and the obtained spliced images are different as a whole.
FIG. 1 is a diagram illustrating an exemplary embodiment of an exposure process. As shown in fig. 1, the application environment includes an electronic device 10, and a camera 12 and a camera 14 are installed on the electronic device 10. In the application environment, the camera 12 and the camera 14 can shoot the same shooting scene to obtain a corresponding original image, namely an image 102 and an image 104; performing alignment processing on the image 102 and the image 104 obtained by shooting, and determining an area overlapping with each other in the image 102 and the image 104 as a first photometric area; performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of at least two first-class cameras according to the obtained first brightness parameter value; and controlling the camera 12 and the camera 14 to perform exposure according to the obtained exposure parameter values. The electronic device 10 may be, but is not limited to, a mobile phone, a desktop computer, a tablet computer, a wearable device, a personal digital assistant, and the like.
FIG. 2 is a flow diagram of an exposure processing method in one embodiment. As shown in fig. 2, the exposure processing method includes steps 202 to 208. Wherein:
step 202, controlling each first-class camera of the at least two first-class cameras to shoot the same shooting scene to respectively obtain a corresponding original image, wherein the field angles of the first-class cameras are mutually overlapped.
The camera may be composed of a lens and an image sensor, and the lens may collect light rays in a shooting scene and convert the collected light rays into electrical signals through the image sensor, thereby generating an image. The cameras may be of different types and may include, for example, laser cameras, visible light cameras, infrared cameras, etc., without limitation thereto. Different types of cameras sense different light types.
The number of cameras installed in the electronic device is not limited herein, and one or more cameras may be installed to simultaneously capture images through the installed one or more cameras. In this embodiment, the electronic device is equipped with at least two cameras of the first type. The electronic device can control each first type camera to shoot a corresponding original image.
It is understood that the field angles of the first type cameras overlap each other, and specifically, partial areas up to the field angles overlap each other. All the first-type cameras are controlled to shoot the same shooting scene, and the field ranges of different first-type cameras can be different. Therefore, the original images captured by the different first-type cameras are not completely overlapped, and may have parallax.
Specifically, when the electronic device controls the first type of camera to shoot, the first camera may be controlled to shoot sequentially according to a sequence, or all the first type of cameras may be controlled to shoot together, which is not limited herein. For example, a processor may control all the first type cameras to shoot, and then the processor may sequentially send out control signals to control the first type cameras to shoot sequentially; each first type camera can be connected with a processor, and then the processors can simultaneously send out control signals to the connected first type cameras to control the first type cameras to shoot simultaneously.
And 204, performing alignment processing on all the shot original images, and determining the mutually overlapped areas in all the original images as a first light metering area.
The alignment process refers to a process of making pixel points in a plurality of images correspond to each other. In a specific implementation, the alignment process may be a process of associating pixel points representing the same information in the shooting scene in the multiple images. For example, when two images are aligned, the information in one of the images may be compared with the information in the other image, the feature points in the two images are searched, the translation relationship between the two images is determined according to the same searched feature points, and then the two images are corresponded according to the determined translation relationship.
For another example, cameras installed on the electronic device have overlapping field ranges, and the positional relationship of images acquired by different cameras can be calibrated before the electronic device leaves a factory, so that the alignment processing can be directly performed according to the positional relationship of the images when the images are shot, and the alignment processing does not need to be performed in a manner of searching for feature points again.
After all the original images obtained by shooting are aligned, pixel points of the same information represented in different original images are aligned at the same position, and therefore the position relation among a plurality of original images is determined. Further, after the original images are aligned, the overlapping regions between different original images may be determined, so that the overlapping regions in all the original images may be determined.
The regions overlapping each other in all the original images refer to regions existing in all the original images. For example, if four original images are captured by the first type of camera, the overlapped areas in all the original images refer to the areas existing in the four original images. The area present in all the original images is taken as the first photometric area.
And step 206, performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of at least two first-class cameras according to the obtained first brightness parameter value.
The photometry process is a process of measuring the brightness of light reflected by a subject, that is, the brightness of light of a captured scene. The first luminance parameter value may be a parameter value indicating the luminance level of the photographed scene, and may be measured based on a first photometric area. For example, the luminance parameter values of all the pixels in the first light measuring region are counted, and the obtained luminance average parameter value of all the pixels is used as the first luminance parameter value. Or generating a brightness distribution map according to the brightness parameter values of all the pixel points of the first light metering area, and calculating the first brightness parameter value according to the brightness distribution map. The first luminance parameter value may also be calculated by other manners, which are not limited herein.
It can be understood that, in the process of acquiring an image by the camera, light rays in a shooting scene are collected by the lens, and the energy of the light rays is converted into an electric signal by the image sensor. In the process of converting light into an electric signal, the image sensor senses different light durations and different received light energy, so that the converted electric signal is different in strength.
The process that image sensor perception light is exposure, and in the process that image sensor exposes, the light volume of intaking of light and the duration of perception light all can influence the energy of the light of collection to influence the power of the signal of telecommunication that converts. Therefore, the above-mentioned exposure parameter value may include at least one of an exposure time period, which is a parameter value controlling the time period of the image sensor exposure, and an aperture parameter value, which is a parameter value controlling the amount of light entering the lens.
And step 208, controlling at least two first-class cameras to perform exposure according to the obtained exposure parameter values.
And obtaining an exposure parameter value according to a first brightness parameter value obtained by the first light metering area, and then controlling the exposure of the at least two first-class cameras according to the obtained exposure parameter value. For example, the luminance obtained from the first light metering area is relatively high, and the exposure time period of the first type camera can be correspondingly reduced, so that the luminance of the shot image is reduced. The aperture parameter value of the first camera can be adjusted, the light inlet quantity of the first camera is reduced, and therefore the brightness of the shot image is reduced.
It can be understood that the first-type cameras are cameras with the same specification, exposure parameter values obtained by all the cameras are the same, and then exposure of all the first-type cameras is controlled through the same exposure parameter values, so that images with the same exposure can be collected by all the first-type cameras, and consistency of shot images is guaranteed.
The exposure processing method controls each first-class camera in at least two first-class cameras to shoot a corresponding original image. Then, all the shot original images are subjected to alignment processing, and the overlapped areas in all the original images are determined as a first light metering area. And performing photometric processing on the first photometric area to obtain a first brightness parameter value, so as to determine an exposure parameter value according to the obtained first brightness parameter value. And finally, controlling at least two first-class cameras to perform exposure according to the obtained exposure parameter values. Therefore, exposure parameter values of the cameras can be adjusted through the mutually overlapped areas in the multiple original images collected by the different cameras, the exposure consistency of the cameras is guaranteed, the images with different exposure degrees are prevented from being collected by the different cameras, and the exposure accuracy is improved.
FIG. 3 is a diagram illustrating a first photometric area in one embodiment. As shown in fig. 3, the first type of camera captures a corresponding original image, and a total of four original images are obtained. After the four original images are aligned, an area where the four original images overlap with each other is determined as the first light metering area 302.
In an embodiment, the step of aligning the original images may specifically include: acquiring the alignment parameters of at least two first-class cameras, and aligning all the original images obtained by shooting according to the alignment parameters.
When the original images are collected through at least two first-class cameras, different first-class cameras have certain parallax and can be aligned only through certain translation. Generally, before the electronic device leaves a factory, cameras are calibrated to determine the position relationship between different cameras, so as to determine the position relationship between images acquired by different cameras.
The alignment parameters are parameters for aligning images respectively acquired by at least two first-class cameras, the alignment parameters can be determined by calibration before an original image is shot, and then the original image is aligned directly according to the alignment parameters after the original image is shot. For example, two images are placed at the origin of the same coordinate system xoy, the alignment parameter may represent the amount of translation between the images, and the specific expression may be that one of the images is translated by 100 pixel points in the positive direction of the x axis, and the positive direction of the y axis is translated by 300 pixel points, so that the two images are aligned.
It is understood that the position of the first type camera may be fixed or movable, and is not limited herein. If the first type of camera is fixed, the alignment parameters are also fixed, and if the first type of camera is movable, the alignment parameters can be determined according to the moving position of the first type of camera, and the alignment parameters corresponding to different positions are different.
In an embodiment provided by the present application, the step of aligning the original images may further include: controlling a second camera to shoot the shooting scene to obtain a reference image, respectively registering each original image obtained by shooting with the reference image, and aligning all the original images according to a registration result; the field angle of the second camera is larger than that of any one first camera, and the field angle of the second camera and the field angle of any one first camera are overlapped.
Specifically, the electronic device may further include a second type of camera, and the shooting scene is shot by the second type of camera to obtain the reference image. The field angle of the second type camera is larger than that of any one first type camera, so that the image information contained in the reference image is larger than that of any one original image, and the reference image can be used as a reference image when all the original images are aligned.
The method includes the steps of respectively registering each original image obtained through shooting with a reference image, specifically, identifying a feature point in each original image and a feature point in the reference image, and then comparing the feature point in the original image with the feature point of the reference image. Corresponding feature points in the original image and the reference image are determined, and the position relation between each original image and the reference image is determined according to the corresponding feature points, so that all the original images can be aligned.
The first type of camera can be a long-focus camera, the second type of camera can be a wide-angle camera, and each original image corresponds to different distortion areas in the reference image respectively. The field of view range of wide-angle camera is wider than that of wide-angle camera, and the detail information that long focus camera acquireed is more than the detail information that wide-angle camera acquireed.
It is understood that the wide-angle camera has a large field range, so that the edge area of the image may be distorted. In this case, the original image captured by the telephoto camera may be corrected for the distortion region of the reference image captured by the wide-angle camera, so that a high-resolution image may be obtained and distortion of the image may be reduced. For example, four original images and a reference image are combined by taking one original image by four telephoto cameras, respectively, taking one reference image by a wide-angle camera, and corresponding the four original images to the four corners of the reference image, respectively.
In other embodiments, the first type of camera is not limited to being a tele camera and the second type of camera is not limited to being a wide camera. The first type camera and the second type camera can be the same type camera or different types of cameras as long as the requirement that the image information acquired by the second type camera is more than that acquired by the first type camera is met. For example, the first type of camera and the second type of camera are both visible light cameras, but the resolution of the image acquired by the first type of camera is greater than the resolution of the image acquired by the second type of camera.
In an embodiment, the step of capturing the original image may specifically include: the first processor controls each first type camera of the at least two first type cameras to sequentially shoot the same shooting scene to respectively obtain a corresponding original image; the step of aligning all the original images may specifically include: and the first processor controls the second camera to shoot a shooting scene to obtain a reference image, each obtained original image is respectively registered with the reference image, and all the original images are aligned according to a registration result.
Specifically, in the above embodiment, all the cameras may be controlled by one first processor to perform shooting, that is, the first type of camera and the second type of camera are both connected to the first processor, and the first processor sequentially controls the first type of camera to shoot one corresponding original image and controls the second type of camera to shoot the reference image.
The first type of camera sequentially transmits the shot original images to the first processor, and the first processor stores the received original images in the cache. The second type of camera also sends the shot reference image to the first processor, and the first processor also stores the received reference image in a cache. After the first processor receives the original image and the reference image, the original image is aligned according to the registration result of the original image and the reference image, and the exposure parameter value is obtained according to the alignment result. After the first processor obtains the exposure parameter values, the exposure of each first type of camera can be controlled according to the exposure parameter values.
It is to be understood that the electronic device may further include a second processor, and the exposure processing method may further include: and the first processor sends the reference image shot by the second type of camera to the second processor so as to control the reference image to output preview through the second processor.
Specifically, the first processor may be an ASIC (Application Specific Integrated Circuit) chip, and the second processor may be an ISP (Image Signal Processing) chip. The ASIC can be connected with a plurality of cameras to control the shooting of the cameras, and the ISP can only control the shooting of one camera.
And acquiring an exposure parameter value through the first processor, controlling the first type of camera to expose, and controlling the preview of the image through the second type of camera. Specifically, when a certain frame is shot, the first processor may obtain an exposure parameter value according to an original image shot by the frame and a reference image, and control exposure of the first type camera when shooting a next frame of image according to the determined exposure parameter value. Meanwhile, as the process of splicing and synthesizing a plurality of original images is time-consuming, the first processor sends the obtained reference image to the second processor, and the second processor directly outputs and displays the reference image for a user to preview the image.
Fig. 4 is a hardware configuration diagram for implementing an exposure processing method in one embodiment. As shown in fig. 4, the cameras 402, 404, 406 and 408 are telephoto cameras, and the first processor 42 may control the telephoto cameras to capture an original image, and the telephoto cameras may transmit the original image to the first processor 42 after capturing the original image. The camera 410 is a wide-angle camera, and the first processor 42 may control the wide-angle camera to capture a reference image, and after the wide-angle camera captures the reference image, the reference image is sent to the first processor 42, and after the first processor 42 receives the reference image, the reference image is sent to the second processor 44. The first processor 42 may calculate exposure parameter values for the camera 402, the camera 404, the camera 406, and the camera 408 when capturing the next frame image from the original image and the reference image, and the second processor 44 may output a preview from the current reference image.
In the exposure processing method, the first processor can control the exposure of the camera, and the second processor can control the preview of the image, so that the waiting time of a user is reduced, and the image processing efficiency of the electronic equipment is improved.
In one embodiment, the exposure processing method further includes: and obtaining a second light metering area according to other areas except the first light metering area in the original image, and performing light metering processing on the second light metering area to obtain a second brightness parameter value. The step of obtaining the exposure parameter values may comprise: and determining exposure parameter values of at least two first-class cameras according to the first brightness parameter value and the second brightness parameter value.
Specifically, after the first type camera acquires the original images, an area overlapping each other in all the original images may be used as a first light metering area, and an area other than the first light metering area may be used as a second light metering area. That is, the first photometric area exists in all original images, while the second photometric area exists only in a part of the original images. Therefore, a first brightness parameter value can be calculated according to the first light metering area, a second brightness parameter value can be calculated according to the second light metering area, and then the final exposure parameter value can be determined according to the obtained first brightness parameter value and the second brightness parameter value.
It will be appreciated that the first photometric area described above is present in all original images, while the second photometric area is present in only a part of the original images. Therefore, after the first brightness parameter value and the second brightness parameter value are obtained, the light intensity of the current shooting scene can be analyzed by taking the first brightness parameter value as a main part and taking the second brightness parameter value as an auxiliary part, so as to determine the final exposure parameter value.
In this embodiment of the application, the step of obtaining the second brightness parameter value may further include: dividing other areas except the first light metering area in the original image into different second light metering areas, wherein the different second light metering areas appear in the original image in different times; and performing photometric processing on each second photometric area to respectively obtain corresponding second brightness parameter values.
Specifically, the other areas except the first light metering area in the original image may be divided into one or more different second light metering areas, and the obtained different second light metering areas appear in the original image at different times. That is, after the second light metering area is determined, the other area may be divided into one or more different second light metering areas according to the number of images overlapped.
For example, after aligning four images, the first light metering area is an area existing in all the four images, and the second light metering area may be an area existing in one, two, and three of the images, so that the aligned images are divided into different light metering areas, and a luminance parameter value is calculated according to the different light metering areas.
The step of acquiring the exposure parameter value may specifically include: counting the times of occurrence of the first light metering area and each second light metering area in all original images respectively, and determining the weight corresponding to the first brightness parameter value and the weight corresponding to each second brightness parameter value according to the times; and combining the first brightness parameter value and the second brightness parameter value to obtain a brightness parameter value according to the weight corresponding to the first brightness parameter value and the weight corresponding to each second brightness parameter value, and determining the exposure parameter values of at least two first-class cameras according to the brightness parameter values.
After the first light metering area and the second light metering area are determined, the number of times each of the first light metering area and the second light metering area appears in all the original images is counted, namely the area is counted to exist in a plurality of original images simultaneously. It is considered that the more times the luminance parameter value of the area appears in the original image, the more important the luminance parameter value of the area is to the last acquired exposure parameter value.
Therefore, after counting the times of the first light metering region and the second light metering region appearing in all the original images respectively, the weights of the corresponding first brightness parameter value and the second brightness parameter value can be determined according to the counted times, finally, the first brightness parameter value and the second brightness parameter value are combined according to the weights to obtain the final brightness parameter value, and the exposure parameter value is determined according to the obtained brightness parameter value. The manner of combining the first luminance parameter value and the second luminance parameter value is not limited herein, and for example, the first luminance parameter value and the second luminance parameter value may be combined in proportion according to a weight, specifically, linear combinations such as addition, subtraction, multiplication, and division may be used, and other nonlinear combinations may also be used, which is not limited herein.
For example, if four original images are included in total, the number of times that the first photometric area and the second photometric area exist in all the original images may be 1, 2, 3, and 4 times, the weight of the second photometric area appearing 1 time may be 1/(1+2+3 ═ 4) to 0.1, the weight of the second photometric area appearing 2 times may be 2/(1+2+3 ═ 4) to 0.2, the weight of the second photometric area appearing 3 times may be 3/(1+2+3 ═ 4) to 0.3, and the weight of the first photometric area appearing 4 times may be 4/(1+2+3 ═ 4) to 0.4. Assuming that the regions with the number of times of 1, 2, 3, and 4 exist in all the original images, the corresponding luminance parameter values are respectively V1、V2、V3、V4Then determining the final luminance parameter value may be expressed as V0.1V1+0.2*V2+0.3*V3+0.4*V4
In the exposure processing method, the weight is calculated according to the times of the first light metering area and the second light metering area appearing in all the original images respectively, the brightness parameter value obtained by combining the first brightness parameter value and the second brightness parameter value is combined according to different weights, and the exposure parameter value is determined according to the obtained brightness parameter value, so that the accuracy of exposure processing is improved.
FIG. 5 is a diagram illustrating registration of an original image with a reference image in one embodiment. As shown in fig. 5, the first type of camera is a telephoto camera, and the second type of camera is a wide-angle camera. The original images taken by the four first cameras are respectively an image 504, an image 506, an image 508 and an image 510, and the images 504, 506, 508 and 510 taken by the four first cameras are respectively registered with the image 502 acquired by the second camera, so that the images 504, 506, 508 and 510 are aligned. The image 504, the image 506, the image 508, and the image 510 after alignment can be divided into a first light metering area 52, a second light metering area 54, and a second light metering area 56. Wherein the number of times the first light metering area 52 appears in the image 504, the image 506, the image 508, and the image 510 is 4, the number of times the second light metering area 54 appears in the image 504, the image 506, the image 508, and the image 510 is 2, and the number of times the second light metering area 56 appears in the image 504, the image 506, the image 508, and the image 510 is 1.
In one embodiment, the electronic device may be provided with at least two second-type cameras, where the at least two second-type cameras respectively acquire one image, and then generate one reference image according to the images acquired by the at least two second-type cameras respectively. Specifically, the step of acquiring the reference image may include: controlling each second camera of the at least two second cameras to shoot the shooting scene to respectively obtain a corresponding image to be synthesized, wherein the field angles of the at least two second cameras are mutually overlapped; and synthesizing all the obtained images to be synthesized to obtain a reference image.
The angles of view of the at least two second-type cameras may overlap in a partial region or may overlap in all regions, which is not limited herein. And synthesizing all the obtained images to be synthesized, specifically, aligning all the images to be synthesized, only reserving the areas existing in all the images to be synthesized, and then averaging the areas overlapped with each other in all the images to be synthesized to obtain the final reference image. Or after aligning all the images to be synthesized, the regions in all the images to be synthesized may be retained, that is, the regions existing in two or more images to be synthesized are averaged, and only the region where one image to be synthesized exists is directly retained. In other embodiments, the reference image may be obtained by other synthesis methods, which are not limited herein.
After the images to be synthesized are obtained according to the at least two second-type cameras, the reference images with wider view range can be obtained when the images to be synthesized are synthesized into the reference images. Meanwhile, according to the images to be synthesized acquired by the at least two second-type cameras, the physical distance information of the object in the images to be synthesized can be calculated according to the principle of triangulation distance measurement. The physical distance information is a physical distance between the object and the second type camera, and may be used to assist the camera in performing processes such as exposure and focusing, but is not limited thereto.
Fig. 6 is a schematic diagram of registration of an original image and a reference image in another embodiment. As shown in fig. 6, the first type of camera is a telephoto camera, and the second type of camera is a wide-angle camera. Images to be synthesized, which are respectively shot by the two wide-angle cameras, are an image 602 and an image 604, after the images 602 and the images 604 shot by the two second-type cameras are aligned, an average value of overlapping areas is taken to obtain a reference image (an image 606). The original images taken by the four first type cameras are image 608, image 610, image 612 and image 614, respectively, and the images 608, 610, 612 and 614 taken by the four first type cameras are registered with the image 606, respectively, so that the images 608, 610, 612 and 614 are aligned. The image 608, the image 610, the image 612, and the image 614 after alignment may be divided into a first light metering area 62, a second light metering area 64, and a second light metering area 66. Wherein the number of times the first light metering area 62 appears in the image 608, the image 610, the image 612, and the image 614 is 4, the number of times the second light metering area 64 appears in the image 608, the image 610, the image 612, and the image 614 is 2, and the number of times the second light metering area 66 appears in the image 608, the image 610, the image 612, and the image 614 is 1.
Fig. 7 is a flowchart illustrating an exposure processing method according to another embodiment. As shown in fig. 7, the exposure processing method may specifically include:
step 702, a first processor controls each first type camera of at least two first type cameras to sequentially shoot the same shooting scene to respectively obtain a corresponding original image;
step 704, the first processor controls the second type of camera to shoot a shooting scene to obtain a reference image;
step 706, registering each obtained original image with a reference image through a first processor, and aligning all the original images according to a registration result;
step 708, determining, by the first processor, mutually overlapped regions in all the original images as a first photometric region;
step 710, performing photometry processing on a first photometry area through a first processor to obtain a first brightness parameter value;
step 712, dividing other areas except the first photometric area in the original image into different second photometric areas through the first processor, wherein the different second photometric areas appear in all the original images at different times;
step 714, performing photometry processing on each second photometry area through the first processor to obtain corresponding second brightness parameter values respectively;
step 716, counting, by the first processor, the times of occurrence of the first light metering area and each second light metering area in all the original images respectively, and determining the weight corresponding to the first luminance parameter value and the weight corresponding to each second luminance parameter value according to the times;
step 718, combining the first brightness parameter value and the second brightness parameter value to obtain a brightness parameter value through the first processor according to the weight corresponding to the first brightness parameter value and the weight corresponding to each second brightness parameter value;
step 720, determining exposure parameter values of at least two first-class cameras according to the brightness parameter values through a first processor;
step 722, controlling at least two first-class cameras to perform exposure through the first processor according to the obtained exposure parameter values;
and step 724, the first processor sends the reference image shot by the second type of camera to the second processor, and the second processor controls the reference image to output preview.
It should be understood that although the steps in the flowcharts of fig. 2 and 7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2 and 7 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
Fig. 8 is a block diagram showing the structure of an exposure processing apparatus according to an embodiment. As shown in fig. 8, the exposure processing apparatus 800 includes an image capturing module 802, an area determining module 804, a parameter value acquiring module 806, and an exposure control module 808. Wherein:
the image capturing module 802 is configured to control each of the at least two first-type cameras to capture the same captured scene to obtain a corresponding original image, where the field angles of the first-type cameras overlap each other;
the region determining module 804 is configured to perform alignment processing on all the original images obtained by shooting, and determine a region overlapping with each other in all the original images as a first light metering region;
a parameter value obtaining module 806, configured to perform photometry processing on the first photometry area to obtain a first brightness parameter value, and determine exposure parameter values of at least two first-class cameras according to the obtained first brightness parameter value;
and the exposure control module 808 is configured to control at least two first-class cameras to perform exposure according to the obtained exposure parameter values.
The exposure processing device controls each first-class camera in the at least two first-class cameras to shoot a corresponding original image. Then, all the shot original images are subjected to alignment processing, and the overlapped areas in all the original images are determined as a first light metering area. And performing photometric processing on the first photometric area to obtain a first brightness parameter value, so as to determine an exposure parameter value according to the obtained first brightness parameter value. And finally, controlling at least two first-class cameras to perform exposure according to the obtained exposure parameter values. Therefore, exposure parameter values of the cameras can be adjusted through the mutually overlapped areas in the multiple original images collected by the different cameras, the exposure consistency of the cameras is guaranteed, the images with different exposure degrees are prevented from being collected by the different cameras, and the accuracy of the images is improved.
In an embodiment, the area determining module 804 is further configured to obtain alignment parameters of at least two first-type cameras, and perform alignment processing on all captured original images according to the alignment parameters.
In an embodiment, the area determining module 804 is further configured to control the second type of camera to shoot a shooting scene to obtain a reference image, register each of the shot original images with the reference image, and perform alignment processing on all the original images according to a registration result; the field angle of the second camera is larger than that of any one first camera, and the field angle of the second camera and the field angle of any one first camera are overlapped.
In an embodiment, the image capturing module 802 is further configured to control each of the at least two first-type cameras to sequentially capture the same capturing scene to obtain a corresponding original image; the region determining module 804 is further configured to control the second type of camera to shoot a shooting scene to obtain a reference image; and respectively registering each obtained original image with the reference image, and aligning all the original images according to a registration result.
In an embodiment, the exposure processing apparatus 800 further includes a preview module, which is configured to send the reference image captured by the second type of camera to the second processor, and control the reference image to output a preview through the second processor.
In an embodiment, the parameter value obtaining module 806 is further configured to obtain a second light metering area according to another area in the original image except the first light metering area, and perform light metering processing on the second light metering area to obtain a second brightness parameter value; and determining exposure parameter values of at least two first-class cameras according to the first brightness parameter value and the second brightness parameter value.
In one embodiment, the parameter value obtaining module 806 is further configured to divide other areas in the original image except the first photometric area into different second photometric areas, where the different second photometric areas appear in different times in all the original images; performing photometric processing on each second photometric area to respectively obtain corresponding second brightness parameter values; counting the times of occurrence of the first light metering area and each second light metering area in all original images respectively, and determining the weight corresponding to the first brightness parameter value and the weight corresponding to each second brightness parameter value according to the times; combining the first brightness parameter value and the second brightness parameter value to obtain a brightness parameter value according to the weight corresponding to the first brightness parameter value and the weight corresponding to each second brightness parameter value; and determining exposure parameter values of at least two first-class cameras according to the brightness parameter values.
The division of each module in the exposure processing apparatus is only for illustration, and in other embodiments, the exposure processing apparatus may be divided into different modules as needed to complete all or part of the functions of the exposure processing apparatus.
The implementation of each block in the exposure processing apparatus provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 9 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 9, for convenience of explanation, only aspects of the image processing technique related to the embodiments of the present application are shown.
As shown in fig. 9, the image processing circuit includes an ISP processor 930, an ISP processor 940 and a control logic 950. Camera 910 includes one or more lenses 912 and an image sensor 914. Image sensor 914 may include an array of color filters (e.g., Bayer filters), and image sensor 914 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 914 and provide a set of image data that may be processed by ISP processor 930. The camera 920 includes one or more lenses 922 and an image sensor 924. The image sensor 924 may include an array of color filters (e.g., Bayer filters), and the image sensor 924 may acquire light intensity and wavelength information captured with each imaging pixel of the image sensor 924 and provide a set of image data that may be processed by the ISP processor 940.
The image collected by the camera 910 is transmitted to the ISP processor 930 for processing, after the ISP processor 930 processes the image, the ISP processor 930 may send statistical data of the image (such as brightness of the image, contrast value of the image, color of the image, etc.) to the control logic 950, and the control logic 950 may determine control parameters of the camera 910 according to the statistical data, so that the camera 910 may perform operations such as auto focus, auto exposure, etc. according to the control parameters. The image may be stored in the image memory 960 after being processed by the ISP processor 930, and the ISP processor 930 may also read the image stored in the image memory 960 to process the image. The image processed by the ISP processor 930 may be directly transmitted to the display 970 for display, or the display 970 may read the image in the image memory 960 for display.
Where ISP processor 930 processes the image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 930 may perform one or more image processing operations on the image data, gathering statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The image Memory 960 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving the interface from image sensor 914, ISP processor 930 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 960 for additional processing before being displayed. ISP processor 930 receives processed data from image memory 960 and performs image data processing in the RGB and YCbCr color spaces on the processed data. The image data processed by ISP processor 930 may be output to display 970 for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of ISP processor 930 may also be sent to image memory 960, and display 970 may read image data from image memory 960. In one embodiment, image memory 960 may be configured to implement one or more frame buffers.
The statistics determined by ISP processor 930 may be sent to control logic 950. For example, the statistical data may include image sensor 914 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 912 shading correction, and the like. Control logic 950 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters for camera 910 and control parameters for ISP processor 930 based on the received statistical data. For example, the control parameters of camera 910 may include gain, integration time for exposure control, anti-shake parameters, flash control parameters, lens 912 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters, and the like. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 912 shading correction parameters.
Similarly, the image collected by the camera 920 is transmitted to the ISP processor 940 for processing, after the ISP processor 940 processes the image, the statistical data of the image (such as the brightness of the image, the contrast value of the image, the color of the image, etc.) can be sent to the control logic 950, and the control logic 950 can determine the control parameters of the camera 920 according to the statistical data, so that the camera 920 can perform operations such as auto-focus and auto-exposure according to the control parameters. The image may be stored in the image memory 960 after being processed by the ISP processor 940, and the ISP processor 940 may also read the image stored in the image memory 960 to process the image. In addition, the image may be directly transmitted to the display 970 for display after being processed by the ISP processor 940, or the display 970 may read the image in the image memory 960 for display. Camera 920 and ISP processor 940 may also implement the processes described for camera 910 and ISP processor 930.
The steps of the exposure processing method provided in the foregoing embodiment can be implemented by using the image processing technology in fig. 9, which is specifically described as controlling the camera 910 and the camera 920 to shoot the same shooting scene, so as to obtain a corresponding original image respectively. The original image captured by the camera 910 may be sent to the ISP processor 930, and then sent to the image memory 960 through the ISP processor. The original image captured by the camera 920 may be sent to the ISP processor 940, and the ISP processor 940 reads the original image captured by the camera 910 from the image memory 960. Then, the ISP processor 940 aligns the original images captured by the cameras 910 and 920, and determines the overlapping area of all the original images as the first photometric area. And performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of at least two first-class cameras according to the obtained first brightness parameter value. After obtaining the exposure parameters, ISP processor 940 sends the exposure parameters to ISP processor 930. The ISP processor 930 and the ISP processor 940 respectively control the camera 910 and the camera 920 to perform exposure according to the obtained exposure parameter values.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the exposure processing methods provided by the embodiments described above.
A computer program product containing instructions which, when run on a computer, cause the computer to execute the exposure processing method provided in the above embodiments.
Any reference to memory, storage, database, or other medium used by embodiments of the present application may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An exposure processing method, comprising:
controlling each first-class camera of at least two first-class cameras to shoot a same shooting scene to respectively obtain a corresponding original image, wherein the field angles of the first-class cameras are mutually overlapped;
aligning all the shot original images, and determining mutually overlapped areas in all the original images as a first light metering area;
performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of the at least two first-class cameras according to the obtained first brightness parameter value;
and controlling the at least two first-class cameras to perform exposure according to the obtained exposure parameter values.
2. The method according to claim 1, wherein the aligning all the original images obtained by shooting comprises:
and acquiring the alignment parameters of the at least two first-class cameras, and aligning all the shot original images according to the alignment parameters.
3. The method according to claim 1, wherein the aligning all the original images obtained by shooting comprises:
controlling a second type camera to shoot the shooting scene to obtain a reference image, respectively registering each original image obtained by shooting with the reference image, and aligning all the original images according to a registration result;
the field angle of the second camera is larger than that of any one of the first cameras, and the field angle of the second camera and the field angle of any one of the first cameras are mutually overlapped.
4. The method of claim 3, wherein the first type of camera is a tele camera and the second type of camera is a Wide camera, and wherein each of the raw images corresponds to a different distortion region in the reference image.
5. The method according to claim 3, wherein said controlling each of the at least two first-type cameras to capture the same scene to obtain a corresponding original image comprises:
the first processor controls each first type camera of the at least two first type cameras to sequentially shoot the same shooting scene to respectively obtain a corresponding original image;
the controlling the second type of camera to shoot the shooting scene to obtain a reference image, respectively registering each shot original image with the reference image, and aligning all the original images according to the registration result, including:
the first processor controls the second camera to shoot the shooting scene to obtain a reference image, each obtained original image is respectively registered with the reference image, and all the original images are aligned according to a registration result;
the method further comprises the following steps:
and the first processor sends the reference image obtained by shooting by the second type of camera to a second processor so as to control the reference image to output preview through the second processor.
6. The method of claim 1, further comprising:
obtaining a second light metering area according to other areas except the first light metering area in the original image, and performing light metering processing on the second light metering area to obtain a second brightness parameter value;
determining exposure parameter values of the at least two first-class cameras according to the obtained first brightness parameter values, wherein the determining comprises the following steps:
and determining exposure parameter values of the at least two first-class cameras according to the first brightness parameter value and the second brightness parameter value.
7. The method according to claim 6, wherein obtaining a second photometric area from an area other than the first photometric area in the original image, and performing photometric processing on the second photometric area to obtain a second brightness parameter value comprises:
dividing other areas except the first light metering area in the original image into different second light metering areas, wherein the different second light metering areas appear in all the original images in different times;
performing photometric processing on each second photometric area to respectively obtain corresponding second brightness parameter values;
the determining the exposure parameter values of the at least two first-class cameras according to the first brightness parameter value and the second brightness parameter value comprises:
counting the times of occurrence of the first light metering area and each second light metering area in all original images respectively, and determining the weight corresponding to the first brightness parameter value and the weight corresponding to each second brightness parameter value according to the times;
combining the first brightness parameter value and the second brightness parameter value to obtain a brightness parameter value according to the weight corresponding to the first brightness parameter value and the weight corresponding to each second brightness parameter value;
and determining exposure parameter values of the at least two first-class cameras according to the brightness parameter values.
8. An exposure processing apparatus, comprising:
the image shooting module is used for controlling each first-class camera in at least two first-class cameras to shoot the same shooting scene to respectively obtain a corresponding original image, wherein the field angles of the first-class cameras are mutually overlapped;
the region determining module is used for aligning all the shot original images and determining regions which are overlapped with each other in all the original images as a first light metering region;
the parameter value acquisition module is used for performing photometric processing on the first photometric area to obtain a first brightness parameter value, and determining exposure parameter values of the at least two first-class cameras according to the obtained first brightness parameter value;
and the exposure control module is used for controlling the at least two first-class cameras to expose according to the obtained exposure parameter values.
9. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the method according to any one of claims 1 to 7;
wherein the processor is the first processor when the processor executes the method of claim 5.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN201910447895.6A 2019-05-27 2019-05-27 Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium Active CN112004029B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910447895.6A CN112004029B (en) 2019-05-27 2019-05-27 Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910447895.6A CN112004029B (en) 2019-05-27 2019-05-27 Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN112004029A true CN112004029A (en) 2020-11-27
CN112004029B CN112004029B (en) 2022-03-15

Family

ID=73461268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910447895.6A Active CN112004029B (en) 2019-05-27 2019-05-27 Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN112004029B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572970A (en) * 2021-06-24 2021-10-29 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN113691776A (en) * 2021-08-18 2021-11-23 南京领行科技股份有限公司 In-vehicle camera system and light supplementing method
CN115514900A (en) * 2022-08-26 2022-12-23 中国科学院合肥物质科学研究院 Imaging spectrometer rapid automatic exposure imaging method and storage medium
CN115550556A (en) * 2021-06-25 2022-12-30 荣耀终端有限公司 Exposure intensity adjusting method and related device
WO2023077421A1 (en) * 2021-11-05 2023-05-11 深圳市大疆创新科技有限公司 Movable platform control method and apparatus, and movable platform and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101909152A (en) * 2009-05-26 2010-12-08 奥林巴斯映像株式会社 Camera head
JP2018186514A (en) * 2018-06-13 2018-11-22 パナソニックIpマネジメント株式会社 Image processing device, imaging system having the same and image processing method
US10165194B1 (en) * 2016-12-16 2018-12-25 Amazon Technologies, Inc. Multi-sensor camera system
US20190073751A1 (en) * 2017-09-01 2019-03-07 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101909152A (en) * 2009-05-26 2010-12-08 奥林巴斯映像株式会社 Camera head
US10165194B1 (en) * 2016-12-16 2018-12-25 Amazon Technologies, Inc. Multi-sensor camera system
US20190073751A1 (en) * 2017-09-01 2019-03-07 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
JP2018186514A (en) * 2018-06-13 2018-11-22 パナソニックIpマネジメント株式会社 Image processing device, imaging system having the same and image processing method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572970A (en) * 2021-06-24 2021-10-29 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN115550556A (en) * 2021-06-25 2022-12-30 荣耀终端有限公司 Exposure intensity adjusting method and related device
CN115550556B (en) * 2021-06-25 2023-10-24 荣耀终端有限公司 Exposure intensity adjusting method and related device
CN113691776A (en) * 2021-08-18 2021-11-23 南京领行科技股份有限公司 In-vehicle camera system and light supplementing method
CN113691776B (en) * 2021-08-18 2024-02-09 南京领行科技股份有限公司 In-vehicle camera system and light supplementing method
WO2023077421A1 (en) * 2021-11-05 2023-05-11 深圳市大疆创新科技有限公司 Movable platform control method and apparatus, and movable platform and storage medium
CN115514900A (en) * 2022-08-26 2022-12-23 中国科学院合肥物质科学研究院 Imaging spectrometer rapid automatic exposure imaging method and storage medium
CN115514900B (en) * 2022-08-26 2023-11-07 中国科学院合肥物质科学研究院 Imaging spectrometer rapid automatic exposure imaging method and storage medium

Also Published As

Publication number Publication date
CN112004029B (en) 2022-03-15

Similar Documents

Publication Publication Date Title
CN109767467B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107948519B (en) Image processing method, device and equipment
CN112004029B (en) Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium
CN109089047B (en) Method and device for controlling focusing, storage medium and electronic equipment
CN110225248B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110213494B (en) Photographing method and device, electronic equipment and computer readable storage medium
CN109712192B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
CN109862269B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110290323B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN112087580B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN109685853B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109951641B (en) Image shooting method and device, electronic equipment and computer readable storage medium
CN109146906B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
CN109598763B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN109963080B (en) Image acquisition method and device, electronic equipment and computer storage medium
CN107872631B (en) Image shooting method and device based on double cameras and mobile terminal
CN109559352B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN112019734B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN107948617B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN109559353B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
CN111246100B (en) Anti-shake parameter calibration method and device and electronic equipment
CN110035206B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112087571A (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110049240B (en) Camera control method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant