WO2021120107A1 - Method of generating captured image and electrical device - Google Patents

Method of generating captured image and electrical device Download PDF

Info

Publication number
WO2021120107A1
WO2021120107A1 PCT/CN2019/126637 CN2019126637W WO2021120107A1 WO 2021120107 A1 WO2021120107 A1 WO 2021120107A1 CN 2019126637 W CN2019126637 W CN 2019126637W WO 2021120107 A1 WO2021120107 A1 WO 2021120107A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
setting
electrical device
imaging
Prior art date
Application number
PCT/CN2019/126637
Other languages
French (fr)
Inventor
Ahmed BOUDISSA
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2019/126637 priority Critical patent/WO2021120107A1/en
Priority to CN201980102874.9A priority patent/CN114946170B/en
Publication of WO2021120107A1 publication Critical patent/WO2021120107A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Definitions

  • the present disclosure relates to a method of generating a captured image in an electrical device including a camera assembly and such an electrical device including a camera assembly.
  • Electrical devices such as smartphones and tablet terminals are widely used in our daily life.
  • many of the electrical devices are equipped with a camera assembly to capture an image.
  • Some of the electrical devices are portable and are thus easy to carry. Therefore, a user of the electrical device can easily capture an image by using the camera assembly of the electrical device anytime, anywhere.
  • the captured image includes one or more light source areas which are overexposed or halation.
  • the overexposed light source areas and the halation areas in the captured image can be subject to an image correction process after capturing the image. In order to perform the image correction process to adjust the light source area, it is important to precisely detect the light source areas.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an imaging device and an electrical device.
  • a method of generating a captured image in an electrical device including a camera assembly may include:
  • the first setting of imaging is a setting of the camera assembly set by a user
  • the second setting of imaging is a setting to detect an area emitting a light stronger than other areas in the second image
  • the second setting of imaging may be a short exposure in which an exposure time is shorter than the first setting of imaging.
  • the exposure time of the short exposure may be short enough not to detect the light from areas other than the light source area in the second image.
  • the second setting of imaging may be a low ISO sensitivity which is lower than an ISO sensitivity of the first setting of imaging.
  • the low ISO sensitivity may be low enough not to detect the light from areas other than the light source area in the second image.
  • the second setting of imaging may be a low exposure value which is lower than an exposure value of the first setting of imaging.
  • the low exposure value may be low enough not to detect the light from areas other than the light source area in the second image.
  • the low exposure value may be negative.
  • the detecting the light source area may include:
  • the method may further include:
  • the camera assembly may include at least a first camera and a second camera
  • the second image is captured by the first camera or the second camera.
  • the first image may be captured by the first camera
  • the method may further include:
  • the camera assembly may include a main camera in a back side of the electrical device, and
  • the first image and the second image are captured by the main camera.
  • the camera assembly may include a sub camera in a front side of the electrical device, and
  • the first image and the second image are captured by the sub camera.
  • an electrical device may include:
  • a camera assembly configured to capture a first image in a first setting of imaging and a second image in a second setting of imaging, wherein the first setting of imaging is a setting of the camera assembly and the second setting of imaging is a setting to detect an area emitting a light stronger than other areas in the second image, and
  • an image processor configured to detect a light source area based on the second image to generate the captured image based on the first image.
  • the second setting of imaging may be a short exposure in which an exposure time is shorter than the first setting of imaging.
  • the exposure time of the short exposure may be short enough not to detect the light from areas other than the light source area in the second image.
  • the second setting of imaging may be a low ISO sensitivity which is lower than an ISO sensitivity of the first setting of imaging.
  • the low ISO sensitivity may be low enough not to detect the light from areas other than the light source area in the second image.
  • the second setting of imaging may be a low exposure value which is lower than an exposure value of the first setting of imaging.
  • the low exposure value may be low enough not to detect the light from areas other than the light source area in the second image.
  • the low exposure value may be negative.
  • the image processor may be further configured to:
  • the certain area as the light source area if the brightness of the certain area is larger than the threshold value.
  • the image processor may be further configured to:
  • the camera assembly may include at least a first camera and a second camera
  • the second image is captured by the first camera or the second camera.
  • the first image may be captured by the first camera
  • the image processor may be further configured to:
  • the camera assembly may include a main camera in a back side of the electrical device, and
  • the first image and the second image are captured by the main camera.
  • the camera assembly may include a sub camera in a front side of the electrical device, and
  • the first image and the second image are captured by the sub camera.
  • FIG. 1 illustrates a back side view of an electrical device according to a first embodiment of the present disclosure
  • FIG. 2 illustrates a front side view of the electrical device according to the first embodiment of the present disclosure
  • FIG. 3 illustrates a block diagram of the electrical device according to the first embodiment of the present disclosure
  • FIG. 4 illustrates a flowchart of an image capturing process performed by the electrical device according to the first embodiment of the present disclosure
  • FIG. 5 shows a first main image captured by a first main camera or a second main image captured by a second main camera in a first setting of imaging
  • FIG. 6 shows an auxiliary image captured by the first main camera in a second setting of imaging
  • FIG. 7 illustrates a back side view of the electrical device according to a second embodiment of the present disclosure.
  • FIG. 8 illustrates a flowchart of an image capturing process performed by the electrical device according to the second embodiment of the present disclosure.
  • FIG. 1 illustrates a back side view of an electrical device 10 according to a first embodiment of the present disclosure
  • FIG. 2 illustrates a front side view of the electrical device 10 according to the first embodiment of the present disclosure.
  • the electrical device 10 may include a display 20 and a camera assembly 30.
  • the camera assembly 30 includes a first main camera 32, a second main camera 34 and a sub camera 36.
  • the first main camera 32 and the second main camera 34 can capture an image in a back side of the electrical device 10 and the sub camera can capture an image in a front side of the electrical device 10. Therefore, the first main camera 32 and the second main camera 34 are so called out-camera whereas the sub camera 36 is so called in-camera.
  • the electrical device 10 can be a mobile phone, a tablet computer, a personal digital assistant, and so on.
  • the first main camera 32 and the second main camera 34 may have the same performance and/or characteristics or they may have different performance and/or characteristics.
  • the first main camera 32 may be equipped with a full color image sensor and the second main camera 34 may be equipped with a black-and-white image sensor.
  • the first main camera 32 may be a camera suitable for capturing a still image and the second main camera 34 may be a camera suitable for capturing a moving image.
  • the first main camera 32 may be a camera equipped with a wide-angle lens and the second main camera 34 may be a camera equipped with a telephoto lens.
  • the performance of the sub camera 36 is lower than that of the first main camera 32 and the second main camera 34.
  • the performance of the sub camera 36 may be the same as that of the first main camera 32 and the second main camera 34.
  • the electrical device 10 may have more than three cameras.
  • the electrical device 10 may have three, four, five, etc. main cameras.
  • FIG. 3 illustrates a block diagram of the electrical device 10 according to the present embodiment.
  • the electrical device 10 may include a main processor 40, an image signal processor 42, a memory 44, a power supply circuit 46 and a communication circuit 48.
  • the display 20, the camera assembly 30, the main processor 40, the image signal processor 42, the memory 44, the power supply circuit 46 and the communication circuit 48 are connected together via a bus 50.
  • the main processor 40 executes one or more programs stored in the memory 44.
  • the main processor 40 implements various applications and data processing of the electrical device 10 by executing the programs.
  • the main processor 40 may be one or more computer processors.
  • the main processor 40 is not limited to one CPU core, but it may have a plurality of CPU cores.
  • the main processor 40 may be a main CPU of the electrical device 10, an image processing unit (IPU) or a DSP provided with the camera assembly 30.
  • the image signal processor 42 controls the camera assembly 30 and processes the image captured by the camera assembly 30.
  • the image signal processor 42 can execute a de-mosaic process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range process and so on to the image captured by the camera assembly 30.
  • the main processor 40 and the image signal processor 42 collaborate with each other to obtain the captured image by the camera assembly 30. That is, the main processor 40 and the image signal processor 42 are configured to capture the image by the camera assembly 30 and execute various kinds of image processes for the captured image.
  • the memory 44 stores a program to be executed by the main processor 40 and various kinds of data. For example, data of the captured image are stored in the memory 44.
  • the memory 44 may be a high-speed RAM memory, or a non-volatile memory such as a flash memory and a magnetic disk memory.
  • the power supply circuit 46 may have a battery such as a lithium-ion rechargeable battery (not shown) and a battery management unit (BMU) for managing the battery.
  • a battery such as a lithium-ion rechargeable battery (not shown) and a battery management unit (BMU) for managing the battery.
  • BMU battery management unit
  • the communication circuit 48 is configured to receive and transmit data to communicate with the Internet or other devices via wireless communication.
  • the wireless communication may adopt any communication standard or protocol, including but not limited to GSM (Global System for Mobile communication) , CDMA (Code Division Multiple Access) , LTE (Long Term Evolution) , LTE-Advanced, 5th generation (5G) .
  • GSM Global System for Mobile communication
  • CDMA Code Division Multiple Access
  • LTE Long Term Evolution
  • 5G 5th generation
  • the communication circuit 48 may include an antenna and a RF (radio frequency) circuit.
  • FIG. 4 illustrates a flowchart of an image capturing process performed by the electrical device 10 according to the present embodiment.
  • the image capturing process is executed by the main processor 40 in collaboration with the image signal processor 42. Therefore, the main processor 40 and the image signal processor 42 constitute an image processor in the present embodiment.
  • the electrical device 10 sets a first setting of imaging of the camera assembly 30 (Step S10) .
  • the electrical device 10 may set a setting of the auto focus, a setting of the auto exposure and a setting of the auto white balance for the camera assembly 30.
  • the electrical device 10 may set a setting of ISO sensitivity, a setting of an exposure time, a setting of an exposure value, a setting of a shutter speed and so on for the camera assembly 30.
  • the first setting of imaging of the camera assembly 30 can be set automatically by the electrical device 10. In other words, the user can select an automatic setting for the first setting of imaging of the camera assembly 30.
  • the first setting of imaging of the camera assembly 30 can be previously set by the user. In other words, the user can previously set each item of the first setting of imaging of the camera assembly 30.
  • the electrical device 10 captures a first main image by the first main camera 32 in the first setting of imaging of the camera assembly 30 (Step S12) and a second main image by the second main camera 34 in the first setting of imaging of the camera assembly 30 (Step S14) .
  • the first main camera 32 and the second main camera 34 capture the first main image and the second main image simultaneously.
  • the timing of capturing the first main image by the first main camera 32 and the timing of capturing the second main image by the second main camera 34 may be slightly different.
  • the electrical device 10 captures an auxiliary image by the first main camera 32 in a second setting of imaging which is a setting to detect an area emitting a light stronger than other areas in the auxiliary image (Step S16) .
  • FIG. 5 shows the first main image captured by the first main camera 32 or the second main image captured by the second main camera 34 in the first setting of imaging set in the electrical device 10 by the user
  • FIG. 6 shows the auxiliary image captured by the first main camera 32 in the second setting of imaging to detect an area emitting a light.
  • the first main image and the second main image are normal photographs to be captured by the camera assembly 30. That is, the first main image and the second main image are images the user wants to capture.
  • the auxiliary image is a special photograph for easy detection of an area emitting strong light. That is, the electrical device 10 can detect the area emitting the strong light which emits a lot of photons directly reaching the camera assembly 30 from the light source.
  • the second setting of imaging of the first main camera 32 may be a short exposure in which an exposure time is shorter than that of the first setting of imaging.
  • the exposure time of the short exposure is short enough not to detect the light from areas other than the light source area in the auxiliary image.
  • the electrical device 10 may set the exposure time of the first main camera 32 between 1/8000s and 1/6000s in order to capture the auxiliary image. If the exposure time is shorter than 1/8000s, even the light in the light source area cannot be detected in the auxiliary image, whereas if the exposure time is longer than 1/6000s, the light in the other areas can be also detected in the auxiliary image.
  • the second setting of imaging of the first main camera 32 may be a low ISO sensitivity which is lower than the ISO sensitivity of the first setting of imaging.
  • the low ISO sensitivity is low enough not to detect the light from areas other than the light source area in the auxiliary image.
  • the electrical device 10 may set the ISO of the first main camera 32 between 50 and 100. If the ISO sensitivity is lower than 50, even the light in the light source area cannot be detected in the auxiliary image, whereas if the ISO sensitivity is higher than 100, the light in the other areas can be also detected in the auxiliary image.
  • the second setting of imaging of the first main camera 32 may be a low exposure value (EV) which is lower than an exposure value (EV) of the first setting of imaging.
  • the low exposure value (EV) is low enough not to detect the light from areas other than the light source area in the auxiliary image.
  • the exposure value (EV) is preferably negative.
  • the exposure value (EV) can be constituted of the shutter speed, the ISO sensitivity and the diaphragm.
  • the electrical device 10 may set the exposure value (EV) of the first main camera 32 between -4.0 and -3.5. If the exposure value (EV) is lower than -4.0, even the light in the light source area cannot be detected in the auxiliary image, whereas if the exposure value (EV) is higher than -3.5, the light in the other areas can be also detected in the auxiliary image.
  • the electrical device 10 can also adjust various settings of the first main camera 32 other than the exposure time, the ISO sensitivity, the exposure value and the shutter speed in the second setting of imaging in order to capture the auxiliary image to detect the area emitting strong light.
  • the first main camera 32 is used for capturing the auxiliary image in the present embodiment
  • the second main camera 34 may be used for capturing the auxiliary image instead of the first main camera 32.
  • the auxiliary image is captured after the first main image and the second main image have been captured in the present embodiment
  • the auxiliary image may be captured before the first main image and the second main image are captured.
  • the step S16 is executed before the step 12 and the step S14 are executed.
  • the electrical device 10 computes a depth map based on the first main image and the second main image (Step S18) . More specifically, a position of the first main camera 32 is different from a positon of the second main camera 34. Therefore, a viewpoint of the first main image captured by the first main camera 32 is different from a viewpoint of the second main image captured by the second main camera 34. Using a parallax of the first main image and the second main image, the electrical device 10 can generate the depth map which indicates a distance between the electrical device 10 and surfaces of objects in the first main image and the second main image.
  • the depth map may be computed by the main processor 40 and/or the image signal processor 42. Moreover, the depth map may be computed in the camera assembly 30. However, a depth map computation circuit for computing the depth map may be placed in the electrical device 10.
  • the electrical device 10 detects the light source area based on the auxiliary image to detect the light source in the auxiliary image (Step S20) .
  • the electrical device 10 compares a brightness of a certain area in the auxiliary image with a threshold value, and regards the certain area as the light source area if the brightness of the certain area is larger than the threshold value.
  • the electrical device 10 can generate a light source map to indicate where the light source area is located in the auxiliary image.
  • the electrical device 10 compares the brightness of every area in the auxiliary image with the threshold value.
  • the electrical device 10 compares the brightness of every pixel in the auxiliary image with the threshold value.
  • the threshold value may be stored in the memory 44, set in the program executed by the main processor 40, or set in the image signal processor 42.
  • the electrical device 10 can determine that the pixel is in the light source area if the brightness value of the pixel is more than 128.
  • the threshold value is not necessarily set by a certain value but may be set by a percentage.
  • the threshold value may be set at 50% of the brightness.
  • the electrical device 10 detects the light source area in the auxiliary image after the depth map has been computed in the present embodiment, the electrical device 10 may detect the light source area in the auxiliary image before the depth map is computed. In this case, the step S20 is executed before the step 18 is executed.
  • the light source area may be detected based on the auxiliary image by the main processor 40 and/or the image signal processor 42. Moreover, the light source area may be detected based on the auxiliary image by the camera assembly 30.
  • the electrical device 10 executes an image generation process to generate the captured image (Step S22) .
  • the image generation process is executed based on at least the first main image captured in the step S12, the second main image captured in the step S14, the depth map computed in the step S18, and the light source area detected in the step S20.
  • the image generation process includes at least a Bokeh rendering process based on the depth map. That is, after a single integrated image is generated based on the first main image and the second main image, the Bokeh rendering process is applied to the single integrated image to defocus a background of the single integrated image based on the depth map. In other words, the background of the single integrated image is defocused to be blurred through the Bokeh rendering process.
  • the image generation process includes at least a light source area adjustment process based on the detected light source area.
  • the light source area adjustment process is a sort of an image correction process to correct the image captured by the camera assembly 30.
  • the light source area in the single integrated image is adjusted based on the light source area detected in the step S20.
  • the light source area adjustment process may be executed after the Bokeh rendering process is executed, or the light source area adjustment process may be executed before the Bokeh rendering process is executed.
  • the captured image the user wants to capture is generated.
  • the image generation process to generate the captured image may include various processes being applied to the first main image and the second main image, other than the Bokeh rendering process and the light source area adjustment process.
  • the image generation process may include a de-mosaic process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range process, and so on, to generate the captured image.
  • the image generation process may be executed by the main processor 40 and/or the image signal processor 42.
  • a Bokeh rendering circuit for executing the Bokeh rendering process may be placed in the electrical device 10.
  • a light source area adjustment circuit for executing the light source area adjustment process may be placed in the electrical device 10.
  • the electrical device 10 outputs the captured image (Step S24) .
  • the electrical device 10 may show the captured image on the display 20.
  • the electrical device 10 may store the captured image in the memory 44.
  • the image capturing process according to the present embodiment is completed.
  • the light source area can be detected based on the auxiliary image captured in the second setting of imaging which is the setting to detect the area emitting strong light. Therefore, it is easy for the electrical device 10 to detect the light source area.
  • the auxiliary image is captured in the second setting of imaging which is, for example, the short exposure, the low ISO, the low exposure value or the fast shutter speed. Therefore, the auxiliary image can be obtained without any additional functions being necessary to the electrical device 10. As a result, it is possible to suppress the manufacturing costs of the electrical device 10.
  • the first main image and/or the second main image correspond to a first image captured in the first setting of imaging
  • the auxiliary image corresponds to a second image captured in the second setting of imaging
  • the second main image corresponds to a third image.
  • the first main image corresponds to the third image.
  • the electrical device 10 has two main cameras on the back side thereof in the first embodiment of the present disclosure, the electrical device 10 has one main camera on the back side thereof in a second embodiment of the present disclosure.
  • differentials from the first embodiment will be explained.
  • FIG. 7 illustrates the back side view of the electrical device 10 according to the second embodiment of the present disclosure. Also, FIG. 7 is a diagram which corresponds to FIG. 1 in the first embodiment. The front side view of the electrical device 10 according to the second embodiment is substantially the same as FIG. 2 in the first embodiment.
  • the electrical device according to the present embodiment has the camera assembly 30 which includes a main camera 38 but does not include any additional main cameras. That is, the camera assembly 30 of the electrical device 10 according to the present embodiment has one main camera 38 in the back side of the electrical device 10 in order to capture the image in the back side thereof.
  • the electrical device 10 has the sub camera 36 in the same manner as that of the first embodiment. That is, the camera assembly 30 of the electrical device 10 according to the second embodiment also includes one so called in-camera in the front side of the electrical device 10 in order to capture the image in the front side thereof.
  • FIG. 8 illustrates a flowchart of an image capturing process performed by the electrical device 10 according to the second embodiment. Also in the present embodiment, the image capturing process is executed by a collaboration of the main processor 40 and the image signal processor 42. Therefore, the main processor 40 and the image signal processor 42 constitute an image processor in the present embodiment.
  • the electrical device 10 sets the camera assembly 30 to a first setting of imaging of the camera assembly 30 (Step S10) .
  • the step S10 in the second embodiment is substantially the same as that in the first embodiment.
  • the electrical device 10 captures a main image by the main camera 38 in the first setting of imaging of the camera assembly 30 (Step 30) .
  • the step 30 in the second embodiment is substantially the same as that in the first embodiment except that the single main image is captured by the single main camera 38.
  • the electrical device 10 captures an auxiliary image by the main camera 38 in the second setting of imaging to detect an area emitting a light stronger than other areas in the auxiliary image (Step S32) .
  • the step S32 in the second embodiment is substantially the same as that in the first embodiment except that the auxiliary image is captured by the main camera 38.
  • the electrical device 10 detects the light source area based on the auxiliary image to detect the light source in the auxiliary image (Step S20) .
  • the step S20 in the second embodiment is substantially the same as that in the first embodiment.
  • the electrical device 10 executes an image generation process to generate the captured image (Step S34) .
  • the image generation process is executed based on at least the main image captured in step S30 and the light source area detected in the step S20.
  • the image generation process includes at least a light source area adjustment process based on the detected light source area.
  • the light source area adjustment process is a sort of an image correction process to correct the image captured by the camera assembly 30.
  • the light source area in the main image captured in the step S30 is adjusted based on the light source area detected in the step S20.
  • the captured image is generated.
  • the image generation process to generate the captured image may include various processes being applied to the main image other than the light source area adjustment process.
  • the image generation process may include a de-mosaic process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range process and so on to generate the captured image.
  • the image generation process is executed by the main processor 40 and/or the image signal processor 42.
  • a light source area adjustment circuit for executing the light source area adjustment process may be placed in the electrical device 10.
  • step S34 in the second embodiment is substantially the same as that in the first embodiment except for the processes mentioned above.
  • the electrical device 10 outputs the captured image (Step S24) .
  • the step S24 in the second embodiment is substantially the same as that in the first embodiment.
  • the image capturing process according to the present embodiment is completed.
  • the light source area can be detected based on the auxiliary image captured in the second setting of imaging which is the setting to detect an area emitting strong light. Therefore, it is easy for the electrical device 10 to detect the light source area.
  • the auxiliary image is captured in the second setting of imaging which is, for example, the short exposure, the low ISO, the low exposure value or the fast shutter speed. Therefore, the auxiliary image can be obtained without any additional functions being necessary to the electrical device 10. As a result, it is possible to suppress the manufacturing costs of the electrical device 10.
  • the electrical device 10 can capture the auxiliary image and detect the light source area in the auxiliary image to adjust the light source area in the main image. Therefore, the present embodiment can be implemented by the sub camera 36 instead of the main camera 38. In other words, if the sub camera 36 captures the main image and the auxiliary image, the image capturing process in FIG. 8 can be executed based on the main image and the auxiliary image captured by the sub camera 36.
  • the main image corresponds to a first image in the first setting of imaging
  • the auxiliary image corresponds to a second image in the second setting of imaging
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • the feature defined with “first” and “second” may comprise one or more of this feature.
  • “aplurality of” means two or more than two, unless specified otherwise.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements, which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are contacted via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is right or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is right or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function may be specifically achieved in any computer readable medium to be used by the instruction execution system, device or equipment (such as the system based on computers, the system comprising processors or other systems capable of obtaining the instruction from the instruction execution system, device and equipment and executing the instruction) , or to be used in combination with the instruction execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instruction execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

A method of generating a captured image in an electrical device including a camera assembly according to the embodiments of the present disclosure includes capturing a first image in a first setting of imaging by the camera assembly, wherein the first setting of imaging is a setting of the camera assembly set by a user; capturing a second image in a second setting of imaging by the camera assembly, wherein the second setting of imaging is a setting to detect an area emitting a light stronger than other areas in the second image, and detecting a light source area based on the second image to generate the captured image based on the first image.

Description

METHOD OF GENERATING CAPTURED IMAGE AND ELECTRICAL DEVICE FIELD
The present disclosure relates to a method of generating a captured image in an electrical device including a camera assembly and such an electrical device including a camera assembly.
BACKGROUND
Electrical devices such as smartphones and tablet terminals are widely used in our daily life. Nowadays, many of the electrical devices are equipped with a camera assembly to capture an image. Some of the electrical devices are portable and are thus easy to carry. Therefore, a user of the electrical device can easily capture an image by using the camera assembly of the electrical device anytime, anywhere.
However, if the user of the electrical device captures an image at night, the captured image includes one or more light source areas which are overexposed or halation. The overexposed light source areas and the halation areas in the captured image can be subject to an image correction process after capturing the image. In order to perform the image correction process to adjust the light source area, it is important to precisely detect the light source areas.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an imaging device and an electrical device.
In accordance with the present disclosure, a method of generating a captured image in an electrical device including a camera assembly may include:
capturing a first image in a first setting of imaging by the camera assembly, wherein the first setting of imaging is a setting of the camera assembly set by a user;
capturing a second image in a second setting of imaging by the camera assembly, wherein the second setting of imaging is a setting to detect an area emitting a light stronger than other areas in the second image, and
detecting a light source area based on the second image to generate the captured image based on the first image.
In some embodiments, the second setting of imaging may be a short exposure in which an exposure time is shorter than the first setting of imaging.
In some embodiments, the exposure time of the short exposure may be short enough not to detect the light from areas other than the light source area in the second image.
In some embodiments, the second setting of imaging may be a low ISO sensitivity which is lower than an ISO sensitivity of the first setting of imaging.
In some embodiments, the low ISO sensitivity may be low enough not to detect the light from areas other than the light source area in the second image.
In some embodiments, the second setting of imaging may be a low exposure value which is lower than an exposure value of the first setting of imaging.
In some embodiments, the low exposure value may be low enough not to detect the light from areas other than the light source area in the second image.
In some embodiments, the low exposure value may be negative.
In some embodiments, the detecting the light source area may include:
comparing a brightness of a certain area in the second image with a threshold value; and
regarding the certain area as the light source area if the brightness of the certain area is larger than the threshold value.
In some embodiments, the method may further include:
generating, based on the detected light source area, a light source map which indicates a place where the light source area is located in the second image.
In some embodiments, the camera assembly may include at least a first camera and a second camera; and
the second image is captured by the first camera or the second camera.
In some embodiments, the first image may be captured by the first camera, and
the method may further include:
capturing a third image by the third camera;
computing a depth map based on the first image and the third image; and
generating the captured image through an image generation process based on at least the first image, the second image, the depth map and the detected light source area.
In some embodiments, the camera assembly may include a main camera in a back side of the electrical device, and
the first image and the second image are captured by the main camera.
In some embodiments, the camera assembly may include a sub camera in a front side of the electrical device, and
the first image and the second image are captured by the sub camera.
In accordance with the present disclosure, an electrical device may include:
a camera assembly configured to capture a first image in a first setting of imaging and a second image in a second setting of imaging, wherein the first setting of imaging is a setting of the camera assembly and the second setting of imaging is a setting to detect an area emitting a light stronger than other areas in the second image, and
an image processor configured to detect a light source area based on the second image to generate the captured image based on the first image.
In some embodiments, the second setting of imaging  may be a short exposure in which an exposure time is shorter than the first setting of imaging.
In some embodiments, the exposure time of the short exposure may be short enough not to detect the light from areas other than the light source area in the second image.
In some embodiments, the second setting of imaging may be a low ISO sensitivity which is lower than an ISO sensitivity of the first setting of imaging.
In some embodiments, the low ISO sensitivity may be low enough not to detect the light from areas other than the light source area in the second image.
In some embodiments, the second setting of imaging may be a low exposure value which is lower than an exposure value of the first setting of imaging.
In some embodiments, the low exposure value may be low enough not to detect the light from areas other than the light source area in the second image.
In some embodiments, the low exposure value may be negative.
In some embodiments, the image processor may be further configured to:
compare a brightness of a certain area in the second image with a threshold value; and
regard the certain area as the light source area if the brightness of the certain area is larger than the threshold value.
In some embodiments, the image processor may be further configured to:
generate, based on the detected light source area, a light source map which indicates a place where the light source area is located in the second image.
In some embodiments, the camera assembly may include at least a first camera and a second camera; and
the second image is captured by the first camera or the second camera.
In some embodiments, the first image may be captured by the first camera, and
the image processor may be further configured to:
capture a third image by the third camera;
compute a depth map based on the first image and the third image; and
generate the captured image through an image generation process based on at least the first image, the second image, the depth map and the detected light source area.
In some embodiments, the camera assembly may include a main camera in a back side of the electrical device, and
the first image and the second image are captured by the main camera.
In some embodiments, the camera assembly may include a sub camera in a front side of the electrical device, and
the first image and the second image are captured by the sub camera.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1 illustrates a back side view of an electrical device according to a first embodiment of the present disclosure;
FIG. 2 illustrates a front side view of the electrical device according to the first embodiment of the present disclosure;
FIG. 3 illustrates a block diagram of the electrical device according to the first embodiment of the present disclosure;
FIG. 4 illustrates a flowchart of an image capturing process performed by the electrical device according to the first embodiment of the present disclosure;
FIG. 5 shows a first main image captured by a first main camera or a second main image captured by a second main camera in a first setting of imaging;
FIG. 6 shows an auxiliary image captured by the first main camera in a second setting of imaging;
FIG. 7 illustrates a back side view of the electrical device  according to a second embodiment of the present disclosure; and
FIG. 8 illustrates a flowchart of an image capturing process performed by the electrical device according to the second embodiment of the present disclosure.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and the elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory, which aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
[First Embodiment]
FIG. 1 illustrates a back side view of an electrical device 10 according to a first embodiment of the present disclosure and FIG. 2 illustrates a front side view of the electrical device 10 according to the first embodiment of the present disclosure.
As shown in FIG. 1 and FIG. 2, the electrical device 10 may include a display 20 and a camera assembly 30. In the present embodiment, the camera assembly 30 includes a first main camera 32, a second main camera 34 and a sub camera 36. The first main camera 32 and the second main camera 34 can capture an image in a back side of the electrical device 10 and the sub camera can capture an image in a front side of the electrical device 10. Therefore, the first main camera 32 and the second main camera 34 are so called out-camera whereas the sub camera 36 is so called in-camera. For example, the electrical device 10 can be a mobile phone, a tablet computer, a personal digital assistant, and so on.
The first main camera 32 and the second main camera 34 may have the same performance and/or characteristics or they may have different performance and/or characteristics. For example, in a case where the first main camera 32 and the  second main camera 34 have different performance and/or characteristics from each other, the first main camera 32 may be equipped with a full color image sensor and the second main camera 34 may be equipped with a black-and-white image sensor. Also, the first main camera 32 may be a camera suitable for capturing a still image and the second main camera 34 may be a camera suitable for capturing a moving image. Also, the first main camera 32 may be a camera equipped with a wide-angle lens and the second main camera 34 may be a camera equipped with a telephoto lens.
In the present embodiment, the performance of the sub camera 36 is lower than that of the first main camera 32 and the second main camera 34. However, the performance of the sub camera 36 may be the same as that of the first main camera 32 and the second main camera 34.
Although the electrical device 10 according to the present embodiment has three cameras, the electrical device may have more than three cameras. For example, the electrical device 10 may have three, four, five, etc. main cameras.
FIG. 3 illustrates a block diagram of the electrical device 10 according to the present embodiment. As shown in FIG. 3, in addition to the display 20 and the camera assembly 30, the electrical device 10 may include a main processor 40, an image signal processor 42, a memory 44, a power supply circuit 46 and a communication circuit 48. The display 20, the camera assembly 30, the main processor 40, the image signal processor 42, the memory 44, the power supply circuit 46 and the communication circuit 48 are connected together via a bus 50.
The main processor 40 executes one or more programs stored in the memory 44. The main processor 40 implements various applications and data processing of the electrical device 10 by executing the programs. The main processor 40 may be one or more computer processors. The main processor 40 is not limited to one CPU core, but it may have a plurality of CPU cores. The main processor 40 may be a main CPU of the electrical device 10, an image processing unit (IPU) or a DSP provided with  the camera assembly 30.
The image signal processor 42 controls the camera assembly 30 and processes the image captured by the camera assembly 30. For example, the image signal processor 42 can execute a de-mosaic process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range process and so on to the image captured by the camera assembly 30.
In the present embodiment, the main processor 40 and the image signal processor 42 collaborate with each other to obtain the captured image by the camera assembly 30. That is, the main processor 40 and the image signal processor 42 are configured to capture the image by the camera assembly 30 and execute various kinds of image processes for the captured image.
The memory 44 stores a program to be executed by the main processor 40 and various kinds of data. For example, data of the captured image are stored in the memory 44.
The memory 44 may be a high-speed RAM memory, or a non-volatile memory such as a flash memory and a magnetic disk memory.
The power supply circuit 46 may have a battery such as a lithium-ion rechargeable battery (not shown) and a battery management unit (BMU) for managing the battery.
The communication circuit 48 is configured to receive and transmit data to communicate with the Internet or other devices via wireless communication. The wireless communication may adopt any communication standard or protocol, including but not limited to GSM (Global System for Mobile communication) , CDMA (Code Division Multiple Access) , LTE (Long Term Evolution) , LTE-Advanced, 5th generation (5G) .
The communication circuit 48 may include an antenna and a RF (radio frequency) circuit.
FIG. 4 illustrates a flowchart of an image capturing process performed by the electrical device 10 according to the present embodiment. In the present embodiment, the image capturing process is executed by the main processor 40 in  collaboration with the image signal processor 42. Therefore, the main processor 40 and the image signal processor 42 constitute an image processor in the present embodiment.
As shown in FIG. 4, the electrical device 10 sets a first setting of imaging of the camera assembly 30 (Step S10) . In the present embodiment, for example, the electrical device 10 may set a setting of the auto focus, a setting of the auto exposure and a setting of the auto white balance for the camera assembly 30. In addition, the electrical device 10 may set a setting of ISO sensitivity, a setting of an exposure time, a setting of an exposure value, a setting of a shutter speed and so on for the camera assembly 30.
The first setting of imaging of the camera assembly 30 can be set automatically by the electrical device 10. In other words, the user can select an automatic setting for the first setting of imaging of the camera assembly 30. Alternatively, the first setting of imaging of the camera assembly 30 can be previously set by the user. In other words, the user can previously set each item of the first setting of imaging of the camera assembly 30.
Next, the electrical device 10 captures a first main image by the first main camera 32 in the first setting of imaging of the camera assembly 30 (Step S12) and a second main image by the second main camera 34 in the first setting of imaging of the camera assembly 30 (Step S14) .
In the present embodiment, the first main camera 32 and the second main camera 34 capture the first main image and the second main image simultaneously. However, the timing of capturing the first main image by the first main camera 32 and the timing of capturing the second main image by the second main camera 34 may be slightly different.
Next, the electrical device 10 captures an auxiliary image by the first main camera 32 in a second setting of imaging which is a setting to detect an area emitting a light stronger than other areas in the auxiliary image (Step S16) .
FIG. 5 shows the first main image captured by the first  main camera 32 or the second main image captured by the second main camera 34 in the first setting of imaging set in the electrical device 10 by the user, and FIG. 6 shows the auxiliary image captured by the first main camera 32 in the second setting of imaging to detect an area emitting a light.
As shown in FIG. 5, the first main image and the second main image are normal photographs to be captured by the camera assembly 30. That is, the first main image and the second main image are images the user wants to capture. On the other hand, as shown in FIG. 6, the auxiliary image is a special photograph for easy detection of an area emitting strong light. That is, the electrical device 10 can detect the area emitting the strong light which emits a lot of photons directly reaching the camera assembly 30 from the light source.
In the present embodiment, the second setting of imaging of the first main camera 32 may be a short exposure in which an exposure time is shorter than that of the first setting of imaging. In other words, the exposure time of the short exposure is short enough not to detect the light from areas other than the light source area in the auxiliary image.
For example, the electrical device 10 may set the exposure time of the first main camera 32 between 1/8000s and 1/6000s in order to capture the auxiliary image. If the exposure time is shorter than 1/8000s, even the light in the light source area cannot be detected in the auxiliary image, whereas if the exposure time is longer than 1/6000s, the light in the other areas can be also detected in the auxiliary image.
Also, in the present embodiment, the second setting of imaging of the first main camera 32 may be a low ISO sensitivity which is lower than the ISO sensitivity of the first setting of imaging. In other words, the low ISO sensitivity is low enough not to detect the light from areas other than the light source area in the auxiliary image.
For example, the electrical device 10 may set the ISO of the first main camera 32 between 50 and 100. If the ISO sensitivity is lower than 50, even the light in the light source  area cannot be detected in the auxiliary image, whereas if the ISO sensitivity is higher than 100, the light in the other areas can be also detected in the auxiliary image.
Also, in the present embodiment, the second setting of imaging of the first main camera 32 may be a low exposure value (EV) which is lower than an exposure value (EV) of the first setting of imaging. In other words, the low exposure value (EV) is low enough not to detect the light from areas other than the light source area in the auxiliary image. In order to detect the light source area in the auxiliary image, the exposure value (EV) is preferably negative. Here, the exposure value (EV) can be constituted of the shutter speed, the ISO sensitivity and the diaphragm.
For example, the electrical device 10 may set the exposure value (EV) of the first main camera 32 between -4.0 and -3.5. If the exposure value (EV) is lower than -4.0, even the light in the light source area cannot be detected in the auxiliary image, whereas if the exposure value (EV) is higher than -3.5, the light in the other areas can be also detected in the auxiliary image.
Incidentally, the electrical device 10 can also adjust various settings of the first main camera 32 other than the exposure time, the ISO sensitivity, the exposure value and the shutter speed in the second setting of imaging in order to capture the auxiliary image to detect the area emitting strong light.
Moreover, although the first main camera 32 is used for capturing the auxiliary image in the present embodiment, the second main camera 34 may be used for capturing the auxiliary image instead of the first main camera 32.
Furthermore, although the auxiliary image is captured after the first main image and the second main image have been captured in the present embodiment, the auxiliary image may be captured before the first main image and the second main image are captured. In this case, the step S16 is executed before the step 12 and the step S14 are executed.
Next, as shown in FIG. 4, the electrical device 10 computes a depth map based on the first main image and the second main image (Step S18) . More specifically, a position of the first main camera 32 is different from a positon of the second main camera 34. Therefore, a viewpoint of the first main image captured by the first main camera 32 is different from a viewpoint of the second main image captured by the second main camera 34. Using a parallax of the first main image and the second main image, the electrical device 10 can generate the depth map which indicates a distance between the electrical device 10 and surfaces of objects in the first main image and the second main image.
For example, the depth map may be computed by the main processor 40 and/or the image signal processor 42. Moreover, the depth map may be computed in the camera assembly 30. However, a depth map computation circuit for computing the depth map may be placed in the electrical device 10.
Next, as shown in FIG. 4, the electrical device 10 detects the light source area based on the auxiliary image to detect the light source in the auxiliary image (Step S20) . In the present embodiment, the electrical device 10 compares a brightness of a certain area in the auxiliary image with a threshold value, and regards the certain area as the light source area if the brightness of the certain area is larger than the threshold value. For example, the electrical device 10 can generate a light source map to indicate where the light source area is located in the auxiliary image.
More specifically, the electrical device 10 compares the brightness of every area in the auxiliary image with the threshold value. In other words, the electrical device 10 compares the brightness of every pixel in the auxiliary image with the threshold value. The threshold value may be stored in the memory 44, set in the program executed by the main processor 40, or set in the image signal processor 42.
For example, in a case where every pixel in the auxiliary  image includes a brightness composed of 256 gradations, the electrical device 10 can determine that the pixel is in the light source area if the brightness value of the pixel is more than 128.
Incidentally, the threshold value is not necessarily set by a certain value but may be set by a percentage. For example, in a case where the maximum brightness is 100% and the minimum brightness is 0%, the threshold value may be set at 50% of the brightness.
Moreover, although the electrical device 10 detects the light source area in the auxiliary image after the depth map has been computed in the present embodiment, the electrical device 10 may detect the light source area in the auxiliary image before the depth map is computed. In this case, the step S20 is executed before the step 18 is executed.
For example, the light source area may be detected based on the auxiliary image by the main processor 40 and/or the image signal processor 42. Moreover, the light source area may be detected based on the auxiliary image by the camera assembly 30.
Next, as shown in FIG. 4, the electrical device 10 executes an image generation process to generate the captured image (Step S22) . In the present embodiment, the image generation process is executed based on at least the first main image captured in the step S12, the second main image captured in the step S14, the depth map computed in the step S18, and the light source area detected in the step S20.
The image generation process includes at least a Bokeh rendering process based on the depth map. That is, after a single integrated image is generated based on the first main image and the second main image, the Bokeh rendering process is applied to the single integrated image to defocus a background of the single integrated image based on the depth map. In other words, the background of the single integrated image is defocused to be blurred through the Bokeh rendering process.
In addition, the image generation process includes at least a light source area adjustment process based on the detected light source area. The light source area adjustment process is a sort of an image correction process to correct the image captured by the camera assembly 30. In the present embodiment, after the single integrated image is generated based on the first main image and the second main image, the light source area in the single integrated image is adjusted based on the light source area detected in the step S20.
In the present embodiment, the light source area adjustment process may be executed after the Bokeh rendering process is executed, or the light source area adjustment process may be executed before the Bokeh rendering process is executed.
When the image generation process is completed, the captured image the user wants to capture is generated. Of course, the image generation process to generate the captured image may include various processes being applied to the first main image and the second main image, other than the Bokeh rendering process and the light source area adjustment process. For example, the image generation process may include a de-mosaic process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range process, and so on, to generate the captured image.
For example, the image generation process may be executed by the main processor 40 and/or the image signal processor 42. However, a Bokeh rendering circuit for executing the Bokeh rendering process may be placed in the electrical device 10. Similarly, a light source area adjustment circuit for executing the light source area adjustment process may be placed in the electrical device 10.
Next, as shown in FIG. 4, the electrical device 10 outputs the captured image (Step S24) . For example, the electrical device 10 may show the captured image on the display 20. Also, the electrical device 10 may store the captured image in  the memory 44.
After the electrical device 10 has output the captured image, the image capturing process according to the present embodiment is completed.
As explained above, in accordance with the electrical device 10 of the present embodiment, the light source area can be detected based on the auxiliary image captured in the second setting of imaging which is the setting to detect the area emitting strong light. Therefore, it is easy for the electrical device 10 to detect the light source area.
Moreover, the auxiliary image is captured in the second setting of imaging which is, for example, the short exposure, the low ISO, the low exposure value or the fast shutter speed. Therefore, the auxiliary image can be obtained without any additional functions being necessary to the electrical device 10. As a result, it is possible to suppress the manufacturing costs of the electrical device 10.
Incidentally, in the first embodiment, the first main image and/or the second main image correspond to a first image captured in the first setting of imaging, and the auxiliary image corresponds to a second image captured in the second setting of imaging.
Moreover, in the first embodiment, if the first main image corresponds to the first image, the second main image corresponds to a third image. On the other hand, if the second main image corresponds to the first image, the first main image corresponds to the third image.
[Second embodiment]
Although the electrical device 10 has two main cameras on the back side thereof in the first embodiment of the present disclosure, the electrical device 10 has one main camera on the back side thereof in a second embodiment of the present disclosure. Hereinafter, differentials from the first embodiment will be explained.
FIG. 7 illustrates the back side view of the electrical  device 10 according to the second embodiment of the present disclosure. Also, FIG. 7 is a diagram which corresponds to FIG. 1 in the first embodiment. The front side view of the electrical device 10 according to the second embodiment is substantially the same as FIG. 2 in the first embodiment.
As shown in FIG. 7, the electrical device according to the present embodiment has the camera assembly 30 which includes a main camera 38 but does not include any additional main cameras. That is, the camera assembly 30 of the electrical device 10 according to the present embodiment has one main camera 38 in the back side of the electrical device 10 in order to capture the image in the back side thereof.
As shown in FIG. 2, also in the second embodiment, the electrical device 10 has the sub camera 36 in the same manner as that of the first embodiment. That is, the camera assembly 30 of the electrical device 10 according to the second embodiment also includes one so called in-camera in the front side of the electrical device 10 in order to capture the image in the front side thereof.
FIG. 8 illustrates a flowchart of an image capturing process performed by the electrical device 10 according to the second embodiment. Also in the present embodiment, the image capturing process is executed by a collaboration of the main processor 40 and the image signal processor 42. Therefore, the main processor 40 and the image signal processor 42 constitute an image processor in the present embodiment.
As shown in FIG. 8, the electrical device 10 sets the camera assembly 30 to a first setting of imaging of the camera assembly 30 (Step S10) . The step S10 in the second embodiment is substantially the same as that in the first embodiment.
Next, the electrical device 10 captures a main image by the main camera 38 in the first setting of imaging of the camera assembly 30 (Step 30) . The step 30 in the second embodiment is substantially the same as that in the first embodiment except  that the single main image is captured by the single main camera 38.
Next, the electrical device 10 captures an auxiliary image by the main camera 38 in the second setting of imaging to detect an area emitting a light stronger than other areas in the auxiliary image (Step S32) . The step S32 in the second embodiment is substantially the same as that in the first embodiment except that the auxiliary image is captured by the main camera 38.
Next, as shown in FIG. 4, the electrical device 10 detects the light source area based on the auxiliary image to detect the light source in the auxiliary image (Step S20) . The step S20 in the second embodiment is substantially the same as that in the first embodiment.
Next, as shown in FIG. 4, the electrical device 10 executes an image generation process to generate the captured image (Step S34) . In the present embodiment, the image generation process is executed based on at least the main image captured in step S30 and the light source area detected in the step S20.
The image generation process includes at least a light source area adjustment process based on the detected light source area. The light source area adjustment process is a sort of an image correction process to correct the image captured by the camera assembly 30. In the present embodiment, the light source area in the main image captured in the step S30 is adjusted based on the light source area detected in the step S20.
When the image generation process is completed, the captured image is generated. Of course, the image generation process to generate the captured image may include various processes being applied to the main image other than the light source area adjustment process. For example, the image generation process may include a de-mosaic process, a noise reduction process, an auto exposure process, an auto focus process, an auto white balance process, a high dynamic range  process and so on to generate the captured image.
For example, the image generation process is executed by the main processor 40 and/or the image signal processor 42. However, a light source area adjustment circuit for executing the light source area adjustment process may be placed in the electrical device 10.
The step S34 in the second embodiment is substantially the same as that in the first embodiment except for the processes mentioned above.
Next, as shown in FIG. 8, the electrical device 10 outputs the captured image (Step S24) . The step S24 in the second embodiment is substantially the same as that in the first embodiment.
After the electrical device 10 has output the captured image, the image capturing process according to the present embodiment is completed.
As explained above, according to the electrical device 10 of the present embodiment, the light source area can be detected based on the auxiliary image captured in the second setting of imaging which is the setting to detect an area emitting strong light. Therefore, it is easy for the electrical device 10 to detect the light source area.
Moreover, the auxiliary image is captured in the second setting of imaging which is, for example, the short exposure, the low ISO, the low exposure value or the fast shutter speed. Therefore, the auxiliary image can be obtained without any additional functions being necessary to the electrical device 10. As a result, it is possible to suppress the manufacturing costs of the electrical device 10.
Moreover, even though the electrical device 10 includes the single main camera 38, the electrical device 10 can capture the auxiliary image and detect the light source area in the auxiliary image to adjust the light source area in the main image. Therefore, the present embodiment can be implemented by the sub camera 36 instead of the main camera 38. In other words, if the sub camera 36 captures the main  image and the auxiliary image, the image capturing process in FIG. 8 can be executed based on the main image and the auxiliary image captured by the sub camera 36.
Incidentally, in the second embodiment, the main image corresponds to a first image in the first setting of imaging, and the auxiliary image corresponds to a second image in the second setting of imaging.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings under discussion. These relative terms are only used to simplify description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, the feature defined with "first" and "second" may comprise one or more of this feature. In the description of the present disclosure, "aplurality of" means two or more than two, unless specified otherwise.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be  inner communications of two elements, which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are contacted via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is right or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is right or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may be also applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary  embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instruction execution system, device or equipment (such as the system based on computers, the system comprising processors or other systems capable of obtaining the instruction from the instruction execution system, device and equipment and executing the instruction) , or to be used in combination with the instruction execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic  connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instruction execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more  cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (28)

  1. A method of generating a captured image in an electrical device including a camera assembly, comprising:
    capturing a first image in a first setting of imaging by the camera assembly, wherein the first setting of imaging is a setting of the camera assembly set by a user;
    capturing a second image in a second setting of imaging by the camera assembly, wherein the second setting of imaging is a setting to detect an area emitting a light stronger than other areas in the second image, and
    detecting a light source area based on the second image to generate the captured image based on the first image.
  2. The method according to claim 1, wherein the second setting of imaging is a short exposure in which an exposure time is shorter than the first setting of imaging.
  3. The method according to claim 2, wherein the exposure time of the short exposure is short enough not to detect the light from areas other than the light source area in the second image.
  4. The method according to claim 1, wherein the second setting of imaging is a low ISO sensitivity which is lower than an ISO sensitivity of the first setting of imaging.
  5. The method according to claim 4, wherein the low ISO sensitivity is low enough not to detect the light from areas other than the light source area in the second image.
  6. The method according to claim 1, wherein the second setting of imaging is a low exposure value which is lower than an exposure value of the first setting of imaging.
  7. The method according to claim 6, wherein the low  exposure value is low enough not to detect the light from areas other than the light source area in the second image.
  8. The method according to claim 7, wherein the low exposure value is negative.
  9. The method according to claim 1, wherein the detecting the light source area comprises:
    comparing a brightness of a certain area in the second image with a threshold value; and
    regarding the certain area as the light source area if the brightness of the certain area is larger than the threshold value.
  10. The method according to claim 1, wherein further comprising:
    generating, based on the detected light source area, a light source map which indicates a place where the light source area is located in the second image.
  11. The method according to any one of claims 1-10, wherein
    the camera assembly comprises at least a first camera and a second camera; and
    the second image is captured by the first camera or the second camera.
  12. The method according to claim 11, wherein the first image is captured by one of the first camera and the second camera, and
    the method further comprises:
    capturing a third image by the other of the first camera and the second camera;
    computing a depth map based on the first image and the third image; and
    generating the captured image through an image generation process based on at least the first image, the second image, the depth map and the detected light source area.
  13. The method according to any one of claims 1-10, wherein
    the camera assembly comprises a main camera in a back side of the electrical device, and
    the first image and the second image are captured by the main camera.
  14. The method according to any one of claims 1-10, wherein
    the camera assembly comprises a sub camera in a front side of the electrical device, and
    the first image and the second image are captured by the sub camera.
  15. An electrical device, comprising:
    a camera assembly configured to capture a first image in a first setting of imaging and a second image in a second setting of imaging, wherein the first setting of imaging is a setting of the camera assembly and the second setting of imaging is a setting to detect an area emitting a light stronger than other areas in the second image, and
    an image processor configured to detect a light source area based on the second image to generate a captured image based on the first image.
  16. The electrical device according to claim 15, wherein the second setting of imaging is a short exposure in which an exposure time is shorter than the first setting of imaging.
  17. The electrical device according to claim 16, wherein the exposure time of the short exposure is short enough not to detect the light from areas other than the light source area in the second image.
  18. The electrical device according to claim 15, wherein the second setting of imaging is a low ISO sensitivity which is lower than an ISO sensitivity of the first setting of imaging.
  19. The electrical device according to claim 18, wherein the low ISO sensitivity is low enough not to detect the light from areas other than the light source area in the second image.
  20. The electrical device according to claim 15, wherein the second setting of imaging is a low exposure value which is lower than an exposure value of the first setting of imaging.
  21. The electrical device according to claim 20, wherein the low exposure value is low enough not to detect the light from areas other than the light source area in the second image.
  22. The electrical device according to claim 21, wherein the low exposure value is negative.
  23. The electrical device according to claim 15, wherein the image processor is further configured to:
    compare a brightness of a certain area in the second image with a threshold value; and
    regard the certain area as the light source area if the brightness of the certain area is larger than the threshold value.
  24. The electrical device according to claim 15, wherein the image processor is further configured to:
    generate, based on the detected light source area, a light source map which indicates a place where the light source area is located in the second image.
  25. The electrical device according to any one of claims 15-24, wherein
    the camera assembly comprises at least a first camera and a second camera; and
    the second image is captured by the first camera or the second camera.
  26. The electrical device according to claim 25, wherein
    the first image is captured by one of the first camera and the second camera, and
    the image processor is further configured to:
    capture a third image by the other of the first camera and the second camera;
    compute a depth map based on the first image and the third image; and
    generate the captured image through an image generation process based on at least the first image, the second image, the depth map and the detected light source area.
  27. The electrical device according to any one of claims 15-24, wherein
    the camera assembly comprises a main camera in a back side of the electrical device, and
    the first image and the second image are captured by the main camera.
  28. The electrical device according to any one of claims 15-24, wherein
    the camera assembly comprises a sub camera in a front side of the electrical device, and
    the first image and the second image are captured by the sub camera.
PCT/CN2019/126637 2019-12-19 2019-12-19 Method of generating captured image and electrical device WO2021120107A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/126637 WO2021120107A1 (en) 2019-12-19 2019-12-19 Method of generating captured image and electrical device
CN201980102874.9A CN114946170B (en) 2019-12-19 2019-12-19 Method for generating image and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/126637 WO2021120107A1 (en) 2019-12-19 2019-12-19 Method of generating captured image and electrical device

Publications (1)

Publication Number Publication Date
WO2021120107A1 true WO2021120107A1 (en) 2021-06-24

Family

ID=76477035

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/126637 WO2021120107A1 (en) 2019-12-19 2019-12-19 Method of generating captured image and electrical device

Country Status (2)

Country Link
CN (1) CN114946170B (en)
WO (1) WO2021120107A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104427254A (en) * 2013-09-10 2015-03-18 联想(北京)有限公司 Light sensing control method and light sensing control device
CN106952247A (en) * 2017-03-17 2017-07-14 成都通甲优博科技有限责任公司 A kind of dual camera terminal and its image processing method and system
CN107613218A (en) * 2017-09-15 2018-01-19 维沃移动通信有限公司 The image pickup method and mobile terminal of a kind of high dynamic range images
CN107707827A (en) * 2017-11-14 2018-02-16 维沃移动通信有限公司 A kind of high-dynamics image image pickup method and mobile terminal
US20190230334A1 (en) * 2016-06-15 2019-07-25 Sony Corporation Image producing apparatus and image producing method
CN110177221A (en) * 2019-06-25 2019-08-27 维沃移动通信有限公司 The image pickup method and device of high dynamic range images

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9307207B2 (en) * 2013-01-07 2016-04-05 GM Global Technology Operations LLC Glaring reduction for dynamic rearview mirror
CN107809582A (en) * 2017-10-12 2018-03-16 广东欧珀移动通信有限公司 Image processing method, electronic installation and computer-readable recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104427254A (en) * 2013-09-10 2015-03-18 联想(北京)有限公司 Light sensing control method and light sensing control device
US20190230334A1 (en) * 2016-06-15 2019-07-25 Sony Corporation Image producing apparatus and image producing method
CN106952247A (en) * 2017-03-17 2017-07-14 成都通甲优博科技有限责任公司 A kind of dual camera terminal and its image processing method and system
CN107613218A (en) * 2017-09-15 2018-01-19 维沃移动通信有限公司 The image pickup method and mobile terminal of a kind of high dynamic range images
CN107707827A (en) * 2017-11-14 2018-02-16 维沃移动通信有限公司 A kind of high-dynamics image image pickup method and mobile terminal
CN110177221A (en) * 2019-06-25 2019-08-27 维沃移动通信有限公司 The image pickup method and device of high dynamic range images

Also Published As

Publication number Publication date
CN114946170B (en) 2024-04-19
CN114946170A (en) 2022-08-26

Similar Documents

Publication Publication Date Title
US10348962B2 (en) Image processing method and apparatus, and electronic device
US10249021B2 (en) Image processing method and apparatus, and electronic device
US20190122337A1 (en) Image processing method and apparatus, and electronic device
EP3328078B1 (en) Image processing method and apparatus, and electronic device
CN105611182B (en) Brightness compensation method and device
CN110766729B (en) Image processing method, device, storage medium and electronic equipment
CN102238394B (en) Image processing apparatus, control method thereof, and image-capturing apparatus
EP3328057A1 (en) Camera assembly, method for portrait tracking based on the same, and electronic device
US20230177654A1 (en) Method of removing noise in image, electrical device, and storage medium
WO2021120107A1 (en) Method of generating captured image and electrical device
WO2021159295A1 (en) Method of generating captured image and electrical device
WO2021138867A1 (en) Method for electronic device with a plurality of cameras and electronic device
US11368630B2 (en) Image processing apparatus and image processing method
WO2021243709A1 (en) Method of generating target image data, electrical device and non-transitory computer readable medium
WO2021138797A1 (en) Method of adjusting captured image and electrical device
US11196938B2 (en) Image processing apparatus and control method for same
WO2022246606A1 (en) Electrical device, method of generating image data, and non-transitory computer readable medium
WO2022016385A1 (en) Method of generating corrected pixel data, electrical device and non-transitory computer readable medium
WO2022241732A1 (en) Method of generating an image, electronic device, apparatus, and computer readable storage medium
WO2022222075A1 (en) Method of generating image data, electronic device, apparatus, and computer readable medium
WO2022174460A1 (en) Sensor, electrical device, and non-transitory computer readable medium
WO2023044852A1 (en) Camera assembly and electrical device
WO2022047614A1 (en) Method of generating target image data, electrical device and non-transitory computer readable medium
WO2022183437A1 (en) Method of generating embedded image data, image sensor, electrical device and non-transitory computer readable medium
WO2021253166A1 (en) Method of generating target image data and electrical device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19956363

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19956363

Country of ref document: EP

Kind code of ref document: A1