CN114143421B - Dual-sensor camera system and calibration method thereof - Google Patents
Dual-sensor camera system and calibration method thereof Download PDFInfo
- Publication number
- CN114143421B CN114143421B CN202011625552.3A CN202011625552A CN114143421B CN 114143421 B CN114143421 B CN 114143421B CN 202011625552 A CN202011625552 A CN 202011625552A CN 114143421 B CN114143421 B CN 114143421B
- Authority
- CN
- China
- Prior art keywords
- image
- infrared
- color
- sensor
- brightness
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000009977 dual effect Effects 0.000 claims abstract description 27
- 238000003384 imaging method Methods 0.000 claims abstract description 24
- 238000004590 computer program Methods 0.000 claims abstract description 6
- 238000001514 detection method Methods 0.000 claims description 11
- 239000011159 matrix material Substances 0.000 claims description 7
- 230000009466 transformation Effects 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 4
- 238000005070 sampling Methods 0.000 claims 2
- 238000011426 transformation method Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 6
- 230000000295 complement effect Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/743—Bracketing, i.e. taking a series of images with varying exposure conditions
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Color Television Image Signal Generators (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Cameras In General (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
- Burglar Alarm Systems (AREA)
- Air Bags (AREA)
Abstract
A dual sensor camera system and a calibration method thereof. The dual sensor camera system includes at least one color sensor, at least one infrared sensor, a memory device, and a processor. The processor is configured to load and execute a computer program stored in the storage device to: controlling a color sensor and an infrared sensor to acquire a plurality of color images and a plurality of infrared images of an imaging scene respectively by adopting a plurality of shooting conditions; calculating a plurality of color image parameters of the color image acquired under each photographing condition and a plurality of infrared image parameters of the infrared image acquired under each photographing condition to calculate a difference between the brightness of the color image and the brightness of the infrared image; and determining exposure settings suitable for the color sensor and the infrared sensor based on the calculated difference.
Description
Technical Field
The present disclosure relates to an imaging system and method, and more particularly, to a dual-sensor imaging system and a calibration method thereof.
Background
The exposure conditions of cameras (including aperture, shutter, perceived brightness) affect the quality of the captured image, and many cameras automatically adjust the exposure conditions during the capture of the image to obtain a clear and bright image. However, in a scene with high contrast such as a low light source or a backlight, the result of adjusting the exposure condition by the camera may generate too high noise or overexposure of a partial area, which cannot satisfy the image quality of all the areas.
In view of this, a new image sensor architecture is adopted in the prior art, which utilizes the sensitivity characteristic of an Infrared (IR) sensor Gao Guangmin to insert and dispose IR pixels in color pixels of the image sensor to assist in brightness detection. For example, fig. 1 is a schematic diagram of an existing image acquisition using an image sensor. Referring to fig. 1, in a conventional image sensor 10, pixels of red (R), green (G), blue (B) and the like are arranged, and pixels of infrared (I) are also arranged alternately. Thus, the image sensor 10 is able to combine the color information 12 acquired by the R, G, B color pixels with the luminance information 14 acquired by the I pixels to obtain an image 16 of moderate color and luminance.
However, under the above-mentioned architecture of a single image sensor, the exposure condition of each pixel in the image sensor is the same, so that only the exposure condition suitable for the color pixel or the infrared pixel can be selected to acquire the image, and as a result, the characteristics of the two pixels cannot be effectively utilized to improve the image quality of the acquired image.
Disclosure of Invention
The invention provides a dual-sensor camera system and a calibration method thereof, which utilize independently configured color and infrared sensors to respectively acquire a plurality of images under different shooting conditions, so as to perform image alignment and brightness matching, and be applied to subsequently acquired images, thereby improving the image quality of the acquired images.
The dual-sensor camera system comprises at least one color sensor, at least one infrared sensor, a storage device and a processor coupled with the color sensor, the infrared sensor and the storage device. The processor is configured to load and execute a computer program stored in a storage device to: controlling a color sensor and an infrared sensor to acquire a plurality of color images and a plurality of infrared images of an imaging scene respectively by adopting a plurality of shooting conditions; calculating a plurality of color image parameters of the color image acquired under each photographing condition and a plurality of infrared image parameters of the infrared image acquired under each photographing condition to calculate a difference between the brightness of the color image and the brightness of the infrared image; and determining exposure settings suitable for the color sensor and the infrared sensor based on the calculated difference.
The calibration method of the dual-sensor camera system is suitable for the dual-sensor camera system comprising at least one color sensor, at least one infrared sensor and a processor. The method comprises the following steps: controlling a color sensor and an infrared sensor to acquire a plurality of color images and a plurality of infrared images of an imaging scene respectively by adopting a plurality of shooting conditions; calculating a plurality of color image parameters of the color image acquired under each photographing condition and a plurality of infrared image parameters of the infrared image acquired under each photographing condition to calculate a difference between the brightness of the color image and the brightness of the infrared image; and determining exposure settings suitable for the color sensor and the infrared sensor based on the calculated difference.
Based on the above, the dual-sensor image capturing system and the calibration method thereof of the present invention acquire a plurality of images by using different capturing conditions on the color sensor and the infrared sensor which are configured independently, and determine exposure and alignment settings suitable for the color sensor and the infrared sensor according to the positional relationship of corresponding pixels in the images and the brightness difference between the images, so as to perform image alignment and brightness matching on the subsequently acquired images, thereby improving the image quality of the captured images.
In order to make the present disclosure more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 is a schematic diagram of a prior art image acquisition using an image sensor;
FIG. 2 is a schematic diagram illustrating the use of an image sensor to acquire an image in accordance with an embodiment of the present invention;
FIG. 3 is a block diagram of a dual sensor camera system according to one embodiment of the present invention;
FIG. 4 is a flow chart illustrating a method of calibrating a dual sensor camera system in accordance with an embodiment of the present invention;
FIG. 5 is a flow chart illustrating a method of alignment calibration of a dual sensor camera system in accordance with an embodiment of the present invention;
FIG. 6 is a flow chart of a brightness matching calibration method for a dual sensor camera system according to an embodiment of the present invention;
fig. 7 is a flowchart illustrating a method of calibrating a dual sensor camera system according to an embodiment of the present invention.
Symbol description
10. 20: image sensor
12: color information
14: luminance information
16: image processing apparatus
22: color sensor
22a: color image
24: infrared sensor
24a: infrared image
26: scene image
30: dual sensor camera system
32: color sensor
34: infrared sensor
36: storage device
38: processor and method for controlling the same
R, G, B, I: pixel arrangement
S402 to S406, S502 to S506, S602 to S606, S702 to S706: step (a)
Detailed Description
Fig. 2 is a schematic diagram illustrating capturing an image using an image sensor in accordance with an embodiment of the present invention. Referring to fig. 2, the image sensor 20 of the embodiment of the present invention adopts a dual-sensor architecture in which the color sensor 22 and the Infrared (IR) sensor 24 are independently configured, uses the characteristics of the color sensor 22 and the infrared sensor 24 to respectively obtain a plurality of images by using a plurality of exposure conditions suitable for the current image capturing scene, selects a color image 22a and an infrared image 24a with appropriate exposure conditions, and uses the infrared image 24a to complement the texture details lacking in the color image 22a by means of image fusion, thereby obtaining a scene image 26 with good color and texture details.
Fig. 3 is a block diagram of a dual sensor camera system according to an embodiment of the present invention. Referring to fig. 3, the dual-sensor camera system 30 of the present embodiment can be configured in an electronic device such as a mobile phone, a tablet computer, a notebook computer, a navigation device, a driving recorder, a digital camera, a digital video camera, etc. for providing a camera function. The dual sensor camera system 30 includes at least one color sensor 32, at least one infrared sensor 34, a memory device 36, and a processor 38, the functions of which are as follows:
the color sensor 32 may, for example, comprise a charge coupled device (Charge Coupled Device, CCD), a complementary metal oxide semiconductor (Complementary Metal-Oxide Semiconductor, CMOS) device, or other type of photosensitive device, and may sense light intensity to produce an image of the camera scene. The color sensor 32 is, for example, a red, green and blue (RGB) image sensor, which includes red (R), green (G) and blue (B) color pixels, and is configured to acquire color information such as red light, green light and blue light in the imaging scene, and combine the color information to generate a color image of the imaging scene.
The infrared sensor 34 includes, for example, a CCD, a CMOS device, or other kind of photosensitive device, which is capable of sensing infrared light by adjusting a wavelength sensing range of the photosensitive device. The infrared sensor 34 acquires infrared light information in the imaging scene using the above-described photosensitive device as a pixel, for example, and synthesizes the infrared light information to generate an infrared image of the imaging scene.
The Memory device 36 is, for example, any type of fixed or removable random access Memory (Random Access Memory, RAM), read-Only Memory (ROM), flash Memory (Flash Memory), hard disk, or the like, or a combination thereof, for storing a computer program executable by the processor 38. In some embodiments, storage device 36 may also store, for example, color images acquired by color sensor 32 and infrared images acquired by infrared sensor 34.
The processor 38 is, for example, a central processing unit (Central Processing Unit, CPU), or other programmable general purpose or special purpose Microprocessor (Microprocessor), microcontroller (Microcontroller), digital signal processor (Digital Signal Processor, DSP), programmable controller, application specific integrated circuit (Application Specific Integrated Circuits, ASIC), programmable logic device (Programmable Logic Device, PLD), or other similar device, or combination of devices, as the invention is not limited in this regard. In this embodiment, the processor 38 may load a computer program from the storage device 36 to perform the calibration method of the dual sensor camera system according to the embodiment of the present invention.
Based on the difference between the characteristics (such as resolution, wavelength range, field of view (FOV)) of the color sensor 32 and the infrared sensor 34, the embodiments of the present invention provide a calibration method that can calibrate the color sensor 32 and the infrared sensor 34 mounted on the dual-sensor image capturing system 30 in the production stage (production stage) to balance the difference between the color sensor 32 and the infrared sensor 34 under different capturing conditions, and the calibration result is stored in the storage device 36 and can be used as a basis for adjusting the acquired image in the subsequent operation time (run stage).
Fig. 4 is a flowchart illustrating a calibration method of a dual sensor camera system according to an embodiment of the present invention. Referring to fig. 3 and fig. 4, the method of the present embodiment is suitable for the dual-sensor camera system 30, and is suitable for calibrating the color sensor 32 and the infrared sensor 34 of the dual-sensor camera system 30 in the production stage, and the following describes the detailed steps of the calibration method of the present embodiment with respect to each device of the dual-sensor camera system 30.
In step S402, at least one color sensor 32 and at least one infrared sensor 34 are mounted to the dual sensor camera system 30. The color sensor 32 and the infrared sensor 34 are, for example, a robot mounted on the image sensor, such as the color sensor 22 and the infrared sensor 24 mounted on the image sensor 20 shown in fig. 2.
In step S404, calibration of the alignment between the color sensor 32 and the infrared sensor 34 is performed by the processor 38. The processor 38 executes an image alignment algorithm, such as a brutal method (brutal force), an optical flow method (optical flow), a homography method (homography), or a local warping method (local warping), to align the color image acquired by the color sensor 32 with the infrared image acquired by the infrared sensor 34, and detailed embodiments thereof will be described later.
In step S406, the processor 38 performs calibration of the brightness matching of the color sensor 32 and the infrared sensor 34 under different photographing conditions. The processor 38 calculates differences between the color image and the infrared image acquired under different photographing conditions, so as to determine exposure settings suitable for the color sensor 32 and the infrared sensor 34, and detailed embodiments thereof will be described later.
For the alignment calibration described above, fig. 5 is a flowchart of an alignment calibration method of the dual sensor camera system according to an embodiment of the present invention. Referring to fig. 3 and 5, the method of the present embodiment is applicable to the dual-sensor imaging system 30, and the following describes the detailed steps of the alignment calibration method of the present embodiment with each device of the dual-sensor imaging system 30.
In step S502, the processor 38 controls the color sensor 32 and the infrared sensor 34 to capture test charts (test chart) with special patterns, respectively, to obtain a color test image and an infrared test image. The special pattern is, for example, a black-white checkerboard pattern, or other patterns that can clearly distinguish features, but is not limited thereto.
In step S504, a plurality of feature points of the special pattern in the color test image and the infrared test image are detected by the processor 38.
In some embodiments, processor 38 may, for example, cut each of the color test image and the infrared test image into a plurality of blocks and execute a feature detection algorithm to detect at least one feature point within each block. The processor 38 determines the number of feature points detected from each block based on its own computing power, for example, and the feature detection algorithm is harris corner detection (Harris corner detection). In some embodiments, processor 38 may select, for example, edge pixels in each block, or pixels in each block with high local variance (local variance), as the detected feature points, without limitation.
In step S506, the processor 38 executes an image alignment algorithm to calculate a matching relationship between the color test image and the infrared test image according to the positional relationship between the corresponding feature points in the color test image and the infrared test image, so as to align the color image and the infrared image acquired subsequently. Where the image alignment is performed, the processor 38 may, for example, take all the feature points in the color image and all the feature points in the infrared image, thereby performing an image alignment algorithm for these feature points.
In some embodiments, when the captured scene is a planar scene, processor 38 moves a patch (patch) comprising a plurality of pixels at corresponding locations in the infrared test image for a specified one of the feature points detected from the color test image to search for a corresponding feature point in the infrared test image that corresponds to the specified feature point. The processor 38 moves the patch around the pixel corresponding to the specified feature point in the infrared test image, and compares the pixel located in the patch with the pixel located around the specified feature point in the color test image until the pixel in the patch matches the pixel located around the specified feature point (e.g., the sum of the differences of the pixel values of all the pixels is less than a predetermined threshold). Finally, processor 38 may determine the center point pixel where the patch is located when the match is achieved as the corresponding feature point corresponding to the specified feature point. Processor 38 will repeatedly perform the above-described matching actions until the correspondence of all feature points is obtained.
Processor 38 may then, for example, execute a random sample consensus (RANdom SAmple Consensus, RANSAC) algorithm to build a homography transformation matrix (homography transformation matrix) as follows:
wherein, (x, y) represents the position of the designated feature point in the color test image, (x ', y') represents the position of the corresponding feature point in the infrared test image, and a-h represent variables. The processor 38 may take the positions of the respective feature points in the color test image and the corresponding feature points in the infrared test image into the homography matrix solution, and use the solution as a matching relationship between the color test image and the infrared test image.
In some embodiments, when the photographed scene is a scene with multiple depths, because of parallax (parallaxes) between the color sensor 32 and the infrared sensor 34, the acquired images will have aberrations (abberations), so it is necessary to calculate the matching relationship for the image planes of different depths. At this time, the processor 38 calculates a plurality of depths of the image scene using the color test image and the infrared test image, so as to divide the image scene into a plurality of depth scenes (for example, into a near view and a far view) with different depths. Wherein processor 38 establishes a quadratic equation (quadratic equation), for each depth scene, for example, as follows:
wherein, (x, y) represents the position of the designated feature point in the color test image, (x ', y') represents the position of the corresponding feature point in the infrared test image, and a-f represent variables. Processor 38 may, for example, take the locations of the respective feature points in the color test image and the corresponding feature points in the infrared test image into the quadratic solution described above, and use the solution as a matching relationship between the color test image and the infrared test image.
On the other hand, for the above-mentioned calibration of brightness matching, fig. 6 is a flowchart of a brightness matching calibration method of a dual-sensor camera system according to an embodiment of the present invention. Referring to fig. 3 and fig. 6, the method of the present embodiment is applicable to the dual-sensor image capturing system 30, and the following describes the detailed steps of the brightness matching calibration method of the present embodiment with each device of the dual-sensor image capturing system 30.
In step S602, the processor 38 controls the color sensor 32 and the infrared sensor 34 to acquire a plurality of color images and a plurality of infrared images of a shooting scene respectively using a plurality of shooting conditions. The shooting conditions include, for example, one or a combination of a wavelength range and brightness of ambient light (ambient light), and a distance between a subject and a background in an imaging scene, and are not limited thereto.
In step S604, a plurality of color image parameters of the color image acquired under each photographing condition and a plurality of infrared image parameters of the infrared image acquired under each photographing condition are calculated by the processor 38, and used to calculate the difference between the brightness of the color image and the brightness of the infrared image.
In some embodiments, for images taken under different photographing conditions, processor 38 may calculate, for example, 3A (including Auto Focus (AF), auto Exposure (AE), auto white balance (Auto White Balance, AWB)) statistics for each image and may be used to calculate differences (e.g., differences or ratios) between the image brightnesses.
In some embodiments, the processor 38 may, for example, divide each color image and each infrared image into a plurality of blocks (blocks), and calculate the average of the pixel values of all pixels in each block, thereby calculating the average difference of the pixel values of the corresponding blocks as the difference between the brightness of the color image and the brightness of the infrared image.
In some embodiments, processor 38 calculates, for example, image histograms (histogram) for each color image and each infrared image, thereby calculating the difference between the image histograms of the color image and the infrared image as the difference between the brightness of the color image and the brightness of the infrared image.
Returning to the flow of fig. 6, in step S606, the processor 38 determines exposure settings appropriate for the color sensor 32 and the infrared sensor 34 based on the calculated difference.
In some embodiments, to achieve frame synchronization, processor 38 controls, for example, color sensor 32 and infrared sensor 34 to acquire color images and infrared images of the captured scene, respectively, using the same exposure time, and calculates the difference in brightness between the color images and the infrared images, thereby calculating a gain (gain) to adjust the brightness of the color images and/or the brightness of the infrared images. That is, the processor 38 calculates a gain that compensates for the difference in brightness between the color image and the infrared image, which may be a gain for the color image, a gain for the infrared image, or both, without limitation.
For example, if the color image obtained with the same exposure time is brighter, the gain for adjusting the brightness of the infrared image may be calculated so that the brightness of the infrared image is equivalent to the brightness of the color image after multiplying the gain. The calculated gains are stored in the memory means 36, for example, together with their corresponding shooting conditions, so that in a subsequent run time, whenever a color image and an infrared image are acquired, the processor 38 can obtain the gain for the shooting conditions from the memory means 36 by identifying the shooting conditions and multiply the pixel values of the acquired color image or infrared image by the obtained gain, so that the brightness of the color image can be matched with the brightness of the infrared image.
In some embodiments, an infrared projector (IR projector) may be additionally provided in the dual-sensor camera system 30 to assist the processor 38 in calculating the distance between the dual-sensor camera system 30 and the subject and background in the camera scene in conjunction with the infrared sensor 34.
In detail, fig. 7 is a flowchart of a calibration method of a dual sensor camera system according to an embodiment of the present invention. Referring to fig. 3 and fig. 7, the method of the present embodiment is applicable to the dual-sensor imaging system 30, and the following describes the detailed steps of the calibration method of the present embodiment with respect to each device of the dual-sensor imaging system 30.
In step S702, the processor 38 controls the infrared projector to project invisible light having a special pattern to the imaging scene.
In step S704, the processor 38 controls two of the infrared sensors 34 to acquire a plurality of infrared images of the imaging scene having the special pattern, respectively.
In step S706, the distances between the two-sensor imaging system 30 and the subject and the background in the imaging scene are calculated by the processor 38 based on the special pattern in the acquired infrared image and the parallax (parallaxx) between the two infrared sensors, respectively. The special pattern projected to the shooting scene by the infrared projector is not easily influenced by the environment, so that the distance between a shooting subject and/or a background can be obtained accurately by the method, and shooting conditions are identified to be used as the basis for subsequent image compensation.
In summary, the dual-sensor image capturing system and the calibration method thereof according to the present invention respectively acquire a plurality of images by using different capturing conditions for a color sensor and an infrared sensor disposed on the dual-sensor image capturing system, so as to calibrate the image alignment and brightness matching of the color sensor and the infrared sensor, and use the calibration result as a basis for adjusting the subsequently acquired images. Therefore, the image quality of the image acquired by the dual-sensor imaging system can be improved.
While the present disclosure has been described with reference to the exemplary embodiments, it should be understood that the invention is not limited thereto, but may be embodied with other specific forms and modifications without departing from the spirit or scope of the present disclosure.
Claims (20)
1. A dual sensor camera system comprising:
at least one color sensor;
at least one infrared sensor;
a storage device storing a computer program; and
a processor, coupled to the at least one color sensor, the at least one infrared sensor, and the storage device, configured to load and execute the computer program to:
controlling the at least one color sensor and the at least one infrared sensor to acquire a plurality of color images and a plurality of infrared images of a shooting scene respectively by adopting a plurality of shooting conditions;
calculating a plurality of color image parameters of the color image acquired under each of the photographing conditions and a plurality of infrared image parameters of the infrared image acquired under each of the photographing conditions to calculate a difference between a brightness of the color image and a brightness of the infrared image; and
according to the calculated difference between the brightness of the color image and the brightness of the infrared image, exposure settings suitable for the at least one color sensor and the at least one infrared sensor are determined so that the brightness of the color image is matched with the brightness of the infrared image.
2. The dual sensor camera system of claim 1, wherein the processor comprises:
cutting each color image and each infrared image into a plurality of blocks, and calculating the average pixel value of all pixels in each block; and
a difference in the pixel value averages of the corresponding blocks is calculated as a difference between the brightness of the color image and the brightness of the infrared image.
3. The dual sensor camera system of claim 1, wherein the processor comprises:
calculating an image histogram of each of the color images and each of the infrared images; and
a difference in the image histogram of the color image and the infrared image is calculated as a difference between the brightness of the color image and the brightness of the infrared image.
4. The dual sensor camera system of claim 1, wherein the processor comprises:
controlling the at least one color sensor and the at least one infrared sensor to acquire the color image and the infrared image of the shooting scene respectively by adopting the same exposure time; and
a gain for adjusting the brightness of the color image or the brightness of the infrared image is calculated based on the calculated difference between the brightness of the color image and the brightness of the infrared image.
5. The dual sensor camera system of claim 1, wherein the processor further comprises:
controlling the at least one color sensor and the at least one infrared sensor to respectively shoot test patterns with special patterns so as to obtain a color test image and an infrared test image;
detecting a plurality of characteristic points of the special pattern in the color test image and the infrared test image; and
and executing an image alignment algorithm to calculate a matching relationship between the color test image and the infrared test image according to the position relationship between the corresponding characteristic points in the color test image and the infrared test image, so as to align the color image and the infrared image which are acquired subsequently.
6. The dual sensor camera system of claim 5, wherein said processor comprises:
cutting the color test image and the infrared test image into a plurality of blocks, and executing a feature detection algorithm to detect at least one feature point in each block, wherein the feature detection algorithm comprises a harris corner detection method.
7. The dual sensor camera system of claim 5, wherein said processor comprises:
for a specified feature point in the feature points detected from the color test image, a patch including a plurality of pixels is moved centering on a pixel corresponding to the specified feature point in the infrared test image to search for a corresponding feature point corresponding to the specified feature point in the infrared test image; and
executing a random sampling consistency algorithm to establish a homography transformation matrix, and bringing the positions of the characteristic points in the color test image and the corresponding characteristic points in the infrared test image into the homography transformation matrix to solve, wherein the solved solution is used as the matching relation between the color test image and the infrared test image.
8. The dual sensor camera system of claim 5, wherein said processor comprises:
calculating a plurality of depths of a shooting scene by using the color test image and the infrared test image, so as to divide the shooting scene into a plurality of depth scenes; and
and establishing a quadratic equation for each depth scene, and bringing the positions of the characteristic points in the color test image and the corresponding characteristic points in the infrared test image in each depth scene into the corresponding quadratic equation for solving, wherein the solved solution is used as the matching relation between the color test image and the infrared test image.
9. The dual sensor imaging system of claim 5, wherein the image alignment algorithm comprises an violence method, an optical flow method, a homography transformation method, or a local warping method.
10. The dual sensor camera system of claim 1, further comprising an infrared projector, wherein the processor further comprises:
controlling the infrared projector to project invisible light with a special pattern to the shooting scene;
controlling two infrared sensors in the at least one infrared sensor to respectively acquire a plurality of infrared images of the shooting scene with the special pattern; and
and respectively calculating the distances between the two-sensor imaging system and a main body and a background in the imaging scene according to the special pattern in the acquired infrared image and the parallax of the two infrared sensors.
11. A method of calibrating a dual sensor camera system comprising at least one color sensor, at least one infrared sensor, and a processor, the method comprising the steps of:
controlling the at least one color sensor and the at least one infrared sensor to acquire a plurality of color images and a plurality of infrared images of a shooting scene respectively by adopting a plurality of shooting conditions;
calculating a plurality of color image parameters of the color image acquired under each of the photographing conditions and a plurality of infrared image parameters of the infrared image acquired under each of the photographing conditions to calculate a difference between a brightness of the color image and a brightness of the infrared image; and
according to the calculated difference between the brightness of the color image and the brightness of the infrared image, exposure settings suitable for the at least one color sensor and the at least one infrared sensor are determined so that the brightness of the color image is matched with the brightness of the infrared image.
12. The method of claim 11, wherein the step of calculating a difference between the brightness of the color image and the brightness of the infrared image comprises:
cutting each color image and each infrared image into a plurality of blocks, and calculating the average pixel value of all pixels in each block; and
a difference in the pixel value averages of the corresponding blocks is calculated as a difference between the brightness of the color image and the brightness of the infrared image.
13. The method of claim 11, wherein the step of calculating a difference between the brightness of the color image and the brightness of the infrared image comprises:
calculating an image histogram of each of the color images and each of the infrared images; and
a difference in the image histogram of the color image and the infrared image is calculated as a difference between the brightness of the color image and the brightness of the infrared image.
14. The method of claim 11, further comprising:
controlling the at least one color sensor and the at least one infrared sensor to acquire the color image and the infrared image of the shooting scene respectively by adopting the same exposure time; and
a gain for adjusting the brightness of the color image or the brightness of the infrared image is calculated based on the calculated difference between the brightness of the color image and the brightness of the infrared image.
15. The method of claim 11, further comprising:
controlling the at least one color sensor and the at least one infrared sensor to respectively shoot test patterns with special patterns so as to obtain a color test image and an infrared test image;
detecting a plurality of characteristic points of the special pattern in the color test image and the infrared test image; and
and executing an image alignment algorithm to calculate a matching relationship between the color test image and the infrared test image according to the position relationship between the corresponding characteristic points in the color test image and the infrared test image, so as to align the color image and the infrared image which are acquired subsequently.
16. The method of claim 15, wherein detecting a plurality of feature points of the special pattern in the color test image and the infrared test image comprises:
cutting the color test image and the infrared test image into a plurality of blocks, and executing a feature detection algorithm to detect at least one feature point in each block, wherein the feature detection algorithm comprises a harris corner detection method.
17. The method of claim 15, wherein the step of calculating a matching relationship between the color test image and the infrared test image comprises:
for a specified feature point in the feature points detected from the color test image, a patch including a plurality of pixels is moved centering on a pixel corresponding to the specified feature point in the infrared test image to search for a corresponding feature point corresponding to the specified feature point in the infrared test image; and
executing a random sampling consistency algorithm to establish a homography transformation matrix, and bringing the positions of the characteristic points in the color test image and the corresponding characteristic points in the infrared test image into the homography transformation matrix to solve, wherein the solved solution is used as the matching relation between the color test image and the infrared test image.
18. The method of claim 15, wherein the step of calculating a matching relationship between the color test image and the infrared test image comprises:
calculating a plurality of depths of a shooting scene by using the color test image and the infrared test image, so as to divide the shooting scene into a plurality of depth scenes; and
and establishing a quadratic equation for each depth scene, and bringing the positions of the characteristic points in the color test image and the corresponding characteristic points in the infrared test image in each depth scene into the corresponding quadratic equation for solving, wherein the solved solution is used as the matching relation between the color test image and the infrared test image.
19. The method of claim 15, wherein the image alignment algorithm comprises an violence method, an optical flow method, a homography method, or a local warping method.
20. The method of claim 11, wherein the dual sensor camera system further comprises an infrared projector, the method further comprising:
controlling the infrared projector to project invisible light with a special pattern to the shooting scene;
controlling two infrared sensors in the at least one infrared sensor to respectively acquire a plurality of infrared images of the shooting scene with the special pattern; and
and respectively calculating the distances between the two-sensor imaging system and a main body and a background in the imaging scene according to the special pattern in the acquired infrared image and the parallax of the two infrared sensors.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063074477P | 2020-09-04 | 2020-09-04 | |
US63/074,477 | 2020-09-04 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114143421A CN114143421A (en) | 2022-03-04 |
CN114143421B true CN114143421B (en) | 2024-04-05 |
Family
ID=80438521
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011541300.2A Active CN114143418B (en) | 2020-09-04 | 2020-12-23 | Dual-sensor imaging system and imaging method thereof |
CN202011540274.1A Active CN114143443B (en) | 2020-09-04 | 2020-12-23 | Dual-sensor imaging system and imaging method thereof |
CN202011625552.3A Active CN114143421B (en) | 2020-09-04 | 2020-12-30 | Dual-sensor camera system and calibration method thereof |
CN202011622478.XA Active CN114143419B (en) | 2020-09-04 | 2020-12-30 | Dual-sensor camera system and depth map calculation method thereof |
CN202011625515.2A Active CN114143420B (en) | 2020-09-04 | 2020-12-30 | Dual-sensor camera system and privacy protection camera method thereof |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011541300.2A Active CN114143418B (en) | 2020-09-04 | 2020-12-23 | Dual-sensor imaging system and imaging method thereof |
CN202011540274.1A Active CN114143443B (en) | 2020-09-04 | 2020-12-23 | Dual-sensor imaging system and imaging method thereof |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011622478.XA Active CN114143419B (en) | 2020-09-04 | 2020-12-30 | Dual-sensor camera system and depth map calculation method thereof |
CN202011625515.2A Active CN114143420B (en) | 2020-09-04 | 2020-12-30 | Dual-sensor camera system and privacy protection camera method thereof |
Country Status (2)
Country | Link |
---|---|
CN (5) | CN114143418B (en) |
TW (5) | TWI767468B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116091341B (en) * | 2022-12-15 | 2024-04-02 | 南京信息工程大学 | Exposure difference enhancement method and device for low-light image |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004246252A (en) * | 2003-02-17 | 2004-09-02 | Takenaka Komuten Co Ltd | Apparatus and method for collecting image information |
CN104661008A (en) * | 2013-11-18 | 2015-05-27 | 深圳中兴力维技术有限公司 | Processing method and device for improving colorful image quality under condition of low-light level |
CN105009568A (en) * | 2012-12-21 | 2015-10-28 | 菲力尔***公司 | Compact multi-spectrum imaging with fusion |
JP2017011634A (en) * | 2015-06-26 | 2017-01-12 | キヤノン株式会社 | Imaging device, control method for the same and program |
JP2017163297A (en) * | 2016-03-09 | 2017-09-14 | キヤノン株式会社 | Imaging apparatus |
CN110248105A (en) * | 2018-12-10 | 2019-09-17 | 浙江大华技术股份有限公司 | A kind of image processing method, video camera and computer storage medium |
WO2020051898A1 (en) * | 2018-09-14 | 2020-03-19 | 浙江宇视科技有限公司 | Automatic exposure method and apparatus for dual-light image, and dual-light image camera and machine storage medium |
CN111586314A (en) * | 2020-05-25 | 2020-08-25 | 浙江大华技术股份有限公司 | Image fusion method and device and computer storage medium |
Family Cites Families (62)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005091434A (en) * | 2003-09-12 | 2005-04-07 | Noritsu Koki Co Ltd | Position adjusting method and image reader with damage compensation function using the same |
JP4244018B2 (en) * | 2004-03-25 | 2009-03-25 | ノーリツ鋼機株式会社 | Defective pixel correction method, program, and defective pixel correction system for implementing the method |
JP4341680B2 (en) * | 2007-01-22 | 2009-10-07 | セイコーエプソン株式会社 | projector |
US9307212B2 (en) * | 2007-03-05 | 2016-04-05 | Fotonation Limited | Tone mapping for low-light video frame enhancement |
EP3876510A1 (en) * | 2008-05-20 | 2021-09-08 | FotoNation Limited | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US8866920B2 (en) * | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
CN101404060B (en) * | 2008-11-10 | 2010-06-30 | 北京航空航天大学 | Human face recognition method based on visible light and near-infrared Gabor information amalgamation |
US8749635B2 (en) * | 2009-06-03 | 2014-06-10 | Flir Systems, Inc. | Infrared camera systems and methods for dual sensor applications |
WO2010104490A1 (en) * | 2009-03-12 | 2010-09-16 | Hewlett-Packard Development Company, L.P. | Depth-sensing camera system |
JP5670456B2 (en) * | 2009-08-25 | 2015-02-18 | アイピーリンク・リミテッド | Reduce noise in color images |
US8478123B2 (en) * | 2011-01-25 | 2013-07-02 | Aptina Imaging Corporation | Imaging devices having arrays of image sensors and lenses with multiple aperture sizes |
JP2013115679A (en) * | 2011-11-30 | 2013-06-10 | Fujitsu General Ltd | Imaging apparatus |
US10848731B2 (en) * | 2012-02-24 | 2020-11-24 | Matterport, Inc. | Capturing and aligning panoramic image and depth data |
TW201401186A (en) * | 2012-06-25 | 2014-01-01 | Psp Security Co Ltd | System and method for identifying human face |
US20150245062A1 (en) * | 2012-09-25 | 2015-08-27 | Nippon Telegraph And Telephone Corporation | Picture encoding method, picture decoding method, picture encoding apparatus, picture decoding apparatus, picture encoding program, picture decoding program and recording medium |
KR102070778B1 (en) * | 2012-11-23 | 2020-03-02 | 엘지전자 주식회사 | Rgb-ir sensor with pixels array and apparatus and method for obtaining 3d image using the same |
TWM458748U (en) * | 2012-12-26 | 2013-08-01 | Chunghwa Telecom Co Ltd | Image type depth information retrieval device |
JP6055681B2 (en) * | 2013-01-10 | 2016-12-27 | 株式会社 日立産業制御ソリューションズ | Imaging device |
CN104021548A (en) * | 2014-05-16 | 2014-09-03 | 中国科学院西安光学精密机械研究所 | Method for acquiring 4D scene information |
US9516295B2 (en) * | 2014-06-30 | 2016-12-06 | Aquifi, Inc. | Systems and methods for multi-channel imaging based on multiple exposure settings |
JP6450107B2 (en) * | 2014-08-05 | 2019-01-09 | キヤノン株式会社 | Image processing apparatus, image processing method, program, and storage medium |
JP6597636B2 (en) * | 2014-12-10 | 2019-10-30 | ソニー株式会社 | Imaging apparatus, imaging method, program, and image processing apparatus |
JP6185213B2 (en) * | 2015-03-31 | 2017-08-23 | 富士フイルム株式会社 | Imaging apparatus, image processing method of imaging apparatus, and program |
WO2016192437A1 (en) * | 2015-06-05 | 2016-12-08 | 深圳奥比中光科技有限公司 | 3d image capturing apparatus and capturing method, and 3d image system |
CN105049829B (en) * | 2015-07-10 | 2018-12-25 | 上海图漾信息科技有限公司 | Optical filter, imaging sensor, imaging device and 3-D imaging system |
CN105069768B (en) * | 2015-08-05 | 2017-12-29 | 武汉高德红外股份有限公司 | A kind of visible images and infrared image fusion processing system and fusion method |
US10523855B2 (en) * | 2015-09-24 | 2019-12-31 | Intel Corporation | Infrared and visible light dual sensor imaging system |
TW201721269A (en) * | 2015-12-11 | 2017-06-16 | 宏碁股份有限公司 | Automatic exposure system and auto exposure method thereof |
JP2017112401A (en) * | 2015-12-14 | 2017-06-22 | ソニー株式会社 | Imaging device, apparatus and method for image processing, and program |
CN206117865U (en) * | 2016-01-16 | 2017-04-19 | 上海图漾信息科技有限公司 | Range data monitoring device |
KR101747603B1 (en) * | 2016-05-11 | 2017-06-16 | 재단법인 다차원 스마트 아이티 융합시스템 연구단 | Color night vision system and operation method thereof |
CN106815826A (en) * | 2016-12-27 | 2017-06-09 | 上海交通大学 | Night vision image Color Fusion based on scene Recognition |
CN108280807A (en) * | 2017-01-05 | 2018-07-13 | 浙江舜宇智能光学技术有限公司 | Monocular depth image collecting device and system and its image processing method |
US11145077B2 (en) * | 2017-02-06 | 2021-10-12 | Photonic Sensors & Algorithms, S.L. | Device and method for obtaining depth information from a scene |
CN108419062B (en) * | 2017-02-10 | 2020-10-02 | 杭州海康威视数字技术股份有限公司 | Image fusion apparatus and image fusion method |
CN109474770B (en) * | 2017-09-07 | 2021-09-14 | 华为技术有限公司 | Imaging device and imaging method |
CN109712102B (en) * | 2017-10-25 | 2020-11-27 | 杭州海康威视数字技术股份有限公司 | Image fusion method and device and image acquisition equipment |
CN107846537B (en) * | 2017-11-08 | 2019-11-26 | 维沃移动通信有限公司 | A kind of CCD camera assembly, image acquiring method and mobile terminal |
CN112788249B (en) * | 2017-12-20 | 2022-12-06 | 杭州海康威视数字技术股份有限公司 | Image fusion method and device, electronic equipment and computer readable storage medium |
US10748247B2 (en) * | 2017-12-26 | 2020-08-18 | Facebook, Inc. | Computing high-resolution depth images using machine learning techniques |
US10757320B2 (en) * | 2017-12-28 | 2020-08-25 | Waymo Llc | Multiple operating modes to expand dynamic range |
TWI661726B (en) * | 2018-01-09 | 2019-06-01 | 呂官諭 | Image sensor with enhanced image recognition and application |
CN110136183B (en) * | 2018-02-09 | 2021-05-18 | 华为技术有限公司 | Image processing method and device and camera device |
CN108965654B (en) * | 2018-02-11 | 2020-12-25 | 浙江宇视科技有限公司 | Double-spectrum camera system based on single sensor and image processing method |
CN110572583A (en) * | 2018-05-18 | 2019-12-13 | 杭州海康威视数字技术股份有限公司 | method for shooting image and camera |
CN108961195B (en) * | 2018-06-06 | 2021-03-23 | Oppo广东移动通信有限公司 | Image processing method and device, image acquisition device, readable storage medium and computer equipment |
JP6574878B2 (en) * | 2018-07-19 | 2019-09-11 | キヤノン株式会社 | Image processing apparatus, image processing method, imaging apparatus, program, and storage medium |
JP7254461B2 (en) * | 2018-08-01 | 2023-04-10 | キヤノン株式会社 | IMAGING DEVICE, CONTROL METHOD, RECORDING MEDIUM, AND INFORMATION PROCESSING DEVICE |
CN109035193A (en) * | 2018-08-29 | 2018-12-18 | 成都臻识科技发展有限公司 | A kind of image processing method and imaging processing system based on binocular solid camera |
JP2020052001A (en) * | 2018-09-28 | 2020-04-02 | パナソニックIpマネジメント株式会社 | Depth acquisition device, depth acquisition method, and program |
US11176694B2 (en) * | 2018-10-19 | 2021-11-16 | Samsung Electronics Co., Ltd | Method and apparatus for active depth sensing and calibration method thereof |
CN109636732B (en) * | 2018-10-24 | 2023-06-23 | 深圳先进技术研究院 | Hole repairing method of depth image and image processing device |
US11120536B2 (en) * | 2018-12-12 | 2021-09-14 | Samsung Electronics Co., Ltd | Apparatus and method for determining image sharpness |
WO2020168465A1 (en) * | 2019-02-19 | 2020-08-27 | 华为技术有限公司 | Image processing device and method |
US10972649B2 (en) * | 2019-02-27 | 2021-04-06 | X Development Llc | Infrared and visible imaging system for device identification and tracking |
JP7316809B2 (en) * | 2019-03-11 | 2023-07-28 | キヤノン株式会社 | Image processing device, image processing device control method, system, and program |
CN110349117B (en) * | 2019-06-28 | 2023-02-28 | 重庆工商大学 | Infrared image and visible light image fusion method and device and storage medium |
CN110706178B (en) * | 2019-09-30 | 2023-01-06 | 杭州海康威视数字技术股份有限公司 | Image fusion device, method, equipment and storage medium |
CN111524175A (en) * | 2020-04-16 | 2020-08-11 | 东莞市东全智能科技有限公司 | Depth reconstruction and eye movement tracking method and system for asymmetric multiple cameras |
CN111540003A (en) * | 2020-04-27 | 2020-08-14 | 浙江光珀智能科技有限公司 | Depth image generation method and device |
CN111383206B (en) * | 2020-06-01 | 2020-09-29 | 浙江大华技术股份有限公司 | Image processing method and device, electronic equipment and storage medium |
IN202021032940A (en) * | 2020-07-31 | 2020-08-28 | .Us Priyadarsan |
-
2020
- 2020-12-23 TW TW109145632A patent/TWI767468B/en active
- 2020-12-23 TW TW109145614A patent/TWI778476B/en active
- 2020-12-23 CN CN202011541300.2A patent/CN114143418B/en active Active
- 2020-12-23 CN CN202011540274.1A patent/CN114143443B/en active Active
- 2020-12-30 CN CN202011625552.3A patent/CN114143421B/en active Active
- 2020-12-30 TW TW109146831A patent/TWI797528B/en active
- 2020-12-30 CN CN202011622478.XA patent/CN114143419B/en active Active
- 2020-12-30 TW TW109146764A patent/TWI764484B/en active
- 2020-12-30 CN CN202011625515.2A patent/CN114143420B/en active Active
- 2020-12-30 TW TW109146922A patent/TWI767484B/en active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004246252A (en) * | 2003-02-17 | 2004-09-02 | Takenaka Komuten Co Ltd | Apparatus and method for collecting image information |
CN105009568A (en) * | 2012-12-21 | 2015-10-28 | 菲力尔***公司 | Compact multi-spectrum imaging with fusion |
CN104661008A (en) * | 2013-11-18 | 2015-05-27 | 深圳中兴力维技术有限公司 | Processing method and device for improving colorful image quality under condition of low-light level |
JP2017011634A (en) * | 2015-06-26 | 2017-01-12 | キヤノン株式会社 | Imaging device, control method for the same and program |
JP2017163297A (en) * | 2016-03-09 | 2017-09-14 | キヤノン株式会社 | Imaging apparatus |
WO2020051898A1 (en) * | 2018-09-14 | 2020-03-19 | 浙江宇视科技有限公司 | Automatic exposure method and apparatus for dual-light image, and dual-light image camera and machine storage medium |
CN110248105A (en) * | 2018-12-10 | 2019-09-17 | 浙江大华技术股份有限公司 | A kind of image processing method, video camera and computer storage medium |
CN111586314A (en) * | 2020-05-25 | 2020-08-25 | 浙江大华技术股份有限公司 | Image fusion method and device and computer storage medium |
Non-Patent Citations (1)
Title |
---|
基于环境光检测的场景融合***;董月;陈跃庭;冯华君;徐之海;李奇;;光子学报(01);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114143420A (en) | 2022-03-04 |
TWI767468B (en) | 2022-06-11 |
TWI764484B (en) | 2022-05-11 |
TW202211161A (en) | 2022-03-16 |
TW202211674A (en) | 2022-03-16 |
CN114143418A (en) | 2022-03-04 |
TWI778476B (en) | 2022-09-21 |
CN114143418B (en) | 2023-12-01 |
CN114143443A (en) | 2022-03-04 |
CN114143419B (en) | 2023-12-26 |
TW202211673A (en) | 2022-03-16 |
TW202211165A (en) | 2022-03-16 |
CN114143421A (en) | 2022-03-04 |
TWI797528B (en) | 2023-04-01 |
CN114143443B (en) | 2024-04-05 |
TW202211160A (en) | 2022-03-16 |
TWI767484B (en) | 2022-06-11 |
CN114143419A (en) | 2022-03-04 |
CN114143420B (en) | 2024-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108028895B (en) | Calibration of defective image sensor elements | |
US20130258139A1 (en) | Imaging apparatus | |
JP2014044345A (en) | Imaging apparatus | |
US8223258B2 (en) | Backlight photographing method | |
JP5848662B2 (en) | Image processing apparatus and control method thereof | |
US10972676B2 (en) | Image processing method and electronic device capable of optimizing hdr image by using depth information | |
US10764550B2 (en) | Image processing apparatus, image processing method, and storage medium | |
CN113691795A (en) | Image processing apparatus, image processing method, and storage medium | |
CN114143421B (en) | Dual-sensor camera system and calibration method thereof | |
JP6718253B2 (en) | Image processing apparatus and image processing method | |
JP6702752B2 (en) | Image processing device, imaging device, control method, and program | |
JP2013012940A (en) | Tracking apparatus and tracking method | |
JP6525503B2 (en) | Image processing apparatus and imaging apparatus | |
JP2019179463A (en) | Image processing device, control method thereof, program, and recording medium | |
US11418719B2 (en) | Dual sensor imaging system and calibration method which includes a color sensor and an infrared ray sensor to perform image alignment and brightness matching | |
US11496660B2 (en) | Dual sensor imaging system and depth map calculation method thereof | |
JP7455656B2 (en) | Image processing device, image processing method, and program | |
JP2013229698A (en) | Imaging apparatus, image processing apparatus, and imaging processing method, and program | |
JP2008053809A (en) | Camera and subject area extracting method | |
JP2017192027A (en) | Image processing apparatus, image processing method, and program | |
CN114170222A (en) | Image processing method, related device, equipment and storage medium | |
CN114298951A (en) | Image processing method, related device, equipment and storage medium | |
JP2021087125A (en) | Image processing device, control method thereof, and program | |
CN114697483A (en) | Device and method for shooting under screen based on compressed sensing white balance algorithm | |
JP2019003694A (en) | Image processing apparatus, control method of the same, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |