CN113242389A - Multi-frame dynamic range extension method and system for RCCB (Rich communication Circuit Board) image sensor - Google Patents

Multi-frame dynamic range extension method and system for RCCB (Rich communication Circuit Board) image sensor Download PDF

Info

Publication number
CN113242389A
CN113242389A CN202110784469.9A CN202110784469A CN113242389A CN 113242389 A CN113242389 A CN 113242389A CN 202110784469 A CN202110784469 A CN 202110784469A CN 113242389 A CN113242389 A CN 113242389A
Authority
CN
China
Prior art keywords
unit
fusion
image
rccb
image sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110784469.9A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Iwaysense Intelligent Co ltd
Original Assignee
Shenzhen Iwaysense Intelligent Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Iwaysense Intelligent Co ltd filed Critical Shenzhen Iwaysense Intelligent Co ltd
Priority to CN202110784469.9A priority Critical patent/CN113242389A/en
Publication of CN113242389A publication Critical patent/CN113242389A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a multi-frame dynamic range extension method and a system for an RCCB image sensor, belonging to an image processing method
Figure DEST_PATH_IMAGE001
And
Figure 187420DEST_PATH_IMAGE002
(ii) a The image data
Figure 854025DEST_PATH_IMAGE001
And
Figure 932840DEST_PATH_IMAGE002
from the line buffer unit or output by the upper level fusion unit; from raw image data
Figure 617768DEST_PATH_IMAGE001
And
Figure 123835DEST_PATH_IMAGE002
in which luminance information is extracted by a luminance extraction process
Figure DEST_PATH_IMAGE003
And
Figure 277736DEST_PATH_IMAGE004
and the like; the method and the system provided by the invention can solve the common local color deviation defect of the RCCB image sensor in multi-frame fusion application, only a small amount of line cache is needed in the processing process, and the processing can be directly carried out on original format data, thereby reducing the system delay, the complexity and the cost, being conveniently expanded to various programmable devices and application-specific integrated circuits, and being more suitable for popularization and application on various image acquisition devices.

Description

Multi-frame dynamic range extension method and system for RCCB (Rich communication Circuit Board) image sensor
Technical Field
The present invention relates to an image processing method, and more particularly, to a multi-frame dynamic range extension method and system for an RCCB image sensor.
Background
In human daily life scenes, such as a dark tunnel portal or automobile headlights at dark night, the brightness change range of different areas of the same scene exceeds 8 orders of magnitude, namely the dynamic range is larger than 160 dB. Human eyes can adapt to brightness change of more than 6 orders of magnitude due to long-term evolution, namely, the dynamic range is higher than 120 dB.
An image sensor is a device that converts an optical image into an electronic signal, and is widely used in digital cameras and other electro-optical devices. The image sensor comprises a plurality of elementary photosensitive elements, called pixels, distributed in rows and columns. Each pixel can convert the incident light intensity into an electrical signal of corresponding intensity, resulting in a two-dimensional black-and-white image reflecting the light intensity distribution. In order to obtain a color image, a color filter needs to be added to the image sensor. The RGGB filter Bayer array is the most commonly used method at present, which simulates the sensitivity of human eyes to color, and converts gray information into color information by using an arrangement of 1 red, 2 green and 1 blue, i.e., RGGB. In the prior art, the RGGB filter reduces the light flux entering the image sensor to about 1/3, which is not favorable for dark environment application. The RCCB filter Bayer array is a novel filter array which is applied to vehicle-mounted image sensors, and is different from the RGGB filter Bayer array in the prior art in that position filters of all green (G) pixels are changed into transparent (C), as shown in figure 3, the whole light flux of the image sensor is improved, the signal-to-noise ratio can be obviously improved at night, and the identifiability of a target is improved.
Moreover, single photoelectric conversion of the image sensor can only provide a dynamic range of about 70dB at most, and the requirement of reflecting and recording high-dynamic scenes in the real world is far from being met. In order to provide a dynamic range which is not inferior to that of human eyes, a method of multiple photoelectric conversion is generally adopted, a pixel with darker light in a scene is recorded by a larger photoelectric conversion rate, a pixel with stronger light in the scene is recorded by a smaller photoelectric conversion rate, and the dynamic range is expanded to 100-140 dB through multi-frame image fusion.
Meanwhile, because the light flux of the C channel in the RCCB filter Bayer array is much larger than that of the R and B channels, the R, C and B data generated by different photoelectric conversion rates are often selected near the same neighborhood, and due to the insufficient accuracy of the photoelectric conversion rate and the non-linearity of the device, the relative proportion distortion between R, C and B usually occurs in the neighborhood, which results in the defect of local color deviation. And the local color deviation defect needs additional post-processing to be weakened, so that the system overhead and the cost are increased, the total time delay of the system is increased, and the real-time performance of the system is reduced. Therefore, the local color deviation defect seriously affects the recognition of the RCCB image sensor to a traffic signal lamp and a traffic indication sign, and is not beneficial to the application of the key safety fields of automobile auxiliary driving and the like. There is a need for further research and improvement in image processing methods for image sensors for such defects.
Disclosure of Invention
One of the objectives of the present invention is to provide a method and a system for extending a multi-frame dynamic range of an RCCB image sensor, so as to solve the technical problems in the prior art, such as distortion of relative proportions between channels, local color deviation, attenuation due to post-processing, increase of system overhead and cost, increase of total system delay, and reduction of system real-time performance.
In order to solve the technical problems, the invention adopts the following technical scheme:
the invention provides a multi-frame dynamic range extension method for an RCCB image sensor, which comprises the following steps:
step A, acquiring two groups of original image data from an input interface by using a row unit
Figure 893130DEST_PATH_IMAGE001
And
Figure 51579DEST_PATH_IMAGE002
(ii) a The image data
Figure 763183DEST_PATH_IMAGE001
And
Figure 956267DEST_PATH_IMAGE002
from the line buffer unit or output by the upper level fusion unit;
step B, from the original image data
Figure 95124DEST_PATH_IMAGE003
And
Figure 475290DEST_PATH_IMAGE004
in which luminance information is extracted by a luminance extraction process
Figure 990585DEST_PATH_IMAGE005
And
Figure 536710DEST_PATH_IMAGE006
step C, according to the brightness information
Figure 908786DEST_PATH_IMAGE005
And an
Figure 713931DEST_PATH_IMAGE003
And
Figure 829654DEST_PATH_IMAGE004
the photoelectric conversion ratio of (2) was calculated
Figure 731751DEST_PATH_IMAGE004
The fusion weight coefficient of each pixel
Figure 212411DEST_PATH_IMAGE007
Step D, according to the brightness information
Figure 334213DEST_PATH_IMAGE005
And
Figure 925732DEST_PATH_IMAGE006
to calculate the reflection
Figure 947914DEST_PATH_IMAGE003
And
Figure 599476DEST_PATH_IMAGE004
reference value for the severity of each pixel's relative motion or brightness shift
Figure 441530DEST_PATH_IMAGE008
Step E, according to the reference value
Figure 899056DEST_PATH_IMAGE008
Calculating motion compensation coefficients
Figure 713428DEST_PATH_IMAGE009
Step F, according to the fusion weight coefficient
Figure 96743DEST_PATH_IMAGE010
And motion compensation coefficient
Figure 363776DEST_PATH_IMAGE011
Calculating the fusion result of each pixel to obtain the output image with expanded dynamic range
Figure 624993DEST_PATH_IMAGE012
Further, the luminance extracting process performed in the step B includes the following steps:
by using
Figure 293872DEST_PATH_IMAGE013
Coordinates representing any element in the image and luminance value data;
the original image data I is a pixel plane which is periodically arranged according to a Bayer RCCB format;
calculating C channel value estimate for R/B location pixel in raw image data by
Figure 349552DEST_PATH_IMAGE014
And calculating the pixel
Figure 103882DEST_PATH_IMAGE015
Horizontal gradient of
Figure 168790DEST_PATH_IMAGE016
And vertical gradient
Figure 692175DEST_PATH_IMAGE017
Figure 420222DEST_PATH_IMAGE018
Figure 724164DEST_PATH_IMAGE019
Figure 264867DEST_PATH_IMAGE020
Figure 705075DEST_PATH_IMAGE021
Figure 305821DEST_PATH_IMAGE022
Figure 831480DEST_PATH_IMAGE023
Then, the horizontal interpolation of the R/B position is generated by the following formula
Figure 175874DEST_PATH_IMAGE024
Vertical interpolation
Figure 969124DEST_PATH_IMAGE025
And central interpolation
Figure 803088DEST_PATH_IMAGE026
Figure 753727DEST_PATH_IMAGE027
Figure 698549DEST_PATH_IMAGE028
Figure 51033DEST_PATH_IMAGE029
According to
Figure 55898DEST_PATH_IMAGE030
And
Figure 493832DEST_PATH_IMAGE031
selecting
Figure 743810DEST_PATH_IMAGE032
Figure 950801DEST_PATH_IMAGE033
Or
Figure 126567DEST_PATH_IMAGE034
As a result of the interpolation of the missing C positions, the complete C plane is constructed by the following formula;
Figure 114115DEST_PATH_IMAGE035
finally, calculating brightness information Y through the following formula;
Figure 338423DEST_PATH_IMAGE036
calculating the fusion weight coefficient of the pixel in the step C
Figure 727816DEST_PATH_IMAGE007
The method comprises the following steps:
by using
Figure 746587DEST_PATH_IMAGE037
Coordinates representing any element in the image and luminance value data;
by using
Figure 743404DEST_PATH_IMAGE038
Representing an input image
Figure 771403DEST_PATH_IMAGE003
The brightness value of the current position pixel;
constructing a plane rectangular coordinate system, wherein the horizontal axis is brightness, and the vertical axis is a fusion weight coefficient;
obtaining abscissa of piecewise polyline equation from system configuration
Figure 15302DEST_PATH_IMAGE039
Figure 470554DEST_PATH_IMAGE040
Figure 167115DEST_PATH_IMAGE041
Figure 733225DEST_PATH_IMAGE042
Figure 97211DEST_PATH_IMAGE043
Obtaining the ordinate of a piecewise polyline equation from a system configuration
Figure 723364DEST_PATH_IMAGE044
Figure 408686DEST_PATH_IMAGE045
Figure 840804DEST_PATH_IMAGE046
Using said abscissa
Figure 996979DEST_PATH_IMAGE039
Figure 856350DEST_PATH_IMAGE040
Figure 465186DEST_PATH_IMAGE041
Figure 435416DEST_PATH_IMAGE042
Figure 446098DEST_PATH_IMAGE043
And ordinate
Figure 974906DEST_PATH_IMAGE047
Figure 71038DEST_PATH_IMAGE048
Figure 844959DEST_PATH_IMAGE049
Constructing a piecewise polyline equation for said
Figure 710146DEST_PATH_IMAGE005
Performing coefficient mapping to calculate an input image by the following formula
Figure 911321DEST_PATH_IMAGE004
Fusion weight coefficient of current position
Figure 494749DEST_PATH_IMAGE007
Figure 72361DEST_PATH_IMAGE050
Reference value in the above step D
Figure 355837DEST_PATH_IMAGE008
The calculation method comprises the steps of obtaining current input original data from system configuration
Figure 665595DEST_PATH_IMAGE051
And
Figure 798636DEST_PATH_IMAGE052
the reference value is calculated by the following formula
Figure 852043DEST_PATH_IMAGE008
Figure 754140DEST_PATH_IMAGE054
According to the reference value in the step E
Figure 234800DEST_PATH_IMAGE008
Calculating motion compensation coefficients
Figure 855137DEST_PATH_IMAGE009
The method comprises the following steps:
obtaining threshold values for starting compensating motion from system configuration
Figure 7507DEST_PATH_IMAGE055
Deriving thresholds for fully compensated motion from system configuration
Figure 701794DEST_PATH_IMAGE056
By passingThe motion compensation coefficient is calculated by the following formula
Figure 681251DEST_PATH_IMAGE009
Figure 460988DEST_PATH_IMAGE057
Obtaining the output image with the expanded dynamic range in the step F
Figure 918514DEST_PATH_IMAGE058
The method comprises the following steps:
obtaining current input raw data from system configuration
Figure 732887DEST_PATH_IMAGE051
And
Figure 617666DEST_PATH_IMAGE052
photoelectric conversion ratio of
Figure 884699DEST_PATH_IMAGE059
By the following formula, use
Figure 647381DEST_PATH_IMAGE051
Figure 316260DEST_PATH_IMAGE052
Figure 371941DEST_PATH_IMAGE060
Figure 126270DEST_PATH_IMAGE061
Figure 925599DEST_PATH_IMAGE059
Calculating the fusion result of the current position pixel
Figure 511301DEST_PATH_IMAGE058
Figure 941145DEST_PATH_IMAGE062
The invention also provides a multi-frame dynamic range extension system for the RCCB image sensor, which is used for executing the method and comprises a plurality of line cache units, wherein the line cache units are all connected into an interface unit, each line cache unit is also connected into a respective fusion unit, one of the fusion units is connected into an output unit, the fusion units are also connected into a control unit, and the interface unit is used for being connected into the sensor unit.
Preferably, the further technical scheme is as follows: the number of the line cache units is four, the line cache units are respectively a first line cache unit, a second line cache unit, a third line cache unit and a fourth line cache unit, the first line cache unit and the second line cache unit are both accessed into the first fusion unit, the third line cache unit and the fourth line cache unit are respectively accessed into the second fusion unit and the third fusion unit, and the first fusion unit, the second fusion unit and the third fusion unit are sequentially connected and accessed into the output unit by the third fusion unit; the first fusion unit, the second fusion unit and the third fusion unit are all connected to the control unit.
The further technical scheme is as follows: multiple line buffer units for respectively buffering and aligning multiple sets of image data with different photoelectric conversion rates generated by image sensor
Figure 743623DEST_PATH_IMAGE063
Figure 18746DEST_PATH_IMAGE064
Figure 724534DEST_PATH_IMAGE065
Figure 59701DEST_PATH_IMAGE066
And (4) showing.
The further technical scheme is as follows: the first fusionThe unit being for detection and compensation
Figure 850939DEST_PATH_IMAGE064
Motion luminance shift in, fusion
Figure 929754DEST_PATH_IMAGE063
And
Figure 490048DEST_PATH_IMAGE064
image data, output fusion image
Figure 261695DEST_PATH_IMAGE067
(ii) a The second fusion unit is used for detecting and compensating
Figure 776115DEST_PATH_IMAGE065
And merging
Figure 720937DEST_PATH_IMAGE067
And
Figure 73421DEST_PATH_IMAGE065
image data, output fusion image
Figure 78286DEST_PATH_IMAGE068
(ii) a The third fusion unit is used for detecting and compensating
Figure 516221DEST_PATH_IMAGE066
And merging
Figure 999155DEST_PATH_IMAGE068
And
Figure 471724DEST_PATH_IMAGE066
image data, output fusion image
Figure 146026DEST_PATH_IMAGE069
The further technical scheme is as follows: the image sensor unit is used for alternating rowsGenerating four kinds of original data with different photoelectric conversion rates; the interface unit is used for connecting the image sensor unit, separating the line alternation image data generated by the image sensor unit and inputting the line alternation image data into the system; the control unit provides a necessary configuration parameter storage function for the system; the output unit fuses the final image according to a certain interface format
Figure 133574DEST_PATH_IMAGE069
And outputting to an external or next-stage processing unit.
Compared with the prior art, the invention has the following beneficial effects: the method and the system can solve the common local color deviation defect of the RCCB image sensor in multi-frame fusion application, only a small amount of line cache is needed in the processing process, and the processing can be directly carried out on original format data, thereby reducing the system delay, the complexity and the cost, being conveniently expanded to various programmable devices and application-specific integrated circuits, and being more suitable for popularization and application on various image acquisition devices.
Drawings
FIG. 1 is a schematic block diagram of a system for illustrating one embodiment of the invention;
FIG. 2 is a flow chart illustrating the execution of a method according to one embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating the luminance information extraction on the Bayer RCCB format according to the present invention;
FIG. 4 is a schematic diagram illustrating the principle of piecewise polygonal line mapping in the process of calculating the fusion coefficients in the method according to an embodiment of the present invention.
Detailed Description
The invention is further elucidated with reference to the drawing.
Referring to fig. 1 and 2, an embodiment of the present invention is a multi-frame dynamic range extension system for an RCCB image sensor, and in this embodiment, only a 4-frame dynamic range extension system is taken as an example for description, the system includes an image sensor unit, a pipeline structure composed of an interface unit, line buffer units (1 to 4), a control unit, merging units (1 to 3), and an output unit, where a plurality of line buffer units are all connected to the interface unit, and each line buffer unit is also connected to a respective merging unit, where one merging unit is connected to the output unit, and the merging units are also connected to the control unit, and the interface unit is used for connecting to the sensor unit. More specifically, as shown in fig. 1, four line cache units are provided, which are respectively a line cache unit 1, a line cache unit 2, a line cache unit 3, and a line cache unit 4, where the line cache unit 1 and the line cache unit 2 are both accessed to the fusion unit 1, the line cache unit 3 and the line cache unit 4 are respectively accessed to the fusion unit 2 and the fusion unit 3, and the fusion unit 1, the fusion unit 2, and the fusion unit 3 are sequentially connected and accessed to the output unit by the fusion unit 3; the fusion unit 1, the fusion unit 2 and the fusion unit 3 are all connected to the control unit.
In the system, according to the flow sequence of data, the image sensor unit generates 4 kinds of raw data with different photoelectric conversion rates, the raw data passes through the interface unit in a line-alternating mode, is separated and flows into the line buffer units (1 to 4), further flows into the fusion units (1 to 3), and finally flows out of the system through the output unit; all or part of the units (including at least the fusion units 1 to 3) except the image sensor unit are implemented in the form of a programmable device, one representation of which is software running on some general purpose processing chip (CPU or DSP).
Based on the system with the structure, the other expression is to realize a special image processing pipeline on a Field Programmable Gate Array (FPGA); based on the foregoing, yet another expression is for implementing a dedicated image processing pipeline in an application specific integrated circuit chip (ASIC) or system on a chip (SoC).
As can be seen from the above system structure and fig. 1, the interface unit is used to implement a communication protocol with the image sensor unit, establish a data transmission channel between the image sensor and the system, and separate the line-alternate image data generated by the data transmission channel into 4 sets of image data with different photoelectric conversion rates to be input to the system. The line buffer units (1 to 4) are used for buffering and aligning 4 groups of image data with different photoelectric conversion rates generated by the image sensor. Fusion units (1 to 3) for detection by means of cascadingAnd compensating the motion brightness deviation in each group of image data with lower photoelectric conversion rate, fusing each group of image data to obtain a final fused image, and outputting the final fused image to an external or next-stage processing unit through an output unit according to a certain interface format. The aforementioned image sensor unit is used to generate raw data of four different photoelectric conversion rates in a line-alternating manner. The control unit provides a necessary configuration parameter storage function for the system; the output unit fuses the final image according to a certain interface format
Figure 357882DEST_PATH_IMAGE069
And outputting to an external or next-stage processing unit.
Based on the above system architecture, another embodiment of the present invention is a multi-frame dynamic range extension method for an RCCB image sensor, the method performed as follows:
s1, acquiring two groups of original image data from input in line unit
Figure 481695DEST_PATH_IMAGE070
And
Figure 766046DEST_PATH_IMAGE071
s2, inputting raw data
Figure 240890DEST_PATH_IMAGE070
And
Figure 268889DEST_PATH_IMAGE071
in which luminance information is extracted by a luminance extraction process
Figure 14253DEST_PATH_IMAGE072
And
Figure 203926DEST_PATH_IMAGE073
(ii) a The input image data can be from a line cache unit and also can be from the output of a previous-level fusion unit;
s3, according to the brightness information
Figure 166066DEST_PATH_IMAGE072
And an
Figure 794493DEST_PATH_IMAGE070
And
Figure 158478DEST_PATH_IMAGE071
the photoelectric conversion ratio of (2) was calculated
Figure 79905DEST_PATH_IMAGE071
The fusion weight coefficient of each pixel
Figure 467024DEST_PATH_IMAGE010
S4, according to the brightness information
Figure 899142DEST_PATH_IMAGE072
And
Figure 789738DEST_PATH_IMAGE073
to calculate the reflection
Figure 649109DEST_PATH_IMAGE070
And
Figure 320262DEST_PATH_IMAGE071
reference value for the severity of each pixel's relative motion or brightness shift
Figure 493754DEST_PATH_IMAGE074
S5, according to the reference value
Figure 68217DEST_PATH_IMAGE074
Calculating motion compensation coefficients
Figure 36173DEST_PATH_IMAGE011
S6, based on the fusion weight coefficient
Figure 194622DEST_PATH_IMAGE010
And exerciseCompensation factor
Figure 906226DEST_PATH_IMAGE011
Calculating the fusion result of each pixel to obtain the output image with expanded dynamic range
Figure 833731DEST_PATH_IMAGE012
Preferably, in step S2, a calculation method for extracting luminance information from the input raw data is as follows:
s21, use
Figure 34905DEST_PATH_IMAGE075
Coordinates representing any element in the image and luminance value data;
s22, the original data I is a pixel plane periodically arranged according to a Bayer RCCB format if shown in FIG. 3;
s23, calculating C channel value estimation of R/B position pixel in original data
Figure 618333DEST_PATH_IMAGE076
Calculating a pixel
Figure 428901DEST_PATH_IMAGE077
Horizontal gradient of
Figure 414175DEST_PATH_IMAGE078
And vertical gradient
Figure 786250DEST_PATH_IMAGE079
Figure 919291DEST_PATH_IMAGE018
Figure 972698DEST_PATH_IMAGE019
Figure 874795DEST_PATH_IMAGE080
Figure 355455DEST_PATH_IMAGE081
Figure 211678DEST_PATH_IMAGE082
Figure 825379DEST_PATH_IMAGE085
Here, it is preferable that the multiplication operation described above be realized by left shift;
s25 horizontal interpolation for generating R/B position
Figure 539257DEST_PATH_IMAGE086
Vertical interpolation
Figure 584573DEST_PATH_IMAGE087
And central interpolation
Figure 42099DEST_PATH_IMAGE088
Figure 151744DEST_PATH_IMAGE089
Figure 239786DEST_PATH_IMAGE090
Figure 569136DEST_PATH_IMAGE091
Here, it is preferable that the division operation described above can be implemented by right shift;
s26, according to
Figure 768036DEST_PATH_IMAGE092
And
Figure 233653DEST_PATH_IMAGE093
selecting
Figure 492596DEST_PATH_IMAGE094
Figure 309242DEST_PATH_IMAGE095
Or
Figure 610036DEST_PATH_IMAGE096
As a result of the interpolation of the missing C positions, a complete C plane is constructed;
Figure 133421DEST_PATH_IMAGE035
here, it is preferable that the multiplication operation can be implemented by left shift
S27, calculating brightness information Y;
Figure 625582DEST_PATH_IMAGE036
further, using the luminance information
Figure 601628DEST_PATH_IMAGE097
And a set of parameters obtained from the system configuration, calculating
Figure 204648DEST_PATH_IMAGE098
The fusion weight coefficient of each pixel in the image;
preferably, in the step S3, one calculation method of the fusion weight coefficient is as follows:
s31, use
Figure 848119DEST_PATH_IMAGE099
Coordinates representing any element in the image and luminance value data;
s32, use
Figure 245602DEST_PATH_IMAGE100
Representing an input image
Figure 558813DEST_PATH_IMAGE101
The brightness value of the current position pixel;
s33, constructing a plane rectangular coordinate system, wherein the horizontal axis is brightness, and the vertical axis is a fusion weight coefficient;
s34, obtaining the abscissa of the piecewise polyline equation from the system configuration
Figure 637628DEST_PATH_IMAGE102
Figure 197922DEST_PATH_IMAGE103
Figure 703990DEST_PATH_IMAGE104
Figure 982524DEST_PATH_IMAGE105
Figure 865030DEST_PATH_IMAGE106
S35, obtaining the ordinate of the piecewise polyline equation from the system configuration
Figure 279831DEST_PATH_IMAGE107
Figure 786161DEST_PATH_IMAGE108
Figure 224095DEST_PATH_IMAGE109
S36, using the abscissa
Figure 707029DEST_PATH_IMAGE102
Figure 179599DEST_PATH_IMAGE103
Figure 89786DEST_PATH_IMAGE104
Figure 15017DEST_PATH_IMAGE105
Figure 301641DEST_PATH_IMAGE106
And ordinate
Figure 628718DEST_PATH_IMAGE107
Figure 473920DEST_PATH_IMAGE108
Figure 948764DEST_PATH_IMAGE109
Constructing a piecewise polyline equation for said
Figure 711184DEST_PATH_IMAGE097
Performing coefficient mapping, namely:
Figure 955083DEST_PATH_IMAGE110
computing an input image
Figure 410335DEST_PATH_IMAGE098
Fusion weight coefficient of current position
Figure 372475DEST_PATH_IMAGE111
In the above step S4, the brightness information is used
Figure 502367DEST_PATH_IMAGE097
And
Figure 538457DEST_PATH_IMAGE112
calculating a reference value reflecting the severity of the relative motion or luminance shift of each pixel at the current position
Figure 226927DEST_PATH_IMAGE113
: slave systemObtaining current input raw data in a global configuration
Figure 614046DEST_PATH_IMAGE101
And
Figure 780585DEST_PATH_IMAGE098
calculating a reference value of the photoelectric conversion ratio R
Figure 999077DEST_PATH_IMAGE113
Namely:
Figure 796131DEST_PATH_IMAGE114
in the above step S5, the motion compensation coefficient is calculated
Figure 965819DEST_PATH_IMAGE115
The method comprises the following steps of acquiring a group of parameters from system configuration, specifically:
obtaining threshold values for starting compensating motion from system configuration
Figure 139312DEST_PATH_IMAGE116
Deriving thresholds for fully compensated motion from system configuration
Figure 212310DEST_PATH_IMAGE117
Calculated by the formula, namely:
Figure 180266DEST_PATH_IMAGE118
obtaining the motion compensation coefficient of the current position
Figure 338715DEST_PATH_IMAGE115
Further, in the above step S5, the fusion weight coefficient is used as a basis
Figure 847057DEST_PATH_IMAGE111
And exerciseCompensation factor
Figure 977824DEST_PATH_IMAGE115
And obtaining current input raw data from the system configuration
Figure 680463DEST_PATH_IMAGE101
And
Figure 263891DEST_PATH_IMAGE098
the photoelectric conversion ratio R of (a);
in the above step S6, use is made of
Figure 575923DEST_PATH_IMAGE101
Figure 561197DEST_PATH_IMAGE098
Figure 933272DEST_PATH_IMAGE111
Figure 3997DEST_PATH_IMAGE115
R calculating the fusion result of the current position pixel
Figure 119720DEST_PATH_IMAGE119
Namely:
Figure 520352DEST_PATH_IMAGE120
in the invention, the multi-frame dynamic range extension method of the RCCB image sensor can be conveniently realized in the form of various integrated circuits, including ASIC, FPGA and the like, the whole system has good expansibility, is convenient for integrating other image algorithms, and can ensure the real-time operation of processing.
In the embodiment of the present invention, the embodiment based on an integrated circuit is implemented by taking an FPGA as an example, and those skilled in the art can easily extend the embodiment to other integrated circuits to implement the embodiment.
In addition to the foregoing, it should be noted that reference throughout this specification to "one embodiment," "another embodiment," "an embodiment," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment described generally throughout this application. The appearances of the same phrase in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with any embodiment, it is submitted that it is within the scope of the invention to effect such feature, structure, or characteristic in connection with other embodiments.
Although the invention has been described herein with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More specifically, various variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, other uses will also be apparent to those skilled in the art.

Claims (10)

1. A multi-frame dynamic range extension method for an RCCB image sensor, said method comprising the steps of:
step A, acquiring two groups of original image data from an input interface by using a row unit
Figure 694180DEST_PATH_IMAGE001
And
Figure 673637DEST_PATH_IMAGE002
(ii) a The image data
Figure 764959DEST_PATH_IMAGE001
And
Figure 160168DEST_PATH_IMAGE002
from the rowThe buffer unit or the upper fusion unit outputs the buffer unit or the upper fusion unit;
step B, from the original image data
Figure 36857DEST_PATH_IMAGE001
And
Figure 859320DEST_PATH_IMAGE002
in which luminance information is extracted by a luminance extraction process
Figure 939402DEST_PATH_IMAGE003
And
Figure 138302DEST_PATH_IMAGE004
step C, according to the brightness information
Figure 869498DEST_PATH_IMAGE003
And an
Figure 862862DEST_PATH_IMAGE001
And
Figure 931705DEST_PATH_IMAGE002
the photoelectric conversion ratio of (2) was calculated
Figure 668717DEST_PATH_IMAGE002
The fusion weight coefficient of each pixel
Figure 254419DEST_PATH_IMAGE005
Step D, according to the brightness information
Figure 684264DEST_PATH_IMAGE003
And
Figure 738938DEST_PATH_IMAGE004
to calculate the reflection
Figure 14062DEST_PATH_IMAGE001
And
Figure 719850DEST_PATH_IMAGE002
reference value for the severity of each pixel's relative motion or brightness shift
Figure 55016DEST_PATH_IMAGE006
Step E, according to the reference value
Figure 95522DEST_PATH_IMAGE007
Calculating motion compensation coefficients
Figure 174337DEST_PATH_IMAGE008
Step F, according to the fusion weight coefficient
Figure 734631DEST_PATH_IMAGE009
And motion compensation coefficient
Figure 506278DEST_PATH_IMAGE008
Calculating the fusion result of each pixel to obtain the output image with expanded dynamic range
Figure 269966DEST_PATH_IMAGE010
2. The multi-frame dynamic range extension method for the RCCB image sensor of claim 1, wherein the performing of the brightness extraction process in step B comprises the steps of:
by using
Figure 152471DEST_PATH_IMAGE011
Coordinates representing any element in the image and luminance value data;
the original image data I is a pixel plane which is periodically arranged according to a Bayer RCCB format;
calculating C channel value estimate for R/B location pixel in raw image data by
Figure 567272DEST_PATH_IMAGE012
Figure 818475DEST_PATH_IMAGE013
And calculating a pixel I
Figure 256409DEST_PATH_IMAGE013
Horizontal gradient of
Figure 739343DEST_PATH_IMAGE014
And vertical gradient
Figure 211913DEST_PATH_IMAGE015
Figure 138412DEST_PATH_IMAGE016
Figure 63643DEST_PATH_IMAGE017
Figure 350267DEST_PATH_IMAGE018
Figure 411764DEST_PATH_IMAGE019
Figure 7700DEST_PATH_IMAGE020
Figure 420226DEST_PATH_IMAGE021
Then, the horizontal interpolation of the R/B position is generated by the following formula
Figure 510542DEST_PATH_IMAGE022
Vertical interpolation
Figure 692125DEST_PATH_IMAGE023
And central interpolation
Figure 694847DEST_PATH_IMAGE024
Figure 594670DEST_PATH_IMAGE025
Figure 223097DEST_PATH_IMAGE026
Figure 524766DEST_PATH_IMAGE027
According to
Figure 199854DEST_PATH_IMAGE014
And
Figure 586973DEST_PATH_IMAGE015
selecting
Figure 19091DEST_PATH_IMAGE028
Figure 909687DEST_PATH_IMAGE029
Or
Figure 519791DEST_PATH_IMAGE030
As a result of the interpolation of the missing C positions, the complete C plane is constructed by the following formula;
Figure 190944DEST_PATH_IMAGE031
finally, calculating brightness information Y through the following formula;
Figure 364436DEST_PATH_IMAGE032
3. the multi-frame dynamic range extension method for RCCB image sensor of claim 2, wherein said step C of calculating the fusion weight coefficients of the pixels
Figure 686702DEST_PATH_IMAGE005
The method comprises the following steps:
by using
Figure 654658DEST_PATH_IMAGE033
Coordinates representing any element in the image and luminance value data;
by using
Figure 813107DEST_PATH_IMAGE034
Representing an input image
Figure 524711DEST_PATH_IMAGE001
The brightness value of the current position pixel;
constructing a plane rectangular coordinate system, wherein the horizontal axis is brightness, and the vertical axis is a fusion weight coefficient;
obtaining abscissa of piecewise polyline equation from system configuration
Figure 202948DEST_PATH_IMAGE035
Figure 341805DEST_PATH_IMAGE036
Figure 987550DEST_PATH_IMAGE037
Figure 237266DEST_PATH_IMAGE038
Figure 554632DEST_PATH_IMAGE039
Obtaining the ordinate of a piecewise polyline equation from a system configuration
Figure 864390DEST_PATH_IMAGE040
Figure 997432DEST_PATH_IMAGE041
Figure 50838DEST_PATH_IMAGE042
Using said abscissa
Figure 703668DEST_PATH_IMAGE043
Figure 184327DEST_PATH_IMAGE044
Figure 539085DEST_PATH_IMAGE045
Figure 396183DEST_PATH_IMAGE046
Figure 402054DEST_PATH_IMAGE047
And ordinate
Figure 53615DEST_PATH_IMAGE040
Figure 161249DEST_PATH_IMAGE041
Figure 556458DEST_PATH_IMAGE042
Constructing a piecewise polyline equation for said
Figure 918300DEST_PATH_IMAGE048
Performing coefficient mapping to calculate an input image by the following formula
Figure 6342DEST_PATH_IMAGE049
Fusion weight coefficient of current position
Figure 335692DEST_PATH_IMAGE050
Figure 534592DEST_PATH_IMAGE051
4. The multi-frame dynamic range extension method for the RCCB image sensor according to claim 1 or 3, characterized in that the reference value in step D
Figure 252406DEST_PATH_IMAGE006
The calculation method comprises the steps of obtaining current input original data from system configuration
Figure 511349DEST_PATH_IMAGE001
And
Figure 327995DEST_PATH_IMAGE002
the reference value is calculated by the following formula
Figure 878056DEST_PATH_IMAGE006
Figure 401442DEST_PATH_IMAGE052
5. The multi-frame dynamic range extension method for RCCB image sensor of claim 1, wherein said step E is based on a reference value
Figure 893603DEST_PATH_IMAGE053
Calculating motion compensation coefficients
Figure 869649DEST_PATH_IMAGE054
The method comprises the following steps:
obtaining threshold values for starting compensating motion from system configuration
Figure 721936DEST_PATH_IMAGE055
Deriving thresholds for fully compensated motion from system configuration
Figure 365407DEST_PATH_IMAGE056
The motion compensation coefficient is calculated by the following formula
Figure 762890DEST_PATH_IMAGE057
Figure 491812DEST_PATH_IMAGE058
Obtaining the output image with the expanded dynamic range in the step F
Figure 383676DEST_PATH_IMAGE059
The method comprises the following steps:
obtaining current input raw data from system configuration
Figure 881653DEST_PATH_IMAGE060
And
Figure 450038DEST_PATH_IMAGE049
the photoelectric conversion ratio R of (a);
by the following formula, use
Figure 666256DEST_PATH_IMAGE060
Figure 857416DEST_PATH_IMAGE049
Figure 209900DEST_PATH_IMAGE050
Figure 214765DEST_PATH_IMAGE061
R calculating the fusion result of the current position pixel
Figure 652699DEST_PATH_IMAGE059
Figure 886366DEST_PATH_IMAGE062
6. A multi-frame dynamic range extension system for RCCB image sensors, characterized in that the system is configured to perform the method of any of claims 1 to 5, and comprises a plurality of line buffer units, each line buffer unit being connected to an interface unit, and each line buffer unit being further connected to a respective merging unit, wherein one merging unit is connected to an output unit, and each merging unit being further connected to a control unit, and the interface unit being configured to be connected to a sensor unit.
7. The multi-frame dynamic range extension system for an RCCB image sensor of claim 6, wherein: the number of the line cache units is four, the line cache units are respectively a first line cache unit, a second line cache unit, a third line cache unit and a fourth line cache unit, the first line cache unit and the second line cache unit are both accessed into the first fusion unit, the third line cache unit and the fourth line cache unit are respectively accessed into the second fusion unit and the third fusion unit, and the first fusion unit, the second fusion unit and the third fusion unit are sequentially connected and accessed into the output unit by the third fusion unit; the first fusion unit, the second fusion unit and the third fusion unit are all connected to the control unit.
8. The multi-frame dynamic range extension system for an RCCB image sensor of claim 6 or 7, wherein: multiple line buffer units for respectively buffering and aligning multiple sets of image data with different photoelectric conversion rates generated by image sensor
Figure 358935DEST_PATH_IMAGE063
Figure 269122DEST_PATH_IMAGE064
Figure 194353DEST_PATH_IMAGE065
Figure 730246DEST_PATH_IMAGE066
And (4) showing.
9. The multi-frame dynamic range extension system for an RCCB image sensor of claim 7, wherein: the first fusion unit is used for detecting and compensating
Figure 57322DEST_PATH_IMAGE064
Motion luminance shift in, fusion
Figure 403989DEST_PATH_IMAGE063
And
Figure 629566DEST_PATH_IMAGE064
image data, output fusion image
Figure 391985DEST_PATH_IMAGE067
(ii) a The second fusion unit is used for detecting and compensating
Figure 635885DEST_PATH_IMAGE065
And merging
Figure 91137DEST_PATH_IMAGE067
And
Figure 305474DEST_PATH_IMAGE065
image data, output fusion image
Figure 871584DEST_PATH_IMAGE068
(ii) a The third fusion unit is used for detecting and compensating
Figure 969990DEST_PATH_IMAGE066
And merging
Figure 596144DEST_PATH_IMAGE068
And
Figure 796312DEST_PATH_IMAGE066
image data, output fusion image
Figure 900534DEST_PATH_IMAGE069
10. The multi-frame dynamic range extension system for an RCCB image sensor of claim 7 or 9, wherein: the image sensor unit is used for generating raw data of four different photoelectric conversion rates in a line alternating mode; the interface unit is used for connecting the image sensor unit, separating the line alternation image data generated by the image sensor unit and inputting the line alternation image data into the system; the control unit provides a necessary configuration parameter storage function for the system; the output unitFusing the final image according to certain interface format
Figure 119026DEST_PATH_IMAGE069
And outputting to an external or next-stage processing unit.
CN202110784469.9A 2021-07-12 2021-07-12 Multi-frame dynamic range extension method and system for RCCB (Rich communication Circuit Board) image sensor Pending CN113242389A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110784469.9A CN113242389A (en) 2021-07-12 2021-07-12 Multi-frame dynamic range extension method and system for RCCB (Rich communication Circuit Board) image sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110784469.9A CN113242389A (en) 2021-07-12 2021-07-12 Multi-frame dynamic range extension method and system for RCCB (Rich communication Circuit Board) image sensor

Publications (1)

Publication Number Publication Date
CN113242389A true CN113242389A (en) 2021-08-10

Family

ID=77135425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110784469.9A Pending CN113242389A (en) 2021-07-12 2021-07-12 Multi-frame dynamic range extension method and system for RCCB (Rich communication Circuit Board) image sensor

Country Status (1)

Country Link
CN (1) CN113242389A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101917629A (en) * 2010-08-10 2010-12-15 浙江大学 Green component and color difference space-based Bayer format color interpolation method
CN102254301A (en) * 2011-07-22 2011-11-23 西安电子科技大学 Demosaicing method for CFA (color filter array) images based on edge-direction interpolation
CN102959957A (en) * 2010-07-06 2013-03-06 皇家飞利浦电子股份有限公司 Generation of high dynamic range images from low dynamic range images in multi-view video coding
CN104881843A (en) * 2015-06-10 2015-09-02 京东方科技集团股份有限公司 Image interpolation method and image interpolation apparatus
CN110418065A (en) * 2018-04-27 2019-11-05 北京展讯高科通信技术有限公司 High dynamic range images motion compensation process, device and electronic equipment
CN111131718A (en) * 2019-07-16 2020-05-08 深圳市艾为智能有限公司 Multi-exposure image fusion method and system with LED flicker compensation function
US20210042881A1 (en) * 2019-08-09 2021-02-11 Samsung Electronics Co., Ltd. Method and apparatus for image processing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102959957A (en) * 2010-07-06 2013-03-06 皇家飞利浦电子股份有限公司 Generation of high dynamic range images from low dynamic range images in multi-view video coding
CN102986214A (en) * 2010-07-06 2013-03-20 皇家飞利浦电子股份有限公司 Generation of high dynamic range images from low dynamic range images
CN101917629A (en) * 2010-08-10 2010-12-15 浙江大学 Green component and color difference space-based Bayer format color interpolation method
CN102254301A (en) * 2011-07-22 2011-11-23 西安电子科技大学 Demosaicing method for CFA (color filter array) images based on edge-direction interpolation
CN104881843A (en) * 2015-06-10 2015-09-02 京东方科技集团股份有限公司 Image interpolation method and image interpolation apparatus
CN110418065A (en) * 2018-04-27 2019-11-05 北京展讯高科通信技术有限公司 High dynamic range images motion compensation process, device and electronic equipment
CN111131718A (en) * 2019-07-16 2020-05-08 深圳市艾为智能有限公司 Multi-exposure image fusion method and system with LED flicker compensation function
US20210042881A1 (en) * 2019-08-09 2021-02-11 Samsung Electronics Co., Ltd. Method and apparatus for image processing

Similar Documents

Publication Publication Date Title
US9055181B2 (en) Solid-state imaging device, image processing apparatus, and a camera module having an image synthesizer configured to synthesize color information
US20070183657A1 (en) Color-image reproduction apparatus
CN105578078A (en) Image sensor, imaging device, mobile terminal and imaging method
CN100366053C (en) Method for compensating bad dots on digital images
CN105430359A (en) Imaging method, image sensor, imaging device and electronic device
CN103327220B (en) With green channel for the denoising method guided on low-light (level) Bayer image
CN104350744A (en) Camera system with multi-spectral filter array and image processing method thereof
KR100548611B1 (en) Apparatus and method for edge emphasis in image processing
US20130100311A1 (en) Solid-state imaging device, camera module, and focus adjustment method of camera module
EP3902242B1 (en) Image sensor and signal processing method
WO2016002283A1 (en) Image processing device, image pickup device, information processing device and program
CN105430361A (en) Imaging method, image sensor, imaging device and electronic device
RU2712436C1 (en) Rccb image processing method
JP3551857B2 (en) High dynamic range image processing apparatus and method
WO2024027287A1 (en) Image processing system and method, and computer-readable medium and electronic device
JP2015103906A (en) Image processing device, imaging system, image processing method, and program
CN105578080A (en) Imaging method, image sensor, imaging device and electronic device
CN105578081A (en) Imaging method, image sensor, imaging device and electronic device
US20230007191A1 (en) Image sensor, imaging apparatus, electronic device, image processing system, and signal processing method
CN103379289A (en) Imaging apparatus
CN113556526A (en) RGBW filter array-based color enhancement method for color night vision equipment
JP2010193037A (en) Apparatus and program for creating pseudo-color image
CN103546729B (en) Gray-color dual-mode TDI-CMOS image sensor and control method
US6904166B2 (en) Color interpolation processor and the color interpolation calculation method thereof
CN113242413B (en) Interpolation calculation method and system for anti-sawtooth RCCB filter array

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210810