CN106572312B - Panoramic video self-adaptive illumination compensation method and system - Google Patents

Panoramic video self-adaptive illumination compensation method and system Download PDF

Info

Publication number
CN106572312B
CN106572312B CN201611007942.8A CN201611007942A CN106572312B CN 106572312 B CN106572312 B CN 106572312B CN 201611007942 A CN201611007942 A CN 201611007942A CN 106572312 B CN106572312 B CN 106572312B
Authority
CN
China
Prior art keywords
image
condition
parameter
value
illumination compensation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611007942.8A
Other languages
Chinese (zh)
Other versions
CN106572312A (en
Inventor
马国强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mengwang Video Co ltd
Original Assignee
Shenzhen Mengwang Video Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mengwang Video Co ltd filed Critical Shenzhen Mengwang Video Co ltd
Priority to CN201611007942.8A priority Critical patent/CN106572312B/en
Publication of CN106572312A publication Critical patent/CN106572312A/en
Application granted granted Critical
Publication of CN106572312B publication Critical patent/CN106572312B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a panoramic video self-adaptive illumination compensation method and a panoramic video self-adaptive illumination compensation system. On the other hand, the method analyzes the brightness of the panoramic spliced image and determines the type needing illumination compensation; then, the identification degree of an object in the scene with insufficient illumination is improved by utilizing a global compensation mode; the backlight defect image is solved by using a local compensation mode, so that the spliced panoramic image has a virtual natural light transition effect, and the advantages of panoramic video effect and calculated amount are improved.

Description

Panoramic video self-adaptive illumination compensation method and system
Technical Field
The invention relates to the field of video illumination compensation, in particular to a panoramic video self-adaptive illumination compensation method and system.
Background
The panoramic video can contain 360-degree picture range information due to the horizontal visual angle, so that the panoramic video has wide application fields, such as panoramic video monitoring, virtual environment construction, aerial shooting, commercial display and the like. The method for synthesizing the panoramic video by adopting the multi-path video code streams is adopted in many places because of convenient implementation and low cost. However, due to the problem of light conditions, the panoramic video always has the problems of the phenomenon of simultaneous smooth and reverse light, low image recognition degree in the foggy and rainy days, light change in different time periods and the like. On one hand, for actual monitoring or when a user wants to acquire complete display information, the conventional image stitching algorithm does not consider the problems, so that the presentation effect of the panoramic image acquired by direct stitching is reduced; on the other hand, the image frames of the panoramic video stream have strong correlation and carry compressed information, and if the conventional panoramic image splicing technology is directly adopted, unnecessary calculation amount is generated, so that real-time presentation is influenced.
Disclosure of Invention
The embodiment of the invention aims to provide a panoramic video self-adaptive illumination compensation method, and aims to solve the problems of poor image splicing effect and large calculated amount in the prior art.
The embodiment of the invention is realized in such a way that a panoramic video self-adaptive illumination compensation method comprises the following steps:
step J0: if the parameter par is judgedtIf the value is 1, the step J1 is carried out, otherwise, the step J3 is carried out;
step J1: computing
Figure GDA0002441501460000011
Is determined as statistic, including
Figure GDA0002441501460000012
Figure GDA0002441501460000013
Step J2: will be provided with
Figure GDA0002441501460000015
In descending order, is recorded as
Figure GDA0002441501460000014
The corresponding first, second and third distribution parameters are recorded as
Figure GDA0002441501460000021
Figure GDA0002441501460000022
The corresponding image is recorded as
Figure GDA0002441501460000023
Figure GDA0002441501460000024
Step J3: if (par)t1 and
Figure GDA0002441501460000025
) Or (par)t0 and part-1A global compensation mode is adopted during illumination compensation), and then the system enters the global compensation mode; otherwise, entering a local illumination compensation mode;
step J4: frame t image of N spliced code streams after illumination compensation
Figure GDA0002441501460000026
Figure GDA0002441501460000027
Carrying out conventional panoramic image splicing;
step J5: let t be t + 1;
step J6: if the code stream is spliced, ending; otherwise, re-enter step J0;
wherein the content of the first and second substances,
Figure GDA0002441501460000028
a t frame image representing the nth spliced code stream;
Figure GDA0002441501460000029
to represent
Figure GDA00024415014600000210
The peak value parameter of (a);
Figure GDA00024415014600000211
respectively representing a first distribution parameter, a second distribution parameter and a third distribution parameter; partA judgment parameter representing the t-th frame panorama; part-1A judgment parameter representing the t-1 th frame panorama; thres1Is the first decision threshold value for the first time,
another objective of an embodiment of the present invention is to provide a panoramic video adaptive illumination compensation system, which includes:
the parameter judgment module is used for judging whether the judgment parameter is 1 or not, entering the judgment statistic calculation device, and otherwise, entering the first judgment processing module;
the judgment parameter calculation method comprises the following steps:
Figure GDA00024415014600000212
wherein the content of the first and second substances,
Figure GDA00024415014600000213
condition 1 represents that t is 1 or
Figure GDA00024415014600000214
For intra-predicted frames or (there is one)
Figure GDA00024415014600000215
) (ii) a Condition 2 represents
Figure GDA00024415014600000216
Is an intra-frame prediction block or at least comprises one intra-frame prediction sub-block; partA judgment parameter representing the t-th frame panorama;
Figure GDA00024415014600000217
representing the t frame image of the nth spliced code stream, wherein the initial value of t is 1;
Figure GDA00024415014600000218
representing an intermediate variable; n is the number of code streams of the spliced panoramic video; sum (variable | condition) represents summing the variables that satisfy the condition;
Figure GDA00024415014600000219
Figure GDA00024415014600000220
to represent
Figure GDA00024415014600000221
Line ii jj decoding block; the size of the block is 16x16(H264 standard), 64x64(HEVC), when the block is further divided, these smaller blocks are called sub-blocks; bkw and bkh respectively indicate the number of columns and the number of rows of a frame of image in units of blocks after the image is divided into blocks.
Judgment statistic calculation device, calculation
Figure GDA0002441501460000031
The judgment statistic of (1);
a peak parameter descending order module for aligning
Figure GDA0002441501460000032
The peak value parameters are arranged in a descending order, and the image peak value parameter sequence after the descending order is obtained, and the corresponding first distribution parameter sequence and the second distribution parameter sequenceThe distribution parameter sequence, the third distribution parameter sequence and the corresponding image sequence;
the method specifically comprises the following steps: will be provided with
Figure GDA0002441501460000033
In descending order, is recorded as
Figure GDA0002441501460000034
The corresponding first, second and third distribution parameters are recorded as
Figure GDA0002441501460000035
Figure GDA0002441501460000036
The corresponding image is recorded as
Figure GDA0002441501460000037
Figure GDA0002441501460000038
To represent
Figure GDA0002441501460000039
The peak value parameter of (a);
Figure GDA00024415014600000310
a t frame image representing the nth spliced code stream;
a first judgment processing module for judging if (par) is satisfiedt1 and
Figure GDA00024415014600000311
) Or (par)t0 and part-1A global compensation mode is adopted during illumination compensation), and then the system enters a global illumination compensation device; otherwise, entering a local illumination compensation device; wherein part-1A judgment parameter representing the t-1 th frame panorama; thres1Is a first decision threshold, Thres1≤100;
Local illumination compensation means for judging if partWhen 1, then pair
Figure GDA00024415014600000312
Illumination compensation is carried out; otherwise, set up
Figure GDA00024415014600000313
Is composed of
Figure GDA00024415014600000314
The next frame of the code stream is spliced, and then the code stream is paired
Figure GDA00024415014600000315
Illumination compensation is carried out; wherein the content of the first and second substances,
Figure GDA00024415014600000316
show that
Figure GDA00024415014600000317
The Nth image after descending order arrangement;
Figure GDA00024415014600000318
a t-1 frame image representing the nth spliced code stream;
the global illumination compensation device is used for carrying out global illumination compensation on the images corresponding to the N spliced code stream image peak value parameter sequences after the descending order arrangement;
the panoramic image splicing module is used for splicing t frame images of N spliced code streams after illumination compensation
Figure GDA00024415014600000319
Figure GDA00024415014600000320
Carrying out conventional panoramic image splicing;
a frame number setting module, configured to set t as t + 1;
and the second judgment processing module is used for judging that the code stream is spliced completely, ending the process, and otherwise, re-entering the parameter judgment module.
The invention has the advantages of
The method determines whether the illumination compensation parameters need to be recalculated or not through the code stream information so as to reduce the calculated amount of the multi-path code stream splicing. On the other hand, the method analyzes the brightness of the panoramic spliced image and determines the type needing illumination compensation; then, the identification degree of an object in the scene with insufficient illumination is improved by utilizing a global compensation mode; and solving the backlight defect image by using a local compensation mode, so that the spliced panoramic image has a virtual and natural light ray transition effect. Therefore, the advantages of panoramic video effect and calculated amount in applications such as monitoring and the like are improved.
Drawings
FIG. 1 is a flow chart of a panoramic video adaptive illumination compensation method according to a preferred embodiment of the present invention;
FIG. 2 is a flow chart of a method of the local illumination compensation mode of FIG. 1;
FIG. 3 is a flow chart of a global illumination compensation pattern method of FIG. 1;
FIG. 4 is a block diagram of a panoramic video adaptive illumination compensation system according to a preferred embodiment of the present invention;
fig. 5 is a structural view of the partial illumination compensation apparatus of fig. 4.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and examples, and for convenience of description, only parts related to the examples of the present invention are shown. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The method of the embodiment of the invention determines whether the illumination compensation parameters need to be recalculated or not through the code stream information, thereby reducing the calculation amount of the multi-path code stream splicing. On the other hand, the method analyzes the brightness of the panoramic spliced image and determines the type needing illumination compensation; then, the identification degree of an object in the scene with insufficient illumination is improved by utilizing a global compensation mode; and solving the backlight defect image by using a local compensation mode, so that the spliced panoramic image has a virtual and natural light ray transition effect. Thereby promoting the advantages of panoramic video effect and calculation amount.
Example one
FIG. 1 is a flow chart of a panoramic video adaptive illumination compensation method according to a preferred embodiment of the present invention; the method comprises the following steps:
step 0: if the judgment parameter is 1, then Step1 is entered, otherwise, Step3 is entered.
The judgment parameter calculation method comprises the following steps:
Figure GDA0002441501460000051
wherein the content of the first and second substances,
Figure GDA0002441501460000052
condition 1 represents that t is 1 or
Figure GDA0002441501460000053
For intra-predicted frames or (there is one)
Figure GDA0002441501460000054
) (ii) a Condition 2 represents
Figure GDA0002441501460000055
Is an intra-frame prediction block or at least comprises one intra-frame prediction sub-block; partA judgment parameter representing the t-th frame panorama;
Figure GDA0002441501460000056
representing the t frame image of the nth spliced code stream, wherein the initial value of t is 1;
Figure GDA0002441501460000057
representing an intermediate variable; n is the number of code streams of the spliced panoramic video; sum (variable | condition) represents summing the variables that satisfy the condition;
Figure GDA0002441501460000058
Figure GDA0002441501460000059
to represent
Figure GDA00024415014600000510
Line ii jj decoding block; the size of the block is 16x16(H264 standard), 64x64(HEVC), when the block is further divided, these smaller blocks are called sub-blocks; bkw and bkh respectively indicate the number of columns and the number of rows of a frame of image in units of blocks after the image is divided into blocks.
Step 1: computing
Figure GDA00024415014600000511
N is more than or equal to 1 and less than or equal to N.
Step 11: computing
Figure GDA00024415014600000512
Wherein the content of the first and second substances,
Figure GDA00024415014600000513
the luminance k distribution value of the t frame image of the nth spliced code stream is represented, k represents the luminance value, k is more than or equal to 0 and less than or equal to 255,
Figure GDA00024415014600000514
to represent
Figure GDA00024415014600000515
The luminance value of the ith row and jth column of pixels after decoding; width and height represent the length and width resolution of the image.
Step 12: calculating a set of splice statistics parameters comprising
Figure GDA00024415014600000516
Figure GDA00024415014600000517
Figure GDA00024415014600000518
Figure GDA00024415014600000519
Figure GDA00024415014600000520
Wherein the content of the first and second substances,
Figure GDA00024415014600000521
to represent
Figure GDA00024415014600000522
The peak value parameter of (a); max (variable | condition) and min (variable | condition) respectively represent the maximum value and the minimum value of the variables satisfying the condition; arc max (variable | argument condition) represents the argument value corresponding to the maximum value of the variable, i.e., the argument value
Figure GDA0002441501460000061
Expression solution
Figure GDA0002441501460000062
Then the corresponding k value at this time is obtained, namely the expression
Figure GDA0002441501460000063
A value of (d);
Figure GDA0002441501460000064
respectively representing a first, a second and a third distribution parameter.
Step 2: will be provided with
Figure GDA00024415014600000627
In descending order, is recorded as
Figure GDA0002441501460000065
The corresponding first, second and third distribution parameters are recorded as
Figure GDA0002441501460000066
Figure GDA0002441501460000067
It corresponds toIs recorded as
Figure GDA0002441501460000068
Figure GDA0002441501460000069
Figure GDA00024415014600000610
To represent
Figure GDA00024415014600000611
The peak value parameter of (a);
Figure GDA00024415014600000612
a t frame image representing the nth spliced code stream;
step 3: if (par)t1 and
Figure GDA00024415014600000613
) Or (par)t0 and part-1A global compensation mode is adopted during illumination compensation), then a Step5 global compensation mode is entered; otherwise, entering Step4 local illumination compensation mode.
Wherein part-1A judgment parameter representing the t-1 th frame panorama; thres1For the first decision threshold, Thres may be generally taken1≤100。
FIG. 2 is a flow chart of a method of the local illumination compensation mode of FIG. 1;
step 4: if partWhen 1, then pair
Figure GDA00024415014600000614
Illumination compensation is carried out; otherwise, set up
Figure GDA00024415014600000615
Is composed of
Figure GDA00024415014600000616
The next frame of the code stream is spliced, and then the code stream is paired
Figure GDA00024415014600000617
And carrying out illumination compensation.
Wherein the content of the first and second substances,
Figure GDA00024415014600000618
show that
Figure GDA00024415014600000619
The Nth image after descending order arrangement;
Figure GDA00024415014600000620
a t-1 frame image representing the nth spliced code stream;
step 40: obtaining
Figure GDA00024415014600000621
First and second reference images.
Wherein the first reference image is
Figure GDA00024415014600000622
The left stitched image and the second reference image are
Figure GDA00024415014600000623
And the right side is spliced with the image.
Case1 direct acquisition when the image capturing position is known
Figure GDA00024415014600000624
The first and second reference images of (1);
case 2: when the position relation of the spliced panoramic image sequence is unknown, the first reference image and the second reference image can be obtained by the following method.
If par t1, then directly choose
Figure GDA00024415014600000625
The code stream corresponding to the first and second reference images is used as the next frame image
Figure GDA00024415014600000626
The first and second reference images of (1); otherwise, the first reference image is acquired by the following method:
step A1: computing
Figure GDA0002441501460000071
std (variable | condition) represents the mean square error of the variables satisfying the condition, and int represents the rounding operation.
Figure GDA0002441501460000072
To represent
Figure GDA0002441501460000073
The luminance value of the ith row and jth column of pixels after decoding;
Figure GDA0002441501460000074
representing the t frame image of the Nth spliced code stream, wherein the initial value of t is 1;
Figure GDA0002441501460000075
to represent
Figure GDA0002441501460000076
The luminance value of the ith row and the jth + width/2 column of pixels after decoding;
step A2:
Figure GDA0002441501460000077
expression solution
Figure GDA0002441501460000078
The minimum K value in the sequence is generally more than or equal to 1 and less than or equal to 5;
step A3: setting the image set corresponding to the minimum K values in the step A2 as an alternative set, and then using well-known image feature matching to the images in the alternative set to find out the final image
Figure GDA0002441501460000079
The first reference image of (1).
Acquiring a second reference image:
step (ii) ofB1: computing
Figure GDA00024415014600000710
Step B2:
Figure GDA00024415014600000711
expression solution
Figure GDA00024415014600000712
The minimum K value in the sequence is generally more than or equal to 1 and less than or equal to 5;
step B3: setting the image set corresponding to the minimum K values in the step B2 as an alternative set, and then using well-known image feature matching to the images in the alternative set to find out the final image
Figure GDA00024415014600000713
The second reference image of (1).
Step 41: calculating compensation parameters
refmaxMax (ref1_ y (i, j), ref2_ y | area condition 1)
Zone condition 1:
the region of the ref1_ y (i, j) ∈ right side 1/2 of the first reference image or the region of the ref2_ y (i, j) ∈ left side 1/2 of the second reference image;
ref1_ y (i, j) represents the luminance value of the ith row and jth column pixel of the first reference image;
ref2_ y (i, j) represents the luminance value of the ith row and jth column pixel of the second reference image;
refminmin (ref1_ y (i, j), ref2_ y | area condition 2);
zone condition 2:
the region of the right side 1/2 of the first reference image of ref1_ y (i, j) ∈ or the region of the left side 1/2 of the second reference image of ref2_ y (i, j) ∈
Figure GDA00024415014600000714
Wherein refmax、refminIndicates the compensation parameters (compensation parameter 1, compensation parameter 2); max (variable | Condition), min (variable | Condition)) Respectively representing the maximum and minimum values of variables meeting the conditions;
Figure GDA0002441501460000081
representing the brightness k distribution value of the Nth image;
Figure GDA0002441501460000082
the brightness k distribution value, max, of the t frame image representing the Nth spliced code streamN、minNIndicates the compensation parameters (compensation parameter 3, compensation parameter 4);
step 42: using compensation parameters, pair
Figure GDA0002441501460000083
The image is illumination compensated and then proceeds to Step 6.
Figure GDA0002441501460000084
FIG. 3 is a flow chart of a global illumination compensation pattern method of FIG. 1;
step 5: global illumination compensation mode.
Step 50: if partWhen equal to 0, set up
Figure GDA0002441501460000085
Is composed of
Figure GDA0002441501460000086
Splicing the next frame of the code stream; otherwise, go directly to Step 51.
Step 51: to pair
Figure GDA0002441501460000087
The image of (2) is first compensated for illumination.
Figure GDA0002441501460000088
Thres2For the second determination of the threshold value, it is generally preferable
Figure GDA0002441501460000089
min (variable 1, variable 2) represents the minimum value between variable 1 and variable 2;
Figure GDA00024415014600000810
stretching
Figure GDA00024415014600000811
Any linear or non-linear monotonically increasing function may be selected; compression
Figure GDA00024415014600000812
Any linear or non-linear monotonically increasing decreasing function may be selected.
Figure GDA00024415014600000813
Show that
Figure GDA00024415014600000814
After the peak parameters are arranged in a descending order, the nth peak parameter is obtained;
step 52: to pair
Figure GDA00024415014600000815
Illumination compensation is performed, and then Step6 is entered.
Case1 direct acquisition when the image capturing position is known
Figure GDA00024415014600000816
The first and second reference images of (1); then, the first and second reference images after illumination compensation are used for the image pair by using the method of Step41-Step42
Figure GDA00024415014600000817
Illumination compensation is performed, and then Step6 is entered.
Case 2: when the position relation of the spliced panoramic image sequence is unknown, the spliced panoramic image sequence without illumination compensation can be obtained by a method of Case2 in Step40
Figure GDA00024415014600000818
The first and second reference images of (1); then, the first and second reference images after illumination compensation are used for the image pair by using the method of Step41-Step42
Figure GDA0002441501460000091
Illumination compensation is performed, and then Step6 is entered.
Step 6: frame t image of N spliced code streams after illumination compensation
Figure GDA0002441501460000092
And carrying out conventional panoramic image stitching.
Step7:t=t+1。
Step 8: if the code stream is spliced, ending; otherwise, Step0 is re-entered.
Example two
FIG. 4 is a block diagram of a panoramic video adaptive illumination compensation system according to a preferred embodiment of the present invention; the system comprises:
the parameter judgment module is used for judging whether the judgment parameter is 1 or not, entering the judgment statistic calculation device, and otherwise, entering the first judgment processing module;
the judgment parameter calculation method comprises the following steps:
Figure GDA0002441501460000093
wherein the content of the first and second substances,
Figure GDA0002441501460000094
condition 1 represents that t is 1 or
Figure GDA0002441501460000095
For intra-predicted frames or (there is one)
Figure GDA0002441501460000096
) (ii) a Condition 2 represents
Figure GDA0002441501460000097
For intra-frame prediction of blocks or at least packetsContaining an intra-frame prediction sub-block; partA judgment parameter representing the t-th frame panorama;
Figure GDA0002441501460000098
representing the t frame image of the nth spliced code stream, wherein the initial value of t is 1;
Figure GDA0002441501460000099
representing an intermediate variable; n is the number of code streams of the spliced panoramic video; sum (variable | condition) represents summing the variables that satisfy the condition;
Figure GDA00024415014600000910
Figure GDA00024415014600000911
to represent
Figure GDA00024415014600000912
Line ii jj decoding block; the size of the block is 16x16(H264 standard), 64x64(HEVC), when the block is further divided, these smaller blocks are called sub-blocks; bkw and bkh respectively indicate the number of columns and the number of rows of a frame of image in units of blocks after the image is divided into blocks.
Judgment statistic calculation device, calculation
Figure GDA00024415014600000913
N is more than or equal to 1 and less than or equal to N;
a peak parameter descending order module for aligning
Figure GDA00024415014600000914
The peak value parameters are arranged in a descending order, and an image peak value parameter sequence after the descending order, a first distribution parameter sequence, a second distribution parameter sequence, a third distribution parameter sequence and a corresponding image sequence are obtained;
the method specifically comprises the following steps: will be provided with
Figure GDA00024415014600000915
In descending order, is recorded as
Figure GDA00024415014600000916
The corresponding first, second and third distribution parameters are recorded as
Figure GDA0002441501460000101
Figure GDA0002441501460000102
The corresponding image is recorded as
Figure GDA0002441501460000103
Figure GDA0002441501460000104
Figure GDA0002441501460000105
To represent
Figure GDA0002441501460000106
The peak value parameter of (a);
Figure GDA0002441501460000107
a t frame image representing the nth spliced code stream;
a first judgment processing module for judging if (par) is satisfiedt1 and
Figure GDA0002441501460000108
) Or (par)t0 and part-1A global compensation mode is adopted during illumination compensation), and then the system enters a global illumination compensation device; otherwise, entering the local illumination compensation device.
Wherein part-1A judgment parameter representing the t-1 th frame panorama; thres1For the first decision threshold, Thres may be generally taken1≤100。
Local illumination compensation means for judging if partWhen 1, then pair
Figure GDA0002441501460000109
Illumination compensation is carried out; otherwise, set up
Figure GDA00024415014600001010
Is composed of
Figure GDA00024415014600001011
The next frame of the code stream is spliced, and then the code stream is paired
Figure GDA00024415014600001012
Illumination compensation is carried out; wherein the content of the first and second substances,
Figure GDA00024415014600001013
show that
Figure GDA00024415014600001014
The Nth image after descending order arrangement;
Figure GDA00024415014600001015
a t-1 frame image representing the nth spliced code stream;
the global illumination compensation device is used for carrying out global illumination compensation on the images corresponding to the N spliced code stream image peak value parameter sequences after the descending order arrangement;
the panoramic image splicing module is used for splicing t frame images of N spliced code streams after illumination compensation
Figure GDA00024415014600001016
Figure GDA00024415014600001017
And carrying out conventional panoramic image stitching.
A frame number setting module, configured to set t as t + 1;
and the second judgment processing module is used for judging that the code stream is spliced completely, ending the process, and otherwise, re-entering the parameter judgment module.
Further, the judgment statistic calculation means further includes:
an image brightness distribution value calculating module for calculating
Figure GDA00024415014600001018
Wherein the content of the first and second substances,
Figure GDA00024415014600001019
the luminance k distribution value of the t frame image of the nth spliced code stream is represented, k represents the luminance value, k is more than or equal to 0 and less than or equal to 255,
Figure GDA00024415014600001020
to represent
Figure GDA00024415014600001021
The luminance value of the ith row and jth column of pixels after decoding; width and height represent the length and width resolution of the image.
The judgment statistic parameter set calculation module is used for calculating the splicing statistic parameter set;
Figure GDA0002441501460000111
Figure GDA0002441501460000112
Figure GDA0002441501460000113
Figure GDA0002441501460000114
wherein the content of the first and second substances,
Figure GDA0002441501460000115
to represent
Figure GDA0002441501460000116
The peak value parameter of (a); max (variable | condition) and min (variable | condition) respectively represent the maximum value and the minimum value of the variables satisfying the condition; arc max (variable | argument condition) indicates that the variable is a maximum valueThe value of the time-dependent argument, i.e.
Figure GDA0002441501460000117
Expression solution
Figure GDA0002441501460000118
Then the corresponding k value at this time is obtained, namely the expression
Figure GDA0002441501460000119
A value of (d);
Figure GDA00024415014600001110
respectively representing a first, a second and a third distribution parameter.
FIG. 5 is a block diagram of the local illumination compensation apparatus of FIG. 4; further, the local illumination compensation apparatus further includes:
a local reference image acquisition module for acquiring
Figure GDA00024415014600001111
First and second reference images.
Wherein the first reference image is
Figure GDA00024415014600001112
The left stitched image and the second reference image are
Figure GDA00024415014600001113
And the right side is spliced with the image.
Case1 direct acquisition when the image capturing position is known
Figure GDA00024415014600001114
The first and second reference images of (1);
case 2: when the position relation of the spliced panoramic image sequence is unknown, the first reference image and the second reference image can be obtained by the following method.
If par t1, then directly choose
Figure GDA00024415014600001115
The code stream corresponding to the first and second reference images is used as the next frame image
Figure GDA00024415014600001116
The first and second reference images of (1); otherwise, the first reference image is acquired by the following method:
acquiring a first reference image:
step A1: computing
Figure GDA00024415014600001117
std (variable | condition) represents the mean square error of the variables satisfying the condition, and int represents the rounding operation.
Wherein the content of the first and second substances,
Figure GDA0002441501460000121
to represent
Figure GDA0002441501460000122
The luminance value of the ith row and jth column of pixels after decoding;
Figure GDA0002441501460000123
representing the t frame image of the Nth spliced code stream, wherein the initial value of t is 1;
Figure GDA0002441501460000124
to represent
Figure GDA0002441501460000125
The luminance value of the ith row and the jth + width/2 column of pixels after decoding;
step A2:
Figure GDA0002441501460000126
expression solution
Figure GDA0002441501460000127
The minimum K value in the sequence is generally more than or equal to 1 and less than or equal to 5;
step A3: setting the image set corresponding to the minimum K values in the step A2 as an alternative set, and then enabling the images in the alternative setFinding the final image by using known image characteristic matching
Figure GDA0002441501460000128
The first reference image of (1).
Acquiring a second reference image:
step B1: computing
Figure GDA0002441501460000129
Step B2:
Figure GDA00024415014600001210
expression solution
Figure GDA00024415014600001211
The minimum K value in the sequence is generally more than or equal to 1 and less than or equal to 5;
step B3: setting the image set corresponding to the minimum K values in the step B2 as an alternative set, and then using well-known image feature matching to the images in the alternative set to find out the final image
Figure GDA00024415014600001212
The second reference image of (1).
The first compensation parameter calculation module is used for calculating a compensation parameter;
refmaxmax (ref1_ y (i, j), ref2_ y | area condition 1)
Zone condition 1:
the region of the ref1_ y (i, j) ∈ right side 1/2 of the first reference image or the region of the ref2_ y (i, j) ∈ left side 1/2 of the second reference image;
ref1_ y (i, j) represents the luminance value of the ith row and jth column pixel of the first reference image;
ref2_ y (i, j) represents the luminance value of the ith row and jth column pixel of the second reference image
refminMin (ref1_ y (i, j), ref2_ y | area condition 2)
Zone condition 2:
the region of the right side 1/2 of the first reference image of ref1_ y (i, j) ∈ or the region of the left side 1/2 of the second reference image of ref2_ y (i, j) ∈
Figure GDA00024415014600001213
Wherein refmax、refminIndicates the compensation parameters (compensation parameter 1, compensation parameter 2); max (variable | condition) and min (variable | condition) respectively represent the maximum value and the minimum value of the variables satisfying the condition;
Figure GDA0002441501460000131
representing the brightness k distribution value of the Nth image;
Figure GDA0002441501460000132
the brightness k distribution value, max, of the t frame image representing the Nth spliced code streamN、minNIndicates the compensation parameters (compensation parameter 3, compensation parameter 4);
a first illumination compensation module for using the compensation parameter
Figure GDA0002441501460000133
And performing illumination compensation on the image, and then entering a panoramic image splicing module. The method specifically comprises the following steps:
Figure GDA0002441501460000134
Figure GDA0002441501460000135
to represent
Figure GDA0002441501460000136
The luminance value of the ith row and jth column of pixels after decoding;
further, the global illumination compensation device is used for judging if partWhen equal to 0, set up
Figure GDA0002441501460000137
Is composed of
Figure GDA0002441501460000138
Splicing the next frame of the code stream; otherwise, directly pair
Figure GDA0002441501460000139
The image is subjected to illumination compensation in a first way and then is subjected to
Figure GDA00024415014600001310
Performing illumination compensation by a second mode; then entering a panoramic image splicing module;
the first mode is as follows:
Figure GDA00024415014600001311
Thres2for the second determination of the threshold value, it is generally preferable
Figure GDA00024415014600001312
min (variable 1, variable 2) represents the minimum value between variable 1 and variable 2;
Figure GDA00024415014600001313
stretching
Figure GDA00024415014600001314
Any linear or non-linear monotonically increasing function may be selected; compression
Figure GDA00024415014600001315
Any linear or non-linear monotonically increasing decreasing function may be selected.
Figure GDA00024415014600001316
Show that
Figure GDA00024415014600001317
After the peak parameters are arranged in a descending order, the nth peak parameter is obtained;
the second mode is as follows:
case1 direct acquisition when the image capturing position is known
Figure GDA00024415014600001318
The first and second reference images of (1); then, the first and second reference images after illumination compensation are used, and the first compensation parameter calculation module and the first illumination compensation module are used for matching
Figure GDA00024415014600001319
And performing illumination compensation, and then entering a panoramic image splicing module.
Case 2: when the position relation of the spliced panoramic image sequence is unknown, the spliced panoramic image sequence without illumination compensation can be firstly obtained by a method of Case2 in the local reference image obtaining module
Figure GDA0002441501460000141
The first and second reference images of (1); then, the first and second reference images after illumination compensation are used, and the first compensation parameter calculation module and the first illumination compensation module are used for matching
Figure GDA0002441501460000142
And performing illumination compensation, and then entering a panoramic image splicing module.
It will be understood by those skilled in the art that all or part of the steps in the method according to the above embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, such as ROM, RAM, magnetic disk, optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (2)

1. A panoramic video adaptive illumination compensation method is characterized by comprising the following steps:
step J0: if the parameter par is judgedtIf the value is 1, the step J1 is carried out, otherwise, the step J3 is carried out; wherein, judgingParameter partThe calculation method comprises the following steps:
Figure FDA0002441501450000011
Figure FDA0002441501450000012
condition 1 represents that t is 1 or
Figure FDA0002441501450000013
For intra-predicted frames or for the presence of one
Figure FDA0002441501450000014
Condition 2 represents
Figure FDA0002441501450000015
Is an intra-frame prediction block or at least comprises one intra-frame prediction sub-block;
Figure FDA0002441501450000016
representing an intermediate variable;
n is the number of code streams of the spliced panoramic video;
sum (variable | condition) represents summing the variables that satisfy the condition;
Figure FDA0002441501450000017
Figure FDA0002441501450000018
to represent
Figure FDA0002441501450000019
Line ii jj decoding block;
bkw and bkh respectively represent the column number and the row number of a frame of image in a unit of block after the frame of image is divided into blocks;
step J1: computing
Figure FDA00024415014500000110
Is determined as statistic, including
Figure FDA00024415014500000111
Figure FDA00024415014500000112
The calculation
Figure FDA00024415014500000113
The judgment statistical quantity is specifically as follows:
computing
Figure FDA00024415014500000114
Wherein the content of the first and second substances,
Figure FDA00024415014500000115
the luminance k distribution value of the t frame image of the nth spliced code stream is represented, k represents the luminance value, k is more than or equal to 0 and less than or equal to 255,
Figure FDA00024415014500000116
to represent
Figure FDA00024415014500000117
The luminance value of the ith row and jth column of pixels after decoding; width and height represent the length and width resolution of the image;
calculating a set of splice statistics parameters comprising
Figure FDA00024415014500000118
Figure FDA00024415014500000119
Figure FDA00024415014500000120
Figure FDA00024415014500000121
Figure FDA0002441501450000021
Wherein the content of the first and second substances,
Figure FDA0002441501450000022
to represent
Figure FDA0002441501450000023
The peak value parameter of (a); max (variable | condition) and min (variable | condition) respectively represent the maximum value and the minimum value of the variables satisfying the condition;
Figure FDA0002441501450000026
represents a third distribution parameter; arc max (variable | argument condition) represents the argument value corresponding to the maximum value of the variable,
Figure FDA0002441501450000027
expression solution
Figure FDA0002441501450000028
Then the corresponding k value at this time is obtained, namely the expression
Figure FDA0002441501450000029
A value of (d);
step J2: will be provided with
Figure FDA00024415014500000210
In descending order, is recorded as
Figure FDA00024415014500000211
The corresponding first, second and third distribution parameters are recorded as
Figure FDA00024415014500000212
Figure FDA00024415014500000213
The corresponding image is recorded as
Figure FDA00024415014500000214
Figure FDA00024415014500000215
Step J3: if (par)t1 and
Figure FDA00024415014500000216
) Or (par)t0 and part-1A global compensation mode is adopted during illumination compensation), and then the system enters the global compensation mode; otherwise, entering a local illumination compensation mode;
the global illumination compensation mode specifically includes:
step 50: if partWhen equal to 0, set up
Figure FDA00024415014500000217
Is composed of
Figure FDA00024415014500000218
Splicing the next frame of the code stream; otherwise, go directly to Step 51;
step 51: to pair
Figure FDA00024415014500000219
The image is subjected to illumination compensation;
Figure FDA00024415014500000220
Thres2determining a threshold value for the second determination,
Figure FDA00024415014500000221
min (variable 1, variable 2) represents the minimum value between variable 1 and variable 2;
Figure FDA00024415014500000222
stretching
Figure FDA00024415014500000223
Is any linear or non-linear monotonically increasing function; compression
Figure FDA00024415014500000224
Is any linear or non-linear monotonically increasing decreasing function;
Figure FDA00024415014500000225
show that
Figure FDA00024415014500000226
After the peak parameters are arranged in a descending order, the nth peak parameter is obtained;
step 52: to pair
Figure FDA00024415014500000227
Illumination compensation is carried out, and then the step J4 is carried out;
case1: when the image capturing position is known, it can be directly acquired
Figure FDA00024415014500000228
The first and second reference images of (1); then, the first and second reference images after illumination compensation are used for the image pair by using the method of Step41-Step42
Figure FDA0002441501450000031
Illumination compensation is carried out, and then the step J4 is carried out;
case 2: when the position relation of the spliced panoramic image sequence is unknown, the position relation can be firstly compensated by using the illumination which is not carried outThe compensated spliced panoramic image sequence is acquired by a method of Case2 in Step40
Figure FDA0002441501450000032
The first and second reference images of (1); then, the first and second reference images after illumination compensation are used for the image pair by using the method of Step41-Step42
Figure FDA0002441501450000033
Illumination compensation is carried out, and then the step J4 is carried out;
wherein, the local illumination compensation mode specifically comprises:
if partWhen 1, then pair
Figure FDA0002441501450000034
Illumination compensation is carried out; otherwise, set up
Figure FDA0002441501450000035
Is composed of
Figure FDA0002441501450000036
The next frame of the code stream is spliced, and then the code stream is paired
Figure FDA0002441501450000037
Illumination compensation is carried out;
Figure FDA0002441501450000038
show that
Figure FDA0002441501450000039
The Nth image after descending order arrangement;
Figure FDA00024415014500000310
a t-1 frame image representing the nth spliced code stream;
the method specifically comprises the following steps:
step 40: obtaining
Figure FDA00024415014500000311
First and second reference images; the first reference image is
Figure FDA00024415014500000312
The left stitched image and the second reference image are
Figure FDA00024415014500000313
Splicing images on the right side;
wherein the obtaining is
Figure FDA00024415014500000314
The first and second reference images are specifically:
case1: when the image capturing position is known, directly acquiring
Figure FDA00024415014500000315
The first and second reference images of (1);
case 2: when the position relation of the spliced panoramic image sequence is unknown, the first reference image and the second reference image are obtained by the following method:
if part1, then directly choose
Figure FDA00024415014500000316
The code stream corresponding to the first and second reference images is used as the next frame image
Figure FDA00024415014500000317
The first and second reference images of (1); otherwise, the first reference image is acquired by the following method:
acquiring a first reference image:
step A1: computing
Figure FDA00024415014500000318
Wherein std (variable | condition) represents that the mean square error of the variable satisfying the condition is solved, and int represents rounding operation;
Figure FDA00024415014500000319
to represent
Figure FDA00024415014500000320
The luminance value of the ith row and jth column of pixels after decoding;
Figure FDA00024415014500000321
representing the t frame image of the Nth spliced code stream, wherein the initial value of t is 1;
Figure FDA00024415014500000322
to represent
Figure FDA00024415014500000323
The luminance value of the ith row and the jth + width/2 column of pixels after decoding;
step A2: min _ K (diff)t nInt (N/2) is more than or equal to N and less than or equal to N), representing that diff is solvedt nK is not less than 1 and not more than 5;
step A3: setting the image set corresponding to the minimum K values in the step A2 as an alternative set, then carrying out image feature matching on the images in the alternative set to find out the final image
Figure FDA0002441501450000041
A first reference image of (a);
acquiring a second reference image:
step B1: computing
Figure FDA0002441501450000042
Step B2: min _ K (diff)t nInt (N/2) is more than or equal to N and less than or equal to N), representing that diff is solvedt nK is not less than 1 and not more than 5;
step B3: setting the image set corresponding to the minimum K values in the step B2 as an alternative set, then carrying out image feature matching on the images in the alternative set to find out the final image
Figure FDA0002441501450000043
A second reference image of (a);
step 41: calculating a compensation parameter refmaxMax (ref1_ y (i, j), ref2_ y | area condition 1);
zone condition 1:
the region of the ref1_ y (i, j) ∈ right side 1/2 of the first reference image or the region of the ref2_ y (i, j) ∈ left side 1/2 of the second reference image;
ref1_ y (i, j) represents the luminance value of the ith row and jth column pixel of the first reference image;
ref2_ y (i, j) represents the luminance value of the ith row and jth column pixel of the second reference image;
refminmin (ref1_ y (i, j), ref2_ y | area condition 2);
zone condition 2:
the region of the ref1_ y (i, j) ∈ right side 1/2 of the first reference image or the region of the ref2_ y (i, j) ∈ left side 1/2 of the second reference image;
Figure FDA0002441501450000044
wherein refmax、refminRepresents a compensation parameter 1 and a compensation parameter 2; max (variable | condition) and min (variable | condition) respectively represent the maximum value and the minimum value of the variables satisfying the condition;
Figure FDA0002441501450000045
representing the brightness k distribution value of the Nth image;
Figure FDA0002441501450000046
the brightness k distribution value, max, of the t frame image representing the Nth spliced code streamN、minNRepresents a compensation parameter 3, a compensation parameter 4;
step 42: using compensation parameters, pair
Figure FDA0002441501450000047
Image illuminationAnd (4) compensating, and then entering a step J4, specifically:
Figure FDA0002441501450000048
step J4: frame t image of N spliced code streams after illumination compensation
Figure FDA0002441501450000049
Figure FDA00024415014500000410
Carrying out conventional panoramic image splicing;
step J5: let t be t + 1;
step J6: if the code stream is spliced, ending; otherwise, re-enter step J0;
wherein the content of the first and second substances,
Figure FDA0002441501450000051
a t frame image representing the nth spliced code stream;
Figure FDA0002441501450000052
to represent
Figure FDA0002441501450000053
The peak value parameter of (a);
Figure FDA0002441501450000054
respectively representing a first distribution parameter, a second distribution parameter and a third distribution parameter; partA judgment parameter representing the t-th frame panorama; part-1A judgment parameter representing the t-1 th frame panorama; thres1Is a first decision threshold.
2. A panoramic video adaptive illumination compensation system, the system comprising:
a parameter judgment module for judging the parameter partIf the value is 1, the step J1 is carried out, otherwise, the step J3 is carried out; wherein the judgment parameter partThe calculation method comprises the following steps:
Figure FDA0002441501450000055
Figure FDA0002441501450000056
condition 1 represents that t is 1 or
Figure FDA0002441501450000057
For intra-predicted frames or (there is one)
Figure FDA0002441501450000058
);
Condition 2 represents
Figure FDA0002441501450000059
Is an intra-frame prediction block or at least comprises one intra-frame prediction sub-block;
Figure FDA00024415014500000510
representing an intermediate variable;
n is the number of code streams of the spliced panoramic video;
sum (variable | condition) represents summing the variables that satisfy the condition;
Figure FDA00024415014500000511
Figure FDA00024415014500000512
to represent
Figure FDA00024415014500000513
Line ii jj decoding block;
bkw and bkh respectively represent the column number and the row number of a frame of image in a unit of block after the frame of image is divided into blocks;
step J1: computing
Figure FDA00024415014500000514
Is determined as statistic, including
Figure FDA00024415014500000515
Figure FDA00024415014500000516
The calculation
Figure FDA00024415014500000517
The judgment statistical quantity is specifically as follows:
computing
Figure FDA00024415014500000518
Wherein the content of the first and second substances,
Figure FDA00024415014500000519
the luminance k distribution value of the t frame image of the nth spliced code stream is represented, k represents the luminance value, k is more than or equal to 0 and less than or equal to 255,
Figure FDA00024415014500000520
to represent
Figure FDA00024415014500000521
The luminance value of the ith row and jth column of pixels after decoding; width and height represent the length and width resolution of the image;
calculating a set of splice statistics parameters comprising
Figure FDA00024415014500000522
Figure FDA0002441501450000061
Figure FDA0002441501450000062
Figure FDA0002441501450000063
Figure FDA0002441501450000064
Figure FDA0002441501450000065
Wherein the content of the first and second substances,
Figure FDA0002441501450000066
to represent
Figure FDA0002441501450000067
The peak value parameter of (a); max (variable | condition) and min (variable | condition) respectively represent the maximum value and the minimum value of the variables satisfying the condition; arc max (variable | argument condition) represents the argument value corresponding to the maximum value of the variable,
Figure FDA0002441501450000068
expression solution
Figure FDA0002441501450000069
Then the corresponding k value at this time is obtained, namely the expression
Figure FDA00024415014500000610
A value of (d);
step J2: will be provided with
Figure FDA00024415014500000611
In descending order, is recorded as
Figure FDA00024415014500000612
Corresponding to the first, second and third distribution parametersIs marked as
Figure FDA00024415014500000613
Figure FDA00024415014500000614
The corresponding image is recorded as
Figure FDA00024415014500000615
Figure FDA00024415014500000616
Step J3: if (par)t1 and
Figure FDA00024415014500000617
) Or (par)t0 and part-1A global compensation mode is adopted during illumination compensation), and then the system enters the global compensation mode; otherwise, entering a local illumination compensation mode;
the global illumination compensation mode specifically includes:
step 50: if partWhen equal to 0, set up
Figure FDA00024415014500000618
Is composed of
Figure FDA00024415014500000619
Splicing the next frame of the code stream; otherwise, go directly to Step 51;
step 51: to pair
Figure FDA00024415014500000620
The image is subjected to illumination compensation;
Figure FDA00024415014500000621
Thres2in order to determine the threshold value for the second time,
Figure FDA00024415014500000622
min (variable 1, variable 2) represents the minimum value between variable 1 and variable 2;
Figure FDA00024415014500000623
stretching
Figure FDA00024415014500000624
Is any linear or non-linear monotonically increasing function; compression
Figure FDA00024415014500000625
Is any linear or non-linear monotonically increasing decreasing function;
Figure FDA00024415014500000626
show that
Figure FDA00024415014500000627
After the peak parameters are arranged in a descending order, the nth peak parameter is obtained;
step 52: to pair
Figure FDA0002441501450000071
Illumination compensation is carried out, and then the step J4 is carried out;
case1: when the image capturing position is known, it can be directly acquired
Figure FDA0002441501450000072
The first and second reference images of (1); then, the first and second reference images after illumination compensation are used for the image pair by using the method of Step41-Step42
Figure FDA0002441501450000073
Illumination compensation is carried out, and then the step J4 is carried out;
case 2: when the position relation of the spliced panoramic image sequence is unknown, the spliced panoramic image sequence without illumination compensation can be obtained by a method of Case2 in Step40Get
Figure FDA0002441501450000074
The first and second reference images of (1); then, the first and second reference images after illumination compensation are used for the image pair by using the method of Step41-Step42
Figure FDA0002441501450000075
Illumination compensation is carried out, and then the step J4 is carried out;
wherein, the local illumination compensation mode specifically comprises:
if partWhen 1, then pair
Figure FDA0002441501450000076
Illumination compensation is carried out; otherwise, set up
Figure FDA0002441501450000077
Is composed of
Figure FDA0002441501450000078
The next frame of the code stream is spliced, and then the code stream is paired
Figure FDA0002441501450000079
Illumination compensation is carried out;
Figure FDA00024415014500000710
show that
Figure FDA00024415014500000711
The Nth image after descending order arrangement;
Figure FDA00024415014500000712
a t-1 frame image representing the nth spliced code stream;
the method specifically comprises the following steps:
step 40: obtaining
Figure FDA00024415014500000713
First and second reference images; the first reference image is
Figure FDA00024415014500000714
The left stitched image and the second reference image are
Figure FDA00024415014500000715
Splicing images on the right side;
wherein the obtaining is
Figure FDA00024415014500000716
The first and second reference images are specifically:
case1: when the image capturing position is known, directly acquiring
Figure FDA00024415014500000717
The first and second reference images of (1);
case 2: when the position relation of the spliced panoramic image sequence is unknown, the first reference image and the second reference image are obtained by the following method:
if part1, then directly choose
Figure FDA00024415014500000718
The code stream corresponding to the first and second reference images is used as the next frame image
Figure FDA00024415014500000719
The first and second reference images of (1); otherwise, the first reference image is acquired by the following method:
acquiring a first reference image:
step A1: computing
Figure FDA00024415014500000720
Wherein std (variable | condition) represents that the mean square error of the variable satisfying the condition is solved, and int represents rounding operation;
Figure FDA00024415014500000721
to represent
Figure FDA00024415014500000722
The luminance value of the ith row and jth column of pixels after decoding;
Figure FDA00024415014500000723
representing the t frame image of the Nth spliced code stream, wherein the initial value of t is 1;
Figure FDA00024415014500000724
to represent
Figure FDA00024415014500000725
The luminance value of the ith row and the jth + width/2 column of pixels after decoding;
step A2: min _ K (diff)t nInt (N/2) is more than or equal to N and less than or equal to N), representing that diff is solvedt nK is not less than 1 and not more than 5;
step A3: setting the image set corresponding to the minimum K values in the step A2 as an alternative set, and then using well-known image feature matching to the images in the alternative set to find out the final image
Figure FDA0002441501450000081
A first reference image of (a);
acquiring a second reference image:
step B1: computing
Figure FDA0002441501450000082
Step B2: min _ K (diff)t nInt (N/2) is more than or equal to N and less than or equal to N), representing that diff is solvedt nK is not less than 1 and not more than 5;
step B3: setting the image set corresponding to the minimum K values in the step B2 as an alternative set, and then using the well-known image feature matching to the images in the alternative set to find out the final image
Figure FDA0002441501450000083
A second reference image of (a);
step 41: calculating a compensation parameter refmaxMax (ref1_ y (i, j), ref2_ y | area condition 1);
zone condition 1:
the region of the ref1_ y (i, j) ∈ right side 1/2 of the first reference image or the region of the ref2_ y (i, j) ∈ left side 1/2 of the second reference image;
ref1_ y (i, j) represents the luminance value of the ith row and jth column pixel of the first reference image;
ref2_ y (i, j) represents the luminance value of the ith row and jth column pixel of the second reference image;
refminmin (ref1_ y (i, j), ref2_ y | area condition 2);
zone condition 2:
the region of the right side 1/2 of the first reference image of ref1_ y (i, j) ∈ or the region of the left side 1/2 of the second reference image of ref2_ y (i, j) ∈
Figure FDA0002441501450000084
Wherein refmax、refminRepresents a compensation parameter 1 and a compensation parameter 2; max (variable | condition) and min (variable | condition) respectively represent the maximum value and the minimum value of the variables satisfying the condition;
Figure FDA0002441501450000085
representing the brightness k distribution value of the Nth image;
Figure FDA0002441501450000086
the brightness k distribution value, max, of the t frame image representing the Nth spliced code streamN、minNRepresents a compensation parameter 3, a compensation parameter 4;
step 42: using compensation parameters, pair
Figure FDA0002441501450000087
Performing illumination compensation on the image, and then entering step J4, specifically:
Figure FDA0002441501450000091
step J4: frame t image of N spliced code streams after illumination compensation
Figure FDA0002441501450000092
Figure FDA0002441501450000093
Carrying out conventional panoramic image splicing;
step J5: let t be t + 1;
step J6: if the code stream is spliced, ending; otherwise, re-enter step J0;
wherein the content of the first and second substances,
Figure FDA0002441501450000094
a t frame image representing the nth spliced code stream;
Figure FDA0002441501450000095
to represent
Figure FDA0002441501450000096
The peak value parameter of (a);
Figure FDA0002441501450000097
respectively representing a first distribution parameter, a second distribution parameter and a third distribution parameter; partA judgment parameter representing the t-th frame panorama; part-1A judgment parameter representing the t-1 th frame panorama; thres1Is a first decision threshold.
CN201611007942.8A 2016-11-16 2016-11-16 Panoramic video self-adaptive illumination compensation method and system Active CN106572312B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611007942.8A CN106572312B (en) 2016-11-16 2016-11-16 Panoramic video self-adaptive illumination compensation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611007942.8A CN106572312B (en) 2016-11-16 2016-11-16 Panoramic video self-adaptive illumination compensation method and system

Publications (2)

Publication Number Publication Date
CN106572312A CN106572312A (en) 2017-04-19
CN106572312B true CN106572312B (en) 2020-08-04

Family

ID=58542160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611007942.8A Active CN106572312B (en) 2016-11-16 2016-11-16 Panoramic video self-adaptive illumination compensation method and system

Country Status (1)

Country Link
CN (1) CN106572312B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109815905B (en) * 2019-01-24 2022-12-23 深圳市梦网视讯有限公司 Method and system for detecting face image by backlight source

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101123691A (en) * 2007-07-24 2008-02-13 哈尔滨工程大学 Self-adapted adjustment method for image brightness of high resolution and panorama vision system
JP2010056774A (en) * 2008-08-27 2010-03-11 Casio Comput Co Ltd Apparatus, method and program for processing image
CN101719989A (en) * 2009-11-30 2010-06-02 北京中星微电子有限公司 Method and system for backlight compensation
CN104023236A (en) * 2014-06-13 2014-09-03 深圳百科信息技术有限公司 Method and system of adjusting chromaticity balance and quantization parameters
CN105430376A (en) * 2015-11-12 2016-03-23 深圳进化动力数码科技有限公司 Method and device for detecting consistency of panoramic camera
CN105809626A (en) * 2016-03-08 2016-07-27 长春理工大学 Self-adaption light compensation video image splicing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104184961A (en) * 2013-05-22 2014-12-03 辉达公司 Mobile device and system used for generating panoramic video

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101123691A (en) * 2007-07-24 2008-02-13 哈尔滨工程大学 Self-adapted adjustment method for image brightness of high resolution and panorama vision system
JP2010056774A (en) * 2008-08-27 2010-03-11 Casio Comput Co Ltd Apparatus, method and program for processing image
CN101719989A (en) * 2009-11-30 2010-06-02 北京中星微电子有限公司 Method and system for backlight compensation
CN104023236A (en) * 2014-06-13 2014-09-03 深圳百科信息技术有限公司 Method and system of adjusting chromaticity balance and quantization parameters
CN105430376A (en) * 2015-11-12 2016-03-23 深圳进化动力数码科技有限公司 Method and device for detecting consistency of panoramic camera
CN105809626A (en) * 2016-03-08 2016-07-27 长春理工大学 Self-adaption light compensation video image splicing method

Also Published As

Publication number Publication date
CN106572312A (en) 2017-04-19

Similar Documents

Publication Publication Date Title
US8582915B2 (en) Image enhancement for challenging lighting conditions
US7218777B2 (en) Flicker correction for moving picture
CN106462955B (en) Automatic video quality enhancement with temporal smoothing and user override
US20160094824A1 (en) Image processing method, image processing apparatus and electronic device
US11763431B2 (en) Scene-based image processing method, apparatus, smart terminal and storage medium
US20160100148A1 (en) Method and system of lens shading color correction using block matching
JP2008504750A5 (en)
US9165345B2 (en) Method and system for noise reduction in video systems
US10229481B2 (en) Automatic defective digital motion picture pixel detection
CN110620924A (en) Method and device for processing coded data, computer equipment and storage medium
CN105704398A (en) Video processing method
US9881358B2 (en) Method and system for adaptive pixel replacement
US8611423B2 (en) Determination of optimal frame types in video encoding
CN106683047B (en) Illumination compensation method and system for panoramic image
US20140022338A1 (en) Method for Producing a Panoramic Image on the Basis of a Video Sequence and Implementation Apparatus
CN106572312B (en) Panoramic video self-adaptive illumination compensation method and system
US10049436B1 (en) Adaptive denoising for real-time video on mobile devices
CN107481199B (en) Image defogging method and device, storage medium and mobile terminal
CN107423704B (en) Lip video positioning method and system based on skin color detection
US20210090413A1 (en) Using a skip block mask to reduce bitrate from a monitoring camera
JP2003061112A (en) Camerawork detector and camerawork detection method
CN109274970B (en) Rapid scene switching detection method and system
CN109815905B (en) Method and system for detecting face image by backlight source
KR101295782B1 (en) Color correction method and apparatus for stereoscopic image
JP5179433B2 (en) Noise reduction device, noise reduction method, and moving image playback device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 518000 Guangdong city of Shenzhen province Nanshan District Guangdong streets high in the four Longtaili Technology Building Room 325 No. 30

Applicant after: Shenzhen Monternet encyclopedia Information Technology Co. Ltd.

Address before: The central Shenzhen city of Guangdong Province, 518057 Keyuan Road, Nanshan District science and Technology Park No. 15 Science Park Sinovac A Building 1 unit 403, No. 405 unit

Applicant before: BAC Information Technology Co., Ltd.

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 518000 Guangdong city of Shenzhen province Nanshan District Guangdong streets high in the four Longtaili Technology Building Room 325 No. 30

Applicant after: Shenzhen mengwang video Co., Ltd

Address before: 518000 Guangdong city of Shenzhen province Nanshan District Guangdong streets high in the four Longtaili Technology Building Room 325 No. 30

Applicant before: SHENZHEN MONTNETS ENCYCLOPEDIA INFORMATION TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant