CN100423552C - Image freatment method and its device - Google Patents

Image freatment method and its device Download PDF

Info

Publication number
CN100423552C
CN100423552C CNB2005101081562A CN200510108156A CN100423552C CN 100423552 C CN100423552 C CN 100423552C CN B2005101081562 A CNB2005101081562 A CN B2005101081562A CN 200510108156 A CN200510108156 A CN 200510108156A CN 100423552 C CN100423552 C CN 100423552C
Authority
CN
China
Prior art keywords
image
keenization
motion vector
module
conjunction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CNB2005101081562A
Other languages
Chinese (zh)
Other versions
CN1794790A (en
Inventor
傅楸善
邱健平
陳景富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inventec Appliances Corp
Original Assignee
Inventec Appliances Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inventec Appliances Corp filed Critical Inventec Appliances Corp
Priority to CNB2005101081562A priority Critical patent/CN100423552C/en
Publication of CN1794790A publication Critical patent/CN1794790A/en
Application granted granted Critical
Publication of CN100423552C publication Critical patent/CN100423552C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present invention discloses an image treatment method and a device thereof, which is suitable for stabilizing images. The method comprises that at least two images are extracted, a most sharp image and a secondary sharp image are selected according to the images, and as least a first motion vector is generated by the sharpest image and the secondary sharp image. Finally, according to the motion vector and a combined proportion, a combined action of the sharpest image and the secondary sharp image is provided so as to generate a combined image and achieve the purpose of stabilizing the image.

Description

Image treatment method and device thereof
Technical field:
The present invention discloses a kind of image treatment method and device thereof, especially for stable image.
Background technology:
Digital still camera is more and more universal at present, in the development design the little and lightweight trend of volume is arranged.When digital still camera more and more lighter and handier, user's just easy more because automatic shake of hand when taking pictures cause according to take out the image that comes have fuzzy, the phenomenon of bad.
See also Fig. 1 and Fig. 2, be the schematic diagram that is used for stable image of prior art.See also Fig. 1, move 13 because the automatic shake of user's hand produces shake on the other hand, the image that makes object 17 be imaged on A position 14 originally moves to A ' position 15, and this blurring 16 causes image fog.Please continue and consult Fig. 2, the motion detecting device (not shown) calculates compensate for displacement amount 18 according to hand shake mobile 13, and compensate for displacement amount 18 is sent to compensation arrangement 11.And compensation arrangement 11 carries out moving of lens 12 according to compensate for displacement amount 18, makes the imaging of object 17 be returned to A position 14, to reach the purpose of stable image.Yet above-mentioned motion compensation unit and compensation arrangement have complicated circuit and mechanism, can increase the volume and the cost of device.
Therefore, the inventor inquires into based on many practical experience and special topic, then proposes a kind of device of image processing with implementation and foundation as above-mentioned expectation in the present invention.
Summary of the invention:
Because above-mentioned problem, purpose of the present invention is for providing a kind of image treatment method, to reach the purpose of stable image.
Edge is for reaching above-mentioned purpose, according to disclosed kind of image processor of the present invention, to comprise an image capture module, image selection module, motion vector generation module and an image in conjunction with module.The image capture module is in order at least two images of acquisition.Image selects module to select a sharpest keenization image and once sharp keenization image according to those images.Motion vector produce module according to the sharpest keenization image and time sharp keenization image to produce at least one first motion vector.Image in conjunction with module according to motion vector and one in conjunction with ratio provide this sharpest keenization image and sharp keenization of this time image one in conjunction with action.
Another object of the present invention provides a kind of image treatment method, comprises the following step at least:
(a) utilize at least two images of image capture module acquisition;
(b) select a sharpest keenization image and once sharp keenization image according to those images;
(c) by this sharpest keenization image and sharp keenization of this time image, produce at least one motion vector;
(d) according to this motion vector and one in conjunction with ratio provide this sharpest keenization image and sharp keenization of this time image one in conjunction with action, to produce one in conjunction with image.
Description of drawings:
Fig. 1 is the schematic diagram that is used for stable image of prior art;
Fig. 2 is the schematic diagram that is used for stable image of prior art;
Fig. 3 is the schematic diagram according to image processor of the present invention;
Fig. 4 is a schematic diagram of selecting module according to image of the present invention;
Fig. 5 is a flow chart of selecting module according to image of the present invention;
Fig. 6 is the example according to horizontal direction edge detection parameter of the present invention;
Fig. 7 is the example according to vertical direction edge detection parameter of the present invention;
Fig. 8 is according to image raw value of the present invention;
Fig. 9 is the schematic diagram that produces module according to motion vector of the present invention;
Figure 10 is a flow chart of selecting module according to motion vector of the present invention;
Figure 11 is according to the flow chart of image of the present invention in conjunction with module;
Figure 12 is according to the schematic diagram in conjunction with grid of the present invention;
Figure 13 is according to the schematic diagram in conjunction with grid of the present invention.
The figure number simple declaration:
11: compensation arrangement; 12: lens; 13: the hand shake is moved;
14: image space; 15: image space; 16: blurring;
17: object; 18: the compensate for displacement amount; 21: the image capture module;
22: image is selected module; 23: motion vector produces module;
24: image is in conjunction with module; 211: plural number is opened image;
221: the sharpest keenization image; 222: inferior sharp keenization image;
231: the first motion vectors; 241: in conjunction with image; 31: store module;
32: the edge detection module; 33: image is selected control die set;
34: the first buffering modules; 35: the second buffering modules; 331: the first buffers;
332: the second buffers; S41~S49: steps flow chart;
61: feature produces module; 62: the feature comparison module;
63: motion vector is selected module; 611: image feature;
621: the second motion vectors; S71~S79: steps flow chart;
S81~S83: steps flow chart; 91: in conjunction with the width of grid;
92: in conjunction with the height of grid; 93: the length of image;
94: the width of image; 95: have motion vector in conjunction with grid;
96: no motion vector in conjunction with grid;
951: have motion vector in conjunction with grid;
952: have motion vector in conjunction with grid;
961: have motion vector in conjunction with grid;
962: have motion vector in conjunction with grid.
Execution mode:
Now further understand and understanding for the auditor is had architectural feature of the present invention and the effect reached, sincerely help with preferred embodiment and cooperate detailed explanation, illustrate as after:
See also Fig. 3, schematic diagram for image processor of the present invention, this device comprises that an image capture module 21, image selection module 22, a motion vector produce module 23, and one image in conjunction with module 24, wherein, image capture module 21 is sent to image with those images 211 and selects module 22 in order at least two images 211 of acquisition.Image selects module 22 to select a sharpest keenization image 221 and once sharp keenization image 222 from those images, and this two image is sent to motion vector produces module 23.Motion vector produces module 23 by the sharpest keenization image 221 and at least one first motion vector 231 of sharp keenization of this time image 222 generations.At last, image according to first motion vector 231 and predefined in conjunction with ratio, carries out the combination of sharp keenization image 221 and time sharp keenization image 222 in conjunction with module, produces one in conjunction with image 241.
See also Fig. 4 and Fig. 5, select the schematic diagram and the flow chart of module 22 for image shown in Figure 3.Among the figure, image is selected module to comprise a storage module 31, edge detecting module (edge detector) 32,1 first buffering module 34, one second buffering module 35 and an image and is selected control die set 33, wherein, image selects control die set 33 to comprise one first buffer 331 and second buffer 332, first buffer 331 stores present maximal margin value, and second buffer 332 stores present second largest marginal value.Store the image 211 of module 31, and still untreated image is sent to edge detection module 32 in order to storage retrieval.Edge detection module 32 is in order to calculating the marginal value of image, and marginal value is sent to image selects control die set 33.Image selects control die set 33 that marginal value and present maximal margin value and present second largest marginal value are done a comparison, if this marginal value is bigger than present maximal margin value, at first the data storage of 331 li of first buffers to second buffer 332, data storage to the second buffering module 35 that the first buffering module is 34 li, then, this marginal value is stored into first buffer 331, and image is stored into the first buffering module 34.If this marginal value is littler than present maximal margin value, and bigger than present second largest marginal value, then this marginal value is stored into second buffer 332, image is stored into the second buffering module 35.After handling all images, first 34 li of the modules of buffering are stored be 35 li of the sharpest keenization image 221, the second buffering modules stored be time sharp keenization image 222.
Please continue and consult Fig. 5, select the flow chart of module 22 for image shown in Figure 3.Comprise the following step:
(1) setting the storage values of first buffer and the storage values of second buffer is zero S41.
(2) storing module certainly reads still untreated image and is sent to edge detection module S42.
(3) utilize the edge detection module to calculate the marginal value S43 of this image.
(4) judge that whether this marginal value is greater than present maximal margin value S44.
(5) if to second buffer, data storage to the second buffering module in the first buffering module then, is stored into first buffer to this marginal value the data storage in first buffer, and this image is stored into the first buffering module S47.
(6) if not, judge that whether this marginal value is greater than present second largest marginal value S45.
(7) if, this marginal value is stored into second buffer, this image is stored into the second buffering module S48.
(8) judge whether to handle all image S46.
(9) if not, then get back to S42.
(10) if, process ends S49 then.
See also Fig. 6 to Fig. 8, be the schematic diagram of edge detection module 32 shown in Figure 4.Seeing also Fig. 6 and Fig. 7, is bidimensional image edge detection parameter among the figure, can include nine parameters usually.Fig. 6 is the edge detection parameter of horizontal direction.Fig. 7 is the edge detection parameter of vertical direction.Fig. 8 is original image numerical value, is example with the marginal value of calculating pixel E, and its calculating process is as follows:
At first, find out A, B, C, D, F, G, H and the I of neighborhood pixels position E as shown in Figure 8 pixel intensity for vicinity.
Then, the horizontal direction marginal value Gh of its location of pixels E can be calculated by following formula:
Gh=(-1)*A+(-2)*B+(-1)*C+(0)*D+(0)*E+(0)*F+(1)*G+(2)*H+(1)*I
The vertical direction marginal value Gv of its location of pixels E can be calculated by following formula:
Gv=(-1)*A+(0)*B+(1)*C+(-2)*D+(0)*E+(2)*F+(-1)*G+(0)*H+(1)*I
At last, the marginal value g of its location of pixels E can be calculated by following formula:
g = G v 2 + G h 2
If all pixels among Fig. 8 are all calculated with above-mentioned formula, just can draw the marginal value of all pixels, and the marginal value of all pixels that add up, just can obtain the marginal value of this image.
See also Fig. 9, produce the schematic diagram and the flow chart of module 23 for motion vector shown in Figure 3.Among the figure, motion vector produces module 23 and comprises feature generation module 61, a feature comparison module 62 and motion vector selection module 63.Feature produces module 61 in order to produce the most at least one image feature 611 of sharp keenization image.Wherein, this module be according to the level difference value between pixel and vertical differentiation value producing a matrix, and utilize the characteristic value of this matrix and a feature threshold value to judge whether this pixel is an image feature 611.Above-mentioned image feature production method, existing relevant open source literature evidence, as Harris and M.Stephens.A combined corner and edgedetector.In Alvey Vision Conference, pp.147-151,1988., do not repeat them here.Then, feature comparison module 62 is carried out this comparison action of sharp keenization image and sharp keenization of this time image according to comparison methods and image feature 611, and produces at least one second motion vector 621.This comparison method can be the least squares error method.At last, motion vector selects module 63 to select first motion vector 231 from second motion vector 621.
Please continue and consult Figure 10, select the flow chart of module 63 for motion vector of the present invention shown in Figure 9.At first, set a length threshold value and an angle threshold value S71.Then, all second motion vectors are sorted according to length, and with the median of this ordering as length representation vector S72.All second motion vectors are sorted according to angle, and with the median of this ordering as angle representation vector S73.Then, select second a motion vector S74 who does not compare as yet.Whether the difference in length of judging this second motion vector and length representation vector less than length threshold value S75, if not, then selects other second motion vectors of comparison S74 that compares not as yet.If whether difference is less than angle threshold value S76 then to judge the angle of this second motion vector and angle representation vector, if then setting this second motion vector is the first motion vector S77.If not, then select other second motion vectors S74 that compares of comparison not as yet.At last, judge whether to compare all second motion vector S78, if, process ends then.If not, then select other second motion vectors S74 that compares of comparison not as yet.
See also Figure 11 to Figure 13, be flow chart and the schematic diagram of image of the present invention shown in Figure 3 in conjunction with module.See also Figure 11, at first, set length and width S 81 in conjunction with grid.Then, the motion vector S82 of calculations incorporated grid.At last, carry out the combination of the sharpest keenization image and inferior sharp keenization image in conjunction with ratio, produce one in conjunction with image S83 according to one.Wherein, can be default in conjunction with ratio by the user, or produce than (edge responses ratio) and a scale parameter (proportional factor) by the marginal value of sharp keenization image and time sharp keenization image.For example, when the marginal value ratio of sharp keenization image and time sharp keenization image is 40: 60, scale parameter is 0.4, be (40 * 0.4) then in conjunction with ratio: (100-40 * 0.4)=16: 84, therefore in conjunction with image according to the sharpest keenization image, inferior sharp keenization image and carry out one at 16: 84 in conjunction with action in conjunction with ratio produces in conjunction with image.
Please continue and consult Figure 12, be the schematic diagram of step S82 of the present invention shown in Figure 11.Among the figure, be 2048 pixels with width 93, height 94 is that the image of 1536 pixels is an example, and the width of at first setting in conjunction with grid 91 is 128 pixels, and width 92 is 128 pixels.If have at least one first motion vector in the grid, as in conjunction with grid 95, then with the mean value of those first motion vectors as motion vector in conjunction with grid 95.If,, then should represent with Null, and produce motion vector in conjunction with grid 95 according near tool motion vector in conjunction with grid as in conjunction with grid 96 in conjunction with there not being first motion vector in the grid.Please continue and consult Figure 13, be 20 in conjunction with square 951 to produce motion vector in conjunction with what have motion vector near grid 961 bases, is 15 in conjunction with square 952 to produce motion vector in conjunction with what have motion vector near grid 962 bases.At last, each all has a motion vector in conjunction with grid.
The above, it only is preferred embodiment of the present invention, be not to be used for limiting scope of the invention process, all according to the described shape of the present patent application claim, structure, feature and principle etc. change and modify, all should be contained in the claim of the present invention.

Claims (16)

1. image processor, it comprises:
One image capture module captures at least two images;
One image is selected module, selects a sharpest keenization image and once sharp keenization image according to those images;
One motion vector produces module, by this sharpest keenization image and sharp keenization of this time image, produces at least one first motion vector;
One image is in conjunction with module, according to this motion vector and one in conjunction with ratio provide this sharpest keenization image and sharp keenization of this time image one in conjunction with action.
2. image processor according to claim 1 is characterized in that, this image selects module more to comprise edge detecting module, is the marginal value that calculates this image.
3. image processor according to claim 1 is characterized in that, this motion vector produces module and more comprises:
One feature produces module, by this sharpest keenization image, produces at least one image feature;
One feature comparison module provides this comparison action of sharp keenization image and sharp keenization of this time image according to this image feature, and produces at least one second motion vector;
One motion vector is selected module, is according at least one threshold value this second motion vector to be carried out one to select action, to produce this first motion vector.
4. image processor according to claim 3 is characterized in that, this feature comparison module is a least square differential mode group.
5. image processor according to claim 3 is characterized in that, this threshold value is a length threshold value.
6. image processor according to claim 3 is characterized in that, this threshold value is an angle threshold value.
7. image processor according to claim 1 is characterized in that, this is this sharpness ratio of the sharpest keenization image and sharp keenization of this time image in conjunction with ratio.
8. image processor according to claim 1 is characterized in that, this is default voluntarily by the user in conjunction with ratio.
9. an image treatment method is used for an image processor, and this method comprises:
(a) at least two images of acquisition;
(b) select a sharpest keenization image and once sharp keenization image according to those images;
(c) by this sharpest keenization image and sharp keenization of this time image, generation at least the first motion vector;
(d) according to this first motion vector and one in conjunction with ratio provide this sharpest keenization image and sharp keenization of this time image one in conjunction with action.
10. image treatment method according to claim 9 is characterized in that, (b) more comprises the marginal value that calculates this image in the step.
11. image treatment method according to claim 9 is characterized in that, (c) more comprises in the step:
(c1) by this sharpest keenization image, produce at least one image feature;
(c2) provide this comparison action of sharp keenization image and sharp keenization of this time image according to this image feature, and produce at least one second motion vector;
(c3) according at least one threshold value this second motion vector is carried out one and select action, to produce this first motion vector.
12. image treatment method according to claim 11 is characterized in that, (c2) step more comprises and utilizes a least square differential mode group to carry out a comparison action.
13. image treatment method according to claim 11 is characterized in that, this threshold value is a length threshold value.
14. image treatment method according to claim 11 is characterized in that, this threshold value is an angle threshold value.
15. image treatment method according to claim 9 is characterized in that, this is this sharpness ratio of the sharpest keenization image and sharp keenization of this time image in conjunction with ratio.
16. image treatment method according to claim 9 is characterized in that, this is default voluntarily by the user in conjunction with ratio.
CNB2005101081562A 2005-10-09 2005-10-09 Image freatment method and its device Active CN100423552C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2005101081562A CN100423552C (en) 2005-10-09 2005-10-09 Image freatment method and its device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2005101081562A CN100423552C (en) 2005-10-09 2005-10-09 Image freatment method and its device

Publications (2)

Publication Number Publication Date
CN1794790A CN1794790A (en) 2006-06-28
CN100423552C true CN100423552C (en) 2008-10-01

Family

ID=36806028

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2005101081562A Active CN100423552C (en) 2005-10-09 2005-10-09 Image freatment method and its device

Country Status (1)

Country Link
CN (1) CN100423552C (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108344741B (en) * 2018-02-09 2021-03-16 辽宁翔舜科技有限公司 Automatic focal length identification and control method applied to automatic coal rock detection

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1372678A (en) * 2000-03-07 2002-10-02 皇家菲利浦电子有限公司 System and method for improving the sharpness of a video image
WO2004056089A2 (en) * 2002-12-13 2004-07-01 Qinetiq Limited Image stabilisation system and method
CN1656511A (en) * 2002-05-24 2005-08-17 皇家飞利浦电子股份有限公司 Unit for and method of calculating a sharpened edge

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1372678A (en) * 2000-03-07 2002-10-02 皇家菲利浦电子有限公司 System and method for improving the sharpness of a video image
CN1656511A (en) * 2002-05-24 2005-08-17 皇家飞利浦电子股份有限公司 Unit for and method of calculating a sharpened edge
WO2004056089A2 (en) * 2002-12-13 2004-07-01 Qinetiq Limited Image stabilisation system and method

Also Published As

Publication number Publication date
CN1794790A (en) 2006-06-28

Similar Documents

Publication Publication Date Title
US20220222786A1 (en) Image processing method, smart device, and computer readable storage medium
CN106534616B (en) A kind of video image stabilization method and system based on characteristic matching and motion compensation
CN101601279B (en) Imaging device, imaging method, and program
CN111709407B (en) Method and device for improving video target detection performance in monitoring edge calculation
JP5777367B2 (en) Pattern identification device, pattern identification method and program
CN113286194A (en) Video processing method and device, electronic equipment and readable storage medium
CN106548185B (en) A kind of foreground area determines method and apparatus
JP2022510622A (en) Image processing model training methods, image processing methods, network equipment, and storage media
CN102388402A (en) Image processing device and method
CN103493473A (en) Image processing device, image processing method, image processing program, and recording medium
CN109064505B (en) Depth estimation method based on sliding window tensor extraction
KR20110053348A (en) System and method to generate depth data using edge detection
EP2084819A1 (en) Text detection on mobile communications devices
CN103503432A (en) Image processing device, image processing method, and image processing program
CN101065722A (en) Method of automatic navigation directed towards regions of interest of an image
CN101390381A (en) Blur detecting device, blur correcting device, imaging device, and blur detecting method
US20180183998A1 (en) Power reduction and performance improvement through selective sensor image downscaling
CN112084886B (en) Method and device for improving detection performance of neural network target detection
CN105100546A (en) Movement estimation method and device
CN108027496A (en) Focusing control apparatus, focusing control method, focusing control program, lens devices, camera device
CN111862169B (en) Target follow-up method and device, cradle head camera and storage medium
CN101184235B (en) Method and apparatus for implementing background image extraction from moving image
CN116703919A (en) Surface impurity detection method based on optimal transmission distance loss model
CN111144156B (en) Image data processing method and related device
CN116524312A (en) Infrared small target detection method based on attention fusion characteristic pyramid network

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant