CN105009169A - Systems and methods of suppressing sky regions in images - Google Patents

Systems and methods of suppressing sky regions in images Download PDF

Info

Publication number
CN105009169A
CN105009169A CN201380073551.4A CN201380073551A CN105009169A CN 105009169 A CN105009169 A CN 105009169A CN 201380073551 A CN201380073551 A CN 201380073551A CN 105009169 A CN105009169 A CN 105009169A
Authority
CN
China
Prior art keywords
image
sky areas
dynamic range
infrared
sky
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201380073551.4A
Other languages
Chinese (zh)
Other versions
CN105009169B (en
Inventor
N·霍根斯特恩
M·纳斯迈耶
E·A·库尔特
T·R·赫尔特
K·斯特兰德玛
P·布朗热
B·夏普
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Teledyne Flir LLC
Original Assignee
Flir Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/099,818 external-priority patent/US9723227B2/en
Priority claimed from US14/101,245 external-priority patent/US9706139B2/en
Priority claimed from US14/101,258 external-priority patent/US9723228B2/en
Application filed by Flir Systems Inc filed Critical Flir Systems Inc
Publication of CN105009169A publication Critical patent/CN105009169A/en
Application granted granted Critical
Publication of CN105009169B publication Critical patent/CN105009169B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/62Retouching, i.e. modification of isolated colours only or in isolated picture areas only
    • H04N1/628Memory colours, e.g. skin or sky
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/62Retouching, i.e. modification of isolated colours only or in isolated picture areas only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/67Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response
    • H04N25/671Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response for non-uniformity detection or correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)
  • Radiation Pyrometers (AREA)
  • Image Processing (AREA)

Abstract

Various techniques are provided for systems and methods to process images to reduce consumption of an available output dynamic range by the sky in images. For example, according to one or more embodiments of the disclosure, a region or area in images that may correspond to the sky may be identified based on the location of the horizon in the images. A distribution of irradiance levels in the identified sky region may be analyzed to determine a dynamic range attributable to the sky region. A transfer function that compresses the dynamic range attributable to the sky region may be generated and applied so that the sky in the images may be suppressed, thereby advantageously preserving more dynamic range for terrestrial objects and other objects of interest in the images.

Description

For suppressing the system and method for the sky areas in image
The cross reference of related application
This application claims that on Dec 21st, 2012 submits to and be entitled as the U.S. Provisional Patent Application No.61/745 of " SYSTEMS ANDMETHODS OF SUPPRESSING SKY REGIONS IN IMAGES ", the rights and interests of 440, are incorporated herein the full content of above-mentioned application by reference.
The application submits to and is entitled as the U.S. Patent application No.14/101 of " LOW POWER AND SMALLFORM FACTOR INFRARED IMAGING " on Dec 9th, 2013, the partial continuous case of 245, is incorporated herein the full content of above-mentioned application by reference.
The application submits to and is entitled as the U.S. Patent application No.14/099 of " NON-UNIFORMITYCORRECTION TECHNIQUES FOR INFRARED IMAGING DEVICES " on Dec 6th, 2013, the partial continuous case of 818, is incorporated herein the full content of above-mentioned application by reference.
The application submits to and is entitled as the U.S. Patent application No.14/101 of " INFRARED CAMERASYSTEM ARCHITECTURES " on Dec 9th, 2013, and the partial continuous case of 258, is incorporated herein the full content of above-mentioned application by reference.
This application claims that on Dec 31st, 2012 submits to and be entitled as the U.S. Provisional Patent Application No.61/748 of " COMPACTMULTI-SPECTRUM IMAGING WITH FUSION ", the rights and interests of 018, are incorporated herein the full content of above-mentioned application by reference.
This application claims that on March 15th, 2013 submits to and be entitled as the U.S. Provisional Patent Application No.61/792 of " TIME SPACEDINFRARED IMAGE ENHANCEMENT ", the rights and interests of 582, are incorporated herein the full content of above-mentioned application by reference.
This application claims that on March 15th, 2013 submits to and be entitled as the U.S. Provisional Patent Application No.61/793 of " INFRARED IMAGINGENHANCEMENT WITH FUSION ", the rights and interests of 952, are incorporated herein the full content of above-mentioned application by reference.
This application claims that on Dec 26th, 2012 submits to and be entitled as the U.S. Provisional Patent Application No.61/746 of " TIME SPACEDINFRARED IMAGE ENHANCEMENT ", the rights and interests of 069, are incorporated herein the full content of above-mentioned application by reference.
This application claims that on Dec 26th, 2012 submits to and be entitled as the U.S. Provisional Patent Application No.61/746 of " INFRARED IMAGINGENHANCEMENT WITH FUSION ", the rights and interests of 074, are incorporated herein the full content of above-mentioned application by reference.
Technical field
One or more embodiments of the present invention relate generally to imaging system and method, and more specifically, such as, relate to for the treatment of image with the system and method for the dynamic range in optimized image.
Background technology
In many imaging applications (such as, other application of surveillance camera application, thermal imaging application and/or video camera and imaging system) in, user may to observation place face phenomenon and/or object interested (such as, road, people, vehicle, buildings), and to the air on ground or object (such as, flying bird, aircraft, cloud, treetop and/or other objects) so not interested.But for the image (such as, rest image and/or frame of video) that conventional imaging systems is caught, sky (if any) occupies most in availability of dynamic range usually.This may make user be difficult to differentiate or identify interested object in the image of catching.
Although there is conventional dynamic range compression algorithm or automatic growth control (AGC) method of adjustment dynamic range of images, but these conventional methods reliably can not suppress the sky in image (such as, reducing the availability of dynamic range shared by sky).Sky such as owing to catching in image is normally heterogeneous (such as, a large amount of scene informations may be comprised) and/or usually show different character according to atmospheric condition, weather, sun angle and/or other conditions, these conventional methods are unsuccessful.
Summary of the invention
For process image provides various technology with the system and method for the available out-put dynamic range reducing sky in image and occupy.Such as, according to one or more embodiment of the present disclosure, can come based on the horizontal position in image in recognition image, to may correspond to the region in sky or scope.The distribution of the radiation level in the sky areas of identification can be analyzed to determine to belong to the dynamic range of sky areas.The transforming function transformation function of the dynamic range belonging to sky areas with applied compression can be generated, thus the sky in image can be suppressed, thus advantageously for the ground object in image and other attention objects retain more dynamic range.
In one embodiment, system comprises: storer, and it is suitable for storing the image of the scene comprising sky areas and ground region; And processor, it is suitable for and memory communication, this processor is suitable for: the sky areas in recognition image, analyze the distribution of the pixel level relevant to the sky areas in image, based on the distribution of pixel level, determine the dynamic range belonging to sky areas, generate the transforming function transformation function that compression belongs to the dynamic range of sky areas, and transforming function transformation function is applied to image at least partially.
In another embodiment, method comprises: receive the image comprising the scene of sky areas and ground region; Sky areas in recognition image; Analyze the distribution of the pixel level relevant to the sky areas in image; Based on the distribution of pixel level, determine the dynamic range belonging to sky areas; Generate the transforming function transformation function that compression belongs to the dynamic range of sky areas; And transforming function transformation function is applied to image at least partially.
Scope of the present invention is defined by the claims, and claim is incorporated in this part by reference.By considering the detailed description of hereinafter one or more embodiment, understanding more comprehensively and the realization of additional advantage embodiments of the present invention will be provided to those skilled in the art.Hereinafter with reference to accompanying drawing, introduce accompanying drawing first briefly.
Accompanying drawing explanation
Fig. 1 show according to disclosure embodiment, be configured to the infrared imaging module that realizes in the host device.
Fig. 2 show according to disclosure embodiment, assembling after infrared imaging module.
Fig. 3 shows according to embodiment of the present disclosure, the arranged side by side exploded view being placed in the infrared imaging module on socket.
Fig. 4 show according to embodiment of the present disclosure, the block diagram of the infrared sensor package that comprises infrared array sensor.
Fig. 5 illustrates the process flow diagram of the various operations of determination nonuniformity correction (NUC) item according to the disclosure embodiment.
Fig. 6 show according to disclosure embodiment, difference between neighbor.
Fig. 7 shows the flat field correction technology according to disclosure embodiment.
Fig. 8 show according to disclosure embodiment, the various image processing techniques of the Fig. 5 be applied in image processing pipeline and other operations.
Fig. 9 shows the noise in time domain reduction step according to disclosure embodiment.
Figure 10 illustrates the concrete implementation detail of several processes of the image processing pipeline of the Fig. 8 according to the disclosure embodiment.
Figure 11 illustrates the fixed pattern noise (FPN) according to the space correlation in the adjacent domain of the pixel of the disclosure embodiment.
Figure 12 show according to disclosure embodiment, the block diagram of another implementation of the infrared sensor package that comprises infrared array sensor and low-dropout regulator.
Figure 13 show according to disclosure embodiment, the circuit diagram of the part of the infrared sensor package of Figure 12.
Figure 14 shows the block diagram of the imaging system for catching and/or process image according to disclosure embodiment, and imaging system is thermal camera such as.
Figure 15 shows the process flow diagram of the process for suppressing the sky in image according to disclosure embodiment.
Figure 16 A-C shows the various exemplary histograms according to the infrared radiation level in the sky areas of the image of the various embodiment of the disclosure.
Figure 17 A shows the illustrative gamma curve according to disclosure embodiment, is in fact applied in compressed image the transforming function transformation function of the dynamic range belonging to sky areas.
Figure 17 B shows the exemplary segmentation linear function according to disclosure embodiment, is in fact applied in compressed image the transforming function transformation function of the dynamic range belonging to sky areas.
Figure 18 A shows another example of the gamma curve according to another embodiment of the disclosure, is in fact applied in compressed image the transforming function transformation function of the dynamic range belonging to sky areas.
Figure 18 B shows another example of the piecewise linear function according to another embodiment of the disclosure, is in fact applied in compressed image the transforming function transformation function of the dynamic range belonging to sky areas.
Figure 19 shows another example of the piecewise linear function according to another embodiment of the disclosure, is in fact applied in compressed image the transforming function transformation function of the dynamic range belonging to sky areas.
Figure 20 A shows the exemplary screen shot belonging to the heat picture before the transforming function transformation function of the dynamic range of sky areas in application is for compressed image according to disclosure embodiment.
Figure 20 B show according to disclosure embodiment by the heat picture application in Figure 20 A for belonging to the transforming function transformation function of the dynamic range of sky areas and the exemplary screen shot of the heat picture of process that obtains in compressed image.
By reference to detailed description below, will better understand embodiments of the invention and advantage thereof.Should be understood that, identical reference number is for representing the similar elements shown in a pair or several accompanying drawings.
Embodiment
Fig. 1 show according to disclosure embodiment, be configured in host apparatus 102 realize infrared imaging module 100 (such as, thermal camera or infreared imaging device).In one or more embodiment, according to Wafer level packaging or other encapsulation technologies, the infrared imaging module 100 of little shape factor can be realized.
In one embodiment, infrared imaging module 100 can be configured to realize in little portable host apparatus 102 (such as mobile phone, tablet personal computer device, laptop devices, personal digital assistant, visible light camera, music player or other any suitable mobile device).In this, infrared imaging module 100 can be used for infrared imaging feature to be supplied to host apparatus 102.Such as, infrared imaging module 100 can be configured to catch, process and/or otherwise manage infrared image (such as, be also referred to as picture frame) and this infrared image is supplied to host apparatus 102 for any expectation form (such as, for further process, use to store in memory, to show, by the various application running on host apparatus 102, output to other device or other purposes).
In various embodiments, infrared imaging module 100 can be configured to work in low voltage level and wide temperature range.Such as, in one embodiment, infrared imaging module 100 can use the power work of about 2.4 volts, 2.5 volts, 2.8 volts or lower voltage, and can work (such as, providing suitable dynamic range and performance in the ambient temperature range of about 80 DEG C) in the temperature range of about-20 DEG C to about+60 DEG C.In one embodiment, by making infrared imaging module 100 work under low voltage level, compared with the infreared imaging device of other types, the heat that infrared imaging module 100 self produces is less.Therefore, infrared imaging module 100 operationally, can utilize the measure of simplification to compensate this heat self produced.
As shown in Figure 1, host apparatus 102 can comprise socket 104, shutter 105, motion sensor 194, processor 195, storer 196, display 197 and/or miscellaneous part 198.Socket 104 can be configured to reception infrared imaging module 100 as shown by an arrow 101.With regard to this respect, Fig. 2 show according to disclosure embodiment, the infrared imaging module 100 be assemblied in socket 104.
Other suitable devices of motion of host apparatus 102 can be detected to realize motion sensor 194 by one or more accelerometer, gyroscope or can be used for.Processing module 160 or processor 195 can monitor motion sensor 194 and motion sensor 194 provides information, to detect motion to processing module 160 or processor 195.In various embodiments, motion sensor 194 can be embodied as a part (as shown in Figure 1) for host apparatus 102, the part of other devices that also can be embodied as infrared imaging module 100 or be connected to host apparatus 102 or contact with host apparatus 102.
Processor 195 can be embodied as any suitable treating apparatus (such as, logical unit, microcontroller, processor, special IC (ASIC) or other devices), host apparatus 102 can use above-mentioned treating apparatus to perform suitable instruction, such as, the software instruction in storer 196 is stored in.Display 197 can be used for display capture and/or process after infrared image and/or other images, data and information.Miscellaneous part 198 can be used for any function realizing host apparatus 102, as the various application (such as, clock, temperature sensor, visible light camera or miscellaneous part) that may expect.In addition, machine readable media 193 can be used for storing non-transitory instruction, can will be performed by processor 195 in this non-transitory instruction load to storer 196.
In various embodiments, can produce infrared imaging module 100 and socket 104 in a large number, to promote their widespread use, such as, it can be applicable in mobile phone or other devices (such as, needing the device of little shape factor).In one embodiment, when infrared image-forming module 100 is installed in socket 104, the overall dimensions gone out shown by the combination of infrared imaging module 100 and socket 104 is approximately 8.5mm × 8.5mm × 5.9mm.
Fig. 3 shows according to embodiment of the present disclosure, the arranged side by side exploded view being placed in the infrared imaging module 100 on socket 104.Infrared imaging module 100 can comprise lens barrel 110, shell 120, infrared sensor package 128, circuit board 170, pedestal 150 and processing module 160.
Lens barrel 110 can be at least part of loading optical element 180 (such as, lens), the hole 112 in scioptics lens barrel 110, described optical element 180 in figure 3 part visible.Lens barrel 110 can comprise roughly cylindrical prolongation 114, and it can be used for lens barrel 110 is contacted with the hole 122 in shell 120.
Such as, infrared sensor package 128 can be realized by the cap 130 (such as, lid) be arranged on substrate 140.Infrared sensor package 128 can comprise by row or other modes be arranged on the multiple infrared sensors 132 (such as, infrared eye) covered on substrate 140 and by cap 130.Such as, in one embodiment, infrared sensor package 128 can be embodied as focal plane arrays (FPA) (FPA).This focal plane arrays (FPA) can be embodied as the assembly (such as, being sealed by cap 130 and substrate 140) of such as Vacuum Package.In one embodiment, infrared sensor package 128 can be embodied as wafer-class encapsulation (such as, infrared sensor package 128 can be and be arranged on the monolithic that on wafer, one group of vacuum packaging assembly is separated).In one embodiment, the power supply that infrared sensor package 128 can be embodied as use about 2.4 volts, 2.5 volts, 2.8 volts or similar voltage carrys out work.
Infrared sensor 132 can be configured to the infrared radiation of detection target scene (such as, infrared energy), described target scene comprises: such as medium-wave infrared wave band (MWIR), long wave infrared region (LWIR) and/or as other desired in a particular application thermal imaging wave bands.In one embodiment, infrared sensor package 128 can be provided according to wafer-class encapsulation technology.
Infrared sensor 132 can be embodied as such as micro-metering bolometer, or is configured to the thermal imaging infrared sensor of the other types providing multiple pixel with the array direction pattern of any desired.In one embodiment, infrared sensor 132 can be embodied as vanadium oxide (VOx) detector with 17 micron pixel spacing.In various embodiments, the infrared sensor 132 of the infrared sensor 132 of about 32 × 32 arrays, about 64 × 64 arrays, the infrared sensor 132 of about 80 × 64 arrays or the array of other sizes can be used.
Substrate 140 can comprise such as comprising in one embodiment having and is less than the various circuit that about 5.5mm takes advantage of the reading integrated circuit (ROIC) of 5.5mm size.Substrate 140 can also comprise joint sheet 142, and when assembling infrared imaging module 100 as shown in Figure 3, joint sheet 142 can be used for contacting the supplementary connection be positioned on shell 120 inside surface.In one embodiment, ROIC can use Low-dropout voltage regulator (LDO) to realize to perform voltage-regulation, thus therefore the power supply noise reducing to be incorporated into infrared sensor package 128 also provides the Power Supply Rejection Ratio (PSRR) of improvement.In addition, by realizing LDO (such as, within the scope of wafer-class encapsulation) with ROIC, less die area can be consumed and need less discrete dies (or chip).
Fig. 4 show according to embodiment of the present disclosure, the block diagram of the infrared sensor package 128 that comprises infrared sensor 132 array.In the illustrated embodiment, infrared sensor 132 is as a part for the elementary cell array of ROIC 402.ROIC 402 comprises bias voltage and produces and timing control circuit 404, column amplifier 405, row multiplexer 406, row multiplexer 408 and output amplifier 410.The picture frame (that is, heat picture) of being caught by infrared sensor 132 by output amplifier 410 is supplied to processing module 160, processor 195 and/or any other suitable parts, to perform various treatment technology described herein.Although shown in Fig. 4 be 8 × 8 array, the array configurations of any expectation all can be used in other embodiments.ROIC and further describing of infrared sensor in U.S. Patent No. disclosed in 22 days February in 2000 6,028, can be found in 309, it can be used as entirety to be herein incorporated by way of reference.
Infrared array sensor 128 can catch image (such as, picture frame), and provides this image with various speed from its ROIC.Processing module 160 can be used for performing suitable process to the infrared image of catching, and can realize this processing module 160 according to any suitable structure.In one embodiment, processing module 160 can be embodied as ASIC.With regard to this respect, this ASIC can be configured to high performance and/or high efficiency execution image procossing.In another embodiment, general Central Processing Unit (CPU) can be utilized to realize processing module 160, described CPU can be configured to perform suitable software instruction, to carry out image procossing, adjustment and to carry out image procossing, mutual and/or other operations of working in coordination between processing module 160 and host apparatus 102 by various image processing block.In another embodiment, field programmable gate array (FPGA) can be utilized to realize processing module 160.In other embodiments, as understood by those skilled in the art, the process of other types and/or logical circuit can be utilized to realize processing module 160.
In these and other embodiments, processing module 160 also can realize by the parts suitable with other, such as, volatile memory, nonvolatile memory and/or one or more interface are (such as, infrared detector interface, internal integrated circuit (I2C) interface, mobile Industry Processor Interface (MIPI), JTAG (JTAG) interface (such as, IEEE1149.1 standard test access port and boundary-scan architecture) and/or other interfaces).
In certain embodiments, infrared imaging module 100 can comprise one or more actuator 199 further, and it can be used for the focus adjusting the infrared image frame that infrared sensor package 128 is caught.Such as, the miscellaneous part that actuator 199 can be used for mobile optical element 180, infrared sensor 132 and/or is relative to each other, optionally to focus on according to technology described herein and to defocus infrared image frame.Actuator 199 can be realized according to the motional induction equipment of any type or device, and actuator 199 can be placed on the inner or outside any position of infrared imaging module 100, to adapt to different application.
After infrared imaging module 100 being assembled, infrared sensor package 128, pedestal 150 and processing module 160 can seal by shell 120 subsequently completely.Shell 120 can be convenient to the connection of the various parts of infrared imaging module 100.Such as, in one embodiment, shell 120 can be provided for the electric connecting part 126 connecting various parts, will be described in greater detail below.
When infrared imaging module 100 being assembled, electric connecting part 126 (such as, the electric connecting part of conductive path, electrical trace or other types) can be electrically connected with bond pad 142.In various embodiments, can electric connecting part 126 be embedded in shell 120, be arranged on the inside surface of shell 120 and/or by shell 120 described electric connecting part 126 is provided.As shown in Figure 3, electric connecting part 126 can end in the link 124 of the basal surface protruding from shell 120.When infrared imaging module 100 being assembled, link 124 can be connected with circuit board 170 (such as, in various embodiments, shell 120 can be placed in circuit board 170 top).Processing module 160 is electrically connected with circuit board 170 by suitable electric connecting part.Therefore, infrared sensor package 128 can be such as electrically connected with processing module 160 by conductive path, and described conductive path can be provided by the electric connecting part 126 of the complementary tie point in bond pad 142, shell 120 interior surface, shell 120, link 124 and circuit board 170.Advantageously, the realization of this layout can without the need to arranging bonding wire between infrared sensor package 128 and processing module 160.
In various embodiments, the material of any expectation (such as, copper or any other suitable conductive material) can be used to manufacture electric connecting part 126 in shell 120.In one embodiment, the heat that electric connecting part 126 can contribute to infrared imaging module 100 produces dispels the heat.
Other connections can be used in other embodiments.Such as, in one embodiment, sensor module 128 is connected to processing module 160 by ceramic wafer, and described ceramic wafer is connected to sensor module 128 by bonding wire and is connected to processing module 160 by ball grid array (BGA).In another embodiment, sensor module 128 directly can be installed on hard and soft plate and to be electrically connected with bonding wire, and bonding wire or BGA can be utilized processing module 160 to be installed and are connected to hard and soft plate.
The various application of infrared imaging module 100 described in this paper and host apparatus 102 are just in order to illustrate, instead of restriction.With regard to this respect, any one in various technology described herein all may be used on any infrared camera system, infrared imaging device or other devices for carrying out infrared/thermal imaging.
The substrate 140 of infrared sensor package 128 can be installed on pedestal 150.In various embodiments, pedestal 150 (such as, base) such as by the copper production formed by metal injection moulding (MIM), and can carry out black oxidation process or nickel coating process to described pedestal 150.In various embodiments, pedestal 150 can by the material manufacture of any expectation, such as, can according to application-specific, by such as zinc, aluminium or magnesium manufacture, and, pedestal 150 is formed by the application flow of any expectation, such as, according to application-specific, such as, can be formed by the quick cast of aluminium casting, MIM or zinc.In various embodiments, pedestal 150 can be used for providing support structure, various circuit paths, heat radiator performance and other suitable functions.In one embodiment, pedestal 150 can be the sandwich construction using stupalith to realize at least partly.
In various embodiments, circuit board 170 can hold shell 120, thus can support the various parts of infrared imaging module 100 physically.In various embodiments, circuit board 170 can be embodied as printed circuit board (PCB) (such as, the circuit board of FR4 circuit board or other types), the interconnect equipment (such as, the interconnect equipment of interconnection belt or other types) of rigidity or flexibility, flexible circuit board, flexible plastic substrates or other suitable structures.In various embodiments, pedestal 150 can be embodied as various function and the attribute of the circuit board 170 with description, and vice versa.
Socket 104 can comprise the cavity 106 being configured to hold infrared imaging module 100 (view such as, after assembling as shown in Figure 2).Infrared imaging module 100 and/or socket 104 can comprise suitable card, arm, pin, securing member or any other suitable attachment, described attachment can be used for, by friction, tension force, adhesion and/or any other suitable mode, infrared imaging module 100 is fixed to socket 104, or it is inner infrared imaging module 100 to be fixed to socket 104.Socket 104 can comprise attachment 107, and it can when being inserted in the cavity 106 of socket 104 when infrared image-forming module 100, the surface 109 of splice closure 120.The attachment of other types can be used in other embodiments.
Infrared imaging module 100 is electrically connected with socket 104 by suitable electric connecting part (such as, contact, pin, electric wire or any other suitable link).Such as, socket 104 can comprise electric connecting part 108, it can contact to the corresponding electric connecting part of infrared imaging module 100 (such as, interconnect pad, contact or other electric connecting parts on circuit board 170 side or basal surface, engage other electric connecting parts on keyboard 142 or pedestal 150 or other links).Electric connecting part 108 can be manufactured by the material of any expectation (such as, copper or any other suitable conductive material).In one embodiment, electric connecting part 108 can by the flattening of machinery, can against the electric connecting part of infrared imaging module 100 when infrared image-forming module 100 is inserted in the cavity 106 of socket 104.In one embodiment, what electric connecting part 108 can be at least part of is fixed to infrared imaging module 100 in socket 104.The electric connecting part of other types can be used in other embodiments.
Socket 104 is electrically connected with main frame 102 by the electric connecting part of similar type.Such as, in one embodiment, main frame 102 can comprise the electric connecting part (such as, be welded to connect, buckle type connects or other connect) be connected with electric connecting part 108 through hole 190.In various embodiments, this electric connecting part can be placed in side and/or the bottom of socket 104.
Realize the various parts of infrared imaging module 100 by flip chip technology (fct), described flip chip technology (fct) can be used for parts to be directly installed on circuit board, and without the need to being generally used for the extra gap that bonding wire connects.Flip-chip connects the overall dimensions being such as used in and reducing infrared imaging module 100 in compact little shape factor application.Such as, in one embodiment, can use flip-chip link that processing module 160 is installed to circuit board 170.Such as, this flip-chip arrangement can be used to realize infrared imaging module 100.
In various embodiments, can be 12/844 according to such as application number, 124, the applying date is the U.S. Patent application on July 27th, 2010 and application number is 61/469,651, the various technology of the applying date described in the U.S. Provisional Patent Application on March 30th, 2011 (such as, justifying brilliant level encapsulation technology), realize infrared imaging module 100 and/or relevant parts, it can be used as entirety to be herein incorporated by way of reference.In addition, according to one or more embodiment, the various technology can recorded according to document as described below realize, correct, test and/or the parts using infrared imaging module 100 and/or be correlated with, described document is such as: if publication number is 7,470,902, publication date is the United States Patent (USP) on Dec 30th, 2008, and publication number is 6,028,309, publication date is the United States Patent (USP) on February 22nd, 2000, and publication number is 6,812,465, publication date is the United States Patent (USP) on November 2nd, 2004, and publication number is 7,034,301, publication date is the United States Patent (USP) on April 25th, 2006, and publication number is 7,679,048, publication date is the United States Patent (USP) on March 16th, 2010, and publication number is 7,470,904, publication date is the United States Patent (USP) on Dec 30th, 2008, and application number is 12/202,880, the applying date is the U.S. Patent application on September 2nd, 2008 and application number is 12/202,896, the applying date is the U.S. Patent application on September 2nd, 2008, is herein incorporated as a whole by above-mentioned document by way of reference.
In some embodiments, host apparatus 102 can comprise miscellaneous part 198, such as non-thermal video camera (such as, the non-thermographic instrument of visible light camera or other types).Non-thermographic video camera can be little shape factor image-forming module or imaging device, and in some embodiments, mode that can be similar with the various embodiments of infrared imaging module 100 disclosed herein is implemented, wherein one or more sensors and/or sensor array are in response to the radiation (radiation such as, under visible wavelength, ultraviolet wavelength and/or other non-thermal wavelengths) in non-thermal frequency spectrum.Such as, in some embodiments, non-thermal video camera can be implemented charge-coupled device (CCD) (CCD) sensor, electronics multiplication CCD (EMCCD) sensor, complementary metal oxide semiconductor (CMOS) (CMOS) sensor, Scientific Grade CMOS (sCMOS) sensor or other wave filters and/or sensor.
In some embodiments, non-thermal video camera can be positioned at same position with infrared imaging module 100 and be oriented so that the visual field (FOV) of non-thermal video camera superposes the FOV of infrared imaging module 100 at least partly.In an example, infrared imaging module 100 and non-thermal video camera may be embodied to the dual sensor module sharing public substrate, it is according to the U.S. Provisional Patent Application No.61/748 submitted on Dec 31st, 2012, the various technology described in 018, this application is incorporated to herein by quoting.
For the embodiment with non-thermo-optical video camera, various parts (such as, processor 195, processing module 160 and/or other processing element) infrared image that can be configured to infrared imaging module 100 is caught is (such as, comprise heat picture) and the non-thermal video camera non-thermographic of catching is (such as, comprise visible images) superposition, merge, mixing or otherwise synthesis, no matter substantially catch simultaneously or asynchronously catch (a few hours of such as, the time being separated by, a couple of days, in the daytime: evening and/or other).
In some embodiments, heat and non-thermographic can be processed into and produce composograph (the one or more process such as, in some embodiments, this image carried out).Such as, the NUC process (hereafter further describing) based on scene can be performed, true color process can be performed, and/or high contrast process can be performed.
About true color process, can by such as heat picture being mixed with non-thermographic by the mixing of the respective components of the radiation detection component of heat picture and non-thermographic according to hybrid parameter, in some embodiments, hybrid parameter can by user and/or machine adjustments.Such as, brightness or the chromatic component of heat picture and non-thermographic can be synthesized according to hybrid parameter.In one embodiment, this hybrid technology can be called as true color infrared image.Such as, when imaging in the daytime, vision-mix can comprise non-thermal coloured image, and it comprises luminance component and chromatic component, and wherein its brightness value is substituted by the brightness value from heat picture.Use from the brightness data of heat picture makes very non-thermal coloured image based on the temperature variable or dimmed of object.Like this, these hybrid technologies provide thermal imaging in the daytime or visible images.
About height contrast process, high spatial frequency content can by the one or more acquisitions (such as, by performing high-pass filtering, Difference Imaging and/or other technologies) in heat and non-thermographic.Composograph can comprise heat picture radiation detection component and comprise scene infrared (such as, heat) mixed components of characteristic, the infrared characteristic of scene mixes with high spatial frequency content according to hybrid parameter, in some embodiments, hybrid parameter can by user and/or machine adjustments.In some embodiments, high spatial frequency content from non-thermographic can mix with heat picture by being superimposed upon on heat picture by high spatial frequency content, and its high spatial frequencies content substitutes or rewrite those parts corresponding with high spatial frequency content location of heat picture.Such as, high spatial frequency content can comprise the object edge be depicted in scene image, but can not be present in the inside of these objects.In this embodiment, vision-mix can only include high spatial frequency content, and it can be encoded into one or more components of composograph subsequently.
Such as, the radiation detection component of heat picture can be the chromatic component of heat picture, and high spatial frequency content can derive from brightness and/or the chromatic component of non-thermographic.In this embodiment, composograph can comprise radiation detection component (such as, the chromatic component of heat picture) and high spatial frequency content, radiation detection component coding is to the chromatic component of composograph, high spatial frequency content direct coding (such as, vision-mix does not still have heat picture to contribute) is to the luminance component of composograph.So, the radiation detection of the radiation detection component of heat picture can be kept to calibrate.In similar embodiments, blended image data can comprise the high spatial frequency content of the luminance component adding heat picture to and the generated data of generation, and this generated data is encoded to the luminance component of the composograph of generation.
Such as, below any technology disclosed in application may be used to various embodiment: the U.S. Patent application No.12/477 that on June 3rd, 2009 submits to, 828, the U.S. Patent application No.12/766 that on April 23rd, 2010 submits to, 739, the U.S. Patent application No.13/105 that on May 11st, 2011 submits to, 765, the U.S. Patent application No.13/437 that on April 2nd, 2012 submits to, 645, the U.S. Provisional Patent Application No.61/473 that on April 8th, 2011 submits to, 207, the U.S. Provisional Patent Application No.61/746 that on Dec 26th, 2012 submits to, 069, the U.S. Provisional Patent Application No.61/746 that on Dec 26th, 2012 submits to, 074, the U.S. Provisional Patent Application No.61/748 that on Dec 31st, 2012 submits to, 018, the U.S. Provisional Patent Application No.61/792 that on March 15th, 2013 submits to, 582, the U.S. Provisional Patent Application No.61/793 that on March 15th, 2013 submits to, the U.S. Patent application No.PCT/EP2011/056432 that on April 21st, 952 and 2011 submits to, all these applications are overall to be incorporated to herein by quoting.Described herein or describe in other applications or any technology of describing in the patent that relates to herein can be applied to and anyly describe various thermal, non-thermal and purposes herein.
Refer again to Fig. 1, in various embodiments, host apparatus 102 can comprise shutter 105.With regard to this respect, when infrared imaging module 100 is arranged in socket, shutter 105 optionally can be placed on (such as, direction as determined in arrow 103) on socket 104.With regard to this respect, shutter 105 is such as used in when infrared imaging module 100 does not use and protects it.Shutter 105 also can be used as temperature reference, as those skilled in the art be to be understood that, described temperature reference is as a part for the trimming process (such as, Nonuniformity Correction (NUC) process or other trimming processes) of infrared imaging module 100.
In various embodiments, shutter 105 can be manufactured by various material, such as, and polymkeric substance, glass, aluminium (such as, japanning or after anodized) or other materials.In various embodiments, shutter 105 can comprise one or more coating (such as, uniform black matrix coating or reflexive gold coatings), and it is for the various optical properties of optionally filter electromagnetic radiation and/or adjustment shutter 105.
In another embodiment, shutter 105 can be fixed on appropriate location with round-the-clock protection infrared imaging module 100.In this case, the part of shutter 105 or shutter 105 can by substantially can not filter out needs Infrared wavelength suitable material (such as, polymkeric substance, or the infrared transmission material of such as silicon, germanium, zinc selenide or chalcogenide glass) manufacture.As those skilled in the art be to be understood that, in another embodiment, shutter can be embodied as a part for infrared imaging module 100 (such as, in the miscellaneous part of lens barrel or infrared imaging module 100, or as the part of the miscellaneous part of lens barrel or infrared imaging module 100).
Alternatively, in another embodiment, without the need to providing shutter (such as, the outside of shutter 105 or other types or inner shutter), but the technology without shutter can be used to carry out the correction of NUC step or other types.In another embodiment, use and can carry out with the combine with technique based on shutter without the NUC step of fast gate technique or the correction of other types.
Any one in the various technology can recorded according to following document realizes infrared imaging module 100 and host apparatus 102, and described document is: application number is 61/495,873, and the applying date is the U.S. Provisional Patent Application on June 10th, 2011; Application number is 61/495,879, and the applying date is the U.S. Provisional Patent Application on June 10th, 2011; And application number is 61/495,888, the applying date is the U.S. Provisional Patent Application on June 10th, 2011.By way of reference above-mentioned document is herein incorporated as a whole.
In various embodiments, the parts of host apparatus 102 and/or infrared imaging module 100 can be embodied as local system, or are embodied as between parts and carry out by wired and/or wireless network the distributed system that communicates.Therefore, according to the needs of particular implementation, the various operations mentioned by the disclosure can be performed by local and/or remote units.
Fig. 5 show according to disclosure embodiment, the process flow diagram of the various operations of determining NUC item.In certain embodiments, the processing module 160 that can be processed by the picture frame of catching infrared sensor 132 or processor 195 (the two usually also finger processor) perform the operation of Fig. 5.
At block 505, infrared sensor 132 starts the picture frame of capturing scenes.Usually, scene will be the current true environment be in of host apparatus 102.With regard to this respect, shutter 105 (if optionally providing) can be opened to allow infrared imaging module to receive infrared radiation from scene.During all operations shown in Fig. 5, infrared sensor 132 catches picture frame serially.With regard to this respect, catch picture frame continuously and can be used for various operation as further discussed.In one embodiment, time-domain filtering can be carried out (such as to the picture frame of catching, step according to block 826 carries out time-domain filtering to the picture frame of catching, to be described further according to Fig. 8 herein), and before described picture frame is used to the operation shown in Fig. 5, by other (such as, factory's gain term 812, factory's shift term 816, the NUC item 817 previously determined, row FPN item 820 and row FPN item 824, will be described further it according to Fig. 8 herein) they are processed.
At block 510, the startup event of NUC step detected.In one embodiment, NUC step can start in response to the physics of host apparatus 102 moves.Such as, this movement can be detected by by the motion sensor 194 of processor poll.In one example in which, for mobile host device 102 may be carried out in a particular manner, such as, by the host apparatus 102 that moves around of having a mind to, host apparatus 102 is made to do " elimination " or " bang " motion.With regard to this respect, user can according to predetermined speed and direction (speed), such as, is carried out mobile host device 102 by the motion of upper and lower, left and right or other types thus started NUC step.In this example, the use of this movement can allow user's operating host device 102 intuitively, to simulate the noise " elimination " to the picture frame of catching.
In another example, if detect that motion has exceeded threshold value (such as, motion is greater than the normal expection used), then NUC process can be initiated with host apparatus 102.It is expected to be that the spatial translation of any desired type of host apparatus 102 all can be used for initiating NUC process.
In another example, if since the NUC step previously performed, pass by minimum time, then can start NUC step by host apparatus 102.In another example, if since the NUC step previously performed, infrared imaging module 100 experienced by minimum temperature change, then can start NUC step by host apparatus 102.In other example, start serially and repeat NUC step.
At block 515, after NUC step startup event being detected, determine whether perform NUC step veritably.With regard to this respect, whether can meet based on one or more subsidiary condition, optionally start NUC step.Such as, in one embodiment, unless since the NUC step previously performed, pass by minimum time, otherwise NUC step can not have been performed.In another embodiment, unless since the NUC step previously performed, infrared imaging module 100 experienced by minimum temperature variation, otherwise can not perform NUC step.Other standards or condition can be used in other embodiments.If met suitable standard or condition, process flow diagram will proceed to block 520.Otherwise process flow diagram turns back to block 505.
In NUC step, fuzzy graph picture frame can be used for determining NUC item, and the picture frame that described NUC item can be applicable to catch is to correct FPN.As discussed, in one embodiment, the multiple picture frames (picture frame of such as, catching when scene and/or thermal imaging system are in the state of motion) by cumulative moving scene obtain fuzzy graph picture frame.In another embodiment, defocus by the optical element or miscellaneous part making thermal imaging system, obtain fuzzy graph picture frame.
Therefore, block 520 provides the selection of two kinds of methods.If use based drive method, then process flow diagram proceeds to block 525.If used based on the method defocused, then process flow diagram proceeds to block 530.
With reference now to based drive method, at block 525, motion detected.Such as, in one embodiment, the picture frame can caught based on infrared sensor 132 detects motion.With regard to this respect, suitable motion detection step (such as, image registration step, frame are to the mathematic interpolation of frame or other suitable steps) can be applicable to the picture frame of catching, to determine whether there is motion (such as, whether having captured picture frame that is static or motion).Such as, in one embodiment, can determine that the quantity that the pixel of the surrounding of the pixel of successive image frame or region change has exceeded user-defined quantity (such as, number percent and/or threshold value).If the pixel of at least given number percent has changed and the quantity of the pixel changed is at least user-defined quantity, then that can affirm very much has detected motion, thus process flow diagram forwards block 535 to.
In another embodiment, can determine motion on the basis of each pixel, wherein, only cumulative those demonstrate the pixel of significant change, to provide fuzzy graph picture frame.Such as, can arrange counter for each pixel, described counter is identical for ensureing the quantity of the pixel value that each pixel adds up, or is averaged pixel value for the quantity of the pixel value in fact added up according to each pixel.The motion based on image that can perform other types detects, and such as, performs and draws east (Radon) to convert.
In another embodiment, the data that can provide based on motion sensor 194 detect motion.In one embodiment, whether this motion detects to comprise and detects host apparatus 102 and move along relative to straight track in space.Such as, if host apparatus 102 just moves along relative to straight track, so following situation is possible: occur that some object in scene after imaging may fuzzy not (object such as, in scene be aimed at straight track or substantially moved along the direction being parallel to described straight track).Therefore, in this embodiment, when only having host apparatus 102 demonstrate motion or do not demonstrate motion but move along particular track, motion sensor 194 just can detect motion.
In another embodiment, both motion detection step and motion sensor 194 can be used.Therefore, use any one in these various embodiments, can determine scene at least partially and host apparatus 102 relative to each other between motion while (such as, this can be moved relative to scene by host apparatus 102, scene move relative to host apparatus 102 at least partially or above-mentioned two situations cause), whether capture each picture frame.
Can be expected that, detect the picture frame of motion can demonstrate some of the scene of catching secondary fuzzy (such as, the fuzzy thermographic image data relevant to scene), described secondary fuzzy be cause alternately because thermal time constant (such as, micro-radiation heat time constant) and the scene of infrared sensor 132 moves.
At block 535, to detecting that the picture frame of motion adds up.Such as, if the motion of continuous print a series of images frame detected, then can add up to series of drawing picture frame.As another one example, if the motion of some picture frame only detected, then can neglect and not have the picture frame moved not add up to the picture frame that these do not move.Therefore, can, based on the motion detected, continuous print or discontinuous a series of images frame be selected to add up.
At block 540, be averaged to provide fuzzy graph picture frame to cumulative picture frame.Because cumulative picture frame during movement captures, so scene information actual between our desired image frame will be different, thus cause fuzzy after picture frame in scene information by further fuzzy (block 545).
In contrast, during movement, within least short time and at least limited change of scene radiation time, FPN (such as, being caused by one or more parts of infrared imaging module 100) remains unchanged.As a result, picture frame close on the Time and place during movement captured will suffer identical or at least similar FPN.Therefore, although the scene information in successive image frame may change, FPN will keep substantially constant.By being averaged to the multiple picture frames captured between moving period, described multiple picture frame will fuzzy scene information, but can not fuzzy FPN.As a result, compared with scene information, FPN keeps in the fuzzy graph picture frame provided at block 545 clearly.
In one embodiment, in block 535 and 540, cumulative sum carried out to 32 or more picture frames average.But the picture frame of any desired amt is all in other embodiments available, just along with the minimizing of the quantity of frame, correction accuracy can reduce usually.
With reference now to based on the method defocused, at block 530, carry out defocusing operations and defocus with the picture frame making infrared sensor 132 wittingly and catch.Such as, in one embodiment, one or more actuator 199 can be used for adjusting, the miscellaneous part of mobile or translation optical element 180, infrared sensor package 128 and/or infrared imaging module 100, to make fuzzy (such as, not focusing on) picture frame of infrared sensor 132 capturing scenes.Also can consider to use other not make infrared image frame defocus wittingly based on the technology of actuator, such as, as artificial (such as, user starts) defocuses.
Although the scene in picture frame may occur fuzzy, by defocusing operations, FPN (such as, being caused by one or more parts of infrared imaging module 100) will remain unaffected.As a result, the fuzzy graph picture frame of scene (block 545) will have FPN, and compared with scene information, described FPN keeps in described blurred picture clearly.
In superincumbent discussion, what described is relevant with single picture frame of catching based on the method defocused.In another embodiment, can comprise based on the method defocused and when infrared image-forming module 100 is defocused, multiple picture frame being added up, and the picture frame defocused is averaged to eliminate the impact of noise in time domain and provides fuzzy graph picture frame at block 545.
Therefore, be understandable that, both by based drive method also by providing fuzzy picture frame based on the method defocused at block 545.Because motion, to defocus or said two devices all can make a lot of scene informations fuzzy, so in fact fuzzy graph picture frame can be thought the low-pass filtering version of the picture frame of original relevant scene information of catching.
At block 505, the FPN item of the row and column determining to upgrade is processed (such as to fuzzy graph picture frame, if do not determine the FPN item of row and column before, the FPN item of the row and column so upgraded can be block 550 first time the new row and column in iteration FPN item).As the disclosure use, according to the direction of the miscellaneous part of infrared sensor 132 and/or infrared imaging module 100, the interchangeable use of term row and column.
In one embodiment, block 550 comprises determines that often row fuzzy graph picture frame (such as, often row fuzzy graph picture frame can have the space FPN correction term of himself) space FPN correction term, and also determine the space FPN correction term of often row fuzzy graph picture frame (such as, often row fuzzy graph picture frame can have the space FPN correction term of himself).This process can be used for reducing space and the slow change (1/f) reducing the intrinsic row and column FPN of thermal imaging system, this slow change case is caused by the 1/f noise feature of the amplifier in ROIC 402 in this way, and described 1/f noise feature can show as the vertical and horizontal bar in picture frame.
Advantageously, by the FPN utilizing fuzzy graph picture frame to determine space row and column, can reduce and the vertical and horizontal in the scene of actual imaging be thought by mistake be the risk (such as, real scene content is fuzzy, and FPN maintenance is not fuzzy) of row and column noise.
In one embodiment, by consider fuzzy graph picture frame neighbor between difference determine row and column FPN item.Such as, Fig. 6 show according to disclosure embodiment, difference between neighbor.Particularly, in figure 6, pixel 610 and 8 horizontal adjacent pixels near it are compared: d0-d3 is in side, and d4-d7 is at opposite side.Difference between neighbor can be averaged, to obtain the estimated value of the offset error of the pixel groups illustrated.All can calculate the offset error of each pixel in row or row, and the mean value obtained can be used for correcting whole row or row.
In order to prevent that real contextual data is interpreted as noise, can SC service ceiling threshold value and lower threshold (thPix and-thPix).The pixel value (in this example embodiment, being pixel d1 and d4) fallen into outside this threshold range is not used in acquisition offset error.In addition, these threshold values can limit the maximum that row and column FPN corrects.
Application number is 12/396,340, and the applying date is that the U.S. Patent application on March 2nd, 2009 describes the technology more specifically performing space row and column FPN correction process, it can be used as entirety to be herein incorporated by way of reference.
Refer again to Fig. 5, the row and column FPN item of the renewal determined at block 550 is carried out storing (block 552) and is applied to the fuzzy graph picture frame that (block 555) block 545 provides.After applying these, the FPN of some the space row and columns in fuzzy graph picture frame can be reduced.Such as, but because these are applied to row and column usually, so additional FPN can keep, skew or the other reasons of space-independent FPN and pixel to pixel are relevant.With single row and column may not be directly related, the neighborhood of the FPN of space correlation also can remain unchanged.Therefore, can be further processed to determine NUC item, will be described below.
At block 560, determine local contrast value in fuzzy graph picture frame (the gradient edge value such as, between neighbor or small group of pixels or absolute value).If the scene information in fuzzy graph picture frame comprises also not by obviously fuzzy contrast region (such as, the high-contrast edge in Raw scene data), so these features can be identified by the contrast determining step of block 560.
Such as, the local contrast value in fuzzy graph picture frame can be calculated, or the edge detecting step of any other type can be applicable to identify as local contrast region a part, some pixel in blurred picture.Can think that the pixel marked by this way comprises the scene information of very high spatial frequency, the scene information of this very high spatial frequency can be interpreted as FPN (such as, this region may correspond in also not by the part of fully fuzzy scene).Therefore, these pixels can be got rid of outside the process being used for determining further NUC item.In one embodiment, this contrast check processing can be dependent on higher than the expectation contrast value relevant to FPN threshold value (such as, can think that the contrast value that demonstrates is scene information higher than the pixel of threshold value, and think that those pixels lower than threshold value are display FPN).
In one embodiment, after row and column FPN item has been applied to fuzzy graph picture frame, can determine (such as, as shown in Figure 5) the contrast of fuzzy graph picture frame execution block 560.In another embodiment, can before block 550 execution block 560, to determine contrast (such as, to prevent contrast based on scene for determining that this has impact) before determining row and column FPN item.
After block 560, can be expected that, any high spatial frequency component remained in fuzzy graph picture frame can be general owing to space-independent FPN.With regard to this respect, after block 560, other noises a lot of or the real information based on scene needed are removed or got rid of outside fuzzy graph picture frame, this is because: to fuzzy wittingly (such as, by from the motion of block 520 to 545 or defocus) of picture frame, the application (block 555) of row and column FPN item and the determination (block 560) of contrast.
Therefore, it is expected to, after block 560, any residual high spatial frequency component (such as, being shown as the contrast in fuzzy graph picture frame or distinct regions) is all attributable to space-independent FPN.Therefore, at block 565, high-pass filtering is carried out to fuzzy graph picture frame.In one embodiment, this can comprise application Hi-pass filter to extract high spatial frequency component from fuzzy graph picture frame.In another embodiment, this can comprise fuzzy graph picture frame application of low-pass filters, and the difference extracted between the picture frame after low-pass filtering and the picture frame not having filtering is to obtain high spatial frequency component.According to various embodiment of the present disclosure, realize Hi-pass filter by the mean difference between calculating sensor signal (such as, pixel value) and its adjacent signals.
At block 570, flat field correction process is carried out to the fuzzy graph picture frame after high-pass filtering, to determine the NUC item (such as, if previously do not carry out NUC step, the NUC item so upgraded can be the new NUC item of first time in iteration of block 570) upgraded.
Such as, Fig. 7 shows the flat field correction technology 700 according to disclosure embodiment.In the figure 7, the NUC item by using the value of neighbor 712 to 726 of pixel 710 to determine each pixel 710 of fuzzy graph picture frame.For each pixel 710, several gradient can be determined based on the absolute difference between the value of various neighbor.Such as, the absolute difference between following pixel can be determined: between pixel 712 and 714 between (diagonal angle gradient from left to right), pixel 716 and 718 between (VG (vertical gradient) from top to bottom), pixel 720 and 722 between (diagonal angle gradient from right to left) and pixel 724 and 726 (horizontal gradient from left to right).
Can sue for peace to these absolute differences, to provide the summation gradient of pixel 710.Can determine the weighted value of pixel 710, described weighted value is inversely proportional to summation gradient.This step can be performed, until provide weighted value for each pixel 710 to whole pixels 710 of fuzzy graph picture frame.For the region (such as, by fuzzy region or the region with low contrast) with low gradient, weighted value will close to 1.On the contrary, for the region with high gradient, weighted value will be 0 or close to 0.Updated value as the NUC item estimated by Hi-pass filter is multiplied with weighted value.
In one embodiment, by a certain amount of time decay is applied to NUC item determining step, risk scene information being incorporated into NUC item can be reduced further.Such as, can select the time decay factor λ between 0 and 1, the new NUC item (NUCNEW) stored like this is the average weighted value of the NUC item (NUCUPDATE) of the renewal of old NUC item (NUCOLD) and estimation.In one embodiment, this can be expressed as: NUCNEW=λ NUCOLD+ (1-λ) (NUCOLD+NUCUPDATE).
Determine NUC item although described according to gradient, local contrast value time suitable, also can be used to replace gradient.Also other technologies can be used, such as, standard deviation calculation.The flat field correction step that can perform other types, to determine NUC item, comprising: such as publication number is 6,028,309, and publication date is the United States Patent (USP) on February 22nd, 2000; Publication number is 6,812,465, and publication date is the United States Patent (USP) on November 2nd, 2004; And application number is 12/114,865, the various steps of the applying date described in the U.S. Patent application on May 5th, 2008.By way of reference above-mentioned document is herein incorporated as a whole.
Refer again to Fig. 5, block 570 can comprise the additional treatments to NUC item.Such as, in one embodiment, in order to retain the mean value of scene signals, by the mean value that deducts NUC item from each NUC item by whole NUC item and normalize to 0.Same at block 570, in order to avoid row and column noise effect NUC item, the mean value of every row and column can be deducted from the NUC item of every row and column.Result is, the row and column FPN wave filter being used in the row and column FPN item that block 550 is determined after can filtering out better and NUC item being applied to the image of catching (such as, in the step that block 580 carries out, to be further described this herein) further iteration in the row and column noise of (such as, as Fig. 8 be shown specifically).With regard to this respect, row and column FPN wave filter can use more data to calculate often row the and often deviation ratio that arranges is (such as usually, the FPN item of row and column), and with carry out the NUC item of incoherent noise on capture space based on Hi-pass filter compared with, can thus provide more reliably, for reducing the option of the FPN of space correlation.
At block 571-573, can perform additional high-pass filtering to the NUC item upgraded alternatively and further determine that process is with the FPN eliminating space correlation, the FPN of described space correlation has the spatial frequency lower than the previous spatial frequency eliminated by row and column FPN item.With regard to this respect, some changes of the miscellaneous part of infrared sensor 132 or infrared imaging module 100 can produce the FPN noise of space correlation, can not easily by the FPN noise modeling of produced space correlation for row or row noise.The FPN of this space correlation can comprise the transmitted fluorescence on such as sensor module or infrared sensor 132 groups, and described infrared sensor 132 groups is compared with adjacent infrared sensor 132, and it responds different radiancy.In one embodiment, offset correction can be used to reduce the FPN of this space correlation.If the quantity of the FPN of this space correlation is a lot, then also noise can be detected in fuzzy graph picture frame.Because such noise can affect neighbor, the Hi-pass filter with very little kernel may not detect that FPN in neighbor (such as, whole values that Hi-pass filter uses can from affected pixel near pixel extract, thus described whole value can by same offset errors effect).Such as, if use the high-pass filtering of little kernel execution block 565 (such as, only consider the pixel of the direct neighbor of the environs of the pixel falling into the FPN impact being subject to space correlation), then the FPN of the space correlation of extensively distribution may not be detected.
Such as, Figure 11 shows the FPN according to the space correlation in disclosure embodiment, neighbouring pixel.As shown in the picture frame 1100 of sampling, pixel near pixel 1110 can show the FPN of space correlation, inaccurate and the single row and column of FPN of described space correlation is relevant, and be distributed in neighbouring multiple pixels (such as, in this example embodiment, neighbouring pixel is about the pixel of 4 × 4).The picture frame 1100 of sampling also comprises one group of pixel 1120 and one group of pixel 1130, and described pixel 1120 shows does not have substantially responding uniformly of use in filtering calculates, and described pixel 1130 is for estimating the low-pass value of the pixel near pixel 1110.In one embodiment, pixel 1130 can be can be divided into multiple pixels of 2, so that effective calculating of hardware or software.
Refer again to Fig. 5, at block 571-573, can optionally perform additional high-pass filtering to the NUC item upgraded and further determine process, to eliminate the FPN of space correlation, such as, the FPN of the space correlation that pixel 1110 shows.At block 571, the NUC item of the renewal determined at block 570 is applied to fuzzy graph picture frame.Therefore, now, fuzzy graph picture frame will for the FPN (such as, by applying the row and column FPN item upgraded at block 555) of preliminary corrections space correlation, and also for the space-independent FPN of preliminary corrections (such as, by applying the NUC item upgraded at block 571).
At block 572, further apply Hi-pass filter, the core of this Hi-pass filter is larger than the core of the Hi-pass filter used in block 565, and can determine the NUC item of renewal further at block 573.Such as, in order to detect the FPN of the space correlation existed in pixel 1110, the data of the enough large adjacent area from pixel can be comprised at the Hi-pass filter of block 572 application, thus can determine there is no affected pixel (such as, pixel 1120) and affected pixel (such as, pixel 1110) between difference.Such as, the low-pass filter (such as, the N × N kernel much larger than 3 × 3 pixels) with macronucleus can be used, and the result that obtains can be deducted to carry out suitable high-pass filtering.
In one embodiment, in order to improve counting yield, sparse kernel can be used, thus only use the neighbor of the lesser amt in N × N near zone.For Hi-pass filter operation (such as, there is the Hi-pass filter of macronucleus) of any given use neighbor far away, there is the risk (may be fuzzy) scene information of reality being modeled as the FPN of space correlation.Therefore, in one embodiment, the time decay factor λ of the NUC item being used for the renewal determined at block 573 can be set to close to 1.
In various embodiments, can repeatable block 571-573 (such as, cascade), high-pass filtering is performed iteratively to utilize the core size increased progressively, thus the NUC item upgraded further is provided, the NUC item of described further renewal is used for the FPN of the space correlation correcting the adjacent size area needed further.In one embodiment, the NUC item of the renewal that can obtain according to the prior operation by block 571-573, whether by elimination real for the FPN of space correlation, determines the decision performing this iteration.
After block 571-573 completes, make the decision (block 574) whether the NUC item of renewal being applied to the picture frame of catching.Such as, if the mean value of the absolute value of the NUC item of whole picture frame is less than minimum threshold value, or be greater than maximum threshold value, then can think that this NUC item is false or can not provides significant correction.Optionally, threshold criteria can be applied to each pixel, to determine which pixel-by-pixel basis receives the NUC item of renewal.In one embodiment, threshold value may correspond to the difference between the NUC item and the NUC item previously calculated of new calculating.In another embodiment, threshold value can independent of the NUC item previously calculated.Other tests (such as, spatial coherence test) can be applied to determine whether to apply this NUC item.
If think that NUC item is false or can not provides significant correction, then process flow diagram turns back to block 505.Otherwise, store the up-to-date NUC item (block 575) determined to substitute previous NUC item (such as, being determined by the iteration previously performed in Fig. 5), and the described up-to-date NUC item determined be applied to the picture frame that (block 580) catch.
Fig. 8 show according to disclosure embodiment, the various image processing techniques of the Fig. 5 be applied in image processing pipeline 800 and other operations.With regard to this respect, streamline 800 identifies when the processing scheme of the whole iterative images for correcting the picture frame that infrared imaging module 100 provides, the various operations of Fig. 5.In certain embodiments, streamline 800 can be provided by the processing module 160 operated the picture frame of being caught by infrared sensor 132 or processor 195 (the two usually also finger processor).
The picture frame that infrared sensor 132 is caught can be supplied to frame averager 804, described frame averager 804 asks the integration of multiple picture frame to provide the picture frame 802 of the signal to noise ratio (S/N ratio) with improvement.By infrared sensor 132, ROIC 402 and be embodied as and support that other assemblies that hi-vision catches the infrared sensor package 128 of speed provide frame averager 804 effectively.Such as, in one embodiment, infrared sensor package 128 can catch infrared image frame with the frame rate of 240Hz (such as, per second catch 240 width images).In this embodiment, such as by making infrared sensor package 128 be operated in relatively low voltage (such as, compatible mutually with the voltage of mobile phone), and by using relatively little infrared sensor 132 array (such as, in one embodiment, be the infrared array sensor of 64 × 64), realize frame rate high like this.
In one embodiment, with high frame rate (such as, 240Hz or other frame rate), this infrared image frame from infrared sensor package 128 can be supplied to processing module 160.In another embodiment, infrared sensor package 128 can carry out integration in longer time period or multiple time period, thus with lower frame rate (such as, 30Hz, 9Hz or other frame rate) (after such as, being averaged) the infrared image frame after integration is supplied to processing module 160.About the details that can be used for providing hi-vision to catch the implementation of speed can application number referenced before this paper be find in the U.S. Provisional Patent Application of submit to June 10 in 2011 61/495,879.
The picture frame 802 processed by streamline 800, for determining various adjustment item and gain compensation, wherein, is adjusted described picture frame 802 by various item, time-domain filtering.
At block 810 and 814, factory's gain term 812 and factory's shift term 816 are applied to picture frame 802, with the gain between the miscellaneous part compensating determined various infrared sensor 132 and/or infrared imaging module 100 during Computer-Assisted Design, Manufacture And Test respectively and offset deviation.
At block 580, NUC item 817 is applied to picture frame 802, to correct FPN as above.In one embodiment, if also do not determine NUC item 817 (such as, before starting NUC step), then may can not execution block 580, or initial value can be used for the NUC item 817 (such as, the off-set value of each pixel will equal 0) that view data can not be caused to change.
At block 818 to 822, respectively row FPN item 820 and row FPN item 824 are applied to picture frame 802.Row FPN item 820 and row FPN item 824 can be determined as mentioned above according to block 550.In one embodiment, if also do not determine row FPN item 820 and row FPN item 824 (such as, before starting NUC step), then may can not execution block 818 and 822, or the row FPN item 820 that initial value can be used for causing view data to change and row FPN item 824 (such as, the off-set value of each pixel will equal 0).
At block 826, according to noise in time domain abatement (TNR) step, time-domain filtering is performed to picture frame 802.Fig. 9 shows the TNR step according to disclosure embodiment.In fig .9, to the picture frame 802b process after the picture frame 802a be currently received and previous time-domain filtering to determine the picture frame 802e after new time-domain filtering.Picture frame 802a and 802b comprises local neighbor 803a and 803b respectively centered by pixel 805a and 805b.Neighbor 803a and 803b corresponds to the same position in picture frame 802a and 802b, and is the subset of the whole pixel of picture frame 802a and 802b.In the illustrated embodiment, neighbor 803a and 803b comprises the region of 5 × 5 pixels.The neighbor of other sizes can be used in other embodiments.
Determine the difference of the pixel that neighbor 803a and 803b is corresponding and it is averaging, thinking that the position corresponding to pixel 805a and 805b provides average increment value 805c.Average increment value 805c is used in block 807 and determines weighted value, to apply it to pixel 805a and the 805b of picture frame 802a and 802b.
In one embodiment, as shown in curve map 809, the weighted value determined at block 807 can be inversely proportional to average increment value 805c, and during to make that difference is larger between neighbor 803a and 803b, weighted value is reduced to 0 rapidly.With regard to this respect, between neighbor 803a and 803b, bigger difference can represent in scene and there occurs change (such as, the change occurred due to motion), and in one embodiment, suitable weighting can be carried out, to avoid to run into frame fuzzy to introducing during the scene change of frame to pixel 802a and 802b.Other associations between weighted value and average increment size 805c can be used in other embodiments.
The weighted value determined at block 807 can be used for pixel 805a and 805b, to determine the value (block 811) of the respective pixel 805e of picture frame 802e.With regard to this respect, pixel 805e can have according to the average increment value 805c determined at block 807 and weighted value the value after pixel 805a and 805b weighted mean (or other combinations).
Such as, the pixel 805e of the picture frame 802e after time-domain filtering may be the pixel 805a of picture frame 802a and 802b and the weighted sum of 805b.If the average difference between pixel 805a and 805b causes due to noise, so can be expected that, the change of the mean value between neighbor 805a and 805b will close to 0 (such as, corresponding to the mean value of incoherent change).In this case, can be expected that, the difference between neighbor 805a and 805b and will close to 0.In this case, suitable weighting can be carried out to the pixel 805a of picture frame 802a, to contribute to the value generating pixel 805e.
But, if this difference and be not 0 (such as, in one embodiment, even very close to 0), so can by change interpretation for being by kinetic, instead of caused by noise.Therefore, the change of the mean value that can show based on neighbor 805a and 805b detects motion.In this case, larger weight can be applied to the pixel 805a of picture frame 802a, and less weight is applied to the pixel 805b of picture frame 802b.
Other embodiments are also admissible.Such as, although what describe is determine average increment value 805c according to neighbor 805a and 805b, but in other embodiments, average increment value 805c can be determined according to the standard of any expectation (such as, according to the pixel groups be made up of a series of pixel of single pixel or other types).
In the above embodiments, picture frame 802a is described as the picture frame be currently received, and picture frame 802b is described as the picture frame previously after time-domain filtering.In another embodiment, picture frame 802a and 802b can be infrared imaging module 100 capture also not through the first and second picture frames of time-domain filtering.
Figure 10 shows the detailed implementation detail relevant with the TNR step performed by block 826.As shown in Figure 10, respectively picture frame 802a and 802b is read into line buffer 1010a and 1010b, and before picture frame 802b (such as, previous image frames) is read into line buffer 1010b, can be stored in frame buffer 1020.In one embodiment, one piece of random access memory (RAM) that can be provided by any suitable parts of infrared imaging module 100 and/or host apparatus 102 realizes line buffer 1010a-b and frame buffer 1020.
Refer again to Fig. 8, picture frame 802e can be sent to automatic gain compensation block 828, it is further processed picture frame 802e, the result images frame 830 that can use as required to provide host apparatus 102.
Fig. 8 further illustrates the various operations for determining as discussed performed by row and column FPN item and NUC item.In one embodiment, these operations can use picture frame 802e as shown in Figure 8.Because carried out time-domain filtering to picture frame 802e, so at least some noise in time domain can be eliminated, thus can not casual impact to the determination of row and column FPN item 824 and 820 and NUC item 817.In another embodiment, can use not through the picture frame 802 of time-domain filtering.
In fig. 8, the block 510,515 of Fig. 5 is together with 520 expressions of concentrating.As discussed, can event be started in response to various NUC step and optionally start based on various standard or condition and perform NUC step.Also as discussed, according to based drive method (block 525,535 and 540) or NUC step can be performed based on the method defocused (block 530), to provide fuzzy picture frame (block 545).Fig. 8 further illustrates the various extra blocks 550,552,555,560,565,570,571,572,573 and 575 about Fig. 5 previously discussed.
As shown in Figure 8, row and column FPN item 824 and 820 and NUC item 817 can be determined, and apply above-mentioned item in an iterative manner, determine the item upgraded to make to use the picture frame 802 having applied first preceding paragraph.As a result, the institute of Fig. 8 can repeatedly upgrade in steps, and apply these with the noise in the picture frame 830 reducing host apparatus 102 continuously and will use.
Refer again to Figure 10, it illustrates the detailed implementation detail of various pieces relevant with streamline 800 in Fig. 5 and Fig. 8.Such as, block 525,535 and 540 is shown as the regular frame rate operation of the picture frame 802 to be received by streamline 800.In the embodiment shown in fig. 10, the decision made at block 525 is expressed as and determines rhombus (decision diamond), it is for determining whether Given Graph picture frame 802 changes fully, thus can think if picture frame is joined in other picture frames, this picture frame will strengthen fuzzy, therefore this picture frame is carried out adding up (in this embodiment, representing block 535 by arrow) and average (block 540).
In addition, in Fig. 10, operate being shown as the determination (block 550) of row FPN item 820 with renewal rate, in this example embodiment, due to the average treatment performed at block 540, this renewal rate is 1/32 of sensor frame rate (such as, regular frame rate).Other renewal rates can be used in other embodiments.Although Figure 10 only identifies row FPN item 820, can in an identical manner, with the frame rate reduced to realize row FPN item 824.
Figure 10 also show the detailed implementation detail relevant with the NUC determining step of block 570.With regard to this respect, fuzzy graph picture frame can be read into line buffer 1030 (block RAM such as, provided by any suitable parts of infrared imaging module 100 and/or host apparatus 102 realizes).The flat field correction technology 700 of Fig. 7 can be performed to fuzzy graph picture frame.
In view of content of the present disclosure, should be understood that, technology described herein can be used for eliminating various types of FPN (such as, comprising the FPN of very high-amplitude), such as, and the row and column FPN of space correlation and space-independent FPN.
Other embodiments are also admissible.Such as, in one embodiment, the renewal rate of row and column FPN item and/or NUC item can be inversely proportional to the fuzzy estimate amount in fuzzy graph picture frame, and/or is inversely proportional to the size of local contrast value (such as, in the local contrast value that block 560 is determined).
In various embodiments, the technology of description is better than traditional noise compensation technology based on shutter.Such as, by using the step without shutter, do not need to arrange shutter (such as, as shutter 105), thus can reduced in size, weight, cost and mechanical complexity.If do not need the operation shutter of machinery, the power supply and maximum voltage that are supplied to infrared imaging module 100 or produced by infrared imaging module 100 also can be reduced.By being removed by the shutter as potential trouble spot, reliability will be improved.The potential image that step without shutter also eliminates caused by the temporary jam of the scene by shutter imaging interrupts.
In addition, by using the fuzzy graph picture frame of catching from real scene (not being the even scene that shutter provides) to correct noise wittingly, noise compensation can be carried out by the picture frame similar with expecting those real scenes of imaging to radiation level.This can improve precision according to the determined noise compensation item of the technology of various description and efficiency.
As discussed, in various embodiments, infrared imaging module 100 can be configured to work at lower voltages.Especially, by being configured to work under low-power consumption and/or realize infrared imaging module 100 according to the circuit of other parameter work, other parameters described allow infrared imaging module 100 easily and effectively to realize in various types of host apparatus 102 (such as, mobile device and other devices).
Such as, Figure 12 show according to disclosure embodiment, the block diagram of another implementation of the infrared sensor package 128 that comprises infrared sensor 132 and low-dropout regulator (LDO) 1220.As shown in the figure, Figure 12 also show various parts 1202,1204,1205,1206,1208 and 1210, can to realize these parts with the previously described mode identical or similar about the corresponding parts of Fig. 4.Figure 12 also show bias voltage correction circuit 1212, and it can be used for adjusting (such as, with compensation temperature change, self-heating and/or other factors) one or more bias voltage being supplied to infrared sensor 132.
In certain embodiments, LDO 1220 can be set to a part (such as, be positioned on identical chip and/or wafer-class encapsulation is ROIC) for infrared sensor package 128.Such as, LDO1220 can be set to a part of the FPA with infrared sensor package 128.As discussed, this realization can reduce the power supply noise be incorporated in infrared sensor package 128, thus provides the PSRR of improvement.In addition, by utilizing ROIC to realize LDO, less die area can be consumed, and need less separation matrix (or chip).
LDO 1220 receives by feed line 1232 input voltage that power supply 1230 provides.LDO 1220 provides output voltage by feed line 1222 to the various parts of infrared sensor package 128.With regard to this respect, according to the U.S. Patent application No.14/101 submitted in such as on Dec 9th, 2013, the various technology (its entirety is incorporated to herein by quoting) described in 245, LDO 1220 can in response to the single input voltage received from power supply 1230, and all parts to infrared sensor package 128 provides substantially the same regulation output voltage.
Such as, in certain embodiments, power supply 1230 can provide from about 2.8v to the input voltage of about 11v scope (such as, be about 2.8v in one embodiment), and LDO 1220 can provide from about 1.5v to the output voltage of about 2.8v scope (such as, be approximately 2.8 in one embodiment, 2.5,2.4v, and/or voltage lower in various embodiments).With regard to this respect, no matter power supply 1230 is the conventional voltage scopes being implemented about 9v to about 11v, and still implement low-voltage (such as, about 2.8v), LDO 1220 can be used for providing constant regulation output voltage.Therefore, although provide multiple voltage scope for input and output voltage, can be expected that, no matter how input voltage changes, and the output voltage of LDO 1220 will remain unchanged.
Compared with the conventional power source for FPA, the part that LDO 1220 is embodied as infrared sensor package 128 had lot of advantages.Such as, traditional FPA depends on multiple power supply usually, and each in described multiple power supply is discerptible powers to FPA, and all parts being distributed in FPA separated.By being regulated by LDO 1220 pairs of single supplies 1230, the discriminable all parts being supplied to the infrared sensor package 128 of (such as, to reduce possible noise) low-complexity of suitable voltage.Even if the input voltage from power supply 1230 changes (such as, if due to battery or make input voltage increase or reduce for the charging of the device of the other types of power supply 1230 or electric discharge), the use of LDO 1220 also makes infrared sensor package 128 still can work in a constant manner.
The various parts of the infrared sensor package 128 shown in Figure 12 also can be embodied as at the lower operating at voltages of the voltage used than conventional apparatus.Such as, as discussed, LDO 1220 can be embodied as provides low-voltage (such as, about 2.5v).This and the multiple high voltages be generally used for as traditional FPA powers define striking contrast, and described multiple high voltage is such as: for the voltage of the about 3.3v to about 5v for supplying digital circuits; For the voltage of about 3.3v of powering for mimic channel; And for the voltage of the about 9v to about 11v for load supplying.Same, in certain embodiments, the use of LDO 1220 can reduce or eliminate the needs to the independent negative reference voltage being supplied to infrared sensor package 128.
With reference to Figure 13, other aspects of the low voltage operating of infrared sensor package 128 can be understood further.Figure 13 show according to disclosure embodiment, the circuit diagram of the part of the infrared sensor package 128 of Figure 12.Especially, Figure 13 shows the miscellaneous part (such as, parts 1326,1330,1332,1334,1336,1338 and 1341) of the bias voltage correction circuit 1212 being connected to LDO 1220 and infrared sensor 132.Such as, according to embodiment of the present disclosure, bias voltage correction circuit 1212 can be used for compensating the change depending on temperature in bias voltage.Be 7,679 by reference to publication number, 048, publication date is the similar parts indicated in the United States Patent (USP) in March 16 in 2010, the operation of these other annexes can be understood further, it can be used as entirety to be herein incorporated by way of reference.Can be also 6,812 according to publication number, 465, publication date be the various parts that indicate in the United States Patent (USP) on November 2nd, 2004 to realize infrared sensor package 128, it can be used as entirety to be herein incorporated by way of reference.
In various embodiments, all or part of bias voltage correction circuit 1212 can realize on integral array basis as shown in fig. 13 that (such as, for concentrating all infrared sensors 132 in an array).In other embodiments, all or part of bias voltage correction circuit 1212 (such as, each sensor 132 being copied in whole or in part) can be realized on single-sensor basis.In certain embodiments, the bias voltage correction circuit 1212 of Figure 13 and miscellaneous part can be embodied as a part of ROIC 1202.
As shown in figure 13, LDO 1220 provides load voltage Vload to the bias voltage correction circuit 1212 along in feed line 1222.As discussed, in certain embodiments, Vload can be approximately 2.5v, and in contrast, the size that can be used as the load voltage in traditional infrared imaging device is approximately the higher voltage of 9v to about 11v.
Based on Vload, bias voltage correction circuit 1212 provides sensor bias voltage Vbolo at node 1360.Vbolo is distributed to one or more infrared sensor 132 by the on-off circuit 1370 (such as, being represented by the dotted line in Figure 13) be applicable to.In some instances, can be 6,812,465 and 7,679 according to the publication number quoted before this paper, the suitable parts indicated in the patent of 048 be to realize on-off circuit 1370.
Each infrared sensor 132 include by on-off circuit 1370 receive Vbolo node 1350 and can another node 1352 of ground connection, substrate and/or negative reference voltage.In certain embodiments, the voltage at node 1360 place is substantially identical with the Vbolo at node 1350 place.In other embodiments, adjustable at the voltage at node 1360 place, to compensate the possible pressure drop relevant with on-off circuit 1370 and/or other factors.
The voltage that the voltage that usually uses than traditional infrared sensor bias voltage can be utilized lower is to realize Vbolo.In one embodiment, Vbolo can from about 0.2v to the scope of about 0.7v.In another embodiment, Vbolo can in the scope of about 0.4v to about 0.6v.In another embodiment, Vbolo is approximately 0.5v.By contrast, the normally used bias voltage of traditional infrared sensor is approximately 1v.
Compared with traditional infreared imaging device, make infrared sensor package 128 can have significantly reduced power consumption according to infrared sensor 132 of the present disclosure compared with the use of low bias voltage.Especially, the power consumption of each infrared sensor 132 is with square minimizing of bias voltage.Therefore, the reduction (such as, dropping to 0.5v from 1.0v) of voltage provides the reduction of significant power consumption, particularly when the reduction of described voltage is applied to the multiple infrared sensor 132 in infrared array sensor.The reduction of this power also can cause the minimizing of the self-heating of infrared array sensor 128.
According to other embodiments of the present disclosure, provide the various technology for reducing the noise effect in the picture frame provided by the infreared imaging device being operated in low-voltage.With regard to this respect, when infrared sensor package 128 is with described low voltage operating, if do not corrected noise, self-heating and/or other phenomenons, in the picture frame that described noise, self-heating and/or other phenomenons can provide at infrared sensor package 128, become more obvious.
Such as, with reference to Figure 13, when LDO 1220 remains on low-voltage Vload in a manner described herein, Vbolo also will remain on its corresponding low-voltage, and can reduce the relative size of its output signal.Therefore, noise, self-heating and/or other phenomenons can produce larger impact to the less output signal read from infrared sensor 132, thus cause the change (such as, mistake) of output signal.If do not corrected, these changes may show as the noise in picture frame.In addition, although low voltage operating can reduce some phenomenon (such as, self-heating) total number, but the error source that less output signal can make to remain (such as, residual self-heating) produces out-of-proportion impact to output signal during low voltage operating.
In order to compensate this phenomenon, various array sizes, frame rate and/or frame averaging can be utilized to realize infrared sensor package 128, infrared imaging module 100 and/or host apparatus 102.Such as, as discussed, various different array sizes can be considered for infrared sensor 132.In certain embodiments, the infrared sensor 132 of the array sizes of scope from 32 × 32 to 160 × 120 can be utilized to realize infrared sensor 132.The array sizes of other examples comprises 80 × 64,80 × 60,64 × 64 and 64 × 32.Any desired size can be used.
Advantageously, when utilizing this relatively little array sizes to realize infrared sensor package 128, described infrared sensor package 128 without the need under carrying out more cataclysmal situation to ROIC and interlock circuit, can provide picture frame with relatively high frame rate.Such as, in certain embodiments, the scope of frame rate can from about 120Hz to about 480Hz.
In certain embodiments, array sizes and frame rate can relative to each other between increase and decrease (such as, with inversely proportional mode or other modes), to make larger array be embodied as, there is lower frame rate, and less array is embodied as and has higher frame rate.Such as, in one example in which, the array of 160 × 120 can provide the frame rate being approximately 120Hz.In another embodiment, the array of 80 × 60 can provide the higher frame rate being approximately 240Hz accordingly.Other frame rate are also admissible.
By convergent-divergent array sizes and frame rate ratio relative to each other, the concrete reading time of the row of FPA and/row can keep constant, has nothing to do with the FPA size of reality or frame rate.In one embodiment, the reading time can be about 63 microseconds of every row or column.
As the discussion before about Fig. 8, the picture frame that infrared sensor 132 is caught can be supplied to frame averager 804, described frame averager 804 asks the integration of multiple picture frame to have lower frame rate (such as to provide, about 30Hz, approximately 60Hz or other frame rate) and the picture frame 802 (picture frame such as, after process) of signal to noise ratio (S/N ratio) of improvement.Especially, by being averaged to the high frame rate picture frame provided by relatively little FPA array, the picture noise produced on average and/or significantly can be reduced effectively in picture frame 802 due to low voltage operating.Therefore, infrared sensor package 128 can be operated in the relatively low voltage provided by LDO 1220 as discussed, and after frame averager 804 processes the picture frame 802 produced, infrared sensor package 128 can not be subject to the impact of extra noise in the picture frame 802 of described generation and relevant spinoff.
Other embodiments are also admissible.Such as, although show the single array of infrared sensor 132, can be expected that, multiple such array can be used together to provide the picture frame of high-resolution (such as, a scene can imaging on multiple such array).This array can be arranged on multiple infrared sensor package 128 and/or be arranged in same infrared sensor package 128.As described, each such array all can be operated in low-voltage, and also can be the relevant ROIC circuit of each such array configurations, to make the frame per second work that each array still can be relatively high.Shared or dedicated frame averager 804 can be averaged to the high frame rate image frame provided by this array, to reduce and/or to eliminate the noise relevant to low voltage operating.Therefore, still high-resolution Thermo-imaging system can be obtained when being operated in low-voltage.
In various embodiments, infrared sensor package 128 can be embodied as suitable size, can use together with little shape factor socket 104 (such as, for the socket of mobile device) to make infrared imaging module 100.Such as, in certain embodiments, infrared sensor package 128 can be embodied as the chip size that scope is about 4.0mm × approximately 4.0mm to about 5.5mm × about 5.5mm (such as, in one embodiment, about 4.0mm × approximately 5.5mm).Infrared sensor package 128 can be embodied as this size or other suitable sizes, to make it possible to use together with the socket 104 being embodied as various sizes, the size of described socket 104 is such as: 8.5mm × 8.5mm, 8.5mm × 5.9mm, 6.0mm × 6.0mm, 5.5mm × 5.5mm, 4.5mm × 4.5mm and/or other jack sizes, such as, as the U.S. Provisional Patent Application No.61/495 that on June 10th, 2011 submits to, those sizes shown in the table 1 of 873, this application is incorporated to herein by quoting entirety.
Figure 14 shows the block diagram of the imaging system 1400 (such as thermal camera) for catching and/or process image (such as, digital still or video) according to disclosure embodiment.Such as, as described further below, the various embodiments of imaging system 1400 can be suitable for the various operations of implementation, thus suppress the sky in image.In various embodiments, imaging system 1400 can comprise image sensor device 1402, processor 1406, storer 1408, orientation sensor 1410, display 1412 and/or network interface 1414.In various embodiments, the parts of imaging system 1400 can be implemented in the same or similar mode of corresponding component of the host apparatus 102 with Fig. 1.In addition, the parts of imaging system 1400 can be configured to perform above-mentioned various NUC processes.
In some embodiments, the various parts of imaging system 1400 can be distributed, and are communicated with one another by network 1420.In this respect, network interface 1414 can be configured to auxiliary imaging system 1400 various parts between via the wired of network 1420 and/or radio communication.In some embodiments, can utilizing at least partially of imaging system 1400 is realized by the suitable parts of network interface 1414 with the remote-control device 1416 (such as, Conventional digital video register (DVR), imaging processing computing machine, web camera host apparatus and/or other devices) of the various component communications of imaging system 1400 on network 1420.That is, for example, processor 1406 all or part of, storer 1408 all or part of, all or part of and/or network interface 1414 of display 1412 all or part of can implement on remote-control device 1416 or reproduce.In one embodiment, such as, remote-control device 1416 (such as, web camera host apparatus) can be suitable for receiving from the image sensor device 1402 of long-range placement (such as, surveillance camera) image of catching, to perform various image processing operations on the image received, thus suppress the sky in image as described below.It should be appreciated that when not departing from the scope of the present disclosure and spirit, other distributed enforcements many of imaging system 1400 are also possible.
In various embodiments, image sensor device 1402 can be suitable for the image (such as, digital still or frame of video) of capturing scenes.In some embodiments, can utilize and above-mentioned comprise infrared sensor package 128 (such as, FPA) infrared imaging module 100 (such as, thermal imaging system or thermal camera) or other suitable infrared imaging devices being suitable for catching heat picture are to sensor device 1402 of imaging.Therefore, in one embodiment, as mentioned above, image sensor device 1402 can comprise the bias circuit 1212 that the infrared sensor 132 being suitable for FPA and providing the LDO 1220 of regulation voltage and be suitable for FPA provides bias voltage.In one or more embodiments, the heat picture that image sensor device 1402 is caught can comprise pixel, and these pixels have the pixel value of the infrared radiation level that can represent relevant to the relevant position in scene.
In other embodiments, image sensor device 1402 can alternatively or extraly utilize the imaging sensor array of non-thermal (such as, comprising the electromagnetic radiation in visible wavelength, near infrared (NIR) wavelength, short-wave infrared (SWIR) wavelength, ultraviolet (UV) wavelength and/or other the non-thermal wavelengths) image being suitable for capturing scenes or module to implement.In these embodiments, conventional charge (CCD) sensor, complementary metal oxide semiconductor (CMOS) sensor, electron multiplication CCD (EMCCD) sensor, Scientific Grade CMOS (sCMOS) sensor or other sensors may be used for imaging extraly or alternatively sensor device 1402.In one or more embodiments, the non-thermographic of catching at image sensor device 1402 place can be visible images, it comprises pixel, these pixels have the pixel value of the radiation level (such as, other values of illuminance, brightness or visual intensity) of the radiation that can represent in visible light.
In some embodiments, image sensor device 1402 can comprise optical element 1403 (such as, lens, prism, fibre optic and/or other elements), its be suitable for by from the radiation collection in the visible ray of scene, infrared radiation and/or other spectrum, lead and/or gather on the imaging sensor array of image sensor device 1402.In various embodiments, optical element 1403 can also limit the visual field (FOV) 1404 of image sensor device 1402.Such as, the part fallen in scene 1440 in FOV 1404 can be captured as image 1442 by image sensor device 1402.In some embodiments, optical element 1403 can comprise adjustable optical elements, such as zoom lens.The FOV 1404 changing optical element 1403 and provide can be provided according to the optical element 1403 of these embodiments.In one embodiment, optical element 1403 can comprise be suitable for providing the adjustment relating to optical element 1403 feedback information (such as, be arranged on the focal length on zoom lens) position transducer (such as, position coder, pot and/or other suitable parts).
In some embodiments, image sensor device 1402 can comprise infrared imaging module (such as, thermal imaging system) and non-thermographic module (such as, visible light camera), they can be implemented according to any one in the various technology quoted herein and use.Such as, infrared and non-thermographic module may be embodied as the common non-thermal video camera placed substantially, and it is towards the FOV making the FOV of non-thermal video camera superpose corresponding infrared imaging module at least in part.In these embodiments, as described herein, the image that these equipment are caught and/or view data can be applied, merge, mix or otherwise synthesize.It is also contemplated that the image sensor device 1402 with more than two image-forming modules is implemented.
In some embodiments, image sensor device 1402 can utilize adjustable frames 1405 to install or be otherwise placed in a position alternatively, thus image sensor device 1402 can shake or tilt to regulate FOV 1404 towards desired orientation.Such as, adjustable frames 1405 can be suitable for the housing of the miscellaneous part surrounding image sensor device 1402 and/or imaging system 1400 to fix or be otherwise attached to suitable structure or position (such as, wall).In some embodiments, adjustable frames 1405 can comprise and is suitable for the position transducer (such as, position coder, pot and/or other suitable parts) of detection imaging sensor device 1402 relative to the rotation of installation site, translation and/or the change of other directions.In some embodiments, adjustable frames 1405 can comprise one or more driving mechanism (such as, actuator, motor and/or other suitable equipment), its be suitable in response to command signal (such as, from remote-control device 1416) shake, tilt or otherwise change the image sensor device 1402 of attachment or imaging system 1400 towards.
Exemplary scenario 1440 shown in Figure 14 can be the view comprising sky and ground.Therefore, such as, the catching image 1442 and can comprise sky areas 1444 and ground region 1446 of this scene 1440, they can the approximate location in local horizon 1448 in image 1442 be divided.In some embodiments, sky areas 1444 can be identified as region or the scope in some limits 1450 of permission on local horizon 1448, as further described herein.As can be appreciated, if the FOV 1404 of image sensor device 1402 changes, then catch the sky areas 1444 in image 1442, the position in ground region 1446 and local horizon 1448 can change.
As described in the processor 195 of composition graphs 1, any suitable treatment facility can be utilized to implement processor 1406.In some embodiments, processor 1406 described herein at least partially or certain operations may be embodied as the part of image sensor device 1402, such as, the part place of processing module 160 that describes of composition graphs 1 hereinbefore.Processor 1406 can be suitable for the miscellaneous part of camera chain 1400 alternately with communicate, to perform operation described herein and process.
In various embodiments, processor 1406 can be suitable for the image 1442 receiving the scene 1444 of being caught by image sensor device 1402, and on the image 1442 received, perform various image forming process operation to suppress the sky in image 1442, as further described herein.In some embodiments, processor 1406 can be suitable for accessing the image 1442 that the image 1442 be stored in storer 1408 is caught with reception.In other embodiments, processor 1406 can be suitable for receiving image 1442 and being stored in storer 1408 by image 1442.In some embodiments, as above, after perform various image forming process operation on the image 1442 received, processor 1406 can be suitable for the image processed to be stored in storer 1408.
As discussed above, for the embodiment that existing infrared imaging module has again the imaging system 1400 of non-thermographic module, processor 1406 can be configured to superpose, merge, mixing or otherwise synthesize infrared image that infrared imaging module catches (such as, comprise heat picture) and the non-thermographic module non-thermographic of catching is (such as, comprise visible images), thus obtain the infrared image with the resolution of lifting and/or the enhancing of contrast.
In various embodiments, as further described herein, processor 1406 can comprise the various hardware and/or software module that are suitable for the various image forming process operation performed for suppressing the sky in image 1442.In some embodiments, the all or part of of software module can be the executable software instruction of machine be stored in machine readable media 193, and these software instructions can be downloaded or otherwise transfer to imaging system 1400 (such as from this machine readable media 193, to in storer 1408), to be performed by processor 1406.Such as, the executable software instruction of machine can be performed the various operations of process 1500 described below by processor 1406.In some embodiments, processor 1406 can comprise hardware logic (such as, utilizing circuit and other electronic units to implement), and it is configured to the various operations performing following process 1500.In some embodiments, the certain operations of process 1500 can be performed by the hardware logic of processor 1406, and other operations of process 1500 can have been come by executive software instruction.
Storer 1408 can comprise one or more memory storage or electric mechanical memory storage and relevant logic (such as, implement with hardware, software or its combination), for storing and access data in one or more storer or electric mechanical memory storage and information.One or more storer or electric mechanical memory storage can comprise various similar volatibility and nonvolatile memory and memory device, such as, hard disk drive, flash memory, RAM (random access memory), EEPROM (electricallyerasable ROM (EEROM)) and other devices for storing data information.In various embodiments, storer 1408 can be suitable for store and access images (such as, image 1442) to be processed by imaging system 1400, described image is caught by image sensor device 1402, receive via network interface 1414 and/or otherwise obtain.In some embodiments, processor 1406 can be suitable for performing the software module be stored in storer 1408, thus performs various operation as herein described.
Orientation sensor 1410 can be fixed relative to image sensor device 1402, with detect image sensor device 1402 towards.Such as, orientation sensor 1410 can be attached to or otherwise be installed in other suitable parts of image sensor device 1402 or imaging system 1400, thus obtain image sensor device 1402 relative to reference point (such as, relative to ground level or the local horizon of the earth) inclination, rotation, height or other are towards measured value.In various embodiments, gyroscope, accelerometer, tiltmeter, altitude gauge, Inertial Measurement Unit (IMU) and/or other suitable sensors can be utilized to implement orientation sensor 1410.Such as, utilize by orientation sensor 1410 obtain towards measured value, processor 1406 can be suitable for calculating or otherwise determine the approximate location in the local horizon 1448 in image 1442.
In some embodiments, orientation sensor 1410 can also comprise GPS (GPS) receiver and/or electronic compass.Can utilize be suitable for being incorporated in miniaturized electronics with provide gps receiver operate, electronic compass operation or the two suitable chipset or electronic module to implement gps receiver and/or electronic compass.Gps receiver and/or electronic compass may be used for obtaining the geographical location information relevant to imaging system 1400.
In some embodiments, imaging system 1400 can comprise display 1412, and it may be embodied as known video display or the monitor of image display (such as, liquid crystal display (LCD)) or various other types.In some embodiments, processor 1406 can be suitable for view data (image of such as, catching and/or the image of process) and other information displaying on display 1412.In this, display 1412 can comprise display logic (such as, hardware circuit and/or software program), and it can by processor 1406 for showing view data and information.In some embodiments, display 1412 can be implemented extraly or alternatively on remote-control device 1416, and be suitable for receiving view data and information via network interface 1414, thus image be shown to the user away from image sensor device 1402 and/or processor 1406.
In some embodiments, imaging system 1400 can comprise network interface 1414, its can be suitable for process, management or otherwise between auxiliary imaging system 1400 and external unit, and by network 1420 and/or other wired and/or radio communications connected between the various parts of imaging system 1400 (such as, being included in the parts implemented in remote-control device 1416).In this respect, can be suitable for utilizing various wireless and/or wired network interface according to the network interface 1414 of some embodiments, protocol and standard communicates.In one embodiment, network interface 1414 can comprise wireless communication unit (such as, based on IEEE 802.11WiFi standard, Bluetooth tMstandard, ZigBee tMstandard or other suitable wireless near field communication standards), WiMAX parts (such as, based on WiMax technology), mobile cellular parts, radio satellite parts or other suitable wireless communication units.In one embodiment, network interface 1414 can be suitable for via wireline communication section and cable network mutual, and wireline communication section such as power line modem (such as, supports HomePlug tMstandard), digital subscriber line (DSL) modulator-demodular unit, pstn (PSTN) modulator-demodular unit, Ethernet interface, cable modem and/or other the suitable parts for wire communication.In one embodiment, network interface 1412 can be disposed for special wired communication protocol and interface, and/or for based on the proprietary wireless communication protocol of radio frequency (RF), microwave frequency (MWF), infrared frequency (IRF) and/or other suitable Radio Transmission Technologys and interface.
In some embodiments, network interface 1414 can also be suitable for supporting such as, for the various interfaces of surveillance camera and system, agreement and/or standard, open network video interface forum (ONVIF) standard and/or other standards.Therefore, for example, imaging system 1400 can control via network interface 1414 and/or receive the image of the external camera from other compatible ONVIF, by catch and/or the image of process be sent to the central monitoring station of compatible ONVIF, and/or receive order and configuration from the computing machine of compatible ONVIF.
As mentioned above, the various parts of imaging system 1400 can combine, copy and/or omit, desired by the application-specific of imaging system 1400.In an example, processor 1406 can combine with image sensor device 1402, storer 1408, display 1412 and/or network interface 1414.In another example, a part for processor 1406 can combine with image sensor device 1402 or implement on image sensor device 1402, and only some function of processor 1406 is performed by the circuit (such as: processor, microprocessor, logical unit, microcontroller etc.) in image sensor device 1402.In another example, adjustable frames 1405 can be omitted in the hard-wired application or field camera application of imaging system 1400.In another example, network interface 1414 and remote-control device 1416 can be omitted in the independent cameras application of imaging system 1400.
Figure 15 shows the process 1500 for suppressing the sky in image according to disclosure embodiment.Such as, one or more embodiments of the imaging system 1400 of all or part of Figure 14 of use of process 1500 perform, with the sky in restrain and capture image (such as, reduce the available out-put dynamic range that the sky in image consumes), thus can for catching ground object in image and other interested objects retain larger dynamic range.Although hereinafter process 1500 can be described as an example relative to imaging system 1400, but it is to be appreciated that, other imaging systems or camera chain (such as, comprising visible light camera) can be suitable for being configured to or all or part of for implementation 1500.
At block 1502, image sensor device can be utilized to carry out the image (such as, the frame of digital still or digital video) of capturing scenes.Such as, scene and the image of catching thus can comprise a part for sky and the part on ground.By example, the image 1442 comprising sky areas 1444 and ground region 1446 can be the scene 1440 utilizing image sensor device 1402 to catch.In some embodiments, image 1442 can be the non-thermographic of the scene 1440 utilizing image sensor device 1402 to catch (such as, visible images), this image sensor device utilizes non-thermographic module (such as, having the visible light camera of the sensor based on CMOS or CCD) to implement.
In some embodiments, image 1442 can be the heat picture utilizing image sensor device 1402 to catch, for example, this image sensor device 1402 utilizes the one or more embodiments comprising the infrared imaging module 100 of above-mentioned infrared sensor package 128 (such as, FPA) to implement.In this respect, in some embodiments, catch heat picture at block 1502 can relate to and heat picture performs on heat picture NUC operation by obtaining NUC item and they being applied to, to describe as above composition graphs 5-11 and other locally describe herein.In some embodiments, catch heat picture at block 1502 can relate to and utilize LDO 1220 to regulate to be supplied to the voltage of FPA and/or to utilize the infrared sensor 132 of bias circuit 1212 bias voltage FPA, as above in conjunction with the biasing circuit 1212 as described in Figure 12 and 13.
In various embodiments, for example, as being usually used in digital imagery, the image 1442 of catching can utilize multiple pixel to represent, each in these pixels all has and can represent (such as, associated but might not be directly proportional) pixel value of the radiation level (such as, the power of incident radiation) relevant to the relevant position in scene.Alternatively, in some embodiments, the radiation level relevant to scene can also be converted into pixel value by various conventional automatic growth control (AGC) method, conventional dynamic range compression algorithm, conventional exposal control method or other conventional methods, thus makes the scope of the radiation level relevant to scene be fitted to the scope of available pixel value.
Therefore, for catching for embodiment that image 1442 can be the heat picture of scene 1440, pixel value can represent the radiation level from the incident IR radiation of relevant position in scene 1440, the temperature correlation that it again can be relevant to relevant position.For catching for embodiment that image 1442 can be the visible images of scene 1440, pixel value can represent the radiation level (being also referred to as strength level) of the incidence visible light from the relevant position in scene 1440.For color visible image, according to one or more embodiment, the passage of three or more pixel values (such as, red, green and blue passage or other suitable color channels) can be had.
In some embodiments, such as, the embodiment of the image sensor device 1402 implemented by utilizing infrared imaging module and non-thermographic module, heat picture and non-thermographic all can be captured at block 1502.
If expect the embody rule of these embodiments, then block 1502 can also comprise superposition, merging, mixing or otherwise synthesize heat picture and non-thermographic to obtain the heat picture of the enhancing discussed.These composographs can have resolution and/or the contrast of enhancing, retain at least some thermal information (such as, radiation detection temperature information) from the heat picture of catching simultaneously.In these embodiments, at where applicable, the operation of process 1500 can perform on the composite image, or utilizes composograph to perform.That is, the sky areas in composograph can be suppressed to strengthen composograph further.
Alternatively, in the operation of implementation 1500 with after suppressing sky areas, the heat picture of catching and non-thermographic can be synthesized.Such as, the block 1504-1512 of process 1500 can perform on the heat picture of catching and/or the non-thermographic of catching, or utilize them to perform, thus suppress sky areas in one or both wherein, image can be synthesized to strengthen resolution and/or contrast subsequently.
In some embodiments, the various operations of block 1502 can be omitted from process 1500.Such as, if image 1442 be the be provided to imaging system 1400 of scene 1440 previously caught and/or store image, then the operation of block 1502 can relate to reception (such as, by network interface 1414) this image 1442, instead of catches it.Therefore, some embodiments of process 1500 may be used for performing various image processing operations on the image of having caught, and to suppress the sky in these images, and image sensor device 1402 can not be utilized to catch image.
At block 1504, region or the scope (such as one group of pixel) that can correspond to sky in the image of catching (such as, comprising the image of the storage of previously having caught) can be identified.Such as, the sky areas 1444 of catching in image 1442 can be identified.In various embodiments, block 1504 for identifying that the operation of sky areas can comprise the position in the local horizon (such as local horizon 1448) determined in image.Based on horizontal position, the image of catching can be divided on sky areas (such as, sky areas 1444) on local horizon and local horizon or under ground region (such as, ground region 1446).In some embodiments, some borders (such as, border 1450) can add to identify sky areas on local horizon, thus reduces the misclassification situation that the ground object extended on local horizon is considered to belong to sky.In some embodiments, border 1450 can be adjusted (such as, based on graphical analysis) adaptively.In other embodiments, border 1450 can be (such as, being specified by user) of specifying.
The horizontal position of catching in image can be determined in every way.In some embodiments, horizontal position in image can utilize orientation sensor 1410 (such as, comprise gyroscope, tiltmeter, altitude gauge or other for determine relative to local horizon or other reference point towards sensor) determine, orientation sensor can be fixed relative to image sensor device 1410 and/or be calibrated, thus detects and provide the relative position of local horizon in the FOV 1404 of image sensor device 1402.Such as, position in the picture, local horizon can calculate according to the one or more angles of inclination of the image sensor device 1402 detected by orientation sensor 1410 relative to the ground level of the earth or otherwise determine.
In some embodiments, for the application-specific of image sensor device 1402 or imaging system 1400, local horizon can be determined based on horizontal baseline position catching the position in image.Such as, can be arranged on or otherwise be placed in some positions or mounting points (such as at image sensor device 1402 and/or imaging system 1400, as surveillance camera, vehicle-mounted vidicon or boat-carrying video camera) application in, can indicate or otherwise provide local horizon relative to the baseline position (such as, as the pixel coordinate of user's input) of given installation site.
In this respect, such as, for the hard-wired application that camera angle/position is constant substantially after mounting, the baseline position of instruction can be used as the position of local horizon in the image of catching.In another exemplary application, image sensor device 1402 and/or imaging system 1400 can utilize adjustable frames 1405 (yawing and inclined base) to install, and this adjustable frames has position coder, pot or other suitable parts being suitable for providing position feedback information.Such as, this position feedback information can comprise image sensor device 1402 and/or imaging system 1400 relative to the rotation of installation site, translation and/or other directional changes.In such applications, for local horizon in the picture relative to any change in location of baseline position, position feedback information may be used for calculating skew or otherwise regulating.
Similarly, in another example, image sensor device 1402 can comprise optical element 1403 (such as, zoom lens), optical element 1403 can be adjustable and comprise be suitable for providing the adjustment relating to optical element feedback information (such as, be arranged on the focal length on zoom lens) position transducer (such as, position coder, pot and/or other suitable parts).For the change of the FOV 1404 provided due to optical element 1403 and the local horizon caused in the picture relative to any change in location of baseline position, this feedback information may be used for regulating.
In some embodiments, horizontal line definitely can be carried out catching the position in image for detecting the horizontal graphical analysis operation of catching in image by performing.According to one or more embodiment, various suitable algorithm may be used for detecting the local horizon of catching in image.Such as, graphical analysis operation can comprise edge detection algorithm and/or suitably improve with detect one or more may corresponding to the line detection algorithm of the horizontal lines of catching in image (such as, utilizing Hough transform or other suitable methods).
At block 1506, the distribution of radiation level in the sky areas (such as sky areas 1444) identified can be analyzed.Such as, the statistics character of the pixel value (such as, represent the value of radiation level) relevant to the sky areas of image can be analyzed.According in the particular instance of some embodiments, the histogram of these pixel values can be obtained at block 1506.In some embodiments, by collecting on multiple picture frame and assembling and obtain such histogram, thus the more accurate histogram of the radiation level relevant to sky areas can be built.Such as, large margin of error may be there is (such as in the identification in region on high, due to the changeability of horizontal position, such as, when for vehicle-mounted or boat-carrying video camera) application in, the quantity of sample (picture frame) to be assembled can correspondingly increase.According to some embodiments, other statistics character, such as mean value, intermediate value, standard deviation and/or other character, can be collected extraly or alternatively.
At block 1508, the dynamic range belonging to sky areas can be determined based on the analysis of radiation level distribution.In various embodiments, the dynamic range belonging to sky areas can be radiation level to be main in region on high or otherwise to represent the scope of sky areas, and wherein, this scope can be identified by the various operations of block 1508.Referring to figs. 16A-16C, the various operations of block 1508 can be understood better, the figures illustrate the exemplary histograms according to the infrared radiation level (such as, being represented by pixel value) in the sky areas of the various embodiment of the disclosure.Such as, one or more embodiments that can operate according to the distributional analysis of block 1506 obtain the exemplary histograms of Figure 16 A-16C.Although description block 1508 can be carried out in conjunction with the infrared radiation level in heat picture hereinafter, it is to be appreciated that the various operations of block 1508 go for radiation level, luminance level or other speciality relevant to the intensity of the pixel in visible images.
In the exemplary histograms of Figure 16 A, according to various embodiment, the scope 1602A of the infrared radiation level of occupy an leading position (such as, frequently observing) can be identified in sky areas.In various embodiments, various statistics standard or statistics group technology can be utilized to determine scope 1602A at block 1508.Such as, in some embodiments, the scope of the radiation level of occurrence frequency on absolute threshold or relative threshold (such as, the number percent of some reference values) can be selected as scope 1602A.In some embodiments, the distribution of the radiation level in ground region also can be analyzed to determine scope 1602A.Such as, in some embodiments, for range of choice 1602A definitely or relative threshold can at least in part based on other statistics character of the radiation level distribution in the mean value of frequency, frequency, average radiation level and/or ground region.
Scope 1602A in the exemplary histograms of Figure 16 A corresponds to lower infrared radiation level, its sky areas can be indicated colder than ground region many.Although sky is much colder than ground in many cases, the situation that sky warms up many than ground also may be there is.Such as, the cloudy day in winter climate may warm up than ground.The exemplary histograms that Figure 16 B obtains under showing the situation that can warm up than ground on high.In this exemplary histograms, prevailing radiation level in sky areas (such as, by scope 1602B that the one or more operations in above-mentioned block 1508 are determined) can correspond to higher infrared radiation level.In this respect, for some embodiments, also can determine that radiation level in sky areas is substantially lower than the radiation level (such as, in the embodiment utilizing heat picture, sky areas is colder still warmer substantially) in the region that is still above the ground level at block 1508.Such as, in one embodiment, scope 1602A or scope 1602B can contrast with the distribution of the radiation level in ground region, and based on this contrast, cold state of the sky (if for visible images, being situation on night) or warm state of the sky (if for visible images, being fine day situation) can be indicated.In some embodiments, the instruction of cold (secretly) or warm (bright) state of the sky may be used for generating suitable transforming function transformation function, as further described herein.
In some cases, utilize the various operations of the above-mentioned block 1508 for various embodiment, more than scope of radiation level can be identified.In the exemplary histograms of Figure 16 C, the scope 1602C that the various operation identifications two of above-mentioned block 1508 can be utilized to be separated and 1604C.Such as, the radiation level in scope 1602C can belong to the sunny part of sky, and the radiation level in scope 1604C can belong to the aerial cloud cluster in sky, the sun and/or other objects.Therefore, in some embodiments, the operation of block 1508 can comprise in only selective recognition scope or some are as the dynamic range belonging to sky areas.
More specifically, in some embodiments, only minimum identification range (such as, if in cold state of the sky) or the highest identification range (such as, if in warm state of the sky) can be selected as the dynamic range belonging to sky areas.In the exemplary histograms of Figure 16 C, scope 1602C is minimum identification range (such as, covering the scope compared with Low emissivity level), and therefore scope 1602C, instead of scope 1604C, can be selected as according to some embodiments the dynamic range belonging to sky areas.In other embodiments, one or more in identification range can alternatively or extraly select based on other standards, such as, the sum frequency counting of identification range (such as, the scope with the highest sum frequency can be selected), the corresponding frequencies (such as, the scope in ground region with minimum generation sum frequency can be selected) of the radiation level in ground region and/or other suitable standards.
At block 1510, can generate transforming function transformation function, its compression (such as, being mapped to narrower range) belongs to the dynamic range of sky areas.As understood in those skilled in that art, transforming function transformation function may be used for the output signal or the level that input signal or level are mapped to expectation.Such as, transforming function transformation function may be used for input pixel value (such as, catching in image) being mapped to output pixel value to generate expectation output image.
In various embodiments, the transforming function transformation function that can generate at block 1510 can be suitable for the comparatively close limit dynamic range belonging to the sky areas of catching in image be mapped as in output image.In some embodiments, the transforming function transformation function of generation can be the nonlinear function being suitable for the dynamic range belonging to sky areas to be mapped to narrower out-put dynamic range.In one embodiment, non-linear transform function can be the gamma curve (being also referred to as gamma transformation function) with index gamma (γ).Such as, gamma curve can be expressed as power function:
Output=Input γ(equation 1),
Wherein, index gamma (γ) can be selected to the dynamic range making to belong to sky areas and can be mapped to narrower out-put dynamic range.As can be appreciated like that, the gamma curve with same or similar feature can utilize the equation being different from equation 1 to express.Can also recognize from the disclosure, other nonlinear functions various also can be suitable for as transforming function transformation function.The limiting examples of these nonlinear functions comprises sinusoidal curve function (such as sinusoidal or cosine function), nonlinear polynomial function, power function or other suitable nonlinear functions.In other embodiments, the transforming function transformation function of generation can be the piece-wise linear function having two or more parts and be suitable for the dynamic range belonging to sky areas to be mapped to narrower out-put dynamic range.
With reference to figure 17A-19, the generation of the transforming function transformation function at block 1510 place can be understood better, those figures show the example of the various transforming function transformation functions that can generate according to the various embodiment of the disclosure.In Figure 17 A-19, input level and output level are all standardized as 1, and are assigned to x-axis and y-axis respectively.Figure 17 A shows the illustrative gamma curve 1700A according to disclosure embodiment, and it is generated as the transforming function transformation function for compressing the dynamic range belonging to sky areas.In this example, input range 1704 can corresponding to the dynamic range belonging to sky areas of catching in image, and input range 1706 can corresponding to the dynamic range belonging to ground region and/or other objects of catching in image.Figure 17 A shows the exemplary input range 1704 of the lower input level of covering, its can indicate sky areas can colder than ground region (if or to catch image be visible images time, more secretly), as discussed above.
In various embodiments, gamma curve 1700A can be generated as has exponent gamma, is mapped to the output area 1708A narrower than input range 1704 to make input range 1704.Such as, in some embodiments, exponent gamma can be selected to the out-put dynamic range making input range 1704 be mapped to some predetermined percentage (being such as, about 20% in Figure 17 A) at the most.In some embodiments, exponent gamma can be selected to and makes the ratio between output area 1708A and input range 1704 can not exceed estimated rate (predetermined slope such as, represented by line 1712).In these embodiments, the exponent gamma that (such as, calculate or otherwise draw) is suitable can be obtained from the mapping selected according to a kind of embodiment.Such as, if gamma curve 1700A (such as the example shown in Figure 17 A) will be generated for cold (secretly) state of the sky, then the exponent gamma being greater than 1 can be obtained.
In Figure 17 A, such as, the gamma curve 1700A generated according to one or more embodiment the input range 1704 of occupy input dynamic range about 60% can be mapped to occupy out-put dynamic range about 20% output area 1708A.Correspondingly, the input range 1706 (such as, belonging to the dynamic range of ground region and/or other objects in the image of catching) only occupying about 40% of input dynamic range can expand to can cross over out-put dynamic range about 80% scope 1710A.Therefore, such as, gamma curve 1700A advantageously can expand the out-put dynamic range for ground object and other attention objects, thus ground object or other interested objects can be distinguished by the user inspecting output image and are identified more easily, described output image is by being applied to the image of catching and generating by gamma curve 1700A.
Figure 17 B shows the exemplary segmentation formula linear function 1700B according to another embodiment of the disclosure, and it is generated as the transforming function transformation function for compressing the dynamic range belonging to sky areas.In various embodiments, piece-wise linear function 1700B can comprise multiple sections.Such as, piece-wise linear function 1700B can comprise section 1701 and 1702 as shown in figure 17, wherein, section 1701 can be suitable for by input range 1704 (such as, belong to the dynamic range of sky areas) be mapped to narrower output area 1708B, and section 1702 can be suitable for output area 1710B input range 1706 (such as, catching in image the dynamic range belonging to ground region and/or other objects) being mapped to expansion.
As shown, according to various embodiment, section 1701 (such as, map and catch in image the section of the dynamic range belonging to sky areas) dynamic range for suppressing to belong to sky areas can be generated, and section 1702 can be generated as and catching in image for expanding repeatedly the dynamic range belonging to ground region and/or other objects.Such as, in some embodiments, the slope of section 1701 can be selected to the out-put dynamic range making input range 1704 be mapped to some predetermined percentage (being such as, about 20% in Figure 17 B) at the most.In some embodiments, the slope of section 1701 can be chosen to make it not exceed predetermined slope (such as, the slope of line 1712).Therefore, such as, the piece-wise linear function 1700B generated according to one or more embodiment input range 1704 (it crosses over about 60% of input dynamic range) can be suppressed to only occupy out-put dynamic range about 20% output area 1708B, input range 1706 (it crosses over about 40% of input dynamic range) is expanded to repeatedly simultaneously can occupy out-put dynamic range about 80% output area 1710B, as shown in the example in Figure 17 B.
Figure 18 A shows another illustrative gamma curve 1800A according to another embodiment of the disclosure, and it is generated as the transforming function transformation function of the dynamic range for suppressing to belong to sky areas.In this example, input range 1804 (covering higher input level) can corresponding to the dynamic range belonging to sky areas of catching in image, and input range 1806 can corresponding to the dynamic range belonging to ground region and/or other objects of catching in image.Input range 1804 and 1806 therefore can correspond to warm state of the sky (if or for visible images time, bright sky situation).Correspondingly, gamma curve 1800A in various embodiment can be generated as compressing the input range 1804 belonging to the sky areas of warming up than ground, and the gamma curve 1700A of above-mentioned Figure 17 A can be generated as compressing the input range 1704 belonging to the sky areas colder than ground.
In various embodiments, to be similar to the mode generating gamma curve 1700A, gamma curve 1800A can be generated as has exponent gamma, is mapped to the output area 1808A narrower than input range 1804 to make input range 1804.But because gamma curve 1800A can be generated as warm sky areas, rejection ratio ground, exponent gamma can be less than 1, thus the shape of gamma curve 1800A can be suitable for suppressing higher input range (such as, input range 1804).In some embodiments, exponent gamma can be selected to the out-put dynamic range making input range 1804 be mapped to some predetermined percentage (being such as, about 20% in Figure 18 A) at the most.In some embodiments, exponent gamma can be selected to and makes the ratio between output area 1808A and input range 1804 can not exceed estimated rate (predetermined slope such as, represented by line 1812).In example shown in Figure 18 A, the gamma curve 1800A generated according to one or more embodiment input range 1804 (it crosses over about 60% of input dynamic range) can be mapped to only occupy out-put dynamic range about 20% output area 1808A, input range 1806 (it crosses over about 40% of input dynamic range) is expanded to repeatedly simultaneously can occupy out-put dynamic range about 80% output area 1810A.
Figure 18 B shows another exemplary segmentation formula linear function 1800B according to another embodiment of the disclosure, and it is generated as the transforming function transformation function for compressing the dynamic range belonging to sky areas.In various embodiments, piece-wise linear function 1800B can be suitable for the dynamic range belonging to sky areas compressed in warm state of the sky, but generates in the mode of the piece-wise linear function 1700B being similar to Figure 17 B.Such as, the piece-wise linear function 1800B generated according to one or more embodiment can the section of comprising 1801, this section 1801 can by input range 1804 (such as, cross over about 60% of input dynamic range) be suppressed to output area 1808B (such as, occupy about 20% of out-put dynamic range), and the section of comprising 1802, this section 1802 can by input range 1806 (such as, cross over about 40% of input dynamic range) repeatedly expand to output area 1810B (such as, expand to about 80% of out-put dynamic range), as shown in figure 18b.In various embodiments, in order to compress input range 1804, the slope of section 1801 can be restricted to predetermined slope (such as, the slope of line 1812) at the most and/or be restricted to out-put dynamic range input range 1804 being mapped at the most some predetermined percentage (in such as Figure 18 B about 20%).
Figure 19 shows another exemplary segmentation formula linear function 1900 according to another embodiment of the disclosure, and it is generated as the transforming function transformation function for compressing the dynamic range belonging to sky areas.In various embodiments, piece-wise linear function 1900 can the section of comprising 1901,1902 and 1903, wherein, section 1901 can be suitable for input range 1904 (such as, belonging to the dynamic range of sky areas) to be mapped to narrower output area 1908.In various embodiments, section 1902 and 1903 can be suitable for the output area 1910 and 1911 input range 1906 and 1907 (together with them can corresponding to catching in image the dynamic range belonging to ground region and/or other objects) being mapped to respectively diffusion.
Namely, such as, the piece-wise linear function 1900 illustrated can be suitable for two be separated input range (such as, input range 1906 and 1907) can together with belong to the dynamic range of sky areas (such as corresponding to compression when belonging to ground region dynamic range, input range 1904), but can generate in the mode being similar to piece-wise linear function 1700B and 1800B.Therefore, according to various embodiment, in order to compress input range 1904, the slope of section 1901 can be restricted to predetermined slope (such as, the slope of line 1912) at the most and/or be restricted to out-put dynamic range input range 1904 being mapped at the most some predetermined percentage (about 20% in such as Figure 19).In this respect, although be described above piece-wise linear function 1700B, the 1800B, 1900 with two sections or three sections, but piece-wise linear function 1700B, 1800B or 1900 suitably can be modified to include the amount of any amount, thus hold the input range distribution of other patterns various according to the scope of the present disclosure and spirit.
The block 1512 of present steering procedure 1500, the transforming function transformation function of generation can be applied to the image of catching to produce (such as, sky is repressed) image of process.Such as, in various embodiments, according to the various embodiments of the transforming function transformation function generated at block 1510, the pixel value of the pixel in the image of catching can be mapped to or otherwise convert to the new pixel value in the image of process.As discussed, the transforming function transformation function of generation is applied to the available out-put dynamic range occupied by sky in the minimizing image that the image of catching can be favourable, thus in the image of catching, has more dynamic range to may be used for ground object or other interested objects.
According to embodiment of the present disclosure, the transforming function transformation function of one or more embodiment is applied to the example of the image of catching shown in Figure 20 A-20B.More specifically, Figure 20 A is the exemplary screen shot of the heat picture of catching, and Figure 20 B processes (such as, sky is repressed) exemplary screen shot of heat picture, the transforming function transformation function that the heat picture of process is generated according to one or more embodiment from the image of catching by application is converted to or otherwise obtains.In the image of catching of Figure 20 A, may be difficult to identify or distinguish interested object 2002 (such as, people), this is because the contrast between such as interested object 2002 and other ground objects or ground is not enough.In the image of the process of Figure 20 B, the dynamic range that sky areas occupies can be reduced, and can be used for ground and ground object (such as, comprise interested object 2002) dynamic range can correspondingly be expanded, this can allow interested object 2002 by the differentiation that is more prone to or identification.
Therefore, the various embodiments of method and system disclosed herein can be favourable suppression image in sky (such as, reduce the available out-put dynamic range that the sky in image occupies), think that ground object in image or other interested objects retain more dynamic range, this is such as to realize by following means: the sky areas in recognition image, analyze the radiation profiles in sky areas, determine to belong to the dynamic range of sky areas and generate and belong in applied compression image the transforming function transformation function of the dynamic range of sky areas.Therefore, for example, various method and system disclosed herein can be included in or be embodied as various equipment and system, these equipment or system acquisition and/or process static and/or video image (such as, comprising heat picture) advantageously to allow user to identify more easily or to recognize the ground object in image.
In a suitable case, realize by the combination of hardware, software or hardware and software the various embodiments that the disclosure provides.Equally in a suitable case, when not departing from spirit of the present disclosure, proposed various hardware component and/or software part can be merged into and comprising software, hardware and/or the composite component of the two.In a suitable case, when not departing from spirit of the present disclosure, proposed various hardware component and/or software part can be separated into and comprise software, hardware or the subassembly of the two.In addition, in a suitable case, can be expected that, software part can be embodied as hardware component, and vice versa.
According to software of the present disclosure, such as, non-transitory instruction, program code and/or data can be stored in one or more non-transitory machine readable media.Can also be expected that, can use one or more general or special purpose computer and/or computer system, network and/or other modes realize herein mentioned by software.In a suitable case, the order of various step described herein can change, merges into composite steps and/or be separated into sub-step, to provide function described herein.
Embodiment described above only illustratively, instead of limits the present invention.It is to be further understood that, according to principle of the present invention, many amendments and change are possible.Therefore, scope of the present invention is only limited by claims below.

Claims (20)

1. a system, comprising:
Storer, it is suitable for storing the image of the scene comprising sky areas and ground region; And
Processor, it is suitable for and memory communication, and this processor is suitable for:
Sky areas in recognition image,
Analyze the distribution of the pixel level relevant to the sky areas in image,
Based on the distribution of pixel level, determine the dynamic range belonging to sky areas,
Generate the transforming function transformation function that compression belongs to the dynamic range of sky areas, and
Transforming function transformation function is applied to image.
2. the system as claimed in claim 1, wherein:
Transforming function transformation function is non-linear gamma curve; And
Processor is suitable for the index determining gamma curve, compresses to make gamma curve the dynamic range belonging to sky areas.
3. the system as claimed in claim 1, wherein:
Transforming function transformation function is the piece-wise linear function with multiple linearity range; And
Processor is suitable for the slope determining linearity range, compresses to make piece-wise linear function the dynamic range belonging to sky areas.
4. the system as claimed in claim 1, wherein, processor is suitable for:
Determine the horizontal position in image; And
Based on horizontal position, the sky areas in recognition image.
5. system as claimed in claim 4, wherein, processor is suitable for analysis chart picture to detect horizontal position.
6. the system as claimed in claim 1, also comprises image sensor device, and it is suitable for the image of capturing scenes and the image of catching is supplied to storer and/or processor.
7. system as claimed in claim 6, also comprises orientation sensor, its be suitable for detecting image sensor device relative to horizontal towards, wherein, processor is suitable for:
Based on towards the horizontal position determined in image; And
Based on horizontal position, the sky areas in recognition image.
8. system as claimed in claim 6, wherein:
Image is the heat picture of scene; And
Image sensor device is thermal imaging system, and it comprises the focal plane arrays (FPA) (FPA) of infrared sensor and is suitable for catching heat picture.
9. system as claimed in claim 8, wherein, FPA comprises:
Low-dropout regulator (LDO), itself and FPA are integrated and be suitable for providing burning voltage in response to outside supply voltage; And
Bias circuit, it is suitable for providing bias voltage in response to the burning voltage from LDO for infrared sensor.
10. system as claimed in claim 8, wherein:
Thermal imaging system is suitable for the deliberately fuzzy image of capturing scenes, and fuzzy image comprises the noise introduced by thermal imaging system; And
Processor is also suitable for:
Fuzzy image is utilized to determine multiple nonuniformity correction (NUC) item, to reduce noise at least partially, and
NUC item is applied to heat picture.
11. 1 kinds of methods, comprising:
Receive the image comprising the scene of sky areas and ground region;
Sky areas in recognition image;
Analyze the distribution of the pixel level relevant to the sky areas in image;
Based on the distribution of pixel level, determine the dynamic range belonging to sky areas;
Generate the transforming function transformation function that compression belongs to the dynamic range of sky areas; And
Transforming function transformation function is applied to image.
12. methods as claimed in claim 11, wherein:
Transforming function transformation function is non-linear gamma curve; And
Generate transforming function transformation function and comprise the index determining gamma curve, compress to make gamma curve the dynamic range belonging to sky areas.
13. methods as claimed in claim 11, wherein:
Transforming function transformation function is the piece-wise linear function with multiple linearity range; And
Generate transforming function transformation function and comprise the slope determining linearity range, compress to make piece-wise linear function the dynamic range belonging to sky areas.
14. methods as claimed in claim 11, also comprise the horizontal position determined in image, wherein, based on the sky areas in horizontal location recognition image.
15. methods as claimed in claim 14, wherein, determine that horizontal position comprises analysis chart picture to detect local horizon.
16. methods as claimed in claim 11, also comprise the image utilizing image sensor device capturing scenes.
17. methods as claimed in claim 16, also comprise:
Utilize the orientation sensor fixing relative to image sensor device, detect image sensor device relative to horizontal towards;
Based on towards the horizontal position determined in image, wherein, based on the sky areas in horizontal location recognition image.
18. methods as claimed in claim 16, wherein:
Image sensor device is thermal imaging system, and it comprises the focal plane arrays (FPA) (FPA) of infrared sensor; And
Catch image and comprise the heat picture utilizing thermal imaging system capturing scenes.
19. methods as claimed in claim 18, also comprise:
Outside supply voltage is received at FPA;
In response to outside supply voltage, the low-dropout regulator (LDO) integrated with FPA is utilized to provide burning voltage; And
The bias voltage of the bias circuit of FPA is supplied to the infrared sensor of FPA, described bias voltage provides in response to the burning voltage from LDO.
20. methods as claimed in claim 18, also comprise:
The deliberately fuzzy image of capturing scenes, fuzzy image comprises the noise introduced by thermal imaging system;
Fuzzy image is utilized to determine multiple nonuniformity correction (NUC) item, to reduce noise at least partially, and
NUC item is applied to heat picture.
CN201380073551.4A 2012-12-21 2013-12-20 System and method for suppressing the sky areas in image Active CN105009169B (en)

Applications Claiming Priority (19)

Application Number Priority Date Filing Date Title
US201261745440P 2012-12-21 2012-12-21
US61/745,440 2012-12-21
US201261746074P 2012-12-26 2012-12-26
US201261746069P 2012-12-26 2012-12-26
US61/746,069 2012-12-26
US61/746,074 2012-12-26
US201261748018P 2012-12-31 2012-12-31
US61/748,018 2012-12-31
US201361792582P 2013-03-15 2013-03-15
US201361793952P 2013-03-15 2013-03-15
US61/793,952 2013-03-15
US61/792,582 2013-03-15
US14/099,818 US9723227B2 (en) 2011-06-10 2013-12-06 Non-uniformity correction techniques for infrared imaging devices
US14/099,818 2013-12-06
US14/101,245 2013-12-09
US14/101,245 US9706139B2 (en) 2011-06-10 2013-12-09 Low power and small form factor infrared imaging
US14/101,258 2013-12-09
US14/101,258 US9723228B2 (en) 2011-06-10 2013-12-09 Infrared camera system architectures
PCT/US2013/077265 WO2014100741A2 (en) 2012-12-21 2013-12-20 Systems and methods of suppressing sky regions in images

Publications (2)

Publication Number Publication Date
CN105009169A true CN105009169A (en) 2015-10-28
CN105009169B CN105009169B (en) 2018-03-09

Family

ID=50979413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380073551.4A Active CN105009169B (en) 2012-12-21 2013-12-20 System and method for suppressing the sky areas in image

Country Status (2)

Country Link
CN (1) CN105009169B (en)
WO (1) WO2014100741A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109387828A (en) * 2017-08-11 2019-02-26 波音公司 Automatic detection and avoidance system
CN111447355A (en) * 2019-10-23 2020-07-24 泰州市赛得机电设备有限公司 Automatic white balance value big data adjusting system
WO2020154911A1 (en) * 2019-01-29 2020-08-06 SZ DJI Technology Co., Ltd. Sky determination in environment detection for mobile platforms, and associated systems and methods
CN111738946A (en) * 2020-06-16 2020-10-02 新疆大学 Method and device for enhancing sand-dust degraded image
CN115660944A (en) * 2022-10-27 2023-01-31 深圳市大头兄弟科技有限公司 Dynamic method, device and equipment for static picture and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060078215A1 (en) * 2004-10-12 2006-04-13 Eastman Kodak Company Image processing based on direction of gravity
EP1939809A1 (en) * 2005-09-07 2008-07-02 Pioneer Corporation Scene analysis device and method
US20110279673A1 (en) * 2007-11-28 2011-11-17 Flir Systems, Inc. Maritime controls systems and methods
US20120213411A1 (en) * 2009-11-05 2012-08-23 Nec Corporation Image target identification device, image target identification method, and image target identification program
WO2012170949A2 (en) * 2011-06-10 2012-12-13 Flir Systems, Inc. Non-uniformity correction techniques for infrared imaging devices

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6028309A (en) 1997-02-11 2000-02-22 Indigo Systems Corporation Methods and circuitry for correcting temperature-induced errors in microbolometer focal plane array
US7034301B2 (en) 2002-02-27 2006-04-25 Indigo Systems Corporation Microbolometer focal plane array systems and methods
US6812465B2 (en) 2002-02-27 2004-11-02 Indigo Systems Corporation Microbolometer focal plane array methods and circuitry
US7470902B1 (en) 2006-03-20 2008-12-30 Flir Systems, Inc. Infrared camera electronic architectures
US7470904B1 (en) 2006-03-20 2008-12-30 Flir Systems, Inc. Infrared camera packaging
US7679048B1 (en) 2008-04-18 2010-03-16 Flir Systems, Inc. Systems and methods for selecting microbolometers within microbolometer focal plane arrays
WO2018227003A1 (en) 2017-06-08 2018-12-13 Superior Energy Services, Llc Deep set safety valve

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060078215A1 (en) * 2004-10-12 2006-04-13 Eastman Kodak Company Image processing based on direction of gravity
EP1939809A1 (en) * 2005-09-07 2008-07-02 Pioneer Corporation Scene analysis device and method
US20110279673A1 (en) * 2007-11-28 2011-11-17 Flir Systems, Inc. Maritime controls systems and methods
US20120213411A1 (en) * 2009-11-05 2012-08-23 Nec Corporation Image target identification device, image target identification method, and image target identification program
WO2012170949A2 (en) * 2011-06-10 2012-12-13 Flir Systems, Inc. Non-uniformity correction techniques for infrared imaging devices

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FRANCESCO BRANCHITTA ET AL.: "dynamic range compression and contrast enhancement in IR imaging systems", 《PROC. SPIE 6737, ELECTRO-OPTICAL AND INFRARED SYSTEMS: TECHNOLOGY AND APPLICATIONS IV》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109387828A (en) * 2017-08-11 2019-02-26 波音公司 Automatic detection and avoidance system
CN109387828B (en) * 2017-08-11 2024-02-23 波音公司 Automatic detection and avoidance system
WO2020154911A1 (en) * 2019-01-29 2020-08-06 SZ DJI Technology Co., Ltd. Sky determination in environment detection for mobile platforms, and associated systems and methods
CN111447355A (en) * 2019-10-23 2020-07-24 泰州市赛得机电设备有限公司 Automatic white balance value big data adjusting system
CN111738946A (en) * 2020-06-16 2020-10-02 新疆大学 Method and device for enhancing sand-dust degraded image
CN111738946B (en) * 2020-06-16 2022-04-08 新疆大学 Method and device for enhancing sand-dust degraded image
CN115660944A (en) * 2022-10-27 2023-01-31 深圳市大头兄弟科技有限公司 Dynamic method, device and equipment for static picture and storage medium
CN115660944B (en) * 2022-10-27 2023-06-30 深圳市闪剪智能科技有限公司 Method, device, equipment and storage medium for dynamic state of static picture

Also Published As

Publication number Publication date
WO2014100741A2 (en) 2014-06-26
CN105009169B (en) 2018-03-09
WO2014100741A3 (en) 2014-09-04

Similar Documents

Publication Publication Date Title
US9819880B2 (en) Systems and methods of suppressing sky regions in images
US20220343598A1 (en) System and methods for improved aerial mapping with aerial vehicles
US10110838B2 (en) Multifunctional sky camera system for total sky imaging and spectral radiance measurement
US11445131B2 (en) Imager with array of multiple infrared imaging modules
US20180198960A1 (en) Device attachment with infrared imaging sensor
CN205157051U (en) Infrared sensor package
CN105191288B (en) Abnormal pixel detects
US10169666B2 (en) Image-assisted remote control vehicle systems and methods
CN104782116B (en) Row and column noise reduction in heat picture
US11100618B2 (en) Systems and methods for reducing low-frequency non-uniformity in images
CN104995909A (en) Time spaced infrared image enhancement
CN103748867A (en) Low power consumption and small form factor infrared imaging
CN105027557A (en) Techniques to compensate for calibration drifts in infrared imaging devices
US20150332441A1 (en) Selective image correction for infrared imaging devices
CN105009169A (en) Systems and methods of suppressing sky regions in images
US20160074724A1 (en) Thermal-assisted golf rangefinder systems and methods
CN105765967A (en) Using second camera to adjust settings of first camera
CN104519328A (en) Image processing device, image capturing apparatus, and image processing method
CN110869744B (en) Information processing apparatus, information processing method, program, and information processing system
CN205080731U (en) System for be used for remote control vehicle
JP7074126B2 (en) Image processing equipment, growth survey image creation system and program
CN205157061U (en) Infrared sensor module and infrared imaging equipment
US9538076B2 (en) Image processing devices for suppressing color fringe, and image sensor modules and electronic devices including the same
CN204991709U (en) System of infrared imaging ware with have integrated metal level
CN104981905A (en) Abnormal clock rate detection in imaging sensor arrays

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant