US20150097975A1 - Integrating image frames - Google Patents
Integrating image frames Download PDFInfo
- Publication number
- US20150097975A1 US20150097975A1 US14/048,898 US201314048898A US2015097975A1 US 20150097975 A1 US20150097975 A1 US 20150097975A1 US 201314048898 A US201314048898 A US 201314048898A US 2015097975 A1 US2015097975 A1 US 2015097975A1
- Authority
- US
- United States
- Prior art keywords
- image
- motion
- pixel
- image frames
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 112
- 230000010354 integration Effects 0.000 claims abstract description 53
- 238000005259 measurement Methods 0.000 claims abstract description 45
- 238000000034 method Methods 0.000 claims abstract description 41
- 230000007480 spreading Effects 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 10
- 230000000877 morphologic effect Effects 0.000 claims description 10
- 230000002085 persistent effect Effects 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000002688 persistence Effects 0.000 claims description 6
- 239000006185 dispersion Substances 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 11
- 238000004590 computer program Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 239000000969 carrier Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- H04N5/217—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- G06T5/003—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/684—Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
- H04N23/6845—Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time by combination of a plurality of images sequentially taken
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20201—Motion blur correction
Definitions
- One approach to integrating image frames is a method that can be executed by one or more processors.
- the method includes receiving for each pixel in at least one image frame a value representative of a sensor measurement.
- the method also includes calculating an average difference of the value representative of the sensor measurement over a subset of the plurality of image frames.
- the method further includes detecting motion in at least one pixel of the image frame in the plurality of image frames based on the calculated average difference of the value representative of the sensor measurement over the subset of the plurality of image frames.
- the detected motion is of an object captured in the plurality of image frames.
- the method includes generating an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected.
- the amount of frame integration is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data.
- calculating the average difference includes normalizing a difference of the value representative of the sensor measurement of a pixel of the plurality of image frames by performing a mean absolute deviation calculation of at least one pixel between at least two image frames of the plurality of image frames.
- the method can include detecting motion by detecting initial motion in the at least one pixel of the image frame in the plurality of image frames if the average difference is greater than a scale factor times the average difference and the average difference is greater than a noise threshold.
- the scale factor can be based on a statistical dispersion of data resulting from normalizing the difference of the value representative of the sensor measurement of a pixel of the plurality of image frames and sensor noise, registration accuracy, and changes in the image from frame-to-frame such as rotation, scale and perspective.
- the noise threshold can be based on measured image noise and a type of sensor providing the sensor measurement.
- the method can further include assigning a persistence count for each of the at least one pixel of the image frame for which initial motion is detected.
- the persistent count can be based on a typical vehicle size and speed, and a range of an image sensor to a target.
- the method can also include reducing false detections of motions by applying a noise filter to each of the at least one pixel of the image frame for which motion is detected. Additionally, the method can include applying a morphological spreading filter to those pixels for which motion is still detected.
- the method can also include thresholding the output of the morphological spreading filter, where the thresholding provides a final identification of motion in the at least one pixel of the image frame in the plurality of image frames.
- the system includes a memory and one or more processors.
- the system also includes an image integration processor.
- the image integration processor using the one or more processors, is configured to receive for each pixel in at least one image frame a value representative of a sensor measurement.
- the image integration processor using the one or more processors, is configured to calculate an average difference of the value representative of the sensor measurement over a subset of the plurality of image frames.
- the image integration processor using the one or more processors, is also configured to detect motion in at least one pixel of the image frame in the plurality of image frames based on the calculated average difference of the value representative of the sensor measurement over the subset of the plurality of image frames. The detected motion is of an object captured in the plurality of image frames.
- the image integration processor using the one or more processors, is configured to generate an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected, wherein the amount of frame integration is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data.
- the image integration module using the one or more processors, can be further configured to calculate the average difference by normalizing a difference of the value representative of the sensor measurement of a pixel of the plurality of image frames by performing a mean absolute deviation calculation of at least one pixel between at least two image frames of the plurality of image frames.
- the image integration module using the one or more processors, can be further configured to detect initial motion in the at least one pixel of the image frame in the plurality of image frames if the average difference is greater than a scale factor times the average difference and the average difference is greater than a noise threshold.
- the scale factor can be based on a statistical dispersion of data resulting from normalizing the difference of the value representative of the sensor measurement of a pixel of the plurality of image frames and sensor noise, registration accuracy, and changes in the image from frame-to-frame such as rotation, scale and perspective.
- the noise threshold can be based on measured image noise and a type of sensor providing the sensor measurement.
- the image integration module using the one or more processors, can also be configured to assign a persistence count for each of the at least one pixel of the image frame for which initial motion is detected.
- the persistent count can be based on a typical vehicle size and speed, and a range of an image sensor to a target.
- the image integration module using the one or more processors, can be further configured to reduce false detections of motions by applying a noise filter to each of the at least one pixel of the image frame for which motion is detected.
- the image integration module using the one or more processors, can also be configured to applying a morphological spreading filter to those pixels for which motion is still detected.
- the image integration module using the one or more processors, can be further configured to threshold the output of the morphological spreading filter, wherein results of the threshold provide a final identification of motion in the at least one pixel of the image frame in the plurality of image frames.
- the non-transitory computer-readable medium has computer readable program codes embodied thereon for integrating image frames, the computer readable programs codes, including instructions that, when executed by one or more processors, cause the processor to receive for each pixel in at least one image frame a value representative of a sensor measurement.
- the instructions also cause the processor to calculate an average difference of the value representative of the sensor measurement over a subset of the plurality of image frames.
- the instructions cause the processor to detect motion in at least one pixel of the image frame in the plurality of image frames based on the calculated average difference of the value representative of the sensor measurement over the subset of the plurality of image frames.
- the detected motion is of an object captured in the plurality of image frames. Additionally, the instructions cause the processor to generate an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected.
- the amount of frame integration is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data.
- FIG. 1 is a diagram of an exemplary image acquisition environment in accordance with an example embodiment of the present disclosure.
- FIG. 2 is a diagram of another exemplary image acquisition environment in accordance with an example embodiment of the present disclosure.
- FIG. 3 is a block diagram of an exemplary image system in accordance with an example embodiment of the present disclosure.
- FIG. 4A illustrates an image frame n which certain pixels are erroneously identified as including motion in accordance with an example embodiment of the present disclosure.
- FIG. 4B illustrates an image frame for which erroneous detections of motion is removed by using an averaging metric in accordance with an example embodiment of the present disclosure.
- FIG. 5A illustrates an image frame that includes a blurred trailing end of a moving object in accordance with an example embodiment of the present disclosure.
- FIG. 5B illustrates an image frame utilizes a persistent count to assist in identifying movement in a trailing end of the moving object in accordance with an example embodiment of the present disclosure.
- FIG. 6A illustrates an image frame that includes a group of low contrast pixels corresponding to a moving object for which motion is not detected in accordance with an example embodiment of the present disclosure.
- FIG. 6B illustrates an image frame that correctly identifies motion in the low contrast pixels in accordance with an example embodiment of the present disclosure.
- FIG. 7 is a flow diagram of a method for integrating image frames in accordance with an example embodiment of the present invention.
- FIG. 1 is a diagram of an exemplary image acquisition environment 100 .
- the environment 100 includes an image console 110 , an image system 120 , and a camera platform 130 .
- An image operator 115 views a plurality of objects utilizing the image console 110 .
- the plurality of objects includes a tank 132 , a car 134 , a tanker trailer 136 , and a truck 138 .
- the camera platform 130 e.g., optical camera platform, infrared camera platform, etc.
- the camera platform 130 communicates data (e.g., digital representation of an image of the tank 132 , optical representation of an image of the car 134 , etc.) to the image system 120 .
- data e.g., digital representation of an image of the tank 132 , optical representation of an image of the car 134 , etc.
- the image system 120 analyzes the received video and/or image frames to integrate the image frames (e.g., a single snapshot of the image at a specified time, a sub-part of the image, etc.). For example, a one second video of an object includes ten image frames of the object. In this example, the image system 120 integrates the ten image frames to form an integrated image frame with enhanced resolution of the object. The image system 120 can store the integrated image for further analysis and/or transmit the integrated image to the image console 110 for viewing and/or analysis by the image operator 115 .
- the image system 120 can store the integrated image for further analysis and/or transmit the integrated image to the image console 110 for viewing and/or analysis by the image operator 115 .
- FIG. 1 illustrates a single camera platform 130
- the environment 100 can utilize any number of camera platforms (e.g., ten camera platforms, one hundred camera platforms, etc.).
- the image system 120 can receive images from any number of camera platforms for the same object or different objects (as illustrated in Table 1).
- the single camera platform 130 can include a plurality of cameras and/or other types of image capture devices (e.g., motion sensor, environmental sensor, heat sensor, etc.).
- Table 1 illustrates exemplary image information received by a plurality of camera platforms and transmitted to the image system 120 .
- FIG. 2 is a diagram of another exemplary image acquisition environment 200 .
- the image acquisition environment 200 illustrates images at two time frames A 210 a and B 210 b .
- a camera platform 212 and a tank 214 are at a first position (e.g., the physical location of the camera platform 212 is in square 3 ⁇ 4 and the physical location of the tank 214 is in square 3 ⁇ 8), and the camera platform 212 receives an image frame A 216 a of the tank 214 .
- the camera platform 212 and the tank 214 are at a second position (e.g., the physical location of the camera platform 212 is in square 3 ⁇ 4 and the physical location of the tank 214 is in square 9 ⁇ 8), and the camera platform 212 receives an image frame B 216 b of the tank 214 .
- the tank 214 moved from the first position to the second position at a velocity V t 224 .
- the camera platform 212 can transmit the image frames A 216 a and B 216 b to an image system (not shown) for processing (e.g., integration of the image frames into an integrated image frame).
- Table 2 illustrates exemplary image information received by the camera platform 212 .
- FIG. 3 is a block diagram of an exemplary image system 310 .
- the image system 310 includes a communication module 311 , an image sharpness module 312 , an image noise module 313 , an image jitter module 314 , an image integration module 315 , an input device 391 , an output device 392 , a display device 393 , a processor 394 , and a storage device 395 .
- the modules and devices described herein can, for example, utilize the processor 394 to execute computer executable instructions and/or include a processor to execute computer executable instructions (e.g., an encryption processing unit, a field programmable gate array processing unit, etc.).
- the image system 310 can include, for example, other modules, devices, and/or processors known in the art and/or varieties of the illustrated modules, devices, and/or processors.
- the communication module 311 receives the images (e.g., from a camera platform, from an intermediate image processing device, from a storage device, etc.).
- the communication module 311 communicates information to/from the image system 310 .
- the communication module 311 can receive, for example, information associated with a camera platform.
- the information associated with the camera platform can be associated with a data signal (e.g., data signal from a camera platform, processed data signal from a camera platform, data signal from a motion sensor, data signal from a global positioning system, data signal from a location system, etc.).
- the image sharpness module 312 determines a sharpness metric for each image frame in a plurality of image frames.
- the sharpness metric is indicative of at least one of edge content and an edge size of the image frame.
- the image sharpness module 312 separates one or more horizontal edge components (e.g., a row within the image frame, set of rows within the image frame, etc.) and one or more vertical edge components (e.g., a column within the image frame, set of columns within the image frame, etc.) in each image frame in the plurality of image frames.
- the image sharpness module 312 interpolate the one or more horizontal components and the one or more vertical components for each image frame in the plurality of image frames (e.g., to achieve sub-pixel alignment, to maximize sub-pixel alignment, etc.).
- the image sharpness module 312 correlates the interpolated horizontal and vertical components to determine a pixel shift for each image frame in the plurality of image frames.
- the pixel shift is indicative of at least one of the edge content and the edge size (e.g., number of edges, size of edges, etc.) of the respective image frame.
- the pixel shift can be, for example, utilized by the image sharpness module 312 to generate an edge map for each image frame.
- the image sharpness module 312 generates an edge map for each image frame in the plurality of image frames.
- the edge map includes pixel values indicative of an edge and a non-edge.
- the image sharpness module 312 combines the pixel values in the edge map for each image frame in the plurality of image frames to form the sharpness metric.
- the image sharpness module 312 utilizes an edge detector/extractor (e.g., Sobel detector, Canny edge extractor, etc.) to generate an edge map for the image frame.
- the edge map can, for example, include pixel values of one for edges and pixel values of zero for non-edges.
- the image sharpness module 312 can, for example, sum the pixel values to generate the sharpness metric for each image frame.
- the image noise module 313 determines a noise metric for each image frame in the plurality of image frames.
- the noise metric is indicative of random variations in brightness or color in the image frame.
- the image jitter module 314 determines a jitter metric for each image frame in the plurality of image frames.
- the jitter metric is indicative of spatial shifts between the image frame and other image frames in the plurality of image frames.
- the image jitter module 314 can, for example, utilize frame to frame registration to measure the spatial shifts between image frames due to unintended motion.
- the image jitter module 314 can, for example, utilize a Fitts algorithm, correlation, and/or any other type of jitter processing technique to determine the jitter metric.
- the image jitter module 314 can, for example, utilize a Kalman filter to measure the spatial shifts between image frames due to intended motion (e.g., image pan commands, image zoom commands, etc.).
- the image frame integration module 315 integrates the one or more image frames of the plurality of image frames based on input received from the communication module 311 , image sharpness module 312 , image noise module 313 , and the image jitter module 314 .
- the image frame integration module 315 receives, for each pixel in at least one image frame of the plurality of image frames, a value representative of a sensor measurement.
- sensor measurements from each pixel can include the difference in intensity between the pixel in the current frame and those from previous frames, after registering the frames to correct for the displacement of the input images.
- statistical measurements are made and compared to thresholds indicating the persistence of the intensity differences over multiple frames. The combined information on intensity differences and their persistence is used to identify which pixels are in motion within the image.
- frame integration causes moving objects to appear blurred or smeared. In order to prevent such blurring or smearing, the image integration module 315 detects whether or not motion exists in each of the pixels of each of the image frames.
- the image frame integration module 315 detects such motion, the image frame integration module 315 generates an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected.
- the amount of integration for those pixels for which motion is detected is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data. For instance, pixels that are determined to be in motion are integrated less than those that are determined to be stationary, reducing the motion blur on the pixels associated with moving objects, while still allowing the application of heavy integration and its associated noise reduction benefits on the otherwise stationary portions of the image.
- FIG. 4A illustrates an image frame 400 a in which certain pixels are erroneously identified as including motion.
- the white pixels of the image frame 400 a are pixels for which motion is detected.
- the white color of pixels 405 along the lower edge of the depicted road indicates that the pixels 405 a include motion.
- pixels 410 include an object in motion (e.g., a vehicle). An indication that the pixels 410 include motion is accurate.
- the image frame integration module 315 reduces erroneous detection of motion by using an averaging metric to track pixel difference over multiple frames.
- the image frame integration module 315 uses registered frame differencing to detect motion. For instance, the image frame integration module 315 calculates an average difference of the value representative of the sensor measurement of the plurality of images or a subset of the plurality of images.
- the image frame integration module 315 can calculate the average difference by normalizing a difference of the value representative of the sensor measurement of a pixel of the plurality of image frames by performing a mean absolute deviation calculation of at least one pixel between at least two image frames of the plurality of image frames.
- FIG. 4B illustrates an image frame 400 b for which erroneous detections of motion is removed by using the averaging metric.
- the erroneous detection of motion in pixels 405 is no longer present.
- the image frame integration module 315 detects motion in a pixel of an image frame if the average difference of the value representative of the sensor measurement is greater than a scale factor times the average difference and the average difference is greater than a noise threshold.
- the scale factor is based on a statistical dispersion of data resulting from normalizing the difference of the value representative of the sensor measurement of a pixel of the plurality of image frames and sensor noise, registration accuracy, and changes in the image from frame-to-frame such as rotation, scale and perspective.
- the noise threshold is based on measured image noise and a type of sensor providing the sensor measurement.
- FIG. 5A illustrates an image frame 500 a that includes a blurred trailing end 515 a of a moving object.
- the image frame integration module 315 assigns a persistent count for each pixel for which motion is detected. In particular, the persistent is set when the average difference is greater than the noise threshold and greater than a scale factor times the average difference.
- the persistent count is based on a typical vehicle's size and speed, and range of an image sensor to the target (e.g., the vehicle).
- FIG. 5B illustrates an image frame 500 b .
- the image frame integration module 315 utilizes the persistent count to assist in identifying movement in a trailing end of the moving object (i.e., pixels which are white).
- the image frame integration module 315 can also utilize a noise filter to further reduce false detections of movement.
- the image frame integration module 315 assumes that a true moving object should be made up of a cluster of moving pixels.
- the image frame integration module 315 determines if a certain percentage of pixels within a predetermined region (e.g., a 9 ⁇ 9 pixel region) around the pixel identified as including motion also include motion. The percentage of pixels is predetermined based on image and sensor characteristics, and the actual percentage is tuned for each application.
- the image frame integration module 315 utilizing a morphological spreading filter. As stated above, the image frame integration module 315 assumes that a true moving object should not be a collection of disconnected discrete pixels, and should be made up of a cluster of pixels for which motion is detected. This spreading filter is applied to fill in and extend the group of motion pixels associated with the moving object and create a uniform motion mask defining the boundaries of the moving object within the frame.
- FIG. 6A illustrates an image frame 600 a that includes a group of low contrast pixels 630 corresponding to a moving object for which motion is not detected (e.g., the pixels are not white and are in grayscale).
- the image frame integration module 315 utilizing a morphological spreading filter to correctly identify movement in such pixels.
- FIG. 6B illustrates an image frame 600 b that correctly identifies motion in the low contrast pixels 630 .
- the image frame integration module 315 then integrates each pixel in each of the plurality of image frames by an amount based on whether the pixel includes motion.
- the display device 393 displays information associated with the image system 310 (e.g., status information, configuration information, etc.).
- the processor 394 executes the operating system and/or any other computer executable instructions for the image system 310 (e.g., executes applications, etc.).
- the memory 395 stores the images (e.g., actual image, processed image, etc.), the integrated image frames, and/or any other data associated with the image system 310 .
- the memory 395 can store image information and/or any other data associated with the image system 310 .
- the memory 395 can include a plurality of storage devices and/or the image system 310 can include a plurality of storage devices (e.g., an image storage device, an integrated image storage device, etc.).
- the memory 395 can include, for example, long-term storage (e.g., a hard drive, a tape storage device, flash memory, etc.), short-term storage (e.g., a random access memory, a graphics memory, etc.), and/or any other type of computer readable storage.
- FIG. 7 is a flow diagram of a method 700 for integrating image frames in accordance with an example embodiment of the present invention.
- the method 700 begins at 705 .
- the method includes receiving for each pixel in at least one image frame a value representative of a sensor measurement.
- the method includes calculating an average difference of the value representative of the sensor measurement over a subset of the plurality of image frames.
- the method includes detecting motion in at least one pixel of the image frame in the plurality of image frames based on the calculated average difference of the value representative of the sensor measurement over the subset of the plurality of image frames. The detected motion is of an object captured in the plurality of image frames.
- the method includes generating an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected.
- the amount of frame integration is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data.
- the above-described systems and methods can be implemented in digital electronic circuitry, in computer hardware, firmware, and/or software.
- the implementation can be as a computer program product.
- the implementation can, for example, be in a machine-readable storage device, for execution by, or to control the operation of, data processing apparatus.
- the implementation can, for example, be a programmable processor, a computer, and/or multiple computers.
- Further example embodiments of the present disclosure may be configured using a computer program product; for example, controls may be programmed in software for implementing example embodiments of the present disclosure. Further example embodiments of the present disclosure may include a non-transitory computer readable medium containing instruction that may be executed by a processor, and, when executed, cause the processor to complete methods described herein. It should be understood that elements of the block and flow diagrams described herein may be implemented in software, hardware, firmware, or other similar implementation determined in the future. In addition, the elements of the block and flow diagrams described herein may be combined or divided in any manner in software, hardware, or firmware. If implemented in software, the software may be written in any language that can support the example embodiments disclosed herein.
- the software may be stored in any form of computer readable medium, such as random access memory (RAM), read only memory (ROM), compact disk read only memory (CD-ROM), and so forth.
- RAM random access memory
- ROM read only memory
- CD-ROM compact disk read only memory
- a general purpose or application specific processor loads and executes software in a manner well understood in the art.
- the block and flow diagrams may include more or fewer elements, be arranged or oriented differently, or be represented differently. It should be understood that implementation may dictate the block, flow, and/or network diagrams and the number of block and flow diagrams illustrating the execution of embodiments of the present disclosure.
- a computer program can be written in any form of programming language, including compiled and/or interpreted languages, and the computer program can be deployed in any form, including as a stand-alone program or as a subroutine, element, and/or other unit suitable for use in a computing environment.
- a computer program can be deployed to be executed on one computer or on multiple computers at one site.
- Method steps can be performed by one or more programmable processors executing a computer program to perform functions of the present disclosure by operating on input data and generating output. Method steps can also be performed by and an apparatus can be implemented as special purpose logic circuitry.
- the circuitry can, for example, be a FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit).
- Subroutines and software agents can refer to portions of the computer program, the processor, the special circuitry, software, and/or hardware that implement that functionality.
- processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
- a processor receives instructions and data from a read-only memory or a random access memory or both.
- the essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data.
- a computer can include, can be operatively coupled to receive data from and/or transfer data to one or more mass storage devices for storing data (e.g., magnetic, magneto-optical disks, or optical disks).
- Data transmission and instructions can also occur over a communications network.
- Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices.
- the information carriers can, for example, be EPROM, EEPROM, flash memory devices, magnetic disks, internal hard disks, removable disks, magneto-optical disks, CD-ROM, and/or DVD-ROM disks.
- the processor and the memory can be supplemented by, and/or incorporated in special purpose logic circuitry.
- the above described techniques can be implemented on a computer having a display device.
- the display device can, for example, be a cathode ray tube (CRT) and/or a liquid crystal display (LCD) monitor.
- CTR cathode ray tube
- LCD liquid crystal display
- the interaction with a user can, for example, be a display of information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer (e.g., interact with a user interface element).
- Other kinds of devices can be used to provide for interaction with a user.
- Other devices can, for example, be feedback provided to the user in any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback).
- Input from the user can, for example, be received in any form, including acoustic, speech, and/or tactile input.
- the above described techniques can be implemented in a distributed computing system that includes a back-end component.
- the back-end component can, for example, be a data server, a middleware component, and/or an application server.
- the above described techniques can be implemented in a distributing computing system that includes a front-end component.
- the front-end component can, for example, be a client computer having a graphical user interface, a Web browser through which a user can interact with an example implementation, and/or other graphical user interfaces for a transmitting device.
- the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, wired networks, and/or wireless networks.
- LAN local area network
- WAN wide area network
- the Internet wired networks, and/or wireless networks.
- the system can include clients and servers.
- a client and a server are generally remote from each other and typically interact through a communication network.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- Packet-based networks can include, for example, the Internet, a carrier internet protocol (IP) network (e.g., local area network (LAN), wide area network (WAN), campus area network (CAN), metropolitan area network (MAN), home area network (HAN)), a private IP network, an IP private branch exchange (IPBX), a wireless network (e.g., radio access network (RAN), 802.11 network, 802.16 network, general packet radio service (GPRS) network, HiperLAN), and/or other packet-based networks.
- IP carrier internet protocol
- LAN local area network
- WAN wide area network
- CAN campus area network
- MAN metropolitan area network
- HAN home area network
- IP network IP private branch exchange
- wireless network e.g., radio access network (RAN), 802.11 network, 802.16 network, general packet radio service (GPRS) network, HiperLAN
- GPRS general packet radio service
- HiperLAN HiperLAN
- Circuit-based networks can include, for example, the public switched telephone network (PSTN), a private branch exchange (PBX), a wireless network (e.g., RAN, bluetooth, code-division multiple access (CDMA) network, time division multiple access (TDMA) network, global system for mobile communications (GSM) network), and/or other circuit-based networks.
- PSTN public switched telephone network
- PBX private branch exchange
- CDMA code-division multiple access
- TDMA time division multiple access
- GSM global system for mobile communications
- the transmitting device can include, for example, a computer, a computer with a browser device, a telephone, an IP phone, a mobile device (e.g., cellular phone, personal digital assistant (PDA) device, laptop computer, electronic mail device), and/or other communication devices.
- the browser device includes, for example, a computer (e.g., desktop computer, laptop computer) with a world wide web browser (e.g., Microsoft® Internet Explorer® available from Microsoft Corporation, Mozilla® Firefox available from Mozilla Corporation).
- the mobile computing device includes, for example, a Blackberry®.
- Comprise, include, and/or plural forms of each are open ended and include the listed parts and can include additional parts that are not listed. And/or is open ended and includes one or more of the listed parts and combinations of the listed parts.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
Description
- Generally, the display of images is degraded by motion blur and noise. Prior attempts at image frame integration reduce noise and increase sensor performance. However, the prior attempts at image frame integration are often degraded by motion blur and noise. Thus, a need exists in the art for improved image frame integration.
- One approach to integrating image frames is a method that can be executed by one or more processors. The method includes receiving for each pixel in at least one image frame a value representative of a sensor measurement. The method also includes calculating an average difference of the value representative of the sensor measurement over a subset of the plurality of image frames. The method further includes detecting motion in at least one pixel of the image frame in the plurality of image frames based on the calculated average difference of the value representative of the sensor measurement over the subset of the plurality of image frames. The detected motion is of an object captured in the plurality of image frames. In addition, the method includes generating an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected. The amount of frame integration is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data.
- In an example, calculating the average difference includes normalizing a difference of the value representative of the sensor measurement of a pixel of the plurality of image frames by performing a mean absolute deviation calculation of at least one pixel between at least two image frames of the plurality of image frames. The method can include detecting motion by detecting initial motion in the at least one pixel of the image frame in the plurality of image frames if the average difference is greater than a scale factor times the average difference and the average difference is greater than a noise threshold.
- The scale factor can be based on a statistical dispersion of data resulting from normalizing the difference of the value representative of the sensor measurement of a pixel of the plurality of image frames and sensor noise, registration accuracy, and changes in the image from frame-to-frame such as rotation, scale and perspective. The noise threshold can be based on measured image noise and a type of sensor providing the sensor measurement.
- The method can further include assigning a persistence count for each of the at least one pixel of the image frame for which initial motion is detected. The persistent count can be based on a typical vehicle size and speed, and a range of an image sensor to a target. The method can also include reducing false detections of motions by applying a noise filter to each of the at least one pixel of the image frame for which motion is detected. Additionally, the method can include applying a morphological spreading filter to those pixels for which motion is still detected. The method can also include thresholding the output of the morphological spreading filter, where the thresholding provides a final identification of motion in the at least one pixel of the image frame in the plurality of image frames.
- Another approach to integrating image frames is a system. The system includes a memory and one or more processors. The system also includes an image integration processor. The image integration processor, using the one or more processors, is configured to receive for each pixel in at least one image frame a value representative of a sensor measurement. The image integration processor, using the one or more processors, is configured to calculate an average difference of the value representative of the sensor measurement over a subset of the plurality of image frames. The image integration processor, using the one or more processors, is also configured to detect motion in at least one pixel of the image frame in the plurality of image frames based on the calculated average difference of the value representative of the sensor measurement over the subset of the plurality of image frames. The detected motion is of an object captured in the plurality of image frames. In addition, the image integration processor, using the one or more processors, is configured to generate an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected, wherein the amount of frame integration is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data.
- The image integration module, using the one or more processors, can be further configured to calculate the average difference by normalizing a difference of the value representative of the sensor measurement of a pixel of the plurality of image frames by performing a mean absolute deviation calculation of at least one pixel between at least two image frames of the plurality of image frames.
- The image integration module, using the one or more processors, can be further configured to detect initial motion in the at least one pixel of the image frame in the plurality of image frames if the average difference is greater than a scale factor times the average difference and the average difference is greater than a noise threshold. The scale factor can be based on a statistical dispersion of data resulting from normalizing the difference of the value representative of the sensor measurement of a pixel of the plurality of image frames and sensor noise, registration accuracy, and changes in the image from frame-to-frame such as rotation, scale and perspective. The noise threshold can be based on measured image noise and a type of sensor providing the sensor measurement.
- The image integration module, using the one or more processors, can also be configured to assign a persistence count for each of the at least one pixel of the image frame for which initial motion is detected. The persistent count can be based on a typical vehicle size and speed, and a range of an image sensor to a target.
- The image integration module, using the one or more processors, can be further configured to reduce false detections of motions by applying a noise filter to each of the at least one pixel of the image frame for which motion is detected. The image integration module, using the one or more processors, can also be configured to applying a morphological spreading filter to those pixels for which motion is still detected.
- Further, the image integration module, using the one or more processors, can be further configured to threshold the output of the morphological spreading filter, wherein results of the threshold provide a final identification of motion in the at least one pixel of the image frame in the plurality of image frames.
- Another approach to integrating image frames is a non-transitory computer readable medium. The non-transitory computer-readable medium has computer readable program codes embodied thereon for integrating image frames, the computer readable programs codes, including instructions that, when executed by one or more processors, cause the processor to receive for each pixel in at least one image frame a value representative of a sensor measurement. The instructions also cause the processor to calculate an average difference of the value representative of the sensor measurement over a subset of the plurality of image frames. Further, the instructions cause the processor to detect motion in at least one pixel of the image frame in the plurality of image frames based on the calculated average difference of the value representative of the sensor measurement over the subset of the plurality of image frames. The detected motion is of an object captured in the plurality of image frames. Additionally, the instructions cause the processor to generate an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected. The amount of frame integration is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data.
- The foregoing and other objects, features and advantages will be apparent from the following more particular description of the embodiments, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the embodiments.
-
FIG. 1 is a diagram of an exemplary image acquisition environment in accordance with an example embodiment of the present disclosure. -
FIG. 2 is a diagram of another exemplary image acquisition environment in accordance with an example embodiment of the present disclosure. -
FIG. 3 is a block diagram of an exemplary image system in accordance with an example embodiment of the present disclosure. -
FIG. 4A illustrates an image frame n which certain pixels are erroneously identified as including motion in accordance with an example embodiment of the present disclosure. -
FIG. 4B illustrates an image frame for which erroneous detections of motion is removed by using an averaging metric in accordance with an example embodiment of the present disclosure. -
FIG. 5A illustrates an image frame that includes a blurred trailing end of a moving object in accordance with an example embodiment of the present disclosure. -
FIG. 5B illustrates an image frame utilizes a persistent count to assist in identifying movement in a trailing end of the moving object in accordance with an example embodiment of the present disclosure. -
FIG. 6A illustrates an image frame that includes a group of low contrast pixels corresponding to a moving object for which motion is not detected in accordance with an example embodiment of the present disclosure. -
FIG. 6B illustrates an image frame that correctly identifies motion in the low contrast pixels in accordance with an example embodiment of the present disclosure. -
FIG. 7 is a flow diagram of a method for integrating image frames in accordance with an example embodiment of the present invention. -
FIG. 1 is a diagram of an exemplaryimage acquisition environment 100. Theenvironment 100 includes animage console 110, animage system 120, and acamera platform 130. Animage operator 115 views a plurality of objects utilizing theimage console 110. The plurality of objects includes a tank 132, a car 134, a tanker trailer 136, and a truck 138. The camera platform 130 (e.g., optical camera platform, infrared camera platform, etc.) receives optical waves, infrared waves, and/or any other type of waveform to form an image of one or more of the plurality of objects. Thecamera platform 130 communicates data (e.g., digital representation of an image of the tank 132, optical representation of an image of the car 134, etc.) to theimage system 120. - The
image system 120 analyzes the received video and/or image frames to integrate the image frames (e.g., a single snapshot of the image at a specified time, a sub-part of the image, etc.). For example, a one second video of an object includes ten image frames of the object. In this example, theimage system 120 integrates the ten image frames to form an integrated image frame with enhanced resolution of the object. Theimage system 120 can store the integrated image for further analysis and/or transmit the integrated image to theimage console 110 for viewing and/or analysis by theimage operator 115. - Although
FIG. 1 illustrates asingle camera platform 130, theenvironment 100 can utilize any number of camera platforms (e.g., ten camera platforms, one hundred camera platforms, etc.). For example, theimage system 120 can receive images from any number of camera platforms for the same object or different objects (as illustrated in Table 1). In other examples, thesingle camera platform 130 can include a plurality of cameras and/or other types of image capture devices (e.g., motion sensor, environmental sensor, heat sensor, etc.). - Table 1 illustrates exemplary image information received by a plurality of camera platforms and transmitted to the
image system 120. -
TABLE 1 Exemplary Image Information Image Object Camera Time A34 Tank 142 Airplane BZ 03:32.21 A35 Tank 142 Airplane BZ 03:32.22 B34 Tank 142 Border Wall RT 03:32.22 C56 Tank 142 Command Carrier GH 03:32.21 D32 Tank 142 Satellite CB 03:32.20 -
FIG. 2 is a diagram of another exemplaryimage acquisition environment 200. Theimage acquisition environment 200 illustrates images at two time frames A 210 a andB 210 b. At time A 210 a, acamera platform 212 and a tank 214 are at a first position (e.g., the physical location of thecamera platform 212 is in square 3×4 and the physical location of the tank 214 is in square 3×8), and thecamera platform 212 receives animage frame A 216 a of the tank 214. Attime B 210 b, thecamera platform 212 and the tank 214 are at a second position (e.g., the physical location of thecamera platform 212 is in square 3×4 and the physical location of the tank 214 is in square 9×8), and thecamera platform 212 receives an image frame B 216 b of the tank 214. During the time period between time A 210 a andtime B 210 b, the tank 214 moved from the first position to the second position at avelocity V t 224. Thecamera platform 212 can transmit the image frames A 216 a and B 216 b to an image system (not shown) for processing (e.g., integration of the image frames into an integrated image frame). - Table 2 illustrates exemplary image information received by the
camera platform 212. -
TABLE 2 Exemplary Image Information Image Object Camera Time Image Frame A Tank 214 Camera Platform 21205:32.21 216a ( Time A 210a)Image Frame B Tank 214 Camera Platform 21205:32.22 216b ( Time B 210b)Image Frame C Tank 214 Camera Platform 21205:32.23 Image Frame D Tank 214 Camera Platform 21205:32.24 -
FIG. 3 is a block diagram of anexemplary image system 310. Theimage system 310 includes acommunication module 311, animage sharpness module 312, animage noise module 313, animage jitter module 314, animage integration module 315, aninput device 391, anoutput device 392, adisplay device 393, aprocessor 394, and astorage device 395. The modules and devices described herein can, for example, utilize theprocessor 394 to execute computer executable instructions and/or include a processor to execute computer executable instructions (e.g., an encryption processing unit, a field programmable gate array processing unit, etc.). It should be understood that theimage system 310 can include, for example, other modules, devices, and/or processors known in the art and/or varieties of the illustrated modules, devices, and/or processors. - The
communication module 311 receives the images (e.g., from a camera platform, from an intermediate image processing device, from a storage device, etc.). Thecommunication module 311 communicates information to/from theimage system 310. Thecommunication module 311 can receive, for example, information associated with a camera platform. The information associated with the camera platform can be associated with a data signal (e.g., data signal from a camera platform, processed data signal from a camera platform, data signal from a motion sensor, data signal from a global positioning system, data signal from a location system, etc.). - The
image sharpness module 312 determines a sharpness metric for each image frame in a plurality of image frames. The sharpness metric is indicative of at least one of edge content and an edge size of the image frame. - In other examples, the
image sharpness module 312 separates one or more horizontal edge components (e.g., a row within the image frame, set of rows within the image frame, etc.) and one or more vertical edge components (e.g., a column within the image frame, set of columns within the image frame, etc.) in each image frame in the plurality of image frames. In some examples, theimage sharpness module 312 interpolate the one or more horizontal components and the one or more vertical components for each image frame in the plurality of image frames (e.g., to achieve sub-pixel alignment, to maximize sub-pixel alignment, etc.). In other examples, theimage sharpness module 312 correlates the interpolated horizontal and vertical components to determine a pixel shift for each image frame in the plurality of image frames. The pixel shift is indicative of at least one of the edge content and the edge size (e.g., number of edges, size of edges, etc.) of the respective image frame. The pixel shift can be, for example, utilized by theimage sharpness module 312 to generate an edge map for each image frame. - In some examples, the
image sharpness module 312 generates an edge map for each image frame in the plurality of image frames. The edge map includes pixel values indicative of an edge and a non-edge. In other examples, theimage sharpness module 312 combines the pixel values in the edge map for each image frame in the plurality of image frames to form the sharpness metric. - In other examples, the
image sharpness module 312 utilizes an edge detector/extractor (e.g., Sobel detector, Canny edge extractor, etc.) to generate an edge map for the image frame. The edge map can, for example, include pixel values of one for edges and pixel values of zero for non-edges. Theimage sharpness module 312 can, for example, sum the pixel values to generate the sharpness metric for each image frame. - The
image noise module 313 determines a noise metric for each image frame in the plurality of image frames. The noise metric is indicative of random variations in brightness or color in the image frame. - The
image jitter module 314 determines a jitter metric for each image frame in the plurality of image frames. The jitter metric is indicative of spatial shifts between the image frame and other image frames in the plurality of image frames. Theimage jitter module 314 can, for example, utilize frame to frame registration to measure the spatial shifts between image frames due to unintended motion. Theimage jitter module 314 can, for example, utilize a Fitts algorithm, correlation, and/or any other type of jitter processing technique to determine the jitter metric. Theimage jitter module 314 can, for example, utilize a Kalman filter to measure the spatial shifts between image frames due to intended motion (e.g., image pan commands, image zoom commands, etc.). - The image
frame integration module 315 integrates the one or more image frames of the plurality of image frames based on input received from thecommunication module 311,image sharpness module 312,image noise module 313, and theimage jitter module 314. - In particular, the image
frame integration module 315 receives, for each pixel in at least one image frame of the plurality of image frames, a value representative of a sensor measurement. For example, sensor measurements from each pixel can include the difference in intensity between the pixel in the current frame and those from previous frames, after registering the frames to correct for the displacement of the input images. Additionally, statistical measurements are made and compared to thresholds indicating the persistence of the intensity differences over multiple frames. The combined information on intensity differences and their persistence is used to identify which pixels are in motion within the image. In general, frame integration causes moving objects to appear blurred or smeared. In order to prevent such blurring or smearing, theimage integration module 315 detects whether or not motion exists in each of the pixels of each of the image frames. Once the imageframe integration module 315 detects such motion, the imageframe integration module 315 generates an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected. In an example, the amount of integration for those pixels for which motion is detected is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data. For instance, pixels that are determined to be in motion are integrated less than those that are determined to be stationary, reducing the motion blur on the pixels associated with moving objects, while still allowing the application of heavy integration and its associated noise reduction benefits on the otherwise stationary portions of the image. - In some situations, motion may be erroneously detected in some pixels. Such detection errors can be caused by, for example, misregistration of image frames, scale changes, rotations, perspective changes, and interlace effects.
FIG. 4A illustrates animage frame 400 a in which certain pixels are erroneously identified as including motion. In particular, the white pixels of theimage frame 400 a are pixels for which motion is detected. For example, the white color ofpixels 405 along the lower edge of the depicted road indicates that the pixels 405 a include motion. However, such an indication is clearly erroneous because the pixels 405 a do not include an object in motion. In contrast,pixels 410 include an object in motion (e.g., a vehicle). An indication that thepixels 410 include motion is accurate. - Referring back to
FIG. 3 , the imageframe integration module 315 reduces erroneous detection of motion by using an averaging metric to track pixel difference over multiple frames. In an example, the imageframe integration module 315 uses registered frame differencing to detect motion. For instance, the imageframe integration module 315 calculates an average difference of the value representative of the sensor measurement of the plurality of images or a subset of the plurality of images. The imageframe integration module 315 can calculate the average difference by normalizing a difference of the value representative of the sensor measurement of a pixel of the plurality of image frames by performing a mean absolute deviation calculation of at least one pixel between at least two image frames of the plurality of image frames. -
FIG. 4B illustrates animage frame 400 b for which erroneous detections of motion is removed by using the averaging metric. In particular, the erroneous detection of motion inpixels 405 is no longer present. - Referring back to
FIG. 3 , the imageframe integration module 315 detects motion in a pixel of an image frame if the average difference of the value representative of the sensor measurement is greater than a scale factor times the average difference and the average difference is greater than a noise threshold. The scale factor is based on a statistical dispersion of data resulting from normalizing the difference of the value representative of the sensor measurement of a pixel of the plurality of image frames and sensor noise, registration accuracy, and changes in the image from frame-to-frame such as rotation, scale and perspective. The noise threshold is based on measured image noise and a type of sensor providing the sensor measurement. - For moving objects, a trailing end of the moving object tends to blur. This occurs because the average difference of the value representative of the sensor measurements for pixels corresponding to the trailing end increases as the moving object reveals the background stationary image. This effect causes the pixels at the trailing end of a moving object to be misidentified as pixels associated with motion.
FIG. 5A illustrates animage frame 500 a that includes a blurred trailingend 515 a of a moving object. In order to compensate for such blurring, the imageframe integration module 315 assigns a persistent count for each pixel for which motion is detected. In particular, the persistent is set when the average difference is greater than the noise threshold and greater than a scale factor times the average difference. The persistent count is based on a typical vehicle's size and speed, and range of an image sensor to the target (e.g., the vehicle).FIG. 5B illustrates animage frame 500 b. As illustrated, the imageframe integration module 315 utilizes the persistent count to assist in identifying movement in a trailing end of the moving object (i.e., pixels which are white). - In addition, the image
frame integration module 315 can also utilize a noise filter to further reduce false detections of movement. In particular, the imageframe integration module 315 assumes that a true moving object should be made up of a cluster of moving pixels. Thus, for each pixel that is identified as including motion, the imageframe integration module 315 determines if a certain percentage of pixels within a predetermined region (e.g., a 9×9 pixel region) around the pixel identified as including motion also include motion. The percentage of pixels is predetermined based on image and sensor characteristics, and the actual percentage is tuned for each application. - In some situations, lower contrast pixels in an image frame corresponding to a moving object may not be identified as including motion when the pixel difference calculations may not exceed the input sensor noise level thresholds. In order to prevent such a situation, the image
frame integration module 315 utilizing a morphological spreading filter. As stated above, the imageframe integration module 315 assumes that a true moving object should not be a collection of disconnected discrete pixels, and should be made up of a cluster of pixels for which motion is detected. This spreading filter is applied to fill in and extend the group of motion pixels associated with the moving object and create a uniform motion mask defining the boundaries of the moving object within the frame. -
FIG. 6A illustrates animage frame 600 a that includes a group oflow contrast pixels 630 corresponding to a moving object for which motion is not detected (e.g., the pixels are not white and are in grayscale). In order to prevent such a scenario, the imageframe integration module 315 utilizing a morphological spreading filter to correctly identify movement in such pixels.FIG. 6B illustrates animage frame 600 b that correctly identifies motion in thelow contrast pixels 630. The imageframe integration module 315 then integrates each pixel in each of the plurality of image frames by an amount based on whether the pixel includes motion. - Referring back to
FIG. 3 , thedisplay device 393 displays information associated with the image system 310 (e.g., status information, configuration information, etc.). Theprocessor 394 executes the operating system and/or any other computer executable instructions for the image system 310 (e.g., executes applications, etc.). - The
memory 395 stores the images (e.g., actual image, processed image, etc.), the integrated image frames, and/or any other data associated with theimage system 310. Thememory 395 can store image information and/or any other data associated with theimage system 310. Thememory 395 can include a plurality of storage devices and/or theimage system 310 can include a plurality of storage devices (e.g., an image storage device, an integrated image storage device, etc.). Thememory 395 can include, for example, long-term storage (e.g., a hard drive, a tape storage device, flash memory, etc.), short-term storage (e.g., a random access memory, a graphics memory, etc.), and/or any other type of computer readable storage. -
FIG. 7 is a flow diagram of amethod 700 for integrating image frames in accordance with an example embodiment of the present invention. Themethod 700 begins at 705. At 710, the method includes receiving for each pixel in at least one image frame a value representative of a sensor measurement. Also, at 715, the method includes calculating an average difference of the value representative of the sensor measurement over a subset of the plurality of image frames. The method, at 720, includes detecting motion in at least one pixel of the image frame in the plurality of image frames based on the calculated average difference of the value representative of the sensor measurement over the subset of the plurality of image frames. The detected motion is of an object captured in the plurality of image frames. Further, at 725, the method includes generating an integrated image frame wherein each pixel having detected motion is integrated by an amount less than that of those pixels for which motion is not detected. The amount of frame integration is based on contrast levels, expected rates of motion, and noise characteristics of sensor input image data. At 730, the method ends. - The above-described systems and methods can be implemented in digital electronic circuitry, in computer hardware, firmware, and/or software. The implementation can be as a computer program product. The implementation can, for example, be in a machine-readable storage device, for execution by, or to control the operation of, data processing apparatus. The implementation can, for example, be a programmable processor, a computer, and/or multiple computers.
- Further example embodiments of the present disclosure may be configured using a computer program product; for example, controls may be programmed in software for implementing example embodiments of the present disclosure. Further example embodiments of the present disclosure may include a non-transitory computer readable medium containing instruction that may be executed by a processor, and, when executed, cause the processor to complete methods described herein. It should be understood that elements of the block and flow diagrams described herein may be implemented in software, hardware, firmware, or other similar implementation determined in the future. In addition, the elements of the block and flow diagrams described herein may be combined or divided in any manner in software, hardware, or firmware. If implemented in software, the software may be written in any language that can support the example embodiments disclosed herein. The software may be stored in any form of computer readable medium, such as random access memory (RAM), read only memory (ROM), compact disk read only memory (CD-ROM), and so forth. In operation, a general purpose or application specific processor loads and executes software in a manner well understood in the art. It should be understood further that the block and flow diagrams may include more or fewer elements, be arranged or oriented differently, or be represented differently. It should be understood that implementation may dictate the block, flow, and/or network diagrams and the number of block and flow diagrams illustrating the execution of embodiments of the present disclosure.
- A computer program can be written in any form of programming language, including compiled and/or interpreted languages, and the computer program can be deployed in any form, including as a stand-alone program or as a subroutine, element, and/or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site.
- Method steps can be performed by one or more programmable processors executing a computer program to perform functions of the present disclosure by operating on input data and generating output. Method steps can also be performed by and an apparatus can be implemented as special purpose logic circuitry. The circuitry can, for example, be a FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit). Subroutines and software agents can refer to portions of the computer program, the processor, the special circuitry, software, and/or hardware that implement that functionality.
- Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor receives instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer can include, can be operatively coupled to receive data from and/or transfer data to one or more mass storage devices for storing data (e.g., magnetic, magneto-optical disks, or optical disks).
- Data transmission and instructions can also occur over a communications network. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices. The information carriers can, for example, be EPROM, EEPROM, flash memory devices, magnetic disks, internal hard disks, removable disks, magneto-optical disks, CD-ROM, and/or DVD-ROM disks. The processor and the memory can be supplemented by, and/or incorporated in special purpose logic circuitry.
- To provide for interaction with a user, the above described techniques can be implemented on a computer having a display device. The display device can, for example, be a cathode ray tube (CRT) and/or a liquid crystal display (LCD) monitor. The interaction with a user can, for example, be a display of information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer (e.g., interact with a user interface element). Other kinds of devices can be used to provide for interaction with a user. Other devices can, for example, be feedback provided to the user in any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback). Input from the user can, for example, be received in any form, including acoustic, speech, and/or tactile input.
- The above described techniques can be implemented in a distributed computing system that includes a back-end component. The back-end component can, for example, be a data server, a middleware component, and/or an application server. The above described techniques can be implemented in a distributing computing system that includes a front-end component. The front-end component can, for example, be a client computer having a graphical user interface, a Web browser through which a user can interact with an example implementation, and/or other graphical user interfaces for a transmitting device. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, wired networks, and/or wireless networks.
- The system can include clients and servers. A client and a server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- Packet-based networks can include, for example, the Internet, a carrier internet protocol (IP) network (e.g., local area network (LAN), wide area network (WAN), campus area network (CAN), metropolitan area network (MAN), home area network (HAN)), a private IP network, an IP private branch exchange (IPBX), a wireless network (e.g., radio access network (RAN), 802.11 network, 802.16 network, general packet radio service (GPRS) network, HiperLAN), and/or other packet-based networks. Circuit-based networks can include, for example, the public switched telephone network (PSTN), a private branch exchange (PBX), a wireless network (e.g., RAN, bluetooth, code-division multiple access (CDMA) network, time division multiple access (TDMA) network, global system for mobile communications (GSM) network), and/or other circuit-based networks.
- The transmitting device can include, for example, a computer, a computer with a browser device, a telephone, an IP phone, a mobile device (e.g., cellular phone, personal digital assistant (PDA) device, laptop computer, electronic mail device), and/or other communication devices. The browser device includes, for example, a computer (e.g., desktop computer, laptop computer) with a world wide web browser (e.g., Microsoft® Internet Explorer® available from Microsoft Corporation, Mozilla® Firefox available from Mozilla Corporation). The mobile computing device includes, for example, a Blackberry®.
- Comprise, include, and/or plural forms of each are open ended and include the listed parts and can include additional parts that are not listed. And/or is open ended and includes one or more of the listed parts and combinations of the listed parts.
- One skilled in the art will realize the present disclosure may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The foregoing embodiments are therefore to be considered in all respects illustrative rather than limiting of the present disclosure described herein. Scope of the present disclosure is thus indicated by the appended claims, rather than by the foregoing description, and all changes that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Claims (21)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/048,898 US9232119B2 (en) | 2013-10-08 | 2013-10-08 | Integrating image frames |
EP14758015.3A EP3055988B1 (en) | 2013-10-08 | 2014-08-07 | Integrating image frames |
PCT/US2014/050046 WO2015053849A1 (en) | 2013-10-08 | 2014-08-07 | Integrating image frames |
IL244955A IL244955A (en) | 2013-10-08 | 2016-04-06 | Integrating image frames |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/048,898 US9232119B2 (en) | 2013-10-08 | 2013-10-08 | Integrating image frames |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150097975A1 true US20150097975A1 (en) | 2015-04-09 |
US9232119B2 US9232119B2 (en) | 2016-01-05 |
Family
ID=51429371
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/048,898 Active 2034-01-05 US9232119B2 (en) | 2013-10-08 | 2013-10-08 | Integrating image frames |
Country Status (4)
Country | Link |
---|---|
US (1) | US9232119B2 (en) |
EP (1) | EP3055988B1 (en) |
IL (1) | IL244955A (en) |
WO (1) | WO2015053849A1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170076594A1 (en) * | 2014-03-03 | 2017-03-16 | Inrix Inc., | Traffic flow rates |
WO2017139198A1 (en) * | 2016-02-08 | 2017-08-17 | Cree, Inc. | Image analysis techniques |
US9894740B1 (en) | 2017-06-13 | 2018-02-13 | Cree, Inc. | Intelligent lighting module for a lighting fixture |
US20180089839A1 (en) * | 2015-03-16 | 2018-03-29 | Nokia Technologies Oy | Moving object detection based on motion blur |
CN108600622A (en) * | 2018-04-12 | 2018-09-28 | 联想(北京)有限公司 | A kind of method and device of video stabilization |
US10451229B2 (en) | 2017-01-30 | 2019-10-22 | Ideal Industries Lighting Llc | Skylight fixture |
US10465869B2 (en) | 2017-01-30 | 2019-11-05 | Ideal Industries Lighting Llc | Skylight fixture |
JP2020050261A (en) * | 2018-09-28 | 2020-04-02 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Information processing device, flight control instruction method, program, and recording medium |
CN111885281A (en) * | 2019-05-02 | 2020-11-03 | 顶级公司 | Image processing |
US10830400B2 (en) | 2018-02-08 | 2020-11-10 | Ideal Industries Lighting Llc | Environmental simulation for indoor spaces |
US10991215B2 (en) | 2018-03-20 | 2021-04-27 | Ideal Industries Lighting Llc | Intelligent signage |
CN112868047A (en) * | 2018-08-14 | 2021-05-28 | 辉达公司 | Spatiotemporal denoising in ray tracing applications |
US11373359B2 (en) | 2018-03-17 | 2022-06-28 | Nvidia Corporation | Reflection denoising in ray-tracing applications |
US11419201B2 (en) | 2019-10-28 | 2022-08-16 | Ideal Industries Lighting Llc | Systems and methods for providing dynamic lighting |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10397504B2 (en) | 2017-02-13 | 2019-08-27 | Kidde Technologies, Inc. | Correcting lag in imaging devices |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120114235A1 (en) * | 2010-11-10 | 2012-05-10 | Raytheon Company | Integrating image frames |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4679086A (en) | 1986-02-24 | 1987-07-07 | The United States Of America As Represented By The Secretary Of The Air Force | Motion sensitive frame integration |
GB9214218D0 (en) | 1992-07-03 | 1992-08-12 | Snell & Wilcox Ltd | Motion compensated video processing |
US7388603B2 (en) | 2003-06-10 | 2008-06-17 | Raytheon Company | Method and imaging system with intelligent frame integration |
EP2063390B1 (en) | 2006-09-14 | 2016-08-03 | Fujitsu Limited | Image processing device and its program |
EP2153407A1 (en) | 2007-05-02 | 2010-02-17 | Agency for Science, Technology and Research | Motion compensated image averaging |
US20100157079A1 (en) | 2008-12-19 | 2010-06-24 | Qualcomm Incorporated | System and method to selectively combine images |
US8497914B2 (en) | 2009-08-10 | 2013-07-30 | Wisconsin Alumni Research Foundation | Vision system and method for motion adaptive integration of image frames |
TR201005076A2 (en) | 2010-06-23 | 2012-01-23 | Vestel Elektroni̇k Sanayi̇ Ve Ti̇caret Anoni̇m Şi̇rketi̇@ | Method and apparatus for estimating motion vectors for image frame interpolation |
-
2013
- 2013-10-08 US US14/048,898 patent/US9232119B2/en active Active
-
2014
- 2014-08-07 WO PCT/US2014/050046 patent/WO2015053849A1/en active Application Filing
- 2014-08-07 EP EP14758015.3A patent/EP3055988B1/en active Active
-
2016
- 2016-04-06 IL IL244955A patent/IL244955A/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120114235A1 (en) * | 2010-11-10 | 2012-05-10 | Raytheon Company | Integrating image frames |
US8374453B2 (en) * | 2010-11-10 | 2013-02-12 | Raytheon Company | Integrating image frames |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10319232B2 (en) * | 2014-03-03 | 2019-06-11 | Inrix Inc. | Traffic flow rates |
US9685078B2 (en) * | 2014-03-03 | 2017-06-20 | Inrix Inc. | Traffic flow rates |
US20170287327A1 (en) * | 2014-03-03 | 2017-10-05 | Inrix Inc. | Traffic flow rates |
US20170076594A1 (en) * | 2014-03-03 | 2017-03-16 | Inrix Inc., | Traffic flow rates |
US20180089839A1 (en) * | 2015-03-16 | 2018-03-29 | Nokia Technologies Oy | Moving object detection based on motion blur |
WO2017139198A1 (en) * | 2016-02-08 | 2017-08-17 | Cree, Inc. | Image analysis techniques |
DE112017000705B4 (en) | 2016-02-08 | 2024-06-20 | Cree Lighting USA LLC (n.d.Ges.d. Staates Delaware) | Image analysis techniques |
US11856059B2 (en) | 2016-02-08 | 2023-12-26 | Ideal Industries Lighting Llc | Lighting fixture with enhanced security |
US10192316B2 (en) | 2016-02-08 | 2019-01-29 | Cree, Inc. | Modular lighting fixture |
US10251245B2 (en) | 2016-02-08 | 2019-04-02 | Cree, Inc. | Automatic mapping of devices in a distributed lighting network |
US10306738B2 (en) | 2016-02-08 | 2019-05-28 | Cree, Inc. | Image analysis techniques |
US11209138B2 (en) | 2017-01-30 | 2021-12-28 | Ideal Industries Lighting Llc | Skylight fixture emulating natural exterior light |
US10465869B2 (en) | 2017-01-30 | 2019-11-05 | Ideal Industries Lighting Llc | Skylight fixture |
US10781984B2 (en) | 2017-01-30 | 2020-09-22 | Ideal Industries Lighting Llc | Skylight Fixture |
US10451229B2 (en) | 2017-01-30 | 2019-10-22 | Ideal Industries Lighting Llc | Skylight fixture |
US9894740B1 (en) | 2017-06-13 | 2018-02-13 | Cree, Inc. | Intelligent lighting module for a lighting fixture |
US10264657B2 (en) | 2017-06-13 | 2019-04-16 | Cree, Inc. | Intelligent lighting module for a lighting fixture |
US10830400B2 (en) | 2018-02-08 | 2020-11-10 | Ideal Industries Lighting Llc | Environmental simulation for indoor spaces |
US11373359B2 (en) | 2018-03-17 | 2022-06-28 | Nvidia Corporation | Reflection denoising in ray-tracing applications |
US12026822B2 (en) | 2018-03-17 | 2024-07-02 | Nvidia Corporation | Shadow denoising in ray-tracing applications |
US10991215B2 (en) | 2018-03-20 | 2021-04-27 | Ideal Industries Lighting Llc | Intelligent signage |
CN108600622A (en) * | 2018-04-12 | 2018-09-28 | 联想(北京)有限公司 | A kind of method and device of video stabilization |
US11688042B2 (en) | 2018-08-14 | 2023-06-27 | Nvidia Corporation | Filtering render data using multiple iterations for a filter direction |
US11113792B2 (en) * | 2018-08-14 | 2021-09-07 | Nvidia Corporation | Temporal-spatial denoising in ray-tracing applications |
CN112868047A (en) * | 2018-08-14 | 2021-05-28 | 辉达公司 | Spatiotemporal denoising in ray tracing applications |
JP2020050261A (en) * | 2018-09-28 | 2020-04-02 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Information processing device, flight control instruction method, program, and recording medium |
CN111885281A (en) * | 2019-05-02 | 2020-11-03 | 顶级公司 | Image processing |
US11419201B2 (en) | 2019-10-28 | 2022-08-16 | Ideal Industries Lighting Llc | Systems and methods for providing dynamic lighting |
Also Published As
Publication number | Publication date |
---|---|
EP3055988B1 (en) | 2021-05-19 |
US9232119B2 (en) | 2016-01-05 |
WO2015053849A1 (en) | 2015-04-16 |
IL244955A (en) | 2016-10-31 |
IL244955A0 (en) | 2016-05-31 |
EP3055988A1 (en) | 2016-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9232119B2 (en) | Integrating image frames | |
US8054881B2 (en) | Video stabilization in real-time using computationally efficient corner detection and correspondence | |
EP2640057B1 (en) | Image processing device, image processing method and program | |
US9734399B2 (en) | Context-aware object detection in aerial photographs/videos using travel path metadata | |
EP2640059A1 (en) | Image processing device, image processing method and program | |
EP2858037B1 (en) | Moving object detector | |
KR101621370B1 (en) | Method and Apparatus for detecting lane of road | |
CN111340749B (en) | Image quality detection method, device, equipment and storage medium | |
EP3593322B1 (en) | Method of detecting moving objects from a temporal sequence of images | |
US20200265592A1 (en) | Three-frame difference target acquisition and tracking using overlapping target images | |
CN111225145A (en) | Real-time image detection analysis and tracking method | |
US8374453B2 (en) | Integrating image frames | |
CN113936458A (en) | Method, device, equipment and medium for judging congestion of expressway | |
JP5304064B2 (en) | Speed measuring device and speed measuring method | |
CN113344906B (en) | Camera evaluation method and device in vehicle-road cooperation, road side equipment and cloud control platform | |
US9648211B2 (en) | Automatic video synchronization via analysis in the spatiotemporal domain | |
Jatoth et al. | Performance analysis of Alpha Beta filter, kalman filter and meanshift for object tracking in video sequences | |
WO2013172924A1 (en) | Motion detection through stereo rectification | |
JP6184447B2 (en) | Estimation apparatus and estimation program | |
EP3754971B1 (en) | Updating a fixed pattern noise matrix | |
CN114730453A (en) | Method for detecting a movement state of a vehicle | |
CN103310434A (en) | Static sign detection method | |
US9141204B2 (en) | Dynamic scale for mouse sensor runaway detection | |
KR102315523B1 (en) | Method of and system for eliminating background of image | |
JP5501988B2 (en) | Captured image correction device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RAYTHEON COMPANY, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NASH, STEPHEN R.;LEDDY, CHRISTOPHER A.;DO, HOANG K.;SIGNING DATES FROM 20131002 TO 20131003;REEL/FRAME:031367/0523 |
|
AS | Assignment |
Owner name: NAVY, DEPARTMENT OF THE, MARYLAND Free format text: CONFIRMATORY LICENSE;ASSIGNOR:RAYTHEON COMPANY;REEL/FRAME:033200/0144 Effective date: 20140417 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |