WO2010102135A1 - Temporally aligned exposure bracketing for high dynamic range imaging - Google Patents

Temporally aligned exposure bracketing for high dynamic range imaging Download PDF

Info

Publication number
WO2010102135A1
WO2010102135A1 PCT/US2010/026250 US2010026250W WO2010102135A1 WO 2010102135 A1 WO2010102135 A1 WO 2010102135A1 US 2010026250 W US2010026250 W US 2010026250W WO 2010102135 A1 WO2010102135 A1 WO 2010102135A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
exposure
prism
temporally aligned
aligned images
Prior art date
Application number
PCT/US2010/026250
Other languages
French (fr)
Inventor
Paul A. Wagner
Original Assignee
Wagner Paul A
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wagner Paul A filed Critical Wagner Paul A
Priority to AU2010221241A priority Critical patent/AU2010221241A1/en
Priority to EP10749347A priority patent/EP2404209A4/en
Publication of WO2010102135A1 publication Critical patent/WO2010102135A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1066Beam splitting or combining systems for enhancing image performance, like resolution, pixel numbers, dual magnifications or dynamic range, by tiling, slicing or overlapping fields of view
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/144Beam splitting or combining systems operating by reflection only using partially transparent surfaces without spectral selectivity
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/093Digital circuits for control of exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions

Definitions

  • This invention relates generally to imaging systems, and more particularly, to imaging systems that provide varying exposures for production of high dynamic range images.
  • HDRI High dynamic range imaging
  • HDRI is a term applied in image processing, computer graphics and photography, and generally relates to systems or techniques for providing a greater dynamic range of exposures. HDRI is most commonly employed in situations where the range between light and dark areas is great, and subsequently a normal exposure, or even a digitally enhanced exposure, are not adequate to resolve all of the image area.
  • HDRI manipulates images and exposures to accurately represent the wide range of intensity levels found in real scenes, from direct sunlight to shadows.
  • the user employs multiple exposures and bracketing with photo merging, to get greater detail throughout the tonal range.
  • HDRI processing involves merging several exposures of a given scene into a, typically, 32-bit HDRI source file, which is then "tone mapped" to produce an image in which adjustments of qualities of light and contrast are applied locally to the HDRI source image.
  • HDRI images are best captured originally in a digital format with a much higher bit depth than the current generation of digital imaging devices.
  • Current devices are built around an 8-bit per channel architecture. That means that both the cameras and output displays have a maximum tonal range of 8-bits per RGB color channel.
  • HDRl formats are typically 32-bits per channel. A few next generation cameras and displays are capable of handling this kind of imagery natively. It will probably be quite a few years until HDRI displays become common but HDRI cameras and acquisition techniques are already emerging.
  • HDRI images are typically tone-mapped back to 8-bits per channel, essentially compressing the extended information into the smaller dynamic range. This is typically done automatically with a variety of existing software algorithms, or manually with artistic input through programs like Adobe Photoshop.
  • the first is to use exotic high end cameras with special imaging chips (CMOS or CCD) like the Spheron HDR.
  • CCD charge-coupled device
  • CMOS complementary metal-oxide semiconductor
  • image sensors convert light into electrons, though CMOS sensors are much less expensive to manufacture than CCD sensors.
  • These types of cameras are typically used by professionals in controlled environments for the primary purpose of creating spherical photos to illuminate computer generated images (another important use of HDRI). They are not point and shoot cameras and are not capable of motion photography.
  • the second is shooting multiple varying exposures in rapid succession (known as exposure bracketing) then combining those images taking the highlights from the underexposed images, mid tones from the normally exposed images, and shadows from the over exposed images to create a composite HDR image that retains massive detail in the highlights and shadows where normal cameras would lose detail.
  • Both of these techniques have substantial disadvantages.
  • the second technique can be done with conventional hardware, but it is time consuming and takes substantial expertise to pull off.
  • the images are not temporally aligned, meaning they were taken one after another at different moments in time, there can be changes in the scene that produce artifacts when the HDRI software attempts to eliminate or synthesize the objects in motion across the frame.
  • An example would be a car moving through the frame.
  • HDRI exposure bracketed HDRI is typically restricted to still subjects, and any animals, cars, pedestrians, moving leaves or litter, clouds, etc., in fact anything that is shifting within the frame will preclude HDRI, or at the very least lead to unhappy results.
  • HDRI requires multiple, huge files, multiple steps, and typically specialized and complicated software.
  • the first technique is very expensive and requires exotic hardware or sophisticated electronic and software systems. While imaging chips are moving ever forward in sensitivity and dynamic range, they still do not produce the dramatic results that the first technique of changing exposures does. In addition, these special cameras are not capable of shooting higher frame rates required to shoot motion pictures. These products are used for narrow specialized purposes.
  • United States Patent Application No. 20070126918, to Lee, published June 7, 2007, discloses cameras that can provide improved images by combining several shots of a scene taken with different exposure and focus levels is provided.
  • cameras are provided, which have pixel-wise exposure control means so that high quality images are obtained for a scene with a high level of contrast.
  • the system is complicated, and employs light reducing filters to create exposures of varying intensity. Much of the light is lost, reducing clarity and introducing sources of distortion and noise to the images.
  • United States Patent Application No. 20080149812 discloses an electronic camera comprising two or more image sensor arrays. At least one of the image sensor arrays has a high dynamic range.
  • the camera also comprises a shutter for selectively allowing light to reach the two or more image sensor arrays, readout circuitry for selectively reading out pixel data from the image sensor arrays, and, a controller configured to control the shutter and the readout circuitry.
  • the controller comprises a processor and a memory having computer-readable code embodied therein which, when executed by the processor, causes the controller to open the shutter for an image capture period to allow the two or more image sensor arrays to capture pixel data, and, read out pixel data from the two or more image sensor arrays.
  • United States Patent Application No. 20070177004 to Kolehmainen, et al., published August 2, 2007, is directed to an image creating method and imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image.
  • the apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality.
  • Multiple lenses are required to implement this method, which is expensive and creates parallax and optic imagery distortions with each lens addition.
  • an optical imaging system for temporally aligning bracketed exposures of a single image, the system comprising a light aperture, a prism and a image capturing device, where the prism is capable of splitting an incoming image from the light aperture into at least two temporally aligned images, and where the image capturing device captures the temporally aligned images at different levels of exposure.
  • the prism splits the intensity of said incoming image to achieve a desired EV output interval between temporally aligned images.
  • the capturing device further comprises image detection sensors, and the ISO of the sensors is adjusted to achieve a desired EV output interval between said images.
  • the system comprises an image processing device connected to said image capturing device.
  • the image processing device comprises a computer processor.
  • the device further comprises a tone-mapping processor.
  • the system comprises an eyepiece for viewing the image to be captured by the lens.
  • the system comprises a digital readout monitor.
  • the prism is capable of splitting the image into three or more levels of exposure.
  • the three levels of exposure are about 14%, about 29% and about 57%, respectively, of the exposure level of the original image.
  • the three levels of exposure are about 5%, about 19% and about 76%, respectively, of the exposure level of the original image.
  • the three levels of exposure are about 1 %, about 11% and about 88%, respectively, of the exposure level of the original image.
  • the prism is capable of splitting the image into four or more levels of exposure.
  • the prism is capable of splitting the image into five or more levels of exposure.
  • the invention provides a method for temporally aligning bracketed exposures of a single image, the method comprising the steps of a) using a prism to split an incoming image from a light aperture into at least two temporally aligned images, and b) using an image capturing device to capture the temporally aligned images at different levels of exposure.
  • Figure 1 shows a diagrammatic view of the system produced according to the invention, demonstrating variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings.
  • Figure 2 shows a diagrammatic view of a system of Figure 1 and further showing additional components of the system for processing the images.
  • Figure 3 shows a perspective drawing of a two-way prism that could be utilized with the invention.
  • Figure 4 shows a perspective drawing of a three-way prism that could be utilized with the invention.
  • Figure 5 shows a perspective drawing of a four-way prism that could be utilized with the invention.
  • Figure 6 shows a perspective drawing of a five-way prism that could be utilized with the invention.
  • the optical imaging system of the present invention provides an improvement to high dynamic range imaging, and assemblies therefore, that allows temporally aligned exposure bracketing.
  • the system is simple, elegant, leverages existing technologies, allows for motion capture with no temporal distortion, and is relatively inexpensive to implement.
  • the present optical imaging system allows the user to capture light with confidence that the under and over exposed regions in the image will be imaged properly. The user simply captures all the available light with and image capturing device, and determines later how to map that information to the output device. With the optical imaging system the user can create stunning imagery that is otherwise impossible to capture, even with the most sophisticated of the current generation of normal photography equipment.
  • the systems and methods utilize prism splitting by full spectrum brackets to several image detecting sensors of an image capturing device.
  • the system eliminates exotic image sensors as a necessary feature.
  • the system allows multiple exposures from existing commodity sensors simultaneously by simply dividing the incoming light for an image into multiple and different levels of exposure for the same image.
  • the temporally aligned imaging system can be analogized to Technicolor. Before color film stock was developed, Hollywood was in search of a way to shoot films in color. Technicolor, Inc. was the first company to develop a way to create color pictures from black and white film stock. It utilized three rolls of black and white film exposed simultaneously through a special set of beam splitters with red, green, and blue filters on them.
  • each black and white film negative recorded just the red, green, or blue information. This process was done in reverse with a projector that ran all three rolls of film simultaneously with the correct color filter in front of each. When the images are aligned properly, a full color picture is realized.
  • This technique is used to this day in professional level video cameras, sometimes referred to as 3CCD sensor.
  • the three red, green and blue sensors not only allow for sharper more saturated colors but also help enhance the dynamic range of the images they help create. But just as better color film stocks helped to usher out the era of the Technicolor process, better CMOS and CCD sensors are ushering out the era of 3CCD sensor systems in favor of full color single sensor systems.
  • CMOS and CCD sensors are ushering out the era of 3CCD sensor systems in favor of full color single sensor systems.
  • some of the highest end professional cameras like the lineup from RED Digital Cinema Camera Company as well as every professional Digital SLR use only one foil color sensor. It is quite apparent that sensor technology has progressed to the point where a single color sensor can replace and even outperform 3CCD sensor systems.
  • the temporally aligned exposure bracketing system employs trichroic prisms adapted to split the entire spectrum to each of multiple foil color sensors, at different exposure levels, rather than splitting out the spectrum into different colors.
  • the system allows a color neutral change in the amount, rather than the spectrum, of light going to each sensor, by the application of such prisms for the temporal alignment of images for HRDI.
  • color neutral it is meant that while the temporally aligned images created by the prism may vary in intensity between themselves, or between themselves and the incoming image, they are not substantially different from one another in color spectrum, i.e., the prism creates split images that are similar in color spectrum, or spectrally neutral, even if differing substantially in intensity.
  • the system 10 comprises an optical imaging system having an aperture 20 for capturing incoming light 30.
  • a neutral prism 100 Internal to the system is a neutral prism 100 that is used to reflect the captured light to generate a color-neutral separation.
  • the neutral film prism 100 is depicted as a three-way prism that splits the light to three separate foil color sensors image 101, 102 and 103.
  • Various means can be employed to adjust the EV (Exposure Value, commonly referred to as a "stop") up and down with the intensity spectrum, and a camera can then capture the images simultaneously.
  • two consecutive neutral films 104 and 105 are used, respectively capturing 57.1429% (4/7) of the light followed by a neutral film of 33.33% (1/3) for the remainder light.
  • the neutral prism thus fractionates a captured image into three temporally aligned exposures 106, 107 and 108, that have relative light intensities of 1/7, 2/7 and 4/7 of the incoming light.
  • the film coatings 104 and 105 for the prism 100 may be of any of numerous coatings known to the art and capable of achieving a color neutral split, or separation, of the image, by reflection of the incoming light 30.
  • Two examples of such spectrally neutral films include a thin film metallic coating, typically aluminum or silver, with or without a set of dielectric layers, and a set of dielectric layers consisting of high and low refractive index materials with the thin film stack designed to reflect a certain percentage of the incident light over the visible wavelength range.
  • These and related types of thin film coatings 104 and 105 shall be termed “spectrally neutral film” or, alternatively, "neutral film.”
  • the prism is harnessed for the purpose of splitting out different exposures of the same image, that are temporally aligned (taken at the same moment).
  • EV Exposure Value
  • stop Exposure Value
  • the system could split the light intensity in the prism 100 into equal amounts of roughly 33% each and then adjust the ISO of the sensors 101, 102 and 103 respectively to achieve different EV output intervals.
  • the system could split the light intensity within the prism 100 into the desired EV intervals for the light 106, 107 and 108.
  • the desired different EV output intervals are achieved for the recorded images. Any combination between these two extremes may be more or less desirable for various applications.
  • Figure 2 illustrates some additional components of a system 10.
  • a tone mapping processor 110 and an HDRI 120 processor that are used for combining the images.
  • the processing chip is used to combine the 3 images in real time to an HDRI image, and another chip is used to complete the tone mapping. These functions can also be combined into a single processing chip.
  • the individual sensors could benefit from some tuning for their respective exposure levels to reduce noise and other artifacts associated with under and over exposure, in ways known to the art.
  • a high quality standard camera lens 140 can be used with the system 10 to gather and focus light from the light aperture.
  • the system 10 also will typically include an eyepiece and/or monitor 150 for aligning the images for capture from the lens onto the sensors.
  • Additional features of the system typically would include mass storage for either the 8 bit tone mapped data 160, or the raw 32 bit HDRI data 170.
  • Other HDRI formats are known, for instance 16 bit and 14 bit formats, though the standard is evolving toward the higher 32 bit format.
  • the ISO is a function of how sensitive the sensor/film is to light.
  • the exposure generated by a particular aperture, shutter speed, and sensitivity combination can be represented by its exposure value "EV".
  • Zero EV is defined by the combination of an aperture off/1 and a shutter speed of Is at ISO 100.
  • Exposure value is used to represent shutter speed and aperture combinations only.
  • An exposure value which takes into account the ISO sensitivity is called “Light Value” or LV and represents the luminance of the scene.
  • Light Value is often referred to as “exposure value”, grouping aperture, shutter speed and sensitivity in one familiar variable. This is because in a digital camera it is as easy to change sensitivity as it is to change aperture and shutter speed.
  • the EV will increase by 1. For instance, 6 EV represents half the amount of light as 5 EV.
  • Table 2 shows the additional variations possible for adjusting output intervals on top of the prismatic split, for +/- 3EV, +/- 2EV and +/- IEV.
  • the various exposure intervals can be modified or enhanced by using different combinations of prism splits with sensor sensitivity settings. This is accomplished by using differential exposure values (EV) to amplify the differences created by the prismatic split at the level of the sensors.
  • EV differential exposure values
  • Table 3 shows results for a diagrammatic view of a system produced according to the invention that as shown in Figures 1 and 2, only deploying a prism with two splits of light 104 and 105 corresponding to 76.1905% (16/21) followed by 20.00% (1/5) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106, 107 and 108 of levels of 76.1905%, 19.0476% and 4.7619%, respectively.
  • Table 4 shows the results where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/- 3EV, +/- 2EV and +/- IEV.
  • Table 4 shows the various ISO settings for each sensor that is used to produce alternative EV output intervals from each sensor (these settings are for +/-1EV input values only) as found in Table 3 (these settings are for +/-2EV input values only).
  • Table 5 is the results for a system produced according to the invention as depicted in Figures 1 and 2, only showing a prism with two splits of light 104 and 105 corresponding to 87.6712% (64/73) followed by 11.11% (1/9) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106, 107 and 108 of levels of 87 .6712%, 10.9589% and 1.3699%, respectively.
  • Table 6 is the settings for a system as would be configured for the Table 5 percentages, where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/- 3EV, +/- 2EV and +/- IEV. Table 6
  • Figures 1 and 2 demonstrate configurations for two-way, three-way, four-way and five-way neutral prism configurations, respectively.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The invention provides an optical imaging system for temporally aligning bracketed exposures of a single image, the system comprising a light aperture, a prism and a image capturing device, where the prism is capable of splitting an incoming image from the light aperture into at least two temporally aligned images, and where the image capturing device captures the temporally aligned images at different levels of exposure.

Description

TEMPORALLY ALIGNED EXPOSURE BRACKETING FOR HIGH DYNAMIC RANGE IMAGING
REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of United States Provisional Patent Application No. 61/157,494, filed March 4, 2009, the complete disclosure of which is incorporated herein, in the entirety.
COPYRIGHT NOTICE
[0002] A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office patent files and records, but otherwise reserves all other copyright rights.
BACKGROUND OF THE INVENTION Field of the Invention
[0003] This invention relates generally to imaging systems, and more particularly, to imaging systems that provide varying exposures for production of high dynamic range images. Description of Related Art
[0004] High dynamic range imaging (HDRI) is a term applied in image processing, computer graphics and photography, and generally relates to systems or techniques for providing a greater dynamic range of exposures. HDRI is most commonly employed in situations where the range between light and dark areas is great, and subsequently a normal exposure, or even a digitally enhanced exposure, are not adequate to resolve all of the image area.
[0005] HDRI manipulates images and exposures to accurately represent the wide range of intensity levels found in real scenes, from direct sunlight to shadows. With HDRI, the user employs multiple exposures and bracketing with photo merging, to get greater detail throughout the tonal range.
[0006] More particularly, HDRI processing involves merging several exposures of a given scene into a, typically, 32-bit HDRI source file, which is then "tone mapped" to produce an image in which adjustments of qualities of light and contrast are applied locally to the HDRI source image.
[0007] HDRI images are best captured originally in a digital format with a much higher bit depth than the current generation of digital imaging devices. Current devices are built around an 8-bit per channel architecture. That means that both the cameras and output displays have a maximum tonal range of 8-bits per RGB color channel.
[0008] HDRl formats are typically 32-bits per channel. A few next generation cameras and displays are capable of handling this kind of imagery natively. It will probably be quite a few years until HDRI displays become common but HDRI cameras and acquisition techniques are already emerging.
[0009] HDRI images are typically tone-mapped back to 8-bits per channel, essentially compressing the extended information into the smaller dynamic range. This is typically done automatically with a variety of existing software algorithms, or manually with artistic input through programs like Adobe Photoshop.
[0010] So in a typical workflow for HDRI the artist first captures the HDRI image, and then the image is tone-mapped back to desired output device such as ink on paper, an 8- bit RGB monitor, or even a 32-bit HDRI monitor (requiring no tone mapping).
[0011] The real challenge with HDRI is not the file formats or computer algorithms to tone map them to 8-bit displays. Those challenges have already been largely met. For example, open EXR is an example of a robust open source HDRI format developed by Industrial Light and Magic. The hardest part of capturing HDR images is the physical devices used to capture the imagery. So far only two ways of capturing HDR images are available.
[0012] The first is to use exotic high end cameras with special imaging chips (CMOS or CCD) like the Spheron HDR. Both CCD (charge-coupled device) and CMOS (complimentary metal-oxide semiconductor) image sensors convert light into electrons, though CMOS sensors are much less expensive to manufacture than CCD sensors. These types of cameras are typically used by professionals in controlled environments for the primary purpose of creating spherical photos to illuminate computer generated images (another important use of HDRI). They are not point and shoot cameras and are not capable of motion photography.
[0013] The second is shooting multiple varying exposures in rapid succession (known as exposure bracketing) then combining those images taking the highlights from the underexposed images, mid tones from the normally exposed images, and shadows from the over exposed images to create a composite HDR image that retains massive detail in the highlights and shadows where normal cameras would lose detail.
[0014] Both of these techniques have substantial disadvantages. The second technique can be done with conventional hardware, but it is time consuming and takes substantial expertise to pull off. In addition, because the images are not temporally aligned, meaning they were taken one after another at different moments in time, there can be changes in the scene that produce artifacts when the HDRI software attempts to eliminate or synthesize the objects in motion across the frame. An example would be a car moving through the frame.
[0015] Even a slight movement of the camera between exposures will be noticeable in the resulting combined image. Moving objects will be "ghosted" in the HDRI image. As such this technique is totally useless for motion photography and can only be used with substantial success in still photography applications.
[0016] For this reason, exposure bracketed HDRI is typically restricted to still subjects, and any animals, cars, pedestrians, moving leaves or litter, clouds, etc., in fact anything that is shifting within the frame will preclude HDRI, or at the very least lead to unhappy results.
[0017] Further, producing HDRI from multiple images can be a time consuming and frustrating task. HDRI requires multiple, huge files, multiple steps, and typically specialized and complicated software.
[0018] The first technique is very expensive and requires exotic hardware or sophisticated electronic and software systems. While imaging chips are moving ever forward in sensitivity and dynamic range, they still do not produce the dramatic results that the first technique of changing exposures does. In addition, these special cameras are not capable of shooting higher frame rates required to shoot motion pictures. These products are used for narrow specialized purposes.
[0019] Proposed solutions to the problems associated with the second technique are reflected in various published patents at the United States Patent and Trademark Office. For example, United States Patent Application No. 20060221209, to McGuire, et ai, published October 5, 2006, teaches an apparatus and method for acquiring and combining images of a scene with multiple optical characteristics at multiple resolutions. Disclosed therein is a camera system that acquires multiple optical characteristics at multiple resolutions of a scene. The camera system includes multiple optical elements arranged as a tree having a multiple of nodes connected by edges. The system employs filters at the end of the chain, and lenses are placed in front of each of the sensors, creating additional sources of optical distortion.
[0020] United States Patent Application No. 20070126918, to Lee, published June 7, 2007, discloses cameras that can provide improved images by combining several shots of a scene taken with different exposure and focus levels is provided. In addition, cameras are provided, which have pixel-wise exposure control means so that high quality images are obtained for a scene with a high level of contrast. The system is complicated, and employs light reducing filters to create exposures of varying intensity. Much of the light is lost, reducing clarity and introducing sources of distortion and noise to the images.
[0021] United States Patent Application No. 20080149812, to Ward, et cd., published June 26, 2008, discloses an electronic camera comprising two or more image sensor arrays. At least one of the image sensor arrays has a high dynamic range. The camera also comprises a shutter for selectively allowing light to reach the two or more image sensor arrays, readout circuitry for selectively reading out pixel data from the image sensor arrays, and, a controller configured to control the shutter and the readout circuitry. The controller comprises a processor and a memory having computer-readable code embodied therein which, when executed by the processor, causes the controller to open the shutter for an image capture period to allow the two or more image sensor arrays to capture pixel data, and, read out pixel data from the two or more image sensor arrays. This is essentially a total digital solution to the problem of controlling exposure levels for different images for high dynamic range processing.
[0022] Finally, United States Patent Application No. 20070177004, to Kolehmainen, et al., published August 2, 2007, is directed to an image creating method and imaging device comprising at least two image capturing apparatus, each apparatus being arranged to produce an image. The apparatus is configured to utilize at least a portion of the images produced with different image capturing apparatus with each other to produce an image with an enhanced image quality. Multiple lenses are required to implement this method, which is expensive and creates parallax and optic imagery distortions with each lens addition.
[0023] None of the prior approaches have been able to provide a simple means for capturing multiple images that overcome the difficulties of temporal misalignment, and that are simple and quickly resolved into a high definition range image.
[0024] What is needed is an inexpensive solution that can be easily integrated into products with conventional form factors. This solution would ideally be easy to use, compact, and able to shoot at high frame rates with no introduction of temporal alignment problems and associated artifacts. SUMMARY QF THE INVENTION
[0025] By this invention is provided an optical imaging system for temporally aligning bracketed exposures of a single image, the system comprising a light aperture, a prism and a image capturing device, where the prism is capable of splitting an incoming image from the light aperture into at least two temporally aligned images, and where the image capturing device captures the temporally aligned images at different levels of exposure.
[0026] In one embodiment of the invention, the prism splits the intensity of said incoming image to achieve a desired EV output interval between temporally aligned images.
[0027] In a different embodiment, the capturing device further comprises image detection sensors, and the ISO of the sensors is adjusted to achieve a desired EV output interval between said images.
[0028] In another aspect of the invention, the system comprises an image processing device connected to said image capturing device.
[0029] In one embodiment, the image processing device comprises a computer processor.
[0030] In a different embodiment, the device further comprises a tone-mapping processor.
[0031] In a different aspect, the system comprises an eyepiece for viewing the image to be captured by the lens.
[0032] In a still further aspect, the system comprises a digital readout monitor.
[0033] In another embodiment, the prism is capable of splitting the image into three or more levels of exposure.
[0034] In a different embodiment, the three levels of exposure are about 14%, about 29% and about 57%, respectively, of the exposure level of the original image.
[0035] In a different embodiment, the three levels of exposure are about 5%, about 19% and about 76%, respectively, of the exposure level of the original image.
[0036] In a different embodiment, the three levels of exposure are about 1 %, about 11% and about 88%, respectively, of the exposure level of the original image.
[0037] In a still different embodiment, the prism is capable of splitting the image into four or more levels of exposure.
[0038] In another embodiment, the prism is capable of splitting the image into five or more levels of exposure.
[0039] In a different aspect, the invention provides a method for temporally aligning bracketed exposures of a single image, the method comprising the steps of a) using a prism to split an incoming image from a light aperture into at least two temporally aligned images, and b) using an image capturing device to capture the temporally aligned images at different levels of exposure.
[0040] These and other features and advantages of this invention are described in, or are apparent from, the following detailed description of various exemplary embodiments of the apparatus and methods according to this invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0041] A more complete understanding of the present invention and the attendant features and advantages thereof may be had by reference to the following detailed description when considered in conjunction with the accompanying drawings wherein:
[0042] Figure 1 shows a diagrammatic view of the system produced according to the invention, demonstrating variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings.
[0043] Figure 2 shows a diagrammatic view of a system of Figure 1 and further showing additional components of the system for processing the images.
[0044] Figure 3 shows a perspective drawing of a two-way prism that could be utilized with the invention.
[0045] Figure 4 shows a perspective drawing of a three-way prism that could be utilized with the invention.
[0046] Figure 5 shows a perspective drawing of a four-way prism that could be utilized with the invention.
[0047] Figure 6 shows a perspective drawing of a five-way prism that could be utilized with the invention.
DETAILED DESCRIPTION OF THE INVENTION [0048] The optical imaging system of the present invention provides an improvement to high dynamic range imaging, and assemblies therefore, that allows temporally aligned exposure bracketing. The system is simple, elegant, leverages existing technologies, allows for motion capture with no temporal distortion, and is relatively inexpensive to implement.
[0049] The present optical imaging system allows the user to capture light with confidence that the under and over exposed regions in the image will be imaged properly. The user simply captures all the available light with and image capturing device, and determines later how to map that information to the output device. With the optical imaging system the user can create stunning imagery that is otherwise impossible to capture, even with the most sophisticated of the current generation of normal photography equipment.
[0050] Before the present invention is described in greater detail, it is to be understood that this invention is not limited to particular embodiments described, as such may, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting, since the scope of the present invention will be limited only by the appended claims.
[0051] Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Although any materials similar or equivalent to those described herein can also be used in the practice or testing of the present invention, the preferred materials are now described.
[0052] All publications and patents cited in this specification are herein incorporated by reference as if each individual publication or patent were specifically and individually indicated to be incorporated by reference and are incorporated herein by reference to disclose and describe the materials in connection with which the publications are cited. The citation of any publication is for its disclosure prior to the filing date and should not be construed as an admission that the present invention is not entitled to antedate such publication by virtue of prior invention. Further, the dates of publication provided may be different from the actual publication dates which may need to be independently confirmed.
[0053] It must be noted that as used herein and in the appended claims, the singular forms "a," "an", and "the" include plural referents unless the context clearly dictates otherwise. It is further noted that the claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as "solely," "only" and the like in connection with the recitation of claim elements, or use of a "negative" limitation.
[0054] As will be apparent to those of skill in the art upon reading this disclosure, each of the individual embodiments described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the present invention.
[0055] For example, although the foregoing drawings and references refer to color images and processors, the system and methods work equally well for black and white (grayscale) images and sensors. For instance, some applications for scientific or industrial use may prefer grayscale imagery.
[0056] Further, while unusual in present day camera art, it is possible to build an imaging apparatus without a primary lens (i.e., a pinhole camera or a slit scanner). These applications are more likely in industrial or scientific applications. The invention can easily be adapted for designs that don't include a front end lens, but rather a simple aperture or the like.
[0057] Generally speaking, the systems and methods utilize prism splitting by full spectrum brackets to several image detecting sensors of an image capturing device. The system eliminates exotic image sensors as a necessary feature. The system allows multiple exposures from existing commodity sensors simultaneously by simply dividing the incoming light for an image into multiple and different levels of exposure for the same image.
[0058] The temporally aligned imaging system can be analogized to Technicolor. Before color film stock was developed, Hollywood was in search of a way to shoot films in color. Technicolor, Inc. was the first company to develop a way to create color pictures from black and white film stock. It utilized three rolls of black and white film exposed simultaneously through a special set of beam splitters with red, green, and blue filters on them.
[0059] Simply put, each black and white film negative recorded just the red, green, or blue information. This process was done in reverse with a projector that ran all three rolls of film simultaneously with the correct color filter in front of each. When the images are aligned properly, a full color picture is realized.
[0060] As better color film stocks emerged, this process fell out of favor, until video cameras emerged. In the early days of video, color sensors were not very sharp, and had difficulty producing high resolution images, or good color saturation and reproduction. Black and white sensors were far sharper and had a higher dynamic range. So the Technicolor principle of using three image sensors and a beam splitter to feed each an identical simultaneous image was dusted off and put into use for a new generation of imaging products. Three black and white CCD were used with a new and vastly improved beam splitter called a trichroic prism.
[0061] This technique is used to this day in professional level video cameras, sometimes referred to as 3CCD sensor. The three red, green and blue sensors not only allow for sharper more saturated colors but also help enhance the dynamic range of the images they help create. But just as better color film stocks helped to usher out the era of the Technicolor process, better CMOS and CCD sensors are ushering out the era of 3CCD sensor systems in favor of full color single sensor systems. In fact some of the highest end professional cameras like the lineup from RED Digital Cinema Camera Company as well as every professional Digital SLR use only one foil color sensor. It is quite apparent that sensor technology has progressed to the point where a single color sensor can replace and even outperform 3CCD sensor systems.
[0062] In one aspect, the temporally aligned exposure bracketing system employs trichroic prisms adapted to split the entire spectrum to each of multiple foil color sensors, at different exposure levels, rather than splitting out the spectrum into different colors.
[0063] The system allows a color neutral change in the amount, rather than the spectrum, of light going to each sensor, by the application of such prisms for the temporal alignment of images for HRDI. By "color neutral", it is meant that while the temporally aligned images created by the prism may vary in intensity between themselves, or between themselves and the incoming image, they are not substantially different from one another in color spectrum, i.e., the prism creates split images that are similar in color spectrum, or spectrally neutral, even if differing substantially in intensity.
[0064] All of the commonly understood color separation prism layouts may also be used for neutral separation. In reference now to Figure 1, the system 10 comprises an optical imaging system having an aperture 20 for capturing incoming light 30. Internal to the system is a neutral prism 100 that is used to reflect the captured light to generate a color-neutral separation.
[0065] In Figure 1, the neutral film prism 100 is depicted as a three-way prism that splits the light to three separate foil color sensors image 101, 102 and 103. Various means can be employed to adjust the EV (Exposure Value, commonly referred to as a "stop") up and down with the intensity spectrum, and a camera can then capture the images simultaneously. In Figure 1, two consecutive neutral films 104 and 105 are used, respectively capturing 57.1429% (4/7) of the light followed by a neutral film of 33.33% (1/3) for the remainder light. The neutral prism thus fractionates a captured image into three temporally aligned exposures 106, 107 and 108, that have relative light intensities of 1/7, 2/7 and 4/7 of the incoming light.
[0066] The film coatings 104 and 105 for the prism 100 may be of any of numerous coatings known to the art and capable of achieving a color neutral split, or separation, of the image, by reflection of the incoming light 30. Two examples of such spectrally neutral films include a thin film metallic coating, typically aluminum or silver, with or without a set of dielectric layers, and a set of dielectric layers consisting of high and low refractive index materials with the thin film stack designed to reflect a certain percentage of the incident light over the visible wavelength range. These and related types of thin film coatings 104 and 105 shall be termed "spectrally neutral film" or, alternatively, "neutral film."
[0067] The following table provides a demonstration for calculating the percentages for such a system, using a prism for splitting a captured image into temporally aligned exposures 106, 107 and 108 at levels of 14.2857%, 28.5714% and 57.1429%, respectively. Table 1
Figure imgf000011_0001
[0068] Thus, with color image sensors that do not need the RGB color split, the prism is harnessed for the purpose of splitting out different exposures of the same image, that are temporally aligned (taken at the same moment).
[0069] Various means can be employed to adjust the EV (Exposure Value, commonly referred to as a "stop") up and down with the intensity spectrum that would allow a camera to capture the images simultaneously. For instance, this can be accomplished by splitting the incoming light into different intensities directly in the prism, adjusting the ISO sensitivity in the sensors or some combination of the two.
[0070] At one extreme, the system could split the light intensity in the prism 100 into equal amounts of roughly 33% each and then adjust the ISO of the sensors 101, 102 and 103 respectively to achieve different EV output intervals. At another extreme, the system could split the light intensity within the prism 100 into the desired EV intervals for the light 106, 107 and 108. Thus, even while leaving the ISO of the sensors the same, the desired different EV output intervals are achieved for the recorded images. Any combination between these two extremes may be more or less desirable for various applications.
[0071] Figure 2 illustrates some additional components of a system 10. In Figure 2 is seen the deployment of a tone mapping processor 110 and an HDRI 120 processor that are used for combining the images. The processing chip is used to combine the 3 images in real time to an HDRI image, and another chip is used to complete the tone mapping. These functions can also be combined into a single processing chip.
[0072] Systems for controlling the action of the lens and associated hardware, including light responsive software controllers, are well known to the art.
[0073] In addition, the individual sensors could benefit from some tuning for their respective exposure levels to reduce noise and other artifacts associated with under and over exposure, in ways known to the art.
[0074] A high quality standard camera lens 140 can be used with the system 10 to gather and focus light from the light aperture.
[0075] The system 10 also will typically include an eyepiece and/or monitor 150 for aligning the images for capture from the lens onto the sensors.
[0076] Additional features of the system typically would include mass storage for either the 8 bit tone mapped data 160, or the raw 32 bit HDRI data 170. Other HDRI formats are known, for instance 16 bit and 14 bit formats, though the standard is evolving toward the higher 32 bit format.
[0077] The ISO is a function of how sensitive the sensor/film is to light. The exposure generated by a particular aperture, shutter speed, and sensitivity combination can be represented by its exposure value "EV". Zero EV is defined by the combination of an aperture off/1 and a shutter speed of Is at ISO 100.
[0078] The term "exposure value" is used to represent shutter speed and aperture combinations only. An exposure value which takes into account the ISO sensitivity is called "Light Value" or LV and represents the luminance of the scene. For the sake of simplicity, as is the case in this patent, Light Value is often referred to as "exposure value", grouping aperture, shutter speed and sensitivity in one familiar variable. This is because in a digital camera it is as easy to change sensitivity as it is to change aperture and shutter speed.
[0079] Each time the amount of light collected by the sensor is halved (e.g., by doubling shutter speed or by halving the aperture), the EV will increase by 1. For instance, 6 EV represents half the amount of light as 5 EV.
[0080] Table 2 shows the additional variations possible for adjusting output intervals on top of the prismatic split, for +/- 3EV, +/- 2EV and +/- IEV. Table 2
Figure imgf000013_0001
[0081] The various exposure intervals can be modified or enhanced by using different combinations of prism splits with sensor sensitivity settings. This is accomplished by using differential exposure values (EV) to amplify the differences created by the prismatic split at the level of the sensors.
[0082] Table 3 shows results for a diagrammatic view of a system produced according to the invention that as shown in Figures 1 and 2, only deploying a prism with two splits of light 104 and 105 corresponding to 76.1905% (16/21) followed by 20.00% (1/5) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106, 107 and 108 of levels of 76.1905%, 19.0476% and 4.7619%, respectively. Table 3
Figure imgf000013_0002
[0083] Table 4 shows the results where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/- 3EV, +/- 2EV and +/- IEV. Table 4 shows the various ISO settings for each sensor that is used to produce alternative EV output intervals from each sensor (these settings are for +/-1EV input values only) as found in Table 3 (these settings are for +/-2EV input values only). Table 4
Figure imgf000014_0001
[0084] Table 5 is the results for a system produced according to the invention as depicted in Figures 1 and 2, only showing a prism with two splits of light 104 and 105 corresponding to 87.6712% (64/73) followed by 11.11% (1/9) on the remainder light. This is used for splitting a captured image into temporally aligned exposures 106, 107 and 108 of levels of 87 .6712%, 10.9589% and 1.3699%, respectively. Table 5
Figure imgf000014_0002
n e u t r a l f i I m p e r c e n t n e u t r a l f i l m r a t i o
87.6712% 64/73
11.11 11% 1/9
[0085] Table 6 is the settings for a system as would be configured for the Table 5 percentages, where variations to exposure intervals are shown using different combinations of prism splits and sensor sensitivity settings of +/- 3EV, +/- 2EV and +/- IEV. Table 6
Figure imgf000014_0003
[0086] The system depicted in Figures 1 and 2, and through Tables 1 through 6, exemplifies a wide range of exposure levels that can be achieved, but are not exhaustive by any means. These are intended as examples only, and even more possibilities exist, including narrower or greater exposure ranges and configurations and settings of the prism splits with sensor sensitivity settings.
[0087] Further, while the use of a three-way prism is demonstrated in Figures 1 and 2, other neutral prism configurations could be utilized. Figures 3 through 6 demonstrate configurations for two-way, three-way, four-way and five-way neutral prism configurations, respectively.
[0088] Use of different prism splits will be desirable for different applications. In a very minimal configuration a 2-way configuration could work (Figure 3), although not as well for some applications. However, a two-way neutral prism likely represents the least expensive implementation of the device, and may likely be used in consumer versions of many products produced for the cost savings.
[0089] On the other hand, in some scientific or professional applications, the greater control from more elaborate splits possible from the four-way and five-way neutral prism splits shown in Figures 5 and 6 may be desired.
[0090] While this invention has been described in conjunction with the specific embodiments outlined above, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, the preferred embodiments of the invention, as set forth above, are intended to be illustrative, not limiting. Various changes may be made without departing from the spirit and scope of this invention.

Claims

What is claimed is: 1. An optical imaging system for temporally aligning bracketed exposures of a single image, said system comprising a light aperture, a prism and a image capturing device, wherein said prism is capable of splitting an incoming image from said light aperture into at least two color neutral, temporally aligned images, whereby said image capturing device captures said temporally aligned images at different levels of exposure.
2. The system of Claim 1 wherein said prism splits the intensity of said incoming image to achieve a desired EV output interval between said temporally aligned images.
3. The system of Claim 1 wherein said image capturing device further comprises image detection sensors for said temporally aligned images.
4. The system of Claim 1 wherein the ISO of said sensors is adjusted to achieve a desired EV output interval between said images.
5. The system of Claim 1 wherein said prism further comprises at least one neutral film coating.
6. The system of Claim 1 further comprising an image processing device connected to said image capturing device.
7. The system of Claim 1 wherein said image processing device comprises a computer processor.
8. The system of Claim 7 further comprising a tone-mapping processor.
9. The system of Claim 8 wherein the image processing device and tone- mapping processor are contained on a single integrated circuit.
10. The system of Claim 1 further comprising a digital readout monitor.
1 1. The system of Claim 1 further comprising a lens associated with said aperture.
12. The system of Claim 1 further comprising an eyepiece for viewing said incoming image.
13. The system of Claim 1 wherein said prism is capable of splitting said image into at least three temporally aligned images having different levels of exposure.
14. The system of Claim 13 wherein said three levels of exposure are about 14%, about 29% and about 57%, respectively, of the intensity of said incoming image.
15. The system of Claim 13 wherein said three levels of exposure are about 5%, about 19% and about 76%, respectively, of the intensity of said incoming image.
16. The system of Claim 13 wherein said three levels of exposure are about 1%, about 11% and about 88%, respectively, of the intensity of said incoming image.
17. The system of Claim 1 wherein said prism is capable of splitting said image into at least four temporally aligned images having different levels of exposure.
18. The system of Claim 1 wherein said prism is capable of splitting said image into at least five temporally aligned images having different levels of exposure.
19. A method for temporally aligning bracketed exposures of a single image, said method comprising the steps of a) using a prism to split an incoming image from a light aperture into at least two temporally aligned images, and b) using an image capturing device to capture said temporally aligned images at different levels of exposure, wherein said prism produces a color neutral split of said temporally aligned images.
20. The method of Claim 19 wherein said prism splits the intensity of said incoming image to achieve a desired EV output interval between said temporally aligned images.
21. The method of Claim 19 wherein said image capturing device further comprises image detection sensors for said temporally aligned images.
22. The method of Claim 19 wherein the ISO of said sensors is adjusted to achieve a desired EV output interval between said images.
23. The method of Claim 19 wherein said prism is capable of splitting said image into at least three temporally aligned images having different levels of exposure.
24. The method of Claim 23 wherein said three levels of exposure are about 14%, about 29% and about 57%, respectively, of the intensity of said incoming image.
25. The method of Claim 23 wherein said three levels of exposure are about 5%, about 19% and about 76%, respectively, of the intensity of said incoming image.
26. The method of Claim 23 wherein said three levels of exposure are about 1%, about 11% and about 88%, respectively, of the intensity of said incoming image.
27. The method of Claim 19 wherein said prism is capable of splitting said image into at least four temporally aligned images having different levels of exposure.
28. The method of Claim 19 wherein said prism is capable of splitting said image into at least five temporally aligned images having different levels of exposure.
PCT/US2010/026250 2009-03-04 2010-03-04 Temporally aligned exposure bracketing for high dynamic range imaging WO2010102135A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2010221241A AU2010221241A1 (en) 2009-03-04 2010-03-04 Temporally aligned exposure bracketing for high dynamic range imaging
EP10749347A EP2404209A4 (en) 2009-03-04 2010-03-04 Temporally aligned exposure bracketing for high dynamic range imaging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15749409P 2009-03-04 2009-03-04
US61/157,494 2009-03-04

Publications (1)

Publication Number Publication Date
WO2010102135A1 true WO2010102135A1 (en) 2010-09-10

Family

ID=42677914

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/026250 WO2010102135A1 (en) 2009-03-04 2010-03-04 Temporally aligned exposure bracketing for high dynamic range imaging

Country Status (5)

Country Link
US (3) US20100225783A1 (en)
EP (1) EP2404209A4 (en)
KR (1) KR20120073159A (en)
AU (1) AU2010221241A1 (en)
WO (1) WO2010102135A1 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9077910B2 (en) 2011-04-06 2015-07-07 Dolby Laboratories Licensing Corporation Multi-field CCD capture for HDR imaging
DE102012009151B4 (en) * 2012-05-08 2018-01-18 Steinbichler Optotechnik Gmbh Method and apparatus for detecting an intensity-modulated optical radiation field
US9245348B2 (en) 2012-06-15 2016-01-26 Microsoft Technology Licensing, Llc Determining a maximum inscribed size of a rectangle
CN105992944B (en) * 2014-02-17 2020-03-17 伊顿有限公司 Oxygen sensor comprising a large-diameter optical fibre whose tip is coated
US20150369565A1 (en) * 2014-06-20 2015-12-24 Matthew Flint Kepler Optical Device Having a Light Separation Element
US10277771B1 (en) * 2014-08-21 2019-04-30 Oliver Markus Haynold Floating-point camera
US20160205291A1 (en) * 2015-01-09 2016-07-14 PathPartner Technology Consulting Pvt. Ltd. System and Method for Minimizing Motion Artifacts During the Fusion of an Image Bracket Based On Preview Frame Analysis
TWI594635B (en) * 2016-01-14 2017-08-01 瑞昱半導體股份有限公司 Method for generating target gain value of wide dynamic range operation
US10257393B2 (en) * 2016-02-12 2019-04-09 Contrast, Inc. Devices and methods for high dynamic range video
EP3414890B1 (en) * 2016-02-12 2023-08-09 Contrast, Inc. Devices and methods for high dynamic range video
US10264196B2 (en) 2016-02-12 2019-04-16 Contrast, Inc. Systems and methods for HDR video capture with a mobile device
CN106060351B (en) * 2016-06-29 2020-07-24 联想(北京)有限公司 Image processing apparatus and image processing method
US9979906B2 (en) 2016-08-03 2018-05-22 Waymo Llc Beam split extended dynamic range image capture system
WO2018031441A1 (en) 2016-08-09 2018-02-15 Contrast, Inc. Real-time hdr video for vehicle control
WO2019014057A1 (en) 2017-07-10 2019-01-17 Contrast, Inc. Stereoscopic camera
US10951888B2 (en) 2018-06-04 2021-03-16 Contrast, Inc. Compressed high dynamic range video
JP7287676B2 (en) * 2020-04-17 2023-06-06 i-PRO株式会社 3-plate camera and 4-plate camera
EP4002833B1 (en) * 2020-11-17 2023-02-15 Axis AB Method and electronic device for increased dynamic range of an image
EP4125267A1 (en) * 2021-07-29 2023-02-01 Koninklijke Philips N.V. An image sensing system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6720997B1 (en) * 1997-12-26 2004-04-13 Minolta Co., Ltd. Image generating apparatus
US20070035646A1 (en) * 1999-07-08 2007-02-15 Olympus Corporation Image pickup device and image pickup optical system
US20070189758A1 (en) * 2006-02-14 2007-08-16 Nikon Corporation Camera and method of imaging
US20080180749A1 (en) * 2007-01-25 2008-07-31 Hewlett-Packard Development Company, L.P. Image processing system and method
US20080211941A1 (en) * 2007-03-01 2008-09-04 Deever Aaron T Digital camera using multiple image sensors to provide improved temporal sampling
US20080303927A1 (en) * 2007-06-06 2008-12-11 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Digital motion picture camera with two image sensors
US20090231465A1 (en) * 2008-03-11 2009-09-17 Fujifilm Corporation Image pick-up apparatus, image pick-up method and recording medium

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4672196A (en) * 1984-02-02 1987-06-09 Canino Lawrence S Method and apparatus for measuring properties of thin materials using polarized light
WO1992008322A1 (en) * 1990-10-30 1992-05-14 Simco/Ramic Corporation Color line scan video camera for inspection system
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US6204881B1 (en) * 1993-10-10 2001-03-20 Canon Kabushiki Kaisha Image data processing apparatus which can combine a plurality of images at different exposures into an image with a wider dynamic range
US5801773A (en) * 1993-10-29 1998-09-01 Canon Kabushiki Kaisha Image data processing apparatus for processing combined image signals in order to extend dynamic range
JPH1198418A (en) * 1997-09-24 1999-04-09 Toyota Central Res & Dev Lab Inc Image pickup device
US6529640B1 (en) * 1998-06-09 2003-03-04 Nikon Corporation Image processing apparatus
US6864916B1 (en) * 1999-06-04 2005-03-08 The Trustees Of Columbia University In The City Of New York Apparatus and method for high dynamic range imaging using spatially varying exposures
JP2001157109A (en) * 1999-11-24 2001-06-08 Nikon Corp Electronic camera and recording medium for image data processing
US7084905B1 (en) * 2000-02-23 2006-08-01 The Trustees Of Columbia University In The City Of New York Method and apparatus for obtaining high dynamic range images
WO2002085000A1 (en) * 2001-04-13 2002-10-24 The Trustees Of Columbia University In The City Of New York Method and apparatus for recording a sequence of images using a moving optical element
EP1271935A1 (en) * 2001-06-29 2003-01-02 Kappa opto-electronics GmbH Apparatus for taking digital images with two simultaneously controlled image sensors
JP2003270518A (en) * 2002-03-13 2003-09-25 Fuji Photo Optical Co Ltd Autofocus system
AU2003220595A1 (en) * 2002-03-27 2003-10-13 The Trustees Of Columbia University In The City Of New York Imaging method and system
JP4097980B2 (en) * 2002-04-23 2008-06-11 オリンパス株式会社 Image synthesizer
US7362365B1 (en) * 2002-06-26 2008-04-22 Pixim, Inc. Digital image capture having an ultra-high dynamic range
US20040008267A1 (en) * 2002-07-11 2004-01-15 Eastman Kodak Company Method and apparatus for generating images used in extended range image composition
US20040130649A1 (en) * 2003-01-03 2004-07-08 Chulhee Lee Cameras
US6879731B2 (en) * 2003-04-29 2005-04-12 Microsoft Corporation System and process for generating high dynamic range video
US7492375B2 (en) * 2003-11-14 2009-02-17 Microsoft Corporation High dynamic range image viewing on low dynamic range displays
JP4353151B2 (en) * 2004-09-08 2009-10-28 セイコーエプソン株式会社 projector
US20060082692A1 (en) * 2004-10-15 2006-04-20 Seiko Epson Corporation Image display device and projector
US20060221209A1 (en) * 2005-03-29 2006-10-05 Mcguire Morgan Apparatus and method for acquiring and combining images of a scene with multiple optical characteristics at multiple resolutions
JP2007264339A (en) * 2006-03-29 2007-10-11 Seiko Epson Corp Modulating device and projector
US20070177004A1 (en) * 2006-06-08 2007-08-02 Timo Kolehmainen Image creating method and imaging device
US7551359B2 (en) * 2006-09-14 2009-06-23 3M Innovative Properties Company Beam splitter apparatus and system
US8242426B2 (en) * 2006-12-12 2012-08-14 Dolby Laboratories Licensing Corporation Electronic camera having multiple sensors for capturing high dynamic range images and related methods
US8542408B2 (en) * 2006-12-29 2013-09-24 Texas Instruments Incorporated High dynamic range display systems
JP4941285B2 (en) * 2007-02-20 2012-05-30 セイコーエプソン株式会社 Imaging apparatus, imaging system, imaging method, and image processing apparatus
US8441732B2 (en) * 2008-03-28 2013-05-14 Michael D. Tocci Whole beam image splitting system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6720997B1 (en) * 1997-12-26 2004-04-13 Minolta Co., Ltd. Image generating apparatus
US20070035646A1 (en) * 1999-07-08 2007-02-15 Olympus Corporation Image pickup device and image pickup optical system
US20070189758A1 (en) * 2006-02-14 2007-08-16 Nikon Corporation Camera and method of imaging
US20080180749A1 (en) * 2007-01-25 2008-07-31 Hewlett-Packard Development Company, L.P. Image processing system and method
US20080211941A1 (en) * 2007-03-01 2008-09-04 Deever Aaron T Digital camera using multiple image sensors to provide improved temporal sampling
US20080303927A1 (en) * 2007-06-06 2008-12-11 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Digital motion picture camera with two image sensors
US20090231465A1 (en) * 2008-03-11 2009-09-17 Fujifilm Corporation Image pick-up apparatus, image pick-up method and recording medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2404209A4 *

Also Published As

Publication number Publication date
US10511785B2 (en) 2019-12-17
AU2010221241A1 (en) 2011-10-27
KR20120073159A (en) 2012-07-04
EP2404209A1 (en) 2012-01-11
US20100225783A1 (en) 2010-09-09
US20200084363A1 (en) 2020-03-12
US20150029361A1 (en) 2015-01-29
EP2404209A4 (en) 2012-10-17

Similar Documents

Publication Publication Date Title
US20200084363A1 (en) Temporally aligned exposure bracketing for high dynamic range imaging
AU2017217929B2 (en) Combined HDR/LDR video streaming
JP4826028B2 (en) Electronic camera
KR101265358B1 (en) Method of controlling an action, such as a sharpness modification, using a colour digital image
EP3429189B1 (en) Dual image capture processing
US9681026B2 (en) System and method for lens shading compensation
JP6351903B1 (en) Image processing apparatus, image processing method, and photographing apparatus
EP3349433B1 (en) Control system, imaging device, and program
US7940311B2 (en) Multi-exposure pattern for enhancing dynamic range of images
US8325268B2 (en) Image processing apparatus and photographing apparatus
US8698924B2 (en) Tone mapping for low-light video frame enhancement
CN108712608A (en) Terminal device image pickup method and device
AU2017217833B2 (en) Devices and methods for high dynamic range video
KR20150109177A (en) Photographing apparatus, method for controlling the same, and computer-readable recording medium
TW201813371A (en) Ghost artifact removal system and method
JP2010220207A (en) Image processing apparatus and image processing program
JP5681589B2 (en) Imaging apparatus and image processing method
JP2018023077A (en) Video camera imaging device
JP2014179920A (en) Imaging apparatus, control method thereof, program, and storage medium
WO2016165967A1 (en) Image acquisition method and apparatus
US8334912B2 (en) Image processing apparatus, imaging apparatus, image processing method, and computer readable recording medium storing image processing program
JP2015080157A (en) Image processing device, image processing method and program
JP2015119436A (en) Imaging apparatus
JP2007134777A (en) Imaging apparatus
JPH07131799A (en) Image pickup device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10749347

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20117023227

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2010749347

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2010221241

Country of ref document: AU

Date of ref document: 20100304

Kind code of ref document: A