CN109191441B - Image processing method, apparatus, system and storage medium - Google Patents

Image processing method, apparatus, system and storage medium Download PDF

Info

Publication number
CN109191441B
CN109191441B CN201810987724.8A CN201810987724A CN109191441B CN 109191441 B CN109191441 B CN 109191441B CN 201810987724 A CN201810987724 A CN 201810987724A CN 109191441 B CN109191441 B CN 109191441B
Authority
CN
China
Prior art keywords
image
light field
image processing
field camera
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810987724.8A
Other languages
Chinese (zh)
Other versions
CN109191441A (en
Inventor
李家豪
尤毅
吴旻烨
邢自然
石志儒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yaoke Intelligent Technology Shanghai Co ltd
Original Assignee
Yaoke Intelligent Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yaoke Intelligent Technology Shanghai Co ltd filed Critical Yaoke Intelligent Technology Shanghai Co ltd
Priority to CN201810987724.8A priority Critical patent/CN109191441B/en
Publication of CN109191441A publication Critical patent/CN109191441A/en
Application granted granted Critical
Publication of CN109191441B publication Critical patent/CN109191441B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/557Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

According to the image processing method, the device, the system and the storage medium provided by the invention, a first image set shot by a light field camera array is received, one of the cameras is selected as a standard camera, then, each image in the first image set is projected into an image physical coordinate system corresponding to the standard camera according to internal parameters and external parameters calibrated by each camera to obtain a second image set, then, focal planes corresponding to each image in the second image set are focused to any plane behind a bright surface in a unified manner to obtain a third image set, and finally, each image is subjected to difference value calculation to obtain a finished image. The invention can solve the problem of highlight generated by taking a picture and restore the scene of removing the highlight.

Description

Image processing method, apparatus, system and storage medium
Technical Field
The present invention relates to the field of image processing. And more particularly, to an image processing method, apparatus, system, and storage medium.
Background
In daily shooting, people often encounter the problem that if glass objects appear in a scene, such as a vehicle windshield, a glass show window, glasses and the like, a shot picture often appears in a region with very high brightness, so that details of the subsequent objects cannot be well obtained, and the region with very high brightness is presented in an image as a region with brighter pixel tones, which is generally called as a highlight problem. The existing methods for removing the highlight problem are more, but the effect is not satisfactory, and the methods are often difficult to clearly restore the object behind the highlight area and still have detail loss or texture information loss. Therefore, there is a need for a device that can capture a scene that is at least partially free from highlight problems, and a technique that can effectively remove highlight regions in the picture and efficiently restore the scene that is occluded by the light in the post-correction.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, it is an object of the present invention to provide an image processing method, apparatus, system and storage medium for solving the problem of highlight occurring in taking a picture in the prior art and the problem of restoring a clear scene without highlight.
To achieve the above and other related objects, the present invention provides an image processing method, including: receiving a first set of images captured by a light field camera array, and optionally one of the cameras being a standard camera; each light field camera in the light field camera array is provided with a polarizer, an installation angle is arranged between each light field camera and the polarizer installed on the light field camera, and the installation angles among the light field cameras are different; respectively projecting each image in the first image set into an image physical coordinate system corresponding to the standard camera according to the internal parameters and the external parameters calibrated by each camera to obtain a second image set; focusing the focal plane corresponding to each image in the second image set to any plane behind the bright surface in a unified manner to obtain a third image set; and obtaining a finished image by calculating the difference value of each image in the third image set.
In an embodiment of the present invention, the method for projecting the images in the first image set into the image physical coordinate system corresponding to the standard camera according to the internal parameters and the external parameters calibrated by the cameras includes: converting the image physical coordinates corresponding to each camera into camera coordinates according to the internal parameters calibrated by each camera; converting the camera coordinates corresponding to each camera into a world coordinate system according to the external parameters calibrated by each camera; and projecting the images in the first image set into an image physical coordinate system corresponding to the standard camera according to the corresponding relation of the cameras in the world coordinate system.
In an embodiment of the present invention, the method for focusing to any plane behind the bright surface in a unified manner includes: judging whether a focal plane corresponding to each image in the second image set is parallel to the light field camera array plane; if yes, focusing a focal plane parallel to the light field camera array plane to any plane behind the bright surface; if not, adjusting a focal plane which is not parallel to the light field camera array plane to be parallel to the light field camera array plane, and focusing to any plane behind the bright surface.
In an embodiment of the invention, the method for adjusting the focal plane non-parallel to the light field camera array plane to be parallel to the light field camera array plane includes: selecting one point on the focal plane which is not parallel to the light field camera array plane, and translating the focal plane until the point is coincided with the original point of the light field camera array; rotating the focal plane to be parallel to the light field camera array plane; moving back to the focal plane.
In an embodiment of the present invention, the difference calculation method includes: taking any two images from the re-focused images, and subtracting pixel values of the two images under the same pixel coordinate to obtain a first difference image; comparing the pixel value of each pixel coordinate in the first difference image with a preset threshold value, and if the pixel value is smaller than the threshold value, marking the pixel value as invalid; if not, marking as valid; subtracting the pixel values of the subtracted image which is the subtracted number in the two images and the first difference image under the same pixel coordinate to obtain a second difference image; the value of the part marked as effective in the first difference image is the difference value of the pixel values of the subtracted image and the first difference image under the same pixel coordinate; corresponding to the part marked as invalid in the difference image, taking the value as the pixel value of each pixel coordinate in the subtracted image; putting the second difference image back to the third image set, and taking any two images to repeat the steps; and when one image remains in the third image set, obtaining the finished image.
To achieve the above and other related objects, the present invention provides a computer-readable storage medium having stored thereon an image processing program which, when executed by a processor, implements the image processing method.
To achieve the above and other related objects, the present invention provides an image processing apparatus comprising: a communicator, a processor, and a memory; the communicator is in communication connection with an external device; the memory is used for storing an image processing program; the processor runs an image processing program to realize the image processing method.
To achieve the above and other related objects, the present invention provides a light field camera array comprising: a plurality of light field cameras; the plurality of light field cameras form an array according to a certain arrangement mode; each light field camera is provided with a polarizer, an installation angle is arranged between each light field camera and the polarizer installed on the light field camera, and the installation angles among the light field cameras are different.
In an embodiment of the invention, the polarizer is a circular polarizer.
In an embodiment of the present invention, an installation angle between each light field camera and the polarizer installed therein is sequentially decreased from outside to inside according to a certain ratio, and the installation angles are distributed in a range of 0 to 180 degrees.
In an embodiment of the present invention, the degree of the included angle between the plane where the lens of each light field camera is located and the bright surface respectively satisfies the brewster angle range.
To achieve the above and other related objects, the present invention provides an image processing system comprising: the image processing device and the light field camera array; the image processing device is in communication connection with the light field camera array to implement the image processing method.
As described above, according to the image processing method, the image processing apparatus, the image processing system, and the storage medium of the present invention, a first image set captured by a light field camera array is received, and one of the cameras is optionally a standard camera, then each image in the first image set is respectively projected into an image physical coordinate system corresponding to the standard camera according to an internal parameter and an external parameter calibrated by each camera to obtain a second image set, then a focal plane corresponding to each image in the second image set is focused uniformly to any plane behind a bright surface to obtain a third image set, and finally each image is subjected to difference calculation to obtain a complete image. Has the following beneficial effects:
the highlight problem that the shot picture appears can be solved to the scene of highlight is got rid of in clear reduction.
Drawings
Fig. 1 is a flowchart illustrating an image processing method according to an embodiment of the invention.
FIG. 2 is a diagram of an image processing apparatus according to an embodiment of the present invention.
FIG. 3 is a schematic view of a light field camera array and a scene thereof according to an embodiment of the invention.
Fig. 4 is a schematic structural diagram of each camera and the polarizer mounted thereon according to an embodiment of the invention.
FIG. 5 is a diagram of an image processing system according to an embodiment of the present invention.
Description of the element reference numerals
Method steps S101 to S104
200 image processing apparatus
201 memory
202 processor
203 communicator
300 light field camera array
301 light field camera
401 light field camera
402 polarizer
501 image processing apparatus
502 light field camera array
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
As shown in fig. 1, a flow chart of an image processing method according to an embodiment of the invention is shown, the image processing method includes:
step S101: receiving a first set of images captured by a light field camera array, and optionally one of the cameras being a standard camera; each light field camera in the light field camera array is provided with a polarizer, an installation angle is arranged between each light field camera and the polarizer installed on the light field camera, and the installation angles among the light field cameras are different.
In an embodiment of the present invention, angles between each camera and the polarizer mounted thereon are sequentially increased from outside to inside according to a certain ratio, and the angles are distributed in a range of 0 to 180 degrees.
The angles of each light field camera and the polarizer mounted on the light field camera are different, and the light field cameras and the polarizers mounted on the light field cameras are sequentially changed according to a certain proportion, including the sequential increment from outside to inside, and the aim of the light field cameras is to receive polarized light rays from different directions as much as possible.
For example, one camera in the center of the light field camera array is selected as a camera number 0, the angle of the polarizing lens of the camera is 0 °, and if the light field camera array has n cameras, the installation angles of the polarizers of the surrounding cameras are sequentially set according to 180/n °, 180/n × 2 °, 180/n × 3 ° … from outside to inside according to the distance from the camera number 0. Finally, the intensity of the polarized light in different cameras can show different brightness changes.
Step S102: and respectively projecting the images in the first image set into an image physical coordinate system corresponding to the standard camera according to the internal parameters and the external parameters calibrated by the cameras to obtain a second image set.
In an embodiment of the present invention, the calibrated internal parameters form an internal parameter matrix as follows:
Figure BDA0001780073690000041
wherein K is an internal parameter matrix; fx, fy represent normalized focal lengths on the x-axis and y-axis, respectively; f is the focal length of the camera, and dx and dy respectively represent the length unit occupied by one pixel in the x direction and the y direction, namely the size of the actual physical value represented by one pixel; x0, y0, respectively, indicate the number of pixels in the horizontal and vertical directions of the phase difference between the pixel coordinates of the center of the image and the pixel coordinates of the origin of the image, and s indicates a warping factor or a coordinate axis inclination parameter, which is generally 0.
For example, the focal length of a camera: 35mm, highest resolution: 4256 × 2832, sensor size: 36.0X 23.9 mm.
According to the information, the corresponding internal parameters of the camera are as follows: x0 ═ 4256/2 ═ 2128, y0 ═ 2832/2 ═ 1416, dx ═ 36.0/4256, dy ═ 23.9/2832, fx ═ f/dx ═ 4137.8, and fy ═ f/dy ═ 4147.3.
In an embodiment of the present invention, the external parameter matrix formed by the calibrated external parameters is: [ R T ].
Where R is a rotation matrix representing the number of degrees the image is rotated about the x, y, z axes. E.g. rotation around the x-axis:
Figure BDA0001780073690000051
t is a review matrix representing the distance the image has been translated along the x, y, z axes. Such as:
Figure BDA0001780073690000052
it should be noted that the internal parameters and the external parameters of the camera calibration are the process of obtaining the final projection matrix. Which contains four basic coordinate systems: a world coordinate system, a camera coordinate system, an image physical coordinate system (imaging plane coordinate system), and a pixel coordinate system.
In an embodiment of the present invention, the method for projecting the images in the first image set into the image physical coordinate system corresponding to the standard camera according to the internal parameters and the external parameters calibrated by the cameras includes: converting the image physical coordinates corresponding to each camera into camera coordinates according to the internal parameters calibrated by each camera; converting the camera coordinates corresponding to each camera into a world coordinate system according to the external parameters calibrated by each camera; and projecting the images in the first image set into an image physical coordinate system corresponding to the standard camera according to the corresponding relation of the cameras in the world coordinate system.
For example, assuming that camera No. 1 is a standard camera, the transformation formula for projecting the image captured by camera No. m to the image physical coordinate system corresponding to the standard camera is:
Figure BDA0001780073690000053
wherein x ism,ymIs the two-dimensional coordinate of the physical coordinate system of the corresponding image of the m camera, and d on the right side of the equation is (x)m,ym) The depth of this point is at the m camera view, but depth information cannot be derived from the two-dimensional image in practice, so it can only be assumed that the depth values of all points are the same.
D' to the left of the equation is the depth value after projection, which is different for each point. x is the number ofm′,ym'is projected plane coordinates, and w' represents a normalization parameter.
In an embodiment of the present invention, after the projection by the above formula, all the images are in the same coordinate system for refocusing in the light field rendering.
Step S103: and focusing the focal plane corresponding to each image in the second image set to any plane behind the bright surface in a unified manner to obtain a third image set.
In an embodiment of the invention, the bright surface represents an object plane in which a high-brightness region or a highlight region is formed in a shot scene, such as a front window glass in which the highlight region is formed, or a show window glass surface in which the highlight region is formed.
In an embodiment of the present invention, the method for focusing to any plane behind the bright surface in a unified manner includes: judging whether a focal plane corresponding to each image in the second image set is parallel to the light field camera array plane; if yes, focusing to any plane behind the bright surface; if not, selecting a point on the focal plane which is not parallel to the light field camera array plane, and translating the focal plane to coincide with the light field camera array origin; rotating the focal plane to be parallel to the light field camera array plane; moving back the focal plane; focusing to any plane behind the bright surface.
The refocusing means changes the actual focusing depth and the actual position of the virtual camera by changing the set depth during the light field processing, so as to obtain the most image information of the desired object, specifically, removes the relative distance between corresponding pixels by performing a displacement with a larger distance, and then superimposes the pixels to obtain a new focused image. Whereas refocusing requires all images to be under the same coordinate system.
In an embodiment of the invention, the focusing depth can be changed according to the set focusing depth value under the condition that the focal plane is parallel to the light field camera array plane.
However, in the case where the focal plane is not parallel to the light field camera array plane, it is necessary to adjust the focal depth to be parallel. The above-described adjustment process can be formulated as:
Figure BDA0001780073690000061
where XYZ is the true coordinates of each camera in the world coordinate system.
Step S104: and obtaining a finished image by calculating the difference value of each image in the third image set.
In an embodiment of the present invention, the difference calculation method includes: taking any two images from the re-focused images, and subtracting pixel values of the two images under the same pixel coordinate to obtain a first difference image; comparing the pixel value of each pixel coordinate in the first difference image with a preset threshold value, and if the pixel value is smaller than the threshold value, marking the pixel value as invalid; if not, marking as valid; subtracting the pixel values of the subtracted image which is the subtracted number in the two images and the first difference image under the same pixel coordinate to obtain a second difference image; the value of the part marked as effective in the first difference image is the difference value of the pixel values of the subtracted image and the first difference image under the same pixel coordinate; corresponding to the part marked as invalid in the difference image, taking the value as the pixel value of each pixel coordinate in the subtracted image; putting the second difference image back to the third image set, and taking any two images to repeat the steps; and when one image remains in the third image set, obtaining the finished image.
In an embodiment of the present invention, the condition for calculating the difference value is that the images are located in the same image physical coordinate. A plurality of images are overlapped under the same coordinate, a uniform image pixel coordinate system is corresponding to the image physical coordinate system, and the pixel coordinate system comprises the pixel coordinate of each image.
The method comprises the steps that each pixel coordinate comprises a plurality of pixel points, each pixel point is composed of pixel values of three channels of r, g and b, pixel values of the two images under the pixel coordinates are selected randomly and subtracted one by one, and the subtraction is carried out on the pixel values of the three channels of r, g and b corresponding to the two images, so that the difference value of the pixel values of the three channels of the pixel points can be obtained, and further a difference value image can be obtained.
For example, any two images im in the refocused image are taken1、im2Subtracting the pixel values under the same pixel coordinate to obtain a first difference image imd1I.e. imd1=im1–im2
Comparing each pixel value under the pixel coordinate in the first difference image with a preset threshold value e, and if the pixel value is smaller than the threshold value, marking the pixel value as invalid; if not, the flag is valid.
Subtracting image im as a subtracted number from two images1With said first difference image imd1Subtracting the pixel values of the same pixel coordinate to obtain a second difference image imd2
The second difference image imd2Putting back the third image set, and taking any two images to repeat the steps;
when the third image set has one image, the finished image im is obtainedresultI.e. the highlight removed image.
In an embodiment of the present invention, since any two images are not put back into the image set, only the second difference image im is put backd2The third image set is replaced so that the images in the third image set are gradually reduced so that the end condition is that all images in the image set have been processed, i.e. only one image in the third image set remains after the last processing.
It should be noted that the preset threshold e also includes three channels, and the thresholds of the three channels in one pixel point are not necessarily related to each other, and need to be correspondingly adjusted according to the illumination, the reflection intensity, and the white balance of the actual scene.
For example, assuming that the preset threshold e is 0, the difference value calculation is performed, and two different images are subtracted from each other, so as to obtain a portion of each image with a lower pixel value under the pixel coordinate, and finally obtain a finished image with a reduced comprehensive pixel value.
In the process of calculating the difference value, the preset threshold e is set to be more like a sensitive coefficient, and the larger the value of e is, the less sensitive the value is. For example, in a scene with good illumination conditions and strong light reflection, the brightness of the non-highlight region in the bright surface is very high, and if the preset threshold e is set to be low, that is, sensitive, the highlight region to be removed may not be removed effectively and accurately through the difference calculation, and on the contrary, the non-highlight region may be removed due to high brightness, thereby affecting the entire image.
Therefore, generally speaking, scenes with good lighting conditions and strong light reflection require larger preset threshold values.
To achieve the above and other related objects, the present invention provides a computer-readable storage medium, on which an image processing program is stored, the program, when executed by a processor, implements an image processing method according to an embodiment of the present invention as shown in fig. 1.
The computer-readable storage medium, as will be appreciated by one of ordinary skill in the art: all or part of the steps for implementing the above method embodiments may be performed by hardware associated with a computer program. The aforementioned image processing program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
These computer program programs may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include transitory computer readable media (transmyedia) such as modulated data signals and carrier waves.
As shown in fig. 2, which shows a schematic diagram of an image processing apparatus in an embodiment of the present invention, the image processing apparatus 200 includes: a communicator 203, a processor 202, and a memory 201; the communicator 203 is in communication connection with an external device; the memory 201 is used for storing an image processing program; the processor 202 runs an image processing program to implement the image processing method in one embodiment of the invention as shown in fig. 1.
The communicator 203 is used to implement a communication link between the database access device and other devices (e.g., client, read-write library, and read-only library), which may be any suitable combination of one or more wired and/or wireless networks. For example, the communication means may include any one or more of the internet, an intranet, a Wide Area Network (WAN), a Local Area Network (LAN), a wireless network, a Digital Subscriber Line (DSL) network, a frame relay network, an Asynchronous Transfer Mode (ATM) network, a Virtual Private Network (VPN), and/or any other suitable communication network.
The memory 201 may include a Random Access Memory (RAM), and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor 202 may be a general-purpose processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the integrated circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components.
Referring to FIG. 3, a light field camera array and a schematic diagram thereof in one embodiment of the invention are shown. As shown, the light field camera array 300 includes: a plurality of light field cameras 301; the plurality of light field cameras 301 form an array in a certain arrangement.
In an embodiment of the present invention, the light field camera 301 is preferably a plurality of RGB light field cameras.
The color pattern of RGB is a color standard in the industry, and various colors are obtained by changing three color channels of red (R), green (G) and blue (B) and superimposing the three color channels on each other, RGB represents the colors of the three channels of red, green and blue, and the color standard almost includes all colors that can be perceived by human vision, and is one of the most widely used color systems at present.
In an embodiment of the present invention, the light field camera array 300 is provided with a rigid support, so as to ensure that the relative position between the light field cameras 301 does not move after being set up.
In an embodiment of the present invention, the arrangement of the light field cameras 301 of the light field camera array 300 is flexible, and may be arranged in an mxn matrix, an mxn honeycomb, or a circular arrangement, i.e. one light field camera is at the center and other light field cameras are arranged around the center to form an n-ring light field camera array 300.
It should be noted that the distance between every two light field cameras 301 may affect the final highlight removal effect, if the distance between the light field cameras 301 is large, the object with the depth not set to the focal plane may be blurred during light field reconstruction, but the adaptability to the light rays with different angles is better, and the dense camera arrangement may make the object with the depth not set to the focal plane clearer, thereby expanding the usable data range. The spacing of the light field cameras 301 is determined according to the particular use scenario, and generally speaking, for outdoor sunlight, the smallest possible camera spacing should be selected.
In an embodiment of the invention, the degree of the included angle between the plane where the lens of each light field camera 301 is located and the bright surface respectively satisfies the brewster angle range.
It should be noted that the degree of the included angle between the plane where the lens of each light field camera 301 is located and the bright surface respectively satisfies the brewster angle range, which aims to ensure that most of the light entering the camera has polarization characteristics.
Although the selected brewster angle has uncertainty in an embodiment of the present invention, the included angle between the plane where the lens of each light field camera 301 is located and the bright surface should be kept consistent. The angle between the lens of each light field camera 301 and the bright surface can be kept consistent by integrally adjusting the angle between the light field camera array 300 and the bright surface.
The brewster angle is also called as the polarizing angle, and when incident natural light enters the interface at this angle, the reflected light is linearly polarized and perpendicular to the refracted light. When a light ray is injected into the medium from air, the tangent of the brewster angle is equal to the refractive index n of the medium. Since the refractive index of a medium is related to the wavelength of light, the magnitude of the brewster angle is also related to the wavelength of light for the same medium.
For example, the Brewster angle is about 54 to 62 degrees, calculated for an optical glass having a refractive index of 1.4 to 1.9.
Fig. 4 shows a schematic structural diagram of each camera and the polarizer mounted thereon according to an embodiment of the present invention. As shown, each of the light field cameras 401 is mounted with a polarizer 402.
In an embodiment of the present invention, the polarizer 402 is preferably a circular polarizer.
In general, polarizers are classified into linear polarizers and circular polarizers. Generally, both a linear polarizer and a circular polarizer can meet basic requirements of photography, but the linear polarizer affects photometry and automatic focusing, and is not suitable for a digital camera with high automation degree. Since the existing digital cameras all have the functions of automatic focusing and automatic exposure, the linear polarizer may cause the automatic exposure inaccuracy and the automatic focusing may be out of control, however, the circular polarizer works well on such cameras, so the polarizer 402 is preferably a circular polarizer in the embodiment of the present invention.
Note that asymmetry of the vibration direction with respect to the propagation direction is called polarization, and only transverse waves can cause polarization. The phenomenon that the spatial distribution of the light wave electric vector vibration loses symmetry with respect to the propagation direction of light is called polarization of light. Light waves contain transverse vibrations in all possible directions, but have unequal amplitudes in different directions, with the amplitudes having maxima and minima in two mutually perpendicular directions, and are referred to as partially polarized light. Natural light and partially polarized light are actually composed of many linearly polarized light different in vibration direction.
In an embodiment of the present invention, an installation angle is provided between each of the light field cameras 401 and the polarizer 402 installed therein, and the installation angles between the light field cameras 401 are different.
In an embodiment of the present invention, an installation angle between each of the light field cameras 401 and the polarizer 402 installed therein is sequentially decreased from outside to inside according to a certain ratio, and the installation angles are distributed in a range of 0 to 180 degrees. .
The installation angles of the light field cameras 401 and the polarizers 402 installed on the light field cameras are different, and the light field cameras and the polarizers are sequentially changed according to a certain proportion, including sequentially increasing from outside to inside, so that the purpose of receiving polarized light rays from different directions as much as possible is achieved.
For example, a light field camera 401 at the center of the light field camera array is selected as a camera number 0, the mounting angle of the polarizer of the camera is 0 °, and if the light field camera array has n lenses, the mounting angles of the polarizers of the surrounding light field cameras are sequentially set according to 180/n °, 180/n × 2 °, 180/n × 3 ° … from outside to inside according to the distance from the camera number 0. Finally, the intensity of the polarized light in different cameras can show different brightness changes.
As shown in fig. 5, a schematic diagram of an image processing system according to an embodiment of the invention is shown, including: FIG. 2 shows an image processing apparatus 501 in an embodiment of the invention, and FIG. 3 shows a light field camera array 502 in an embodiment of the invention; the image processing device is communicatively connected to the light field camera array to implement the image processing method as described in fig. 1.
In an embodiment of the present invention, the image processing apparatus 501 is the same as the image processing apparatus in the embodiment of the present invention shown in fig. 2, and the light field camera array 502 is the same as the light field camera array in the embodiment of the present invention shown in fig. 3, so that the description thereof is omitted here.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It is to be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In summary, according to the image processing method, the image processing apparatus, the image processing system, and the storage medium of the present invention, a first image set captured by a light field camera array is received, and one of the cameras is optionally a standard camera, then each image in the first image set is respectively projected into an image physical coordinate system corresponding to the standard camera according to an internal parameter and an external parameter calibrated by each camera to obtain a second image set, then a focal plane corresponding to each image in the second image set is focused uniformly to any plane behind a bright surface to obtain a third image set, and finally each image is subjected to difference calculation to obtain a finished image.
The invention can solve the problem of highlight generated by taking a picture, and clearly restore the scene of removing the highlight.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (11)

1. An image processing method, characterized in that the method comprises:
receiving a first set of images captured by a light field camera array, and optionally one of the cameras being a standard camera; each light field camera in the light field camera array is provided with a polarizer, an installation angle is arranged between each light field camera and the polarizer installed on the light field camera, and the installation angles among the light field cameras are different;
respectively projecting each image in the first image set into an image physical coordinate system corresponding to the standard camera according to the internal parameters and the external parameters calibrated by each camera to obtain a second image set;
focusing the focal plane corresponding to each image in the second image set to any plane behind the bright surface in a unified manner to obtain a third image set;
calculating the difference value of each image in the third image set to obtain a finished image; two images are selected from the refocused images, and the pixel values of the two images are subtracted under the same pixel coordinate to obtain a first difference image; comparing the pixel value of each pixel coordinate in the first difference image with a preset threshold value, and if the pixel value is smaller than the threshold value, marking the pixel value as invalid; if not, marking as valid; subtracting the pixel values of the subtracted image which is the subtracted number in the two images and the first difference image under the same pixel coordinate to obtain a second difference image; the value of the part marked as effective in the first difference image is the difference value of the pixel values of the subtracted image and the first difference image under the same pixel coordinate; corresponding to the part marked as invalid in the difference image, taking the value as the pixel value of each pixel coordinate in the subtracted image; putting the second difference image back to the third image set, and taking any two images to repeat the steps; and when one image remains in the third image set, obtaining the finished image.
2. The image processing method according to claim 1, wherein the method of projecting the images in the first image set into the image physical coordinate system corresponding to the standard camera according to the internal parameters and the external parameters calibrated by the cameras comprises:
converting the image physical coordinates corresponding to each camera into camera coordinates according to the internal parameters calibrated by each camera;
converting the camera coordinates corresponding to each camera into a world coordinate system according to the external parameters calibrated by each camera;
and projecting the images in the first image set into an image physical coordinate system corresponding to the standard camera according to the corresponding relation of the cameras in the world coordinate system.
3. The image processing method according to claim 1, wherein the method for focusing uniformly to any plane behind the bright surface comprises:
judging whether a focal plane corresponding to each image in the second image set is parallel to the light field camera array plane;
if yes, focusing a focal plane parallel to the light field camera array plane to any plane behind the bright surface;
if not, adjusting a focal plane which is not parallel to the light field camera array plane to be parallel to the light field camera array plane, and focusing to any plane behind the bright surface.
4. The method of image processing according to claim 3, wherein the method of adjusting a focal plane that is non-parallel to the light field camera array plane to be parallel to the light field camera array plane comprises:
selecting one point on the focal plane which is not parallel to the light field camera array plane, and translating the focal plane until the point is coincided with the original point of the light field camera array;
rotating the focal plane to be parallel to the light field camera array plane;
moving back to the focal plane.
5. A computer-readable storage medium on which an image processing program is stored, characterized in that the program realizes the image processing method of any one of claims 1 to 4 when executed by a processor.
6. An image processing apparatus characterized by comprising: a communicator, a processor, and a memory;
the communicator is in communication connection with an external device; the memory is used for storing an image processing program; the processor executes an image processing program to realize the image processing method of any one of claims 1 to 4.
7. An image processing system, comprising: the image processing apparatus of claim 6, and a light field camera array; the image processing device is communicatively connected to the light field camera array to implement the image processing method of any of claims 1 to 4.
8. The image processing system of claim 7, wherein the light field camera array comprises: a plurality of light field cameras; the plurality of light field cameras form an array according to a certain arrangement mode; each light field camera is provided with a polarizer, an installation angle is arranged between each light field camera and the polarizer installed on the light field camera, and the installation angles among the light field cameras are different.
9. The image processing system of claim 8, wherein the polarizer is a circular polarizer.
10. The image processing system of claim 8, wherein the installation angles provided between each light field camera and the polarizer installed thereon are gradually decreased from outside to inside according to a certain proportion, and the installation angles are distributed in the range of 0-180 degrees.
11. The image processing system of claim 8, wherein the angle between the plane where the lens of each light field camera is located and the bright plane respectively satisfies the brewster angle range.
CN201810987724.8A 2018-08-28 2018-08-28 Image processing method, apparatus, system and storage medium Active CN109191441B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810987724.8A CN109191441B (en) 2018-08-28 2018-08-28 Image processing method, apparatus, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810987724.8A CN109191441B (en) 2018-08-28 2018-08-28 Image processing method, apparatus, system and storage medium

Publications (2)

Publication Number Publication Date
CN109191441A CN109191441A (en) 2019-01-11
CN109191441B true CN109191441B (en) 2022-03-01

Family

ID=64916493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810987724.8A Active CN109191441B (en) 2018-08-28 2018-08-28 Image processing method, apparatus, system and storage medium

Country Status (1)

Country Link
CN (1) CN109191441B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132773B (en) * 2019-06-24 2024-04-12 曜科智能科技(上海)有限公司 Method, device, equipment and storage medium for detecting riveting point defect of aircraft head cover
CN113706398A (en) * 2020-05-22 2021-11-26 西北工业大学 Device and method for generating high dynamic image in motion scene

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101968453A (en) * 2009-12-01 2011-02-09 北京理工大学 Polarization detection method and device for white and colorless foreign matters in cotton
US8345144B1 (en) * 2009-07-15 2013-01-01 Adobe Systems Incorporated Methods and apparatus for rich image capture with focused plenoptic cameras
CN108028892A (en) * 2015-09-30 2018-05-11 索尼公司 Information acquisition equipment and information obtaining method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9300865B2 (en) * 2014-01-24 2016-03-29 Goodrich Corporation Random imaging

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8345144B1 (en) * 2009-07-15 2013-01-01 Adobe Systems Incorporated Methods and apparatus for rich image capture with focused plenoptic cameras
CN101968453A (en) * 2009-12-01 2011-02-09 北京理工大学 Polarization detection method and device for white and colorless foreign matters in cotton
CN108028892A (en) * 2015-09-30 2018-05-11 索尼公司 Information acquisition equipment and information obtaining method

Also Published As

Publication number Publication date
CN109191441A (en) 2019-01-11

Similar Documents

Publication Publication Date Title
CN110111262B (en) Projector projection distortion correction method and device and projector
CN110336987B (en) Projector distortion correction method and device and projector
CN107607040B (en) Three-dimensional scanning measurement device and method suitable for strong reflection surface
JP4077869B2 (en) Light source estimation device, light source estimation system, light source estimation method, image resolution increasing device, and image resolution increasing method
Bimber et al. Multifocal projection: A multiprojector technique for increasing focal depth
US9338447B1 (en) Calibrating devices by selecting images having a target having fiducial features
CN113554575B (en) High-reflection object surface highlight removing method based on polarization principle
KR100681320B1 (en) Method for modelling three dimensional shape of objects using level set solutions on partial difference equation derived from helmholtz reciprocity condition
KR20170005009A (en) Generation and use of a 3d radon image
JP2008016918A (en) Image processor, image processing system, and image processing method
CN114697623B (en) Projection plane selection and projection image correction method, device, projector and medium
WO2024148994A1 (en) Highly reflective surface three-dimensional reconstruction method and apparatus based on polarized structured light camera
JP6683307B2 (en) Optimal spherical image acquisition method using multiple cameras
CN109191441B (en) Image processing method, apparatus, system and storage medium
US9124797B2 (en) Image enhancement via lens simulation
CN111080669A (en) Image reflection separation method and device
JPWO2016175044A1 (en) Image processing apparatus and image processing method
Gu et al. Omni-nerf: neural radiance field from 360 image captures
Hach et al. Cinematic bokeh rendering for real scenes
JP2018005542A (en) Image processing device, imaging apparatus, image processing method, image processing program, and storage medium
CN109325912B (en) Reflection separation method based on polarized light field and calibration splicing system
Einabadi et al. Discrete Light Source Estimation from Light Probes for Photorealistic Rendering.
WO2016175043A1 (en) Image processing device and image processing method
Ntregka et al. Photogrammetric exploitation of hdr images for cultural heritage documentation
Goesele et al. Building a Photo Studio for Measurement Purposes.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant