CN102113021A - Image processing device and pseudo-3d image creation device - Google Patents

Image processing device and pseudo-3d image creation device Download PDF

Info

Publication number
CN102113021A
CN102113021A CN2009801306539A CN200980130653A CN102113021A CN 102113021 A CN102113021 A CN 102113021A CN 2009801306539 A CN2009801306539 A CN 2009801306539A CN 200980130653 A CN200980130653 A CN 200980130653A CN 102113021 A CN102113021 A CN 102113021A
Authority
CN
China
Prior art keywords
polarisation
image
normal
sky
polarization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2009801306539A
Other languages
Chinese (zh)
Other versions
CN102113021B (en
Inventor
金森克洋
甲本亚矢子
佐藤智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Publication of CN102113021A publication Critical patent/CN102113021A/en
Application granted granted Critical
Publication of CN102113021B publication Critical patent/CN102113021B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Processing Or Creating Images (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Image Processing (AREA)

Abstract

The present invention provides an image processing device and pseudo-3D image creation device, comprising a color polarized image acquisition unit (201), full sky polarization map acquisition unit (202), weather judgment unit (203), clear sky region separation unit (204), cloudy sky normal estimation unit (207), and a pseudo-3d transformation unit (208). By this configuration, the image processing device factors in the polarization state of the outdoor sky, acquires polarization information, estimates the surface normal information of the surface of the object in the 2d image, and creates a surface normal image. Using this, the device divides the regions of the objects in the image, extracts the 3d information, and creates an image which is subjected to perspective transformation to thereby create a pseudo-3d image.

Description

Image processing apparatus and simulation stereoscopic image generation device
Technical field
The present invention relates to image processing apparatus and simulation stereoscopic image generation device.Be particularly related to according to the common two-dimentional rest image of shooting such as outside the room or dynamic image etc., do not provide the image (two dimensional image) of depth direction information, infer out 3D shape, make the simulation stereoscopic image generation device of simulation stereo-picture.
Background technology
For two-dimentional rest image, want to change arbitrarily viewpoint and appreciate the simulation stereo-picture, just must make in all sorts of ways, obtain and be commonly used to show three-dimensional depth direction information.For this reason, need only the so-called activity detection system that uses for example special stadimeter and for example change light source.But this structure is difficult on the general civilian camera and carries.
Therefore, inquiring into a kind of method, the scene image according to taking uses various hypothesis, infers out D structure, realizes that image synthesizes and virtual viewpoint moves.According to this method,, can comparatively successfully simulate D structure though can not accurately recover to infer the needed information of D structure according to general scene.
Patent documentation 1 disclosed method is to be object with the depth scene such as this quasi-representative of road that stretches to a unlimited distant place, by the scene tectonic model of synthetic 3 basic kind, gives the two-dimensional phase picture with the depth model, and the three-dimensional of simulating is synthetic.
Non-patent literature 1 disclosed method is to be object with general photography scene for example, 1 coloured image of same input, according to the texture information of the color information of its pixel, image tiny area, on the image configuration information and as the parallel lines edge of the artifact surfaces such as buildings of subject, infer end point.According to these information, each zone that identification reflects in the scene, for example sky, ground, perpendicular to male and fomale(M﹠F)s such as the planes such as wall on ground, vegetations.Then,, carry out simple 3 D stereoization according to the normal information of ground that detects and building surface, just as " Stereo picture album ", with the photograph three-dimensional, the synthetic image of seeing from any viewpoint.It is generally acknowledged that by learning typical scene image, above-mentioned inferring can be implemented very accurately.
Yet, according to the technology of patent documentation 1, steric information from image, extract out, not equal to from fundamental figure, extract out.Its result, the object scene is defined.Therefore, this technology versatility of having to is relatively poor, and practicality is not strong.And,, there is following problem though be extract information from image for the technology of non-patent literature 1.
1) can't correctly infer the normal on the inclined-plane that tilts with respect to the face of land.
2), can't correctly infer normal in the side except that rectangle.
3) face that can wrongly judge color information near sky and ground.
These problems all originate from can't directly inferring surface normal from two dimensional image.Want to obtain surface normal information according to two dimensional image, known have a following technology, for example uses existing laser range finder, or use stereoscopic photograph to be obtained by distance.But if to liking without and being castle at a distance etc., the two is all impracticable.
Even if, the technology of using polarisation information is for example arranged as for this technology that also can extract the face normal at a distance by passive (passive) method out.The technology of patent documentation 2 adopts complete diffuse illumination exactly, to carrying out 360 degree illuminations around the subject, takes with the camera that can obtain its image polarisation information, obtains the shape of subject.The refractive index of supposing subject is known, then can obtain the shape of the transparent substance etc. of common mensuration difficulty.
The technical essential of patent documentation 2 is: subject is direct reflection (specular reflection) object, and reflected light is according to Fresnel (Fresnel) rule generation polarisation; The whole face that encases subject is fully thrown light on, make whole of subject direct reflection takes place.In addition, the technology of patent documentation 2 has following advantage: shape is inferred and is not to measure distance earlier and infer normal again, but directly obtain surface normal such as using stadimeter or carrying out measurement in space, need not obtain distance.
With regard to the environment outside the daytime room, known: though far apart from subject, thrown light on by sky around it, and if fine, subject presents direct reflection mostly, so the situation of this environment and present technique is extremely similar.
In addition, non-patent literature 1 discloses the technology that forms the simulation stereo-picture according to two dimensional image.Relevant for the report of sky polarisation (polarization), patent documentation 3 discloses the conventional example of polaroid imaging device in non-patent literature 2 and the non-patent literature 3.
Patent documentation 1: the spy opens the 2006-186510 communique
Patent documentation 2: the spy opens flat 11-211433 communique
Patent documentation 3: the spy opens the 2007-86720 communique
Non-patent literature 1: " Automatic photo pop-up ", Derek Hoiem et.al, ACM SIGGRPAH 2005
Non-patent literature 2: " Polarization Analysis of the Skylight Caused by Rayleigh Scattering and Sun Orientation Estimation using Fisheye-Lens Camera ", Daisuke Miyazaki et.al, electronic information communication association pattern-recognition media is understood research association, Vol.108, No, 198, pp.25-32,2008
Non-patent literature 3: " The Journal of Experimental Biology 204,2933-2942 (2001) "
When the technology of patent documentation 2 is applied to outside the room, there is following problem.
When inferring surface normal according to the polarisation information of specular light, not only need throwing light on around the subject, the illumination that also needs to shine is non-polarisation (polarisation at random) illumination.So, in patent documentation 2, used a kind of very special device, use up perfectly diffusing sphere and cover subject fully, throw light on from its outside.
And the illumination of room external environment is made up of the direct sunshine (directional light) of the sun and the surface illumination in blue sky when fine day.Sunshine is a nonpolarized light, and blue sky is polarized light.When photography, direct reflection can not taken place by the irradiation of the polarisation in blue sky around it by the irradiation of the direct sunshine of the sun in subject usually.So in this case, prior art can't be used.
Summary of the invention
Image processing apparatus of the present invention comprises: the polarized light image obtaining section, obtain the polarized light image of polarisation information with a plurality of pixels; The subject normal is inferred portion, according to the polarisation information that described polarized light image has, infers subject normal to a surface outside the room; With all-sky polarisation mapping graph obtaining section, obtain all-sky polarisation mapping graph, the relation between the position of its expression all-sky and the polarisation information of described position.The described subject normal portion of inferring utilizes described all-sky polarisation mapping graph, obtains the polarized condition of the specular light on described subject surface from described polarisation information, infers described subject normal to a surface thus.
Another image processing apparatus of the present invention comprises: the image obtaining section, obtain luminance picture and polarized light image, and this luminance picture has the monochrome information of a plurality of pixels, and this polarized light image has the polarisation information of described a plurality of pixels; The weather detection unit is judged to be cloudy state or fine state with state of weather; Infer portion with the subject normal,, obtain the polarized condition of the specular light that the subject surface produces outside the room,, take distinct methods, infer the surface normal of described subject according to the state of weather of described weather detection unit decision according to described polarized condition.
In preferred embodiment, described image obtaining section obtains described luminance picture at multiple different colours.
In preferred embodiment, described weather detection unit determines state of weather according to the degree of polarization of sky or the area in the zone of degree of polarization more than datum-plane.
In preferred embodiment, described weather detection unit is judged to be the cloudy state that the sky degree of polarization is lower than the datum-plane of regulation with state of weather, or the fine state of described degree of polarization more than described datum-plane.
In preferred embodiment, described weather detection unit has the part fine day of cloud to be judged to be fine state the sky part.
In preferred embodiment, described weather detection unit obtains the information of expression state of weather from the outside, the state of decision weather.
In preferred embodiment, possess all-sky polarisation mapping graph obtaining section, obtain the all-sky polarisation mapping graph of the following both sides relation of expression, the relation between the position that promptly whole sky is aerial and the polarized condition of described position.When described weather detection unit judges that state of weather is fine state, utilize described all-sky polarisation mapping graph, infer described subject normal to a surface.
In preferred embodiment, possess: normal is inferred portion during the cloudy day, carries out and infers based on the normal of direct reflection polarisation; Normal is inferred portion during with fine day, carries out and infers and infer based on the normal of direct reflection polarisation based on geometric normal.Described when fine the normal portion of inferring carrying out when inferring the relation between the whole day aerial position that can utilize that described all-sky polarisation mapping graph represents and the polarized condition of described position based on geometric normal.
In preferred embodiment, described all-sky polarisation mapping graph obtaining section is utilized wide-angle lens, obtains the polarized light image of whole sky.
In preferred embodiment, described all-sky polarisation mapping graph obtaining section is obtained the data of described all-sky polarisation mapping graph from the outside.
In preferred embodiment, possess: fine day dummy section separated part, when fine the sky dummy section is separated from described image; With cloudy day day dummy section separated part, during the cloudy day sky dummy section is separated from described image.According to the output of described weather detection unit, selectivity is switched the action or the output of fine day dummy section separated part and cloudy day dummy section separated part.
In preferred embodiment, described image obtaining section has: colored polarisation is obtaining section simultaneously, in the same polychrome pixel of single-plate color imaging apparatus, has the different sub-disposed adjacent of a plurality of polarisations that sees through the polarization corrugated of angle with colored mosaic filter disc; The polarisation information treatment part will be approximately sine function to the observation brightness of a plurality of polarisation of the described every kind of color that obtains, and the approximation parameters that obtains is carried out equalization between color, obtain the polarisation information of summing up; With the color information handling part, according to the described a plurality of observation brightness that obtain, carry out the brightness equalization, generate average color brightness.Output (i) coloured image; (ii) based on the degree of polarization information and the polarisation phase image of described polarized light image.
In preferred embodiment, (i), adopt and infer based on geometric normal when the incident angle of light source during less than setting; (ii), adopt and infer based on the normal of direct reflection polarisation when the degree of polarization of light source during less than setting.
In preferred embodiment, when state of weather is judged as the cloudy day,, infer normal according to the polarisation phase place and the degree of polarization of specular light.When the normal line vector of inferring around sight line vector when a plurality of the existence arranged, the normal of selecting has the vector that is directed upwardly with respect to the surface level that comprises described sight line vector, when in the normal line vector of inferring is comprising the plane of described sight line vector and incident ray being arranged a plurality of the existence, the normal of selection has the vector of incident angle less than brewster's angle.
Simulation stereoscopic image generation device of the present invention has: the face extraction unit, and the surface normal of the subject of inferring according to above-mentioned any described image processing apparatus is extracted the face vertical with described surface normal out; With simulation three-dimensional portion, according to the face that described extraction unit is extracted out, implement the viewpoint conversion, generate the generation scene image under other viewpoint.
In preferred embodiment, the world coordinates on the summit of the face that described extraction unit extract out is inferred by described simulation three-dimensional portion.
Image processing method of the present invention comprises following steps: the polarized light image of obtaining scene outside the room; Obtain all-sky polarisation mapping graph; Judge state of weather.Also comprise and infer step,, detect the polarized condition of the specular light on subject surface outside the room,, take distinct methods, infer described by the object surfaces normal according to state of weather according to described polarized light image.
In preferred embodiment, be judged as under the fine situation at state of weather, utilize two kinds of normals, promptly, implement normal and infer based on geometric normal with based on the normal of direct reflection polarisation.
In preferred embodiment, the incident angle of light source hour increases the fiduciary level of inferring normal based on geometry.The degree of polarization of light source hour increases the fiduciary level of inferring normal based on the direct reflection polarisation.The final high normal of fiduciary level that adopts.
In preferred embodiment, when state of weather is judged as the cloudy day,, infer normal according to the polarisation phase place and the degree of polarization of specular light.When the normal line vector of inferring around sight line vector when a plurality of the existence arranged, the normal of selection has the vector that is directed upwardly with respect to the surface level that comprises described sight line vector.When in the normal line vector of inferring is comprising the plane of described sight line vector and incident ray being arranged a plurality of the existence, the normal of selection has the vector of incident angle less than brewster's angle.
The generation method of simulation stereo-picture of the present invention comprises following steps: the polarized light image of obtaining scene outside the room; According to the polarisation information that described polarized light image has, infer subject normal to a surface outside the room; The surface normal of presumptive subject is extracted the face perpendicular to described surface normal out; With the conversion of enforcement viewpoint, to generate the scene image under other viewpoint.
In preferred embodiment, also comprise: the step that the world coordinates on the summit of the face extracted out is inferred.
According to the present invention, just can provide a kind of image processing method and device, when for example inferring the 3D shape of castle far away outside the room apart from camera, even existing stadimeter and stereoscopic photograph do not prove effective, also can utilize the polarisation information of sky, infer the surface normal of subject, infer out 3D shape.In addition, can also discern processing to for example sky, ground, wall surface, roof etc.In addition,, can also provide a kind of simulation three-dimensional device, realize virtual three-dimensional by utilizing Flame Image Process of the present invention.
Description of drawings
Figure 1A is the key diagram of subject direct reflection outside the room.
Figure 1B is another key diagram of subject direct reflection outside the room.
Fig. 1 C is the outside drawing of an embodiment of the image processing apparatus relevant with the present invention.
Fig. 2 is the block diagram of the image processing apparatus relevant with the present invention.
Fig. 3 (A) is the formation block diagram of the colored polarized light image obtaining section of embodiments of the present invention, (B) is the pie graph of all-sky polarisation mapping graph obtaining section.
Fig. 4 (A) is the colored polarisation three-dimensional composition figure of obtaining section simultaneously, (B) along optical axis direction from directly over observe the diagrammatic sketch of its a part of shooting face, (C) be the partial enlarged drawing of figure (B).
Fig. 5 (A) to (C) is respectively the synoptic diagram of expression B, G, R polarisation pixel wavelength characteristic.
Fig. 6 is the brightness that expression sees through the light of different polarisation of 4 kinds of polarisation major axes orientations.
3 kinds of image examples that colored polarized light image obtaining section obtained when building scenes was taken in Fig. 7 (A) to (C) expression.
Fig. 8 (A) to (C) is respectively the synoptic diagram of key diagram 7 (A) to (C) image.
Fig. 9 (A) and (B) two kinds of images obtaining of expression all-sky polarisation mapping graph obtaining section.
Figure 10 A is the process flow diagram of explanation one routine weather detection unit action.
Figure 10 B is the process flow diagram of another routine weather detection unit action of explanation.
Figure 11 is the block diagram of fine day dummy section separated part of explanation.
Figure 12 (A) to (H) utilizes polarisation and brightness to isolate the result of day dummy section (success example) from the scene of reality.
Figure 13 (A) to (F) utilizes polarisation and brightness to isolate the result of day dummy section (failure example) from the scene of reality.
Figure 14 (A) to (D) utilizes form and aspect and brightness to isolate the result of day dummy section (success example) from the scene of reality.
Figure 15 is the block diagram of explanation cloudy day day dummy section separated part.
Figure 16 is expression dummy section scene image synoptic diagram after separated in sky when fine or cloudy.
Figure 17 (A) and (B) key diagram of the polarisation phenomenon difference of subject direct reflection phenomenon during with the cloudy day when being fine day.
Figure 18 utilizes the figure line that shows Fresnel reflectivity in the direct reflection with the relation of incident angle.
Figure 19 is the process flow diagram that normal is inferred portion when realizing fine day.
The polarisation phase diagram of Figure 20 A when to be expression according to geometry implement normal and infer.
Figure 20 B is the diagrammatic sketch with the subject image of subject lens shooting.
Figure 20 C is the diagrammatic sketch with the sky null images of the wide-angle lens shooting of taking sky.
Figure 21 is the processing spec figure that calculates normal according to geometry.
Figure 22 is based on the diagrammatic sketch of geometric normal and fiduciary level.
The polarisation phase diagram of Figure 23 when to be expression according to the direct reflection polarisation implement normal and infer.
Figure 24 is the explanation figure line of the relational expression of incident angle α and degree of polarization ρ.
Figure 25 is based on the normal of direct reflection polarisation and the diagrammatic sketch of fiduciary level.
Figure 26 is the diagrammatic sketch that the normal after the fiduciary level evaluation is inferred the result.
When being the cloudy day, implements Figure 27 the process flow diagram that normal is inferred.
Figure 28 is the process flow diagram of expression simulation three-dimensional portion treatment scheme.
Figure 29 is the diagrammatic sketch of relation between expression world coordinate system and the camera coordinates system.
Figure 30 is the diagrammatic sketch of the simulation stereo-effect after the conversion of expression viewpoint.
Figure 31 is the block diagram according to another embodiment of image processing apparatus of the present invention.
Figure 32 is the block diagram according to the another embodiment of image processing apparatus of the present invention.
Figure 33 is the block diagram of the image processing apparatus relevant with the 2nd embodiment of the present invention.
Figure 34 is the formation block diagram of the monochromatic polarized light image obtaining section of expression the 2nd embodiment of the present invention.
Figure 35 is the explanation block diagram of fine day dummy section separated part of the 2nd embodiment of the present invention.
Figure 36 is the explanation block diagram of the cloudy day day dummy section separated part of the 2nd embodiment of the present invention.
Figure 37 is the formation block diagram of cloudy day day dummy section separated part 3304.
Among the figure: 200-polarisation information obtaining section, the colored polarized light image obtaining section of 201-, 202-all-sky polarisation mapping graph obtaining section, 203-weather detection unit, fine day dummy section separated part of 204-, 205-cloudy day day dummy section separated part, normal was inferred portion when 206-was fine, normal was inferred portion when 207-was cloudy, 208-simulation three-dimensional portion, and the 210-normal is inferred portion.
Embodiment
Known: the polarisation outside the room (polarization) phenomenon nearly all is only to take place because of direct reflection (Specular Reflection).The present invention is exactly without a subject when being thrown light on by sky, utilizes the polarized condition of the specular light that the subject surface produced, and infers the surface normal of subject.But, since fine different with the polarized condition of cloudy sky, so preferred embodiment be that the two is separated, implement to handle.
Figure 1A has schematically showed following situation: light is from certain position 1600a irradiation on subject (for example " house ") 1600 surfaces of certain position 103a outside the room of sky, because direct reflection, light changes direction, injects camera 100.The normal N of the position 1600a on subject 1600 surfaces is with this position 1600a the point into the sky vectorial L of position 103a and the binary vector of sight line vector V of camera 100.So shown in Figure 1A, when the incident angle of light was θ, emergence angle also equaled 0.
Figure 1B has schematically showed following situation: light is from the another location 1600b irradiation on subject 1600 surfaces of another location 103b outside the room of sky, because direct reflection, light changes direction, injects camera 100.So normal N is that the variation with the subject surface location changes, sky sends the position of the light of injecting camera, and (103a 103b) also can change.
In addition, in the example shown in Figure 1A and Figure 1B, normal N is parallel with drawing (paper).Certainly, owing to normal N can change with the shape and the position on subject surface, so also not parallel sometimes with drawing.In order to obtain normal N, not only need the angle θ shown in Figure 1A, also need to determine around the corner ψ of the sight line vector V of camera.
When fine,,, that is to say, can't utilize the polarisation of the direct reflection of subject surface generation, infer the surface normal of subject so can't use the method shown in the patent documentation 2 owing to sky generation polarisation.Yet, during fine day, because direct reflection can take place on the subject surface, so under the condition of complete minute surface, the polarized condition on the sky ad-hoc location can former state embody on the part of subject.For example, in the example shown in Figure 1B, the polarized condition of the sky on day empty position 103b can be reflected on the position 1600b on subject 1600 surfaces.So,, just can determine to inject only sending of camera from which position of sky if utilize the corresponding relation of the polarisation information that manifests on polarisation information on the all-sky polarisation mapping graph obtain separately and the polarized light image.And, after knowing this position, shown in just can image pattern 1A, 1B like that, according to geometry principle, decision subject normal to a surface N.
Here, though have the situation of " local fine " of cloud to describe to being that fine day is local.The preferred embodiment of the present invention can all be handled this local fine state as " fine day ".When carrying out this processing, it is the normal based on the direct reflection polarisation that adopts under the cloudy prerequisite that meeting of the present invention decides according to confidence level, still under the fine prerequisite based on geometric normal.
If the full cloudy day, whole sky is all evenly covered by cloud, and sky can be considered as non-polarizing illumination (hemispherical illumination) so.Like this, just can use and patent documentation 2 described same methods, under the situation at full cloudy day, infer the subject normal to a surface.
According to the present invention,, can infer out subject normal to a surface on the two dimensional image according to the polarisation information of taking from the subject outside the room.Then, use the normal information of inferring out like this, just can from two dimensional image, extract three-dimensional information (depth information) out, generate the three-dimensional image of simulation.
In addition, for the purpose of the present invention, implementation not only can show as image processing method and simulation stereoscopic image generation method, also can show as the video generation device that possesses the mechanism that carries out each step of handling; Allow computing machine carry out the computer program of each step; With storage medium of embodied on computer readable such as the CD-ROM that deposits this program etc.
Image processing apparatus of the present invention comprises: the polarized light image obtaining section of obtaining polarized light image; Get the above-mentioned polarisation information that obtains from polarized light image with basis, infer the subject normal of subject surface normal outside the room and infer portion.
In addition, " polarized light image " in this instructions be meant, comprises the image that a plurality of pixel of having polarisation information in the two dimensional image of subject is formed.That is to say that the meaning of polarized light image is to constitute each image that constitutes by the polarisation information of representing this pixel of a plurality of pixels of this image.Polarisation information comprises degree of polarization and polarisation phase place (polarization angle).So, short of special qualification, " polarized light image " is exactly the general name of " degree of polarization image " and " polarisation phase image "." degree of polarization image " is the two-dimensional representation of each pixel degree of polarization; " polarisation phase image " is the two-dimensional representation of each pixel polarisation phase place.The size of the degree of polarization of each pixel and polarisation phase place (numerical value) can be by the brightness or form and aspect (tone) performance of this pixel.In the application's accompanying drawing, the size of degree of polarization and polarisation phase place is by the height performance of brightness.
In addition, " image " in this instructions not necessarily can be discerned as people's vision, is the state that the pixel two-dimensional arrangements presents.That is to say that what " image " this term showed sometimes is the arrangement (view data) of the information (numerical value) of each pixel of composing images, for example brightness, degree of polarization, polarisation phase place etc.
Image processing apparatus of the present invention possesses the subject normal and infers portion, is used for basis from the polarisation information that this polarized light image obtains, and infers the surface normal of subject outside the room.
In preferred embodiment, this image processing apparatus possesses all-sky polarisation mapping graph obtaining section, is used for obtaining all-sky polarisation mapping graph, the figure shows the relation between the polarisation information of the position of all-sky and this position.This image processing apparatus is inferred the subject normal to a surface according to all-sky polarisation mapping graph.
Preferred image processing apparatus of the present invention possesses the weather detection unit, judges that the state of weather is the cloudy state that degree of polarization is lower than datum-plane, or degree of polarization is the above fine state of this datum-plane.Image processing apparatus of the present invention can be measured the polarized condition of the specular light on subject surface outside the room, according to the state of weather of weather detection unit decision, adopts distinct methods to infer the subject normal to a surface.
Below, utilize accompanying drawing, embodiments of the present invention are described.
(embodiment 1)
At first, the 1st embodiment to image processing apparatus of the present invention describes.
Present embodiment is the image processing apparatus with camera form, and outward appearance constitutes shown in Fig. 1 C.The image processing apparatus of Fig. 1 C (camera 100) comprising: subject camera lens part 101 is used for obtaining the polarized light image and the color image information of subject by photography; Wide-angle lens head 102 is arranged on the camera top, is used for obtaining the polarized light image information of sky by photography; With horizontal indicating devices 103 such as spirit-leveling instruments.The shooting of subject is to keep under the horizontal state at camera 100, and horizontal indicating device 103 expression directions are to carry out under horizontal direction, wide-angle lens head 102 state vertically upward.
Below, with reference to Fig. 2, the formation of present embodiment is described in more details.Fig. 2 is the functional block diagram of the image processing apparatus of present embodiment.
The image processing apparatus of Fig. 2 has polarisation information obtaining section 200, and it comprises colored polarized light image obtaining section 201 and all-sky polarisation mapping graph obtaining section 202.
Colored polarized light image obtaining section 201 obtains the degree of polarization image ρ of subject, the information of polarisation phase image Φ, coloured image C by the camera lens part 101 of Fig. 1 C.All-sky polarisation mapping graph obtaining section 202 is set at the camera top, is used for wide-angle lens head 102 by Fig. 1 C, obtains the polarized light image information of sky.Here, " all-sky polarisation mapping graph " is the mapping graph of the sky polarisation information of the expression a plurality of positions of whole sky (point).
The image processing apparatus of present embodiment comprises that also normal infers portion 210 and simulation three-dimensional portion 208.Normal is inferred portion 210 and comprised: normal was inferred portion 207 when normal was inferred portion 206 and cloudy day when weather detection unit 203, fine sky dummy section separated part 204, cloudy day day dummy section separated part 205, fine day.
In the present embodiment, measure the polarized condition of the specular light on subject surface outside the room,, adopt distinct methods to infer the subject normal to a surface according to the state of weather.Weather detection unit 203 is the mechanisms according to the degree of polarization of sky decision state of weather, be used for judging when scene is taken weather be fine be cloudy.Particularly be exactly when the degree of polarization of sky is lower than datum-plane, just to judge " cloudy state "; When the degree of polarization of sky is datum-plane when above, just judge " fine state ".
In the present embodiment, measure the polarized condition of the specular light on subject surface outside the room,, adopt distinct methods to infer the subject normal to a surface according to the state of weather.Weather detection unit 203 is the mechanisms according to the degree of polarization of sky decision state of weather, be used for judging when scene is taken weather be fine be cloudy.Specifically with two example explanations.An example is when the degree of polarization of sky is lower than datum-plane, just to judge " cloudy state "; When the degree of polarization of sky is datum-plane when above, just judge " fine state ".Another example is, occupies certain area when above when zone that the sky degree of polarization is lower than datum-plane at whole sky, just judges " cloudy state "; In addition just judge " fine state "." cloudy state " of the present invention can be considered as the complete cloudy state that whole sky is all covered by cloud.So " fine state " comprises the state that the sky part has cloud to exist.
Fine day dummy section separated part 204 input ρ, Φ, C, when fine sky dummy section and subject zone are separated subject degree of polarization image ρ fo during fine after output separates the sky dummy section, subject polarisation phase image Φ fo when fine from image.On the other hand, a cloudy day day dummy section separated part 205 is when the cloudy day, sky dummy section and subject zone are separated same subject polarisation phase image Φ co when subject degree of polarization image ρ co, cloudy day when generating cloudy after the sky dummy section separated from image.
Normal is inferred portion 206 and utilize all-sky polarisation mapping graph when fine day when fine, according to the information of degree of polarization image and polarisation phase image, infers normal map as N; Normal is inferred portion 207 when the cloudy day during cloudy day, according to the information of degree of polarization image and polarisation phase image, infers and obtain normal map as N.
Fine or the cloudy normal map that 208 utilizations of simulation three-dimensional portion obtain is carried out the simulation three-dimensional as N to coloured image C.
Fig. 3 (A) is block diagram, just colored polarized light image obtaining section 201 and the all-sky polarisation mapping graph obtaining section 202 that the inside of expression polarisation information obtaining section 200 constitutes.As shown in the figure, the polarisation information obtaining section 200 of present embodiment comprises: wide-angle lens head 102, movable reflecting plate 303, driving mechanism 304, colored polarisation imaging apparatus (colored polarisation obtaining section) 305, polarisation information treatment part 306 and the color information handling part 307 taking the subject camera lens part 101 of subject, be provided with up directly over camera.
Under the state shown in Fig. 3 (A), the light 310 of sky by movable reflecting plate 303 reflections, arrives colored polarisation imaging apparatus 305 by wide-angle lens head 102.So when shooting, colored polarisation imaging apparatus 305 can be taken whole sky.A day dummy section can utilize for example fisheye camera shooting on a large scale, can observe the polarized condition of sky (whole sky).
Polarisation information treatment part 306 obtains the all-sky polarisation mapping graph of expression sky polarized condition according to the output of colored polarisation imaging apparatus 305.In addition, the concrete grammar of obtaining all-sky polarisation mapping graph is stated in non-patent literature 2.In non-patent literature 2, being provided with specification on the digital camera that direction makes progress is the fish eye lens of 8mm for the F3.5 focal length, and its front end is provided with Polarizer.By manually, the polarisation of Polarizer is spent through axle rotation 0 degree, 45 degree, 90, can obtain the sky polarisation pattern (pattern) of whole sky 130 degree scopes.Present embodiment is also taked identical processing basically, but does not use the Polarizer of rotation, use be the polarisation imaging apparatus, like this, can obtain the polarisation mapping graph in real time.Therefore advantage is: during Polarizer rotates, can not cause pseudo-shadow (artifact) owing to the mobile grade of cloud.
Next, driving mechanism 304 with reflecting plate 303 upward (arrow 312) rotate, at this moment, shown in Fig. 3 (B), can enter colored polarisation imaging apparatus 305 from the light 311 of subject camera lens part 101.So, when taking subject, polarisation information treatment part 306 and 307 work of color information handling part.The shooting order of subject and whole sky also can reverse.
Below, be example to take subject, the action of colored polarized light image obtaining section 201 is described.For the action of all-sky polarisation mapping graph obtaining section 202, equally also have only with polarisation and handle relevant part.
Preferably, obtain scene image and scene polarized light image outside the room simultaneously or with the short time interval.Owing to have the wind situation such as moving that clouds up, so preferably take real-time acquisition mode.When obtaining polarized light image, the technology of rotate Polarizer, carrying out a plurality of image takings is not suitable for outside the room.So, must use real-time polarisation camera.
Patent documentation 3 discloses the technology that obtains monochrome image and polarized light image simultaneously in real time.According to this technology, want to obtain simultaneously the part polarized light image of luminance picture and subject, will be to different medelling polarisation (polarizer) of a plurality of polarisation main shafts (polarisation sees through axle) be set on the imaging apparatus space.As medelling polarisation, the having of utilization: photonic crystal or construct many refraction waves chip arrays (Agencies Zao Complex flexion Bo Long plate ア レ イ).But these technology still can not make coloured image and polarized light image obtain simultaneously.
With respect to this, take under the structure of Fig. 3 (A), can obtain coloured image in real time to subject, and obtain polarized light image simultaneously, export two kinds of polarized light images (degree of polarization image ρ and polarisation phase image Φ).
Incident light is injected in the colored polarisation imaging apparatus 305.By this incident light, colored polarisation imaging apparatus 305 can be obtained colored multidate information and polarized light image information two sides in real time.The signal of colored polarisation imaging apparatus 305 output expression color dynamic image information and polarized light image information is input to polarisation information treatment part 306 and color information handling part 307 respectively.307 pairs of above-mentioned signals of polarisation information treatment part 306 and color information handling part are implemented various processing, output coloured image C, degree of polarization image ρ, polarisation phase image Φ.
Fig. 4 (A) is the synoptic diagram of the basic comprising of the colored polarisation imaging apparatus 305 of expression.In illustrated embodiment, color screen and medelling polarisation are by the folded front that is located at the imaging apparatus pixel.Incident light sees through color screen and medelling polarisation arrives imaging apparatus, by the imaging apparatus pixel, can observe brightness.Like this,, just can utilize colored mosaic type single-plate color imaging apparatus, obtain color image information and polarized light image information simultaneously according to present embodiment.
Fig. 4 (B) be along optical axis direction from directly over observe the diagrammatic sketch of the part shooting face of colored polarisation imaging apparatus 305.For simplicity, only illustrate 16 pixels (4 * 4) in the shooting face among the figure.Illustrated 4 rectangular areas 401~404 represent to be arranged on the counterpart of 4 colored mosaic filter discs of Bayer type (bayer) on the pixel cell respectively.Rectangular area 404 is B (blueness) filter disc districts, has covered pixel cell B1~B4.The last close attachment of pixel cell B1~B4 the mutually different B of polarisation main shaft (blueness) with medelling polarisation.Here, " polarisation main shaft " is meant, with the parallel axle of partial wave face (seeing through the partial wave face) of the light that sees through polarisation.In the present embodiment, in the identical pixel of color, has the different polarisation subunit (small Polarizer) that sees through the partial wave face of angle by disposed adjacent.Particularly be exactly that 4 kinds see through the mutually different polarisation subunit of partial wave face direction and are configured in the identical pixel group of each color of R (redness), G (green), B (blueness).1 corresponding 1 trickle polarisation pixel of polarisation subunit.Among the figure, given for example symbol such as G1 to each polarisation pixel.
Fig. 4 (C) is expression close attachment B (blueness) is with the polarisation main shaft that distributes on 4 of the polarisation trickle polarisation pixels.In Fig. 4 (B), the straight line on each trickle polarisation pixel has been illustrated the polarisation major axes orientation of small Polarizer.For the example among the figure, the polarisation main shaft angle of 4 trickle polarisation pixels is different, is respectively ψ i=0 °, 45 °, 90 °, 135 °.
On the pixel of rectangular area 402,403 respectively close attachment 4 G (green) use polarisation, adhered to 4 R (redness) on the pixel of rectangular area 401 and used polarisation.Among the figure, the position shown in the reference marks " 405 " is a virtual pixel location, expression be the location of pixels of in this shooting system 4 pixels being looked when as a whole.As shown in the figure, medelling polarisation of each rectangular area 401~403 also is divided into 4 parts with different polarisation main shafts.
Like this, present embodiment has just possessed following feature: each colour element all comprises the different a plurality of trickle polarisation pixel of polarisation main shaft.Colored mosaic arrangement itself is arbitrarily.In the following description, claim each trickle polarisation pixel to be " polarisation pixel ".
Fig. 5 (A) to (C) is a figure line of schematically representing the wavelength characteristic of B (blueness), G (green), R (redness) polarisation pixel respectively.The longitudinal axis of each figure line is to see through light intensity, and transverse axis is a wavelength.B, G, R have the polarized light property that sees through TM (Transverse Magnetic Wave) ripple, reflection (it is seen through) TE (Transverse Electric Wave) ripple in each wavelength band of B, G, R with the polarisation pixel.The TM ripple is that magnetic-field component is horizontal ripple with respect to the plane of incidence, and the TE ripple is that electric field component is horizontal ripple with respect to the plane of incidence.
Fig. 5 (A) shows the polarized light property 502,503 of B (blueness) polarized light image and B (blueness) and sees through characteristic 501 with color screen.Polarized light property 502,503 is represented the transmitance of TM ripple and TE ripple respectively.
Fig. 5 (B) shows the polarized light property 505,506 of G polarized light image and G and sees through characteristic 504 with color screen.Polarized light property 505,506 is represented the transmitance of TM ripple and TE ripple respectively.
Fig. 5 (C) shows the polarized light property 508,509 of R polarized light image and R and sees through characteristic 507 with color screen.Polarized light property 508,509 is represented the transmitance of TM ripple and TE ripple respectively.
Characteristic shown in Fig. 5 (A) to (C) for example can utilize patent documentation 3 described photonic crystals to realize.Under the situation of using photonic crystal, has the only TE ripple of the electric-field vector vibration plane parallel with the groove of this crystal surface formation; Only TM ripple with magnetic field vector vibration plane vertical with the groove of this crystal surface formation.
Shown in Fig. 5 (A) to (C), the main points of present embodiment are, see through in the wavelength band at each of B, G, R, have all used medelling polarisation that embodies the polarisation stalling characteristic.
When using monochrome image brightness and polarizing filter, need not the wavelength band of performance polarisation stalling characteristic be optimized; And when each colour element is obtained polarisation information, need the stalling characteristic of integrated color and the stalling characteristic of polarisation.
In this instructions, adopt the combination (for example " R1 " or " G1 " etc.) of 4 numerals " 1,2,3,4 " and 3 symbols " R, G, B " to represent the characteristic of polarisation pixel, the polarisation orientations of major of 4 numeral polarisation pixels, 3 symbols are used for distinct color.For polarisation pixel R1 and polarisation pixel G1, numeral is identical, thus the direction unanimity of polarisation main shaft, but the symbol difference of RGB, so they are different polarisation pixels, the light that sees through has different wavelength bands.Present embodiment has realized the arrangement of this polarisation pixel by the combination of color screen shown in Fig. 4 (A) and medelling polarisation.
Below, utilize Fig. 6, the processing of the polarisation information treatment part 306 of Fig. 3 is described.Fig. 6 shows the brightness 601~604 of the light of polarisation that sees through 4 kinds of polarisation major axes orientation differences (ψ i=0 °, 45 °, 90 °, 135 °).Here, when the corner ψ that establishes the polarisation main shaft was ψ i, observation brightness was Ii.Wherein, " i " is the following integer of 1 above N, and " N " is number of samples.In example shown in Figure 6, N=4, thus i=1,2,3,4.Fig. 6 shows sampling (ψ i, Ii) the corresponding brightness 601~604 with 4 pixels.
The angle ψ i of polarisation main shaft and the relation of brightness 601~604 are expressed as the sine function curve.Fig. 6 has recorded and narrated 4 some situations on a sinusoidal curve of brightness 601~604.Deciding under the sinusoidal situation according to more observation brightness, part observation brightness departs from sinusoidal curve sometimes a little.
In addition, " the polarisation information " in this detail specifications refers to AM depth and the phase information on the sinusoidal curve, and expression brightness is to the dependence of polarisation main shaft angle.
During actual treatment, zone 401~404 samplings inner 4 pixel brightness values identical to each color shown in Fig. 4 (A), according to following formula, I is similar to the pairing reflected light brightness of the sub-spindle angular ψ of medelling polarisation.
[numerical expression 1]
I(ψ)=A·sin?2(ψ-B)+C
(formula 1)
Here, as shown in Figure 6, A, B, C are constants, show amplitude, phase place, the mean value of polarisation luminance fluctuation curve respectively.(formula 1) can followingly be launched.
[numerical expression 2]
I(ψ)=a·sin?2ψ+b·cos?2ψ+C
(formula 2)
Wherein, A and B are represented by following (formula 3) and (formula 4) respectively.
[numerical expression 3]
A = a 2 + b 2 , sin ( - 2 B ) = b a 2 + b 2 , cos ( - 2 B ) = a a 2 + b 2 (formula 3)
[numerical expression 4]
B = - 1 2 tan - 1 ( b a ) (formula 4)
If obtain A, the B, the C that make following (formula 5) minimum, the relation between brightness I and the polarisation spindle angular ψ just can be come approximate representation with the sine function of (formula 1).
[numerical expression 5]
f ( a , b , C ) = Σ i = 1 N ( I i - a · sin 2 ψ i - b · cos 2 ψ i - C ) 2 (formula 5)
By above processing, determine three parameters of A, B, C that sine function is approximate at a kind of color.So, obtain the degree of polarization image of expression degree of polarization ρ and the polarisation phase image of expression polarisation phase place Φ.Degree of polarization ρ represents the degree of the light generation polarisation of respective pixel, and polarisation phase place Φ represents the orthogonal directions of spindle angular of part polarisation of the light of respective pixel, the phase place when just the sine function brightness of polarisation is minimum value.According to the Fresnel reflection theory, the face (plane of incidence) at subject surface normal place during direct reflection that Here it is.In addition, the main shaft angle of polarisation is 0 and 180 ° (π).Value ρ and Φ (0≤Φ≤π) use following (formula 6) and (formula 7) to calculate respectively.
[numerical expression 6]
ρ = I max - I min I max + I min = A C = A I ‾ (formula 6)
[numerical expression 7]
φ = 3 π 4 + B (formula 7)
In addition, medelling polarisation of present embodiment also can be photonic crystal, diaphragm type polarization element, silk screen type (wire grid) and based on the polarization element of other principles.
Below, the processing of color information handling part 307 shown in Figure 3 is described.Color information handling part 307 utilizes the information of colored polarisation imaging apparatus 305 outputs, calculates chroma-luminance.The original brightness that sees through the brightness of light of the polarisation light preceding with injecting polarisation is different.Under the situation of non-polarizing illumination, theoretically, the original brightness of the light before the mean value of the observation brightness of polarisation under all polarisation main shafts is equivalent to inject polarisation.If the observation brightness on the pixel of angle polarisation pixel R1 is IR1,, can calculate chroma-luminance according to following (formula 8).
[numerical expression 8]
I ‾ R = ( I R 1 + I R 2 + I R 3 + I R 4 ) / 4
I ‾ G = ( I G 1 + I G 2 + I G 3 + I G 4 ) / 4
I ‾ B = ( I B 1 + I B 2 + I B 3 + I B 4 ) / 4 (formula 8)
By obtaining the brightness on each polarisation pixel, just can generate common colored mosaic image.According to mosaic image, carry out conversion toward coloured image, make each pixel have the rgb pixel value, thereby generate coloured image C.This conversion utilize known interpolation technique for example Bayer type mosaic interpolation method realize.
The brightness of each pixel and polarisation information can utilize 4 polarisation pixels shown in Fig. 4 (B) to obtain among coloured image C, degree of polarization image ρ, the polarisation phase image Φ.So, can think each brightness and polarisation information the representative be the value of the virtual representation vegetarian refreshments 405 on the center of 4 polarisation pixels shown in Fig. 4 (B).Therefore, the rate respectively of coloured image and polarized light image reduces, vertical 1/2 * horizontal 1/2 of the resolution that is had for original colored veneer imaging apparatus.For this reason, preferably increase the pixel count of imaging apparatus as far as possible.
Fig. 7 (A)~(C) is 3 kinds of images (degree of polarization image ρ, polarisation phase image Φ, coloured image C) of being obtained by colored polarized light image obtaining section 201 when taking the scene of distant place buildings.The intensity of the degree of polarization image ρ performance polarisation of Fig. 7 (A), image white more (brightness is high more), degree of polarization is high more.The polarisation phase image Φ of Fig. 7 (B) shows the direction (angle) of polarisation phase place with numerical value.The manifestation mode of polarisation phase place is the value to luminance distribution 0 to 180 degree.The polarisation phase place has periodically, and 180 degree are one-period, spends continuously at 0 degree and 180 so it should be noted that the white and black angle of phase place of representing on the polarized light image.The coloured image C of Fig. 7 (C) is the luminance picture of common RGB color.In fact, coloured image is each luminance picture by obtaining RGB and they is combined formation.For convenience's sake, Fig. 7 (C) has only represented the brightness (monochrome image) of coloured image.
Fig. 7 (A)~(C) periphery is circle, and this is because the peristome of standard viewing angle is circle.In addition, by these figure as can be known, below the prerequisite of Chu Liing is, when carrying out the scene photography, horizontal line level in picture is taken.
Fig. 8 (A)~(C) is the synoptic diagram of same scene image.The degree of polarization image ρ of Fig. 8 (A), the white portion degree of polarization of expression roof parts 801 is the highest.The polarisation phase image Φ of Fig. 8 (B), the phasing degree of polarisation shows as polarisation phase vectors performance 803.The coloured image C of Fig. 8 (C) represents the brightness value of color.
Fig. 9 (A) and (B) schematically when photography performance two kinds of images (degree of polarization mapping graph and polarisation phase mapping figure) of obtaining by all-sky polarisation mapping graph obtaining section 202.The all-sky degree of polarization mapping graph of Fig. 9 (A) is the all-sky image of the degree of polarization of the whole sky of performance.The centre of central point 901 expression skies; Periphery circumference 902 expression local horizon; The subject that the subject 903 that links to each other with periphery indicates to take; The lower zone of degree of polarization around the dotted line 905 expression sun 904.Camera coverage when subject is taken in dashed region 906 expressions.
On the other hand, the group of curves 905~912 on the polarisation phase image of Fig. 9 (B) has polarisation phase vectors performance 803 in each pixel.The position relation of this all-sky polarisation phase image is identical with Fig. 9 (A).The angle of polarisation phase vectors is by with the unique definition of angle with respect to the partial water horizontal line 902 in the camera coverage 906.Like this, the comparison of polarisation phase angle is carried out between just can showing at the polarisation phase vectors of the performance of the polarisation phase vectors of above-mentioned Fig. 8 (B) and Fig. 9 (B).This character is most important to later discussion polarisation phase place.
In the degree of polarization image of Fig. 9 (A), just the sun 904 and zone on every side thereof are lower in the zone of dotted line 905 expression for degree of polarization, and be higher near the local horizon.That is to say that aerial degree of polarization was uneven in whole day.This is typical situation when sunny on daytime.Though the degree of polarization of sky can be with the height change of the sun, degree of polarization is that the fact of inhomogeneous pattern can not change.That is to say that when sunny, for the sky illumination, forming non-polarisation zone and various polarisations zone is the light source of the whole sky of distribution.Therefore, if having Fig. 9 (A) and (B) shown in " all-sky polarisation mapping graph ", by determining the all-sky position, just can make the degree of polarization and the polarisation phase place of the light (illumination light) that shines subject clear according to this position (light source position).In addition, when the subject generation direct reflection on ground, because of this direct reflection incides camera observation viewpoint only, the light that utilizes camera observation viewpoint and subject position to send according to the aerial light source position in sky of geometry decision.That is to say to have only and utilize camera observation viewpoint and subject position light, just can be injected into the position of subject, cause direct reflection according to the polarized condition on the aerial light source position in the sky of geometry decision.
The degree of polarization of all-sky polarisation mapping graph, whole can obviously reduction the when the cloudy day.This is because the Rayleigh scattering (Rayleigh scattering) of the generations such as oxygen molecule of the blue sky of polarisation during the generation fine day becomes the Mie scattering of generations such as water vapor etc. when the cloudy day.Its result during the cloudy day, still has under cloud the situation in zone, blue sky, and whole sky nearly all can become the same extremely low polarized condition.
Figure 10 A is the action flow chart of the weather detection unit 203 of a routine presentation graphs 2.
In this flow process, be fine if the mean value ρ ave of the degree of polarization information of all-sky polarisation mapping graph, just judges weather greater than threshold value ρ w; If less than threshold value ρ w, just judge that weather is the cloudy day.Because observation is not the photographs of certain viewing angle of scene, but whole sky, so the judgement of weather is more accurate.What each image of ρ, Φ, C was carried out is the fine day or the result of determination at cloudy day, outputs to fine day dummy section separated part 204 and cloudy day day dummy section separated part 205.
Figure 10 B is the action flow chart of the weather detection unit 203 of another routine presentation graphs 2.
In this flow process, at first be mean value ave and the standard deviation value σ that obtains the degree of polarization information of all-sky polarisation mapping graph.In addition, at above-mentioned all-sky polarisation mapping graph, calculating degree of polarization ρ is the area S of ave ± 3 σ.Then, whether the mean value ave of judgement degree of polarization information and standard deviation value σ be less than threshold value A V, ST.
If the mean value ave of degree of polarization information and standard deviation value σ all less than threshold value A V, ST and degree of polarization ρ be the area S of ave ± 3 σ less than constant A T, just judge that weather is (fully) cloudy state, otherwise just judge it is fine.
Above-mentioned decision method is stricter to the judgement at cloudy day.Under the situation that is judged to be the cloudy day, the state of whole sky is almost to be covered fully by cloud, and brightness is even, with roughly the same by the state of globe of diffusion covering fully in the prior art.Because observation is not the photographs of certain viewing angle in the scene, but whole sky, so the judgement of weather is more accurate.What ρ, Φ, each image of C were carried out is the fine day or the result of determination at cloudy day, outputs to fine day dummy section separated part 204 and cloudy day day dummy section separated part 205.
Below, a fine day dummy section separated part 204 and the cloudy day day dummy section separated part 205 of Fig. 2 described.
Containing in the scene image packets under the situation of sky, is process object owing to have only the subject of simulation three-dimensional, so two kinds of sky dummy section separated part 204,205 are separated the sky dummy section from the scene image of Fig. 7 and Fig. 8.
Figure 11 is the block diagram of fine day dummy section separated part 204 of explanation.
Fine day dummy section separated part 204 input degree of polarization image ρ and coloured image C, subject degree of polarization image ρ fo, subject polarisation phase image Φ fo that output day dummy section has separated from scene.
Degree of polarization binaryzation portion 1101 uses threshold value T ρ that degree of polarization image ρ is carried out binaryzation.Brightness transition portion 1102 is converted to luminance picture Y with coloured image C. Brightness binaryzation portion 1103,1104 uses threshold value TC1 and TC2 that the luminance picture that brightness transition portion 1102 changes is carried out binaryzation.1105 couples of coloured image C of form and aspect error converter section carry out the HSV conversion, generate the color error image, and performance is with respect to the form and aspect deviation of sky look form and aspect.Form and aspect binaryzation portion 1106 utilizes the threshold value error image of checking colors to handle, and only extracts sky form and aspect zone out.The polarized light image that 1107 pairs of degree of polarization binaryzations of operational part portion, 1101 binaryzations are crossed, the luminance picture of crossing with brightness binaryzation portion 1103 binaryzations carry out AND (logical and) computing.The brightness that 1108 pairs of brightness binaryzations of operational part portion, 1104 binaryzations are crossed, the specific form and aspect of using threshold value TH to carry out binary conversion treatment with form and aspect binaryzation portion 1106 are carried out the AND computing.
Subject shade (Mask) selection portion 1110, according to the result of degree of polarization detection unit 1109, following which kind of shade is adopted in decision: (i) zone, first blue sky shade 1111 that generates according to degree of polarization and brightness; (ii) zone, second blue sky shade 1112 that generates according to form and aspect similarity and brightness.
Operational part 1113 and operational part 1114, the subject shade image 1115 of output is implemented logic and operation with degree of polarization image ρ, polarisation phase image Φ, generate subject degree of polarization image ρ fo when fine, subject polarisation phase image Φ fo when fine.
Want to detect the zone, blue sky, at present the method that exists is, according to coloured image, seeks the similar blueness of color form and aspect and smooth zone on image.But if when using color information, can have following problem: the form and aspect information changing at (i) color is very big, and sky was pinkish red, bright red by blue stain when for example sunset clouds occurred; Perhaps (ii) ground buildings is under the blue or white situation, can't discern actual sky or cloud.
Preferably, do not use this physical element to produce the color information of many variations, and only use monochrome information to detect sky because of sky.For this class sky dummy section is detected, for example can suppose that the highest zone of brightness is a sky in the scene image.In the method based on this supposition, though according to experiment, cloudy day or can obtain to a certain degree good result when sunset clouds are arranged, when fine, the direct reflection brightness of above ground structure is mostly than the brightness height of sky, so can't obtain good result.This is because be not because the normal reflection of sunshine but because the direct reflection on the artificial object smooth surface that the integral illumination of blue sky causes is especially strong.
Therefore, present embodiment is also used the scene degree of polarization except scene brightness, detects the zone, blue sky.What utilize is exactly near sky degree of polarization on sunny daytime very high situation horizontal line.
Reported result in the non-patent literature 3 to the polarized condition of whole sky under early (sunrise) to late (sunset) 12 hours each hour records.According to the document, except the east-west direction in morning and evening, near the sky degree of polarization the local horizon is all very strong in nearly all time.According to experiment, in most cases, this sky degree of polarization is stronger than the degree of polarization of artificial object such as distant views such as the chain of mountains on the ground, buildings.So, detect a day dummy section according to degree of polarization, can become effective sky detection method.The roof of above ground structure or glass etc. also have very strong polarisation, want to remove the polarisation that this buildings etc. causes, as long as use above-mentioned degree of polarization and brightness simultaneously, generate shade and detect and get final product.But, for morning the west to sky and evening east orientation sky because degree of polarization is low, brightness is also low, so this method is also inapplicable.In this case, use color form and aspect and brightness to detect sky.
Below, utilize the Figure 12 (A)~(H) that represents the actual scene image that the action of fine day dummy section separated part 204 shown in Figure 11 is described.In addition, in the following description, the image pickup scope of scene image is a border circular areas, and reason to be the camera apparatus in when experiment have the dark angle of camera lens, can think that this scene image is rectangular image in essence.
Figure 12 (A) is the degree of polarization image ρ of scene image.Luminosity image ρ becomes the image of Figure 12 (B) through the processing (T ρ=0.14) of degree of polarization binaryzation portion 901 afterwards partially.Binary-state threshold is by the histogram decision of degree of polarization.In this scene, landscape such as day dummy section and above-ground structures are divided into degree of polarization higher zone and lower zone, form bimodality and distribute.If the intermediate value of two peak values in the degree of polarization histogram is a threshold value.Among the figure, the territory, cloud sector on buildings right side also is removed when degree of polarization is low.The camera support platform that has only bottom black because polarisation is stronger, can be retained, and can't remove.
Figure 12 (C) is the luminance picture that the coloured image of scene image obtains after brightness transition portion 1102 handles.With the luminance picture warp that obtains after the brightness transition, cross brightness binaryzation portion 1103 and carry out binary conversion treatment (TC1=0.4), become the image of Figure 12 (D).In this scene, the brightness of blue sky and the brightness of buildings are almost equal, so be difficult to utilize brightness that the two is separated.But, in this case,, also can remove dark portions such as camera support platform by suitable setting threshold.
Two kinds of above shade images can be removed the lower territory, cloud sector of a part of degree of polarization by image pattern 12 (E) like that after operational part 1107 is handled, only isolate the zone, blue sky.Figure 12 (F) is to use common rectangular image to show the synoptic diagram of this shade image, is equivalent to zone, first blue sky shade 1111 of Figure 11.
At last, at operational part 1113,1114, the logical and that carries out above-mentioned shade and degree of polarization image, polarisation phase image calculates, and generates the subject degree of polarization image of Figure 12 (G) and the subject polarisation phase image of Figure 12 (H).
Below, utilize the scene of the east sky of taking at dusk among Figure 13, the inapplicable situation of said method is described.Figure 13 (A) to (D) is respectively the binaryzation result of binaryzation result, scene brightness image C and the scene brightness image of scene degree of polarization image ρ, scene degree of polarization image.Finally, obtain the shade image shown in Figure 13 (E), the detection that shows the blue sky zone has been failed.
The reason of failure is that because the sky degree of polarization of the scene polarized light image of Figure 13 (A) is low, brightness is also low.Therefore, in this case, degree of polarization detection unit 1109 can be according to the degree of polarization histogram of scene image, average degree of polarization is implemented to judge, if average degree of polarization does not just adopt said method less than defined threshold (T ρ 1=0.1), take to utilize the method for color form and aspect and brightness.Below, utilize Figure 11 and Figure 14, this explanation is carried out in this processing.
At first,, obtain the error of form and aspect angle by the form and aspect error converter section 1105 of Figure 11, this error be expression as the difference between the hue angle of the blue color phase angle of sky form and aspect and coloured image C, thus, coloured image C is converted into the form and aspect error image.
Here, why the form and aspect of sky being made as blue reason is, utilize the processing of above-mentioned coloured image to only limit to the also low situation of degree of polarization low-light level of sky, this is based on following supposition: selected sky or be the sky in west in morning, it is sky in the east at dusk, so the color of sky can be considered as blueness.
If the hue angle (0 °~360 °) of typical sky blueness is Hsky (=254 °), the hue angle of establishing the input scene is Htest.When the change type (RGB_toH) that uses the well-known RGB color space to the form and aspect H in HSV (form and aspect, saturation degree, brightness) space, so the cycle of considering hue angle be considered as 360 when spending, form and aspect error delta H is showed by following formula.
[numerical expression 9]
Htest=RGB_to_H(R,G,B)
Hmin=min(Htest,Hsky)
Hmax=max(Htest,Hsky)
Δ H=min (Hmax-Hmin, Hmin+360-Hmax) (formula 9)
1106 pairs of these form and aspect error delta H of form and aspect binaryzation portion by Figure 11 carries out threshold process, obtains the candidate of blue sky among the coloured image C.
Figure 14 (A) expression is to the scene image identical with Figure 13, with the form and aspect error image after form and aspect error converter section 1105 conversions of Figure 11.Figure 14 (B) represents that this form and aspect error image is carried out the shade image that obtains after the form and aspect binary conversion treatment (TH=220 °) in form and aspect binaryzation portion 1106.The image of Figure 14 (C) is the result who utilizes in the brightness binaryzation portion 1104 of Figure 11 after threshold value (TC2=0.29) is carried out binaryzation to brightness, and the image of Figure 14 (D) is the shade image that the operational part 1108 through Figure 11 calculates.When fine, this shade image section can be output selection portion 1110 and adopt, and generates fine sky area image.
Figure 15 is the formation block diagram of expression cloudy day day dummy section separated part 205.
According to experiment, the interior the highest zone of brightness of scene mostly is sky greatly during the cloudy day.Therefore, coloured image C is converted to luminance picture, carries out binaryzation, form shade with 1103 pairs of luminance pictures of brightness binaryzation portion with brightness transition portion 1102.With operational part 1113 and operational part 1114, this shade is carried out logic and operation with degree of polarization image and polarisation phase image, subject polarisation phase image Φ co when subject degree of polarization image ρ co and cloudy day in the time of just can generating the cloudy day respectively.
Figure 16 represents fine day or the scene image of when cloudy day sky dummy section after separated, and the set on subject surface is separated.What after this will carry out is exactly to infer each normal to a surface of buildings.In this scene, buildings be by shown in the roof R2 shown in the roof R1 shown in the reference marks " 1601 ", the reference marks " 1602 ", the reference marks " 1603 " over against the wall surface B2 and the glass portion W on the wall surface B1 shown in the reference marks " 1605 " of wall surface B1, the side shown in the reference marks " 1604 " constitute, suppose that they are on the ground surface G shown in the reference marks " 1606 ".
Below, with reference to Figure 17 (A), (B), the polarisation of sky and the relation of subject direct reflection phenomenon when the polarisation of sky was with the cloudy day when fine day was described.
According to experiment, when fine, outside the room direct reflection on subject surface as a result the utmost point depend on the polarized condition of incident light.The polarization direction (polarisation major axes orientation: parallel) and the size of the arrow 1701 expression incident lights of Figure 17 (A) with vibration plane, the variation of the energy reflectivity of rectangle 1702 expression lip-deep P ripples of this subject and S ripple.When refractive index n=1.4, the energy reflectivity of this P ripple and S ripple is will image pattern 18 such, and S wave reflection rate always is higher than the P ripple.In addition, Figure 18 be with and incident angle between relation represent the figure of the Fresnel reflection rate of direct reflection.
Here, because incident light is the linear polarization (linearly polarized light) that has only a vibration plane, so, can form from this surperficial reflected light and to allow the polarisation phasing degree of polarisation (polarized light) of incident slightly rotate, toward the reflected light of the direction polarisation of direction shown in the arrow 1703.But catoptrical major axes orientation does not rotate, and points to the S direction.That is to say that when polarisation was injected subject as illumination, taking place did not have direct relation between the phase place of this polarisation and the surface normal after the direct reflection, can't obtain normal information according to polarisation.Compare with the situation of non-polarizing illumination described later, this point falls far short, and is the problem that adopts maximum when sky throws light on outside the room.But, in this case, also have the situation that can reliably use polarisation information.
As shown in figure 18, near for example incident angle is 0 degree (from directly over inject) zone 1801 and incident angle be that P ripple and S wave reflection rate are equal to each other near 90 the zones 1802 etc. spending.So the variation of polarisation phase place as described above can not take place in the polarisation with the angle generation direct reflection in these zones 1801,1802, can form mirror status completely, so, the polarized condition that reflected light can former state reflection sky.That is to say, exist relative camera for from the complete back side or over against the surface of incident, polarized condition that can former state reflection sky is used so can be used as normal information.
On the other hand, when cloudy day incident light be non-polarisation, non-polarized condition shown in the circle 1704 of Figure 17 (B), by amplitude about equally, the synthetic ripple of vibration plane that a plurality of direction is different shows.Reason is, when this light (non-polarizing illumination) during in the subject surface reflection, the energy reflectivity of P ripple and S ripple can produce difference.After as rectangle 1702, modulating, can become elliptoid part polarisation (partially polarized light) 1705.At this moment, the major axes orientation of part polarisation is the S direction, and the subject normal to a surface is included in the above-mentioned plane.Finally, under the condition of non-polarizing illumination, the result of the lip-deep direct reflection of subject can obtain subject normal to a surface information according to catoptrical polarisation phase place outside cloudy room.
By above investigation, when fine and when cloudy, can infer processing by following establishment normal.
Normal was inferred the process flow diagram of the action of portion 206 when Figure 19 was the expression fine day.Figure 20 A is the key diagram that has utilized the synoptic diagram of scene image.
At first, among the step S1901, obtain the degree of polarization and the polarisation phase place of certain pixel of subject.Among the step S1902, whether degree of polarization is lower than fixed value ρ 1 judges.If degree of polarization is lower than fixed value ρ 1, this pixel is not directly to be subjected to light from the sun probably, but the surround lighting of multipath reflection.According to experiment, if acceptance is surround lighting, the face of straight-on camera is subjected to the light possibility high near the ground.Therefore, this moment just with this pixel as the pixel near the face of the straight-on camera ground, the setting normal.This class face is equivalent to the wall surface B1 of straight-on camera among Figure 20 A for example.In addition, " * " in Figure 20 A represents degree of polarization extremely low.
Among the step S1903,, search for the similar part of polarisation phase place of polarisation phase place and this pixel, find out the light source position of the light that makes this pixel generation direct reflection from the sky in the air according to the sky polarisation mapping graph of Fig. 9.Here, the reference position of polarisation phase angle performance as mentioned above, is the partial water horizontal line 902 of the camera coverage 906 shown in Fig. 9 (A), (B).Therefore, when according to the decision of this benchmark with the sky center 901 of Fig. 9 (B) sky position coordinates (θ L as the center, Φ L) the polarisation phasing degree Φ sky in (0 °≤sky≤180 °), and when obtaining relatively the polarisation phasing degree Φ obj (0 °≤Φ obj≤180 °) of horizontal line benchmark of the subject of taking by camera, then obtain (formula 10) described later Diff Φ sky position coordinates (θ L, Φ L) hour.
Adopt this method, search out G, the W zone of Figure 20 A.At first, suppose to distinguish that the angle (with respect to horizontal angle) of polarisation phase place in G zone 1606 is extremely near the polarisation phasing degree of the polarisation on day dummy section 2002.At this moment, write down the angle (θ L, Φ L) of this day dummy section 2002 in camera coordinates system.
Here it should be noted, near the right opposite of subject, can inject light from the sky at the camera back side, so the W of Figure 20 A zone 1605 can be with the incident angle of about 0 degree, reflection is as the light from the sky dummy section 2003 at the camera back side of light source.Under the incident angle in the zone 1801 of Figure 18, under the state of straight-on camera, direct reflection can take place, so the polarisation phase place of sky is that benchmark is reversed left to right with the horizontal line.In the example of Figure 20 A, as the light source position of the light that incides W zone 1605, what find out is the heterochiral day dummy section 2004 of polarisation phase directional in polarisation phase directional and W zone 1605.
The camera of present embodiment comprises subject camera lens part 101 shown in Fig. 1 C and wide-angle lens head 102, because the two coordinate system difference, so need a little steps during search polarisation phase place.
Figure 20 B represents the subject image with the subject lens shooting.In order to determine the angle of polarizing axis, needing camera horizontal datum 2010 is levels, that is to say to be parallel to the surface water horizontal line.This purpose can realize by usage level indicating device 103, does not need with camera shooting level line.The catoptrical polarisation phase place of subject Φ obj is the angle with respect to the camera datum line 2010 shown in Figure 20 B.When this benchmark entanglement, the angle reference of polarisation phase place will be unclear, just can not search for.
Figure 20 C is the sky image with the wide-angle lens photography of taking sky.Below, on one side with reference to Figure 20 C, the coordinate conversion that explanation is on one side carried out when the polarisation phase place of the whole sky of search.
Direction 2021 is directions of the court of subject camera lens institute of camera.The polarisation phasing degree of whole sky all is the angle with respect to datum line 2022.With datum line 2022 is the boundary, the zone on the direction 2021 can be divided into two zones: " sky dummy section of camera front " 2023 and " camera sky dummy section behind " 2024.
According to above-mentioned, can be from the image of subject lens shooting, to camera coverage 906, extracting benchmark out is the polarisation phase place 2011 of partial water horizontal line 902.Here, because partial water horizontal line 902 is parallel with whole day polarisation phasing degree datum line 2022, so, when " the sky dummy section of camera front " 2023 searched for, can intactly utilize the sky polarisation phasing degree Φ sky (zone 2025) in this zone.
Then, when " camera sky dummy section behind " 2024 searched for, search for as long as be converted to respect to the angle over against claiming of datum line 2022.If think that the polarisation phasing degree was the scope at 0 ° to 180 ° originally, Diff Φ is just with following formula (formula 10) expression so.
[numerical expression 10]
1) during the sky dummy section of search before the camera
Diffφ=|φsky-φobj|
2) (formula 10) during behind sky dummy section of search camera
Diffφ=|(180°-φsky)-φobj|
Like this, be the sky empty position of minimum value as long as search makes the Diff Φ of formula 10.
In the step 1905, according to the light source position of the sky that finds like this, the location of pixels of subject and the observation viewpoint of camera, utilization geometry theory is calculated the normal on the subject location of pixels.Figure 21 is the key diagram of this processing.In camera coordinates system, under the situation of the sight line vector V (2103) of vectorial L (2102) that has determined to be equivalent to skylight source position coordinate (θ L, Φ L) 2101 and camera, can obtain normal N (2104) by dividing these two vectors equally.
In addition, the method that this instructions will be inferred normal like this is called " inferring based on geometric normal ", and will not be that the light of polarisation is injected polarisation that subject produces by the direct reflection on subject surface and inferred the method for normal and be called " normal based on the direct reflection polarisation is inferred ".
Referring again to Figure 19.
In the step 1906, establish the fiduciary level of incident angle θ for the normal obtained of utilization geometry.This be because, as mentioned above, even the polarisation of the polarisation of subject and sky is similar, also just near incident angle θ is 0 degree or 90 degree direct reflection takes place, can think that θ 45 spends near the polarisation that can correctly not reflect sky when.That is to say that fiduciary level Conf for example is expressed from the next.
[numerical expression 11]
Conf=min (θ, 90-θ) (formula 11)
Determine fiduciary level Conf like this, the value of fiduciary level Conf is just high more near 0 fiduciary level more, and is just low more near 45 fiduciary levels more.
Figure 22 has schematically represented to give the state of incident angle θ and fiduciary level Conf to the normal that uses geometry to obtain.Fiduciary level with lower area higher (numerical value herein is 0), i.e. the W zone of the window-glass of the zone 2202 of the wall surface B1 of the G zone 2201 of ground surface, straight-on camera, same straight-on camera.
In the step 1907 of Figure 19,, calculate normal according to the specular light that is not the light of polarisation (non-polarisation).In general, when fine, also exist because the non-polarisation zone that bring in the territory, a part of cloud sector of sky.In this case, can infer surface normal according to the polarized condition of specular light.Obtain the same procedure of normal in the time of can adopting the cloudy day, enforcement obtains according to the normal of the polarisation of this specular light.
With reference to Figure 23.Figure 23 is the polarisation phase place that expression observes scene image.
The polarisation phase place of the point on the subject surface of representing with two arrows among Figure 23 (angle in the image surface), the result who has represented reflection is as (formula 7), the orthogonal directions of the main shaft of part polarisation.According to the Fresnel reflection theory, under the situation that direct reflection takes place, the normal of subject is included in the face that is parallel to direction shown in this phasing degree.In view of the above, can obtain one degree of freedom in have two degree of freedom normal direction of (ψ N, θ N).That is to say that direction shown in the polarisation phase place of each one of subject shown in Figure 23 itself will become the one degree of freedom (ψ N) of normal.But phase directional has the uncertainty of 180 degree.For this point, as shown in figure 21,, just can determine as long as the dip plane such as roof of the subject outside the supposition room must make progress to camera.
For another degree of freedom θ N, can be according to the method for patent documentation 1, the following relation that is had from incident angle θ and degree of polarization ρ decides.This be because, when the refractive index of establishing subject is n, between incident angle θ and the degree of polarization ρ, the establishment of following relational expression.
[numerical expression 12]
ρ = F S - F P F S + F = 2 sin θ tan θ n 2 - sin 2 θ n 2 - sin 2 θ + sin 2 θ tan 2 θ (formula 12)
Figure 24 is the result with the graph of a relation linearize of following formula, and transverse axis is incident angle θ, and the longitudinal axis is degree of polarization ρ.Here, obtain degree of polarization ρ by polarisation observation after, n is assumed to typical material constant for example 1.4, obtain θ.But problem is, what can obtain two candidates separates θ 1, θ 2, is brewster's angle (Brewster ' s angle) the θ B of reference marks 2402 expressions in the middle of the two.Therefore, in the present embodiment, candidate solution is limited in (scopes of arrow 2401 expressions) in the scope less than brewster's angle, from this scope, selects a candidate θ 1.This be because, under the situation of taking room External building thing, when its wall surface to be carried out incident angle be illumination more than 60~70 degree, probably be complete mirror status from light-initiated Figure 18 of background sky, so, very likely adopt based on geometric normal.After θ infers, use the sight line vector V of camera, determine the method phase vector.
Among the step S1908, the normal of just having obtained based on the direct reflection polarisation is calculated fiduciary level.This fiduciary level can obtain by the non-polarity of checking incident light, and the non-polarity of incident light is to use the prerequisite of direct reflection polarisation.Particularly be exactly, obtain the light source vector, the degree of polarization ρ of the corresponding light source position of investigation from all-sky polarisation mapping graph according to normal line vector and sight line vector.
[numerical expression 13]
Conf=ρ (formula 13)
Figure 25 is expression based on the figure of two the fixed states of degree of freedom of normal of direct reflection polarisation and appended fiduciary level thereof.
Among the step S1909,, among normal candidate promptly shown in Figure 22 and the normal candidate shown in Figure 25,, adopt the normal of fiduciary level Conf higher (value is near 0) according to metewand separately from two kinds of normal candidates.Then, with being placed on this location of pixels of adopting like this based on geometric normal or based on the normal of direct reflection polarisation.
This processing for example is to be normal 2204,2205 more than 10 by eliminating Conf shown in Figure 22, and the normal 2501,2502 of eliminating Conf shown in Figure 25 and being more than 0.5 is finished.
Figure 26 represents the normal example after fiduciary level is estimated.By this example as can be known, except the B1 zone, has adopted based on geometric normal in G, W zone, and remaining region R 1, R2, B2 have adopted the normal based on the direct reflection polarisation.
When being the cloudy day, implements Figure 27 the process flow diagram that normal is inferred.Flow content normal when fine is inferred portion 206 and is inferred identical based on the normal of direct reflection polarisation.But during the cloudy day, whole sky is non-polarizing illumination, because this prerequisite is arranged, so, need not to calculate (the step S1909 of Figure 19) such as fiduciary levels.
Below, the flow process of Figure 27 is described.
At first, among the step S2701, on this position of subject, obtain degree of polarization and polarisation phase place.Then, among the step S2702, whether degree of polarization is lower than fixed value ρ 1 judges.If degree of polarization is lower than fixed value ρ 1, just enter step S2703, this pixel as the pixel near the face of the straight-on camera ground, is set normal.If degree of polarization is fixed value ρ more than 1, just enter step S2704, according to the method described above, according to direct reflection projection polariscope algorithm line.Among the step S2705, the surface normal of obtaining is kept in the storer.
Next, the processing that utilizes the above-mentioned treated normal information that obtains, simulate in the simulation three-dimensional portion 205 of three-dimensional is described.
Figure 28 is the process flow diagram of the treatment scheme of expression simulation three-dimensional portion 208.Step S2801 obtains the normal map picture of subject, among the step S2802, similar normal is summed up the formation face.This is equivalent to discern G, W, B1, B2, each district of R1, R2 of Figure 16.Step S2803 detects the summit of each face, extract out its two-dimensional coordinate on image (u, v).Obtain camera parameter among the step S2804.Camera parameter is made up of inner parameter and external parameter.If ignore the lens distortions of camera, inner parameter be exactly focal distance f and optical axis center (u0, v0).External parameter is made up of the translation matrix and the rotation matrix of camera coordinates system and world coordinate system.The inside and outside parameter of camera ask the method typical method in the vision that uses a computer, detailed content is omitted.
Inner parameter is kept in the camera in advance.External parameter depends on method for imaging, and here, for simplicity, it is parallel with the face of land to establish optical axis, over against the face of land, that is to say, the angle of pitch of optical axis, rolling (roll) angle are 0.
Figure 29 is the relation that world coordinate system and camera coordinates are under this photography state of diagram, expression be face 2904 on plane, the face of land 2901, image surface 2902, optical axis center 2903 and the subject.Camera coordinates system is initial point (xc, yc, zc) coordinate system with viewpoint C.World coordinate system is to be (Xw, Yw, Zw) coordinate system of initial point with the arbitrfary point on the face of land.The face 2904 of subject has normal N, and the P1W on the bottom surface, P2W project on the camera, is exactly P1, P2.
Step S2805 shows plane, the face of land with following formula.
[numerical expression 14]
Yw=0 (formula 14)
Here, h is equivalent to camera coordinates system and world coordinates on the plane, the face of land and ties up to deviation on the Y-axis, the just height on the camera distance face of land.
Among the step S2806, calculate world coordinates P1W, the P2W of intersection point P1, the P2 of plane, the face of land and subject bottom surface.With P1 be example when calculating, P1 (u, v) with camera coordinates system in (zc) there are following relation in xc, yc.
[numerical expression 15]
Xc = u - u 0 f Zc Yc = v - v 0 f Zc (formula 15)
Now, the relation of representing camera coordinates system and world coordinate system with following (formula 16).
[numerical expression 16]
Xc = X W Yc = Y W - h Zc = Zw (formula 16)
According to more than, the world coordinates value of P1W is a straight line 2905 and the intersection point on plane, the face of land 2901, uses following formulate.
[numerical expression 17]
X W = - hf ( u - u 0 ) v - v 0 Y W = 0 Z W = - hf v - v 0 (formula 17)
Among the step S2807, calculate the world coordinates position on each summit of Figure 30 (A) expressions such as P2.This calculates just as above-mentioned, determine earlier and intersection point P1, P2, the P3 on plane, the face of land, then, utilize the normal N of face to establish the plane, as described above, as long as coordinate on the computing camera plane and the intersection point of the lead-in wire that comes from camera viewpoint initial point with the plane that is established.
Among the step S2808, set new camera viewpoint position Cnew, implement the viewpoint conversion, realize the simulation three-dimensional.This processing can be by following realization: the new relational expression that the world coordinate system that set up the camera external parameter, just replaces (formula 15) and camera coordinates are, utilize the camera projection of (formula 14), each summit of the P1~P7 in the world coordinate system is projected on the camera plane.The image of Figure 30 (A) will become the such image of Figure 30 (B) after the viewpoint conversion.
In addition, the mechanism that all-sky polarisation mapping graph obtaining section need not to have practical measurement sky polarized condition, can be according to the latitude etc. in photography time and photography place, the database of visit all-sky polarisation mapping graph, obtain necessary all-sky polarisation mapping graph from database, use in conjunction with the camera direction information that obtains from the GPS device simultaneously.In this case, all-sky polarisation mapping graph obtaining section 202 does not need to be configured in the polarisation information obtaining section 200.
In embodiment shown in Figure 31, all-sky polarisation mapping graph obtaining section 202 obtains corresponding all-sky polarisation mapping graph to the latitude in database 3100 notice photography times and photography place from database 3100.In addition, obtain the photography direction of the camera corresponding with all-sky polarisation mapping graph from GPS.
Image processing apparatus both can this be equipped with database 310 in memory storage, shown in also can image pattern 32 like that, by communicator 3110, access external data storehouse (not shown), in addition, weather detection unit 203 also can be by the communicator 3110 of Figure 32, obtain the expression current weather information of situation rain or shine from the outside, and utilize this information, switch the normal presuming method, and do not use the mode of judging weather according to the output of colored polarized light image obtaining section 201.
(embodiment 2)
Below, the 2nd embodiment according to image processing apparatus of the present invention is described.
The image processing apparatus of above-mentioned embodiment 1 has colored polarized light image image pickup part, can obtain the monochrome information of different colours, yet, carry out Flame Image Process according to the present invention, not necessarily will obtain coloured image.For example, Flame Image Process of the present invention for example also can utilize the luminance picture of monochrome image to realize.
Figure 33 is the pie graph of the image processing apparatus of present embodiment.For the formation of Figure 33, be that with the difference of the 1st embodiment image processing apparatus has polarisation information obtaining section 3300, comprise polarized light image obtaining section 3301 and all-sky polarisation mapping graph obtaining section 3302; Fine day dummy section separated part 3303 that normal infers that portion 210 comprises is different with embodiment 1 with the action of cloudy day sky dummy section separated part 3304.
Figure 34 represents the formation of polarisation information obtaining section 3300.Polarisation information obtaining section 3300 has polarisation brightness imaging apparatus 3401, to replace colored polarisation imaging apparatus.Polarisation brightness imaging apparatus 3401 is obtained the luminance picture and the polarized light image of scene in real time.So, can utilize for example patent documentation 3 disclosed technology (medelling polarisation).
Figure 35 represents the configuration example of this type of polarisation brightness imaging apparatus 3401.In illustrated configuration example, arrowband color screen and medelling polarisation are by the folded front that is located at the imaging apparatus pixel.Incident light sees through the arrowband color screen, medelling polarisation arrives imaging apparatus, by the imaging apparatus pixel, can observe monochrome image brightness, in addition, in order to select to make the frequency band of medelling polarisation work, the arrowband color screen has the frequency band that sees through of 500~550nm for example.
In the present embodiment, polarisation is seen through medelling polarisation that the angle is 0 degree, 45 degree, 90 degree, 135 degree, 4 pixels totally are considered as 1 pixel implement to handle.When these pixels are carried out the brightness equalization, can carry out the processing same treatment that usefulness (formula 8) is implemented rgb color, calculate the brightness of monochrome image.
In the Image Information Processing portion 3402 shown in Figure 34,, generate luminance picture Y by this processing.The processing that polarisation information treatment part 306 is implemented is with identical to the processing that (formula 6) illustrates with reference to (formula 1).Medelling polarisation of present embodiment also can be photonic crystal, diaphragm type polarization element, silk screen type and based on the polarization element of other principles.
Figure 36 is the block diagram of the configuration example of fine day dummy section separated part 3303.Fine day dummy section separated part 3303 input degree of polarization image ρ and luminance picture Y, subject degree of polarization image ρ fo, subject polarisation phase image Φ fo that output day dummy section has separated from scene.Degree of polarization binaryzation portion 1101 uses threshold value T ρ that degree of polarization image ρ is carried out binaryzation. Brightness binaryzation portion 1103,1104 uses threshold value TC1 and TC2 that luminance picture is carried out binaryzation.The luminance picture that polarized light image that 1107 pairs of degree of polarization binaryzations of operational part portion, 1101 binaryzations are crossed and brightness binaryzation portion 1103 binaryzations are crossed carries out AND (logical and) computing.
Subject shade selection portion 1110 results according to degree of polarization detection unit 1109, following which kind of shade is adopted in decision: (i) zone, first blue sky shade 1111 that generates according to degree of polarization and brightness; (ii) according to the regional shade 3601 in second blue sky of brightness generation.
Operational part 1113 and operational part 1114, implement the subject shade image 1115 of output and the logic and operation of degree of polarization image ρ and polarisation phase image Φ, generate subject degree of polarization image ρ fo when fine, subject polarisation phase image Φ fo when fine.
Because 3303 of this fine day sky dummy section separated part are used the monochromatic brightness image, so form and aspect information is not as the situation of the 1st embodiment of use coloured image.Its result, for all low east sky, the west of time-division in early morning constantly at dusk of degree of polarization, brightness, less might being difficult to of information detected.But, in great majority were used, the situation of carrying out photography outside the room daytime was maximum, so do not have very big problem.
Figure 37 is the formation block diagram of cloudy day day dummy section separated part 3304.The interior the highest zone of brightness of scene mostly is sky greatly during the cloudy day.Therefore, carry out binaryzation, form shade with 1103 pairs of luminance pictures of brightness binaryzation portion.With operational part 1113 and operational part 1114, this shade is carried out logic and operation with degree of polarization image and polarisation phase image, subject polarisation phase image Φ co when subject degree of polarization image ρ co and cloudy day in the time of just can generating the cloudy day respectively.
Normal was inferred portion 207, simulation three-dimensional portion 208 when normal was inferred portion 206, cloudy day during present embodiment fine, normal is inferred portion 206 when constituting with action all with the 1st embodiment fine, normal is inferred portion 207 when cloudy, to simulate three-dimensional portion 208 identical, so, omission is described.
Utilize possibility on the industry
Common two-dimentional rest image, dynamic image that the present invention can take outside the room, the image that does not just provide depth information is inferred out 3D shape, by carrying out the viewpoint conversion, generates the stereo-picture of simulation. Especially, can utilize by the sky outside room illumination and obtain polarisation information, and the whereby passive shape that obtains (passive), so, be well suited for obtaining the shape of castle or remote subject, be fit to they are carried out stereovision. Can be widely used in the cartographic information application of civil camera, film, ITS, rig camera, building field, room External building thing etc.

Claims (23)

1. an image processing apparatus is characterized in that,
Comprise: the polarized light image obtaining section, obtain the polarized light image of polarisation information with a plurality of pixels;
The subject normal is inferred portion, according to the polarisation information that described polarized light image has, infers subject normal to a surface outside the room; With
All-sky polarisation mapping graph obtaining section is obtained all-sky polarisation mapping graph, the relation between the position of its expression all-sky and the polarisation information of described position,
Described subject normal is inferred portion, utilizes described all-sky polarisation mapping graph, obtains the polarized condition of the specular light on described subject surface from described polarisation information, infers described subject normal to a surface thus.
2. an image processing apparatus is characterized in that,
Comprise: the image obtaining section, obtain luminance picture and polarized light image, this luminance picture has the monochrome information of a plurality of pixels, and this polarized light image has the polarisation information of described a plurality of pixels;
The weather detection unit is judged to be cloudy state or fine state with state of weather; With
The subject normal is inferred portion, according to described polarized condition, obtains the polarized condition of the specular light that the subject surface produces outside the room, according to the state of weather of described weather detection unit decision, takes distinct methods, infers the surface normal of described subject.
3. image processing apparatus according to claim 2 is characterized in that,
Described image obtaining section obtains described luminance picture at multiple different colours.
4. image processing apparatus according to claim 2 is characterized in that,
Described weather detection unit determines state of weather according to the degree of polarization of sky or the area in the zone of degree of polarization more than datum-plane.
5. image processing apparatus according to claim 4 is characterized in that,
Described weather detection unit is judged to be the cloudy state that the sky degree of polarization is lower than the datum-plane of regulation with state of weather, or the fine state of described degree of polarization more than described datum-plane.
6. image processing apparatus according to claim 4 is characterized in that,
Described weather detection unit has the part fine day of cloud to be judged to be fine state the sky part.
7. image processing apparatus according to claim 2 is characterized in that,
Described weather detection unit obtains the information of expression state of weather from the outside, the state of decision weather.
8. image processing apparatus according to claim 2 is characterized in that,
Possess all-sky polarisation mapping graph obtaining section, obtain the all-sky polarisation mapping graph of the relation between the polarized condition of whole day aerial position of expression and described position,
When described weather detection unit judges that state of weather is fine state, utilize described all-sky polarisation mapping graph, infer described subject normal to a surface.
9. image processing apparatus according to claim 8 is characterized in that,
Possess: normal is inferred portion during the cloudy day, carries out and infers based on the normal of direct reflection polarisation; With
Normal is inferred portion when fine, and carry out and infer and infer based on the normal of direct reflection polarisation based on geometric normal,
Described when fine normal infer portion, carrying out when inferring the relation between the whole day aerial position that utilizes that described all-sky polarisation mapping graph represents and the polarized condition of described position based on geometric normal.
10. image processing apparatus according to claim 8 is characterized in that,
Described all-sky polarisation mapping graph obtaining section is utilized wide-angle lens, obtains the polarized light image of whole sky.
11. image processing apparatus according to claim 8 is characterized in that,
Described all-sky polarisation mapping graph obtaining section is obtained the data of described all-sky polarisation mapping graph from the outside.
12. image processing apparatus according to claim 2 is characterized in that,
Possess: fine day dummy section separated part, when fine the sky dummy section is separated from described image; With
Cloudy day day dummy section separated part was separated the sky dummy section during cloudy day from described image,
According to the output of described weather detection unit, selectivity is switched the action or the output of fine day dummy section separated part and cloudy day dummy section separated part.
13. according to any described image processing apparatus of claim 2~12, it is characterized in that,
Described image obtaining section has:
Colored polarisation is obtaining section simultaneously, in the same polychrome pixel of the single-plate color imaging apparatus with colored mosaic filter disc, has the different sub-disposed adjacent of a plurality of polarisations that sees through the polarization corrugated of angle;
The polarisation information treatment part will be to the approximate sine function that becomes of the observation brightness of a plurality of polarisation of the described every kind of color that obtains, and the approximation parameters that obtains is carried out equalization between color, obtains the polarisation information of summing up; With
The color information handling part according to the described a plurality of observation brightness that obtain, carries out the brightness equalization, generates average color brightness,
Output (i) coloured image; (ii) based on the degree of polarization image and the polarisation phase image of described polarisation information.
14. image processing apparatus according to claim 9 is characterized in that,
(i) when the incident angle of light source during, adopt and infer based on geometric normal less than setting;
(ii), adopt and infer based on the normal of direct reflection polarisation when the degree of polarization of light source during less than setting.
15. image processing apparatus according to claim 2 is characterized in that,
When state of weather is judged as the cloudy day,, infer normal according to the polarisation phase place and the degree of polarization of specular light,
When the normal line vector of inferring around sight line vector when a plurality of the existence arranged, the normal of selection has the vector that is directed upwardly with respect to the surface level that comprises described sight line vector,
When in the normal line vector of inferring is comprising the plane of described sight line vector and incident ray being arranged a plurality of the existence, the normal of selection has the vector of incident angle less than brewster's angle.
16. a simulation stereoscopic image generation device is characterized in that having:
The face extraction unit, the surface normal of the subject of inferring according to any described image processing apparatus of claim 1~13 is extracted the face vertical with described surface normal out; With
Simulation three-dimensional portion according to the face that described extraction unit is extracted out, implements the viewpoint conversion, generates the scene image under other viewpoint.
17. simulation stereoscopic image generation device according to claim 16 is characterized in that,
The world coordinates on the summit of the face that described extraction unit extract out is inferred by described simulation three-dimensional portion.
18. an image processing method is characterized in that,
Comprise following steps:
Obtain the polarized light image of scene outside the room;
Obtain all-sky polarisation mapping graph;
Judge state of weather,
Also comprise and infer step,, detect the polarized condition of the specular light on subject surface outside the room,, take distinct methods, infer described by the object surfaces normal according to state of weather according to described polarized light image.
19. image processing method according to claim 18 is characterized in that,
Be judged as under the fine situation at state of weather, utilize two kinds of normals, promptly, implement normal and infer based on geometric normal with based on the normal of direct reflection polarisation.
20. image processing method according to claim 19 is characterized in that,
The incident angle of light source hour increases the fiduciary level of inferring normal based on geometry,
The degree of polarization of light source hour increases the fiduciary level of inferring normal based on the direct reflection polarisation,
The final high normal of fiduciary level that adopts.
21. image processing method according to claim 18 is characterized in that,
When state of weather is judged as the cloudy day, according to the polarisation phase place and the degree of polarization of specular light, infer normal,
When the normal line vector of inferring around sight line vector when a plurality of the existence arranged, the normal of selection has the vector that is directed upwardly with respect to the surface level that comprises described sight line vector,
When in the normal line vector of inferring is comprising the plane of described sight line vector and incident ray being arranged a plurality of the existence, the normal of selection has the vector of incident angle less than brewster's angle.
22. a simulation stereoscopic image generation method is characterized in that,
Comprise following steps:
Obtain the polarized light image of scene outside the room;
According to the polarisation information that described polarized light image has, infer subject normal to a surface outside the room;
The surface normal of presumptive subject is extracted the face perpendicular to described surface normal out; With
The conversion of enforcement viewpoint is to generate the scene image under other viewpoint.
23. simulation stereoscopic image generation method according to claim 22 is characterized in that,
Also comprise: the step that the world coordinates on the summit of the face extracted out is inferred.
CN2009801306539A 2008-12-25 2009-12-16 Image processing device and pseudo-3d image creation device Active CN102113021B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2008331028 2008-12-25
JP2008-331028 2008-12-25
PCT/JP2009/006928 WO2010073547A1 (en) 2008-12-25 2009-12-16 Image processing device and pseudo-3d image creation device

Publications (2)

Publication Number Publication Date
CN102113021A true CN102113021A (en) 2011-06-29
CN102113021B CN102113021B (en) 2013-11-27

Family

ID=42287202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009801306539A Active CN102113021B (en) 2008-12-25 2009-12-16 Image processing device and pseudo-3d image creation device

Country Status (4)

Country Link
US (1) US8654179B2 (en)
JP (2) JP4563513B2 (en)
CN (1) CN102113021B (en)
WO (1) WO2010073547A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108028022A (en) * 2015-09-30 2018-05-11 索尼公司 Image processing apparatus, image processing method and vehicle control system

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL201110A (en) 2009-09-22 2014-08-31 Vorotec Ltd Apparatus and method for navigation
EP2539853B1 (en) * 2010-02-25 2019-01-09 Lirhot Systems Ltd. Light filter with varying polarization angles and processing algorithm
US8760517B2 (en) 2010-09-27 2014-06-24 Apple Inc. Polarized images for security
JP5681555B2 (en) * 2011-04-27 2015-03-11 パナソニックIpマネジメント株式会社 Glossy appearance inspection device, program
US20120287031A1 (en) 2011-05-12 2012-11-15 Apple Inc. Presence sensing
KR101660215B1 (en) 2011-05-12 2016-09-26 애플 인크. Presence sensing
HUP1100482A2 (en) 2011-09-05 2013-04-29 Eotvos Lorand Tudomanyegyetem Method for cloud base height measuring and device for polarization measuring
US8923567B2 (en) * 2011-12-19 2014-12-30 General Electric Company Apparatus and method for predicting solar irradiance variation
US8750566B2 (en) 2012-02-23 2014-06-10 General Electric Company Apparatus and method for spatially relating views of sky images acquired at spaced apart locations
US9562764B2 (en) 2012-07-23 2017-02-07 Trimble Inc. Use of a sky polarization sensor for absolute orientation determination in position determining systems
JP6417666B2 (en) 2013-05-15 2018-11-07 株式会社リコー Image processing system
DE102013109005A1 (en) * 2013-08-20 2015-02-26 Khs Gmbh Device and method for identifying codes under film
WO2015108591A2 (en) * 2013-10-22 2015-07-23 Polaris Sensor Technologies Inc. Sky polarization and sun sensor system and method
JP6576354B2 (en) * 2014-01-23 2019-09-18 パフォーマンス エスケー8 ホールディング インコーポレイテッド System and method for board body manufacture
JP6393583B2 (en) * 2014-10-30 2018-09-19 株式会社ディスコ Protective film detection apparatus and protective film detection method
US10586114B2 (en) * 2015-01-13 2020-03-10 Vivint, Inc. Enhanced doorbell camera interactions
TWI571649B (en) * 2015-12-03 2017-02-21 財團法人金屬工業研究發展中心 A scanning device and method for establishing an outline image of an object
JP6812206B2 (en) * 2016-01-20 2021-01-13 キヤノン株式会社 Measurement system, information processing device, information processing method and program
JP6662745B2 (en) 2016-10-04 2020-03-11 株式会社ソニー・インタラクティブエンタテインメント Imaging device, information processing system, and polarization image processing method
EP3660449A4 (en) * 2017-07-26 2020-08-12 Sony Corporation Information processing device, information processing method, and program
US20200211275A1 (en) * 2017-08-30 2020-07-02 Sony Corporation Information processing device, information processing method, and recording medium
US10817594B2 (en) 2017-09-28 2020-10-27 Apple Inc. Wearable electronic device having a light field camera usable to perform bioauthentication from a dorsal side of a forearm near a wrist
JP6892603B2 (en) 2017-12-07 2021-06-23 富士通株式会社 Distance measuring device, distance measuring method and distance measuring program
CN108387206B (en) * 2018-01-23 2020-03-17 北京航空航天大学 Carrier three-dimensional attitude acquisition method based on horizon and polarized light
US11676245B2 (en) * 2018-05-24 2023-06-13 Sony Corporation Information processing apparatus and method for processing information
JP7256368B2 (en) * 2019-02-06 2023-04-12 ミツミ電機株式会社 ranging camera
US11004253B2 (en) * 2019-02-21 2021-05-11 Electronic Arts Inc. Systems and methods for texture-space ray tracing of transparent and translucent objects
CN110458960B (en) * 2019-06-26 2023-04-18 西安电子科技大学 Polarization-based three-dimensional reconstruction method for colored object
KR102396001B1 (en) * 2019-11-20 2022-05-11 주식회사 에이앤디시스템 Method for calculating amount of clouds through photographing whole sky
US20230342963A1 (en) * 2019-12-13 2023-10-26 Sony Group Corporation Imaging device, information processing device, imaging method, and information processing method
IL279275A (en) * 2020-12-06 2022-07-01 Elbit Systems C4I And Cyber Ltd Device, systems and methods for scene image acquisition
JPWO2022259457A1 (en) * 2021-06-10 2022-12-15
CN113532419A (en) * 2021-06-23 2021-10-22 合肥工业大学 Sky polarization mode information acquisition method and device, electronic equipment and storage medium
CN113935096A (en) * 2021-10-26 2022-01-14 浙江中诚工程管理科技有限公司 Method and system for monitoring foundation pit deformation in real time
CN114993295B (en) * 2022-08-08 2022-10-25 中国人民解放军国防科技大学 Autonomous navigation method based on polarization orientation error compensation

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11211433A (en) 1998-01-30 1999-08-06 Toppan Printing Co Ltd Measuring method and measuring system of surface form of molding
JP2004117478A (en) 2002-09-24 2004-04-15 Fuji Photo Film Co Ltd Camera
US20050104976A1 (en) * 2003-11-17 2005-05-19 Kevin Currans System and method for applying inference information to digital camera metadata to identify digital picture content
JP4214528B2 (en) 2004-12-27 2009-01-28 日本ビクター株式会社 Pseudo stereoscopic image generation apparatus, pseudo stereoscopic image generation program, and pseudo stereoscopic image display system
JP4974543B2 (en) 2005-08-23 2012-07-11 株式会社フォトニックラティス Polarization imaging device
JP3921496B1 (en) 2006-03-15 2007-05-30 松下電器産業株式会社 Image conversion method and image conversion apparatus
JP4139853B2 (en) * 2006-08-31 2008-08-27 松下電器産業株式会社 Image processing apparatus, image processing method, and image processing program
US7792367B2 (en) * 2007-02-13 2010-09-07 Panasonic Corporation System, method and apparatus for image processing and image format
JP2008218799A (en) * 2007-03-06 2008-09-18 Topcon Corp Surface inspection method and surface inspection device
WO2008131478A1 (en) * 2007-04-26 2008-11-06 Vinertech Pty Ltd Collection methods and devices
JP4235252B2 (en) 2007-05-31 2009-03-11 パナソニック株式会社 Image processing device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108028022A (en) * 2015-09-30 2018-05-11 索尼公司 Image processing apparatus, image processing method and vehicle control system
CN108028022B (en) * 2015-09-30 2021-06-15 索尼公司 Image processing apparatus, image processing method, and vehicle control system

Also Published As

Publication number Publication date
JP5276059B2 (en) 2013-08-28
JP4563513B2 (en) 2010-10-13
WO2010073547A1 (en) 2010-07-01
US8654179B2 (en) 2014-02-18
JP2010279044A (en) 2010-12-09
US20110050854A1 (en) 2011-03-03
JPWO2010073547A1 (en) 2012-06-07
CN102113021B (en) 2013-11-27

Similar Documents

Publication Publication Date Title
CN102113021B (en) Image processing device and pseudo-3d image creation device
CN102177719B (en) Apparatus for detecting direction of image pickup device and moving body comprising same
CA3157194C (en) Systems and methods for augmentation of sensor systems and imaging systems with polarization
CN101542232B (en) Normal information generating device and normal information generating method
CN104330074B (en) Intelligent surveying and mapping platform and realizing method thereof
CN101542233B (en) Image processing device and image processing method
CN101960859B (en) Image processing method, image processing device, image synthesis method, and image synthesis device
CN105006021B (en) A kind of Color Mapping Approach and device being applicable to quickly put cloud three-dimensional reconstruction
Lalonde et al. What do the sun and the sky tell us about the camera?
Ackermann et al. Photometric stereo for outdoor webcams
Ikeuchi et al. Digitally archiving cultural objects
Berger et al. Depth from stereo polarization in specular scenes for urban robotics
CN102017601A (en) Image processing apparatus, image division program and image synthesising method
Fradkin et al. Building detection from multiple aerial images in dense urban areas
Watanabe et al. Detecting changes of buildings from aerial images using shadow and shading model
Borrmann Multi-modal 3D mapping-Combining 3D point clouds with thermal and color information
Frohlich et al. Region based fusion of 3D and 2D visual data for Cultural Heritage objects
CN107730584A (en) A kind of panoramic space constructing system based on the technology of taking photo by plane
Borrmann Multi-modal 3D Mapping
Ikeuchi et al. E-Heritage
Zhao et al. Depth Recovery With Large-Area Data Loss Guided by Polarization Cues for Time-of-Flight Imaging
Rodriguez Calibration and 3D vision with a color-polarimetric camera
Tukora Virtual sculpture project
Bosch Alay et al. Omnidirectional Underwater Camera Design and Calibration

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant