CN118176459A - Optical device and imaging device - Google Patents

Optical device and imaging device Download PDF

Info

Publication number
CN118176459A
CN118176459A CN202280071853.7A CN202280071853A CN118176459A CN 118176459 A CN118176459 A CN 118176459A CN 202280071853 A CN202280071853 A CN 202280071853A CN 118176459 A CN118176459 A CN 118176459A
Authority
CN
China
Prior art keywords
image
optical
light
imaging
optical system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280071853.7A
Other languages
Chinese (zh)
Inventor
菅原俊
林佑介
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Corp
Original Assignee
Kyocera Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2022132755A external-priority patent/JP7331222B2/en
Application filed by Kyocera Corp filed Critical Kyocera Corp
Priority claimed from PCT/JP2022/038280 external-priority patent/WO2023074401A1/en
Publication of CN118176459A publication Critical patent/CN118176459A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The photographing device has a photographing optical system and an optical element. The imaging element images the incident first light in a predetermined area. The optical element directs the second light to a prescribed region. Regarding the second light, an angle formed by the optical axis of the photographing optical system and the principal ray incident on the photographing optical system is different from that of the first light.

Description

Optical device and imaging device
Cross-reference to related applications
The present application claims priority from japanese patent applications laid-open nos. 2021-175961 and 2022-132755, respectively, in japan, 10-27 and 2022-8-23, and the entire disclosures of the prior applications are incorporated herein by reference.
Technical Field
The present invention relates to an imaging device.
Background
An imaging optical system for imaging an observation target has various physical characteristics such as a focal length and a field angle. When the focal length is increased, an enlarged image of the observation target is formed, and thus detailed optical information of the observation target in a distant place, in other words, enlarged optical information can be obtained. The wider the angle of view, the more optical information of the observation target can be obtained over a wide range. However, the focal length and the angle of view have a balanced relationship, and the angle of view is narrowed when the focal length is longer, and the angle of view is widened when the focal length is shorter.
Therefore, the focal length is adjusted so that optical information desired according to the situation can be obtained. For example, the focal length is adjusted by displacing a zoom lens included in the photographing optical system. Or the focal length is adjusted by switching a plurality of single focus lenses (see patent documents 1 and 2).
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 11-311832
Patent document 2: japanese patent application laid-open No. 2004-279556
Disclosure of Invention
An optical device according to a first aspect includes:
an optical system for imaging the incident first light in a predetermined area; and
And an optical element that guides second light, which is different from the first light in an angle between an optical axis of the optical system and a principal ray incident on the optical system, to the predetermined region.
An imaging device according to a second aspect includes:
An optical device comprising: an optical system for imaging the incident first light in a predetermined area; and an optical element for guiding second light, which is different from the first light, to the predetermined area at an angle between an optical axis of the optical system and a principal ray incident on the optical system, and
The imaging element is disposed so that the predetermined region overlaps the light receiving region.
Drawings
Fig. 1 is a block diagram showing a schematic configuration of an imaging device according to a first embodiment.
Fig. 2 is a view of the imaging device when viewed from a direction perpendicular to the optical axis in order to show a modification of the optical element of fig. 1.
Fig. 3 is a view of the imaging device when viewed from a direction perpendicular to the optical axis in order to show another modification of the optical element of fig. 1.
Fig. 4 is a view of the imaging device when viewed from a direction perpendicular to the optical axis in order to show still another modification of the optical element of fig. 1.
Fig. 5 is a view of the imaging device when viewed from a direction perpendicular to the optical axis in order to show still another modification of the optical element of fig. 1.
Fig. 6 is a view of the imaging device when viewed from a direction perpendicular to the optical axis in order to show still another modification of the optical element of fig. 1.
Fig. 7 is a view of the imaging device when the imaging device is viewed from the normal direction of the light receiving region in order to show still another modification of the optical element of fig. 1.
Fig. 8 is a view of the imaging device when the imaging device is viewed from the normal direction of the light receiving region in order to show still another modification of the optical element of fig. 1.
Fig. 9 is a diagram for explaining physical characteristics of the imaging element and the optical system in fig. 1.
Fig. 10 is a conceptual diagram illustrating an image reaching the light receiving area of fig. 1.
Fig. 11 is a view of the imaging device when the imaging device is viewed from the normal direction of the light receiving region in order to show still another modification of the optical element of fig. 1.
Fig. 12 is a view of the imaging device when the imaging device is viewed from the normal direction of the light receiving region in order to show still another modification of the optical element of fig. 1.
Fig. 13 is a view of the imaging device when the imaging device is viewed from the normal direction of the light receiving region in order to show still another modification of the optical element of fig. 1.
Fig. 14 is a view of the imaging device when the imaging device is viewed from the normal direction of the light receiving region in order to show still another modification of the optical element of fig. 1.
Fig. 15 is a view of the imaging device when the imaging device is viewed from the normal direction of the light receiving region in order to show still another modification of the optical element of fig. 1.
Fig. 16 is a view of the imaging device when the imaging device is viewed from the normal direction of the light receiving region in order to show still another modification of the optical element of fig. 1.
Fig. 17 is a conceptual diagram illustrating a state in which superimposed images reaching the light receiving area of fig. 1 are formed.
Fig. 18 is a conceptual diagram for explaining a process of generating a restored image from an overlapped image by the controller of fig. 1.
Fig. 19 is a flowchart for explaining a ranging process performed by the controller of fig. 1.
Fig. 20 is a block diagram showing a schematic configuration of an imaging device according to the second embodiment.
Fig. 21 is a conceptual diagram for explaining an image component reaching the light receiving area of fig. 20.
Fig. 22 is a conceptual diagram for explaining an image component reaching a light receiving region in the modification of fig. 20.
Fig. 23 is a conceptual diagram for explaining an image component reaching a light receiving region in another modification of fig. 20.
Fig. 24 is a block diagram showing a schematic configuration of an imaging device according to the third embodiment.
Fig. 25 is a block diagram for explaining a pixel structure in the imaging element of fig. 24.
Detailed Description
Embodiments of the present disclosure will be described below with reference to the drawings. Among the constituent elements shown in the following drawings, the same constituent elements are denoted by the same reference numerals.
As shown in fig. 1, an imaging device 10 including an optical device 21 according to a first embodiment of the present disclosure is configured to include the optical device 21 and an imaging element 12. The camera 10 may also be configured to include a controller 14. The optical device 21 is configured to include an imaging optical system (optical system) 11 and an optical element 13.
The photographing optical system 11 images an incident subject beam. The photographing optical system 11 forms an image of the incident first light in a predetermined area pa. The first light may be light emitted from an object point located within the angle of view of the photographing optical system 11 alone. The predetermined area pa may be, for example, a virtual plane or a curved surface in a three-dimensional space whose center intersects the optical axis ox of the imaging optical system 11. Hereinafter, the angle of view of the photographing optical system 11 alone, in other words, the angle of view of the photographing optical system 11 in a configuration not including the optical element 13 is also referred to as a direct angle of view. The photographing optical system 11 is constituted by a single body, in other words, an optical element that images light beams emitted from object points at different positions at different image points without the optical element 13. The optical elements constituting the photographing optical system 11 are, for example, lenses, mirrors, diaphragms, and the like.
The photographing optical system 11 may not be an image side telecentric optical system. In other words, the angle of the principal ray of any light beam passing through the photographing optical system 11 with respect to the optical axis may be greater than 0 °. Or the photographing optical system 11 may be an image side telecentric optical system.
The optical element 13 guides the second light incident on the photographing optical system 11 to a prescribed area pa. The angle formed by the optical axis ox of the photographing optical system 11 and the principal ray of the second light incident on the photographing optical system 11 is different from that of the first light. The second light may be light emitted from an object point located outside the angle of view of the photographing optical system 11, in other words, located outside the direct angle of view. Therefore, the angle formed by the chief ray of the second light and the optical axis ox may be larger than the angle formed by the chief ray of the first light and the optical axis ox. The principal ray may be any one of a ray passing through the center of the aperture stop of the photographing optical system 15, a ray passing through the center of the entrance pupil of the photographing optical system 15, and a ray emitted from any one object point and incident on the center of the light beam of the photographing optical system 15. The optical element 13 can image the second light passing through the imaging optical system 11 in the predetermined area pa.
The optical element 13 may be a mirror that reflects the second light and guides it to the prescribed area pa. The reflecting surface of the reflecting mirror may be parallel to the optical axis ox of the photographing optical system 11. Or the reflecting surface of the mirror may be non-parallel to the optical axis ox.
As shown in fig. 2, the reflecting surface of the reflecting mirror may be inclined with respect to the optical axis ox so as to be inclined toward the outside on the imaging optical system 11 side. In the outward tilted posture, the angle of view of the entire optical device 21 can be made wider than in a configuration in which the reflecting surface of the reflecting mirror is parallel to the optical axis ox.
In the structure of tilting the outside, the optical device 21 may be configured such that the first lens 22 for adjusting the optical path length is disposed between the imaging optical system 11 and the optical element 13 as a reflecting mirror. By disposing the first lens 22, the shift of the focal position from the predetermined area pa due to the longer optical path length can be reduced as compared with a configuration in which the reflecting surface of the reflecting mirror is parallel to the optical axis ox. In a configuration in which the mirror is a mirror having a plane parallel to the direction perpendicular to the optical axis ox, as in a planar mirror, the first lens 22 may be a cylindrical lens. The first lens 22 may be located outside a line connecting the principal ray of the first light passing through the outer edge of the exit pupil of the photographing optical system 11 and the predetermined region pa with respect to the optical axis ox.
In the structure with the outer side inclined, as shown in fig. 3, a prism 23 may be provided in the optical device 21. The second light is reflected by the optical element 13 serving as a mirror, reflected by the prism 23, and guided to the predetermined area pa. By providing the prism 23, the tilt angle between the optical axis ox and the reflecting mirror in the structure in which the outer side is tilted can be made wide.
In the structure of the outside inclination, the optical element 13 as a mirror may be, for example, a planar mirror, a curved mirror, a DMD (digital micromirror device), or a fresnel mirror.
As shown in fig. 4, the reflecting surface of the reflecting mirror may be inclined with respect to the optical axis ox so as to be inclined toward the inside of the imaging surface side of the photographing optical system 11. In the inner inclined posture, the optical device 21 as a whole can be miniaturized as compared with a configuration in which the reflecting surface of the reflecting mirror is parallel to the optical axis ox.
In the structure of inclining on the inner side, the optical element 13 as a mirror may be, for example, a flat mirror, a curved mirror as shown in fig. 5, a DMD as shown in fig. 6, and a fresnel mirror.
The reflecting surface of the reflecting mirror may be parallel to any one side of a rectangular light receiving area ra of the imaging element 12 described later. Alternatively, as shown in fig. 7, the reflecting surface of the reflecting mirror may intersect with any one side of the light receiving area ra. In a configuration in which the reflecting surface of the reflecting mirror intersects with any one side of the light receiving region ra, separation accuracy by an image separation model described later can be improved. In the structure in which the reflecting surface of the reflecting mirror intersects with either side of the light receiving region ra, the optical element 13 is preferably arranged such that the overlapping region between the region sandwiched between two straight lines extending perpendicularly from both ends of the optical element 13 as the reflecting mirror and the light receiving region ra is maximized when viewed from the normal direction of the light receiving region ra.
The mirror is located outside the exit pupil of the photographing optical system 11 when viewed from the direction of the optical axis ox of the photographing optical system 11. More specifically, the reflecting mirror may be disposed with respect to the photographing optical system 11 so that the reflecting surface is positioned outside the exit pupil. Or the mirror may be located inside the exit pupil as viewed from the direction of the optical axis ox. In particular, in a structure in which the light receiving region ra is smaller than the pupil diameter, the mirror may be located inside the exit pupil.
The mirror may comprise a plurality of planar mirrors. Two plane mirrors belonging to at least one group of the plurality of plane mirrors may be located at positions where the reflection surfaces face each other and are parallel. Alternatively, the plurality of plane mirrors are two plane mirrors, and may be located at positions where the reflection surfaces are perpendicular to each other, as shown in fig. 8. Furthermore, two plane mirrors whose reflection surfaces are perpendicular to each other may be parallel to two sides of the rectangular light receiving area ra, respectively, which are perpendicular to each other. The outer edge of the light receiving region ra of the imaging element 12 and the planar mirror may be closely attached to each other in the normal direction of the planar mirror. Or the outer edges of the planar mirror and the light receiving area ra may be non-closely spaced in the normal direction of the planar mirror.
As shown in fig. 9, the two plane mirrors whose reflection surfaces are parallel to each other are each equal to the interval H between the optical axis ox. The two plane mirrors parallel to each other, the photographing optical system 11, and the photographing element 12 may be designed and arranged in such a manner as to satisfy cra+.tan -1 (H/B). CRA is an angle of a principal ray of a light beam emitted from an object point pp at an angle 2 times the direct field angle in the photographing optical system 11 with respect to the optical axis ox. B is the back focus of the photographing optical system 11.
As will be described later, by combining the arrangement of the imaging element 12 in the imaging device 10 and the configuration described above, as shown in fig. 10, the first image component im1 corresponding to the object point in the direct view angle, in other words, the object point emitting the first light, reaches the light receiving region ra of the imaging element 12 without passing through the optical element 13. More specifically, the first image component im1 corresponding to the object point in the direct view angle corresponds to the subject image located in the direct view angle. In addition, the second image component im2 corresponding to the object point outside the direct view angle, in other words, the object point emitting the second light, is inverted via the optical element 13 to reach the light receiving region ra. More specifically, the second image component im2 corresponding to the object point outside the direct view angle corresponds to the subject image located outside the direct view angle.
In the above description, the optical element 13 is a mirror having a surface parallel to the direction perpendicular to the optical axis ox, but may be a mirror having a curved surface when viewed from the optical axis ox. For example, as shown in fig. 11, the optical element 13 may be a set of curved mirrors provided on a set of opposite sides of the rectangular light receiving area ra when viewed from the normal direction of the light receiving area ra. The curved mirror may be parallel to the normal direction of the light receiving area ra. Alternatively, as shown in fig. 12, the optical element 13 may be a mirror having a circular curved surface with a rectangular light receiving area ra inside, when viewed from the normal direction of the light receiving area ra. Alternatively, as shown in fig. 13, the optical element 13 may be a mirror having an elliptical curved surface with a rectangular light receiving region ra built therein when viewed from the normal direction of the light receiving region ra. In a configuration in which the light receiving region ra is rectangular other than square, a mirror having an elliptical curved surface is preferable. Alternatively, as shown in fig. 14, the optical element 13 may be a mirror having a circular curved surface in which the rectangular light receiving area ra is built when viewed from the normal direction of the light receiving area ra. Alternatively, as shown in fig. 15, the optical element 13 may be a mirror having an elliptical curved surface in which the rectangular light receiving region ra is disposed when viewed from the normal direction of the light receiving region ra. In the configuration in which the optical element 13 is a mirror having a curved surface with a rectangular light receiving area ra, a gap between the light receiving area ra and the mirror can be eliminated when viewed from the normal direction of the light receiving area ra. In such a configuration, by excluding the gap, the continuity of optical information in the superimposed image described later can be improved as compared with a configuration having the gap.
The imaging element 12 captures an image formed in the light receiving area ra. The imaging element 12 may be disposed in the imaging device 10 so that a predetermined area pa of the optical device 21 overlaps the light receiving area ra. Therefore, the light receiving area ra of the imaging element 12 can correspond to the direct field angle. The direct field angle may be a field angle equivalent to the range of the object point imaged in the light receiving region ra without the optical element 13. At least a part of the first light, i.e., the light beam, incident on the photographing optical system 11 from within the direct field angle of the photographing optical system 11 can be imaged in the light receiving region ra. In addition, at least a part of the second light, i.e., the light beam, incident on the photographing optical system 11 from outside the direct field angle of the photographing optical system 11 and passing through the optical element 13 can be imaged in the light receiving region ra.
The imaging element 12 can capture an image by invisible light such as visible light, infrared light, and ultraviolet light. The imaging element 12 is, for example, a CCD (Charge Coupled Device: charge coupled device) image sensor, a CMOS (Complementary Metal-Oxide-Semiconductor) image sensor, or the like. The photographing element 12 may be a color image sensor. In other words, the plurality of pixels disposed in the light receiving area ra of the imaging element 12 may be covered with, for example, RGB color filters so as to be uniformly distributed in the light receiving area ra. The imaging element 12 generates an image signal corresponding to an image received by imaging. The imaging element 12 may generate an image signal at a predetermined frame rate of 30fps or the like.
In the photographing element 12, the outer edge of the light receiving region ra on the side where the optical element 13 is provided may be located outside the outer edge of the exit pupil of the photographing optical system 11. The outer side of the outer edge of the exit pupil is positioned outside with respect to the optical axis ox of the photographing optical system 11. As described above, the light receiving area ra may be rectangular.
In the imaging device 10, a plurality of imaging elements 12 may be provided. In the structure in which a plurality of imaging elements 12 are provided, as shown in fig. 16, an optical element 13 may be provided between two imaging elements 12 adjacent to each other. By providing the optical element 13 between the two imaging elements 12 adjacent to each other, the subject light flux imaged in the gap generated between the light receiving regions ra of the two adjacent imaging elements 12 in the configuration in which the optical element 13 is not provided can be imaged in at least one of the imaging elements 12.
In addition, according to the above-described configuration, as shown in fig. 17, the first image component im1 and the second image component im2 inverted in the configuration in which the optical element 13 is a mirror overlap in the light receiving region ra. Therefore, the imaging element 12 images the superimposed image olim of the first image component im1 and the second image component im2 inverted in the configuration in which the optical element 13 is a mirror.
The controller 14 is configured to include at least one processor, at least one dedicated circuit, or a combination thereof. The processor is a general-purpose processor such as a CPU (Central Processing Unit: central processing unit), a GPU (Graphics Processing Unit: graphics processor), or a special-purpose processor dedicated to a specific process. The dedicated Circuit may be, for example, an FPGA (Field-Programmable GATE ARRAY: field Programmable gate array), an ASIC (Application SPECIFIC INTEGRATED Circuit) or the like. The controller 14 may perform image processing on the image signal acquired from the photographing element 12.
As shown in fig. 18, the controller 14 may perform image processing to separate the superimposed image olim corresponding to the image signal into the first image component im1 and the second image component im2 by image processing. The controller 14 separates the superimposed image olim by applying an image processing method such as independent component analysis, wavelet method, image separation model, or the like, for example. The image separation model is, for example, a model constructed by creating a superimposed image in which a plurality of images are superimposed, and learning the superimposed image with the plurality of images as correct answers. The image separation model may be a model to which a generator that generates an image like the Encoder-Decoder model and a discriminator (discriminator) that determines whether the generated image is a false image compete are applied and a Pix-to-Pix (pixel-to-pixel) model of a pair of images reflecting the relationship is generated. The controller 14 may generate the restored image rcim by combining the separated first image component im1 and second image component im 2.
The controller 14 can perform ranging of the subject photographed around the photographing device 10 using the restored image rcim. The controller 14 uses the restored image rcim to perform ranging based on DFD (Depth From Defocus: defocus method), for example. The controller 14 can perform ranging using the restored image rcim according to a Motion parallax method (SLAM: simultaneous Localization AND MAPPING: simultaneous localization and mapping, motion Stereo), a separation model based on DEEP LEARNING (deep learning), a foot distance measurement method, and the like. The foot distance measurement method is a method of calculating three-dimensional coordinates based on image coordinates on the premise that the lower end of a subject image is located on the ground.
The controller 14 may generate a distance image based on the distances corresponding to the respective addresses of the restored image rcim. The distance image is an image in which the pixel value of each pixel corresponds to the distance. The controller 14 may provide the range image to an external device.
Next, the ranging process performed by the controller 14 in the present embodiment will be described with reference to the flowchart of fig. 19. The ranging process starts each time an image signal is acquired from the photographing element 12.
In step S100, the controller 14 separates the second image component im2 from the superimposed image olim corresponding to the acquired image signal. After the separation, the process advances to step S101.
In step S101, the controller 14 generates a first image component im1 by subtracting the second image component im2 separated in step S100 from the superimposed image olim. After the generation of the first image component im1, the process advances to step S102.
In step S102, the controller 14 generates a restored image rcim by combining the second image component im2 separated in step S101 and the first image component im1 generated in step S101. After the restored image rcim is generated, the process advances to step S103.
In step S103, the controller 14 measures the distance for each subject visualized in the restored image rcim using the restored image rcim generated in step S102. After ranging, the process advances to step S104.
In step S104, the controller 14 generates a distance image based on the distance calculated in step S103 and the position of the restored image rcim corresponding to the distance. In addition, the controller 14 supplies the distance image to an external device. After the distance image is generated, the ranging process ends.
The optical device 21 of the first embodiment having the above-described structure includes: a photographing optical system 11 for imaging the incident first light in a predetermined area pa; and an optical element 13 that guides second light, which is different from the first light in an angle formed by the optical axis ox of the imaging optical system 11 and the principal ray incident on the imaging optical system 11, to a predetermined area pa. With this configuration, the optical device 21 employs the imaging optical system 11 having a relatively long focal length, and can guide an image including optical information in a wide range of angles of view corresponding to the focal length to the predetermined area ra. Therefore, the optical device 21 can generate optical information that is large-scale and amplified.
In the imaging device 10 according to the first embodiment, the optical element 13 is a mirror that reflects the second light and guides the second light to the predetermined area pa, and the reflecting surface of the mirror is parallel to the optical axis ox and either side of the rectangular light receiving area ra of the imaging element 12. With this configuration, in the imaging device 10, the direction of distortion of the image reaching the light receiving area ra via the mirror is one direction or less. Therefore, the imaging device 10 can reduce the load of image processing for removing distortion of the reflected light component due to the mirror included in the captured image, and thus can improve the reproducibility of the reflected light component.
In the imaging device 10 according to the first embodiment, the reflecting mirror includes a plurality of flat reflecting mirrors, and reflecting surfaces of at least one group of two flat reflecting mirrors among the plurality of flat reflecting mirrors are parallel to each other. With this configuration, the imaging device 10 can obtain optical information that is magnified from the direct view angle to both sides centering on the optical axis ox.
In the imaging device 10 according to the first embodiment, the optical axis ox is equal to the interval H between the reflective surfaces of the two plane mirrors parallel to each other, and the angle CRA of the principal ray of the light flux from the object point pp at the angle 2 times the direct view angle satisfies cra+.tan -1 (H/B). With this configuration, in the imaging device 10, overlapping of three layers of image components due to overlapping of the reflected light component from one plane mirror and the reflected light component from the other plane mirror can be prevented, and the accuracy of separation of the image components by the image processing after passing can be improved.
In the imaging device 10 according to the first embodiment, the planar mirror is in close contact with the outer edge of the light receiving region ra of the imaging element 12 in the normal direction of the planar mirror. In the normal direction, if a gap is generated between the planar mirror and the light receiving region ra of the imaging element 12, optical information of the subject imaged at the gap is lost. In such a case, the imaging device 10 having the above-described configuration prevents loss of optical information.
In the imaging device 10 according to the first embodiment, the mirror is positioned outside the exit pupil of the imaging optical system 11 when viewed from the optical axis ox direction. With this configuration, the imaging device 10 can make the light flux passing through the vicinity of the end portion of the exit pupil incident on the mirror. Therefore, the photographing device 10 can reduce a decrease in light quantity caused by vignetting of a part of the light beam passing near the exit pupil.
In the imaging device 10 according to the first embodiment, the angle of the principal ray of any light flux in the imaging optical system 11 with respect to the optical axis ox is larger than 0 °. With this configuration, in the imaging device 10, the imaging optical system 11 is not an image-side telecentric optical system, and therefore, a light flux from an object point having an angle larger than the direct view angle can be made incident on the optical element 13. Therefore, the imaging device 10 can reliably generate optical information in a range that is widened from the direct angle of view.
The imaging device 10 according to the first embodiment includes a controller 14, and the controller 14 separates an image corresponding to an image signal into a first image component im1 corresponding to an object point in a direct image and a second image component im2 corresponding to an object point outside a direct view angle. With this configuration, the imaging device 10 can generate an image in which the overlapping of the overlapping images olim in which the plurality of image components are overlapped is eliminated.
Next, an imaging device according to a second embodiment of the present disclosure will be described. In the second embodiment, the structure of an optical element and separation processing performed by a controller are different from those of the first embodiment. The second embodiment will be described below focusing on differences from the first embodiment. In addition, the same reference numerals are given to the parts having the same configuration as in the first embodiment.
As shown in fig. 20, the imaging device 100 of the second embodiment is configured to include an imaging optical system 11, an imaging element 12, and an optical element 130, similarly to the first embodiment. The camera 100 may also be configured to include a controller 14. The imaging optical system 11 and the imaging element 12 in the second embodiment have the same configuration and functions as those in the first embodiment. The structure of the controller 14 in the second embodiment is the same as that in the first embodiment.
In the second embodiment, the optical element 130 images at least a part of the light beam incident on the photographing optical system 11 from outside the direct angle of view of the photographing optical system 11 in the light receiving region ra of the photographing element 12, similarly to the first embodiment. In the second embodiment, the optical element 130 is different from the first embodiment in that an incident light beam is optically processed and emitted.
The optical treatment is, for example, changing the wavelength band of the incident light beam. Specifically, the optical element 130 attenuates light in a wavelength band corresponding to a color filter that covers a part of the plurality of color filters of the imaging element 12 in the incident light beam. Therefore, the optical element 130 images light of a wavelength band other than the light of the wavelength band in the light receiving area ra.
The optical element 130 may be a mirror that reflects light of a wavelength band different from that of the attenuated color. The optical element 130 attenuates R light and reflects GB light, for example.
According to the above configuration, as shown in fig. 21, in the second embodiment, the first R image component im1R, the first G image component im1G, and the first B image component im1B corresponding to the object point in the direct view angle and corresponding to the colors of all the color filters reach the light receiving region ra without passing through the optical element 130. The second G image component im2G and the second B image component im2B corresponding to the object point outside the direct view angle and to the color other than the color component after the attenuation reach the light receiving area ra via the optical element 130.
Or an optical process such as adding a pattern corresponding to a light-dark difference of an incident position to an incident light beam. Specifically, as shown in fig. 22, the optical element 130 has a surface in which the first region 190 and the second region 200 are distributed in a square shape, for example. The first region 190 attenuates the brightness of the incident light at a first attenuation rate and emits the attenuated light. The first decay rate is greater than 0% and less than 100%. The second region 200 attenuates the brightness of the incident light at a second attenuation rate and emits the attenuated light. The second attenuation rate is 0% or more and less than the first attenuation rate. Therefore, the optical element 130 gives a light-dark difference corresponding to the pattern of the first region 190 and the second region 200 to the incident light beam, and images the light-receiving region ra.
Or the optical processing is, for example, to impart distortion to an image formed by the incident light beam in the light receiving region ra. Specifically, as shown in fig. 23, the optical element 130 is a mirror having a cylindrical curved surface with a line parallel to the optical axis as an axis, and reflects the incident light beam to form an image with distortion in the light receiving region ra. More specifically, the optical element 130 is an image that forms distortion to be enlarged in a direction connecting both ends of an arc of a cross section of the optical element that is a mirror based on a plane perpendicular to the optical axis.
In the second embodiment, the controller 14 can perform image processing of separating the superimposed image olim corresponding to the image signal into the first image component im1 and the second image component im2 by image processing similarly to the first embodiment. In the second embodiment, the controller 14 separates the superimposed image olim by an image processing method using an image separation model.
The image separation model in the second embodiment will be described below. The image separation model is a model constructed by generating a first image in which optical processing is not performed and a superimposed image in which optical processing corresponding to the optical element 130 is performed on a second image different from the first image, and learning the first image and the second image as correct answers with respect to the superimposed image.
In the structure in which the optical processing is the changing of the band, the first image is an arbitrary RGB image component of the image. In this configuration, the second image is a GB image component of an image different from the arbitrary image. In the configuration in which the optical process is to change the wavelength band, R image components of an arbitrary image can be used for learning in addition to the superimposed image.
In a configuration in which the optical processing is an additional shading pattern, the first image is an arbitrary image. In this configuration, the second image is an image in which the brightness of an image different from the arbitrary image is changed in the light-dark difference pattern of the optical element 130.
In the configuration in which the optical processing is distortion imparting, the first image is an arbitrary image. In this configuration, the second image is an image obtained by reflecting an image different from the arbitrary image with a mirror having the same curved surface as the optical element 130.
In the second embodiment, the controller 14 can generate the restored image rcim by combining the separated first image component im1 and second image component im2, as in the first embodiment. In the second embodiment, the controller 14 can perform ranging of the subject imaged around the imaging device 100 using the restored image rcim, as in the first embodiment. In the second embodiment, the controller 14 may generate a distance image based on the distances corresponding to the respective addresses of the restored image rcim and provide the distance image to the external device, as in the first embodiment.
The optical device 210 of the second embodiment having the above-described structure also includes: a photographing optical system 11 for imaging the incident first light in a predetermined area pa; and an optical element 130 that guides second light, which is different from the first light in an angle formed by the optical axis ox of the imaging optical system 11 and the principal ray incident on the imaging optical system 11, to a predetermined area pa. Therefore, the imaging device 100 can also generate large-scale and enlarged optical information.
In the imaging device 100 according to the second embodiment, the optical element 130 is a mirror that reflects the second light and forms an image in the predetermined area pa, and the reflecting surface of the mirror is parallel to the optical axis ox and either side of the rectangular light receiving area ra of the imaging element 12. Therefore, the imaging device 100 can reduce the load of image processing for removing distortion of the reflected light component due to the mirror included in the captured image, and can thus improve the reproducibility of the reflected light component.
In the imaging device 100 according to the second embodiment, the mirror includes a plurality of flat mirrors, and the reflective surfaces of at least one group of two flat mirrors of the plurality of flat mirrors are parallel to each other. Therefore, the imaging device 100 can acquire optical information magnified from the direct view angle to both sides centering on the optical axis ox.
In the imaging device 100 according to the second embodiment, the optical axis ox is equal to the interval H between the reflective surfaces of the two plane mirrors parallel to each other, and the angle CRA of the principal ray of the light flux from the object point pp at the angle 2 times the direct view angle satisfies cra+.tan -1 (H/B). Therefore, even in the imaging device 100, overlapping of three layers of image components due to overlapping of the reflected light component from one plane mirror and the reflected light component from the other plane mirror can be prevented, and the accuracy of separation of the image components in the subsequent image processing can be improved.
In the imaging device 100 according to the second embodiment, the planar mirror is in close contact with the outer edge of the light receiving region ra of the imaging element 12 in the normal direction of the planar mirror. Therefore, the imaging device 100 can also prevent loss of optical information.
In the imaging device 100 according to the second embodiment, the mirror is positioned outside the exit pupil of the imaging optical system 11 when viewed from the optical axis ox direction. Therefore, the photographing device 100 can also reduce a decrease in the amount of light caused by vignetting of a part of the light flux passing near the exit pupil.
In the imaging device 100 according to the second embodiment, the angle of the principal ray of any light flux in the imaging optical system 11 with respect to the optical axis ox is larger than 0 °. Therefore, the imaging device 100 can also reliably generate optical information in a range that is widened from the direct angle of view.
The imaging device 100 according to the second embodiment includes a controller 14, and the controller 14 separates an image corresponding to an image signal into a first image component im1 corresponding to an object point in a direct image and a second image component im2 corresponding to an object point outside a direct view angle. Therefore, the imaging device 100 can also generate an image in which the overlapping of the overlapping image olim in which the plurality of image components are overlapped is eliminated.
In the imaging device 100 according to the second embodiment, the optical element 130 performs optical processing on the light beam incident on the optical element 130 and emits the light beam. With this configuration, the imaging device 100 imparts optical characteristics corresponding to the optical processing to the separated image components. Therefore, the imaging device 100 can construct an image separation model that performs learning to improve separation accuracy. Therefore, in the imaging device 100, the restoration accuracy of the restored image can be improved.
Next, an imaging device according to a third embodiment of the present disclosure will be described. In the third embodiment, the configuration of the imaging element and the separation process performed by the controller are different from those of the first embodiment. The third embodiment will be described below focusing on differences from the first embodiment. In addition, the same reference numerals are given to the parts having the same configuration as in the first embodiment.
As shown in fig. 24, the imaging device 101 of the third embodiment is configured to include an imaging optical system 11, an imaging element 121, and an optical element 13, similarly to the first embodiment. The camera 101 may also be configured to include a controller 14. The configuration and functions of the photographing optical system 11 and the optical element 13 in the third embodiment are the same as those in the first embodiment. The structure of the controller 14 in the third embodiment is the same as that in the first embodiment.
In the third embodiment, the imaging element 121, similarly to the first embodiment, images an image formed in the light receiving area ra via the imaging optical system 11. The imaging element 121 can capture an image of invisible light such as visible light, infrared light, or ultraviolet light, similarly to the first embodiment. The photographing element 121 may be a color image sensor. The imaging element 121 generates an image signal corresponding to an image received by imaging, similarly to the first embodiment. The imaging element 121 can generate an image signal at a prescribed frame rate of 30fps or the like, similarly to the first embodiment. In the imaging element 121, as in the first embodiment, the outer edge of the light receiving region ra on the side where the optical element 13 is provided may be located outside the outer edge of the exit pupil of the imaging optical system 11. The light receiving area ra may be rectangular as in the first embodiment.
In the third embodiment, the imaging element 121 may be a dual-pixel type image sensor, unlike the first embodiment. As shown in fig. 25, an imaging element 121, which is a dual-pixel image sensor, is an image sensor having a structure in which a first PD (photodiode) 171 and a second PD181 are provided on a pixel 161 covered with each microlens 151, and only one PD can be made incident on the imaging element by using the incident direction of a light beam. For example, in each pixel 161, only a light flux from a direction inclined to the optical axis ox side can be incident on the first PD171, and only a light flux from a direction inclined to the optical element 13 side can be incident on the second PD181.
With the above-described configuration, the first image component im1 corresponding to the object point in the direct image reaches the first PD171 in the light receiving area ra without passing through the optical element 13. In addition, the second image component im2 corresponding to the object point outside the direct field angle reaches the second PD181 in the light receiving region ra via the inversion of the optical element 13.
In the third embodiment, the controller 14 can perform image processing of separating the superimposed image olim corresponding to the image signal into the first image component im1 and the second image component im2 by image processing similarly to the first embodiment. In the third embodiment, the controller 14 may generate the first image component im1 based only on the signal generated by the first PD 171. The signal generated based on only the first PD171 refers to a signal specified to be generated without using the second PD181, and may include, for example, a signal that is unrelated to the signal of the output of the second PD181 using the same kind of synchronization signal or the like. In addition, in the third embodiment, the controller 14 may generate the inverted second image component im2 based only on the signal generated by the second PD 181. The meaning of the signal generated based on only the second PD181 is similar to the meaning of the signal generated based on only the first PD 171.
In the third embodiment, the controller 14 can generate the restored image rcim by combining the separated first image component im1 and second image component im2, as in the first embodiment. In the third embodiment, the controller 14 can perform ranging of the subject imaged around the imaging device 101 using the restored image rcim, as in the first embodiment. In the third embodiment, the controller 14 generates a distance image based on the distances corresponding to the respective addresses of the restored image rcim, and supplies the distance image to the external device, as in the first embodiment.
In the imaging device 101 of the third embodiment having the above-described configuration, the optical element 13 is also a mirror that reflects the second light and guides the second light into the predetermined area pa, and the reflecting surface of the mirror is parallel to the optical axis ox and either side of the rectangular light receiving area ra of the imaging element 121. Therefore, the imaging device 101 can also reduce the load of image processing for removing distortion of the reflected light component due to the mirror included in the captured image, and thus can improve the reproducibility of the reflected light component.
In the imaging device 101 according to the third embodiment, the mirror includes a plurality of flat mirrors, and the reflection surfaces of at least one group of two flat mirrors of the plurality of flat mirrors are parallel to each other. Therefore, the imaging device 101 can acquire optical information that is magnified from the direct view angle to both sides centering on the optical axis ox.
In the imaging device 101 according to the third embodiment, the optical axis ox is equal to the interval H between the reflective surfaces of the two plane mirrors parallel to each other, and the angle CRA of the principal ray of the light flux from the object point pp at the angle 2 times the direct view angle satisfies cra+.tan -1 (H/B). Therefore, in the imaging device 101, overlapping of three layers of image components due to overlapping of the reflected light component from one plane mirror and the reflected light component from the other plane mirror can be prevented, and the accuracy of separation of the image components in the subsequent image processing can be improved.
In the imaging device 101 according to the third embodiment, the planar mirror is in close contact with the outer edge of the light receiving region ra of the imaging element 121 in the normal direction of the planar mirror. Thus, the photographing device 101 prevents loss of optical information.
In the imaging device 101 according to the third embodiment, the mirror is positioned outside the exit pupil of the imaging optical system 11 when viewed from the direction of the optical axis ox. Therefore, the photographing device 101 can also reduce a decrease in light quantity caused by vignetting of a part of the light flux passing near the exit pupil.
In the imaging device 101 according to the third embodiment, the angle of the principal ray of any light flux in the imaging optical system 11 with respect to the optical axis ox is larger than 0 °. Therefore, the imaging device 101 can also reliably generate optical information in a range that is widened from the direct angle of view.
The imaging device 101 according to the third embodiment includes a controller 14, and the controller 14 separates an image corresponding to an image signal into a first image component im1 corresponding to an object point in a direct image and a second image component im2 corresponding to an object point outside a direct view angle. Therefore, the imaging device 101 can also generate the superimposed image olim in which the plurality of image components are superimposed.
Examples
A running space image is prepared as a dataset of images for learning. Using each of these datasets, a portion of the overall image is cut off, creating a set of overlapping images that overlap one portion of the image onto another. The image separation model of example 1 was constructed by learning an image separation model separating superimposed images using the created 15000 groups as training data. Using the remaining group as data for confirmation, the superimposed image was separated by the image separation model of example 1. The separated image is compared with a partial image used in the superimposed image, and PSNR (peak signal-to-noise ratio) and SSIM (structural similarity) are calculated. In the image separation model of example 1, PSNR and SSIM were 20.54 and 0.55, respectively.
Using the images of the same data set as in example 1, a part of the entire image was cut off, and a group of superimposed images in which the GB image component of one part of the image was superimposed on the RGB image component of the other part of the image was created. The image separation model of example 2 was constructed by learning an image separation model separating the superimposed images using the created X1 group as training data. The remaining group was used as data for confirmation, and the superimposed image was separated by the image separation model of example 2. The separated image is compared with a partial image used in the superimposed image, and PSNR and SSIM are calculated. In the image separation model of example 2, PSNR and SSIM were 27.40 and 0.92, respectively.
Using the images of the same data set as in example 1, a part of the entire image was cut off, and the brightness of one part of the image was attenuated by the light and shade pattern of the square checkered pattern corresponding to the field angle of 25 °, to create a group of superimposed images superimposed on the other part of the image. The attenuation ratio of one region of the light-dark difference pattern was 70%, and the attenuation ratio of the other region was 30%. An image separation model of example 3 was constructed by learning an image separation model separating superimposed images using the created X1 group as training data. The remaining group was used as data for confirmation, and the superimposed image was separated by the image separation model of example 3. The separated image is compared with a partial image used in the superimposed image, and PSNR and SSIM are calculated. In the image separation model of example 3, PSNR and SSIM were 24.48 and 0.88, respectively.
Using the respective images of the same data set as in example 1, a part of the entire image was cut off, and a part of the image was distorted in such a manner as to reproduce distortion aberration, creating a group of superimposed images superimposed on the other part of the image. The image separation model of example 4 was constructed by learning an image separation model separating the superimposed images using the created 15000 groups as training data. The remaining group was used as data for confirmation, and the superimposed image was separated by the image separation model of example 4. The separated image is compared with a partial image used in the superimposed image, and PSNR and SSIM are calculated. In the image separation model of example 4, PSNR and SSIM were 25.26 and 0.90, respectively.
In one embodiment, (1) an optical device, wherein,
The device comprises:
an optical system for imaging the incident first light in a predetermined area; and
And an optical element that guides second light, which is different from the first light in an angle between an optical axis of the optical system and a principal ray incident on the optical system, to the predetermined region.
(2) The optical device according to the above (1), wherein,
The angle formed by the chief ray of the second light and the optical axis is larger than the angle formed by the chief ray of the first light and the optical axis.
(3) The optical device according to the above (1) or (2), wherein,
The optical element is a mirror that reflects the second light and guides the second light to the prescribed region.
(4) A photographing apparatus, wherein,
The device comprises:
the optical device of (1) or (2) above; and
The imaging element is disposed so that the predetermined region overlaps the light receiving region.
(5) The camera according to the above (4), wherein,
The optical element is a mirror that reflects the second light and guides the second light to the prescribed region.
(6) The camera according to the above (5), wherein,
The reflecting surface of the reflecting mirror is parallel to the optical axis of the optical system.
(7) The camera according to the above (5) or (6), wherein,
The reflecting surface of the reflecting mirror is parallel to any one side of the rectangular light receiving area of the imaging element.
(8) The camera according to the above (5) to (7), wherein,
The mirror comprises a plurality of planar mirrors,
The reflecting surfaces of at least one group of two plane reflecting mirrors in the plurality of plane reflecting mirrors are parallel to each other.
(9) The camera according to the above (8), wherein,
The distance H between the optical axis of the imaging element and the reflecting surfaces of the two plane mirrors parallel to each other is equal, the back focus of the optical system is set as B, and the angle CRA of the chief ray of the light beam from the object point at an angle 2 times the direct field angle of the imaging optical system corresponding to the light receiving area of the imaging element satisfies CRA.ltoreq.tan -1 (H/B).
(10) The camera according to the above (8) or (9), wherein,
The outer edges of the planar mirror and the light receiving region of the imaging element are in close contact with each other in the normal direction of the planar mirror.
(11) The camera according to the above (5) to (10), wherein,
The mirror is located outside an exit pupil of the optical system when viewed from an optical axis direction of the optical system.
(12) The camera according to the above (5) to (11), wherein,
The angle of the chief ray of any light beam in the optical system is greater than 0 ° with respect to the optical axis.
(13) The camera according to the above (4) to (12), wherein,
The device also comprises:
And a controller that separates an image corresponding to an image signal generated by the imaging device by imaging into a first image component corresponding to an object point within a direct angle of view of the optical system and a second image component corresponding to an object point outside the direct angle of view, the direct angle of view corresponding to a light receiving region of the imaging device.
(14) The camera according to the above (4) to (13), wherein,
The optical element performs optical processing on a light beam incident on the optical element and emits the light beam.
(15) The camera according to the above (14), wherein,
The optical treatment is to change the wavelength band of the incident light beam.
(16) The camera according to the above (14), wherein,
The optical processing is to add a pattern corresponding to a light-dark difference of an incident position to an incident light beam.
(17) The camera according to the above (14), wherein,
The optical processing is to impart distortion to an image formed by an incident light beam in the light receiving region.
Although the embodiments of the imaging method using the imaging devices 10, 100, 101 have been described above, embodiments of the present disclosure may be employed as storage media (e.g., optical discs, magneto-optical discs, CD-ROMs, CD-R, CD-RWs, magnetic tapes, hard discs, memory cards, or the like) on which programs are recorded, in addition to methods or programs for implementing the devices.
The installation method of the program is not limited to the application program such as the object code compiled by the compiler and the program code executed by the interpreter, and may be a method of programming a program module or the like in the operating system. The program may not be configured to perform all the processing only in the CPU on the control board. The program may be configured such that a part or all of the program is executed by an expansion board attached to the substrate or another processing unit attached to the expansion unit, as necessary.
The drawings that illustrate embodiments of the present disclosure are schematic. The dimensional ratios and the like on the drawings do not necessarily coincide with the actual dimensional ratios.
While embodiments of the present disclosure have been described based on the drawings and examples, it should be noted that various modifications or changes can be made by those skilled in the art in light of the present disclosure. Accordingly, it should be noted that such modifications or variations are included within the scope of the present disclosure. For example, functions and the like included in each structural part and the like can be rearranged so as not to be logically contradictory, and a plurality of structural parts and the like can be combined or divided.
All the structural elements described in the present disclosure and/or all the steps of all the disclosed methods or processes can be combined in any combination except combinations where these features are mutually exclusive. Further, the features described in the present disclosure may be replaced with alternative features that serve the same purpose, equivalent purposes, or the like, unless explicitly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or equivalent features.
Further, the embodiment according to the present disclosure is not limited to any specific structure of the above-described embodiment. Embodiments according to the present disclosure may be extended to all new features described in the present disclosure or combinations thereof, or all new methods, process steps, or combinations thereof.
In this disclosure, the recitations of "first" and "second" and the like are identifiers for distinguishing the structures. Structures distinguished by the description of "first" and "second" in this disclosure can be exchanged for numbers in the structure. For example, a first image component can be exchanged for a second image component for "first" and "second" as identifiers. The exchange of identifiers takes place simultaneously. The structure can also be distinguished after the exchange of identifiers. The identifier may also be deleted. The structure from which the identifier is deleted is distinguished by a reference numeral. The description of the identifiers such as "first" and "second" in the present disclosure is merely used to explain the order of the structure, and the basis for the identifier having a small number cannot be used.
Description of the reference numerals
10. 100, 101 Shooting device
11 Shooting optical system
12. 121 Shooting element
13. 130 Optical element
14 Controller
151 Microlens
161 Pixels
171 First PD (Photo Diode)
181 Second PD
190 First region
200 Second region
21. 210, 211 Optical device
22 First lens
23 Prism
Light beam emitted from object point at 2 times of direct view angle by CRA is in shooting optical system
Angle of principal ray in (a) with respect to optical axis
Im1 first image component
Im1B first B picture component
Im1G first G image component
Im1R first R image component
Im2 second image component
Im2B second B picture component
Im2G second G image component
Olim superimposed images
Ox optical axis
Pa defines an area
Rcim restored images
Object point at 2 times the pp direct field angle
Ra light receiving area

Claims (17)

1. An optical device, wherein,
The device comprises:
an optical system for imaging the incident first light in a predetermined area; and
And an optical element that guides the second light to the predetermined region, wherein an angle between an optical axis of the optical system and a principal ray of the second light incident on the optical system is different from the first light.
2. The optical device according to claim 1, wherein,
The angle formed by the chief ray of the second light and the optical axis is larger than the angle formed by the chief ray of the first light and the optical axis.
3. The optical device according to claim 1 or 2, wherein,
The optical element is a mirror that reflects the second light and guides the second light to the prescribed region.
4. A photographing apparatus, wherein,
The device comprises:
The optical device of claim 1 or 2; and
The imaging element is disposed so that the predetermined region overlaps the light receiving region.
5. The photographing device as claimed in claim 4, wherein,
The optical element is a mirror that reflects the second light and guides the second light to the prescribed region.
6. The photographing device of claim 5, wherein,
The reflecting surface of the reflecting mirror is parallel to the optical axis of the optical system.
7. The photographing device as claimed in claim 5 or 6, wherein,
The reflecting surface of the reflecting mirror is parallel to any one side of the rectangular light receiving area of the imaging element.
8. The photographing device according to any of claims 5 to 7, wherein,
The mirror comprises a plurality of planar mirrors,
The reflecting surfaces of at least one group of two plane reflecting mirrors in the plurality of plane reflecting mirrors are parallel to each other.
9. The photographing device as claimed in claim 8, wherein,
The distance H between the optical axis of the imaging element and the reflecting surfaces of the two plane mirrors parallel to each other is equal, the back focus of the optical system is set as B, and the angle CRA of the chief ray of the light beam from the object point at an angle 2 times the direct field angle of the imaging optical system corresponding to the light receiving area of the imaging element satisfies CRA.ltoreq.tan -1 (H/B).
10. The photographing device of claim 8 or 9, wherein,
The outer edges of the planar mirror and the light receiving region of the imaging element are in close contact with each other in the normal direction of the planar mirror.
11. The photographing device according to any of claims 5 to 10, wherein,
The mirror is located outside an exit pupil of the optical system when viewed from an optical axis direction of the optical system.
12. The photographing device of any of claims 5 to 11, wherein,
The angle of the chief ray of any light beam in the optical system is greater than 0 ° with respect to the optical axis.
13. The photographing device according to any of claims 4 to 12, wherein,
The device also comprises:
And a controller that separates an image corresponding to an image signal generated by the imaging device by imaging into a first image component corresponding to an object point within a direct angle of view of the optical system and a second image component corresponding to an object point outside the direct angle of view, the direct angle of view corresponding to a light receiving region of the imaging device.
14. The photographing device according to any of claims 4 to 13, wherein,
The optical element performs optical processing on a light beam incident on the optical element and emits the light beam.
15. The photographing device of claim 14, wherein,
The optical treatment is to change the wavelength band of the incident light beam.
16. The photographing device of claim 14, wherein,
The optical processing is to add a pattern corresponding to a light-dark difference of an incident position to an incident light beam.
17. The photographing device of claim 14, wherein,
The optical processing is to impart distortion to an image of an incident light beam imaged in the light receiving region.
CN202280071853.7A 2021-10-27 2022-10-13 Optical device and imaging device Pending CN118176459A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2021-175961 2021-10-27
JP2022132755A JP7331222B2 (en) 2021-10-27 2022-08-23 Optical device and imaging device
JP2022-132755 2022-08-23
PCT/JP2022/038280 WO2023074401A1 (en) 2021-10-27 2022-10-13 Optical device and imaging device

Publications (1)

Publication Number Publication Date
CN118176459A true CN118176459A (en) 2024-06-11

Family

ID=91347129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280071853.7A Pending CN118176459A (en) 2021-10-27 2022-10-13 Optical device and imaging device

Country Status (1)

Country Link
CN (1) CN118176459A (en)

Similar Documents

Publication Publication Date Title
JP4961993B2 (en) Imaging device, focus detection device, and imaging device
JP5169499B2 (en) Imaging device and imaging apparatus
CN108718376B (en) Thin multi-aperture imaging system with auto-focus and method of use thereof
US8749696B2 (en) Image pickup apparatus having an exit pupil with divided areas
JP7331222B2 (en) Optical device and imaging device
CN103037180B (en) Imageing sensor and picture pick-up device
US20020140835A1 (en) Single sensor chip digital stereo camera
JP2009115893A (en) Image-pickup apparatus
JP2008294819A (en) Image pick-up device
JP2001305415A (en) Focus detector
JP6547073B2 (en) Imager with improved autofocus performance
JP2008224801A (en) Focus detector and imaging apparatus
JP4708970B2 (en) Focus detection device and imaging device having the focus detection device
CN112004011A (en) Image acquisition method and device and light path conversion element
JP5251323B2 (en) Imaging device
JP2005176040A (en) Imaging device
JP2002191060A (en) Three-dimensional imaging unit
US20020025155A1 (en) Camera and focal point detection apparatus
JP7171331B2 (en) Imaging device
JP2014155071A (en) Image processing system, imaging apparatus, control method, and program
CN118176459A (en) Optical device and imaging device
WO2023074401A1 (en) Optical device and imaging device
WO2024101325A1 (en) Optical device and imaging device
JP2003000544A (en) Line-of-vision detector
US9857663B1 (en) Phase detection autofocus system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination