JP2009204991A - Compound-eye imaging apparatus - Google Patents

Compound-eye imaging apparatus Download PDF

Info

Publication number
JP2009204991A
JP2009204991A JP2008048483A JP2008048483A JP2009204991A JP 2009204991 A JP2009204991 A JP 2009204991A JP 2008048483 A JP2008048483 A JP 2008048483A JP 2008048483 A JP2008048483 A JP 2008048483A JP 2009204991 A JP2009204991 A JP 2009204991A
Authority
JP
Japan
Prior art keywords
subject
image
optical element
compound
imaging apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2008048483A
Other languages
Japanese (ja)
Inventor
Jun Tanida
純 谷田
Hiroyuki Tanabe
浩之 田邊
Takashi Toyoda
孝 豊田
Yoshizumi Nakao
良純 中尾
Yasuo Masaki
康生 政木
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Funai Electric Co Ltd
Osaka University NUC
Original Assignee
Funai Electric Co Ltd
Osaka University NUC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Funai Electric Co Ltd, Osaka University NUC filed Critical Funai Electric Co Ltd
Priority to JP2008048483A priority Critical patent/JP2009204991A/en
Publication of JP2009204991A publication Critical patent/JP2009204991A/en
Withdrawn legal-status Critical Current

Links

Images

Landscapes

  • Cameras In General (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
  • Endoscopes (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To obtain a compound-eye imaging apparatus which can be made to advance readily to a narrow space and easily acquire exact three-dimensional shape of a subject, even if it is a subject having even surfaces that are without patterns. <P>SOLUTION: The compound-eye imaging apparatus 1 is equipped with its imaging apparatus body 2, formed sufficiently compact to advance inside human oral cavity, and an image processing device 3 for processing a picked-up image. The imaging apparatus body 2 is equipped with an optical lens array 6, constituted by arranging a plurality of optical lenses L11, L12, ..., L33 on the same plane, a solid-state imaging element 7 picking up single-eye images k11, k12, ..., k33, a diffraction optical element 8 attached to the optical lens array 6, and a laser beam source device 10 emitting a laser beam 9 to the diffraction optical element 8. The diffraction optical element 8 reflects the incident laser beam 9 by minute ruggedness formed on its surface so as to project many bright spots (pattern image) in a grid shape on the surface of the subject. The image processing device 3 calculates the distance between the imaging apparatus body 2 and the bright spot for each imaged bright spot so as to estimate the three-dimensional shape of the subject. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

本発明は複眼撮像装置に関する。   The present invention relates to a compound eye imaging apparatus.

従来から、微小な光学レンズが縦横に複数配置された光学レンズアレイと、該光学レンズアレイの各光学レンズによってそれぞれ形成された被写体の個眼像を撮像する固体撮像素子とを備える複眼撮像装置が知られている。また、そのような複眼撮像装置と微小な距離を移動することができるマルチスリットとを組み合わせることによって、微小物体の高さを計測することできるようになった撮像装置が知られている(例えば、特許文献1参照)。   2. Description of the Related Art Conventionally, there has been a compound-eye imaging device including an optical lens array in which a plurality of minute optical lenses are arranged vertically and horizontally, and a solid-state imaging device that captures a single-eye image of a subject formed by each optical lens of the optical lens array. Are known. In addition, there is known an imaging apparatus that can measure the height of a minute object by combining such a compound eye imaging apparatus and a multi-slit that can move a minute distance (for example, Patent Document 1).

一方、物体の3次元情報を取得するための装置として、ストライプパターン投射装置と、該ストライプパターン投射装置によって異なった種類のストライプパターンを投射された物体を撮像する撮像装置と、撮像装置によって撮像された画像に基づいて撮像装置と物体間の距離を算出する距離算出装置とを備える3次元画像取得装置が知られている(例えば、特許文献2参照)。
特開2000−180139号公報 特開2006−64454号公報
On the other hand, as a device for acquiring three-dimensional information of an object, a stripe pattern projection device, an imaging device that captures an object projected with a different type of stripe pattern by the stripe pattern projection device, and an imaging device There is known a three-dimensional image acquisition device including an imaging device and a distance calculation device that calculates a distance between objects based on the obtained image (see, for example, Patent Document 2).
JP 2000-180139 A JP 2006-64454 A

ところで、歯科診療において歯茎部の腫れの程度を客観的に判断するために口腔内を撮像する場合があり、このときに用いる撮像装置(カメラ)として、人の狭い口腔内に容易に進入させることができ、そのうえ口腔内の複雑な形状の3次元情報を得ることができるものが望まれていた。この点、複眼撮像装置は、装置本体を小型に構成することが容易であり(例えば、10mm四方程度)、複数の個眼像から距離画像を生成して被写体の3次元情報を得ることができるので、直ちに上記の用途に用いることが可能であるように考えられた。   By the way, there are cases where the inside of the oral cavity is imaged in order to objectively determine the degree of swelling of the gums in dental practice. In addition, there has been a demand for a device capable of obtaining three-dimensional information of a complicated shape in the oral cavity. In this respect, the compound-eye imaging apparatus can easily configure the apparatus main body in a small size (for example, about 10 mm square), and can generate a distance image from a plurality of single-eye images and obtain three-dimensional information of the subject. Therefore, it was thought that it could be used immediately for the above-mentioned use.

ところが、本発明者が実際に複眼撮像装置を用いて口腔内の歯茎部を撮像したところ、次のような問題から有効な3次元情報を得ることができないことが判明した。すなわち、距離画像を生成するためには各個眼像における被写体の同一箇所を個眼像内の同一模様領域に基づいて判定しなければならないが、歯茎部の表面は一様なピンク色であって位置を特定できる模様がなく正確に距離画像を生成することができなかった。また、簡単な照明装置により狭い口腔内を一様な照度で照明することは困難であり鮮明な個眼像を得ることができなかった。   However, when the inventors actually imaged the gums in the oral cavity using a compound eye imaging device, it has been found that effective three-dimensional information cannot be obtained due to the following problems. That is, in order to generate a distance image, it is necessary to determine the same part of the subject in each single-eye image based on the same pattern region in the single-eye image, but the surface of the gum is a uniform pink color. There was no pattern that could identify the position, and the distance image could not be generated accurately. Moreover, it is difficult to illuminate a narrow oral cavity with uniform illuminance with a simple illumination device, and a clear single-eye image cannot be obtained.

なお、特許文献2に記載された3次元画像取得装置を応用して歯茎部の表面に種々のストライプパターンを投射することが考えられるが、その場合にはストライプパターンの投射装置が別途必要になることなどから装置全体が大掛かりとなり人の口腔内等の狭いスペースの撮像用として構成することができない。   Note that it is possible to project various stripe patterns on the surface of the gum part by applying the three-dimensional image acquisition device described in Patent Document 2, but in that case, a stripe pattern projection device is required separately. For this reason, the entire apparatus becomes large and cannot be configured for imaging in a narrow space such as in a human oral cavity.

そこで、本発明は、上記課題を解決するものであり、人の口腔内のような狭いスペースに容易に進入させて操作することができ、模様のない一様な表面を有する被写体であっても容易に正確な3次元形状を取得することができる複眼撮像装置を提供することを目的とする。   Therefore, the present invention solves the above-described problems, and can be operated by easily entering a narrow space such as the human oral cavity, even if the subject has a uniform surface without a pattern. An object of the present invention is to provide a compound eye imaging apparatus that can easily acquire an accurate three-dimensional shape.

上記目的を達成するために請求項1の発明は、被写体からの光を集光する複数の光学レンズが同一平面上に配置されてなる光学レンズアレイと、前記光学レンズによってそれぞれ形成された個眼像を撮像する固体撮像素子とを有する複眼撮像装置において、前記光学レンズアレイ上に、外部から入射される可干渉性の光を光学的に変換して被写体へ向けて出射する回折光学素子を設けたことを特徴とする。   In order to achieve the above object, the invention of claim 1 is directed to an optical lens array in which a plurality of optical lenses for condensing light from a subject are arranged on the same plane, and a single eye formed by each of the optical lenses. In a compound eye image pickup apparatus having a solid-state image pickup device for picking up an image, a diffractive optical element that optically converts coherent light incident from the outside and emits it toward a subject is provided on the optical lens array. It is characterized by that.

請求項2の発明は、請求項1の複眼撮像装置において、前記回折光学素子は、外部から入射される可干渉性の光を反射し、この反射光が被写体に所定のパターン像を映出することを特徴とする。   According to a second aspect of the present invention, in the compound eye imaging device according to the first aspect, the diffractive optical element reflects coherent light incident from the outside, and the reflected light projects a predetermined pattern image on the subject. It is characterized by that.

請求項3の発明は、請求項1の複眼撮像装置において、前記回折光学素子は、外部から入射される可干渉性の光を透過し、この透過光が被写体に所定のパターン像を映出することを特徴とする。   According to a third aspect of the present invention, in the compound eye imaging device according to the first aspect, the diffractive optical element transmits coherent light incident from the outside, and the transmitted light projects a predetermined pattern image on the subject. It is characterized by that.

請求項4の発明は、請求項2、又は請求項3に記載の複眼撮像装置において、前記固体撮像素子により撮像された、前記所定のパターン像が映出された被写体の個眼像から、被写体の3次元形状を算出する画像処理装置をさらに備えることを特徴とする。   According to a fourth aspect of the present invention, in the compound-eye imaging device according to the second or third aspect, the subject is obtained from a single-eye image of the subject on which the predetermined pattern image is projected, which is imaged by the solid-state imaging device. The image processing apparatus is further provided with an image processing device that calculates the three-dimensional shape.

請求項5の発明は、請求項1の複眼撮像装置において、前記回折光学素子は、外部から入射される可干渉性の光を強度分布が一様な光に変換し、この光が被写体を照明することを特徴とする。   According to a fifth aspect of the present invention, in the compound eye imaging device according to the first aspect, the diffractive optical element converts coherent light incident from the outside into light having a uniform intensity distribution, and the light illuminates the subject. It is characterized by doing.

請求項6の発明は、請求項5の複眼撮像装置において、前記強度分布が一様な光によって照明された被写体の個眼像から、照度差ステレオ法を用いて被写体の3次元形状を算出する画像処理装置をさらに備えることを特徴とする。   According to a sixth aspect of the present invention, in the compound-eye imaging device according to the fifth aspect, the three-dimensional shape of the subject is calculated from a single-eye image of the subject illuminated with light having a uniform intensity distribution, using an illuminance difference stereo method. An image processing apparatus is further provided.

請求項1の発明によれば、光学レンズアレイ上に回折光学素子を設けただけの構成であるので装置本体が大掛かりとならず、人の口腔内のような狭いスペースに容易に進入させて操作することができる。また、回折光学素子により外部から入射される可干渉性の光を光学的に変換して被写体へ出射させるので、被写体が適正に照明され、固体撮像素子によって撮像される複数の個眼像から容易に正確な3次元形状を取得することができる。   According to the first aspect of the present invention, since the diffractive optical element is simply provided on the optical lens array, the apparatus main body does not become large and can be easily operated by entering into a narrow space such as the human oral cavity. can do. In addition, since coherent light incident from the outside is optically converted by the diffractive optical element and emitted to the subject, the subject is properly illuminated and easily obtained from a plurality of single-eye images captured by the solid-state image sensor. An accurate three-dimensional shape can be acquired.

請求項2の発明によれば、回折光学素子が可干渉性の光を反射して被写体に所定のパターン像を映出するので、容易に正確な3次元形状を取得することができる。   According to the invention of claim 2, since the diffractive optical element reflects the coherent light and projects a predetermined pattern image on the subject, an accurate three-dimensional shape can be easily acquired.

請求項3の発明によれば、回折光学素子が可干渉性の光を透過して被写体に所定のパターン像を映出するので、容易に正確な3次元形状を取得することができる。   According to the invention of claim 3, since the diffractive optical element transmits coherent light and projects a predetermined pattern image on the subject, an accurate three-dimensional shape can be easily acquired.

請求項4の発明によれば、画像処理装置が、被写体に映出されたパターン像に基づいて容易に正確な3次元形状を取得することができる。   According to the invention of claim 4, the image processing device can easily acquire an accurate three-dimensional shape based on the pattern image projected on the subject.

請求項5の発明によれば、被写体が、回折光学素子により変換された強度分布が一様な光によって照明されるので、容易に正確な3次元形状を取得することができる。   According to the fifth aspect of the present invention, the subject is illuminated with light having a uniform intensity distribution converted by the diffractive optical element, so that an accurate three-dimensional shape can be easily obtained.

請求項6の発明によれば、画像処理装置が、一様な光によって照明された被写体の個眼像から容易に正確な3次元形状を取得することができる。   According to the invention of claim 6, the image processing apparatus can easily acquire an accurate three-dimensional shape from a single eye image of a subject illuminated with uniform light.

(第1の実施形態)
本発明の第1の実施形態に係る複眼撮像装置について、図1乃至図3を参照して説明する。本実施形態の複眼撮像装置1は、歯科診療において用いられる歯茎部の腫れの程度を計測するための装置として構成され、図1、2に示されるように、被写体(歯茎部)からの光を集光して撮像する撮像装置本体2と、撮像装置本体2により撮像された画像情報(複数の個眼像)から被写体の3次元形状を算出する画像処理装置3とを備える。画像処理装置はマイクロコンピュータから構成され、撮像装置本体2に接続ケーブル4を介して接続され床上等に載置される。
(First embodiment)
A compound eye imaging apparatus according to a first embodiment of the present invention will be described with reference to FIGS. 1 to 3. The compound eye imaging device 1 of the present embodiment is configured as a device for measuring the degree of swelling of the gum part used in dental practice, and as shown in FIGS. 1 and 2, the light from the subject (gum part) is used. The imaging apparatus main body 2 which condenses and images and the image processing apparatus 3 which calculates the three-dimensional shape of a to-be-photographed object from the image information (multiple eye images) imaged by the imaging device main body 2 are provided. The image processing apparatus includes a microcomputer, and is connected to the imaging apparatus main body 2 via a connection cable 4 and placed on the floor or the like.

撮像装置本体2は、被写体からの光を集光する9個の光学レンズL11、L12・・L33が同一のレンズホルダ5上に支持されてなる光学レンズアレイ6と、各光学レンズL11、L12・・L33によってそれぞれ形成される個眼像k11、k12・・k33を撮像するプレート状の固体撮像素子7と、レンズホルダ5の上面の空きスペースに接着された回折光学素子8と、レーザ光(可干渉性の光)9を回折光学素子8へ照射するレーザ光源装置10とを備える。撮像装置本体2は、図2に示されるように、歯科医師等の操作者の操作によって被験者の口腔に接近させたり、口腔内に挿入できるように小型に形成されている(例えば、光学レンズアレイ6は約10mm四方の正方形に形成される)。   The imaging apparatus main body 2 includes an optical lens array 6 in which nine optical lenses L11, L12,... L33 that collect light from a subject are supported on the same lens holder 5, and the optical lenses L11, L12,. A plate-shaped solid-state image pickup device 7 for picking up the single-eye images k11, k12,... K33 formed by L33, a diffractive optical element 8 bonded to an empty space on the upper surface of the lens holder 5, and laser light (possible A laser light source device 10 for irradiating the diffractive optical element 8 with coherent light 9. As shown in FIG. 2, the imaging device main body 2 is formed in a small size so as to be close to the subject's oral cavity or inserted into the oral cavity by an operation of an operator such as a dentist (for example, an optical lens array). 6 is formed in a square of about 10 mm square).

レーザ光源装置10は、図1に示されるように、半導体レーザ素子から構成されるレーザ光源11と、レーザ光源11から出射されるレーザ光9を回折光学素子8の斜め上方へ導く光ファイバ12とを備える。   As shown in FIG. 1, the laser light source device 10 includes a laser light source 11 composed of a semiconductor laser element, and an optical fiber 12 that guides the laser light 9 emitted from the laser light source 11 obliquely above the diffractive optical element 8. Is provided.

回折光学素子8は薄い石英基板の表面に微細な凹凸を形成したものであり、所定の方向から入射されるレーザ光9を反射し、ホログラム作用によって縦横の格子状に並ぶ多数の輝点p群(パターン像)を被写体(歯茎部)Hの表面に映し出す(図2参照)。回折光学素子8の表面の凹凸形状は、回折光学素子8によって被写体H上に映出される各輝点pの映出方向が撮像装置本体2に対して予め設定された所定の角度になるように設計されている。   The diffractive optical element 8 is formed by forming fine irregularities on the surface of a thin quartz substrate, reflects a laser beam 9 incident from a predetermined direction, and has a number of bright spots p arranged in a vertical and horizontal lattice pattern by a hologram action. The (pattern image) is projected on the surface of the subject (gum portion) H (see FIG. 2). The concavo-convex shape on the surface of the diffractive optical element 8 is such that the projection direction of each bright spot p projected on the subject H by the diffractive optical element 8 is a predetermined angle set in advance with respect to the imaging apparatus body 2. Designed.

回折光学素子8は、レンズホルダ5上の所定位置に接着剤によって接着してもよいし、レンズホルダ5の所定位置に予め形成した凹部内に埋め込んで固定してもよい。また、レンズホルダ5の表面にエッチング技術を用いて直接形成したものであってもよい。   The diffractive optical element 8 may be adhered to a predetermined position on the lens holder 5 with an adhesive, or may be embedded and fixed in a recessed portion formed in advance at a predetermined position of the lens holder 5. Alternatively, it may be formed directly on the surface of the lens holder 5 using an etching technique.

以上のように、本実施形態では、操作者が撮像装置本体2を被写体(歯茎部)Hに接近させて撮像するときに、レーザ光源装置10からレーザ光9を回折光学素子8へ向けて照射させると、映出方向が予め所定の方向に設定された多数の輝点pが被写体Hの表面に映し出され、その画像が固体撮像素子7によって個眼像k11、k12・・k33として撮像される。そして、画像処理装置3が撮像された個眼像k11、k12・・k33から被写体Hの3次元形状を三角測量の原理に基づいて算出する。   As described above, in the present embodiment, when the operator takes an image of the imaging apparatus main body 2 close to the subject (gum part) H, the laser light source 10 emits the laser light 9 toward the diffractive optical element 8. As a result, a large number of bright spots p whose projection directions are set in a predetermined direction are projected on the surface of the subject H, and the images are picked up by the solid-state imaging device 7 as single-eye images k11, k12,. . Then, the three-dimensional shape of the subject H is calculated based on the principle of triangulation from the single-eye images k11, k12,.

次に、画像処理装置3が実行する被写体Hの3次元形状を算出する方法について図3を参照して説明する。図3は、被写体H上に映出された1つの輝点pと回折光学素子8を含み光学レンズアレイ6に垂直な平面V(図1参照)における回折光学素子8から反射されたレーザ光9c、光学レンズL31、L22、L13及び固体撮像素子7の位置関係を示す。   Next, a method for calculating the three-dimensional shape of the subject H executed by the image processing apparatus 3 will be described with reference to FIG. 3 shows a laser beam 9c reflected from the diffractive optical element 8 on a plane V (see FIG. 1) that includes one luminescent spot p projected on the subject H and the diffractive optical element 8 and is perpendicular to the optical lens array 6. FIG. 2 shows the positional relationship between the optical lenses L31, L22, L13 and the solid-state imaging device 7.

いま、被写体(歯茎部)Hの表面位置(腫れの程度)は、輝点pと光学レンズアレイ6との距離Zに相当し未知である。一方、輝点pからの反射光9rを集光する光学レンズL22と固体撮像素子7との間隔f、光学レンズL22の中心と回折光学素子8との水平方向における距離B、光学レンズL22の中心と回折光学素子8の表面との垂直方向における距離h、及び回折光学素子8によって反射され輝点pを形成する光9cの出射角度θは既知である。また、固体撮像素子7上に形成される輝点pの像paの位置は、像paが撮像された画素gの位置から求めることができ、輝点像paの光学レンズL22中心からの距離をduとする。   Now, the surface position (degree of swelling) of the subject (gum) H corresponds to the distance Z between the bright spot p and the optical lens array 6 and is unknown. On the other hand, the distance f between the optical lens L22 that collects the reflected light 9r from the bright spot p and the solid-state imaging device 7, the distance B in the horizontal direction between the center of the optical lens L22 and the diffractive optical element 8, and the center of the optical lens L22. And the distance h in the direction perpendicular to the surface of the diffractive optical element 8 and the emission angle θ of the light 9c reflected by the diffractive optical element 8 to form the bright spot p are known. Further, the position of the image pa of the bright spot p formed on the solid-state image sensor 7 can be obtained from the position of the pixel g where the image pa is picked up, and the distance from the center of the optical lens L22 of the bright spot image pa is determined. Let du.

上記既知の値(f、B、h、θ)は、予め各輝点p毎、及び光学レンズL11、L12・・L33毎に画像処理装置3内に記憶されており、距離duは、撮像された個眼像k11、k12・・k33における輝点像paの位置から測定される。画像処理装置3は上記値(f、B、h、θ、du)に基づいて輝点pと光学レンズアレイ6との距離Zを画像処理装置内に記憶された次の式から算出する。
Z=(B×f×tanθ+f×h)/(f−du×tanθ)・・・(A)
The known values (f, B, h, θ) are stored in advance in the image processing apparatus 3 for each bright spot p and for each optical lens L11, L12,... L33, and the distance du is captured. Measured from the position of the bright spot image pa in the individual images k11, k12,. The image processing device 3 calculates the distance Z between the bright spot p and the optical lens array 6 based on the above values (f, B, h, θ, du) from the following equation stored in the image processing device.
Z = (B × f × tan θ + f × h) / (f−du × tan θ) (A)

ここで上記式の導出方法を説明する。いま、図3の紙面に沿った平面(図1における平面Vと同じ平面)において横方向をx軸、縦方向をz軸として、輝点pからの反射光9rを集光する光学レンズL22の中心をx=0、z=0の原点とする座標を設定する。また、光9cのz軸における交点をaと置くと、z=tanθ×x+aと表わされる(これを式1という)。交点aをh、B、θで表わすと、a=h+B×tanθとなる(これを式2という)。式2を式1に代入すると、z=tanθ×x+h+B×tanθが得られる(これを式3という)。この式3は光9cを表わす関数である。   Here, a method for deriving the above equation will be described. Now, in the plane along the plane of FIG. 3 (the same plane as the plane V in FIG. 1), the horizontal direction is the x axis and the vertical direction is the z axis. Coordinates whose center is the origin of x = 0 and z = 0 are set. If the intersection point of the light 9c on the z-axis is set as a, it is expressed as z = tan θ × x + a (this is referred to as Expression 1). When the intersection point a is represented by h, B, and θ, a = h + B × tan θ (this is referred to as Expression 2). Substituting Equation 2 into Equation 1 yields z = tan θ × x + h + B × tan θ (this is referred to as Equation 3). Equation 3 is a function representing the light 9c.

一方、輝点pから固体撮像素子7へ向かう反射光9rを表わす関数は、点(x=−du、z=−f)を通るので、z=(f/du)×xとなる(これを式4という)。式4を変形してx=(du/f)×zを得る(これを式5という)。式5を式3に代入すると、z=tanθ×{(du/f)×z}+h+B×tanθが得られる(これを式6という)。なお、式6を整理することによって上記距離Zの算出式(A)を得ることができる。   On the other hand, the function representing the reflected light 9r from the bright spot p toward the solid-state image sensor 7 passes through the point (x = −du, z = −f), and therefore z = (f / du) × x (this is It is called Formula 4). Equation 4 is transformed to obtain x = (du / f) × z (this is called Equation 5). Substituting Equation 5 into Equation 3 yields z = tan θ × {(du / f) × z} + h + B × tan θ (this is referred to as Equation 6). Note that the formula (A) for calculating the distance Z can be obtained by rearranging the formula 6.

画像処理装置3は、上記手順によって被写体Hの表面に映出される各輝点p毎に輝点pと撮像装置本体2との間の距離Zを算出する。各光学レンズL11、L12・・L33の光学レンズアレイ6上における位置(図1のxy平面における位置)が既知であるので、各輝点pの3次元情報を算出することができ、被写体(歯茎部)Hの表面の形状(腫れの程度)を推定することができる。なお、被写体H上に映出する輝点pの配置は、図2に示されるような縦横の格子状以外に種々の配置(例えば、放射状の配置)が可能であり、輝点p同士の間隔を密にすることによって被写体Hの表面の3次元形状をより精密に推定することができる。   The image processing device 3 calculates a distance Z between the bright spot p and the imaging device main body 2 for each bright spot p displayed on the surface of the subject H by the above procedure. Since the positions of the optical lenses L11, L12,... L33 on the optical lens array 6 (positions on the xy plane in FIG. 1) are known, the three-dimensional information of each bright spot p can be calculated and the subject (gum) Part) The shape of the surface of H (the degree of swelling) can be estimated. The arrangement of the bright spots p displayed on the subject H can be various arrangements (for example, radial arrangement) other than the vertical and horizontal grids as shown in FIG. , The three-dimensional shape of the surface of the subject H can be estimated more precisely.

また、被写体Hが歯茎部であるときには***した頂点にあたる部分(例えば、図3におけるm部)が鏡面反射を生じる場合がある。この場合、鏡面反射部分からの反射光9rはエネルギーが大きく固体撮像素子7の画素gを飽和させてしまうことから、鏡面反射部分の近辺にある輝点pの位置(輝点像paの位置)を特定することができず(距離duを測定することができず)、当該輝点pの撮像装置本体2との間の距離Zを算出することができなくなる虞がある。ところが、本実施形態では、光学レンズL11、L12・・L33はそれぞれ異なった角度で被写体Hからの光を集光するので、個眼像k11、k12・・k33毎に鏡面反射部分が形成される位置がずれる。従って、特定の個眼像(例えば、k11)においてある輝点pの距離Zを算出できない場合でも、他の個眼像(例えば、k33)に基づいて同一の輝点pの距離Zを算出することができる。   In addition, when the subject H is a gum part, a part corresponding to a raised vertex (for example, the m part in FIG. 3) may cause specular reflection. In this case, since the reflected light 9r from the specular reflection portion has a large energy and saturates the pixel g of the solid-state imaging device 7, the position of the bright spot p in the vicinity of the specular reflection portion (position of the bright spot image pa). Cannot be specified (distance du cannot be measured), and the distance Z between the bright spot p and the imaging device main body 2 cannot be calculated. However, in the present embodiment, since the optical lenses L11, L12,... L33 collect light from the subject H at different angles, a specular reflection portion is formed for each individual image k11, k12,. The position shifts. Accordingly, even when the distance Z of a certain bright spot p cannot be calculated in a specific single-eye image (for example, k11), the distance Z of the same bright spot p is calculated based on another single-eye image (for example, k33). be able to.

さらに、被写体Hが歯茎部である場合には、以上のようにして取得した特定の被験者の歯科治療時におけるデータ(各輝点pの3次元情報)をメモリ等の記憶媒体に保存しておき、治療が完了した後において再度取得したデータと比較することによって治療の効果(歯茎部の腫れの程度が緩和したこと)を容易に確認することができる。3次元情報を比較する場合、腫れの影響のない歯の部分の情報をベースとして一致させ、その後、歯茎部分を比較すると、より正確に変化分を測定することができる。   Further, when the subject H is a gum part, the data (three-dimensional information of each bright spot p) obtained during the dental treatment of the specific subject obtained as described above is stored in a storage medium such as a memory. The effect of the treatment (reduction in the degree of swelling of the gums) can be easily confirmed by comparing with the data obtained again after the treatment is completed. When comparing three-dimensional information, it is possible to measure the amount of change more accurately by making matching based on information on a tooth portion that is not affected by swelling and then comparing the gum portion.

(第2の実施形態)
次に、第2の実施形態に係る複眼撮像装置1について、図4を参照して説明する。本実施形態の複眼撮像装置1は第1の実施形態と略同一の構成であり、同一構成部分については同一の番号を付して説明を省略する。異なるところは、光学レンズアレイ6上に設けられる回折光学素子28が透過型の素子である点と、レーザ光源装置20から回折光学素子28へ出射されるレーザ光9の光路が光学レンズアレイ6と固体撮像素子7との間の空間に設けられた点である。
(Second Embodiment)
Next, a compound eye imaging apparatus 1 according to the second embodiment will be described with reference to FIG. The compound eye imaging device 1 of the present embodiment has substantially the same configuration as that of the first embodiment, and the same components are denoted by the same reference numerals and description thereof is omitted. The difference is that the diffractive optical element 28 provided on the optical lens array 6 is a transmissive element, and the optical path of the laser light 9 emitted from the laser light source device 20 to the diffractive optical element 28 is different from that of the optical lens array 6. This is a point provided in a space between the solid-state imaging device 7.

本実施形態の回折光学素子28は、第1の実施形態と略同様に透明の石英基板の表面に微細な凹凸を形成したものであり、下面から入射されるレーザ光9を透過し、縦横の格子状に並ぶ多数の輝点群(パターン像)を形成する光9cに変換して被写体(歯茎部)Hの表面へ向けて出射する。   The diffractive optical element 28 of the present embodiment is formed by forming fine irregularities on the surface of a transparent quartz substrate in substantially the same manner as in the first embodiment. The diffractive optical element 28 transmits laser light 9 incident from the lower surface and is vertically and horizontally. The light is converted into light 9c forming a large number of bright spot groups (pattern images) arranged in a lattice pattern and emitted toward the surface of the subject (gum portion) H.

回折光学素子28は、レンズホルダ5上の所定位置に予め開けられた開口に嵌め込んで固定してもよいし、レンズホルダ5の所定位置に予め形成した凹部内に埋め込んで固定してもよい。後者の場合には、レンズホルダ5を光透過性の材質で形成するか、凹部の中央部に下方からのレーザ光9を通過させる貫通孔を形成する。   The diffractive optical element 28 may be fixed by being fitted into an opening previously opened at a predetermined position on the lens holder 5 or may be fixed by being embedded in a recess formed in advance at a predetermined position of the lens holder 5. . In the latter case, the lens holder 5 is formed of a light-transmitting material, or a through hole through which the laser light 9 from below is passed is formed in the central portion of the recess.

本実施形態のレーザ光源装置20は、半導体レーザ素子から構成されるレーザ光源21と、レーザ光源21から出射されるレーザ光9を回折光学素子28の下面へ導くプリズム22とを備える。プリズム22は光学レンズアレイ5と固体撮像素子7との間であって、各光学レンズL11、L12・・L33によって集光され固体撮像素子7へと進行する光の通過路から外れた位置に設置される。以上のようにレーザ光源21から回折光学素子28へ出射されるレーザ光9の光路が光学レンズアレイ6と固体撮像素子7との間の空間に形成されるので、レーザ光源21から出射されたレーザ光9が光学レンズアレイ6よりも被写体H側において光学レンズL11、L12・・L33により集光される光の光路(例えば、光9r)に影響を及ぼすことがない。   The laser light source device 20 of the present embodiment includes a laser light source 21 composed of a semiconductor laser element, and a prism 22 that guides the laser light 9 emitted from the laser light source 21 to the lower surface of the diffractive optical element 28. The prism 22 is disposed between the optical lens array 5 and the solid-state image sensor 7 and at a position outside the passage of the light that is collected by each of the optical lenses L11, L12, and L33 and travels to the solid-state image sensor 7. Is done. As described above, since the optical path of the laser light 9 emitted from the laser light source 21 to the diffractive optical element 28 is formed in the space between the optical lens array 6 and the solid-state imaging device 7, the laser emitted from the laser light source 21 is used. The light 9 does not affect the optical path (for example, the light 9r) of the light collected by the optical lenses L11, L12,... L33 on the subject H side of the optical lens array 6.

本実施形態の複眼撮像装置1も第1の実施形態と同様に、撮像装置本体2が小型に形成されるので歯科医師等の操作者が撮像装置本体2を容易に被験者の口腔に接近させたり、口腔内に挿入できる。また、撮像時に被写体(歯茎部)Hに多数の輝点pが映し出されるので、第1の実施形態と同様の手順に基づいて画像処理装置3が各輝点pの3次元情報を算出することができ、被写体(歯茎部)Hの3次元形状を正確に推定することができる。   Similarly to the first embodiment, the compound-eye imaging device 1 of the present embodiment has a small imaging device body 2 so that an operator such as a dentist can easily bring the imaging device body 2 close to the oral cavity of the subject. Can be inserted into the oral cavity. In addition, since a large number of bright spots p are displayed on the subject (gum) H at the time of imaging, the image processing device 3 calculates the three-dimensional information of each bright spot p based on the same procedure as in the first embodiment. The three-dimensional shape of the subject (gum portion) H can be accurately estimated.

また、本実施形態の場合には、前述の通りレーザ光源21から回折光学素子28へのレーザ光9の光路が撮像装置本体2内に形成され、光学レンズアレイ2の被写体H側において他の光に影響を及ぼす虞がないので、図5に示されるように異なった波長のレーザ光9a、9bを出射する2つのレーザ光源装置20a、20bと、各レーザ光源装置20a、20bに対応する2つの回折光学素子28a、28bを備えた構成とすることができる。   In the case of this embodiment, as described above, the optical path of the laser light 9 from the laser light source 21 to the diffractive optical element 28 is formed in the imaging apparatus main body 2, and other light on the subject H side of the optical lens array 2. As shown in FIG. 5, two laser light source devices 20a and 20b for emitting laser beams 9a and 9b having different wavelengths and two laser light source devices 20a and 20b corresponding to the laser light source devices 20a and 20b are used. The diffractive optical elements 28a and 28b may be provided.

この場合には、各回折光学素子28a、28bからの透過光によって被写体Hの表面に映出される輝点の色が異なるので、各回折光学素子28a、28bによって映出される輝点の位置が交互になるように設定することによって、測定できる輝点の密度を容易に高めることができ、被写体Hの3次元形状をより正確に推定することができる。例えば、レーザ光9aが赤色光、レーザ光9bが青色光である場合について説明すると、被写体Hの表面に赤色の輝点prと青色の輝点pbが図6に示すように交互に映出される。赤色の輝点prと青色の輝点pbとの間隔dは、同一色の輝点pが映出されるときの画像処理装置3が判別できる輝点pの間隔(分解能)よりも小さくすることができるので、単位面積当たりに映出できる輝点pの数を容易に増加することができる。   In this case, since the color of the bright spot projected on the surface of the subject H is different depending on the transmitted light from each diffractive optical element 28a, 28b, the position of the bright spot projected by each diffractive optical element 28a, 28b is alternated. The density of bright spots that can be measured can be easily increased, and the three-dimensional shape of the subject H can be estimated more accurately. For example, when the laser light 9a is red light and the laser light 9b is blue light, a red bright spot pr and a blue bright spot pb are alternately projected on the surface of the subject H as shown in FIG. . The interval d between the red luminescent spot pr and the blue luminescent spot pb may be smaller than the interval (resolution) of the luminescent spot p that can be discriminated by the image processing apparatus 3 when the same color luminescent spot p is projected. Therefore, the number of bright spots p that can be projected per unit area can be easily increased.

なお、各レーザ光源装置20a、20bによるレーザ光9a、9bの出射のタイミングは同時であってもよいし、所定時間ずれていてもよい。赤、青同時に出射する場合は、個眼の一部に赤色フィルタ、残りの個眼に青色フィルタを搭載すると、被写体の形状によって赤色輝点と青色輝点が重なった場合でも、それぞれの色フィルタを搭載した個眼で輝点位置の検出が可能であり、これにより分解能を向上させるとともに、測定時間の短縮が可能である。タイミングをずらして出射する場合は、例えば、図6における赤色の輝点prの位置について画像処理装置3が被写体Hと撮像装置本体2との距離Zを算出し、次に距離Zが算出された位置の間を補間するようにして青色の輝点pbの位置について画像処理装置3が被写体Hと撮像装置本体2との距離Zを算出することになる。   In addition, the timing of emission of the laser beams 9a and 9b by the laser light source devices 20a and 20b may be the same or may be shifted by a predetermined time. When emitting red and blue at the same time, if a red filter is installed in a part of the individual eye and a blue filter is installed in the remaining individual eyes, each color filter will be used even if the red and blue luminescent spots overlap depending on the shape of the subject. It is possible to detect the position of the bright spot with a single eye equipped with the lens, thereby improving the resolution and shortening the measurement time. In the case of emission at different timings, for example, the image processing device 3 calculates the distance Z between the subject H and the imaging device main body 2 for the position of the red bright spot pr in FIG. 6, and then the distance Z is calculated. The image processing device 3 calculates the distance Z between the subject H and the imaging device main body 2 for the position of the blue bright spot pb so as to interpolate between the positions.

(第3の実施形態)
次に、第3の実施形態に係る複眼撮像装置1について、図7を参照して説明する。本実施形態の複眼撮像装置1は第2の実施形態における図5に示されたものと略同一の構成であり、異なるところは、レーザ光源装置20a、20bが出射するレーザ光9a、9bの波長が同一である点と、そのレーザ光を回折光学素子28a、28bが変換して出射する光が輝点pを形成する光ではなく強度分布が一様な光である点と、画像処理装置3が照度差ステレオ法を用いて3次元形状を算出する点である。本実施形態の回折光学素子28a、28bは、表面の微小な凹凸が、出射する光9cが一様な強度(照度)となるように設計されているものである。
(Third embodiment)
Next, a compound eye imaging apparatus 1 according to a third embodiment will be described with reference to FIG. The compound-eye imaging device 1 of the present embodiment has substantially the same configuration as that shown in FIG. 5 in the second embodiment, except that the wavelengths of the laser beams 9a and 9b emitted from the laser light source devices 20a and 20b are different. Are the same, the light emitted from the diffractive optical elements 28a and 28b after the laser light is converted is not the light forming the bright spot p, but the light having a uniform intensity distribution, and the image processing apparatus 3 Is the point of calculating the three-dimensional shape using the illuminance difference stereo method. The diffractive optical elements 28a and 28b of the present embodiment are designed so that the surface has minute irregularities so that the emitted light 9c has a uniform intensity (illuminance).

具体的には、各回折光学素子28a、28bが出射する光9cは、図7に示されるように、回折光学素子28a、28bの表面の凹凸によって形成される微小な球面波mwが合成されて強度分布が一様になる。この一様な強度の光9cによって照明された被写体Hには、被写体Hと各回折光学素子28a、28bとの距離に無関係に、照明光9cによって照らされる方向にのみ依存する影が生じる。本実施形態の画像処理装置3は、上記のような被写体Hに形成される影の濃度に基づいて公知の照度差ステレオ法を用いて被写体Hの3次元形状を算出する。   Specifically, as shown in FIG. 7, the light 9c emitted from each diffractive optical element 28a, 28b is synthesized with a minute spherical wave mw formed by irregularities on the surface of the diffractive optical element 28a, 28b. The intensity distribution is uniform. The subject H illuminated by the light 9c of uniform intensity has a shadow that depends only on the direction illuminated by the illumination light 9c, regardless of the distance between the subject H and each diffractive optical element 28a, 28b. The image processing apparatus 3 according to the present embodiment calculates the three-dimensional shape of the subject H using a known illuminance difference stereo method based on the density of the shadow formed on the subject H as described above.

例えば、回折光学素子28aが出射する光9cによって生じる影のうち同じ濃度の影の領域Hfaは、回折光学素子28aに対して同じ傾斜角度を有する面に生じる。回折光学素子28aに対してより大きな傾斜角度を有する面にはさらに濃い濃度の影の領域Hfbが形成される。同一の強度の照明によって生じる影の濃度は、被照明面の傾斜角度に応じて一義的に決定されるので、画像処理装置3は、回折光学素子28aからの光9cによって照明された被写体Hの個眼像k22における影の濃度を計測することによって、被写体Hの表面の各位置における回折光学素子28aに対する微小面の傾斜角度を算出することができる。   For example, a shadow region Hfa having the same density among the shadows generated by the light 9c emitted from the diffractive optical element 28a is generated on a surface having the same inclination angle with respect to the diffractive optical element 28a. A darker shaded region Hfb is formed on the surface having a larger tilt angle with respect to the diffractive optical element 28a. Since the density of the shadow caused by the illumination with the same intensity is uniquely determined according to the tilt angle of the illuminated surface, the image processing apparatus 3 can detect the object H illuminated by the light 9c from the diffractive optical element 28a. By measuring the density of the shadow in the single-eye image k22, the inclination angle of the minute surface with respect to the diffractive optical element 28a at each position on the surface of the subject H can be calculated.

画像処理装置3は、同様に回折光学素子28bからの光9cによって被写体Hを一様な強度で照明して個眼像k22を撮像し、その個眼像k22における影の濃度から被写体Hの表面の各位置における回折光学素子28bに対する微小面の傾斜角度を算出し、既に算出した回折光学素子28aに対する微小面の傾斜角度と合成することによって被写体Hの3次元形状を推定することができる。   Similarly, the image processing apparatus 3 illuminates the subject H with uniform intensity with the light 9c from the diffractive optical element 28b to pick up a single-eye image k22, and determines the surface of the subject H from the shadow density in the single-eye image k22. The three-dimensional shape of the subject H can be estimated by calculating the inclination angle of the minute surface with respect to the diffractive optical element 28b at each position and combining it with the already calculated inclination angle of the minute surface with respect to the diffractive optical element 28a.

なお、光学レンズアレイ6上に3カ所の回折光学素子を配置して被写体Hを3方向から一様な強度の光9cで照明し、各方向から照明したときに撮像した個眼像に基づいて被写体H表面の微小面の傾斜角度を算出し、それらを合成することによって被写体Hの3次元形状を推定することが好ましいが、被写体Hが歯茎部の場合のように個々の被写体の形状に大きな相違がない場合には、上記のように2方向から照明することにより算出されたデータ(微小面の傾斜角度)を合成することによって被写体Hの形状の有意な推定を行うことができる。   Note that three diffractive optical elements are arranged on the optical lens array 6 to illuminate the subject H with light 9c having a uniform intensity from three directions, and based on a single-eye image captured when illuminated from each direction. Although it is preferable to estimate the three-dimensional shape of the subject H by calculating the inclination angle of the minute surface of the subject H and combining them, the shape of each subject is large as in the case where the subject H is a gum. If there is no difference, the shape of the subject H can be significantly estimated by combining the data (tilt angle of the minute surface) calculated by illuminating from two directions as described above.

さらに、前述のように歯科診療の治療の前後における歯茎部の腫れの程度を判定するような場合には、光学レンズアレイ6上に設けた1カ所の回折光学素子8から出射する一様な強度の光9cにより歯茎部を照明し、撮像された個眼像k22に基づいて歯茎部の表面の傾斜角度を算出する簡易な方法によっても有意な推定(歯茎部の腫れが大きくなったか、小さくなったか)を行うことができる。   Further, when the degree of swelling of the gums before and after the dental treatment is determined as described above, the uniform intensity emitted from one diffractive optical element 8 provided on the optical lens array 6 is used. A simple method of illuminating the gum part with the light 9c and calculating the inclination angle of the surface of the gum part based on the captured single-eye image k22 also makes a significant estimation (the swelling of the gum part becomes larger or smaller). Can do).

また、第1の実施形態と同様に、歯茎部の一部に鏡面反射がある場合であっても、複数の個眼像の中から同一部分について鏡面反射がない個眼像を検索し、その個眼像の当該部分の画像データを用いることができる。   Further, as in the first embodiment, even if there is specular reflection in a part of the gum part, a single-eye image without specular reflection is searched for the same part from a plurality of single-eye images, Image data of the part of the single-eye image can be used.

本発明の第1の実施形態に係る複眼撮像装置の構成を示す図。1 is a diagram illustrating a configuration of a compound eye imaging apparatus according to a first embodiment of the present invention. 同複眼撮像装置において撮像装置本体を被写体に接近させて使用するときの態様を示す図。The figure which shows a mode when using the imaging device main body close to a to-be-photographed object in the compound eye imaging device. 同複眼撮像装置における被写体の表面に映出させた輝点を撮像するときのレーザ光の光路と、光学レンズ、及び固体撮像素子の配置を示す図。The figure which shows the arrangement | positioning of the optical path of a laser beam, an optical lens, and a solid-state image sensor when imaging the bright spot projected on the surface of the to-be-photographed object in the compound eye imaging device. 本発明の第2の実施形態に係る複眼撮像装置の構成を示す図。The figure which shows the structure of the compound eye imaging device which concerns on the 2nd Embodiment of this invention. 同複眼撮像装置の変形例を示す図。The figure which shows the modification of the compound eye imaging device. 同複眼撮像装置の変形例における輝点の配置を示す図。The figure which shows arrangement | positioning of the luminescent point in the modification of the compound eye imaging device. 本発明の第3の実施形態に係る複眼撮像装置における回折光学素子から出射される光と被写体に形成される影の態様を示す図。The figure which shows the aspect of the shadow radiate | emitted from the light radiate | emitted from the diffractive optical element in the compound-eye imaging device which concerns on the 3rd Embodiment of this invention, and a to-be-photographed object.

符号の説明Explanation of symbols

1 複眼撮像装置
3 画像処理装置
6 光学レンズアレイ
7 固体撮像素子
8 回折光学素子
9、9a、9b レーザ光(可干渉性の光)
28、28a、28b 回折光学素子
H 被写体
L11、L12・・L33 光学レンズ
k11、k12・・k33 個眼像
p、pb、pr 輝点(パターン像)
DESCRIPTION OF SYMBOLS 1 Compound eye imaging device 3 Image processing device 6 Optical lens array 7 Solid-state image sensor 8 Diffractive optical element 9, 9a, 9b Laser beam (coherent light)
28, 28a, 28b Diffractive optical element H Subject L11, L12... L33 Optical lens k11, k12... K33 Single eye image p, pb, pr Bright spot (pattern image)

Claims (6)

被写体からの光を集光する複数の光学レンズが同一平面上に配置されてなる光学レンズアレイと、前記光学レンズによってそれぞれ形成された個眼像を撮像する固体撮像素子とを有する複眼撮像装置において、
前記光学レンズアレイ上に、外部から入射される可干渉性の光を光学的に変換して被写体へ向けて出射する回折光学素子を設けたことを特徴とする複眼撮像装置。
In a compound eye imaging apparatus having an optical lens array in which a plurality of optical lenses for condensing light from a subject are arranged on the same plane, and a solid-state imaging element that captures a single-eye image formed by each of the optical lenses. ,
A compound-eye imaging apparatus comprising a diffractive optical element that optically converts coherent light incident from the outside and emits the light toward a subject on the optical lens array.
前記回折光学素子は、外部から入射される可干渉性の光を反射し、この反射光が被写体に所定のパターン像を映出することを特徴とする請求項1に記載の複眼撮像装置。   The compound-eye imaging apparatus according to claim 1, wherein the diffractive optical element reflects coherent light incident from the outside, and the reflected light projects a predetermined pattern image on a subject. 前記回折光学素子は、外部から入射される可干渉性の光を透過し、この透過光が被写体に所定のパターン像を映出することを特徴とする請求項1に記載の複眼撮像装置。   The compound eye imaging apparatus according to claim 1, wherein the diffractive optical element transmits coherent light incident from the outside, and the transmitted light projects a predetermined pattern image on a subject. 前記固体撮像素子により撮像された、前記所定のパターン像が映出された被写体の個眼像から、被写体の3次元形状を算出する画像処理装置をさらに備えることを特徴とする請求項2、又は請求項3に記載の複眼撮像装置。   3. The image processing apparatus according to claim 2, further comprising: an image processing device that calculates a three-dimensional shape of the subject from a single-eye image of the subject on which the predetermined pattern image is projected, which is captured by the solid-state imaging device. The compound eye imaging device according to claim 3. 前記回折光学素子は、外部から入射される可干渉性の光を強度分布が一様な光に変換し、この光が被写体を照明することを特徴とする請求項1に記載の複眼撮像装置。   The compound-eye imaging device according to claim 1, wherein the diffractive optical element converts coherent light incident from the outside into light having a uniform intensity distribution, and the light illuminates the subject. 前記強度分布が一様な光によって照明された被写体の個眼像から、照度差ステレオ法を用いて被写体の3次元形状を算出する画像処理装置をさらに備えることを特徴とする請求項5に記載の複眼撮像装置。   The image processing apparatus according to claim 5, further comprising: an image processing device that calculates a three-dimensional shape of the subject from a single-eye image of the subject illuminated with light having a uniform intensity distribution using an illuminance difference stereo method. Compound eye imaging device.
JP2008048483A 2008-02-28 2008-02-28 Compound-eye imaging apparatus Withdrawn JP2009204991A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008048483A JP2009204991A (en) 2008-02-28 2008-02-28 Compound-eye imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008048483A JP2009204991A (en) 2008-02-28 2008-02-28 Compound-eye imaging apparatus

Publications (1)

Publication Number Publication Date
JP2009204991A true JP2009204991A (en) 2009-09-10

Family

ID=41147310

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008048483A Withdrawn JP2009204991A (en) 2008-02-28 2008-02-28 Compound-eye imaging apparatus

Country Status (1)

Country Link
JP (1) JP2009204991A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010069301A (en) * 2008-09-18 2010-04-02 Steinbichler Optotechnik Gmbh Device for determining three-dimensional coordinate of object, tooth in particular
JP2012059268A (en) * 2010-09-10 2012-03-22 Dimensional Photonics International Inc Data capturing method for three-dimensional imaging
WO2012120729A1 (en) * 2011-03-10 2012-09-13 三洋電機株式会社 Information acquiring apparatus, and object detecting apparatus having information acquiring apparatus mounted therein
WO2012124208A1 (en) * 2011-03-16 2012-09-20 三洋電機株式会社 Light-emitting device, information acquisition device, and object detection device mounted therewith
WO2012132086A1 (en) * 2011-03-25 2012-10-04 三洋電機株式会社 Information acquisition device and object detection device having information acquisition device mounted therein
WO2012133081A1 (en) * 2011-03-29 2012-10-04 三洋電機株式会社 Object detection device and information acquisition device
WO2012137674A1 (en) * 2011-04-08 2012-10-11 三洋電機株式会社 Information acquisition device, projection device, and object detection device
WO2012176623A1 (en) * 2011-06-24 2012-12-27 三洋電機株式会社 Object-detecting device and information-acquiring device
WO2013015145A1 (en) * 2011-07-22 2013-01-31 三洋電機株式会社 Information acquiring apparatus and object detecting apparatus
WO2013031447A1 (en) * 2011-08-26 2013-03-07 三洋電機株式会社 Object detection device and information acquisition device
WO2013031448A1 (en) * 2011-08-26 2013-03-07 三洋電機株式会社 Object detection device and information acquisition device
JP2013538592A (en) * 2010-06-08 2013-10-17 デュレ,フランソワ Time-dependent three-dimensional measuring device using color optical impression
WO2015188286A1 (en) * 2014-06-11 2015-12-17 Quarz Partners Ag Measuring apparatus and method for three-dimensional measurement of an oral cavity
WO2017002388A1 (en) * 2015-06-30 2017-01-05 オリンパス株式会社 Image processing device, ranging system, and endoscope system
WO2017006574A1 (en) * 2015-07-03 2017-01-12 オリンパス株式会社 Image processing device, image determination system, and endoscope system
JP2017518147A (en) * 2014-03-28 2017-07-06 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Quantitative three-dimensional imaging of surgical scenes
US10334227B2 (en) 2014-03-28 2019-06-25 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
US10350009B2 (en) 2014-03-28 2019-07-16 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging and printing of surgical implants
US10555788B2 (en) 2014-03-28 2020-02-11 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
US11266465B2 (en) 2014-03-28 2022-03-08 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010069301A (en) * 2008-09-18 2010-04-02 Steinbichler Optotechnik Gmbh Device for determining three-dimensional coordinate of object, tooth in particular
JP2013538592A (en) * 2010-06-08 2013-10-17 デュレ,フランソワ Time-dependent three-dimensional measuring device using color optical impression
JP2012059268A (en) * 2010-09-10 2012-03-22 Dimensional Photonics International Inc Data capturing method for three-dimensional imaging
WO2012120729A1 (en) * 2011-03-10 2012-09-13 三洋電機株式会社 Information acquiring apparatus, and object detecting apparatus having information acquiring apparatus mounted therein
WO2012124208A1 (en) * 2011-03-16 2012-09-20 三洋電機株式会社 Light-emitting device, information acquisition device, and object detection device mounted therewith
CN102822622A (en) * 2011-03-25 2012-12-12 三洋电机株式会社 Object detecting device and information acquiring device
WO2012132086A1 (en) * 2011-03-25 2012-10-04 三洋電機株式会社 Information acquisition device and object detection device having information acquisition device mounted therein
JP5174285B1 (en) * 2011-03-25 2013-04-03 三洋電機株式会社 Information acquisition device and object detection device equipped with information acquisition device
CN102812414A (en) * 2011-03-29 2012-12-05 三洋电机株式会社 Object detecting device and information acquiring device
JP5106710B2 (en) * 2011-03-29 2012-12-26 三洋電機株式会社 Object detection device and information acquisition device
WO2012133081A1 (en) * 2011-03-29 2012-10-04 三洋電機株式会社 Object detection device and information acquisition device
WO2012137674A1 (en) * 2011-04-08 2012-10-11 三洋電機株式会社 Information acquisition device, projection device, and object detection device
WO2012176623A1 (en) * 2011-06-24 2012-12-27 三洋電機株式会社 Object-detecting device and information-acquiring device
WO2013015145A1 (en) * 2011-07-22 2013-01-31 三洋電機株式会社 Information acquiring apparatus and object detecting apparatus
WO2013031447A1 (en) * 2011-08-26 2013-03-07 三洋電機株式会社 Object detection device and information acquisition device
WO2013031448A1 (en) * 2011-08-26 2013-03-07 三洋電機株式会社 Object detection device and information acquisition device
US10334227B2 (en) 2014-03-28 2019-06-25 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
US10368054B2 (en) 2014-03-28 2019-07-30 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes
US11304771B2 (en) 2014-03-28 2022-04-19 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
US11266465B2 (en) 2014-03-28 2022-03-08 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
JP2017518147A (en) * 2014-03-28 2017-07-06 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Quantitative three-dimensional imaging of surgical scenes
US10555788B2 (en) 2014-03-28 2020-02-11 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
US10350009B2 (en) 2014-03-28 2019-07-16 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging and printing of surgical implants
WO2015188286A1 (en) * 2014-06-11 2015-12-17 Quarz Partners Ag Measuring apparatus and method for three-dimensional measurement of an oral cavity
CN107072498A (en) * 2015-06-30 2017-08-18 奥林巴斯株式会社 Image processing apparatus, range-measurement system and endoscopic system
WO2017002388A1 (en) * 2015-06-30 2017-01-05 オリンパス株式会社 Image processing device, ranging system, and endoscope system
JP6064106B1 (en) * 2015-06-30 2017-01-18 オリンパス株式会社 Image processing apparatus, capsule endoscope system, and endoscope system
CN107529969A (en) * 2015-07-03 2018-01-02 奥林巴斯株式会社 Image processing apparatus, image discriminating system and endoscopic system
JPWO2017006574A1 (en) * 2015-07-03 2017-07-06 オリンパス株式会社 Image processing apparatus and endoscope system
WO2017006574A1 (en) * 2015-07-03 2017-01-12 オリンパス株式会社 Image processing device, image determination system, and endoscope system

Similar Documents

Publication Publication Date Title
JP2009204991A (en) Compound-eye imaging apparatus
US10260869B2 (en) Chromatic confocal system
US11629954B2 (en) Intraoral scanner with fixed focal position and/or motion tracking
US20210298605A1 (en) Intraoral scanner
JP6779199B2 (en) Equipment for dental confocal imaging
US9539070B2 (en) Method and device for carrying out optical pickup
US7259871B2 (en) Apparatus and method for rapid and precise scanning of three-dimensional occlusal profile of dental cast
JP5189287B2 (en) Dental laser digitizer system
JP2010507079A (en) Apparatus and method for non-contact detection of 3D contours
JP2016528972A (en) System, method, and computer program for 3D contour data collection and caries detection
JP2010246899A (en) Dental surface imaging using polarized fringe projection
JP2004053532A (en) Optical shape measuring device
KR101269128B1 (en) Surface roughness measurement apparatus and method having intermediate view generator
JP3848586B2 (en) Surface inspection device
JPH01209415A (en) Endoscope device with measuring function
TW201633370A (en) Exposure device
DK2428162T3 (en) Method of recording data for three-dimensional imaging of intra-oral cavities
Kagawa et al. A compact shape-measurement module based on a thin compound-eye camera with multiwavelength diffractive pattern projection for intraoral diagnosis
JP2021060214A (en) Measuring apparatus and measuring method
JP2013083509A (en) Shape measuring apparatus and adjusting method for shape measuring apparatus

Legal Events

Date Code Title Description
A300 Withdrawal of application because of no request for examination

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20110510