JPWO2020018931A5 - - Google Patents

Download PDF

Info

Publication number
JPWO2020018931A5
JPWO2020018931A5 JP2021526407A JP2021526407A JPWO2020018931A5 JP WO2020018931 A5 JPWO2020018931 A5 JP WO2020018931A5 JP 2021526407 A JP2021526407 A JP 2021526407A JP 2021526407 A JP2021526407 A JP 2021526407A JP WO2020018931 A5 JPWO2020018931 A5 JP WO2020018931A5
Authority
JP
Japan
Prior art keywords
depth
weight
markers
image
item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2021526407A
Other languages
Japanese (ja)
Other versions
JP2021531482A (en
JP7297891B2 (en
Publication date
Application filed filed Critical
Priority claimed from PCT/US2019/042647 external-priority patent/WO2020018931A1/en
Publication of JP2021531482A publication Critical patent/JP2021531482A/en
Publication of JPWO2020018931A5 publication Critical patent/JPWO2020018931A5/ja
Application granted granted Critical
Publication of JP7297891B2 publication Critical patent/JP7297891B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Description

種々の実施形態では、近位端および遠位端を有する、内視鏡と、内視鏡の遠位端に光学的に結合される、結像デバイスと、それを用いて具現化されるプログラム命令を有する、コンピュータ可読記憶媒体を備える、コンピューティングノードとを含む、統合された外科手術デバイスが、提供される。プログラム命令は、コンピューティングノードのプロセッサによって、プロセッサに、画像が記録される、方法を実施させるように実行可能である。画像は、物体と、物体上に配置される、第1の複数のマーカと、物体上に配置される、第2の複数のマーカと、物体上に配置される、第3の複数のマーカとを含む。第1の深度は、画像および第1の複数のマーカを使用して算出される。第2の深度は、画像および第2の複数のマーカを使用して算出される。第3の深度は、画像および第3の複数のマーカを使用して算出される。第1の加重は、第1の深度に割り当てられ、第2の加重は、第2の深度に割り当てられ、第3の加重は、第3の深度に割り当てられる。加重平均深度は、第1の深度、第2の深度、第3の深度、第1の加重、第2の加重、および第3の加重に基づいて算出される。
本発明は、例えば、以下の項目を提供する。
(項目1)
方法であって、
画像を記録することであって、前記画像は、物体と、前記物体上に配置される第1の複数のマーカと、前記物体上に配置される第2の複数のマーカと、前記物体上に配置される第3の複数のマーカとを備える、ことと、
前記画像および前記第1の複数のマーカを使用して、第1の深度を算出することと、
前記画像および前記第2の複数のマーカを使用して、第2の深度を算出することと、
前記画像および前記第3の複数のマーカを使用して、第3の深度を算出することと、
第1の加重を前記第1の深度に割り当て、第2の加重を前記第2の深度に割り当て、第3の加重を前記第3の深度に割り当てることと、
前記第1の深度、第2の深度、第3の深度、第1の加重、第2の加重、および第3の加重に基づいて、加重平均深度を算出することと
を含む、方法。
(項目2)
前記画像を記録することは、1つ以上のデジタルカメラを用いて実施される、項目1に記載の方法。
(項目3)
前記1つ以上のデジタルカメラは、立体視カメラシステムを備える、項目2に記載の方法。
(項目4)
前記1つ以上のデジタルカメラは、プレノプティックカメラを備える、項目2または3に記載の方法。
(項目5)
前記第1の複数のマーカは、基点マーカを備える、前記項目のいずれかに記載の方法。
(項目6)
前記基点マーカは、液体インクを備える、項目5に記載の方法。
(項目7)
前記画像を記録することは、
構造化された光源から前記物体の表面上に構造化された光パターンをオーバーレイすることと、
前記構造化された光パターンを前記物体上に記録することと、
前記構造化された光パターンの幾何学的再構築を算出することと
を含む、前記項目のいずれかに記載の方法。
(項目8)
前記画像を記録することは、
光源から前記物体の表面上に光パターンをオーバーレイすることと、
前記光パターンの第1の画像を第1の場所における第1のカメラを用いて記録することと、
前記光パターンの第2の画像を前記第1の場所から所定の距離だけ離れた第2の場所における第2のカメラを用いて記録することと、
前記第1の画像と前記第2の画像との間の不均衡値を算出することと
を含む、前記項目のいずれかに記載の方法。
(項目9)
前記第3の複数のマーカは、前記物体の表面に適用される造影剤を備える、前記項目のいずれかに記載の方法。
(項目10)
前記造影剤は、霧状液体染料である、項目9に記載の方法。
(項目11)
前記第3の加重は、前記第1の加重および前記第2の加重を上回る、項目1-10のいずれか1項に記載の方法。
(項目12)
前記第2の加重は、前記第1の加重および前記第3の加重を上回る、項目1-10のいずれか1項に記載の方法。
(項目13)
前記第1の加重は、前記第2の加重および前記第3の加重を上回る、項目1-10のいずれか1項に記載の方法。
(項目14)
前記第1の加重は、前記第2の加重および前記第3の加重に等しい、項目1-10のいずれか1項に記載の方法。
(項目15)
システムであって、
結像デバイスと、
コンピュータ可読記憶媒体を備えるコンピューティングノードであって、前記コンピュータ可読記憶媒体は、それとともに具現化されるプログラム命令を有し、前記プログラム命令は、前記コンピューティングノードのプロセッサに、
画像を記録することであって、前記画像は、物体と、前記物体上に配置される第1の複数のマーカと、前記物体上に配置される第2の複数のマーカと、前記物体上に配置される第3の複数のマーカとを備える、ことと、
前記画像および前記第1の複数のマーカを使用して、第1の深度を算出することと、
前記画像および前記第2の複数のマーカを使用して、第2の深度を算出することと、
前記画像および前記第3の複数のマーカを使用して、第3の深度を算出することと、
第1の加重を前記第1の深度に割り当て、第2の加重を前記第2の深度に割り当て、第3の加重を前記第3の深度に割り当てることと、
前記第1の深度、第2の深度、第3の深度、第1の加重、第2の加重、および第3の加重に基づいて、加重平均深度を算出することと
を含む方法を実施させるように、前記プロセッサによって実行可能である、コンピューティングノードと
を備える、システム。
(項目16)
前記結像デバイスは、1つ以上のデジタルカメラを備える、項目15に記載のシステム。
(項目17)
前記1つ以上のデジタルカメラは、立体視カメラシステムを備える、項目16に記載のシステム。
(項目18)
前記1つ以上のデジタルカメラは、プレノプティックカメラを備える、項目16または17に記載のシステム。
(項目19)
前記第1の複数のマーカは、基点マーカを備える、項目15-18のいずれか1項に記載のシステム。
(項目20)
前記基点マーカは、液体インクを備える、項目19に記載のシステム。
(項目21)
構造化された光パターンを前記物体の表面上に投影するように構成される構造化された光源をさらに備える、項目15-20のいずれか1項に記載のシステム。
(項目22)
前記画像を記録することは、
前記構造化された光源から前記物体の表面上に前記構造化された光パターンをオーバーレイすることと、
前記構造化された光パターンを前記物体上に記録することと、
前記構造化された光パターンの幾何学的再構築を算出することと
を含む、項目21に記載のシステム。
(項目23)
前記画像を記録することは、
光源から前記物体の表面上に光パターンをオーバーレイすることと、
前記光パターンの第1の画像を第1の場所における第1のカメラを用いて記録することと、
前記光パターンの第2の画像を前記第1の場所から所定の距離だけ離れた第2の場所における第2のカメラを用いて記録することと、
前記第1の画像と前記第2の画像との間の不均衡値を算出することと
を含む、項目15-22のいずれか1項に記載のシステム。
(項目24)
前記第3の複数のマーカは、前記物体の表面に適用される造影剤を備える、項目15-23のいずれか1項に記載のシステム。
(項目25)
前記造影剤は、霧状液体染料である、項目24に記載のシステム。
(項目26)
前記第3の加重は、前記第1の加重および前記第2の加重を上回る、項目15-25のいずれか1項に記載のシステム。
(項目27)
前記第2の加重は、前記第1の加重および前記第3の加重を上回る、項目15-25のいずれか1項に記載のシステム。
(項目28)
前記第1の加重は、前記第2の加重および前記第3の加重を上回る、項目15-25のいずれか1項に記載のシステム。
(項目29)
前記第1の加重は、前記第2の加重および前記第3の加重に等しい、項目15-25のいずれか1項に記載のシステム。
(項目30)
近位および遠位端を有する内視鏡をさらに備え、前記結像デバイスは、前記近位端に配置される、項目15-29のいずれか1項に記載のシステム。
(項目31)
コンピュータ可読記憶媒体を備えるコンピュータプログラム製品であって、前記コンピュータ可読記憶媒体は、それとともに具現化されるプログラム命令を有し、前記プログラム命令は、プロセッサに、
画像を記録することであって、前記画像は、物体と、前記物体上に配置される第1の複数のマーカと、前記物体上に配置される第2の複数のマーカと、前記物体上に配置される第3の複数のマーカとを備える、ことと、
前記画像および前記第1の複数のマーカを使用して、第1の深度を算出することと、
前記画像および前記第2の複数のマーカを使用して、第2の深度を算出することと、
前記画像および前記第3の複数のマーカを使用して、第3の深度を算出することと、
第1の加重を前記第1の深度に割り当て、第2の加重を前記第2の深度に割り当て、第3の加重を前記第3の深度に割り当てることと、
前記第1の深度、第2の深度、第3の深度、第1の加重、第2の加重、および第3の加重に基づいて、加重平均深度を算出することと
を含む方法を実施させるように、前記プロセッサによって実行可能である、コンピュータプログラム製品。
(項目32)
前記画像を記録することは、1つ以上のデジタルカメラを用いて実施される、項目31に記載のコンピュータプログラム製品。
(項目33)
前記1つ以上のデジタルカメラは、立体視カメラシステムを備える、項目32に記載のコンピュータプログラム製品。
(項目34)
前記1つ以上のデジタルカメラは、プレノプティックカメラを備える、項目32または33に記載のコンピュータプログラム製品。
(項目35)
前記第1の複数のマーカは、基点マーカを備える、項目31-34のいずれか1項に記載のコンピュータプログラム製品。
(項目36)
前記基点マーカは、液体インクを備える、項目35に記載のコンピュータプログラム製品。
(項目37)
前記画像を記録することは、
構造化された光源から前記物体の表面上に構造化された光パターンをオーバーレイすることと、
前記構造化された光パターンを前記物体上に記録することと、
前記構造化された光パターンの幾何学的再構築を算出することと
を含む、項目31-36のいずれか1項に記載のコンピュータプログラム製品。
(項目38)
前記画像を記録することは、
光源から前記物体の表面上に光パターンをオーバーレイすることと、
前記光パターンの第1の画像を第1の場所における第1のカメラを用いて記録することと、
前記光パターンの第2の画像を前記第1の場所から所定の距離だけ離れた第2の場所における第2のカメラを用いて記録することと、
前記第1の画像と前記第2の画像との間の不均衡値を算出することと
を含む、項目31-37のいずれか1項に記載のコンピュータプログラム製品。
(項目39)
前記第3の複数のマーカは、前記物体の表面に適用される造影剤を備える、項目31-38のいずれか1項に記載のコンピュータプログラム製品。
(項目40)
前記造影剤は、霧状液体染料である、項目39に記載のコンピュータプログラム製品。
(項目41)
前記第3の加重は、前記第1の加重および前記第2の加重を上回る、項目31-40のいずれか1項に記載のコンピュータプログラム製品。
(項目42)
前記第2の加重は、前記第1の加重および前記第3の加重を上回る、項目31-40のいずれか1項に記載のコンピュータプログラム製品。
(項目43)
前記第1の加重は、前記第2の加重および前記第3の加重を上回る、項目31-40のいずれか1項に記載のコンピュータプログラム製品。
(項目44)
前記第1の加重は、前記第2の加重および前記第3の加重に等しい、項目31-40のいずれか1項に記載のコンピュータプログラム製品。
(項目45)
方法であって、
画像を記録することであって、前記画像は、物体と、前記物体上に配置される第1の複数のマーカと、前記物体上に配置される第2の複数のマーカとを備える、ことと、
前記画像および前記第1の複数のマーカを使用して、第1の深度を算出することと、
前記画像および前記第2の複数のマーカを使用して、第2の深度を算出することと、
第1の加重を前記第1の深度に割り当て、第2の加重を前記第2の深度に割り当てることと、
前記第1の深度、第2の深度、第1の加重、および第2の加重に基づいて、加重平均深度を算出することと
を含む、方法。
(項目46)
システムであって、
結像デバイスと、
コンピュータ可読記憶媒体を備えるコンピューティングノードであって、前記コンピュータ可読記憶媒体は、それとともに具現化されるプログラム命令を有し、前記プログラム命令は、前記コンピューティングノードのプロセッサに、
画像を記録することであって、前記画像は、物体と、前記物体上に配置される第1の複数のマーカと、前記物体上に配置される第2の複数のマーカとを備える、ことと、
前記画像および前記第1の複数のマーカを使用して、第1の深度を算出することと、
前記画像および前記第2の複数のマーカを使用して、第2の深度を算出することと、
第1の加重を前記第1の深度に割り当て、第2の加重を前記第2の深度に割り当てることと、
前記第1の深度、第2の深度、第1の加重、および第2の加重に基づいて、加重平均深度を算出することと
を含む方法を実施させるように、前記プロセッサによって実行可能である、コンピューティングノードと
を備える、システム。
(項目47)
コンピュータ可読記憶媒体を備えるコンピュータプログラム製品であって、前記コンピュータ可読記憶媒体は、それとともに具現化されるプログラム命令を有し、前記プログラム命令は、プロセッサに、
画像を記録することであって、前記画像は、物体と、前記物体上に配置される第1の複数のマーカと、前記物体上に配置される第2の複数のマーカとを備える、ことと、
前記画像および前記第1の複数のマーカを使用して、第1の深度を算出することと、
前記画像および前記第2の複数のマーカを使用して、第2の深度を算出することと、
第1の加重を前記第1の深度に割り当て、第2の加重を前記第2の深度に割り当てることと、
前記第1の深度、第2の深度、第1の加重、および第2の加重に基づいて、加重平均深度を算出することと
を含む方法を実施させるように、前記プロセッサによって実行可能である、コンピュータプログラム製品。
(項目48)
統合された外科手術デバイスであって、
近位端および遠位端を有する内視鏡と、
前記内視鏡の遠位端に光学的に結合される結像デバイスと、
コンピュータ可読記憶媒体を備えるコンピューティングノードであって、前記コンピュータ可読記憶媒体は、それとともに具現化されるプログラム命令を有し、前記プログラム命令は、前記コンピューティングノードのプロセッサに、
画像を記録することであって、前記画像は、物体と、前記物体上に配置される第1の複数のマーカと、前記物体上に配置される第2の複数のマーカと、前記物体上に配置される第3の複数のマーカとを備える、ことと、
前記画像および前記第1の複数のマーカを使用して、第1の深度を算出することと、
前記画像および前記第2の複数のマーカを使用して、第2の深度を算出することと、
前記画像および前記第3の複数のマーカを使用して、第3の深度を算出することと、
第1の加重を前記第1の深度に割り当て、第2の加重を前記第2の深度に割り当て、第3の加重を前記第3の深度に割り当てることと、
前記第1の深度、第2の深度、第3の深度、第1の加重、第2の加重、および第3の加重に基づいて、加重平均深度を算出することと
を含む方法を実施させるように、前記プロセッサによって実行可能である、コンピューティングノードと
を備える、統合された外科手術デバイス。
(項目49)
統合された外科手術デバイスであって、
近位端および遠位端を有する内視鏡と、
前記内視鏡の遠位端に光学的に結合される結像デバイスと、
コンピュータ可読記憶媒体を備えるコンピューティングノードであって、前記コンピュータ可読記憶媒体は、それとともに具現化されるプログラム命令を有し、前記プログラム命令は、前記コンピューティングノードのプロセッサに、
画像を記録することであって、前記画像は、物体と、前記物体上に配置される第1の複数のマーカと、前記物体上に配置される第2の複数のマーカとを備える、ことと、
前記画像および前記第1の複数のマーカを使用して、第1の深度を算出することと、
前記画像および前記第2の複数のマーカを使用して、第2の深度を算出することと、
第1の加重を前記第1の深度に割り当て、第2の加重を前記第2の深度に割り当てることと、
前記第1の深度、第2の深度、第1の加重、および第2の加重に基づいて、加重平均深度を算出することと
を含む方法を実施させるように、前記プロセッサによって実行可能である、コンピューティングノードと
を備える、統合された外科手術デバイス。
In various embodiments, an endoscope having a proximal end and a distal end, an imaging device optically coupled to the distal end of the endoscope, and a program embodied therewith An integrated surgical device is provided that includes a computing node having a computer-readable storage medium having instructions. Program instructions are executable by a processor of a computing node to cause the processor to perform a method in which an image is recorded. The image includes an object, a first plurality of markers positioned on the object, a second plurality of markers positioned on the object, and a third plurality of markers positioned on the object. including. A first depth is calculated using the image and the first plurality of markers. A second depth is calculated using the image and a second plurality of markers. A third depth is calculated using the image and a third plurality of markers. A first weight is assigned to the first depth, a second weight is assigned to the second depth, and a third weight is assigned to the third depth. A weighted average depth is calculated based on the first depth, the second depth, the third depth, the first weight, the second weight, and the third weight.
The present invention provides, for example, the following items.
(Item 1)
a method,
recording an image, the image comprising an object, a first plurality of markers disposed on the object, a second plurality of markers disposed on the object, and a third plurality of markers positioned;
calculating a first depth using the image and the first plurality of markers;
calculating a second depth using the image and the second plurality of markers;
calculating a third depth using the image and the third plurality of markers;
assigning a first weight to the first depth, a second weight to the second depth, and a third weight to the third depth;
calculating a weighted average depth based on the first depth, the second depth, the third depth, the first weight, the second weight, and the third weight;
A method, including
(Item 2)
The method of item 1, wherein recording the images is performed using one or more digital cameras.
(Item 3)
3. The method of item 2, wherein the one or more digital cameras comprise a stereoscopic camera system.
(Item 4)
Method according to item 2 or 3, wherein the one or more digital cameras comprise plenoptic cameras.
(Item 5)
The method of any of the preceding items, wherein the first plurality of markers comprises a fiducial marker.
(Item 6)
6. The method of item 5, wherein the fiducial marker comprises liquid ink.
(Item 7)
Recording the image includes:
overlaying a structured light pattern from a structured light source onto the surface of the object;
recording the structured light pattern on the object;
calculating a geometric reconstruction of the structured light pattern;
The method of any of the preceding items, comprising
(Item 8)
Recording the image includes:
overlaying a light pattern from a light source onto the surface of the object;
recording a first image of the light pattern with a first camera at a first location;
recording a second image of the light pattern with a second camera at a second location a predetermined distance from the first location;
calculating an imbalance value between the first image and the second image;
The method of any of the preceding items, comprising
(Item 9)
The method of any of the preceding items, wherein the third plurality of markers comprises a contrast agent applied to the surface of the object.
(Item 10)
10. The method of item 9, wherein the contrast agent is a nebulized liquid dye.
(Item 11)
11. The method of any one of items 1-10, wherein the third weighting exceeds the first weighting and the second weighting.
(Item 12)
11. The method of any one of items 1-10, wherein the second weighting exceeds the first weighting and the third weighting.
(Item 13)
11. The method of any one of items 1-10, wherein the first weighting exceeds the second weighting and the third weighting.
(Item 14)
11. The method of any one of items 1-10, wherein the first weight is equal to the second weight and the third weight.
(Item 15)
a system,
an imaging device;
A computing node comprising a computer-readable storage medium having program instructions embodied therewith, said program instructions causing a processor of said computing node to:
recording an image, the image comprising an object, a first plurality of markers disposed on the object, a second plurality of markers disposed on the object, and a third plurality of markers positioned;
calculating a first depth using the image and the first plurality of markers;
calculating a second depth using the image and the second plurality of markers;
calculating a third depth using the image and the third plurality of markers;
assigning a first weight to the first depth, a second weight to the second depth, and a third weight to the third depth;
calculating a weighted average depth based on the first depth, the second depth, the third depth, the first weight, the second weight, and the third weight;
a computing node executable by said processor to cause it to perform a method comprising
A system comprising:
(Item 16)
16. The system of item 15, wherein the imaging device comprises one or more digital cameras.
(Item 17)
17. The system of item 16, wherein the one or more digital cameras comprise a stereoscopic camera system.
(Item 18)
18. System according to item 16 or 17, wherein the one or more digital cameras comprise plenoptic cameras.
(Item 19)
19. The system of any one of items 15-18, wherein the first plurality of markers comprises fiducial markers.
(Item 20)
20. The system of item 19, wherein the fiducial marker comprises liquid ink.
(Item 21)
21. The system of any one of items 15-20, further comprising a structured light source configured to project a structured light pattern onto the surface of the object.
(Item 22)
Recording the image includes:
overlaying the structured light pattern from the structured light source onto the surface of the object;
recording the structured light pattern on the object;
calculating a geometric reconstruction of the structured light pattern;
22. The system of item 21, comprising:
(Item 23)
Recording the image includes:
overlaying a light pattern from a light source onto the surface of the object;
recording a first image of the light pattern with a first camera at a first location;
recording a second image of the light pattern with a second camera at a second location a predetermined distance from the first location;
calculating an imbalance value between the first image and the second image;
23. The system of any one of items 15-22, comprising:
(Item 24)
24. The system of any one of items 15-23, wherein the third plurality of markers comprises a contrast agent applied to the surface of the object.
(Item 25)
25. The system of item 24, wherein the contrast agent is a nebulized liquid dye.
(Item 26)
26. The system of any one of items 15-25, wherein said third weighting exceeds said first weighting and said second weighting.
(Item 27)
26. The system of any one of items 15-25, wherein said second weighting exceeds said first weighting and said third weighting.
(Item 28)
26. The system of any one of items 15-25, wherein the first weighting exceeds the second weighting and the third weighting.
(Item 29)
26. The system of any one of items 15-25, wherein the first weight is equal to the second weight and the third weight.
(Item 30)
30. The system of any one of items 15-29, further comprising an endoscope having proximal and distal ends, wherein the imaging device is located at the proximal end.
(Item 31)
A computer program product comprising a computer-readable storage medium having program instructions embodied therewith, the program instructions causing a processor to:
recording an image, the image comprising an object, a first plurality of markers disposed on the object, a second plurality of markers disposed on the object, and a third plurality of markers positioned;
calculating a first depth using the image and the first plurality of markers;
calculating a second depth using the image and the second plurality of markers;
calculating a third depth using the image and the third plurality of markers;
assigning a first weight to the first depth, a second weight to the second depth, and a third weight to the third depth;
calculating a weighted average depth based on the first depth, the second depth, the third depth, the first weight, the second weight, and the third weight;
A computer program product executable by said processor to cause it to perform a method comprising:
(Item 32)
32. The computer program product of item 31, wherein recording the images is performed using one or more digital cameras.
(Item 33)
33. The computer program product of item 32, wherein the one or more digital cameras comprise a stereoscopic camera system.
(Item 34)
34. Computer program product according to item 32 or 33, wherein the one or more digital cameras comprise plenoptic cameras.
(Item 35)
35. The computer program product of any one of items 31-34, wherein the first plurality of markers comprises fiducial markers.
(Item 36)
36. The computer program product of item 35, wherein the fiducial marker comprises liquid ink.
(Item 37)
Recording the image includes:
overlaying a structured light pattern from a structured light source onto the surface of the object;
recording the structured light pattern on the object;
calculating a geometric reconstruction of the structured light pattern;
37. The computer program product of any one of items 31-36, comprising:
(Item 38)
Recording the image includes:
overlaying a light pattern from a light source onto the surface of the object;
recording a first image of the light pattern with a first camera at a first location;
recording a second image of the light pattern with a second camera at a second location a predetermined distance from the first location;
calculating an imbalance value between the first image and the second image;
38. The computer program product of any one of items 31-37, comprising:
(Item 39)
39. The computer program product of any one of items 31-38, wherein the third plurality of markers comprises a contrast agent applied to the surface of the object.
(Item 40)
40. The computer program product of item 39, wherein the contrast agent is a liquid nebulized dye.
(Item 41)
41. The computer program product of any one of items 31-40, wherein said third weighting exceeds said first weighting and said second weighting.
(Item 42)
41. The computer program product of any one of items 31-40, wherein said second weighting exceeds said first weighting and said third weighting.
(Item 43)
41. The computer program product of any one of items 31-40, wherein the first weighting exceeds the second weighting and the third weighting.
(Item 44)
41. The computer program product of any one of items 31-40, wherein the first weight is equal to the second weight and the third weight.
(Item 45)
a method,
recording an image, the image comprising an object, a first plurality of markers positioned on the object, and a second plurality of markers positioned on the object; ,
calculating a first depth using the image and the first plurality of markers;
calculating a second depth using the image and the second plurality of markers;
assigning a first weight to the first depth and a second weight to the second depth;
calculating a weighted average depth based on the first depth, the second depth, the first weight, and the second weight;
A method, including
(Item 46)
a system,
an imaging device;
A computing node comprising a computer-readable storage medium having program instructions embodied therewith, said program instructions causing a processor of said computing node to:
recording an image, the image comprising an object, a first plurality of markers positioned on the object, and a second plurality of markers positioned on the object; ,
calculating a first depth using the image and the first plurality of markers;
calculating a second depth using the image and the second plurality of markers;
assigning a first weight to the first depth and a second weight to the second depth;
calculating a weighted average depth based on the first depth, the second depth, the first weight, and the second weight;
a computing node executable by said processor to cause it to perform a method comprising
A system comprising:
(Item 47)
A computer program product comprising a computer-readable storage medium having program instructions embodied therewith, the program instructions causing a processor to:
recording an image, the image comprising an object, a first plurality of markers positioned on the object, and a second plurality of markers positioned on the object; ,
calculating a first depth using the image and the first plurality of markers;
calculating a second depth using the image and the second plurality of markers;
assigning a first weight to the first depth and a second weight to the second depth;
calculating a weighted average depth based on the first depth, the second depth, the first weight, and the second weight;
A computer program product executable by said processor to cause it to perform a method comprising:
(Item 48)
An integrated surgical device comprising:
an endoscope having a proximal end and a distal end;
an imaging device optically coupled to the distal end of the endoscope;
A computing node comprising a computer-readable storage medium having program instructions embodied therewith, said program instructions causing a processor of said computing node to:
recording an image, the image comprising an object, a first plurality of markers disposed on the object, a second plurality of markers disposed on the object, and a third plurality of markers positioned;
calculating a first depth using the image and the first plurality of markers;
calculating a second depth using the image and the second plurality of markers;
calculating a third depth using the image and the third plurality of markers;
assigning a first weight to the first depth, a second weight to the second depth, and a third weight to the third depth;
calculating a weighted average depth based on the first depth, the second depth, the third depth, the first weight, the second weight, and the third weight;
a computing node executable by said processor to cause it to perform a method comprising
An integrated surgical device comprising:
(Item 49)
An integrated surgical device comprising:
an endoscope having a proximal end and a distal end;
an imaging device optically coupled to the distal end of the endoscope;
A computing node comprising a computer-readable storage medium having program instructions embodied therewith, said program instructions causing a processor of said computing node to:
recording an image, the image comprising an object, a first plurality of markers positioned on the object, and a second plurality of markers positioned on the object; ,
calculating a first depth using the image and the first plurality of markers;
calculating a second depth using the image and the second plurality of markers;
assigning a first weight to the first depth and a second weight to the second depth;
calculating a weighted average depth based on the first depth, the second depth, the first weight, and the second weight;
a computing node executable by said processor to cause it to perform a method comprising
An integrated surgical device comprising:

Claims (20)

深度感知の方法であって、A method of depth sensing comprising:
(a)外科手術場面内の物体の1つ以上の画像を記録することであって、前記1つ以上の画像は、前記物体上の複数のマーカを備える、ことと、(a) recording one or more images of an object within a surgical scene, the one or more images comprising a plurality of markers on the object;
(b)(i)前記1つ以上の画像および前記複数のマーカのうちの第1の組のマーカを使用して、第1の深度を算出し、(ii)前記1つ以上の画像および前記複数のマーカのうちの第2の組のマーカを使用して、第2の深度を算出し、(iii)前記複数のマーカのうちの第3の組のマーカを使用して、第3の深度を算出することと、(b) (i) using the one or more images and a first set of markers of the plurality of markers to calculate a first depth; (ii) the one or more images and the using a second set of markers of the plurality of markers to calculate a second depth; (iii) using a third set of markers of the plurality of markers to calculate a third depth; and calculating
(c)(i)第1の加重を前記第1の深度に割り当て、(ii)第2の加重を前記第2の深度に割り当て、(iii)第3の加重を前記第3の深度に割り当てることと、(c) (i) assigning a first weight to said first depth; (ii) assigning a second weight to said second depth; (iii) assigning a third weight to said third depth; and
(d)前記第1の深度、前記第1の加重、前記第2の深度、前記第2の加重、前記第3の深度、および前記第3の加重に少なくとも部分的に基づいて、前記物体の1つ以上の部分の加重平均深度を算出することと(d) based at least in part on the first depth, the first weighting, the second depth, the second weighting, the third depth, and the third weighting of the object; calculating a weighted average depth of one or more portions;
を含む方法。method including.
(a)は、前記1つ以上の画像を記録するために1つ以上の結像デバイスを使用することを含む、請求項1に記載の方法。2. The method of claim 1, wherein (a) comprises using one or more imaging devices to record the one or more images. 前記1つ以上の結像デバイスは、立体視カメラ、赤外線カメラ、明視野カメラ、プレノプティックカメラ、または構造化光検出ユニットを備える、請求項2に記載の方法。3. The method of claim 2, wherein the one or more imaging devices comprise a stereoscopic camera, an infrared camera, a brightfield camera, a plenoptic camera, or a structured light detection unit. 前記1つ以上の結像デバイスは、異なるタイプの結像デバイスを備える、請求項2に記載の方法。3. The method of claim 2, wherein the one or more imaging devices comprise different types of imaging devices. 前記第1の加重および前記第2の加重のうちの少なくとも一方は、前記1つ以上の結像デバイスの正確度または精度に対応する、請求項2に記載の方法。3. The method of claim 2, wherein at least one of the first weighting and the second weighting corresponds to accuracy or precision of the one or more imaging devices. 前記第1の組のマーカは、1つ以上の基点マーカを備える、請求項1に記載の方法。2. The method of claim 1, wherein the first set of markers comprises one or more fiducial markers. 前記1つ以上の基点マーカは、前記物体に適用された液体インクを備える、請求項6に記載の方法。7. The method of Claim 6, wherein the one or more fiducial markers comprise liquid ink applied to the object. 前記第2の組のマーカは、1つ以上の光学マーカを備える、請求項1に記載の方法。2. The method of claim 1, wherein the second set of markers comprises one or more optical markers. (a)は、(i)前記1つ以上の光学マーカを備える構造化された光パターンを前記物体の表面上に投影することと、(ii)前記物体の前記表面上に投影された前記構造化された光パターンを記録することとをさらに含む、請求項8に記載の方法。(a) (i) projecting a structured light pattern comprising said one or more optical markers onto a surface of said object; and (ii) said structure projected onto said surface of said object; 9. The method of claim 8, further comprising recording the transformed light pattern. 前記投影された構造化された光パターンの少なくとも一部を使用することにより、前記物体の前記表面の幾何学的再構築を算出することをさらに含む、請求項9に記載の方法。10. The method of claim 9, further comprising computing a geometric reconstruction of the surface of the object by using at least part of the projected structured light pattern. (i)第1のカメラを使用して前記物体の第1の画像を記録し、前記第1のカメラから離間されている第2のカメラを使用して前記物体の第2の画像を記録することと、(i) recording a first image of said object using a first camera and recording a second image of said object using a second camera spaced from said first camera; and
(ii)前記第1の画像と前記第2の画像との間の不均衡値を算出することと、(ii) calculating an imbalance value between the first image and the second image;
(iii)前記不均衡値を使用することにより、前記物体の前記1つ以上の部分の深度を算出または更新することと(iii) calculating or updating the depth of the one or more portions of the object using the imbalance value;
をさらに含む、請求項1に記載の方法。2. The method of claim 1, further comprising:
前記第1の加重および前記第2の加重のうちの少なくとも一方は、前記1つ以上の画像のピクセルレベルでパラメータ化されている、請求項1に記載の方法。2. The method of claim 1, wherein at least one of the first weights and the second weights are parameterized at the pixel level of the one or more images. 前記第1の加重および前記第2の加重のうちの少なくとも一方は、パラメータ関数を使用してパラメータ化され、前記パラメータ関数は、連続関数、不連続関数、区分関数、線形関数、または指数関数的関数を含む、請求項12に記載の方法。At least one of the first weight and the second weight is parameterized using a parametric function, the parametric function being a continuous function, a discrete function, a piecewise function, a linear function, or an exponential function. 13. The method of claim 12, comprising a function. 前記第3の加重は、前記第1の加重または前記第2の加重よりも大きいか、前記第1の加重または前記第2の加重に等しい、請求項1に記載の方法。2. The method of claim 1, wherein the third weight is greater than or equal to the first weight or the second weight. 前記第2の加重は、前記第1の加重または前記第3の加重よりも大きいか、前記第1の加重または前記第3の加重に等しい、請求項1に記載の方法。2. The method of claim 1, wherein the second weight is greater than or equal to the first weight or the third weight. 前記第1の加重は、前記第2の加重または前記第3の加重よりも大きいか、前記第2の加重または前記第3の加重に等しい、請求項1に記載の方法。2. The method of claim 1, wherein the first weight is greater than or equal to the second weight or the third weight. 前記複数のマーカは、前記物体の表面に適用される造影剤を備える、請求項1に記載の方法。2. The method of Claim 1, wherein the plurality of markers comprises a contrast agent applied to the surface of the object. 前記造影剤は、霧状染料を含む、請求項17に記載の方法。18. The method of claim 17, wherein the contrast agent comprises a nebulized dye. 前記物体は、生体材料、組織、器官、内部身体構造、または外部身体構造を含む、請求項1に記載の方法。2. The method of claim 1, wherein the object comprises biomaterial, tissue, organ, internal body structure, or external body structure. 前記第1の深度、前記第2の深度、または前記第3の深度は、前記第1、第2、および第3の組のマーカの各々について1つ以上の検出された位置を既知の基準に対して相互参照することによって算出され、前記既知の基準は、既知のサイズまたは既知の形状を備える、請求項1に記載の方法。The first depth, the second depth, or the third depth are defined relative to a known reference to one or more detected positions for each of the first, second, and third sets of markers. 2. The method of claim 1, wherein the known fiducials comprise known sizes or known shapes.

JP2021526407A 2018-07-19 2019-07-19 Systems and Methods for Multimodal Sensing of Depth in Vision Systems for Automated Surgical Robots Active JP7297891B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862700700P 2018-07-19 2018-07-19
US62/700,700 2018-07-19
PCT/US2019/042647 WO2020018931A1 (en) 2018-07-19 2019-07-19 Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots

Publications (3)

Publication Number Publication Date
JP2021531482A JP2021531482A (en) 2021-11-18
JPWO2020018931A5 true JPWO2020018931A5 (en) 2022-07-27
JP7297891B2 JP7297891B2 (en) 2023-06-26

Family

ID=69163745

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2021526407A Active JP7297891B2 (en) 2018-07-19 2019-07-19 Systems and Methods for Multimodal Sensing of Depth in Vision Systems for Automated Surgical Robots

Country Status (7)

Country Link
US (2) US11179218B2 (en)
EP (1) EP3824621A4 (en)
JP (1) JP7297891B2 (en)
KR (1) KR102545980B1 (en)
CN (1) CN112740666A (en)
CA (1) CA3106823A1 (en)
WO (1) WO2020018931A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7076447B2 (en) 2016-11-24 2022-05-27 ユニヴァーシティ オブ ワシントン Light field capture and rendering for head-mounted displays
EP3824621A4 (en) 2018-07-19 2022-04-27 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots

Family Cites Families (286)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5603318A (en) 1992-04-21 1997-02-18 University Of Utah Research Foundation Apparatus and method for photogrammetric surgical localization
GB9515311D0 (en) 1995-07-26 1995-09-20 3D Scanners Ltd Stripe scanners and methods of scanning
DE19636354A1 (en) 1996-09-02 1998-03-05 Ruedger Dipl Ing Rubbert Method and device for performing optical recordings
US6081322A (en) 1997-10-16 2000-06-27 Research Foundation Of State Of New York NIR clinical opti-scan system
US6373963B1 (en) 1998-02-05 2002-04-16 Textile/Clothing Technology Corporation Systems, methods and computer program for measuring the surface contour of an object
DE19815201A1 (en) 1998-04-04 1999-10-07 Link Johann & Ernst Gmbh & Co Measuring arrangement for detecting dimensions of test specimens, preferably of hollow bodies, in particular of bores in workpieces, and methods for measuring such dimensions
WO1999058930A1 (en) 1998-05-14 1999-11-18 Metacreations Corporation Structured-light, triangulation-based three-dimensional digitizer
DE19829278C1 (en) 1998-06-30 2000-02-03 Sirona Dental Systems Gmbh 3-D camera for the detection of surface structures, especially for dental purposes
US6879324B1 (en) 1998-07-14 2005-04-12 Microsoft Corporation Regional progressive meshes
IL125659A (en) 1998-08-05 2002-09-12 Cadent Ltd Method and apparatus for imaging three-dimensional structure
DE19837932C2 (en) 1998-08-20 2000-09-07 Bioshape Ag Method and device for determining the surface shape of biological tissue
US7068825B2 (en) 1999-03-08 2006-06-27 Orametrix, Inc. Scanning system and calibration method for capturing precise three-dimensional information of objects
US7107116B2 (en) 1999-03-29 2006-09-12 Genex Technologies, Inc. Diffuse optical tomography system and method of use
US7099732B2 (en) 1999-03-29 2006-08-29 Genex Technologies, Inc. Sanitary sleeve or tip for intra-oral three-dimensional camera
US6503195B1 (en) 1999-05-24 2003-01-07 University Of North Carolina At Chapel Hill Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction
US6563105B2 (en) 1999-06-08 2003-05-13 University Of Washington Image acquisition with depth enhancement
CA2278108C (en) 1999-07-20 2008-01-29 The University Of Western Ontario Three-dimensional measurement method and apparatus
US7224384B1 (en) 1999-09-08 2007-05-29 3Dv Systems Ltd. 3D imaging system
US7006236B2 (en) 2002-05-22 2006-02-28 Canesta, Inc. Method and apparatus for approximating depth of an object's placement onto a monitored region with applications to virtual interface devices
DE50000335D1 (en) 2000-04-05 2002-09-05 Brainlab Ag Referencing a patient in a medical navigation system using illuminated light points
US6564086B2 (en) 2000-05-03 2003-05-13 Rocky Mountain Biosystems, Inc. Prosthesis and method of making
US6850872B1 (en) 2000-08-30 2005-02-01 Microsoft Corporation Facial image processing methods and systems
JP2002164066A (en) 2000-11-22 2002-06-07 Mitsubishi Heavy Ind Ltd Stacked heat exchanger
JP2002345733A (en) 2001-05-29 2002-12-03 Fuji Photo Film Co Ltd Imaging device
JP2003075137A (en) 2001-09-04 2003-03-12 Minolta Co Ltd Photographing system and imaging device used therefor and three-dimensional measuring auxiliary unit
JP3962588B2 (en) 2002-01-07 2007-08-22 キヤノン株式会社 3D image processing method, 3D image processing apparatus, 3D image processing system, and 3D image processing program
TW567693B (en) 2002-04-19 2003-12-21 Infopivot Technologies Inc Method for solving unavailability of Internet services using floating IP
JP4054222B2 (en) 2002-06-05 2008-02-27 オリンパス株式会社 Light source device for endoscope device
US7385708B2 (en) 2002-06-07 2008-06-10 The University Of North Carolina At Chapel Hill Methods and systems for laser based real-time structured light depth extraction
US20110015518A1 (en) 2002-06-13 2011-01-20 Martin Schmidt Method and instrument for surgical navigation
US7277599B2 (en) * 2002-09-23 2007-10-02 Regents Of The University Of Minnesota System and method for three-dimensional video imaging using a single camera
US6977732B2 (en) 2002-12-26 2005-12-20 National Taiwan University Miniature three-dimensional contour scanner
DE10304111B4 (en) 2003-01-31 2011-04-28 Sirona Dental Systems Gmbh Recording method for an image of a recording object
JP4798945B2 (en) 2003-03-05 2011-10-19 トヨタ自動車株式会社 Imaging device
AU2004223469B2 (en) 2003-03-24 2009-07-30 D4D Technologies, Llc Laser digitizer system for dental applications
WO2004109601A1 (en) 2003-05-30 2004-12-16 Dreamworks Rendering of soft shadows using depth maps
US7450783B2 (en) 2003-09-12 2008-11-11 Biopticon Corporation Methods and systems for measuring the size and volume of features on live tissues
US20050096515A1 (en) 2003-10-23 2005-05-05 Geng Z. J. Three-dimensional surface image guided adaptive therapy system
US7951073B2 (en) 2004-01-21 2011-05-31 Boston Scientific Limited Endoscopic device having spray mechanism and related methods of use
US7330577B2 (en) 2004-01-27 2008-02-12 Densys Ltd. Three-dimensional modeling of the oral cavity by projecting a two-dimensional array of random patterns
WO2005076198A1 (en) 2004-02-09 2005-08-18 Cheol-Gwon Kang Device for measuring 3d shape using irregular pattern and method for the same
DE102004008164B3 (en) 2004-02-11 2005-10-13 Karl Storz Gmbh & Co. Kg Method and device for creating at least a section of a virtual 3D model of a body interior
US20050253849A1 (en) 2004-05-13 2005-11-17 Pixar Custom spline interpolation
US7698068B2 (en) 2004-06-17 2010-04-13 Cadent Ltd. Method for providing data associated with the intraoral cavity
JP4589048B2 (en) 2004-08-04 2010-12-01 オリンパス株式会社 Capsule endoscope
US7961912B2 (en) 2004-10-14 2011-06-14 Stevick Glen R Method and apparatus for dynamic space-time imaging system
US7620209B2 (en) 2004-10-14 2009-11-17 Stevick Glen R Method and apparatus for dynamic space-time imaging system
US20080312540A1 (en) 2004-12-08 2008-12-18 Vasilis Ntziachristos System and Method for Normalized Flourescence or Bioluminescence Imaging
US8027710B1 (en) 2005-01-28 2011-09-27 Patrick Dannan Imaging system for endoscopic surgery
JP5001286B2 (en) 2005-10-11 2012-08-15 プライム センス リミティド Object reconstruction method and system
CA2625775A1 (en) 2005-10-14 2007-04-19 Applied Research Associates Nz Limited A method of monitoring a surface feature and apparatus therefor
KR100752758B1 (en) 2005-10-19 2007-08-29 (주) 인텍플러스 Apparatus and method for measuring image
US20070115484A1 (en) 2005-10-24 2007-05-24 Peisen Huang 3d shape measurement system and method including fast three-step phase shifting, error compensation and calibration
US7898651B2 (en) 2005-10-24 2011-03-01 General Electric Company Methods and apparatus for inspecting an object
US7489408B2 (en) 2005-11-15 2009-02-10 General Electric Company Optical edge break gage
PL1969307T3 (en) 2005-11-28 2010-12-31 3Shape As Coded structured light
DE102005060312A1 (en) 2005-12-16 2007-06-28 Siemens Ag Scanning device for optical scanning of surfaces
DE102006004583A1 (en) 2006-02-01 2007-08-09 Siemens Ag Optical scanner, to give images of marked human and animal tissue, has a distance sensor with a control unit to set the focus of the infra red light source for the image detector
JP5044126B2 (en) 2006-02-23 2012-10-10 オリンパス株式会社 Endoscope observation apparatus and operation method of endoscope for image formation
WO2007109678A2 (en) 2006-03-20 2007-09-27 Baylor College Of Medicine Method and system for non-contact fluorescence optical tomography with patterned illumination
JP4864511B2 (en) 2006-03-31 2012-02-01 富士フイルム株式会社 Electronic endoscope apparatus and program
US7435217B2 (en) 2006-04-17 2008-10-14 Microvision, Inc. Scanned beam imagers and endoscopes with positionable light collector
WO2007139187A1 (en) 2006-05-31 2007-12-06 National University Corporation Chiba University Three-dimensional image forming device, three-dimensional image forming method and program
EP1862115B1 (en) 2006-05-31 2009-03-18 BrainLAB AG Registration with emitting marking elements
US8125648B2 (en) * 2006-06-05 2012-02-28 Board Of Regents, The University Of Texas System Polarization-sensitive spectral interferometry
US20110057930A1 (en) 2006-07-26 2011-03-10 Inneroptic Technology Inc. System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy
US20080123910A1 (en) 2006-09-19 2008-05-29 Bracco Imaging Spa Method and system for providing accuracy evaluation of image guided surgery
US20080107305A1 (en) 2006-11-02 2008-05-08 Northern Digital Inc. Integrated mapping system
US8326020B2 (en) 2007-02-28 2012-12-04 Sungkyunkwan University Foundation Structural light based depth imaging method and system using signal separation coding, and error correction thereof
US7995798B2 (en) 2007-10-15 2011-08-09 Given Imaging Ltd. Device, system and method for estimating the size of an object in a body lumen
DE102007054906B4 (en) 2007-11-15 2011-07-28 Sirona Dental Systems GmbH, 64625 Method for optical measurement of the three-dimensional geometry of objects
ES2372515B2 (en) 2008-01-15 2012-10-16 Universidad De La Laguna CHAMBER FOR THE REAL-TIME ACQUISITION OF THE VISUAL INFORMATION OF THREE-DIMENSIONAL SCENES.
US9072445B2 (en) 2008-01-24 2015-07-07 Lifeguard Surgical Systems Inc. Common bile duct surgical imaging system
US9094675B2 (en) * 2008-02-29 2015-07-28 Disney Enterprises Inc. Processing image data from multiple cameras for motion pictures
US7821649B2 (en) 2008-03-05 2010-10-26 Ge Inspection Technologies, Lp Fringe projection system and method for a probe suitable for phase-shift analysis
JP2009240621A (en) 2008-03-31 2009-10-22 Hoya Corp Endoscope apparatus
EP2286368B1 (en) 2008-05-06 2013-09-04 Flashscan3d, Llc System and method for structured light illumination with frame subwindows
US20100113921A1 (en) 2008-06-02 2010-05-06 Uti Limited Partnership Systems and Methods for Object Surface Estimation
DE102008040947B4 (en) 2008-08-01 2014-02-06 Sirona Dental Systems Gmbh 3D dental camera for detecting surface structures of a test object by means of triangulation
CN102113309B (en) 2008-08-03 2013-11-06 微软国际控股私有有限公司 Rolling camera system
US8406859B2 (en) 2008-08-10 2013-03-26 Board Of Regents, The University Of Texas System Digital light processing hyperspectral imaging apparatus
US9282926B2 (en) 2008-12-18 2016-03-15 Sirona Dental Systems Gmbh Camera for recording surface structures, such as for dental purposes
KR101526866B1 (en) 2009-01-21 2015-06-10 삼성전자주식회사 Method of filtering depth noise using depth information and apparatus for enabling the method
US8295546B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
US9091623B2 (en) * 2009-02-16 2015-07-28 Satake Usa, Inc. System to determine product characteristics, counts, and per unit weight details
WO2010096453A1 (en) 2009-02-17 2010-08-26 Board Of Regents, The University Of Texas System Methods of producing laser speckle contrast images
WO2010096447A2 (en) 2009-02-17 2010-08-26 Board Of Regents, The University Of Texas System Quantitative imaging with multi-exposure speckle imaging (mesi)
US20100265557A1 (en) 2009-04-21 2010-10-21 Jesper Sallander Optical Systems Configured to Generate More Closely Spaced Light Beams and Pattern Generators Including the Same
US9135502B2 (en) 2009-05-11 2015-09-15 Universitat Zu Lubeck Method for the real-time-capable, computer-assisted analysis of an image sequence containing a variable pose
US7763841B1 (en) 2009-05-27 2010-07-27 Microsoft Corporation Optical component for a depth sensor
JP5361592B2 (en) 2009-07-24 2013-12-04 オリンパス株式会社 Endoscope apparatus, measurement method, and program
KR20110018696A (en) 2009-08-18 2011-02-24 주식회사 이턴 Apparatus and method for processing 3d image
US8264536B2 (en) 2009-08-25 2012-09-11 Microsoft Corporation Depth-sensitive imaging via polarization-state mapping
DE112010003417A5 (en) 2009-08-27 2012-08-16 Naviswiss AB ENDOSCOPE AND METHOD FOR THE USE THEREOF
US8723118B2 (en) 2009-10-01 2014-05-13 Microsoft Corporation Imager for constructing color and depth images
US20110080471A1 (en) 2009-10-06 2011-04-07 Iowa State University Research Foundation, Inc. Hybrid method for 3D shape measurement
US10045882B2 (en) 2009-10-30 2018-08-14 The Johns Hopkins University Surgical instrument and systems with integrated optical sensor
US20110123098A1 (en) 2009-11-24 2011-05-26 Maurice Moshe Ernst System and a Method for Three-dimensional Modeling of a Three-dimensional Scene Features with a Cooling System
US20120206587A1 (en) 2009-12-04 2012-08-16 Orscan Technologies Ltd System and method for scanning a human body
US8320621B2 (en) 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
EP2523621B1 (en) 2010-01-13 2016-09-28 Koninklijke Philips N.V. Image integration based registration and navigation for endoscopic surgery
US8723923B2 (en) 2010-01-14 2014-05-13 Alces Technology Structured light system
EP2359745A1 (en) 2010-02-12 2011-08-24 Helmholtz Zentrum München Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH) Method and device for multi-spectral photonic imaging
EP2533679B1 (en) 2010-02-12 2017-01-11 Koninklijke Philips N.V. Laser enhanced reconstruction of 3d surface
US8872824B1 (en) 2010-03-03 2014-10-28 Nvidia Corporation System, method, and computer program product for performing shadowing utilizing shadow maps and ray tracing
US8279418B2 (en) 2010-03-17 2012-10-02 Microsoft Corporation Raster scanning for depth detection
WO2011134083A1 (en) 2010-04-28 2011-11-03 Ryerson University System and methods for intraoperative guidance feedback
US8330804B2 (en) 2010-05-12 2012-12-11 Microsoft Corporation Scanned-beam depth mapping to 2D image
US9557574B2 (en) 2010-06-08 2017-01-31 Microsoft Technology Licensing, Llc Depth illumination and detection optics
US8558873B2 (en) 2010-06-16 2013-10-15 Microsoft Corporation Use of wavefront coding to create a depth image
US20130253313A1 (en) 2010-08-02 2013-09-26 The Johns Hopkins University Autofocusing endoscope and system
US9833145B2 (en) 2010-08-11 2017-12-05 Snu R&Db Foundation Method for simultaneously detecting fluorescence and raman signals for multiple fluorescence and raman signal targets, and medical imaging device for simultaneously detecting multiple targets using the method
US20120056982A1 (en) * 2010-09-08 2012-03-08 Microsoft Corporation Depth camera based on structured light and stereo vision
US9157733B2 (en) 2010-09-10 2015-10-13 Dimensional Photonics International, Inc. Method of data acquisition for three-dimensional imaging
AU2011305543A1 (en) 2010-09-21 2013-04-11 The Johns Hopkins University Optical sensing system for cochlear implant surgery
US8760517B2 (en) 2010-09-27 2014-06-24 Apple Inc. Polarized images for security
CN102008282B (en) 2010-10-29 2012-08-08 深圳大学 Number stamp intraoral scanner and oral cavity internal surface topography image real-time reconstructing system
US9345389B2 (en) 2010-11-12 2016-05-24 Emory University Additional systems and methods for providing real-time anatomical guidance in a diagnostic or therapeutic procedure
US9506749B2 (en) 2010-11-15 2016-11-29 Seikowave, Inc. Structured light 3-D measurement module and system for illuminating an area-under-test using a fixed-pattern optic
US9599461B2 (en) 2010-11-16 2017-03-21 Ectoscan Systems, Llc Surface data acquisition, storage, and assessment system
US8649024B2 (en) 2010-12-03 2014-02-11 Zygo Corporation Non-contact surface characterization using modulated illumination
US8803952B2 (en) 2010-12-20 2014-08-12 Microsoft Corporation Plural detector time-of-flight depth mapping
US20120165681A1 (en) 2010-12-23 2012-06-28 Tyco Healthcare Group Lp Delineating Skin or Surface Lesions
US9226673B2 (en) 2011-01-10 2016-01-05 East Carolina University Methods, systems and computer program products for non-invasive determination of blood flow distribution using speckle imaging techniques and hemodynamic modeling
EP2663222B1 (en) 2011-01-10 2021-10-27 East Carolina University Methods and systems for non-invasive determination of blood flow distribution using speckle imaging techniques and hemodynamic modeling
US8817046B2 (en) * 2011-04-21 2014-08-26 Microsoft Corporation Color channels and optical markers
EP2689708B1 (en) 2011-04-27 2016-10-19 Olympus Corporation Endoscopic apparatus and measurement method
JP5841353B2 (en) 2011-05-24 2016-01-13 オリンパス株式会社 Endoscope apparatus and image acquisition method
JP5846763B2 (en) 2011-05-24 2016-01-20 オリンパス株式会社 Endoscope device
JP5830270B2 (en) 2011-05-24 2015-12-09 オリンパス株式会社 Endoscope apparatus and measuring method
WO2012167201A1 (en) 2011-06-01 2012-12-06 Digital Light Innovations System and method for hyperspectral illumination
WO2012170963A1 (en) 2011-06-08 2012-12-13 Digital Light Innovations System and method for hyperspectral imaging
US9001190B2 (en) 2011-07-05 2015-04-07 Microsoft Technology Licensing, Llc Computer vision system and method using a depth sensor
KR20130011141A (en) 2011-07-20 2013-01-30 삼성전자주식회사 Endoscope and endoscope system
US9444981B2 (en) 2011-07-26 2016-09-13 Seikowave, Inc. Portable structured light measurement module/apparatus with pattern shifting device incorporating a fixed-pattern optic for illuminating a subject-under-test
US8672838B2 (en) 2011-08-12 2014-03-18 Intuitive Surgical Operations, Inc. Image capture unit in a surgical instrument
US8784301B2 (en) 2011-08-12 2014-07-22 Intuitive Surgical Operations, Inc. Image capture unit and method with an extended depth of field
US8764633B2 (en) 2011-08-12 2014-07-01 Intuitive Surgical Operations, Inc. Feature differentiation image capture unit and method in a surgical instrument
US9254103B2 (en) 2011-08-15 2016-02-09 The Trustees Of Dartmouth College Operative microscope having diffuse optical imaging system with tomographic image reconstruction and superposition in field of view
US9491441B2 (en) 2011-08-30 2016-11-08 Microsoft Technology Licensing, Llc Method to extend laser depth map range
JP5926909B2 (en) * 2011-09-07 2016-05-25 オリンパス株式会社 Fluorescence observation equipment
US9142025B2 (en) 2011-10-05 2015-09-22 Electronics And Telecommunications Research Institute Method and apparatus for obtaining depth information using optical pattern
WO2013058978A1 (en) 2011-10-17 2013-04-25 Kimmel Zebadiah M Method and apparatus for sizing and fitting an individual for apparel, accessories, or prosthetics
US8605993B2 (en) * 2011-11-21 2013-12-10 Robo-team Ltd. Methods and systems of merging depth data from a plurality of disparity maps
DE102011119608B4 (en) 2011-11-29 2021-07-29 Karl Storz Se & Co. Kg Device and method for endoscopic 3D data acquisition
US11510600B2 (en) 2012-01-04 2022-11-29 The Trustees Of Dartmouth College Method and apparatus for quantitative and depth resolved hyperspectral fluorescence and reflectance imaging for surgical guidance
KR102011169B1 (en) 2012-03-05 2019-08-14 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Generation of depth images based upon light falloff
JP5654511B2 (en) 2012-03-14 2015-01-14 富士フイルム株式会社 Endoscope system, processor device for endoscope system, and method for operating endoscope system
WO2013136620A1 (en) 2012-03-14 2013-09-19 独立行政法人産業技術総合研究所 Phase distribution analysis method and device for fringe image using high-dimensional brightness information, and program therefor
JP5995484B2 (en) 2012-03-30 2016-09-21 キヤノン株式会社 Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and program
JP2015523102A (en) * 2012-04-16 2015-08-13 チルドレンズ ナショナル メディカル センターChildren’S National Medical Center Dual-mode stereo imaging system for tracking and control in surgical and interventional procedures
US20140194747A1 (en) 2012-05-01 2014-07-10 Empire Technology Development Llc Infrared scanner and projector to indicate cancerous cells
US20130296712A1 (en) 2012-05-03 2013-11-07 Covidien Lp Integrated non-contact dimensional metrology tool
JP5930531B2 (en) 2012-05-24 2016-06-08 三菱電機エンジニアリング株式会社 Imaging apparatus and imaging method
US9674436B2 (en) 2012-06-18 2017-06-06 Microsoft Technology Licensing, Llc Selective imaging zones of an imaging sensor
US9471864B2 (en) 2012-06-22 2016-10-18 Microsoft Technology Licensing, Llc Encoding data in depth patterns
US9220570B2 (en) 2012-06-29 2015-12-29 Children's National Medical Center Automated surgical and interventional procedures
US8896594B2 (en) 2012-06-30 2014-11-25 Microsoft Corporation Depth sensing with depth-adaptive illumination
JP6005278B2 (en) 2012-07-25 2016-10-12 シーメンス アクチエンゲゼルシヤフトSiemens Aktiengesellschaft Color coding for 3D measurements, especially on transmissive and scattering surfaces
US20140031665A1 (en) 2012-07-25 2014-01-30 Covidien Lp Telecentric Scale Projection System for Real-Time In-Situ Surgical Metrology
CN104125794B (en) 2012-08-07 2016-06-22 奥林巴斯株式会社 Sweep type endoscope apparatus
US9297889B2 (en) 2012-08-14 2016-03-29 Microsoft Technology Licensing, Llc Illumination light projection for a depth camera
US9696427B2 (en) 2012-08-14 2017-07-04 Microsoft Technology Licensing, Llc Wide angle depth detection
US9057784B2 (en) 2012-08-14 2015-06-16 Microsoft Technology Licensing, Llc Illumination light shaping for a depth camera
US20140092281A1 (en) 2012-09-28 2014-04-03 Pelican Imaging Corporation Generating Images from Light Fields Utilizing Virtual Viewpoints
US20150238276A1 (en) 2012-09-30 2015-08-27 M.S.T. Medical Surgery Technologies Ltd. Device and method for assisting laparoscopic surgery - directing and maneuvering articulating tool
US9070194B2 (en) 2012-10-25 2015-06-30 Microsoft Technology Licensing, Llc Planar surface detection
EP2912405B1 (en) 2012-10-29 2017-10-18 7D Surgical Inc. Integrated illumination and optical surface topology detection system and methods of use thereof
DE102012021185A1 (en) 2012-10-30 2014-04-30 Smart Optics Sensortechnik Gmbh Method for 3D optical measurement of teeth with reduced point-spread function
US9304603B2 (en) 2012-11-12 2016-04-05 Microsoft Technology Licensing, Llc Remote control using depth camera
KR101918030B1 (en) * 2012-12-20 2018-11-14 삼성전자주식회사 Method and apparatus for rendering hybrid multi-view
CA2900268C (en) 2013-02-04 2021-04-06 D4D Technologies, Llc Intra-oral scanning device with illumination frames interspersed with image frames
ES2804681T3 (en) 2013-02-04 2021-02-09 Childrens Nat Medical Ct Hybrid Control Surgical Robotic System
US9962244B2 (en) 2013-02-13 2018-05-08 3Shape A/S Focus scanning apparatus recording color
CN104036226B (en) 2013-03-04 2017-06-27 联想(北京)有限公司 A kind of object information acquisition method and electronic equipment
US9351643B2 (en) 2013-03-12 2016-05-31 Covidien Lp Systems and methods for optical measurement for in-situ surgical applications
US9375844B2 (en) 2013-03-15 2016-06-28 Intuitive Surgical Operations, Inc. Geometrically appropriate tool selection assistance for determined work site dimensions
US20140307055A1 (en) 2013-04-15 2014-10-16 Microsoft Corporation Intensity-modulated light pattern for active stereo
US9294758B2 (en) 2013-04-18 2016-03-22 Microsoft Technology Licensing, Llc Determining depth data for a captured image
EP2800055A1 (en) 2013-04-30 2014-11-05 3DDynamics Bvba Method and system for generating a 3D model
US9074868B2 (en) 2013-05-13 2015-07-07 General Electric Company Automated borescope measurement tip accuracy test
US9729860B2 (en) 2013-05-24 2017-08-08 Microsoft Technology Licensing, Llc Indirect reflection suppression in depth imaging
US9274047B2 (en) 2013-05-24 2016-03-01 Massachusetts Institute Of Technology Methods and apparatus for imaging of occluded objects
JP6446357B2 (en) 2013-05-30 2018-12-26 株式会社ニコン Imaging system
US9344619B2 (en) * 2013-08-30 2016-05-17 Qualcomm Incorporated Method and apparatus for generating an all-in-focus image
DE102013016752A1 (en) 2013-09-03 2015-03-05 Universität Stuttgart Method and arrangement for robust one-shot interferometry, in particular also for optical coherence tomography according to the Spatial Domain Approach (SD-OCT)
US9462253B2 (en) 2013-09-23 2016-10-04 Microsoft Technology Licensing, Llc Optical modules that reduce speckle contrast and diffraction artifacts
US20150086956A1 (en) 2013-09-23 2015-03-26 Eric Savitsky System and method for co-registration and navigation of three-dimensional ultrasound and alternative radiographic data sets
US9799117B2 (en) 2013-09-30 2017-10-24 Lenovo (Beijing) Co., Ltd. Method for processing data and apparatus thereof
US9443310B2 (en) 2013-10-09 2016-09-13 Microsoft Technology Licensing, Llc Illumination modules that emit structured light
CA2931529C (en) 2013-11-27 2022-08-23 Children's National Medical Center 3d corrected imaging
US10469827B2 (en) 2013-12-27 2019-11-05 Sony Corporation Image processing device and image processing method
US20160309068A1 (en) 2014-01-06 2016-10-20 The Regents Of The University Of California Spatial frequency domain imaging using custom patterns
WO2015105780A1 (en) 2014-01-07 2015-07-16 The Regents Of The University Of California Method for extraction of spatial frequency information for quantitative tissue imaging
WO2015105360A1 (en) 2014-01-10 2015-07-16 주식회사 고영테크놀러지 Device and method for measuring three-dimensional shape
US9720506B2 (en) 2014-01-14 2017-08-01 Microsoft Technology Licensing, Llc 3D silhouette sensing system
US9524582B2 (en) 2014-01-28 2016-12-20 Siemens Healthcare Gmbh Method and system for constructing personalized avatars using a parameterized deformable mesh
US10485425B2 (en) 2014-02-04 2019-11-26 The Trustees Of Dartmouth College Apparatus and methods for structured light scatteroscopy
EP3108447B1 (en) 2014-02-17 2020-03-25 Children's National Medical Center Method and system for providing recommendation for optimal execution of surgical procedures
DE102014002514B4 (en) 2014-02-21 2015-10-29 Universität Stuttgart Device and method for multi- or hyperspectral imaging and / or for distance and / or 2-D or 3-D profile measurement of an object by means of spectrometry
US9380224B2 (en) 2014-02-28 2016-06-28 Microsoft Technology Licensing, Llc Depth sensing using an infrared camera
JP6535020B2 (en) 2014-03-02 2019-06-26 ブイ.ティー.エム.(バーチャル テープ メジャー)テクノロジーズ リミテッド System for measuring 3D distance and dimensions of visible objects in endoscopic images
DE102014204243A1 (en) 2014-03-07 2015-09-10 Siemens Aktiengesellschaft Endoscope with depth determination
DE102014204244A1 (en) 2014-03-07 2015-09-10 Siemens Aktiengesellschaft Endoscope with depth determination
US11116383B2 (en) 2014-04-02 2021-09-14 Asensus Surgical Europe S.à.R.L. Articulated structured light based-laparoscope
DE102014207022A1 (en) 2014-04-11 2015-10-29 Siemens Aktiengesellschaft Depth determination of a surface of a test object
DE102014210938A1 (en) 2014-06-06 2015-12-17 Siemens Aktiengesellschaft Method for controlling a medical device and control system for a medical device
US20150377613A1 (en) 2014-06-30 2015-12-31 Samsung Electronics Co., Ltd. Systems and methods for reconstructing 3d surfaces of tubular lumens
US9261356B2 (en) 2014-07-03 2016-02-16 Align Technology, Inc. Confocal surface topography measurement with fixed focal positions
US9439568B2 (en) 2014-07-03 2016-09-13 Align Technology, Inc. Apparatus and method for measuring surface topography optically
US9261358B2 (en) 2014-07-03 2016-02-16 Align Technology, Inc. Chromatic confocal system
US10398294B2 (en) 2014-07-24 2019-09-03 Z Square Ltd. Illumination sources for multicore fiber endoscopes
CN105509639B (en) 2014-09-24 2019-01-01 通用电气公司 For the measuring system and measurement method of measure geometry feature
US10039439B2 (en) 2014-09-30 2018-08-07 Fujifilm Corporation Endoscope system and method for operating the same
CN107405094A (en) 2014-10-14 2017-11-28 东卡罗莱娜大学 For visualizing method, system and the computer program product of anatomical structure and blood flow and perfusion physiological function using imaging technique
JP2017534378A (en) 2014-10-14 2017-11-24 イースト カロライナ ユニバーシティ Method, system, and computer program product for determining hemodynamic parameters using signals obtained by multispectral imaging of blood flow and perfusion
US11553844B2 (en) 2014-10-14 2023-01-17 East Carolina University Methods, systems and computer program products for calculating MetaKG signals for regions having multiple sets of optical characteristics
US20160128553A1 (en) 2014-11-07 2016-05-12 Zheng Jason Geng Intra- Abdominal Lightfield 3D Endoscope and Method of Making the Same
JP6432770B2 (en) 2014-11-12 2018-12-05 ソニー株式会社 Image processing apparatus, image processing method, and program
US9841496B2 (en) 2014-11-21 2017-12-12 Microsoft Technology Licensing, Llc Multiple pattern illumination optics for time of flight system
US9638801B2 (en) 2014-11-24 2017-05-02 Mitsubishi Electric Research Laboratories, Inc Depth sensing using optical pulses and fixed coded aperature
US9330464B1 (en) 2014-12-12 2016-05-03 Microsoft Technology Licensing, Llc Depth camera feedback
KR101671649B1 (en) 2014-12-22 2016-11-01 장석준 Method and System for 3D manipulated image combined physical data and clothing data
US9958758B2 (en) 2015-01-21 2018-05-01 Microsoft Technology Licensing, Llc Multiple exposure structured light pattern
EP3250104B1 (en) 2015-01-28 2019-03-06 Brainlab AG Light point identification method
US9817159B2 (en) 2015-01-31 2017-11-14 Microsoft Technology Licensing, Llc Structured light pattern generation
CN107431800A (en) 2015-02-12 2017-12-01 奈克斯特Vr股份有限公司 For carrying out environment measurement and/or the method and apparatus using such measurement
US9953428B2 (en) 2015-03-03 2018-04-24 Microsoft Technology Licensing, Llc Digital camera unit with simultaneous structured and unstructured illumination
KR102376954B1 (en) 2015-03-06 2022-03-21 삼성전자주식회사 Method for irradiating litght for capturing iris and device thereof
US9955140B2 (en) 2015-03-11 2018-04-24 Microsoft Technology Licensing, Llc Distinguishing foreground and background with inframed imaging
KR102306539B1 (en) 2015-03-12 2021-09-29 삼성전자주식회사 Method and device for irradiating light used to capture iris
US10058256B2 (en) 2015-03-20 2018-08-28 East Carolina University Multi-spectral laser imaging (MSLI) methods and systems for blood flow and perfusion imaging and quantification
US10390718B2 (en) 2015-03-20 2019-08-27 East Carolina University Multi-spectral physiologic visualization (MSPV) using laser imaging methods and systems for blood flow and perfusion imaging and quantification in an endoscopic design
DE102015206511B4 (en) 2015-04-13 2023-10-19 Siemens Healthcare Gmbh Determination of a clear spatial relationship between a medical device and another object
US9690984B2 (en) 2015-04-14 2017-06-27 Microsoft Technology Licensing, Llc Two-dimensional infrared depth sensing
US20170059305A1 (en) 2015-08-25 2017-03-02 Lytro, Inc. Active illumination for enhanced depth map generation
US10132616B2 (en) 2015-04-20 2018-11-20 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US10356392B2 (en) 2015-05-28 2019-07-16 University College Cork—National Univesity of Ireland, Cork Coded access optical sensor
EP3318173A4 (en) 2015-06-30 2019-04-17 Olympus Corporation Image processing device, ranging system, and endoscope system
WO2017006574A1 (en) 2015-07-03 2017-01-12 オリンパス株式会社 Image processing device, image determination system, and endoscope system
CA2987058C (en) 2015-07-13 2019-03-19 Synaptive Medical (Barbados) Inc. System and method for providing a contour video with a 3d surface in a medical navigation system
US9958585B2 (en) 2015-08-17 2018-05-01 Microsoft Technology Licensing, Llc Computer vision depth sensing at video rate using depth from defocus
JP6456550B2 (en) * 2015-09-01 2019-01-23 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Device for displaying medical image data of a body part
CA3038648A1 (en) 2015-09-28 2017-04-06 Montefiore Medical Center Methods and devices for intraoperative viewing of patient 3d surface images
DK178899B1 (en) 2015-10-09 2017-05-08 3Dintegrated Aps A depiction system
WO2017059870A1 (en) 2015-10-09 2017-04-13 3Dintegrated Aps A laparoscopic tool system for minimally invasive surgery
US9955861B2 (en) 2015-10-16 2018-05-01 Ricoh Company, Ltd. Construction of an individual eye model using a plenoptic camera
US10402992B2 (en) 2015-10-16 2019-09-03 Capsovision Inc. Method and apparatus for endoscope with distance measuring for object scaling
CA3003596A1 (en) 2015-10-31 2017-05-04 Children's National Medical Center Soft surgical tools
US10828125B2 (en) * 2015-11-03 2020-11-10 Synaptive Medical (Barbados) Inc. Dual zoom and dual field-of-view microscope
WO2017143427A1 (en) 2016-02-25 2017-08-31 Synaptive Medical (Barbados) Inc. System and method for scope based depth map acquisition
CN111329551A (en) 2016-03-12 2020-06-26 P·K·朗 Augmented reality guidance for spinal and joint surgery
EP4394703A2 (en) 2016-03-14 2024-07-03 Mohamed R. Mahfouz Method of designing a dynamic patient-specific orthopedic implant
EP3220351A1 (en) 2016-03-14 2017-09-20 Thomson Licensing Method and device for processing lightfield data
US20170280970A1 (en) 2016-03-31 2017-10-05 Covidien Lp Thoracic endoscope for surface scanning
WO2017180097A1 (en) 2016-04-12 2017-10-19 Siemens Aktiengesellschaft Deformable registration of intra and preoperative inputs using generative mixture models and biomechanical deformation
TWI567693B (en) * 2016-05-17 2017-01-21 緯創資通股份有限公司 Method and system for generating depth information
US10375330B2 (en) 2016-05-27 2019-08-06 Verily Life Sciences Llc Systems and methods for surface topography acquisition using laser speckle
US20170366773A1 (en) 2016-06-21 2017-12-21 Siemens Aktiengesellschaft Projection in endoscopic medical imaging
TWI597042B (en) 2016-06-29 2017-09-01 Endoscopic with distance measuring function and distance measuring method
US10217235B2 (en) 2016-07-11 2019-02-26 Nri R&D Patent Licensing, Llc Advanced lensless light-field imaging systems and methods for enabling a wide range of entirely new applications
IL292427B2 (en) 2016-07-25 2023-05-01 Magic Leap Inc Imaging modification, display and visualization using augmented and virtual reality eyewear
US9947099B2 (en) 2016-07-27 2018-04-17 Microsoft Technology Licensing, Llc Reflectivity map estimate from dot based structured light systems
US20180042466A1 (en) 2016-08-12 2018-02-15 The Johns Hopkins University Compact endoscope design for three-dimensional surgical guidance
WO2018085797A1 (en) 2016-11-04 2018-05-11 Aquifi, Inc. System and method for portable active 3d scanning
US11998282B2 (en) * 2016-12-16 2024-06-04 Intuitive Surgical Operations, Inc. Systems and methods for teleoperated control of an imaging instrument
US11129681B2 (en) 2017-02-22 2021-09-28 Orthosoft Ulc Bone and tool tracking in robotized computer-assisted surgery
US10485629B2 (en) 2017-02-24 2019-11-26 Sony Olympus Medical Solutions Inc. Endoscope device
US10572720B2 (en) 2017-03-01 2020-02-25 Sony Corporation Virtual reality-based apparatus and method to generate a three dimensional (3D) human face model using image and depth data
CN108694740A (en) 2017-03-06 2018-10-23 索尼公司 Information processing equipment, information processing method and user equipment
CA2960528C (en) * 2017-03-08 2018-03-20 Synaptive Medical (Barbados) Inc. A depth-encoded fiducial marker for intraoperative surgical registration
US10262453B2 (en) 2017-03-24 2019-04-16 Siemens Healthcare Gmbh Virtual shadows for enhanced depth perception
WO2019045971A1 (en) 2017-08-28 2019-03-07 East Carolina University Multi-spectral physiologic visualization (mspv) using laser imaging methods and systems for blood flow and perfusion imaging and quantification in an endoscopic design
US11278220B2 (en) 2018-06-08 2022-03-22 East Carolina University Determining peripheral oxygen saturation (SpO2) and hemoglobin concentration using multi-spectral laser imaging (MSLI) methods and systems
CN112513617A (en) 2018-06-28 2021-03-16 儿童国家医疗中心 Method and system for dye-free visualization of blood flow and tissue perfusion in laparoscopic surgery
US11850002B2 (en) * 2018-07-16 2023-12-26 International Business Machines Corporation Three-dimensional model for surgical planning
EP3824621A4 (en) 2018-07-19 2022-04-27 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
EP3847660A2 (en) 2018-09-05 2021-07-14 East Carolina University Systems for detecting vascular and arterial disease in asymptomatic patients and related methods
US11125861B2 (en) 2018-10-05 2021-09-21 Zoox, Inc. Mesh validation
US10823855B2 (en) 2018-11-19 2020-11-03 Fca Us Llc Traffic recognition and adaptive ground removal based on LIDAR point cloud statistics
CN113906479A (en) 2018-12-28 2022-01-07 艾科缇弗外科公司 Generating synthetic three-dimensional imagery from local depth maps
JP2022525113A (en) 2019-03-26 2022-05-11 イースト カロライナ ユニバーシティ Near-infrared fluorescence imaging and related systems and computer program products for blood flow and perfusion visualization
CN114828725A (en) * 2019-08-16 2022-07-29 身体视觉医疗有限公司 Devices for interventional and surgical procedures and methods of use thereof
EP4135615A1 (en) 2020-04-17 2023-02-22 Activ Surgical, Inc. Systems and methods for enhancing medical images
NL2026240B1 (en) 2020-08-07 2022-04-08 Limis Dev B V Device for coupling coherent light into an endoscopic system
NL2026505B1 (en) 2020-09-18 2022-05-23 Limis Dev B V Motion-compensated laser speckle contrast imaging

Similar Documents

Publication Publication Date Title
CN104837436B (en) Motion compensation in 3-D scanning
US5967979A (en) Method and apparatus for photogrammetric assessment of biological tissue
US20190191141A1 (en) Motion blur compensation
JP5647778B2 (en) Device for determining the three-dimensional coordinates of an object, in particular a tooth
US20170243374A1 (en) Calibration device, calibration method, optical device, image-capturing device, projection device, measuring system, and measuring method
JP2015128242A (en) Image projection device and calibration method of the same
KR20130032368A (en) Three-dimensional measurement apparatus, three-dimensional measurement method, and storage medium
US20200005454A1 (en) Method for measuring a dental object
US10052079B2 (en) Method for producing an X-ray image
JP6475312B1 (en) Optical tracking system and optical tracking method
US7049594B2 (en) Position sensing sensor, method and system
JP6293122B2 (en) How to measure dental conditions
JP3237414B2 (en) Stereo camera calibration device
US20220117689A1 (en) Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
JP5487946B2 (en) Camera image correction method, camera apparatus, and coordinate transformation parameter determination apparatus
JPWO2020018931A5 (en)
JP2003164431A (en) Blood flow velocity measuring apparatus
JP3408237B2 (en) Shape measuring device
Comlekciler et al. Artificial 3-D contactless measurement in orthognathic surgery with binocular stereo vision
Zhao et al. Calibration and correction of lens distortion for two-dimensional digital speckle correlation measurement
JP3370049B2 (en) Shape measuring device
KR102307919B1 (en) Pose estimation method of bendable interventional medical device using single-view x-ray image
JP3042773B2 (en) 3D motion analyzer
WO2020031659A1 (en) Position and attitude estimation system, position and attitude estimation apparatus, and position and attitude estimation method
Hsu et al. A distortion correction method for endoscope images based on calibration patterns and a simple mathematic model for optical lens